Skip to main content

Home/ TOK Friends/ Group items tagged job

Rss Feed Group items tagged

Javier E

What Is Wrong with the West's Economies? by Edmund S. Phelps | The New York Review of B... - 1 views

  • What is wrong with the economies of the West—and with economics?
  • With little or no effective policy initiative giving a lift to the less advantaged, the jarring market forces of the past four decades—mainly the slowdowns in productivity that have spread over the West and, of course, globalization, which has moved much low-wage manufacturing to Asia—have proceeded, unopposed, to drag down both employment and wage rates at the low end. The setback has cost the less advantaged not only a loss of income but also a loss of what economists call inclusion—access to jobs offering work and pay that provide self-respect.
  • The classical idea of political economy has been to let wage rates sink to whatever level the market takes them, and then provide everyone with the “safety net” of a “negative income tax,” unemployment insurance, and free food, shelter, clothing, and medical care
  • ...32 more annotations...
  • This failing in the West’s economies is also a failing of economics
  • many people have long felt the desire to do something with their lives besides consuming goods and having leisure. They desire to participate in a community in which they can interact and develop.
  • Our prevailing political economy is blind to the very concept of inclusion; it does not map out any remedy for the deficiency
  • injustice of another sort. Workers in decent jobs view the economy as unjust if they or their children have virtually no chance of climbing to a higher rung in the socioeconomic ladder
  • “Money is like blood. You need it to live but it isn’t the point of life.”4
  • justice is not everything that people need from their economy. They need an economy that is good as well as just. And for some decades, the Western economies have fallen short of any conception of a “good economy”—an economy offering a “good life,” or a life of “richness,” as some humanists call it
  • The good life as it is popularly conceived typically involves acquiring mastery in one’s work, thus gaining for oneself better terms—or means to rewards, whether material, like wealth, or nonmaterial—an experience we may call “prospering.”
  • As humanists and philosophers have conceived it, the good life involves using one’s imagination, exercising one’s creativity, taking fascinating journeys into the unknown, and acting on the world—an experience I call “flourishing.”
  • prospering and flourishing became prevalent in the nineteenth century when, in Europe and America, economies emerged with the dynamism to generate their own innovation.
  • What is the mechanism of the slowdown in productivity
  • prospering
  • In nineteenth-century Britain and America, and later Germany and France, a culture of exploration, experimentation, and ultimately innovation grew out of the individualism of the Renaissance, the vitalism of the Baroque era, and the expressionism of the Romantic period.
  • What made innovating so powerful in these economies was that it was not limited to elites. It permeated society from the less advantaged parts of the population on up.
  • High-enough wages, low-enough unemployment, and wide-enough access to engaging work are necessary for a “good-enough” economy—though far from sufficient. The material possibilities of the economy must be adequate for the nonmaterial possibilities to be widespread—the satisfactions of prospering and of flourishing through adventurous, creative, and even imaginative work.
  • today’s standard economics. This economics, despite its sophistication in some respects, makes no room for economies in which people are imagining new products and using their creativity to build them. What is most fundamentally “wrong with economics” is that it takes such an economy to be the norm—to be “as good as it gets.”
  • ince around 1970, or earlier in some cases, most of the continental Western European economies have come to resemble more completely the mechanical model of standard economics. Most companies are highly efficient. Households, apart from the very low-paid or unemployed, have gone on saving
  • In most of Western Europe, economic dynamism is now at lows not seen, I would judge, since the advent of dynamism in the nineteenth century. Imagining and creating new products has almost disappeared from the continent
  • The bleak levels of both unemployment and job satisfaction in Europe are testimony to its dreary economies.
  • a recent survey of household attitudes found that, in “happiness,” the median scores in Spain (54), France (51), Italy (48), and Greece (37) are all below those in the upper half of the nations labeled “emerging”—Mexico (79), Venezuela (74), Brazil (73), Argentina (66), Vietnam (64), Colombia (64), China (59), Indonesia (58), Chile (58), and Malaysia (56)
  • The US economy is not much better. Two economists, Stanley Fischer and Assar Lindbeck, wrote of a “Great Productivity Slowdown,” which they saw as beginning in the late 1960s.11 The slowdown in the growth of capital and labor combined—what is called “total factor productivity”—is star
  • though the injustices in the West’s economies are egregious, they ought not to be seen as a major cause of the productivity slowdowns and globalization. (For one thing, a slowdown of productivity started in the US in the mid-1960s and the sharp loss of manufacturing jobs to poorer countries occurred much later—from the late 1970s to the early 1990s.) Deeper causes must be at work.
  • The plausible explanation of the syndrome in America—the productivity slowdown and the decline of job satisfaction, among other things—is a critical loss of indigenous innovation in the established industries like traditional manufacturing and services that was not nearly offset by the innovation that flowered in a few new industries
  • hat then caused this narrowing of innovation? No single explanation is persuasive. Yet two classes of explanations have the ring of truth. One points to suppression of innovation by vested interests
  • some professions, such as those in education and medicine, have instituted regulation and licensing to curb experimentation and change, thus dampening innovation
  • established corporations—their owners and stakeholders—and entire industries, using their lobbyists, have obtained regulations and patents that make it harder for new firms to gain entry into the market and to compete with incumbents.
  • The second explanation points to a new repression of potential innovators by families and schools. As the corporatist values of control, solidarity, and protection are invoked to prohibit innovation, traditional values of conservatism and materialism are often invoked to inhibit a young person from undertaking an innovation.
  • ow might Western nations gain—or regain—widespread prospering and flourishing? Taking concrete actions will not help much without fresh thinking: people must first grasp that standard economics is not a guide to flourishing—it is a tool only for efficiency.
  • Widespread flourishing in a nation requires an economy energized by its own homegrown innovation from the grassroots on up. For such innovation a nation must possess the dynamism to imagine and create the new—economic freedoms are not sufficient. And dynamism needs to be nourished with strong human values.
  • a reform of education stands out. The problem here is not a perceived mismatch between skills taught and skills in demand
  • The problem is that young people are not taught to see the economy as a place where participants may imagine new things, where entrepreneurs may want to build them and investors may venture to back some of them. It is essential to educate young people to this image of the economy.
  • It will also be essential that high schools and colleges expose students to the human values expressed in the masterpieces of Western literature, so that young people will want to seek economies offering imaginative and creative careers. Education systems must put students in touch with the humanities in order to fuel the human desire to conceive the new and perchance to achieve innovations
  • This reorientation of general education will have to be supported by a similar reorientation of economic education.
maxwellokolo

Italy's biggest bank to slash 14,000 jobs and raise nearly $14 billion - 0 views

  •  
    The sweeping overhaul announced Tuesday by UniCredit SpA will push the total number of expected job losses at the Milan-based bank to 14,000 by 2019. That's about 10% of its employees. The job cuts will reduce costs by €1.1 billion ($1.2 billion).
sissij

Trump Administration Orders Tougher Screening of Visa Applicants - The New York Times - 0 views

  •  
    Since Trump took presidency, the policy on immigration became harsher and stricter. Many people are facing a choice, sty or leave. There are a lot of foreign students who want to stay in the United States and find jobs after graduation. However, it is getting harder for foreign students to get a chance as the time is very limited for them to find a job and companies have to consider whether they want to hire foreign employees and go through those complicated processes. Although Trump uses security as a reason, I think it is overly strict. --Sissi (3/24/2017)
sissij

Home Inspectors on Their Weirdest Discoveries - The New York Times - 0 views

  • When a home is sold, its many secrets can come out of the closet. Brokers, potential buyers and home inspectors step inside properties that may have been completely private for years.
  • Sometimes, owners hide flaws in the hopes a buyer will miss an expensive problem. Other times, homeowners are caught completely unaware that, say, a family of raccoons has taken up residence in the chimney.
  • The buyer, who was supposed to put down a large deposit that afternoon, was livid. The seller’s broker tried to assure her that the problem could be easily fixed.
  •  
    I found this article very interesting as it talks about the job of home inspectors. Home inspectors are the middle man between seller and buyer to make sure that all the issues of the home and pricing of the rent is transparent and clear for both sides. It reminded me of the rating companies we talked about in economics. In the film "Inside Job", the rating company is supposed to give good guidance and create transparency between sellers and buyers. However, some rating companies failed to give honest advice to clients and this lack of information is one of the reason that causes the collapse of the economics. Every market need a responsible middleman to operate efficiently. --Sissi (3/24/2017)
Duncan H

The White Underclass - NYTimes.com - 0 views

  • Persistent poverty is America’s great moral challenge, but it’s far more than that.
  • As a practical matter, we can’t solve educational problems, health care costs, government spending or economic competitiveness so long as a chunk of our population is locked in an underclass. Historically, “underclass” has often been considered to be a euphemism for race, but increasingly it includes elements of the white working class as well.
  • Liberals sometimes feel that it is narrow-minded to favor traditional marriage. Over time, my reporting on poverty has led me to disagree: Solid marriages have a huge beneficial impact on the lives of the poor (more so than in the lives of the middle class, who have more cushion when things go wrong).
  • ...3 more annotations...
  • I fear we’re facing a crisis in which a chunk of working-class America risks being calcified into an underclass, marked by drugs, despair, family decline, high incarceration rates and a diminishing role of jobs and education as escalators of upward mobility. We need a national conversation about these dimensions of poverty, and maybe Murray can help trigger it. I fear that liberals are too quick to think of inequality as basically about taxes. Yes, our tax system is a disgrace, but poverty is so much deeper and more complex than that.
  • to blame liberal social policies for the pathologies he examines. Yes, I’ve seen disability programs encourage some people to drop out of the labor force. But there were far greater forces at work, such as the decline in good union jobs.
  • Eighty percent of the people in my high school cohort dropped out or didn’t pursue college because it used to be possible to earn a solid living at the steel mill, the glove factory or sawmill. That’s what their parents had done. But the glove factory closed, working-class jobs collapsed and unskilled laborers found themselves competing with immigrants. There aren’t ideal solutions, but some evidence suggests that we need more social policy, not less. Early childhood education can support kids being raised by struggling single parents. Treating drug offenders is far cheaper than incarcerating them. A new study finds that a jobs program for newly released prison inmates left them 22 percent less likely to be convicted of another crime. This initiative, by the Center for Employment Opportunities, more than paid for itself: each $1 brought up to $3.85 in benefits.
  •  
    What should we do about this?
Javier E

What If Everybody Didn't Have to Work to Get Paid? - The Atlantic - 0 views

  • Santens, for his part, believes that job growth is no longer keeping pace with automation, and he sees a government-provided income as a viable remedy. “It’s not just a matter of needing basic income in the future; we need it now,” says Santens, who lives in New Orleans. “People don’t see it, but we are already seeing the effects all around us, in the jobs and pay we take, the hours we accept, the extremes inequality is reaching, and in the loss of consumer spending power.”
  • People in other countries, especially in safety-net-friendly Europe, seem more open to the idea of a basic income than people in the U.S. The Swiss are considering a basic income proposal. Most of the candidates in Finland’s upcoming parliamentary elections support the idea
  • the stories told by the winners are inspiring. For example, one recipient is using his newfound freedom to write his dissertation. Another winner quit his job at a call center to study and become a teacher. Perhaps one anonymous commentator summed it up best: “I did not realize how unfree we all are.”
  • ...2 more annotations...
  • But in the U.S., the issue is still a political non-starter for mainstream politicians, due to lingering suspicions about the fairness and practicality of a basic income, as well as a rejection of the premise that automation is actually erasing white-collar jobs.
  • “The sad reality is that a lot of the people who will most need a basic income are not likely to generate a lot of sympathy among volunteer donors,” Ford says. “You see this already with charitable giving—people will give for families, children, and pets—but not so much for single homeless men.” Ford cautions against what he calls the “libertarian/techno-optimistic fantasy” of a private market solution. “Government, for all its deficiencies, is going to be the only real tool in the toolbox here.”
Javier E

Technology's Man Problem - NYTimes.com - 0 views

  • computer engineering, the most innovative sector of the economy, remains behind. Many women who want to be engineers encounter a field where they not only are significantly underrepresented but also feel pushed away.
  • Among the women who join the field, 56 percent leave by midcareer, a startling attrition rate that is double that for men, according to research from the Harvard Business School.
  • A culprit, many people in the field say, is a sexist, alpha-male culture that can make women and other people who don’t fit the mold feel unwelcome, demeaned or even endangered.
  • ...12 more annotations...
  • “I’ve been a programmer for 13 years, and I’ve always been one of the only women and queer people in the room. I’ve been harassed, I’ve had people make suggestive comments to me, I’ve had people basically dismiss my expertise. I’ve gotten rape and death threats just for speaking out about this stuff.”
  • “We see these stories, ‘Why aren’t there more women in computer science and engineering?’ and there’s all these complicated answers like, ‘School advisers don’t have them take math and physics,’ and it’s probably true,” said Lauren Weinstein, a man who has spent his four-decade career in tech working mostly with other men, and is currently a consultant for Google.“But I think there’s probably a simpler reason,” he said, “which is these guys are just jerks, and women know it.”
  • once programming gained prestige, women were pushed out. Over the decades, the share of women in computing has continued to decline. In 2012, just 18 percent of computer-science college graduates were women, down from 37 percent in 1985, according to the National Center for Women & Information Technology.
  • Some 1.2 million computing jobs will be available in 2022, yet United States universities are producing only 39 percent of the graduates needed to fill them, the N.C.W.I.T. estimates.
  • Twenty percent of software developers are women, according to the Labor Department, and fewer than 6 percent of engineers are black or Hispanic. Comparatively, 56 percent of people in business and financial-operations jobs are women, as are 36 percent of physicians and surgeons and one-third of lawyers.
  • an engineer at Pinterest has collected data from people at 133 start-ups and found that an average of 12 percent of the engineers are women.
  • “It makes a hostile environment for me,” she said. “But I don’t want to raise my hand and call negative attention toward myself, and become the woman who is the problem — ‘that woman.’ In start-up culture they protect their own tribe, so by putting my hand up, I’m saying I’m an ‘other,’ I shouldn’t be there, so for me that’s an economic threat.”
  • “Many women have come to me and said they basically have had to hide on the Net now,” said Mr. Weinstein, who works on issues of identity and anonymity online. “They use male names, they don’t put their real photos up, because they are immediately targeted and harassed.”
  • “It’s a boys’ club, and you have to try to get into it, and they’re trying as hard as they can to prove you can’t,” said Ephrat Bitton, the director of algorithms at FutureAdvisor, an online investment start-up that she says has a better culture because almost half the engineers are women.
  • Writing code is a high-pressure job with little room for error, as are many jobs. But coding can be stressful in a different way, women interviewed for this article said, because code reviews — peer reviews to spot mistakes in software — can quickly devolve.
  • “Code reviews are brutal — ‘Mine is better than yours, I see flaws in yours’ — and they should be, for the creation of good software,” said Ellen Ullman, a software engineer and author. “I think when you add a drop of women into it, it just exacerbates the problem, because here’s a kind of foreigner.”
  • But some women argue that these kinds of initiatives are unhelpful.“My general issue with the coverage of women in tech is that women in the technology press are talked about in the context of being women, and men are talked about in the context of being in technology,” said a technical woman who would speak only on condition of anonymity because she did not want to be part of an article about women in tech.
charlottedonoho

Weekend Roundup: Preparing to Be Disrupted | Nathan Gardels - 0 views

  • Discussion around the theme "prepare to be disrupted" ranged from how the emergent sharing economy, along with 3D desktop manufacturing, would take work back into the home to worries that automation could eliminate as much as 47 percent of current jobs in the United States.
  • In The WorldPost, Ian Goldin of the Oxford Martin School writes that technological advance can lead to greater inequality or inclusive prosperity depending on how we govern ourselves.
  • Speaking at the London conference, MIT's Andrew McAfee argues that digital technology is "the best economic news in human history" but says that it poses many challenges to job creation in the future.
Javier E

Ivy League Schools Are Overrated. Send Your Kids Elsewhere. | New Republic - 1 views

  • a blizzard of admissions jargon that I had to pick up on the fly. “Good rig”: the transcript exhibits a good degree of academic rigor. “Ed level 1”: parents have an educational level no higher than high school, indicating a genuine hardship case. “MUSD”: a musician in the highest category of promise. Kids who had five or six items on their list of extracurriculars—the “brag”—were already in trouble, because that wasn’t nearly enough.
  • With so many accomplished applicants to choose from, we were looking for kids with something special, “PQs”—personal qualities—that were often revealed by the letters or essays. Kids who only had the numbers and the résumé were usually rejected: “no spark,” “not a team-builder,” “this is pretty much in the middle of the fairway for us.” One young person, who had piled up a truly insane quantity of extracurriculars and who submitted nine letters of recommendation, was felt to be “too intense.”
  • On the other hand, the numbers and the résumé were clearly indispensable. I’d been told that successful applicants could either be “well-rounded” or “pointy”—outstanding in one particular way—but if they were pointy, they had to be really pointy: a musician whose audition tape had impressed the music department, a scientist who had won a national award.
  • ...52 more annotations...
  • When I speak of elite education, I mean prestigious institutions like Harvard or Stanford or Williams as well as the larger universe of second-tier selective schools, but I also mean everything that leads up to and away from them—the private and affluent public high schools; the ever-growing industry of tutors and consultants and test-prep courses; the admissions process itself, squatting like a dragon at the entrance to adulthood; the brand-name graduate schools and employment opportunities that come after the B.A.; and the parents and communities, largely upper-middle class, who push their children into the maw of this machine.
  • Our system of elite education manufactures young people who are smart and talented and driven, yes, but also anxious, timid, and lost, with little intellectual curiosity and a stunted sense of purpose: trapped in a bubble of privilege, heading meekly in the same direction, great at what they’re doing but with no idea why they’re doing it.
  • “Super People,” the writer James Atlas has called them—the stereotypical ultra-high-achieving elite college students of today. A double major, a sport, a musical instrument, a couple of foreign languages, service work in distant corners of the globe, a few hobbies thrown in for good measure: They have mastered them all, and with a serene self-assurance
  • Like so many kids today, I went off to college like a sleepwalker. You chose the most prestigious place that let you in; up ahead were vaguely understood objectives: status, wealth—“success.” What it meant to actually get an education and why you might want one—all this was off the table.
  • It was only after 24 years in the Ivy League—college and a Ph.D. at Columbia, ten years on the faculty at Yale—that I started to think about what this system does to kids and how they can escape from it, what it does to our society and how we can dismantle it.
  • I taught many wonderful young people during my years in the Ivy League—bright, thoughtful, creative kids whom it was a pleasure to talk with and learn from. But most of them seemed content to color within the lines that their education had marked out for them. Very few were passionate about ideas. Very few saw college as part of a larger project of intellectual discovery and development. Everyone dressed as if they were ready to be interviewed at a moment’s notice.
  • Look beneath the façade of seamless well-adjustment, and what you often find are toxic levels of fear, anxiety, and depression, of emptiness and aimlessness and isolation. A large-scale survey of college freshmen recently found that self-reports of emotional well-being have fallen to their lowest level in the study’s 25-year history.
  • So extreme are the admission standards now that kids who manage to get into elite colleges have, by definition, never experienced anything but success. The prospect of not being successful terrifies them, disorients them. The cost of falling short, even temporarily, becomes not merely practical, but existential. The result is a violent aversion to risk.
  • There are exceptions, kids who insist, against all odds, on trying to get a real education. But their experience tends to make them feel like freaks. One student told me that a friend of hers had left Yale because she found the school “stifling to the parts of yourself that you’d call a soul.”
  • What no one seems to ask is what the “return” is supposed to be. Is it just about earning more money? Is the only purpose of an education to enable you to get a job? What, in short, is college for?
  • The first thing that college is for is to teach you to think.
  • College is an opportunity to stand outside the world for a few years, between the orthodoxy of your family and the exigencies of career, and contemplate things from a distance.
  • it is only through the act of establishing communication between the mind and the heart, the mind and experience, that you become an individual, a unique being—a soul. The job of college is to assist you to begin to do that. Books, ideas, works of art and thought, the pressure of the minds around you that are looking for their own answers in their own ways.
  • College is not the only chance to learn to think, but it is the best. One thing is certain: If you haven’t started by the time you finish your B.A., there’s little likelihood you’ll do it later. That is why an undergraduate experience devoted exclusively to career preparation is four years largely wasted.
  • Elite schools like to boast that they teach their students how to think, but all they mean is that they train them in the analytic and rhetorical skills that are necessary for success in business and the professions.
  • Everything is technocratic—the development of expertise—and everything is ultimately justified in technocratic terms.
  • Religious colleges—even obscure, regional schools that no one has ever heard of on the coasts—often do a much better job in that respect.
  • At least the classes at elite schools are academically rigorous, demanding on their own terms, no? Not necessarily. In the sciences, usually; in other disciplines, not so much
  • professors and students have largely entered into what one observer called a “nonaggression pact.”
  • higher marks for shoddier work.
  • today’s young people appear to be more socially engaged than kids have been for several decades and that they are more apt to harbor creative or entrepreneurial impulses
  • they tend to be played out within the same narrow conception of what constitutes a valid life: affluence, credentials, prestige.
  • Experience itself has been reduced to instrumental function, via the college essay. From learning to commodify your experiences for the application, the next step has been to seek out experiences in order to have them to commodify
  • there is now a thriving sector devoted to producing essay-ready summers
  • To be a high-achieving student is to constantly be urged to think of yourself as a future leader of society.
  • what these institutions mean by leadership is nothing more than getting to the top. Making partner at a major law firm or becoming a chief executive, climbing the greasy pole of whatever hierarchy you decide to attach yourself to. I don’t think it occurs to the people in charge of elite colleges that the concept of leadership ought to have a higher meaning, or, really, any meaning.
  • The irony is that elite students are told that they can be whatever they want, but most of them end up choosing to be one of a few very similar things
  • As of 2010, about a third of graduates went into financing or consulting at a number of top schools, including Harvard, Princeton, and Cornell.
  • Whole fields have disappeared from view: the clergy, the military, electoral politics, even academia itself, for the most part, including basic science
  • It’s considered glamorous to drop out of a selective college if you want to become the next Mark Zuckerberg, but ludicrous to stay in to become a social worker. “What Wall Street figured out,” as Ezra Klein has put it, “is that colleges are producing a large number of very smart, completely confused graduates. Kids who have ample mental horsepower, an incredible work ethic and no idea what to do next.”
  • t almost feels ridiculous to have to insist that colleges like Harvard are bastions of privilege, where the rich send their children to learn to walk, talk, and think like the rich. Don’t we already know this? They aren’t called elite colleges for nothing. But apparently we like pretending otherwise. We live in a meritocracy, after all.
  • Visit any elite campus across our great nation, and you can thrill to the heart-warming spectacle of the children of white businesspeople and professionals studying and playing alongside the children of black, Asian, and Latino businesspeople and professionals
  • That doesn’t mean there aren’t a few exceptions, but that is all they are. In fact, the group that is most disadvantaged by our current admissions policies are working-class and rural whites, who are hardly present
  • The college admissions game is not primarily about the lower and middle classes seeking to rise, or even about the upper-middle class attempting to maintain its position. It is about determining the exact hierarchy of status within the upper-middle class itself.
  • This system is exacerbating inequality, retarding social mobility, perpetuating privilege, and creating an elite that is isolated from the society that it’s supposed to lead. The numbers are undeniable. In 1985, 46 percent of incoming freshmen at the 250 most selective colleges came from the top quarter of the income distribution. By 2000, it was 55 percent
  • The major reason for the trend is clear. Not increasing tuition, though that is a factor, but the ever-growing cost of manufacturing children who are fit to compete in the college admissions game
  • Wealthy families start buying their children’s way into elite colleges almost from the moment they are born: music lessons, sports equipment, foreign travel (“enrichment” programs, to use the all-too-perfect term)—most important, of course, private-school tuition or the costs of living in a place with top-tier public schools.
  • s there anything that I can do, a lot of young people have written to ask me, to avoid becoming an out-of-touch, entitled little shit? I don’t have a satisfying answer, short of telling them to transfer to a public university. You cannot cogitate your way to sympathy with people of different backgrounds, still less to knowledge of them. You need to interact with them directly, and it has to be on an equal footing
  • Elite private colleges will never allow their students’ economic profile to mirror that of society as a whole. They can’t afford to—they need a critical mass of full payers and they need to tend to their donor base—and it’s not even clear that they’d want to.
  • Elite colleges are not just powerless to reverse the movement toward a more unequal society; their policies actively promote it.
  • The SAT is supposed to measure aptitude, but what it actually measures is parental income, which it tracks quite closely
  • U.S. News and World Report supplies the percentage of freshmen at each college who finished in the highest 10 percent of their high school class. Among the top 20 universities, the number is usually above 90 percent. I’d be wary of attending schools like that. Students determine the level of classroom discussion; they shape your values and expectations, for good and ill. It’s partly because of the students that I’d warn kids away from the Ivies and their ilk. Kids at less prestigious schools are apt to be more interesting, more curious, more open, and far less entitled and competitive.
  • The best option of all may be the second-tier—not second-rate—colleges, like Reed, Kenyon, Wesleyan, Sewanee, Mount Holyoke, and others. Instead of trying to compete with Harvard and Yale, these schools have retained their allegiance to real educational values.
  • Not being an entitled little shit is an admirable goal. But in the end, the deeper issue is the situation that makes it so hard to be anything else. The time has come, not simply to reform that system top to bottom, but to plot our exit to another kind of society altogether.
  • The education system has to act to mitigate the class system, not reproduce it. Affirmative action should be based on class instead of race, a change that many have been advocating for years. Preferences for legacies and athletes ought to be discarded. SAT scores should be weighted to account for socioeconomic factors. Colleges should put an end to résumé-stuffing by imposing a limit on the number of extracurriculars that kids can list on their applications. They ought to place more value on the kind of service jobs that lower-income students often take in high school and that high achievers almost never do. They should refuse to be impressed by any opportunity that was enabled by parental wealth
  • More broadly, they need to rethink their conception of merit. If schools are going to train a better class of leaders than the ones we have today, they’re going to have to ask themselves what kinds of qualities they need to promote. Selecting students by GPA or the number of extracurriculars more often benefits the faithful drudge than the original mind.
  • reforming the admissions process. That might address the problem of mediocrity, but it won’t address the greater one of inequality
  • The problem is the Ivy League itself. We have contracted the training of our leadership class to a set of private institutions. However much they claim to act for the common good, they will always place their interests first.
  • I’ve come to see that what we really need is to create one where you don’t have to go to the Ivy League, or any private college, to get a first-rate education.
  • High-quality public education, financed with public money, for the benefit of all
  • Everybody gets an equal chance to go as far as their hard work and talent will take them—you know, the American dream. Everyone who wants it gets to have the kind of mind-expanding, soul-enriching experience that a liberal arts education provides.
  • We recognize that free, quality K–12 education is a right of citizenship. We also need to recognize—as we once did and as many countries still do—that the same is true of higher education. We have tried aristocracy. We have tried meritocracy. Now it’s time to try democracy.
Javier E

At the Existentialist Café: Freedom, Being, and Apricot Cocktails with Jean-P... - 0 views

  • The phenomenologists’ leading thinker, Edmund Husserl, provided a rallying cry, ‘To the things themselves!’ It meant: don’t waste time on the interpretations that accrue upon things, and especially don’t waste time wondering whether the things are real. Just look at this that’s presenting itself to you, whatever this may be, and describe it as precisely as possible.
  • You might think you have defined me by some label, but you are wrong, for I am always a work in progress. I create myself constantly through action, and this is so fundamental to my human condition that, for Sartre, it is the human condition, from the moment of first consciousness to the moment when death wipes it out. I am my own freedom: no more, no less.
  • Sartre wrote like a novelist — not surprisingly, since he was one. In his novels, short stories and plays as well as in his philosophical treatises, he wrote about the physical sensations of the world and the structures and moods of human life. Above all, he wrote about one big subject: what it meant to be free. Freedom, for him, lay at the heart of all human experience, and this set humans apart from all other kinds of object.
  • ...97 more annotations...
  • Sartre listened to his problem and said simply, ‘You are free, therefore choose — that is to say, invent.’ No signs are vouchsafed in this world, he said. None of the old authorities can relieve you of the burden of freedom. You can weigh up moral or practical considerations as carefully as you like, but ultimately you must take the plunge and do something, and it’s up to you what that something is.
  • Even if the situation is unbearable — perhaps you are facing execution, or sitting in a Gestapo prison, or about to fall off a cliff — you are still free to decide what to make of it in mind and deed. Starting from where you are now, you choose. And in choosing, you also choose who you will be.
  • The war had made people realise that they and their fellow humans were capable of departing entirely from civilised norms; no wonder the idea of a fixed human nature seemed questionable.
  • If this sounds difficult and unnerving, it’s because it is. Sartre does not deny that the need to keep making decisions brings constant anxiety. He heightens this anxiety by pointing out that what you do really matters. You should make your choices as though you were choosing on behalf of the whole of humanity, taking the entire burden of responsibility for how the human race behaves. If you avoid this responsibility by fooling yourself that you are the victim of circumstance or of someone else’s bad advice, you are failing to meet the demands of human life and choosing a fake existence, cut off from your own ‘authenticity’.
  • Along with the terrifying side of this comes a great promise: Sartre’s existentialism implies that it is possible to be authentic and free, as long as you keep up the effort.
  • almost all agreed that it was, as an article in Les nouvelles littéraires phrased it, a ‘sickening mixture of philosophic pretentiousness, equivocal dreams, physiological technicalities, morbid tastes and hesitant eroticism … an introspective embryo that one would take distinct pleasure in crushing’.
  • he offered a philosophy designed for a species that had just scared the hell out of itself, but that finally felt ready to grow up and take responsibility.
  • In this rebellious world, just as with the Parisian bohemians and Dadaists in earlier generations, everything that was dangerous and provocative was good, and everything that was nice or bourgeois was bad.
  • Such interweaving of ideas and life had a long pedigree, although the existentialists gave it a new twist. Stoic and Epicurean thinkers in the classical world had practised philosophy as a means of living well, rather than of seeking knowledge or wisdom for their own sake. By reflecting on life’s vagaries in philosophical ways, they believed they could become more resilient, more able to rise above circumstances, and better equipped to manage grief, fear, anger, disappointment or anxiety.
  • In the tradition they passed on, philosophy is neither a pure intellectual pursuit nor a collection of cheap self-help tricks, but a discipline for flourishing and living a fully human, responsible life.
  • For Kierkegaard, Descartes had things back to front. In his own view, human existence comes first: it is the starting point for everything we do, not the result of a logical deduction. My existence is active: I live it and choose it, and this precedes any statement I can make about myself.
  • Studying our own moral genealogy cannot help us to escape or transcend ourselves. But it can enable us to see our illusions more clearly and lead a more vital, assertive existence.
  • What was needed, he felt, was not high moral or theological ideals, but a deeply critical form of cultural history or ‘genealogy’ that would uncover the reasons why we humans are as we are, and how we came to be that way. For him, all philosophy could even be redefined as a form of psychology, or history.
  • For those oppressed on grounds of race or class, or for those fighting against colonialism, existentialism offered a change of perspective — literally, as Sartre proposed that all situations be judged according to how they appeared in the eyes of those most oppressed, or those whose suffering was greatest.
  • She observed that we need not expect moral philosophers to ‘live by’ their ideas in a simplistic way, as if they were following a set of rules. But we can expect them to show how their ideas are lived in. We should be able to look in through the windows of a philosophy, as it were, and see how people occupy it, how they move about and how they conduct themselves.
  • the existentialists inhabited their historical and personal world, as they inhabited their ideas. This notion of ‘inhabited philosophy’ is one I’ve borrowed from the English philosopher and novelist Iris Murdoch, who wrote the first full-length book on Sartre and was an early adopter of existentialism
  • What is existentialism anyway?
  • An existentialist who is also phenomenological provides no easy rules for dealing with this condition, but instead concentrates on describing lived experience as it presents itself. — By describing experience well, he or she hopes to understand this existence and awaken us to ways of living more authentic lives.
  • Existentialists concern themselves with individual, concrete human existence. — They consider human existence different from the kind of being other things have. Other entities are what they are, but as a human I am whatever I choose to make of myself at every moment. I am free — — and therefore I’m responsible for everything I do, a dizzying fact which causes — an anxiety inseparable from human existence itself.
  • On the other hand, I am only free within situations, which can include factors in my own biology and psychology as well as physical, historical and social variables of the world into which I have been thrown. — Despite the limitations, I always want more: I am passionately involved in personal projects of all kinds. — Human existence is thus ambiguous: at once boxed in by borders and yet transcendent and exhilarating. —
  • The first part of this is straightforward: a phenomenologist’s job is to describe. This is the activity that Husserl kept reminding his students to do. It meant stripping away distractions, habits, clichés of thought, presumptions and received ideas, in order to return our attention to what he called the ‘things themselves’. We must fix our beady gaze on them and capture them exactly as they appear, rather than as we think they are supposed to be.
  • Husserl therefore says that, to phenomenologically describe a cup of coffee, I should set aside both the abstract suppositions and any intrusive emotional associations. Then I can concentrate on the dark, fragrant, rich phenomenon in front of me now. This ‘setting aside’ or ‘bracketing out’ of speculative add-ons Husserl called epoché — a term borrowed from the ancient Sceptics,
  • The point about rigour is crucial; it brings us back to the first half of the command to describe phenomena. A phenomenologist cannot get away with listening to a piece of music and saying, ‘How lovely!’ He or she must ask: is it plaintive? is it dignified? is it colossal and sublime? The point is to keep coming back to the ‘things themselves’ — phenomena stripped of their conceptual baggage — so as to bail out weak or extraneous material and get to the heart of the experience.
  • Husserlian ‘bracketing out’ or epoché allows the phenomenologist to temporarily ignore the question ‘But is it real?’, in order to ask how a person experiences his or her world. Phenomenology gives a formal mode of access to human experience. It lets philosophers talk about life more or less as non-philosophers do, while still being able to tell themselves they are being methodical and rigorous.
  • Besides claiming to transform the way we think about reality, phenomenologists promised to change how we think about ourselves. They believed that we should not try to find out what the human mind is, as if it were some kind of substance. Instead, we should consider what it does, and how it grasps its experiences.
  • For Brentano, this reaching towards objects is what our minds do all the time. Our thoughts are invariably of or about something, he wrote: in love, something is loved, in hatred, something is hated, in judgement, something is affirmed or denied. Even when I imagine an object that isn’t there, my mental structure is still one of ‘about-ness’ or ‘of-ness’.
  • Except in deepest sleep, my mind is always engaged in this aboutness: it has ‘intentionality’. Having taken the germ of this from Brentano, Husserl made it central to his whole philosophy.
  • Husserl saw in the idea of intentionality a way to sidestep two great unsolved puzzles of philosophical history: the question of what objects ‘really’ are, and the question of what the mind ‘really’ is. By doing the epoché and bracketing out all consideration of reality from both topics, one is freed to concentrate on the relationship in the middle. One can apply one’s descriptive energies to the endless dance of intentionality that takes place in our lives: the whirl of our minds as they seize their intended phenomena one after the other and whisk them around the floor,
  • Understood in this way, the mind hardly is anything at all: it is its aboutness. This makes the human mind (and possibly some animal minds) different from any other naturally occurring entity. Nothing else can be as thoroughly about or of things as the mind is:
  • Some Eastern meditation techniques aim to still this scurrying creature, but the extreme difficulty of this shows how unnatural it is to be mentally inert. Left to itself, the mind reaches out in all directions as long as it is awake — and even carries on doing it in the dreaming phase of its sleep.
  • a mind that is experiencing nothing, imagining nothing, or speculating about nothing can hardly be said to be a mind at all.
  • Three simple ideas — description, phenomenon, intentionality — provided enough inspiration to keep roomfuls of Husserlian assistants busy in Freiburg for decades. With all of human existence awaiting their attention, how could they ever run out of things to do?
  • For Sartre, this gives the mind an immense freedom. If we are nothing but what we think about, then no predefined ‘inner nature’ can hold us back. We are protean.
  • way of this interpretation. Real, not real; inside, outside; what difference did it make? Reflecting on this, Husserl began turning his phenomenology into a branch of ‘idealism’ — the philosophical tradition which denied external reality and defined everything as a kind of private hallucination.
  • For Sartre, if we try to shut ourselves up inside our own minds, ‘in a nice warm room with the shutters closed’, we cease to exist. We have no cosy home: being out on the dusty road is the very definition of what we are.
  • One might think that, if Heidegger had anything worth saying, he could have communicated it in ordinary language. The fact is that he does not want to be ordinary, and he may not even want to communicate in the usual sense. He wants to make the familiar obscure, and to vex us. George Steiner thought that Heidegger’s purpose was less to be understood than to be experienced through a ‘felt strangeness’.
  • He takes Dasein in its most ordinary moments, then talks about it in the most innovative way he can. For Heidegger, Dasein’s everyday Being is right here: it is Being-in-the-world, or In-der-Welt-sein. The main feature of Dasein’s everyday Being-in-the-world right here is that it is usually busy doing something.
  • Thus, for Heidegger, all Being-in-the-world is also a ‘Being-with’ or Mitsein. We cohabit with others in a ‘with-world’, or Mitwelt. The old philosophical problem of how we prove the existence of other minds has now vanished. Dasein swims in the with-world long before it wonders about other minds.
  • Sometimes the best-educated people were those least inclined to take the Nazis seriously, dismissing them as too absurd to last. Karl Jaspers was one of those who made this mistake, as he later recalled, and Beauvoir observed similar dismissive attitudes among the French students in Berlin.
  • In any case, most of those who disagreed with Hitler’s ideology soon learned to keep their view to themselves. If a Nazi parade passed on the street, they would either slip out of view or give the obligatory salute like everyone else, telling themselves that the gesture meant nothing if they did not believe in it. As the psychologist Bruno Bettelheim later wrote of this period, few people will risk their life for such a small thing as raising an arm — yet that is how one’s powers of resistance are eroded away, and eventually one’s responsibility and integrity go with them.
  • for Arendt, if you do not respond adequately when the times demand it, you show a lack of imagination and attention that is as dangerous as deliberately committing an abuse. It amounts to disobeying the one command she had absorbed from Heidegger in those Marburg days: Think!
  • ‘Everything takes place under a kind of anaesthesia. Objectively dreadful events produce a thin, puny emotional response. Murders are committed like schoolboy pranks. Humiliation and moral decay are accepted like minor incidents.’ Haffner thought modernity itself was partly to blame: people had become yoked to their habits and to mass media, forgetting to stop and think, or to disrupt their routines long enough to question what was going on.
  • Heidegger’s former lover and student Hannah Arendt would argue, in her 1951 study The Origins of Totalitarianism, that totalitarian movements thrived at least partly because of this fragmentation in modern lives, which made people more vulnerable to being swept away by demagogues. Elsewhere, she coined the phrase ‘the banality of evil’ to describe the most extreme failures of personal moral awareness.
  • His communicative ideal fed into a whole theory of history: he traced all civilisation to an ‘Axial Period’ in the fifth century BC, during which philosophy and culture exploded simultaneously in Europe, the Middle East and Asia, as though a great bubble of minds had erupted from the earth’s surface. ‘True philosophy needs communion to come into existence,’ he wrote, and added, ‘Uncommunicativeness in a philosopher is virtually a criterion of the untruth of his thinking.’
  • The idea of being called to authenticity became a major theme in later existentialism, the call being interpreted as saying something like ‘Be yourself!’, as opposed to being phony. For Heidegger, the call is more fundamental than that. It is a call to take up a self that you didn’t know you had: to wake up to your Being. Moreover, it is a call to action. It requires you to do something: to take a decision of some sort.
  • Being and Time contained at least one big idea that should have been of use in resisting totalitarianism. Dasein, Heidegger wrote there, tends to fall under the sway of something called das Man or ‘the they’ — an impersonal entity that robs us of the freedom to think for ourselves. To live authentically requires resisting or outwitting this influence, but this is not easy because das Man is so nebulous. Man in German does not mean ‘man’ as in English (that’s der Mann), but a neutral abstraction, something like ‘one’ in the English phrase ‘one doesn’t do that’,
  • for Heidegger, das Man is me. It is everywhere and nowhere; it is nothing definite, but each of us is it. As with Being, it is so ubiquitous that it is difficult to see. If I am not careful, however, das Man takes over the important decisions that should be my own. It drains away my responsibility or ‘answerability’. As Arendt might put it, we slip into banality, failing to think.
  • Jaspers focused on what he called Grenzsituationen — border situations, or limit situations. These are the moments when one finds oneself constrained or boxed in by what is happening, but at the same time pushed by these events towards the limits or outer edge of normal experience. For example, you might have to make a life-or-death choice, or something might remind you suddenly of your mortality,
  • Jaspers’ interest in border situations probably had much to do with his own early confrontation with mortality. From childhood, he had suffered from a heart condition so severe that he always expected to die at any moment. He also had emphysema, which forced him to speak slowly, taking long pauses to catch his breath. Both illnesses meant that he had to budget his energies with care in order to get his work done without endangering his life.
  • If I am to resist das Man, I must become answerable to the call of my ‘voice of conscience’. This call does not come from God, as a traditional Christian definition of the voice of conscience might suppose. It comes from a truly existentialist source: my own authentic self. Alas, this voice is one I do not recognise and may not hear, because it is not the voice of my habitual ‘they-self’. It is an alien or uncanny version of my usual voice. I am familiar with my they-self, but not with my unalienated voice — so, in a weird twist, my real voice is the one that sounds strangest to me.
  • Marcel developed a strongly theological branch of existentialism. His faith distanced him from both Sartre and Heidegger, but he shared a sense of how history makes demands on individuals. In his essay ‘On the Ontological Mystery’, written in 1932 and published in the fateful year of 1933, Marcel wrote of the human tendency to become stuck in habits, received ideas, and a narrow-minded attachment to possessions and familiar scenes. Instead, he urged his readers to develop a capacity for remaining ‘available’ to situations as they arise. Similar ideas of disponibilité or availability had been explored by other writers,
  • Marcel made it his central existential imperative. He was aware of how rare and difficult it was. Most people fall into what he calls ‘crispation’: a tensed, encrusted shape in life — ‘as though each one of us secreted a kind of shell which gradually hardened and imprisoned him’.
  • Bettelheim later observed that, under Nazism, only a few people realised at once that life could not continue unaltered: these were the ones who got away quickly. Bettelheim himself was not among them. Caught in Austria when Hitler annexed it, he was sent first to Dachau and then to Buchenwald, but was then released in a mass amnesty to celebrate Hitler’s birthday in 1939 — an extraordinary reprieve, after which he left at once for America.
  • we are used to reading philosophy as offering a universal message for all times and places — or at least as aiming to do so. But Heidegger disliked the notion of universal truths or universal humanity, which he considered a fantasy. For him, Dasein is not defined by shared faculties of reason and understanding, as the Enlightenment philosophers thought. Still less is it defined by any kind of transcendent eternal soul, as in religious tradition. We do not exist on a higher, eternal plane at all. Dasein’s Being is local: it has a historical situation, and is constituted in time and place.
  • For Marcel, learning to stay open to reality in this way is the philosopher’s prime job. Everyone can do it, but the philosopher is the one who is called on above all to stay awake, so as to be the first to sound the alarm if something seems wrong.
  • Second, it also means understanding that we are historical beings, and grasping the demands our particular historical situation is making on us. In what Heidegger calls ‘anticipatory resoluteness’, Dasein discovers ‘that its uttermost possibility lies in giving itself up’. At that moment, through Being-towards-death and resoluteness in facing up to one’s time, one is freed from the they-self and attains one’s true, authentic self.
  • If we are temporal beings by our very nature, then authentic existence means accepting, first, that we are finite and mortal. We will die: this all-important realisation is what Heidegger calls authentic ‘Being-towards-Death’, and it is fundamental to his philosophy.
  • Hannah Arendt, instead, left early on: she had the benefit of a powerful warning. Just after the Nazi takeover, in spring 1933, she had been arrested while researching materials on anti-Semitism for the German Zionist Organisation at Berlin’s Prussian State Library. Her apartment was searched; both she and her mother were locked up briefly, then released. They fled, without stopping to arrange travel documents. They crossed to Czechoslovakia (then still safe) by a method that sounds almost too fabulous to be true: a sympathetic German family on the border had a house with its front door in Germany and its back door in Czechoslovakia. The family would invite people for dinner, then let them leave through the back door at night.
  • As Sartre argued in his 1943 review of The Stranger, basic phenomenological principles show that experience comes to us already charged with significance. A piano sonata is a melancholy evocation of longing. If I watch a soccer match, I see it as a soccer match, not as a meaningless scene in which a number of people run around taking turns to apply their lower limbs to a spherical object. If the latter is what I’m seeing, then I am not watching some more essential, truer version of soccer; I am failing to watch it properly as soccer at all.
  • Much as they liked Camus personally, neither Sartre nor Beauvoir accepted his vision of absurdity. For them, life is not absurd, even when viewed on a cosmic scale, and nothing can be gained by saying it is. Life for them is full of real meaning, although that meaning emerges differently for each of us.
  • For Sartre, we show bad faith whenever we portray ourselves as passive creations of our race, class, job, history, nation, family, heredity, childhood influences, events, or even hidden drives in our subconscious which we claim are out of our control. It is not that such factors are unimportant: class and race, in particular, he acknowledged as powerful forces in people’s lives, and Simone de Beauvoir would soon add gender to that list.
  • Sartre takes his argument to an extreme point by asserting that even war, imprisonment or the prospect of imminent death cannot take away my existential freedom. They form part of my ‘situation’, and this may be an extreme and intolerable situation, but it still provides only a context for whatever I choose to do next. If I am about to die, I can decide how to face that death. Sartre here resurrects the ancient Stoic idea that I may not choose what happens to me, but I can choose what to make of it, spiritually speaking.
  • But the Stoics cultivated indifference in the face of terrible events, whereas Sartre thought we should remain passionately, even furiously engaged with what happens to us and with what we can achieve. We should not expect freedom to be anything less than fiendishly difficult.
  • Freedom does not mean entirely unconstrained movement, and it certainly does not mean acting randomly. We often mistake the very things that enable us to be free — context, meaning, facticity, situation, a general direction in our lives — for things that define us and take away our freedom. It is only with all of these that we can be free in a real sense.
  • Nor did he mean that privileged groups have the right to pontificate to the poor and downtrodden about the need to ‘take responsibility’ for themselves. That would be a grotesque misreading of Sartre’s point, since his sympathy in any encounter always lay with the more oppressed side. But for each of us — for me — to be in good faith means not making excuses for myself.
  • Camus’ novel gives us a deliberately understated vision of heroism and decisive action compared to those of Sartre and Beauvoir. One can only do so much. It can look like defeatism, but it shows a more realistic perception of what it takes to actually accomplish difficult tasks like liberating one’s country.
  • Camus just kept returning to his core principle: no torture, no killing — at least not with state approval. Beauvoir and Sartre believed they were taking a more subtle and more realistic view. If asked why a couple of innocuous philosophers had suddenly become so harsh, they would have said it was because the war had changed them in profound ways. It had shown them that one’s duties to humanity could be more complicated than they seemed. ‘The war really divided my life in two,’ Sartre said later.
  • Poets and artists ‘let things be’, but they also let things come out and show themselves. They help to ease things into ‘unconcealment’ (Unverborgenheit), which is Heidegger’s rendition of the Greek term alētheia, usually translated as ‘truth’. This is a deeper kind of truth than the mere correspondence of a statement to reality, as when we say ‘The cat is on the mat’ and point to a mat with a cat on it. Long before we can do this, both cat and mat must ‘stand forth out of concealedness’. They must un-hide themselves.
  • Heidegger does not use the word ‘consciousness’ here because — as with his earlier work — he is trying to make us think in a radically different way about ourselves. We are not to think of the mind as an empty cavern, or as a container filled with representations of things. We are not even supposed to think of it as firing off arrows of intentional ‘aboutness’, as in the earlier phenomenology of Brentano. Instead, Heidegger draws us into the depths of his Schwarzwald, and asks us to imagine a gap with sunlight filtering in. We remain in the forest, but we provide a relatively open spot where other beings can bask for a moment. If we did not do this, everything would remain in the thickets, hidden even to itself.
  • The astronomer Carl Sagan began his 1980 television series Cosmos by saying that human beings, though made of the same stuff as the stars, are conscious and are therefore ‘a way for the cosmos to know itself’. Merleau-Ponty similarly quoted his favourite painter Cézanne as saying, ‘The landscape thinks itself in me, and I am its consciousness.’ This is something like what Heidegger thinks humanity contributes to the earth. We are not made of spiritual nothingness; we are part of Being, but we also bring something unique with us. It is not much: a little open space, perhaps with a path and a bench like the one the young Heidegger used to sit on to do his homework. But through us, the miracle occurs.
  • Beauty aside, Heidegger’s late writing can also be troubling, with its increasingly mystical notion of what it is to be human. If one speaks of a human being mainly as an open space or a clearing, or a means of ‘letting beings be’ and dwelling poetically on the earth, then one doesn’t seem to be talking about any recognisable person. The old Dasein has become less human than ever. It is now a forestry feature.
  • Even today, Jaspers, the dedicated communicator, is far less widely read than Heidegger, who has influenced architects, social theorists, critics, psychologists, artists, film-makers, environmental activists, and innumerable students and enthusiasts — including the later deconstructionist and post-structuralist schools, which took their starting point from his late thinking. Having spent the late 1940s as an outsider and then been rehabilitated, Heidegger became the overwhelming presence in university philosophy all over the European continent from then on.
  • As Levinas reflected on this experience, it helped to lead him to a philosophy that was essentially ethical, rather than ontological like Heidegger’s. He developed his ideas from the work of Jewish theologian Martin Buber, whose I and Thou in 1923 had distinguished between my relationship with an impersonal ‘it’ or ‘them’, and the direct personal encounter I have with a ‘you’. Levinas took it further: when I encounter you, we normally meet face-to-face, and it is through your face that you, as another person, can make ethical demands on me. This is very different from Heidegger’s Mitsein or Being-with, which suggests a group of people standing alongside one another, shoulder to shoulder as if in solidarity — perhaps as a unified nation or Volk.
  • For Levinas, we literally face each other, one individual at a time, and that relationship becomes one of communication and moral expectation. We do not merge; we respond to one another. Instead of being co-opted into playing some role in my personal drama of authenticity, you look me in the eyes — and you remain Other. You remain you.
  • This relationship is more fundamental than the self, more fundamental than consciousness, more fundamental even than Being — and it brings an unavoidable ethical obligation. Ever since Husserl, phenomenologists and existentialists had being trying to stretch the definition of existence to incorporate our social lives and relationships. Levinas did more: he turned philosophy around entirely so that these relationships were the foundation of our existence, not an extension of it.
  • Her last work, The Need for Roots, argues, among other things, that none of us has rights, but each one of us has a near-infinite degree of duty and obligation to the other. Whatever the underlying cause of her death — and anorexia nervosa seems to have been involved — no one could deny that she lived out her philosophy with total commitment. Of all the lives touched on in this book, hers is surely the most profound and challenging application of Iris Murdoch’s notion that a philosophy can be ‘inhabited’.
  • Other thinkers took radical ethical turns during the war years. The most extreme was Simone Weil, who actually tried to live by the principle of putting other people’s ethical demands first. Having returned to France after her travels through Germany in 1932, she had worked in a factory so as to experience the degrading nature of such work for herself. When France fell in 1940, her family fled to Marseilles (against her protests), and later to the US and to Britain. Even in exile, Weil made extraordinary sacrifices. If there were people in the world who could not sleep in a bed, she would not do so either, so she slept on the floor.
  • The mystery tradition had roots in Kierkegaard’s ‘leap of faith’. It owed much to the other great nineteenth-century mystic of the impossible, Dostoevsky, and to older theological notions. But it also grew from the protracted trauma that was the first half of the twentieth century. Since 1914, and especially since 1939, people in Europe and elsewhere had come to the realisation that we cannot fully know or trust ourselves; that we have no excuses or explanations for what we do — and yet that we must ground our existence and relationships on something firm, because otherwise we cannot survive.
  • One striking link between these radical ethical thinkers, all on the fringes of our main story, is that they had religious faith. They also granted a special role to the notion of ‘mystery’ — that which cannot be known, calculated or understood, especially when it concerns our relationships with each other. Heidegger was different from them, since he rejected the religion he grew up with and had no real interest in ethics — probably as a consequence of his having no real interest in the human.
  • Meanwhile, the Christian existentialist Gabriel Marcel was also still arguing, as he had since the 1930s, that ethics trumps everything else in philosophy and that our duty to each other is so great as to play the role of a transcendent ‘mystery’. He too had been led to this position partly by a wartime experience: during the First World War he had worked for the Red Cross’ Information Service, with the unenviable job of answering relatives’ inquiries about missing soldiers. Whenever news came, he passed it on, and usually it was not good. As Marcel later said, this task permanently inoculated him against warmongering rhetoric of any kind, and it made him aware of the power of what is unknown in our lives.
  • As the play’s much-quoted and frequently misunderstood final line has it: ‘Hell is other people.’ Sartre later explained that he did not mean to say that other people were hellish in general. He meant that after death we become frozen in their view, unable any longer to fend off their interpretation. In life, we can still do something to manage the impression we make; in death, this freedom goes and we are left entombed in other’s people’s memories and perceptions.
  • We have to do two near-impossible things at once: understand ourselves as limited by circumstances, and yet continue to pursue our projects as though we are truly in control. In Beauvoir’s view, existentialism is the philosophy that best enables us to do this, because it concerns itself so deeply with both freedom and contingency. It acknowledges the radical and terrifying scope of our freedom in life, but also the concrete influences that other philosophies tend to ignore: history, the body, social relationships and the environment.
  • The aspects of our existence that limit us, Merleau-Ponty says, are the very same ones that bind us to the world and give us scope for action and perception. They make us what we are. Sartre acknowledged the need for this trade-off, but he found it more painful to accept. Everything in him longed to be free of bonds, of impediments and limitations
  • Of course we have to learn this skill of interpreting and anticipating the world, and this happens in early childhood, which is why Merleau-Ponty thought child psychology was essential to philosophy. This is an extraordinary insight. Apart from Rousseau, very few philosophers before him had taken childhood seriously; most wrote as though all human experience were that of a fully conscious, rational, verbal adult who has been dropped into this world from the sky — perhaps by a stork.
  • For Merleau-Ponty, we cannot understand our experience if we don’t think of ourselves in part as overgrown babies. We fall for optical illusions because we once learned to see the world in terms of shapes, objects and things relevant to our own interests. Our first perceptions came to us in tandem with our first active experiments in observing the world and reaching out to explore it, and are still linked with those experiences.
  • Another factor in all of this, for Merleau-Ponty, is our social existence: we cannot thrive without others, or not for long, and we need this especially in early life. This makes solipsistic speculation about the reality of others ridiculous; we could never engage in such speculation if we hadn’t already been formed by them.
  • As Descartes could have said (but didn’t), ‘I think, therefore other people exist.’ We grow up with people playing with us, pointing things out, talking, listening, and getting us used to reading emotions and movements; this is how we become capable, reflective, smoothly integrated beings.
  • In general, Merleau-Ponty thinks human experience only makes sense if we abandon philosophy’s time-honoured habit of starting with a solitary, capsule-like, immobile adult self, isolated from its body and world, which must then be connected up again — adding each element around it as though adding clothing to a doll. Instead, for him, we slide from the womb to the birth canal to an equally close and total immersion in the world. That immersion continues as long as we live, although we may also cultivate the art of partially withdrawing from time to time when we want to think or daydream.
  • When he looks for his own metaphor to describe how he sees consciousness, he comes up with a beautiful one: consciousness, he suggests, is like a ‘fold’ in the world, as though someone had crumpled a piece of cloth to make a little nest or hollow. It stays for a while, before eventually being unfolded and smoothed away. There is something seductive, even erotic, in this idea of my conscious self as an improvised pouch in the cloth of the world. I still have my privacy — my withdrawing room. But I am part of the world’s fabric, and I remain formed out of it for as long as I am here.
  • By the time of these works, Merleau-Ponty is taking his desire to describe experience to the outer limits of what language can convey. Just as with the late Husserl or Heidegger, or Sartre in his Flaubert book, we see a philosopher venturing so far from shore that we can barely follow. Emmanuel Levinas would head out to the fringes too, eventually becoming incomprehensible to all but his most patient initiates.
  • Sartre once remarked — speaking of a disagreement they had about Husserl in 1941 — that ‘we discovered, astounded, that our conflicts had, at times, stemmed from our childhood, or went back to the elementary differences of our two organisms’. Merleau-Ponty also said in an interview that Sartre’s work seemed strange to him, not because of philosophical differences, but because of a certain ‘register of feeling’, especially in Nausea, that he could not share. Their difference was one of temperament and of the whole way the world presented itself to them.
  • The two also differed in their purpose. When Sartre writes about the body or other aspects of experience, he generally does it in order to make a different point. He expertly evokes the grace of his café waiter, gliding between the tables, bending at an angle just so, steering the drink-laden tray through the air on the tips of his fingers — but he does it all in order to illustrate his ideas about bad faith. When Merleau-Ponty writes about skilled and graceful movement, the movement itself is his point. This is the thing he wants to understand.
  • We can never move definitively from ignorance to certainty, for the thread of the inquiry will constantly lead us back to ignorance again. This is the most attractive description of philosophy I’ve ever read, and the best argument for why it is worth doing, even (or especially) when it takes us no distance at all from our starting point.
  • By prioritising perception, the body, social life and childhood development, Merleau-Ponty gathered up philosophy’s far-flung outsider subjects and brought them in to occupy the centre of his thought.
  • In his inaugural lecture at the Collège de France on 15 January 1953, published as In Praise of Philosophy, he said that philosophers should concern themselves above all with whatever is ambiguous in our experience. At the same time, they should think clearly about these ambiguities, using reason and science. Thus, he said, ‘The philosopher is marked by the distinguishing trait that he possesses inseparably the taste for evidence and the feeling for ambiguity.’ A constant movement is required between these two
  • As Sartre wrote in response to Hiroshima, humanity had now gained the power to wipe itself out, and must decide every single day that it wanted to live. Camus also wrote that humanity faced the task of choosing between collective suicide and a more intelligent use of its technology — ‘between hell and reason’. After 1945, there seemed little reason to trust in humanity’s ability to choose well.
  • Merleau-Ponty observed in a lecture of 1951 that, more than any previous century, the twentieth century had reminded people how ‘contingent’ their lives were — how at the mercy of historical events and other changes that they could not control. This feeling went on long after the war ended. After the A-bombs were dropped on Hiroshima and Nagasaki, many feared that a Third World War would not be long in coming, this time between the Soviet Union and the United States.
ilanaprincilus06

Retail Sales Fall For 3rd Straight Month : NPR - 0 views

  • Restaurants and bars are reeling from persistent spikes of coronavirus cases and related restrictions in their communities, driving retail spending in December down for the third month in a row.
  • Even as people continue to splurge on shopping, they cut back on going out to eat and shop.
  • Gas stations saw the biggest jump in spending last month, up 6.6%, as people traveled for holiday visits despite health warnings.
  • ...8 more annotations...
  • People spent almost $790 billion on gifts and other purchases in the last two months of 2020
  • This growth, up 8.3% compared with 2019, was nearly double that seen in previous years.
  • Spending at restaurants and bars, meanwhile, was still down 21.2% in December compared to a year earlier,
  • This was a reflection of an unusual economic downturn. Even as millions lost jobs, Americans have continued to buy and renovate homes, splurging online on devices, workout gear and pricey purchases such as appliances and furniture that drove a lot of 2020 spending.
  • money that was no longer being spent on services freed up budgets to spend on goods
  • "We don't expect economic activity to return to pre-pandemic levels until late 2021 and employment at those levels won't return until well into 2022 and possibly 2023."
  • In December, leisure and hospitality businesses lost almost half a million jobs, most of them in eating and drinking establishments.
  • the U.S. has so far recovered less than 56% of the jobs that were lost last spring.
Javier E

The New History Wars - The Atlantic - 0 views

  • Critical historians who thought they were winning the fight for control within the academy now face dire retaliation from outside the academy. The dizzying turn from seeming triumph in 2020 to imminent threat in 2022 has unnerved many practitioners of the new history. Against this background, they did not welcome it when their association’s president suggested that maybe their opponents had a smidgen of a point.
  • a background reality of the humanities in the contemporary academy: a struggle over who is entitled to speak about what. Nowhere does this struggle rage more fiercely than in anything to do with the continent of Africa. Who should speak? What may be said? Who will be hired?
  • ne obvious escape route from the generational divide in the academy—and the way the different approaches to history, presentist and antiquarian, tend to map onto it—is for some people, especially those on the older and whiter side of the divide, to keep their mouths shut about sensitive issues
  • ...15 more annotations...
  • The political and methodological stresses within the historical profession are intensified by economic troubles. For a long time, but especially since the economic crisis of 2008, university students have turned away from the humanities, preferring to major in fields that seem to offer more certain and lucrative employment. Consequently, academic jobs in the humanities and especially in history have become radically more precarious for younger faculty—even as universities have sought to meet diversity goals in their next-generation hiring by expanding offerings in history-adjacent specialties, such as gender and ethnic studies.
  • The result has produced a generational divide. Younger scholars feel oppressed and exploited by universities pressing them to do more labor for worse pay with less security than their elders; older scholars feel that overeager juniors are poised to pounce on the least infraction as an occasion to end an elder’s career and seize a job opening for themselves. Add racial difference as an accelerant, and what was intended as an interesting methodological discussion in a faculty newsletter can explode into a national culture war.
  • One of the greatest American Africanists was the late Philip Curtin. He wrote one of the first attempts to tally the exact number of persons trafficked by the transatlantic slave trade. Upon publication in 1972, his book was acclaimed as a truly pioneering work of history. By 1995, however, he was moved to protest against trends in the discipline at that time in an article in the Chronicle of Higher Education:I am troubled by increasing evidence of the use of racial criteria in filling faculty posts in the field of African history … This form of intellectual apartheid has been around for several decades, but it appears to have become much more serious in the past few years, to the extent that white scholars trained in African history now have a hard time finding jobs.
  • Much of academia is governed these days by a joke from the Soviet Union: “If you think it, don’t speak it. If you speak it, don’t write it. If you write it, don’t sign it. But if you do think it, speak it, write it, and sign it—don’t be surprised.”
  • Yet this silence has consequences, too. One of the most unsettling is the displacement of history by mythmaking
  • mythmaking is spreading from “just the movies” to more formal and institutional forms of public memory. If old heroes “must fall,” their disappearance opens voids for new heroes to be inserted in their place—and that insertion sometimes requires that new history be fabricated altogether, the “bad history” that Sweet tried to warn against.
  • If it is not the job of the president of the American Historical Association to confront those questions, then whose is it?
  • Sweet used a play on words—“Is History History?”—for the title of his complacency-shaking essay. But he was asking not whether history is finished, done with, but Is history still history? Is it continuing to do what history is supposed to do? Or is it being annexed for other purposes, ideological rather than historical ones?
  • Advocates of studying the more distant past to disturb and challenge our ideas about the present may accuse their academic rivals of “presentism.”
  • In real life, of course, almost everybody who cares about history believes in a little of each option. But how much of each? What’s the right balance? That’s the kind of thing that historians do argue about, and in the arguing, they have developed some dismissive labels for one another
  • Those who look to the more recent past to guide the future may accuse the other camp of “antiquarianism.”
  • The accusation of presentism hurts because it implies that the historian is sacrificing scholarly objectivity for ideological or political purposes. The accusation of antiquarianism stings because it implies that the historian is burrowing into the dust for no useful purpose at all.
  • In his mind, he was merely reopening one of the most familiar debates in professional history: the debate over why? What is the value of studying the past? To reduce the many available answers to a stark choice: Should we study the more distant past to explore its strangeness—and thereby jolt ourselves out of easy assumptions that the world we know is the only possible one?
  • Or should we study the more recent past to understand how our world came into being—and thereby learn some lessons for shaping the future?
  • The August edition of the association’s monthly magazine featured, as usual, a short essay by the association’s president, James H. Sweet, a professor at the University of Wisconsin at Madison. Within hours of its publication, an outrage volcano erupted on social media. A professor at Cornell vented about the author’s “white gaze.”
Javier E

'The Godfather of AI' Quits Google and Warns of Danger Ahead - The New York Times - 0 views

  • he officially joined a growing chorus of critics who say those companies are racing toward danger with their aggressive campaign to create products based on generative artificial intelligence, the technology that powers popular chatbots like ChatGPT.
  • Dr. Hinton said he has quit his job at Google, where he has worked for more than decade and became one of the most respected voices in the field, so he can freely speak out about the risks of A.I. A part of him, he said, now regrets his life’s work.
  • “I console myself with the normal excuse: If I hadn’t done it, somebody else would have,”
  • ...24 more annotations...
  • Industry leaders believe the new A.I. systems could be as important as the introduction of the web browser in the early 1990s and could lead to breakthroughs in areas ranging from drug research to education.
  • But gnawing at many industry insiders is a fear that they are releasing something dangerous into the wild. Generative A.I. can already be a tool for misinformation. Soon, it could be a risk to jobs. Somewhere down the line, tech’s biggest worriers say, it could be a risk to humanity.
  • “It is hard to see how you can prevent the bad actors from using it for bad things,” Dr. Hinton said.
  • After the San Francisco start-up OpenAI released a new version of ChatGPT in March, more than 1,000 technology leaders and researchers signed an open letter calling for a six-month moratorium on the development of new systems because A.I technologies pose “profound risks to society and humanity.
  • Several days later, 19 current and former leaders of the Association for the Advancement of Artificial Intelligence, a 40-year-old academic society, released their own letter warning of the risks of A.I. That group included Eric Horvitz, chief scientific officer at Microsoft, which has deployed OpenAI’s technology across a wide range of products, including its Bing search engine.
  • Dr. Hinton, often called “the Godfather of A.I.,” did not sign either of those letters and said he did not want to publicly criticize Google or other companies until he had quit his job
  • Dr. Hinton, a 75-year-old British expatriate, is a lifelong academic whose career was driven by his personal convictions about the development and use of A.I. In 1972, as a graduate student at the University of Edinburgh, Dr. Hinton embraced an idea called a neural network. A neural network is a mathematical system that learns skills by analyzing data. At the time, few researchers believed in the idea. But it became his life’s work.
  • Dr. Hinton is deeply opposed to the use of artificial intelligence on the battlefield — what he calls “robot soldiers.”
  • In 2012, Dr. Hinton and two of his students in Toronto, Ilya Sutskever and Alex Krishevsky, built a neural network that could analyze thousands of photos and teach itself to identify common objects, such as flowers, dogs and cars.
  • In 2018, Dr. Hinton and two other longtime collaborators received the Turing Award, often called “the Nobel Prize of computing,” for their work on neural networks.
  • Around the same time, Google, OpenAI and other companies began building neural networks that learned from huge amounts of digital text. Dr. Hinton thought it was a powerful way for machines to understand and generate language, but it was inferior to the way humans handled language.
  • Then, last year, as Google and OpenAI built systems using much larger amounts of data, his view changed. He still believed the systems were inferior to the human brain in some ways but he thought they were eclipsing human intelligence in others.
  • “Maybe what is going on in these systems,” he said, “is actually a lot better than what is going on in the brain.”
  • As companies improve their A.I. systems, he believes, they become increasingly dangerous. “Look at how it was five years ago and how it is now,” he said of A.I. technology. “Take the difference and propagate it forwards. That’s scary.”
  • Until last year, he said, Google acted as a “proper steward” for the technology, careful not to release something that might cause harm. But now that Microsoft has augmented its Bing search engine with a chatbot — challenging Google’s core business — Google is racing to deploy the same kind of technology. The tech giants are locked in a competition that might be impossible to stop, Dr. Hinton said.
  • His immediate concern is that the internet will be flooded with false photos, videos and text, and the average person will “not be able to know what is true anymore.”
  • He is also worried that A.I. technologies will in time upend the job market. Today, chatbots like ChatGPT tend to complement human workers, but they could replace paralegals, personal assistants, translators and others who handle rote tasks. “It takes away the drudge work,” he said. “It might take away more than that.”
  • Down the road, he is worried that future versions of the technology pose a threat to humanity because they often learn unexpected behavior from the vast amounts of data they analyze. This becomes an issue, he said, as individuals and companies allow A.I. systems not only to generate their own computer code but actually run that code on their ow
  • And he fears a day when truly autonomous weapons — those killer robots — become reality.
  • “The idea that this stuff could actually get smarter than people — a few people believed that,” he said. “But most people thought it was way off. And I thought it was way off. I thought it was 30 to 50 years or even longer away. Obviously, I no longer think that.”
  • Many other experts, including many of his students and colleagues, say this threat is hypothetical. But Dr. Hinton believes that the race between Google and Microsoft and others will escalate into a global race that will not stop without some sort of global regulation.
  • But that may be impossible, he said. Unlike with nuclear weapons, he said, there is no way of knowing whether companies or countries are working on the technology in secret. The best hope is for the world’s leading scientists to collaborate on ways of controlling the technology. “I don’t think they should scale this up more until they have understood whether they can control it,” he said.
  • Dr. Hinton said that when people used to ask him how he could work on technology that was potentially dangerous, he would paraphrase Robert Oppenheimer, who led the U.S. effort to build the atomic bomb: “When you see something that is technically sweet, you go ahead and do it.”
  • He does not say that anymore.
Javier E

For Lee Tilghman, There Is Life After Influencing - The New York Times - 0 views

  • At her first full-time job since leaving influencing, the erstwhile smoothie-bowl virtuoso Lee Tilghman stunned a new co-worker with her enthusiasm for the 9-to-5 grind.
  • The co-worker pulled her aside that first morning, wanting to impress upon her the stakes of that decision. “This is terrible,” he told her. “Like, I’m at a desk.”“You don’t get it,” Ms. Tilghman remembered saying. “You think you’re a slave, but you’re not.” He had it backward, she added. “When you’re an influencer, then you have chains on.’”
  • In the late 2010s, for a certain subset of millennial women, Ms. Tilghman was wellness culture, a warm-blooded mood board of Outdoor Voices workout sets, coconut oil and headstands. She had earned north of $300,000 a year — and then dropped more than 150,000 followers, her entire management team, and most of her savings to become an I.R.L. person.
  • ...8 more annotations...
  • The corporate gig, as a social media director for a tech platform, was a revelation. “I could just show up to work and do work,” Ms. Tilghman said. After she was done, she could leave. She didn’t have to be a brand. There’s no comments section at an office job.
  • In 2019, a Morning Consult report found that 54 percent of Gen Z and millennial Americans were interested in becoming influencers. (Eighty-six percent said they would be willing to post sponsored content for money.)
  • If social media has made audiences anxious, it’s driving creators to the brink. In 2021, the TikTok breakout star Charli D’Amelio said she had “lost the passion” for posting videos. A few months later, Erin Kern announced to her 600,000 Instagram followers that she would be deactivating her account @cottonstem; she had been losing her hair, and her doctors blamed work-induced stress
  • Other influencers faded without fanfare — teens whose mental health had taken too much of a hit and amateur influencers who stopped posting after an algorithm tweak tanked their metrics. Some had been at this for a decade or more, starting at 12 or 14 or 19.
  • She posted less, testing out new identities that she hoped wouldn’t touch off the same spiral that wellness had. There were dancing videos, dog photos, interior design. None of it stuck. (“You can change the niche, but you’re still going to be performing your life for content,” she explained over lunch.)
  • Ms. Tilghman’s problem — as the interest in the workshop, which she decided to cap at 15, demonstrated — is that she has an undeniable knack for this. In 2022, she started a Substack to continue writing, thinking of it as a calling card while she applied to editorial jobs; it soon amassed 20,000 subscribers. It once had a different name, but now it’s called “Offline Time.” The paid tier costs $5 a month.
  • Casey Lewis, who helms the After School newsletter about Gen Z consumer trends, predicts more pivots and exits. TikTok has elevated creators faster than other platforms and burned them out quicker, she said.
  • Ms. Lewis expects a swell of former influencers taking jobs with P.R. agencies, marketing firms and product development conglomerates. She pointed out that creators have experience not just in video and photo editing, but in image management, crisis communication and rapid response. “Those skills do transfer,” she said.
Javier E

The Obama legacy that can't be repealed - The Washington Post - 0 views

  • There is no mystery about Barack Obama’s greatest presidential achievement: He stopped the Great Recession from becoming the second Great Depression. True, he had plenty of help, including from his predecessor, George W. Bush, and from the top officials at the Treasury Department and Federal Reserve. But if Obama had made one wrong step, what was a crushing economic slump could have become something much worse.
  • It is Obama’s unfortunate fate that the high-water mark of his presidency occurred in the first months, when the world flirted with financial calamity. The prospect of another Great Depression — a long period of worsening economic decline — was not far-fetched.
  • In the first quarter of 2009, as Obama was moving into the White House, monthly job losses averaged 772,000. The ultimate decline in employment was 8.7 million jobs, or 6.3 percent. Housing prices and stock values were collapsing. From their peak in February 2007 to their low point, housing prices dropped 26 percent. Millions of homeowners were “underwater” — their houses were worth less than the mortgages on them. Stock prices fell roughly by half from August 2007 to March 2009.
  • ...7 more annotations...
  • There was no guarantee that the economy’s downward spiral wouldn’t continue, as frightened businesses and consumers curbed spending and, in the process, increased unemployment. The CEA presents a series of charts comparing the 2008-2009 slump with the Great Depression. In every instance, the 2008-2009 downturn was as bad as — or worse than — the first year of the Great Depression: employment loss, drop in global trade and change in households’ net worth.
  • The starkest of these was the fall in households’ net worth (people’s assets, such as homes and stock, minus their debts, such as mortgages and credit-card balances). It dropped by $13 trillion, about a fifth, from its high point in 2007 to its trough in 2009. This decline, the CEA notes, “was far larger than the reduction [adjusted for inflation] . . . at the onset of the Great Depression.”
  • What separates then from now is that, after 18 months or so, spending turned up in 2009 while it continued declining in the 1930s. This difference reflected, at least in part, the aggressive policies adopted to blunt the downturn. The Fed cut short-term interest rates to zero and provided other avenues of cheap credit; the Troubled Asset Relief Program (TARP), enacted in the final months of the Bush administration, poured money into major banks to reassure the public of their solvency.
  • Still, Obama’s role was crucial. Against opposition, he decided to rescue General Motors and Chrysler. Throwing them onto the tender mercies of the market would have been a huge blow to the industrial Midwest and to national psychology. He also championed a sizable budget “stimulus.” Advertised originally as $787 billion, it was actually $2.6 trillion over four years when the initial program was combined with later proposals and so-called “automatic stabilizers” are included, the CEA says
  • More generally, Obama projected reason and calm when much of the nation was fearful and frazzled. Of course, he didn’t single-handedly restore confidence, but he made a big contribution
  • the recovery from the Great Recession is mostly complete. This seems plausible. Since the low point, employment is up 15.6 million jobs. Rising home and stock prices have boosted inflation-adjusted household net worth by 16 percent. Gross domestic product — the economy — is nearly 12 percent higher than before the financial crisis
  • his impact is underestimated. Suppose we had had a second Great Depression with, say, peak unemployment of 15 percent. Almost all our problems — from poverty to political polarization — would have worsened. Obama’s influence must be considered in this context. When historians do, they may be more impressed.
Javier E

It's Not About You - NYTimes.com - 1 views

  • This year’s graduates are members of the most supervised generation in American history. Through their childhoods and teenage years, they have been monitored, tutored, coached and honed to an unprecedented degree.
  • they will confront amazingly diverse job markets, social landscapes and lifestyle niches. Most will spend a decade wandering from job to job and clique to clique, searching for a role
  • you see that many graduates are told to: Follow your passion, chart your own course, march to the beat of your own drummer, follow your dreams and find yourself. This is the litany of expressive individualism, which is still the dominant note in American culture.
  • ...7 more annotations...
  • this talk is of no help to the central business of adulthood, finding serious things to tie yourself down to. The successful young adult is beginning to make sacred commitments — to a spouse, a community and calling — yet mostly hears about freedom and autonomy.
  • very few people at age 22 or 24 can take an inward journey and come out having discovered a developed self.
  • Most successful young people don’t look inside and then plan a life. They look outside and find a problem, which summons their life.
  • Most people don’t form a self and then lead a life. They are called by a problem, and the self is constructed gradually by their calling.
  • The graduates are also told to pursue happiness and joy. But, of course, when you read a biography of someone you admire, it’s rarely the things that made them happy that compel your admiration. It’s the things they did to court unhappiness — the things they did that were arduous and miserable, which sometimes cost them friends and aroused hatred. It’s excellence, not happiness, that we admire most.
  • Today’s grads enter a cultural climate that preaches the self as the center of a life.
  • Most of us are egotistical and most are self-concerned most of the time, but it’s nonetheless true that life comes to a point only in those moments when the self dissolves into some task. The purpose in life is not to find yourself. It’s to lose yourself.
Javier E

ThinkUp Helps the Social Network User See the Online Self - NYTimes.com - 1 views

  • In addition to a list of people’s most-used words and other straightforward stats like follower counts, ThinkUp shows subscribers more unusual information such as how often they thank and congratulate people, how frequently they swear, whose voices they tend to amplify and which posts get the biggest reaction and from whom.
  • after using ThinkUp for about six months, I’ve found it to be an indispensable guide to how I navigate social networks.
  • Every morning the service delivers an email packed with information, and in its weighty thoroughness, it reminds you that what you do on Twitter and Facebook can change your life, and other people’s lives, in important, sometimes unforeseen ways.
  • ...14 more annotations...
  • ThinkUp is something like Elf on the Shelf for digitally addled adults — a constant reminder that someone is watching you, and that you’re being judged.
  • “The goal is to make you act like less of a jerk online,” Ms. Trapani said. “The big goal is to create mindfulness and awareness, and also behavioral change.”
  • One of the biggest dangers is saying something off the cuff that might make sense in a particular context, but that sounds completely off the rails to the wider public. The problem, in other words, is acting without thinking — being caught up in the moment, without pausing to reflect on the long-term consequences. You’re never more than a few taps away from an embarrassment that might ruin your career, or at least your reputation, for years to come.
  • Because social networks often suggest a false sense of intimacy, they tend to lower people’s self-control.
  • Like a drug or perhaps a parasite, they worm into your devices, your daily habits and your every free moment, and they change how you think.Continue reading the main story Continue reading the main story
  • For those of us most deeply afflicted, myself included, every mundane observation becomes grist for a 140-character quip, and every interaction a potential springboard into an all-consuming, emotionally wrenching flame battle.
  • people often tweet and update without any perspective about themselves. That’s because Facebook and Twitter, as others have observed, have a way of infecting our brains.
  • getting a daily reminder from ThinkUp that there are good ways and bad ways to behave online — has a tendency to focus the mind.
  • More basically, though, it’s helped me pull back from social networks. Each week, ThinkUp tells me how often I’ve tweeted. Sometimes that number is terribly high — a few weeks ago it was more than 800 times — and I realize I’m probably overtaxing my followers
  • ThinkUp charges $5 a month for each social network you connect to it. Is it worth it? After all, there’s a better, more surefire way of avoiding any such long-term catastrophe caused by social media: Just stop using social networks.
  • The main issue constraining growth, the founders say, is that it has been difficult to explain to people why they might need ThinkUp.
  • your online profile plays an important role in how you’re perceived by potential employers. In a recent survey commissioned by the job-hunting site CareerBuilder, almost half of companies said they perused job-seekers’ social networking profiles to look for red flags and to see what sort of image prospective employees portrayed online.
  • even though “never tweet” became a popular, ironic thing to tweet this year, actually never tweeting, and never being on Facebook, is becoming nearly impossible for many people.
  • That may change as more people falter on social networks, either by posting unthinking comments that end up damaging their careers, or simply by annoying people to the point that their online presence becomes a hindrance to their real-life prospects.
sissij

The Gig Economy's False Promise - The New York Times - 0 views

  • Its digital technology lets workers become entrepreneurs, we are told, freed from the drudgery of 9-to-5 jobs.
  • In reality, there is no utopia at companies like Uber, Lyft, Instacart and Handy, whose workers are often manipulated into working long hours for low wages while continually chasing the next ride or task.
  • A recent story in The Times by Noam Scheiber vividly described how Uber and other companies use tactics developed by the video game industry to keep drivers on the road when they would prefer to call it a day, raising company revenue while lowering drivers’ per-hour earnings.
  • ...3 more annotations...
  • they do not qualify for basic protections like overtime pay and minimum wages.
  • independent contractors
  • many of which lose money and rely on investors to keep pouring in billions of dollars of capital, might find that it pays to treat workers better and even make some of them employees.
  •  
    As Uber and other innovative company develop, new forms of jobs appear. Independent contractor is a new idea that has been very popular these year. I found more and more people are tired with working regularly under the command of a boss. They want to get hold of themselves so they rather sign independent contracts and have a more flexible working time. However, this new form of economic is not mature since laws and regulations have not yet covered independent contractors. --Sissi (4/10/2017)
Javier E

What Is College For? (Part 2) - NYTimes.com - 0 views

  • How, exactly, does college prepare students for the workplace? For most jobs, it provides basic intellectual skills: the ability to understand relatively complex instructions, to write and speak clearly and cogently, to evaluate options critically. Beyond these intellectual skills, earning a college degree shows that you have the “moral qualities” needed for most jobs: you have (to put it a bit cynically), for a period of four years and with relatively little supervision, deferred to authority, met deadlines and carried out difficult tasks even when you found them pointless and boring.
  • This sort of intellectual and moral training, however, does not require studying with experts doing cutting-edge work on, say, Homeric poetry, elementary particle theory or the philosophy of Kant. It does not, that is, require the immersion in the world of intellectual culture that a college faculty is designed to provide. It is, rather, the sort of training that ought to result from good elementary and high school education.
  • students graduating from high school should, to cite one plausible model, be able to read with understanding classic literature (from, say, Austen and Browning to Whitman and Hemingway) and write well-organized and grammatically sound essays; they should know the basic outlines of American and European history, have a good beginner’s grasp of at least two natural sciences as well as pre-calculus mathematics, along with a grounding in a foreign language.
  • ...4 more annotations...
  • Is it really possible to improve grade school and high school teaching to the level I’m suggesting? Yes, provided we employ the same sort of selection criteria for pre-college teachers as we do for other professionals such as doctors, lawyers and college professors. In contrast to other professions, teaching is not now the domain of the most successful students — quite the contrary. I’ve known many very bright students who had an initial interest in such teaching but soon realized that there is no comparison in terms of salary, prestige and working conditions.
  • Given this transformation in pre-college education, we could expect it to provide basic job-training for most students. At that point, we would still face a fundamental choice regarding higher education. We could see it as a highly restricted enterprise, educating only professionals who require advanced specialized skills. Correspondingly, only such professionals would have access to higher education as a locus of intellectual culture.
  • On the other hand, we could — as I would urge — see college as the entrée to intellectual culture for everyone who is capable of and interested in working at that level of intellectual engagement
  • Raising high school to the level I am proposing and opening college to everyone who will profit from it would be an expensive enterprise. We would need significant government support to ensure that all students receive an education commensurate with their abilities and aspirations, regardless of family resources. But the intellectual culture of our citizens should be a primary part of our national well-being, not just the predilection of an eccentric elite. As such, it should be among our highest priorities.
‹ Previous 21 - 40 of 451 Next › Last »
Showing 20 items per page