Skip to main content

Home/ TOK Friends/ Group items tagged obsolete

Rss Feed Group items tagged

Javier E

Opinion | Will Translation Apps Make Learning Foreign Languages Obsolete? - The New Yor... - 0 views

  • In Europe, nine out of 10 students study a foreign language. In the United States, only one in five do. Between 1997 and 2008, the number of American middle schools offering foreign languages dropped from 75 percent to 58 percent. Between 2009 and 2013, one American college closed its foreign language program; between 2013 and 2017, 651 others did the same.
  • At first glance, these statistics look like a tragedy. But I am starting to harbor the odd opinion that maybe they are not. What is changing my mind is technology.
  • what about spoken language? I was in Belgium not long ago, and I watched various tourists from a variety of nations use instant speech translation apps to render their own languages into English and French. The newer ones can even reproduce the tone of the speaker’s voice; a leading model, iTranslate, publicizes that its Translator app has had 200 million downloads so far.
  • ...12 more annotations...
  • I don’t think these tools will ever render learning foreign languages completely obsolete. Real conversation in the flowing nuances of casual speech cannot be rendered by a program, at least not in a way that would convey full humanity.
  • even if it may fail at genuine, nuanced conversation — for now, at least — technology is eliminating most of the need to learn foreign languages for more utilitarian purposes.
  • The old-school language textbook scenarios, of people reserving hotel rooms or ordering meals in the language of the country they are visiting — “Greetings. Please bring me a glass of lemonade and a sandwich!” — will now be obsolete
  • to actively enjoy piecing together how other languages work is an individual quirk, not a human universal
  • Obsessive language learners have come to call themselves the polyglot community over the past couple of decades, and I am one of them, to an extent. As such, I know well how hard it can be to recognize that most human beings are numb to this peculiar desire.
  • Most human beings are interested much less in how they are saying things, and which language they are saying them in, than in what they are saying.
  • Learning to express this what — beyond the very basics — in another language is hard. It can be especially hard for us Anglophones, as speaking English works at least decently in so many places
  • To polyglots, foreign languages are Mount Everests daring us to climb them — a metaphor used by Hofstadter in his article. But to most people, they are just a barrier to get to the other side of.
  • After all, despite the sincere and admirable efforts of foreign language teachers nationwide, fewer than one in 100 American students become proficient in a language they learned in school.
  • Because I love trying to learn languages and am endlessly fascinated by their varieties and complexities, I am working hard to wrap my head around this new reality. With an iPhone handy and an appropriate app downloaded, foreign languages will no longer present most people with the barrier or challenge they once did
  • Learning to genuinely speak a new language will hardly be unknown. It will continue to beckon, for instance, for those actually relocating to a new country. And it will persist with people who want to engage with literature or media in the original language, as well as those of us who find pleasure in mastering these new codes just because they are “there.”
  • In other words, it will likely become an artisanal pursuit, of interest to a much smaller but more committed set of enthusiasts. And weird as that is, it is in its way a kind of progress.
Javier E

The End of Courtship? - NYTimes.com - 0 views

  • “The word ‘date’ should almost be stricken from the dictionary,” Ms. Silver said. “Dating culture has evolved to a cycle of text messages, each one requiring the code-breaking skills of a cold war spy to interpret.”
  • Raised in the age of so-called “hookup culture,” millennials — who are reaching an age where they are starting to think about settling down — are subverting the rules of courtship.
  • Instead of dinner-and-a-movie, which seems as obsolete as a rotary phone, they rendezvous over phone texts, Facebook posts, instant messages and other “non-dates” that are leaving a generation confused about how to land a boyfriend or girlfriend.
  • ...13 more annotations...
  • Blame the much-documented rise of the “hookup culture” among young people, characterized by spontaneous, commitment-free (and often, alcohol-fueled) romantic flings. Many students today have never been on a traditional date,
  • Hookups may be fine for college students, but what about after, when they start to build an adult life? The problem is that “young people today don’t know how to get out of hookup culture,”
  • In interviews with students, many graduating seniors did not know the first thing about the basic mechanics of a traditional date. “They’re wondering, ‘If you like someone, how would you walk up to them? What would you say? What words would you use?’ ”
  • Traditional courtship — picking up the telephone and asking someone on a date — required courage, strategic planning and a considerable investment of ego (by telephone, rejection stings). Not so with texting, e-mail, Twitter or other forms of “asynchronous communication,” as techies call it. In the context of dating, it removes much of the need for charm; it’s more like dropping a line in the water and hoping for a nibble.
  • Online dating services, which have gained mainstream acceptance, reinforce the hyper-casual approach by greatly expanding the number of potential dates. Faced with a never-ending stream of singles to choose from, many feel a sense of “FOMO” (fear of missing out), so they opt for a speed-dating approach — cycle through lots of suitors quickly.
  • “I’ve seen men put more effort into finding a movie to watch on Netflix Instant than composing a coherent message to ask a woman out,” said Anna Goldfarb, 34, an author and blogger in Moorestown, N.J. A typical, annoying query is the last-minute: “Is anything fun going on tonight?” More annoying still are the men who simply ping, “Hey” or “ ’sup.”
  • The mass-mailer approach necessitates “cost-cutting, going to bars, meeting for coffee the first time,” he added, “because you only want to invest in a mate you’re going to get more out of.”
  • in  a world where “courtship” is quickly being redefined, women must recognize a flirtatious exchange of tweets, or a lingering glance at a company softball game, as legitimate opportunities for romance, too.
  • THERE’S another reason Web-enabled singles are rendering traditional dates obsolete. If the purpose of the first date was to learn about someone’s background, education, politics and cultural tastes, Google and Facebook have taken care of that.
  • Dodgy economic prospects facing millennials also help torpedo the old, formal dating rituals. Faced with a lingering recession, a stagnant job market, and mountains of student debt, many young people — particularly victims of the “mancession” — simply cannot afford to invest a fancy dinner or show in someone they may or may not click with.
  • “Maybe there’s still a sense of a man taking care of a woman, but our ideology is aligning with the reality of our finances,” Ms. Rosin said. As a man, you might “convince yourself that dating is passé, a relic of a paternalistic era, because you can’t afford to take a woman to a restaurant.”
  • “A lot of men in their 20s are reluctant to take the girl to the French restaurant, or buy them jewelry, because those steps tend to lead to ‘eventually, we’re going to get married,’ ” Mr. Edness, 27, said. In a tight economy, where everyone is grinding away to build a career, most men cannot fathom supporting a family until at least 30 or 35, he said.
  • Even in an era of ingrained ambivalence about gender roles, however, some women keep the old dating traditions alive by refusing to accept anything less. Cheryl Yeoh, a tech entrepreneur in San Francisco, said that she has been on many formal dates of late — plays, fancy restaurants. One suitor even presented her with red roses. For her, the old traditions are alive simply because she refuses to put up with anything less. She generally refuses to go on any date that is not set up a week in advance, involving a degree of forethought. “If he really wants you,” Ms. Yeoh, 29, said, “he has to put in some effort.”
Javier E

Opinion | A New Dark Age Looms - The New York Times - 0 views

  • IMAGINE a future in which humanity’s accumulated wisdom about Earth — our vast experience with weather trends, fish spawning and migration patterns, plant pollination and much more — turns increasingly obsolete. As each decade passes, knowledge of Earth’s past becomes progressively less effective as a guide to the future. Civilization enters a dark age in its practical understanding of our planet.
  • as Earth warms, our historical understanding will turn obsolete faster than we can replace it with new knowledge. Some patterns will change significantly; others will be largely unaffected, though it will be difficult to say what will change, by how much, and when.
  • Until then, farmers will struggle to reliably predict new seasonal patterns and regularly plant the wrong crops. Early signs of major drought will go unrecognized, so costly irrigation will be built in the wrong places. Disruptive societal impacts will be widespread.
  • ...11 more annotations...
  • Such a dark age is a growing possibility. In a recent report, the National Academies of Sciences, Engineering and Medicine concluded that human-caused global warming was already altering patterns of some extreme weather events
  • disrupting nature’s patterns could extend well beyond extreme weather, with far more pervasive impacts.
  • Our foundation of Earth knowledge, largely derived from historically observed patterns, has been central to society’s progress.
  • Science has accelerated this learning process through advanced observation methods and pattern discovery techniques. These allow us to anticipate the future with a consistency unimaginable to our ancestors
  • As Earth’s warming stabilizes, new patterns begin to appear. At first, they are confusing and hard to identify. Scientists note similarities to Earth’s emergence from the last ice age. These new patterns need many years — sometimes decades or more — to reveal themselves fully, even when monitored with our sophisticated observing systems
  • The list of possible disruptions is long and alarming. We could see changes to the prevalence of crop and human pests, like locust plagues set off by drought conditions; forest fire frequency; the dynamics of the predator-prey food chain; the identification and productivity of reliably arable land, and the predictability of agriculture output.
  • Historians of the next century will grasp the importance of this decline in our ability to predict the future. They may mark the coming decades of this century as the period during which humanity, despite rapid technological and scientific advances, achieved “peak knowledge” about the planet it occupies
  • The intermediate time period is our big challenge. Without substantial scientific breakthroughs, we will remain reliant on pattern-based methods for time periods between a month and a decade. The problem is, as the planet warms, these patterns will become increasingly difficult to discern.
  • The oceans, which play a major role in global weather patterns, will also see substantial changes as global temperatures rise. Ocean currents and circulation patterns evolve on time scales of decades and longer, and fisheries change in response. We lack reliable, physics-based models to tell us how this occurs
  • Civilization’s understanding of Earth has expanded enormously in recent decades, making humanity safer and more prosperous. As the patterns that we have come to expect are disrupted by warming temperatures, we will face huge challenges feeding a growing population and prospering within our planet’s finite resources. New developments in science offer our best hope for keeping up, but this is by no means guaranteed
  • Our grandchildren could grow up knowing less about the planet than we do today. This is not a legacy we want to leave them. Yet we are on the verge of ensuring this happens.
Javier E

A Million First Dates - Dan Slater - The Atlantic - 0 views

  • The positive aspects of online dating are clear: the Internet makes it easier for single people to meet other single people with whom they might be compatible, raising the bar for what they consider a good relationship. But what if online dating makes it too easy to meet someone new? What if it raises the bar for a good relationship too high? What if the prospect of finding an ever-more-compatible mate with the click of a mouse means a future of relationship instability, in which we keep chasing the elusive rabbit around the dating track?
  • the rise of online dating will mean an overall decrease in commitment.
  • I often wonder whether matching you up with great people is getting so efficient, and the process so enjoyable, that marriage will become obsolete.”
  • ...5 more annotations...
  • “Historically,” says Greg Blatt, the CEO of Match.com’s parent company, “relationships have been billed as ‘hard’ because, historically, commitment has been the goal. You could say online dating is simply changing people’s ideas about whether commitment itself is a life value.” Mate scarcity also plays an important role in people’s relationship decisions. “Look, if I lived in Iowa, I’d be married with four children by now,” says Blatt, a 40‑something bachelor in Manhattan. “That’s just how it is.”
  • “I think divorce rates will increase as life in general becomes more real-time,” says Niccolò Formai, the head of social-media marketing at Badoo, a meeting-and-dating app with about 25 million active users worldwide. “Think about the evolution of other kinds of content on the Web—stock quotes, news. The goal has always been to make it faster. The same thing will happen with meeting. It’s exhilarating to connect with new people, not to mention beneficial for reasons having nothing to do with romance. You network for a job. You find a flatmate. Over time you’ll expect that constant flow. People always said that the need for stability would keep commitment alive. But that thinking was based on a world in which you didn’t meet that many people.”
  • “You could say online dating allows people to get into relationships, learn things, and ultimately make a better selection,” says Gonzaga. “But you could also easily see a world in which online dating leads to people leaving relationships the moment they’re not working—an overall weakening of commitment.”
  • Explaining the mentality of a typical dating-site executive, Justin Parfitt, a dating entrepreneur based in San Francisco, puts the matter bluntly: “They’re thinking, Let’s keep this fucker coming back to the site as often as we can.” For instance, long after their accounts become inactive on Match.com and some other sites, lapsed users receive notifications informing them that wonderful people are browsing their profiles and are eager to chat. “Most of our users are return customers,” says Match.com’s Blatt.
  • The market is hugely more efficient … People expect to—and this will be increasingly the case over time—access people anywhere, anytime, based on complex search requests … Such a feeling of access affects our pursuit of love … the whole world (versus, say, the city we live in) will, increasingly, feel like the market for our partner(s). Our pickiness will probably increase.” “Above all, Internet dating has helped people of all ages realize that there’s no need to settle for a mediocre relationship.”
Javier E

How GOP Leaders Must Manage Their Political Lives in the Era of Donald Trump - The Atla... - 0 views

  • The day-in, day-out work of the politician is the management of electoral coalitions: coaxing, cajoling, compelling people to work together who—in the more natural course of things—might have nothing in common
  • Unlike writers and intellectuals, politicians don’t have the freedom to work only with people they like and admire. Unlike writers and intellectuals, they have no duty to speak aloud their inner convictions—their work would become impossible if they did.
  • Politics unfortunately abounds in shams that must be treated reverentially for every politician who would succeed. If you are the sort of man whose stomach revolts against treating shams reverentially, you will be well advised to stay out of politics altogether and set up as a prophet; your prophecies may perhaps sow good seed for some future harvest. But as a politician you would be impotent. For at any given time the bulk of your countrymen believe firmly and devoutly, not only in various things that are worthy of belief, but also in illusions of one kind and another; and they will never submit to have their affairs managed for them by one who appears not to share in their credulity.
  • ...9 more annotations...
  • More F.S. Oliver: Nothing in politics is sadder than: the man of sterling character whose genius is so antipathetic to the particular emergency in which he finds himself as to stupefy his thoughts and paralyze his actions. He drifts to disaster, grappling blindfolded which are beyond his comprehension, failing without really fighting.
  • Bad choices over the past decade by Republican political leaders opened the way to Donald Trump, yes.  For a decade, Republican voters have signaled they wanted to protect Medicare, cut immigration, fight fewer wars, and nominate no more Bushes. Their party leaders interpreted those signals as demands to cut Medicare, increase immigration, put boots on the ground in Syria, and nominate another Bush.
  • Their task ahead, in the Biblical phrase, is to pluck the brands from the fire—rescue as much of their party as can be rescued—while simultaneously minimizing the damage to party and country by the nominee their rank-and-file has imposed on them. They need to maneuver so that Trump’s defeat is as solitary as possible, and so that he cannot shift the blame for the failure he has earned onto the heads of others
  • Trump’s taught Republican politicians that they’ve neglected the interests and values of their core supporters. He’s demonstrated that much of their party ideology is obsolete, and that their language no longer moves their voters. He’s proven that their party is less culturally conservative than they believed, less hostile to social insurance than they imagined, and more worried about the economic and social costs of mass migration than they realized. Those are valuable lessons that need to be absorbed and pondered.
  • He’s also demonstrated that he himself is a dangerous person
  • To save themselves and their country, Republican politicians will have to rediscover the politician’s arts of deftness, flexibility, and self-preservation—while stealthily hastening Trump toward the defeat that almost certainly awaits him in November.
  • That’s a big job and a hard job, all the harder because they cannot acknowledge what they are doing. They will seem to help Trump win, while actually working to ensure he fail
  • What Walter Lippman said of presidents is really true of all politicians: They are not “working through noble institutions to dear ends … but trying to grind out a few crude results from a decadent political machine.”
  • The harms they stop are more important than the good they cannot achieve. What they’re called upon to do is to practice statesmanship without fine phrases; to protect the republic without receiving any credit for it
Javier E

In This Snapchat Campaign, Election News Is Big and Then It's Gone - The New York Times - 1 views

  • Every modern presidential election is at least in part defined by the cool new media breakthrough of its moment.
  • In 2000, there was email, and by golly was that a big change from the fax. The campaigns could get their messages in front of print and cable news reporters — who could still dominate the campaign narrative — at will,
  • Then 2008: Facebook made it that much easier for campaigns to reach millions of people directly,
  • ...17 more annotations...
  • The 2004 campaign was the year of the “Web log,” or blog, when mainstream reporters and campaigns officially began losing any control they may have had over political new
  • The question this year has been whether 2016 will be the “Snapchat election,
  • Snapchat represents a change to something else: the longevity of news, how durably it keeps in our brain cells and our servers.
  • Snapchat is recording the here and the now, playing for today. Tomorrow will bring something new that renders today obsolete. It’s a digital Tibetan sand painting made in the image of the millennial mind.
  • Snapchat executives say they set up the app this way because this is what their tens of millions of younger users want; it’s how they live.
  • They can’t possibly have enough bandwidth to process all the incoming information and still dwell on what already was, can they?
  • Experienced strategists and their candidates, who could always work through their election plans methodically — promoting their candidacies one foot in front of the other, adjusting here and there for the unexpected — suddenly found that they couldn’t operate the way they always did.
  • Marco Rubio’s campaign marched into the election season ready to fight the usual news-cycle-by-news-cycle skirmishes. It was surprised to learn that, lo and behold, “There was no news cycle — everything was one big fire hose,” Alex Conant, a senior Rubio strategist, told me. “News was constantly breaking and at the end of the day hardly anything mattered. Things would happen; 24 hours later, everyone was talking about something else.”
  • Then there was Jeb Bush, expecting to press ahead by presenting what he saw as leading-edge policy proposals that would set off a prolonged back-and-forth. When Mr. Bush rolled out a fairly sweeping plan to upend the college loan system, the poor guy thought this was going to become a big thing.
  • It drew only modest coverage and was quickly buried by the latest bit from Donald Trump.
  • In this “hit refresh” political culture, damaging news does not have to stick around for long, either. The next development, good or bad, replaces it almost immediately.
  • Mr. Miller pointed to a recent episode in which Mr. Trump said a protester at a rally had “ties to ISIS,” after that protester charged the stage. No such ties existed. “He says ‘ISIS is attacking me’; this was debunked in eight minutes by Twitter,” Mr. Miller said. “Cable talked about it for three hours and it went away.”
  • “Hillary Clinton said that she was under sniper fire in Bosnia” — she wasn’t — “and that has stuck with her for 20 years,”
  • Mr. Trump has mastered this era of short attention spans in politics by realizing that if you’re the one regularly feeding the stream, you can forever move past your latest trouble, and hasten the mass amnesia.
  • It was with this in mind that The Washington Post ran an editorial late last week reminding its readers of some of Mr. Trump’s more outlandish statements and policy positions
  • The Post urged its readers to “remember” more than two dozen items from Mr. Trump’s record, including that he promised “to round up 11 million undocumented immigrants and deport them,” and “lied about President Obama’s birth certificate.”
  • as the media habits of the young drive everybody else’s, I’m reminded of that old saw about those who forget history. Now, what was I saying?
Javier E

The American Scholar: The Decline of the English Department - William M. Chace - 1 views

  • The number of young men and women majoring in English has dropped dramatically; the same is true of philosophy, foreign languages, art history, and kindred fields, including history. As someone who has taught in four university English departments over the last 40 years, I am dismayed by this shift, as are my colleagues here and there across the land. And because it is probably irreversible, it is important to attempt to sort out the reasons—the many reasons—for what has happened.
  • English: from 7.6 percent of the majors to 3.9 percent
  • In one generation, then, the numbers of those majoring in the humanities dropped from a total of 30 percent to a total of less than 16 percent; during that same generation, business majors climbed from 14 percent to 22 percent.
  • ...23 more annotations...
  • History: from 18.5 percent to 10.7 percent
  • But the deeper explanation resides not in something that has happened to it, but in what it has done to itself. English has become less and less coherent as a discipline and, worse, has come near exhaustion as a scholarly pursuit.
  • The twin focus, then, was on the philological nature of the enterprise and the canon of great works to be studied in their historical evolution.
  • Studying English taught us how to write and think better, and to make articulate many of the inchoate impulses and confusions of our post-adolescent minds. We began to see, as we had not before, how such books could shape and refine our thinking. We began to understand why generations of people coming before us had kept them in libraries and bookstores and in classes such as ours. There was, we got to know, a tradition, a historical culture, that had been assembled around these books. Shakespeare had indeed made a difference—to people before us, now to us, and forever to the language of English-speaking people.
  • today there are stunning changes in the student population: there are more and more gifted and enterprising students coming from immigrant backgrounds, students with only slender connections to Western culture and to the assumption that the “great books” of England and the United States should enjoy a fixed centrality in the world. What was once the heart of the matter now seems provincial. Why throw yourself into a study of something not emblematic of the world but representative of a special national interest? As the campus reflects the cultural, racial, and religious complexities of the world around it, reading British and American literature looks more and more marginal. From a global perspective, the books look smaller.
  • With the cost of a college degree surging upward during the last quarter century—tuition itself increasing far beyond any measure of inflation—and with consequent growth in loan debt after graduation, parents have become anxious about the relative earning power of a humanities degree. Their college-age children doubtless share such anxiety. When college costs were lower, anxiety could be kept at bay. (Berkeley in the early ’60s cost me about $100 a year, about $700 in today’s dollars.)
  • Economists, chemists, biologists, psychologists, computer scientists, and almost everyone in the medical sciences win sponsored research, grants, and federal dollars. By and large, humanists don’t, and so they find themselves as direct employees of the institution, consuming money in salaries, pensions, and operating needs—not external money but institutional money.
  • These, then, are some of the external causes of the decline of English: the rise of public education; the relative youth and instability (despite its apparent mature solidity) of English as a discipline; the impact of money; and the pressures upon departments within the modern university to attract financial resources rather than simply use them up.
  • several of my colleagues around the country have called for a return to the aesthetic wellsprings of literature, the rock-solid fact, often neglected, that it can indeed amuse, delight, and educate. They urge the teaching of English, or French, or Russian literature, and the like, in terms of the intrinsic value of the works themselves, in all their range and multiplicity, as well-crafted and appealing artifacts of human wisdom. Second, we should redefine our own standards for granting tenure, placing more emphasis on the classroom and less on published research, and we should prepare to contest our decisions with administrators whose science-based model is not an appropriate means of evaluation.
  • “It may be that what has happened to the profession is not the consequence of social or philosophical changes, but simply the consequence of a tank now empty.” His homely metaphor pointed to the absence of genuinely new frontiers of knowledge and understanding for English professors to explore.
  • In this country and in England, the study of English literature began in the latter part of the 19th century as an exercise in the scientific pursuit of philological research, and those who taught it subscribed to the notion that literature was best understood as a product of language.
  • no one has come forward in years to assert that the study of English (or comparative literature or similar undertakings in other languages) is coherent, does have self-limiting boundaries, and can be described as this but not that.
  • to teach English today is to do, intellectually, what one pleases. No sense of duty remains toward works of English or American literature; amateur sociology or anthropology or philosophy or comic books or studies of trauma among soldiers or survivors of the Holocaust will do. You need not even believe that works of literature have intelligible meaning; you can announce that they bear no relationship at all to the world beyond the text.
  • With everything on the table, and with foundational principles abandoned, everyone is free, in the classroom or in prose, to exercise intellectual laissez-faire in the largest possible way—I won’t interfere with what you do and am happy to see that you will return the favor
  • Consider the English department at Harvard University. It has now agreed to remove its survey of English literature for undergraduates, replacing it and much else with four new “affinity groups”
  • there would be no one book, or family of books, that every English major at Harvard would have read by the time he or she graduates. The direction to which Harvard would lead its students in this “clean slate” or “trickle down” experiment is to suspend literary history, thrusting into the hands of undergraduates the job of cobbling together intellectual coherence for themselves
  • Those who once strove to give order to the curriculum will have learned, from Harvard, that terms like core knowledge and foundational experience only trigger acrimony, turf protection, and faculty mutinies. No one has the stomach anymore to refight the Western culture wars. Let the students find their own way to knowledge.
  • In English, the average number of years spent earning a doctoral degree is almost 11. After passing that milestone, only half of new Ph.D.’s find teaching jobs, the number of new positions having declined over the last year by more than 20 percent; many of those jobs are part-time or come with no possibility of tenure. News like that, moving through student networks, can be matched against, at least until recently, the reputed earning power of recent graduates of business schools, law schools, and medical schools. The comparison is akin to what young people growing up in Rust Belt cities are forced to see: the work isn’t here anymore; our technology is obsolete.
  • unlike other members of the university community, they might well have been plying their trade without proper credentials: “Whereas economists or physicists, geologists or climatologists, physicians or lawyers must master a body of knowledge before they can even think of being licensed to practice,” she said, “we literary scholars, it is tacitly assumed, have no definable expertise.”
  • English departments need not refight the Western culture wars. But they need to fight their own book wars. They must agree on which texts to teach and argue out the choices and the principles of making them if they are to claim the respect due a department of study.
  • They can teach their students to write well, to use rhetoric. They should place their courses in composition and rhetoric at the forefront of their activities. They should announce that the teaching of composition is a skill their instructors have mastered and that students majoring in English will be certified, upon graduation, as possessing rigorously tested competence in prose expression.
  • The study of literature will then take on the profile now held, with moderate dignity, by the study of the classics, Greek and Latin.
  • But we can, we must, do better. At stake are the books themselves and what they can mean to the young. Yes, it is just a literary tradition. That’s all. But without such traditions, civil societies have no compass to guide them.
Javier E

Over the Side With Old Scientific Tenets - NYTimes.com - 0 views

  • Here are some concepts you might consider tossing out with the Christmas wrappings as you get started on the new year: human nature, cause and effect, the theory of everything, free will and evidence-based medicine.
  • Frank Wilczek of M.I.T., a Nobel Prize winner in physics, would retire the distinction between mind and matter, a bedrock notion, at least in the West, since the time of Descartes. We know a lot more about matter and atoms now, Dr. Wilczek says, and about the brain. Matter, he says, “can dance in intricate, dynamic patterns; it can exploit environmental resources, to self-organize and export entropy.”
Javier E

George Packer: Is Amazon Bad for Books? : The New Yorker - 0 views

  • Amazon is a global superstore, like Walmart. It’s also a hardware manufacturer, like Apple, and a utility, like Con Edison, and a video distributor, like Netflix, and a book publisher, like Random House, and a production studio, like Paramount, and a literary magazine, like The Paris Review, and a grocery deliverer, like FreshDirect, and someday it might be a package service, like U.P.S. Its founder and chief executive, Jeff Bezos, also owns a major newspaper, the Washington Post. All these streams and tributaries make Amazon something radically new in the history of American business
  • Amazon is not just the “Everything Store,” to quote the title of Brad Stone’s rich chronicle of Bezos and his company; it’s more like the Everything. What remains constant is ambition, and the search for new things to be ambitious about.
  • It wasn’t a love of books that led him to start an online bookstore. “It was totally based on the property of books as a product,” Shel Kaphan, Bezos’s former deputy, says. Books are easy to ship and hard to break, and there was a major distribution warehouse in Oregon. Crucially, there are far too many books, in and out of print, to sell even a fraction of them at a physical store. The vast selection made possible by the Internet gave Amazon its initial advantage, and a wedge into selling everything else.
  • ...38 more annotations...
  • it’s impossible to know for sure, but, according to one publisher’s estimate, book sales in the U.S. now make up no more than seven per cent of the company’s roughly seventy-five billion dollars in annual revenue.
  • A monopoly is dangerous because it concentrates so much economic power, but in the book business the prospect of a single owner of both the means of production and the modes of distribution is especially worrisome: it would give Amazon more control over the exchange of ideas than any company in U.S. history.
  • “The key to understanding Amazon is the hiring process,” one former employee said. “You’re not hired to do a particular job—you’re hired to be an Amazonian. Lots of managers had to take the Myers-Briggs personality tests. Eighty per cent of them came in two or three similar categories, and Bezos is the same: introverted, detail-oriented, engineer-type personality. Not musicians, designers, salesmen. The vast majority fall within the same personality type—people who graduate at the top of their class at M.I.T. and have no idea what to say to a woman in a bar.”
  • According to Marcus, Amazon executives considered publishing people “antediluvian losers with rotary phones and inventory systems designed in 1968 and warehouses full of crap.” Publishers kept no data on customers, making their bets on books a matter of instinct rather than metrics. They were full of inefficiences, starting with overpriced Manhattan offices.
  • For a smaller house, Amazon’s total discount can go as high as sixty per cent, which cuts deeply into already slim profit margins. Because Amazon manages its inventory so well, it often buys books from small publishers with the understanding that it can’t return them, for an even deeper discount
  • According to one insider, around 2008—when the company was selling far more than books, and was making twenty billion dollars a year in revenue, more than the combined sales of all other American bookstores—Amazon began thinking of content as central to its business. Authors started to be considered among the company’s most important customers. By then, Amazon had lost much of the market in selling music and videos to Apple and Netflix, and its relations with publishers were deteriorating
  • In its drive for profitability, Amazon did not raise retail prices; it simply squeezed its suppliers harder, much as Walmart had done with manufacturers. Amazon demanded ever-larger co-op fees and better shipping terms; publishers knew that they would stop being favored by the site’s recommendation algorithms if they didn’t comply. Eventually, they all did.
  • Brad Stone describes one campaign to pressure the most vulnerable publishers for better terms: internally, it was known as the Gazelle Project, after Bezos suggested “that Amazon should approach these small publishers the way a cheetah would pursue a sickly gazelle.”
  • ithout dropping co-op fees entirely, Amazon simplified its system: publishers were asked to hand over a percentage of their previous year’s sales on the site, as “marketing development funds.”
  • The figure keeps rising, though less for the giant pachyderms than for the sickly gazelles. According to the marketing executive, the larger houses, which used to pay two or three per cent of their net sales through Amazon, now relinquish five to seven per cent of gross sales, pushing Amazon’s percentage discount on books into the mid-fifties. Random House currently gives Amazon an effective discount of around fifty-three per cent.
  • In December, 1999, at the height of the dot-com mania, Time named Bezos its Person of the Year. “Amazon isn’t about technology or even commerce,” the breathless cover article announced. “Amazon is, like every other site on the Web, a content play.” Yet this was the moment, Marcus said, when “content” people were “on the way out.”
  • By 2010, Amazon controlled ninety per cent of the market in digital books—a dominance that almost no company, in any industry, could claim. Its prohibitively low prices warded off competition
  • In 2004, he set up a lab in Silicon Valley that would build Amazon’s first piece of consumer hardware: a device for reading digital books. According to Stone’s book, Bezos told the executive running the project, “Proceed as if your goal is to put everyone selling physical books out of a job.”
  • Lately, digital titles have levelled off at about thirty per cent of book sales.
  • The literary agent Andrew Wylie (whose firm represents me) says, “What Bezos wants is to drag the retail price down as low as he can get it—a dollar-ninety-nine, even ninety-nine cents. That’s the Apple play—‘What we want is traffic through our device, and we’ll do anything to get there.’ ” If customers grew used to paying just a few dollars for an e-book, how long before publishers would have to slash the cover price of all their titles?
  • As Apple and the publishers see it, the ruling ignored the context of the case: when the key events occurred, Amazon effectively had a monopoly in digital books and was selling them so cheaply that it resembled predatory pricing—a barrier to entry for potential competitors. Since then, Amazon’s share of the e-book market has dropped, levelling off at about sixty-five per cent, with the rest going largely to Apple and to Barnes & Noble, which sells the Nook e-reader. In other words, before the feds stepped in, the agency model introduced competition to the market
  • But the court’s decision reflected a trend in legal thinking among liberals and conservatives alike, going back to the seventies, that looks at antitrust cases from the perspective of consumers, not producers: what matters is lowering prices, even if that goal comes at the expense of competition. Barry Lynn, a market-policy expert at the New America Foundation, said, “It’s one of the main factors that’s led to massive consolidation.”
  • Publishers sometimes pass on this cost to authors, by redefining royalties as a percentage of the publisher’s receipts, not of the book’s list price. Recently, publishers say, Amazon began demanding an additional payment, amounting to approximately one per cent of net sales
  • brick-and-mortar retailers employ forty-seven people for every ten million dollars in revenue earned; Amazon employs fourteen.
  • Since the arrival of the Kindle, the tension between Amazon and the publishers has become an open battle. The conflict reflects not only business antagonism amid technological change but a division between the two coasts, with different cultural styles and a philosophical disagreement about what techies call “disruption.”
  • Bezos told Charlie Rose, “Amazon is not happening to bookselling. The future is happening to bookselling.”
  • n Grandinetti’s view, the Kindle “has helped the book business make a more orderly transition to a mixed print and digital world than perhaps any other medium.” Compared with people who work in music, movies, and newspapers, he said, authors are well positioned to thrive. The old print world of scarcity—with a limited number of publishers and editors selecting which manuscripts to publish, and a limited number of bookstores selecting which titles to carry—is yielding to a world of digital abundance. Grandinetti told me that, in these new circumstances, a publisher’s job “is to build a megaphone.”
  • it offers an extremely popular self-publishing platform. Authors become Amazon partners, earning up to seventy per cent in royalties, as opposed to the fifteen per cent that authors typically make on hardcovers. Bezos touts the biggest successes, such as Theresa Ragan, whose self-published thrillers and romances have been downloaded hundreds of thousands of times. But one survey found that half of all self-published authors make less than five hundred dollars a year.
  • The business term for all this clear-cutting is “disintermediation”: the elimination of the “gatekeepers,” as Bezos calls the professionals who get in the customer’s way. There’s a populist inflection to Amazon’s propaganda, an argument against élitist institutions and for “the democratization of the means of production”—a common line of thought in the West Coast tech world
  • “Book publishing is a very human business, and Amazon is driven by algorithms and scale,” Sargent told me. When a house gets behind a new book, “well over two hundred people are pushing your book all over the place, handing it to people, talking about it. A mass of humans, all in one place, generating tremendous energy—that’s the magic potion of publishing. . . . That’s pretty hard to replicate in Amazon’s publishing world, where they have hundreds of thousands of titles.”
  • By producing its own original work, Amazon can sell more devices and sign up more Prime members—a major source of revenue. While the company was building the
  • Like the publishing venture, Amazon Studios set out to make the old “gatekeepers”—in this case, Hollywood agents and executives—obsolete. “We let the data drive what to put in front of customers,” Carr told the Wall Street Journal. “We don’t have tastemakers deciding what our customers should read, listen to, and watch.”
  • book publishers have been consolidating for several decades, under the ownership of media conglomerates like News Corporation, which squeeze them for profits, or holding companies such as Rivergroup, which strip them to service debt. The effect of all this corporatization, as with the replacement of independent booksellers by superstores, has been to privilege the blockbuster.
  • The combination of ceaseless innovation and low-wage drudgery makes Amazon the epitome of a successful New Economy company. It’s hiring as fast as it can—nearly thirty thousand employees last year.
  • the long-term outlook is discouraging. This is partly because Americans don’t read as many books as they used to—they are too busy doing other things with their devices—but also because of the relentless downward pressure on prices that Amazon enforces.
  • he digital market is awash with millions of barely edited titles, most of it dreck, while r
  • Amazon believes that its approach encourages ever more people to tell their stories to ever more people, and turns writers into entrepreneurs; the price per unit might be cheap, but the higher number of units sold, and the accompanying royalties, will make authors wealthier
  • In Friedman’s view, selling digital books at low prices will democratize reading: “What do you want as an author—to sell books to as few people as possible for as much as possible, or for as little as possible to as many readers as possible?”
  • The real talent, the people who are writers because they happen to be really good at writing—they aren’t going to be able to afford to do it.”
  • Seven-figure bidding wars still break out over potential blockbusters, even though these battles often turn out to be follies. The quest for publishing profits in an economy of scarcity drives the money toward a few big books. So does the gradual disappearance of book reviewers and knowledgeable booksellers, whose enthusiasm might have rescued a book from drowning in obscurity. When consumers are overwhelmed with choices, some experts argue, they all tend to buy the same well-known thing.
  • These trends point toward what the literary agent called “the rich getting richer, the poor getting poorer.” A few brand names at the top, a mass of unwashed titles down below, the middle hollowed out: the book business in the age of Amazon mirrors the widening inequality of the broader economy.
  • “If they did, in my opinion they would save the industry. They’d lose thirty per cent of their sales, but they would have an additional thirty per cent for every copy they sold, because they’d be selling directly to consumers. The industry thinks of itself as Procter & Gamble*. What gave publishers the idea that this was some big goddam business? It’s not—it’s a tiny little business, selling to a bunch of odd people who read.”
  • Bezos is right: gatekeepers are inherently élitist, and some of them have been weakened, in no small part, because of their complacency and short-term thinking. But gatekeepers are also barriers against the complete commercialization of ideas, allowing new talent the time to develop and learn to tell difficult truths. When the last gatekeeper but one is gone, will Amazon care whether a book is any good? ♦
Javier E

The Sweet Caress of Cyberspace - NYTimes.com - 0 views

  • “Her” imagines a society in which human beings are so thoroughly marginalized that they’re being edited out of courtship and companionship, because they’re superfluous, messy. It’s a love story as horror story. If we no longer need anyone in the passenger seat, do we need anyone at all?
  • It’s a parable of narcissism in the digital world, which lets you sprint to the foreground of everything, giving you an audience or the illusion of one.
  • But “Her” also traces the flip side of the coin — that with our amassed knowledge and scientific accomplishments, we may be succeeding in rendering ourselves obsolete.
  • ...1 more annotation...
  • I savored a few themes in particular. One is the Internet’s extreme indulgence of the seemingly innate human impulse to contrive a habitat that’s entirely unthreatening, an ego-stroking ecosystem, a sensibility-controlled comfort zone.
Sophia C

Thomas Kuhn: Revolution Against Scientific Realism* - 1 views

  • as such a complex system that nobody believed that it corresponded to the physical reality of the universe. Although the Ptolemaic system accounted for observations-"saved the appearances"-its epicycles and deferents were never intended be anything more than a mathematical model to use in predicting the position of heavenly bodies. [3]
  • lileo that he was free to continue his work with Copernican theory if he agreed that the theory did not describe physical reality but was merely one of the many potential mathematical models. [10] Galileo continued to work, and while he "formally (23)claimed to prove nothing," [11] he passed his mathematical advances and his observational data to Newton, who would not only invent a new mathematics but would solve the remaining problems posed by Copernicus. [12]
  • Thus without pretending that his method could find the underlying causes of things such as gravity, Newton believed that his method produced theory, based upon empirical evidence, that was a close approximation of physical reality.
  • ...27 more annotations...
  • Medieval science was guided by "logical consistency."
  • The logical empiricist's conception of scientific progress was thus a continuous one; more comprehensive theory replaced compatible, older theory
  • Hempel also believed that science evolved in a continuous manner. New theory did not contradict past theory: "theory does not simply refute the earlier empirical generalizations in its field; rather, it shows that within a certain limited range defined by qualifying conditions, the generalizations hold true in fairly close approximation." [21]
  • New theory is more comprehensive; the old theory can be derived from the newer one and is one special manifestation" [22] of the more comprehensive new theory.
  • movement combined induction, based on empiricism, and deduction in the form of logic
  • It was the truth, and the prediction and control that came with it, that was the goal of logical-empirical science.
  • Each successive theory's explanation was closer to the truth than the theory before.
  • e notion of scientific realism held by Newton led to the evolutionary view of the progress of science
  • he entities and processes of theory were believed to exist in nature, and science should discover those entities and processes
  • Particularly disturbing discoveries were made in the area of atomic physics. For instance, Heisenberg's indeterminacy (25)principle, according to historian of science Cecil Schneer, yielded the conclusion that "the world of nature is indeterminate.
  • "even the fundamental principle of causality fail[ed] ."
  • was not until the second half of the twentieth century that the preservers of the evolutionary idea of scientific progress, the logical empiricists, were seriously challenged
  • revolutionary model of scientific change and examined the role of the scientific community in preventing and then accepting change. Kuhn's conception of scientific change occurring through revolutions undermined the traditional scientific goal, finding "truth" in nature
  • Textbooks inform scientists-to-be about this common body of knowledge and understanding.
  • for the world is too huge and complex to be explored randomly.
  • a scientist knows what facts are relevant and can build on past research
  • Normal science, as defined by Kuhn, is cumulative. New knowledge fills a gap of ignorance
  • ne standard product of the scientific enterprise is missing. Normal science does not aim at novelties of fact or theory and, when successful, finds none."
  • ntain a mechanism that uncovers anomaly, inconsistencies within the paradigm.
  • eventually, details arise that are inconsistent with the current paradigm
  • hese inconsistencies are eventually resolved or are ignored.
  • y concern a topic of central importance, a crisis occurs and normal science comes to a hal
  • that the scientists re-examine the foundations of their science that they had been taking for granted
  • it resolves the crisis better than the others, it offers promise for future research, and it is more aesthetic than its competitors. The reasons for converting to a new paradigm are never completely rational.
  • Unlike evolutionary science, in which new knowledge fills a gap of ignorance, in Kuhn's model new knowledge replaces incompatible knowledge.
  • Thus science is not a continuous or cumulative endeavor: when a paradigm shift occurs there is a revolution similar to a political revolution, with fundamental and pervasive changes in method and understanding. Each successive vision about the nature of the universe makes the past vision obsolete; predictions, though more precise, remain similar to the predictions of the past paradigm in their general orientation, but the new explanations do not accommodate the old
  • In a sense, we have circled back to the ancient and medieval practice of separating scientific theory from physical reality; both medieval scientists and Kuhn would agree that no theory corresponds to reality and therefore any number of theories might equally well explain a natural phenomenon. [36] Neither twentieth-century atomic theorists nor medieval astronomers are able to claim that their theories accurately describe physical phenomena. The inability to return to scientific realism suggests a tripartite division of the history of science, with a period of scientific realism fitting between two periods in which there is no insistence that theory correspond to reality. Although both scientific realism and the evolutionary idea of scientific progress appeal to common sense, both existed for only a few hundred years.
Javier E

Lots of Physicists Are Nervous About the Speed of Light - Technology - The Atlantic Wire - 0 views

  • Until subsequent experiments back up these results, it would be foolish, they say, to throw out a foundational theory of physics.
  • "The correct attitude is to ask oneself what went wrong."
  • "The implications could be huge. Particles that move faster than light are essentially moving backwards in time, which could make the phrase cause and effect obsolete."
  • ...1 more annotation...
  • "Modern physics is based on two theories, relativity and the quantum theory, so half of modern physics would have to be replaced by a new theory.
julia rhodes

Brainlike Computers, Learning From Experience - NYTimes.com - 0 views

  • Computers have entered the age when they are able to learn from their own mistakes, a development that is about to turn the digital world on its head.
  • Not only can it automate tasks that now require painstaking programming — for example, moving a robot’s arm smoothly and efficiently — but it can also sidestep and even tolerate errors, potentially making the term “computer crash” obsolete.
  • The new computing approach, already in use by some large technology companies, is based on the biological nervous system, specifically on how neurons react to stimuli and connect with other neurons to interpret information.
  • ...6 more annotations...
  • In coming years, the approach will make possible a new generation of artificial intelligence systems that will perform some functions that humans do with ease: see, speak, listen, navigate, manipulate and control.
  • “We’re moving from engineering computing systems to something that has many of the characteristics of biological computing,” said Larry Smarr
  • The new approach, used in both hardware and software, is being driven by the explosion of scientific knowledge about the brain. Kwabena Boahen, a computer scientist who leads Stanford’s Brains in Silicon research program, said that is also its limitation, as scientists are far from fully understanding how brains function.
  • They are not “programmed.” Rather the connections between the circuits are “weighted” according to correlations in data that the processor has already “learned.” Those weights are then altered as data flows in to the chip, causing them to change their values and to “spike.” That generates a signal that travels to other components and, in reaction, changes the neural network, in essence programming the next actions much the same way that information alters human thoughts and actions.
  • Traditional computers are also remarkably energy inefficient, especially when compared to actual brains, which the new neurons are built to mimic. I.B.M. announced last year that it had built a supercomputer simulation of the brain that encompassed roughly 10 billion neurons — more than 10 percent of a human brain. It ran about 1,500 times more slowly than an actual brain. Further, it required several megawatts of power, compared with just 20 watts of power used by the biological brain.
  • Running the program, known as Compass, which attempts to simulate a brain, at the speed of a human brain would require a flow of electricity in a conventional computer that is equivalent to what is needed to power both San Francisco and New York, Dr. Modha said.
Javier E

How to Raise a University's Profile: Pricing and Packaging - NYTimes.com - 0 views

  • I talked to a half-dozen of Hugh Moren’s fellow students. A highly indebted senior who was terrified of the weak job market described George Washington, where he had invested considerable time getting and doing internships, as “the world’s most expensive trade school.” Another mentioned the abundance of rich students whose parents were giving them a fancy-sounding diploma the way they might a new car. There are serious students here, he acknowledged, but: “You can go to G.W. and essentially buy a degree.”
  • A recent study from the Organization for Economic Cooperation and Development found that, on average, American college graduates score well below college graduates from most other industrialized countries in mathematics. In literacy (“understanding, evaluating, using and engaging with written text”), scores are just average. This comes on the heels of Richard Arum and Josipa Roksa’s “Academically Adrift,” a study that found “limited or no learning” among many college students.Instead of focusing on undergraduate learning, nu
  • colleges have been engaged in the kind of building spree I saw at George Washington. Recreation centers with world-class workout facilities and lazy rivers rise out of construction pits even as students and parents are handed staggeringly large tuition bills. Colleges compete to hire famous professors even as undergraduates wander through academic programs that often lack rigor or coherence. Campuses vie to become the next Harvard — or at least the next George Washington — while ignoring the growing cost and suspect quality of undergraduate education.
  • ...58 more annotations...
  • Mr. Trachtenberg understood the centrality of the university as a physical place. New structures were a visceral sign of progress. They told visitors, donors and civic leaders that the institution was, like beams and scaffolding rising from the earth, ascending. He added new programs, recruited more students, and followed the dictate of constant expansion.
  • the American research university had evolved into a complicated and somewhat peculiar organization. It was built to be all things to all people: to teach undergraduates, produce knowledge, socialize young men and women, train workers for jobs, anchor local economies, even put on weekend sports events. And excellence was defined by similarity to old, elite institutions. Universities were judged by the quality of their scholars, the size of their endowments, the beauty of their buildings and the test scores of their incoming students.
  • John Silber embarked on a huge building campaign while bringing luminaries like Saul Bellow and Elie Wiesel on board to teach and lend their prestige to the B.U. name, creating a bigger, more famous and much more costly institution. He had helped write a game plan for the aspiring college president.
  • GWU is, for all intents and purposes, a for-profit organization. Best example: study abroad. Their top program, a partnering with Sciences Po, costs each student (30 of them, on a program with 'prestige' status?) a full semester's tuition. It costs GW, according to Sciences Po website, €1000. A neat $20,000 profit per student (who is in digging her/himself deeper and deeper in debt.) Moreover, the school takes a $500 admin fee for the study abroad application! With no guarantee that all credits transfer. Students often lose a partial semester, GW profits again. Nor does GW offer help with an antiquated, one-shot/no transfers, tricky registration process. It's tough luck in gay Paris.Just one of many examples. Dorms with extreme mold, off-campus housing impossible for freshmen and sophomores. Required meal plan: Chick-o-Filet etc. Classes with over 300 students (required).This is not Harvard, but costs same.Emotional problems? Counselors too few. Suicides continue and are not appropriately addressed. Caring environment? Extension so and so, please hold.It's an impressive campus, I'm an alum. If you apply, make sure the DC experience is worth the price: good are internships, a few colleges like Elliot School, post-grad.GWU uses undergrad $$ directly for building projects, like the medical center to which students have NO access. (Student health facility is underfunded, outsourced.)Outstanding professors still make a difference. But is that enough?
  • Mr. Trachtenberg, however, understood something crucial about the modern university. It had come to inhabit a market for luxury goods. People don’t buy Gucci bags merely for their beauty and functionality. They buy them because other people will know they can afford the price of purchase. The great virtue of a luxury good, from the manufacturer’s standpoint, isn’t just that people will pay extra money for the feeling associated with a name brand. It’s that the high price is, in and of itself, a crucial part of what people are buying.
  • Mr. Trachtenberg convinced people that George Washington was worth a lot more money by charging a lot more money. Unlike most college presidents, he was surprisingly candid about his strategy. College is like vodka, he liked to explain.
  • The Absolut Rolex plan worked. The number of applicants surged from some 6,000 to 20,000, the average SAT score of students rose by nearly 200 points, and the endowment jumped from $200 million to almost $1 billion.
  • The university became a magnet for the children of new money who didn’t quite have the SATs or family connections required for admission to Stanford or Yale. It also aggressively recruited international students, rich families from Asia and the Middle East who believed, as nearly everyone did, that American universities were the best in the world.
  • U.S. News & World Report now ranks the university at No. 54 nationwide, just outside the “first tier.”
  • The watch and vodka analogies are correct. Personally, I used car analogies when discussing college choices with my kids. We were in the fortunate position of being able to comfortably send our kids to any college in the country and have them leave debt free. Notwithstanding, I told them that they would be going to a state school unless they were able to get into one of about 40 schools that I felt, in whatever arbitrary manner I decided, that was worth the extra cost. They both ended up going to state schools.College is by and large a commodity and you get out of it what you put into it. Both of my kids worked hard in college and were involved in school life. They both left the schools better people and the schools better schools for them being there. They are both now successful adults.I believe too many people look for the prestige of a named school and that is not what college should be primarily about.
  • In 2013, only 14 percent of the university’s 10,000 undergraduates received a grant — a figure on a par with elite schools but far below the national average. The average undergraduate borrower leaves with about $30,800 in debt.
  • When I talk to the best high school students in my state I always stress the benefits of the honors college experience at an affordable public university. For students who won't qualify for a public honors college. the regular pubic university experience is far preferable to the huge debt of places like GW.
  • Carey would do well to look beyond high ticket private universities (which after all are still private enterprises) and what he describes as the Olympian heights of higher education (which for some reason seems also to embitter him) and look at the system overall . The withdrawal of public support was never a policy choice; it was a political choice, "packaged and branded" as some tax cutting palaver all wrapped up in the argument that a free-market should decide how much college should cost and how many seats we need. In such an environment, trustees at private universities are no more solely responsible for turning their degrees into commodities than the administrations of state universities are for raising the number of out-of-state students in order to offset the loss of support from their legislatures. No doubt, we will hear more about market based solutions and technology from Mr. Carey
  • I went to GW back in the 60s. It was affordable and it got me away from home in New York. While I was there, Newsweek famously published a article about the DC Universities - GW, Georgetown, American and Catholic - dubbing them the Pony league, the schools for the children of wealthy middle class New Yorkers who couldn't get into the Ivy League. Nobody really complained. But that wasn't me. I went because I wanted to be where the action was in the 60s, and as we used to say - "GW was literally a stone's throw from the White House. And we could prove it." Back then, the two biggest alumni names were Jackie Kennedy, who's taken some classes there, and J. Edgar Hoover. Now, according to the glossy magazine they send me each month, it's the actress Kerry Washington. There's some sort of progress there, but I'm a GW alum and not properly trained to understand it.
  • This explains a lot of the modern, emerging mentality. It encompasses the culture of enforced grade inflation, cheating and anti-intellectualism in much of higher education. It is consistent with our culture of misleading statistics and information, cronyism and fake quality, the "best and the brightest" being only schemers and glad handers. The wisdom and creativity engendered by an honest, rigorous academic education are replaced by the disingenuous quick fix, the winner-take-all mentality that neglects the common good.
  • I attended nearby Georgetown University and graduated in 1985. Relative to state schools and elite schools, it was expensive then. I took out loans. I had Pell grants. I had work-study and GSL. I paid my debt of $15,000 off in ten years. Would I have done it differently? Yes: I would have continued on to graduate school and not worried about paying off those big loans right after college. My career work out and I am grateful for the education I received and paid for. But I would not recommend to my nieces and nephews debts north of $100,000 for a BA in liberal arts. Go community. Then go state. Then punch your ticket to Harvard, Yale or Stanford — if you are good enough.
  • American universities appear to have more and more drifted away from educating individuals and citizens to becoming high priced trade schools and purveyors of occupational licenses. Lost in the process is the concept of expanding a student's ability to appreciate broadly and deeply, as well as the belief that a republican democracy needs an educated citizenry, not a trained citizenry, to function well.Both the Heisman Trophy winner and the producer of a successful tech I.P.O. likely have much in common, a college education whose rewards are limited to the financial. I don't know if I find this more sad on the individual level or more worrisome for the future of America.
  • This is now a consumer world for everything, including institutions once thought to float above the Shakespearean briars of the work-a-day world such as higher education, law and medicine. Students get this. Parents get this. Everything is negotiable: financial aid, a spot in the nicest dorm, tix to the big game. But through all this, there are faculty - lots of 'em - who work away from the fluff to link the ambitions of the students with the reality and rigor of the 21st century. The job of the student is to get beyond the visible hype of the surroundings and find those faculty members. They will make sure your investment is worth it
  • My experience in managing or working with GW alumni in their 20's or 30's has not been good. Virtually all have been mentally lazy and/or had a stunning sense of entitlement. Basically they've been all talk and no results. That's been quite a contrast to the graduates from VA/MD state universities.
  • More and more, I notice what my debt-financed contributions to the revenue streams of my vendors earn them, not me. My banks earned enough to pay ridiculous bonuses to employees for reckless risk-taking. My satellite tv operator earned enough to overpay ESPN for sports programming that I never watch--and that, in turn, overpays these idiotic pro athletes and college sports administrators. My health insurer earned enough to defeat one-payor insurance; to enable the opaque, inefficient billing practices of hospitals and other providers; and to feed the behemoth pharmaceutical industry. My church earned enough to buy the silence of sex abuse victims and oppose progressive political candidates. And my govt earned enough to continue ag subsidies, inefficient defense spending, and obsolete transportation and energy policies.
  • as the parent of GWU freshman I am grateful for every opportunity afforded her. She has a generous merit scholarship, is in the honors program with some small classes, and has access to internships that can be done while at school. GWU also gave her AP credits to advance her to sophomore status. Had she attended the state flagship school (where she was accepted into that exclusive honors program) she would have a great education but little else. It's not possible to do foreign affairs related internship far from D.C. or Manhattan. She went to a very competitive high school where for the one or two ivy league schools in which she was interested, she didn't have the same level of connections or wealth as many of her peers. Whether because of the Common Application or other factors, getting into a good school with financial help is difficult for a middle class student like my daughter who had a 4.0 GPA and 2300 on the SAT. She also worked after school.The bottom line - GWU offered more money than perceived "higher tier" universities, and brought tuition to almost that of our state school system. And by the way, I think she is also getting a very good education.
  • This article reinforces something I have learned during my daughter's college application process. Most students choose a school based on emotion (reputation) and not value. This luxury good analogy holds up.
  • The entire education problem can be solved by MOOCs lots and lots of them plus a few closely monitored tests and personal interviews with people. Of course many many people make MONEY off of our entirely inefficient way of "educating" -- are we even really doing that -- getting a degree does NOT mean one is actually educated
  • As a first-generation college graduate I entered GW ambitious but left saddled with debt, and crestfallen at the hard-hitting realization that my four undergraduate years were an aberration from what life is actually like post-college: not as simple as getting an [unpaid] internship with a fancy titled institution, as most Colonials do. I knew how to get in to college, but what do you do after the recess of life ends?I learned more about networking, resume plumping (designated responses to constituents...errr....replied to emails), and elevator pitches than actual theory, economic principles, strong writing skills, critical thinking, analysis, and philosophy. While relatively easy to get a job after graduating (for many with a GW degree this is sadly not the case) sustaining one and excelling in it is much harder. It's never enough just to be able to open a new door, you also need to be prepared to navigate your way through that next opportunity.
  • this is a very telling article. Aimless and directionless high school graduates are matched only by aimless and directionless institutes of higher learning. Each child and each parent should start with a goal - before handing over their hard earned tuition dollars, and/or leaving a trail of broken debt in the aftermath of a substandard, unfocused education.
  • it is no longer the most expensive university in America. It is the 46th.Others have been implementing the Absolut Rolex Plan. John Sexton turned New York University into a global higher-education player by selling the dream of downtown living to students raised on “Sex and the City.” Northeastern followed Boston University up the ladder. Under Steven B. Sample, the University of Southern California became a U.S. News top-25 university. Washington University in St. Louis did the same.
  • I currently attend GW, and I have to say, this article completely misrepresents the situation. I have yet to meet a single person who is paying the full $60k tuition - I myself am paying $30k, because the school gave me $30k in grants. As for the quality of education, Foreign Policy rated GW the #8 best school in the world for undergraduate education in international affairs, Princeton Review ranks it as one of the best schools for political science, and U.S. News ranks the law school #20. The author also ignores the role that an expanding research profile plays in growing a university's prestige and educational power.
  • And in hundreds of regional universities and community colleges, presidents and deans and department chairmen have watched this spectacle of ascension and said to themselves, “That could be me.” Agricultural schools and technical institutes are lobbying state legislatures for tuition increases and Ph.D. programs, fitness centers and arenas for sport. Presidents and boards are drawing up plans to raise tuition, recruit “better” students and add academic programs. They all want to go in one direction — up! — and they are all moving with a single vision of what they want to be.
  • this is the same playbook used by hospitals the past 30 years or so. It is how Hackensack Hospital became Hackensack Medical Center and McComb Hospital became Southwest Mississippi Regional Medical Center. No wonder the results have been the same in healthcare and higher education; both have priced themselves out of reach for average Americans.
  • a world where a college is rated not by the quality of its output, but instaed, by the quality of its inputs. A world where there is practically no work to be done by the administration because the college's reputation is made before the first class even begins! This is isanity! But this is the swill that the mammoth college marketing departments nationwide have shoved down America's throat. Colleges are ranked not by the quality of their graduates, but rather, by the test scores of their incoming students!
  • The Pew Foundation has been doing surveys on what students learn, how much homework they do, how much time they spend with professors etc. All good stuff to know before a student chooses a school. It is called the National Survey of Student Engagement (NSSE - called Nessy). It turns out that the higher ranked schools do NOT allow their information to be released to the public. It is SECRET.Why do you think that is?
  • The article blames "the standard university organizational model left teaching responsibilities to autonomous academic departments and individual faculty members, each of which taught and tested in its own way." This is the view of someone who has never taught at a university, nor thought much about how education there actually happens. Once undergraduates get beyond the general requirements, their educations _have_ to depend on "autonomous departments" because it's only those departments know what the requirements for given degree can be, and can grant the necessary accreditation of a given student. The idea that some administrator could know what's necessary for degrees in everything from engineering to fiction writing is nonsense, except that's what the people who only know the theory of education (but not its practice) actually seem to think. In the classroom itself, you have tremendously talented people, who nevertheless have their own particular strengths and approaches. Don't you think it's a good idea to let them do what they do best rather than trying to make everyone teach the same way? Don't you think supervision of young teachers by older colleagues, who actually know their field and its pedagogy, rather than some administrator, who knows nothing of the subject, is a good idea?
  • it makes me very sad to see how expensive some public schools have become. Used to be you could work your way through a public school without loans, but not any more. Like you, I had the advantage of a largely-scholarship paid undergraduate education at a top private college. However, I was also offered a virtually free spot in my state university's (then new) honors college
  • My daughter attended a good community college for a couple of classes during her senior year of high school and I could immediately see how such places are laboratories for failure. They seem like high schools in atmosphere and appearance. Students rush in by car and rush out again when the class is over.The four year residency college creates a completely different feel. On arrival, you get the sense that you are engaging in something important, something apart and one that will require your full attention. I don't say this is for everyone or that the model is not flawed in some ways (students actually only spend 2 1/2 yrs. on campus to get the four yr. degree). College is supposed to be a 60 hour per week job. Anything less than that and the student is seeking himself or herself
  • This. Is. STUNNING. I have always wondered, especially as my kids have approached college age, why American colleges have felt justified in raising tuition at a rate that has well exceeded inflation, year after year after year. (Nobody needs a dorm with luxury suites and a lazy river pool at college!) And as it turns out, they did it to become luxury brands. Just that simple. Incredible.I don't even blame this guy at GWU for doing what he did. He wasn't made responsible for all of American higher ed. But I do think we all need to realize what happened, and why. This is front page stuff.
  • I agree with you, but, unfortunately, given the choice between low tuition, primitive dorms, and no athletic center VS expensive & luxurious, the customers (and their parents) are choosing the latter. As long as this is the case, there is little incentive to provide bare-bones and cheap education.
  • Wesleyan University in CT is one school that is moving down the rankings. Syracuse University is another. Reed College is a third. Why? Because these schools try hard to stay out of the marketing game. (With its new president, Syracuse has jumped back into the game.) Bryn Mawr College, outside Philadelphia hasn't fared well over the past few decades in the rankings, which is true of practically every women's college. Wellesley is by far the highest ranked women's college, but even there the acceptance rate is significantly higher than one finds at comparable coed liberal arts colleges like Amherst & Williams. University of Chicago is another fascinating case for Mr. Carey to study (I'm sure he does in his forthcoming book, which I look forward to reading). Although it has always enjoyed an illustrious academic reputation, until recently Chicago's undergraduate reputation paled in comparison to peer institutions on the two coasts. A few years ago, Chicago changed its game plan to more closely resemble Harvard and Stanford in undergraduate amenities, and lo and behold, its rankings shot up. It was a very cynical move on the president's part to reassemble the football team, but it was a shrewd move because athletics draw more money than academics ever can (except at engineering schools like Cal Tech & MIT), and more money draws richer students from fancier secondary schools with higher test scores, which lead to higher rankings - and the beat goes on.
  • College INDUSTRY is out of control. Sorry, NYU, GW, BU are not worth the price. Are state schools any better? We have the University of Michigan, which is really not a state school, but a university that gives a discount to people who live in Michigan. Why? When you have an undergraduate body 40+% out-of-state that pays tuition of over $50K/year, you tell me?Perhaps the solution is two years of community college followed by two at places like U of M or Michigan State - get the same diploma at the end for much less and beat the system.
  • In one recent yr., the majority of undergrad professors at Harvard, according to Boston.com, where adjuncts. That means low pay, no benefits, no office, temp workers. Harvard.Easily available student loans fueled this arms race of amenities and frills that in which colleges now engage. They moved the cost of education onto the backs of people, kids, who don't understand what they are doing.Students in colleges these days are customers and the customers must be able to get through. If it requires dumbing things down, so be it. On top of tuition, G.W. U. is known by its students as the land of added fees on top of added fees. The joke around campus was that they would soon be installing pay toilets in the student union. No one was laughing.
  • You could written the same story about my alma mater, American University. The place reeked of ambition and upward mobility decades ago and still does. Whoever's running it now must look at its measly half-billion-dollar endowment and compare it to GWU's $1.5 billion and seethe with envy, while GWU's president sets his sights on an Ivy League-size endowment. And both get back to their real jobs: 24/7 fundraising,Which is what university presidents are all about these days. Money - including million-dollar salaries for themselves (GWU's president made more than Harvard's in 2011) - pride, cachet, power, a mansion, first-class all the way. They should just be honest about it and change their university's motto to Ostende mihi pecuniam! (please excuse my questionable Latin)Whether the students are actually learning anything is up to them, I guess - if they do, it's thanks to the professors, adjuncts and the administrative staff, who do the actual work of educating and keep the school running.
  • When I was in HS (70s), many of my richer friends went to GW and I was then of the impression that GW was a 'good' school. As I age, I have come to realize that this place is just another façade to the emptiness that has become America. All too often are we faced with a dilemma: damned if we do, damned if we don't. Yep, 'education' has become a trap for all too many of our citizen.
  • I transferred to GWU from a state school. I am forever grateful that I did. I wanted to get a good rigorous education and go to one of the best International Affairs schools in the world. Even though the state school I went to was dirt-cheap, the education and the faculty was awful. I transferred to GW and was amazed at the professors at that university. An ambassador or a prominent IA scholar taught every class. GW is an expensive school, but that is the free market. If you want a good education you need to be willing to pay for it or join the military. I did the latter and my school was completely free with no debt and I received an amazing education. If young people aren't willing to make some sort of sacrifice to get ahead or just expect everything to be given to then our country is in a sad state.We need to stop blaming universities like GWU that strive to attract better students, better professors, and better infrastructure. They are doing what is expected in America, to better oneself.
  • "Whether the students are actually learning anything is up to them, I guess." How could it possibly be otherwise??? I am glad that you are willing to give credit to teachers and administrators, but it is not they who "do the actual work of educating." From this fallacy comes its corollary, that we should blame teachers first for "under-performing schools". This long-running show of scapegoating may suit the wallets and vanity of American parents, but it is utterly senseless. When, if ever, American culture stops reeking of arrogance, greed and anti-intellectualism, things may improve, and we may resume the habit of bothering to learn. Until then, nothing doing.
  • Universities sell knowledge and grade students on how much they have learned. Fundamentally, there is conflict of interest in thsi setup. Moreover, students who are poorly educated, even if they know this, will not criticize their school, because doing so would make it harder for them to have a career. As such, many problems with higher education remain unexposed to the public.
  • I've lectured and taught in at least five different countries in three continents and the shortest perusal of what goes on abroad would totally undermine most of these speculations. For one thing American universities are unique in their dedication to a broad based liberal arts type education. In France, Italy or Germany, for example, you select a major like mathematics or physics and then in your four years you will not take even one course in another subject. The amount of work that you do that is critically evaluated by an instructor is a tiny fraction of what is done in an American University. While half educated critics based on profoundly incomplete research write criticism like this Universities in Germany Italy, the Netherlands, South Korea and Japan as well as France have appointed committees and made studies to explain why the American system of higher education so drastically outperforms their own system. Elsewhere students do get a rather nice dose of general education but it ends in secondary school and it has the narrowness and formulaic quality that we would just normally associate with that. The character who wrote this article probably never set foot on a "campus" of the University of Paris or Rome
  • The university is part of a complex economic system and it is responding to the demands of that system. For example, students and parents choose universities that have beautiful campuses and buildings. So universities build beautiful campuses. State support of universities has greatly declined, and this decline in funding is the greatest cause of increased tuition. Therefore universities must compete for dollars and must build to attract students and parents. Also, universities are not ranked based on how they educate students -- that's difficult to measure so it is not measured. Instead universities are ranked on research publications. So while universities certainly put much effort into teaching, research has to have a priority in order for the university to survive. Also universities do not force students and parents to attend high price institutions. Reasonably priced state institutions and community colleges are available to every student. Community colleges have an advantage because they are funded by property taxes. Finally learning requires good teaching, but it also requires students that come to the university funded, prepared, and engaged. This often does not happen. Conclusion- universities have to participate in profile raising actions in order to survive. The day that funding is provided for college, ranking is based on education, and students choose campuses with simple buildings, then things will change at the university.
  • This is the inevitable result of privatizing higher education. In the not-so-distant past, we paid for great state universities through our taxes, not tuition. Then, the states shifted funding to prisons and the Federal government radically cut research support and the GI bill. Instead, today we expect universities to support themselves through tuition, and to the extent that we offered students support, it is through non-dischargeable loans. To make matters worse, the interest rates on those loans are far above the government's cost of funds -- so in effect the loans are an excise tax on education (most of which is used to support a handful of for-profit institutions that account for the most student defaults). This "consumer sovereignty" privatized model of funding education works no better than privatizing California's electrical system did in the era of Enron, or our privatized funding of medical service, or our increasingly privatized prison system: it drives up costs at the same time that it replace quality with marketing.
  • There are data in some instances on student learning, but the deeper problem, as I suspect the author already knows, is that there is nothing like a consensus on how to measure that learning, or even on when is the proper end point to emphasize (a lot of what I teach -- I know this from what students have told me -- tends to come into sharp focus years after graduation).
  • Michael (Baltimore) has hit the nail on the head. Universities are increasingly corporatized institutions in the credentialing business. Knowledge, for those few who care about it (often not those paying for the credentials) is available freely because there's no profit in it. Like many corporate entities, it is increasingly run by increasingly highly paid administrators, not faculty.
  • GWU has not defined itself in any unique way, it has merely embraced the bland, but very expensive, accoutrements of American private education: luxury dorms, food courts, spa-like gyms, endless extracurricular activities, etc. But the real culprit for this bloat that students have to bear financially is the college ranking system by US News, Princeton Review, etc. An ultimately meaningless exercise in competition that has nevertheless pushed colleges and universities to be more like one another. A sad state of affairs, and an extremely expensive one for students
  • It is long past time to realize the failure of the Reagonomics-neoliberal private profits over public good program. In education, we need to return to public institutions publicly funded. Just as we need to recognize that Medicare, Social Security, the post office, public utilities, fire departments, interstate highway system, Veterans Administration hospitals and the GI bill are models to be improved and expanded, not destroyed.
  • George Washington is actually not a Rolex watch, it is a counterfeit Rolex. The real Rolexes of higher education -- places like Hopkins, Georgetown, Duke, the Ivies etc. -- have real endowments and real financial aid. No middle class kid is required to borrow $100,000 to get a degree from those schools, because they offer generous need-based financial aid in the form of grants, not loans. The tuition at the real Rolexes is really a sticker price that only the wealthy pay -- everybody else on a sliding scale. For middle class kids who are fortunate enough to get in, Penn actually ends up costing considerably less than a state university.The fake Rolexes -- BU, NYU, Drexel in Philadelphia -- don't have the sliding scale. They bury middle class students in debt.And really, though it is foolish to borrow $100,000 or $120,000 for an undergraduate degree, I don't find the transaction morally wrong. What is morally wrong is our federal government making that loan non-dischargeable in bankruptcy, so many if these kids will be having their wages garnished for the REST OF THEIR LIVES.There is a very simple solution to this, by the way. Cap the amount of non-dischargeable student loan debt at, say, $50,000
  • The slant of this article is critical of the growth of research universities. Couldn't disagree more. Modern research universities create are incredibly engines of economic opportunity not only for the students (who pay the bills) but also for the community via the creation of blue and white collar jobs. Large research university employ tens of thousands of locals from custodial and food service workers right up to high level administrators and specialist in finance, computer services, buildings and facilities management, etc. Johns Hopkins University and the University of Maryland system employ more people than any other industry in Maryland -- including the government. Research universities typically have hospitals providing cutting-edge medical care to the community. Local business (from cafes to property rental companies) benefit from a built-in, long-term client base as well as an educated workforce. And of course they are the foundry of new knowledge which is critical for the future growth of our country.Check out the work of famed economist Dr. Julia Lane on modeling the economic value of the research university. In a nutshell, there are few better investments America can make in herself than research universities. We are the envy of the world in that regard -- and with good reason. How many *industries* (let alone jobs) have Stanford University alone catalyzed?
  • What universities have the monopoly on is the credential. Anyone can learn, from books, from free lectures on the internet, from this newspaper, etc. But only universities can endow you with the cherished degree. For some reason, people are will to pay more for one of these pieces of paper with a certain name on it -- Ivy League, Stanford, even GW -- than another -- Generic State U -- though there is no evidence one is actually worth more in the marketplace of reality than the other. But, by the laws of economics, these places are actually underpriced: after all, something like 20 times more people are trying to buy a Harvard education than are allowed to purchase one. Usually that means you raise your price.
  • Overalll a good article, except for - "This comes on the heels of Richard Arum and Josipa Roksa’s “Academically Adrift,” a study that found “limited or no learning” among many college students." The measure of learning you report was a general thinking skills exam. That's not a good measure of college gains. Most psychologists and cognitive scientists worth their salt would tell you that improvement in critical thinking skills is going to be limited to specific areas. In other words, learning critical thinking skills in math will make little change in critical thinking about political science or biology. Thus we should not expect huge improvements in general critical thinking skills, but rather improvements in a student's major and other areas of focus, such as a minor. Although who has time for a minor when it is universally acknowledged that the purpose of a university is to please and profit an employer or, if one is lucky, an investor. Finally, improved critical thinking skills are not the end all and be all of a college education even given this profit centered perspective. Learning and mastering the cumulative knowledge of past generations is arguably the most important thing to be gained, and most universities still tend to excel at that even with the increasing mandate to run education like a business and cultivate and cull the college "consumer".
  • As for community colleges, there was an article in the Times several years ago that said it much better than I could have said it myself: community colleges are places where dreams are put on hold. Without making the full commitment to study, without leaving the home environment, many, if not most, community college students are caught betwixt and between, trying to balance work responsibilities, caring for a young child or baby and attending classes. For males, the classic "end of the road" in community college is to get a car, a job and a girlfriend, one who is not in college, and that is the end of the dream. Some can make it, but most cannot.
  • as a scientist I disagree with the claim that undergrad tuition subsidizes basic research. Nearly all lab equipment and research personnel (grad students, technicians, anyone with the title "research scientist" or similar) on campus is paid for through federal grants. Professors often spend all their time outside teaching and administration writing grant proposals, as the limited federal grant funds mean ~%85 of proposals must be rejected. What is more, out of each successful grant the university levies a "tax", called "overhead", of 30-40%, nominally to pay for basic operations (utilities, office space, administrators). So in fact one might say research helps fund the university rather than the other way around. Flag
  • It's certainly overrated as a research and graduate level university. Whether it is good for getting an undergraduate education is unclear, but a big part of the appeal is getting to live in D.C..while attending college instead of living in some small college town in the corn fields.
Javier E

Technology Imperialism, the Californian Ideology, and the Future of Higher Education - 2 views

  • What I hope to make explicit today is how much California – the place, the concept, “the dream machine” – shapes (wants to shape) the future of technology and the future of education.
  • In an announcement on Facebook – of course – Zuckerberg argued that “connectivity is a human right.”
  • As Zuckerberg frames it at least, the “human right” in this case is participation in the global economy
  • ...34 more annotations...
  • This is a revealing definition of “human rights,” I’d argue, particularly as it’s one that never addresses things like liberty, equality, or justice. It never addresses freedom of expression or freedom of assembly or freedom of association.
  • in certain countries, a number of people say they do not use the Internet yet they talk about how much time they spend on Facebook. According to one survey, 11% of Indonesians who said they used Facebook also said they did not use the Internet. A survey in Nigeria had similar results:
  • Evgeny Morozov has described this belief as “Internet-centrism,” an ideology he argues permeates the tech industry, its PR wing the tech blogosphere, and increasingly government policy
  • “Internet-centrism” describes the tendency to see “the Internet” – Morozov uses quotations around the phrase – as a new yet unchanging, autonomous, benevolent, and inevitable socio-technological development. “The Internet” is a master framework for how all institutions will supposedly operate moving forward
  • “The opportunity to connect” as a human right assumes that “connectivity” will hasten the advent of these other rights, I suppose – that the Internet will topple dictatorships, for example, that it will extend participation in civic life to everyone and, for our purposes here at this conference, that it will “democratize education.”
  • Empire is not simply an endeavor of the nation-state – we have empire through technology (that’s not new) and now, the technology industry as empire.
  • Facebook is really just synecdochal here, I should add – just one example of the forces I think are at play, politically, economically, technologically, culturally.
  • it matters at the level of ideology. Infrastructure is ideological, of course. The new infrastructure – “the Internet” if you will – has a particular political, economic, and cultural bent to it. It is not neutral.
  • This infrastructure matters. In this case, this is a French satellite company (Eutelsat). This is an American social network (Facebook). Mark Zuckerberg’s altruistic rhetoric aside, this is their plan – an economic plan – to monetize the world’s poor.
  • The content and the form of “connectivity” perpetuate imperialism, and not only in Africa but in all of our lives. Imperialism at the level of infrastructure – not just cultural imperialism but technological imperialism
  • “The Silicon Valley Narrative,” as I call it, is the story that the technology industry tells about the world – not only the world-as-is but the world-as-Silicon-Valley-wants-it-to-be.
  • To better analyze and assess both technology and education technology requires our understanding of these as ideological, argues Neil Selwyn – “‘a site of social struggle’ through which hegemonic positions are developed, legitimated, reproduced and challenged.”
  • This narrative has several commonly used tropes
  • It often features a hero: the technology entrepreneur. Smart. Independent. Bold. Risk-taking. White. Male
  • “The Silicon Valley narrative” invokes themes like “innovation” and “disruption.” It privileges the new; everything else that can be deemed “old” is viewed as obsolete.
  • It contends that its workings are meritocratic: anyone who hustles can make it.
  • “The Silicon Valley narrative” fosters a distrust of institutions – the government, the university. It is neoliberal. It hates paying taxes.
  • “The Silicon Valley narrative” draws from the work of Ayn Rand; it privileges the individual at all costs; it calls this “personalization.”
  • “The Silicon Valley narrative” does not neatly co-exist with public education. We forget this at our peril. This makes education technology, specifically, an incredibly fraught area.
  • Here’s the story I think we like to hear about ed-tech, about distance education, about “connectivity” and learning: Education technology is supportive, not exploitative. Education technology opens, not forecloses, opportunities. Education technology is driven by a rethinking of teaching and learning, not expanding markets or empire. Education technology meets individual and institutional and community goals.
  • That’s not really what the “Silicon Valley narrative” says about education
  • It is interested in data extraction and monetization and standardization and scale. It is interested in markets and return on investment. “Education is broken,” and technology will fix it
  • If “Silicon Valley” isn’t quite accurate, then I must admit that the word “narrative” is probably inadequate too
  • The better term here is “ideology.”
  • Facebook is “the Internet” for a fairly sizable number of people. They know nothing else – conceptually, experientially. And, let’s be honest, Facebook wants to be “the Internet” for everyone.
  • We tend to not see technology as ideological – its connections to libertarianism, neoliberalism, global capitalism, empire.
  • The California ideology ignores race and labor and the water supply; it is sustained by air and fantasy. It is built upon white supremacy and imperialism.
  • As is the technology sector, which has its own history, of course, in warfare and cryptography.
  • So far this year, some $3.76 billion of venture capital has been invested in education technology – a record-setting figure. That money will change the landscape – that’s its intention. That money carries with it a story about the future; it carries with it an ideology.
  • When a venture capitalist says that “software is eating the world,” we can push back on the inevitability implied in that. We can resist – not in the name of clinging to “the old” as those in educational institutions are so often accused of doing – but we can resist in the name of freedom and justice and a future that isn’t dictated by the wealthiest white men in Hollywood or Silicon Valley.
  • We in education would be naive, I think, to think that the designs that venture capitalists and technology entrepreneurs have for us would be any less radical than creating a new state, like Draper’s proposed state of Silicon Valley, that would enormously wealthy and politically powerful.
  • When I hear talk of “unbundling” in education – one of the latest gerunds you’ll hear venture capitalists and ed-tech entrepreneurs invoke, meaning the disassembling of institutions into products and services – I can’t help but think of the “unbundling” that Draper wished to do to my state: carving up land and resources, shifting tax revenue and tax burdens, creating new markets, privatizing public institutions, redistributing power and doing so explicitly not in the service of equity or justice.
  • I want to show you this map, a proposal – a failed proposal, thankfully – by venture capitalist Tim Draper to split the state of California into six separate states: Jefferson, North California, Silicon Valley, Central California, West California, and South California. The proposal, which Draper tried to collect enough signatures to get on the ballot in California, would have created the richest state in the US – Silicon Valley would be first in per-capita income. It would also have created the nation’s poorest state, Central California, which would rank even below Mississippi.
  • that’s not all that Silicon Valley really does.
Javier E

'Our minds can be hijacked': the tech insiders who fear a smartphone dystopia | Technol... - 0 views

  • Rosenstein belongs to a small but growing band of Silicon Valley heretics who complain about the rise of the so-called “attention economy”: an internet shaped around the demands of an advertising economy.
  • “It is very common,” Rosenstein says, “for humans to develop things with the best of intentions and for them to have unintended, negative consequences.”
  • most concerned about the psychological effects on people who, research shows, touch, swipe or tap their phone 2,617 times a day.
  • ...43 more annotations...
  • There is growing concern that as well as addicting users, technology is contributing toward so-called “continuous partial attention”, severely limiting people’s ability to focus, and possibly lowering IQ. One recent study showed that the mere presence of smartphones damages cognitive capacity – even when the device is turned off. “Everyone is distracted,” Rosenstein says. “All of the time.”
  • Drawing a straight line between addiction to social media and political earthquakes like Brexit and the rise of Donald Trump, they contend that digital forces have completely upended the political system and, left unchecked, could even render democracy as we know it obsolete.
  • Without irony, Eyal finished his talk with some personal tips for resisting the lure of technology. He told his audience he uses a Chrome extension, called DF YouTube, “which scrubs out a lot of those external triggers” he writes about in his book, and recommended an app called Pocket Points that “rewards you for staying off your phone when you need to focus”.
  • “One reason I think it is particularly important for us to talk about this now is that we may be the last generation that can remember life before,” Rosenstein says. It may or may not be relevant that Rosenstein, Pearlman and most of the tech insiders questioning today’s attention economy are in their 30s, members of the last generation that can remember a world in which telephones were plugged into walls.
  • One morning in April this year, designers, programmers and tech entrepreneurs from across the world gathered at a conference centre on the shore of the San Francisco Bay. They had each paid up to $1,700 to learn how to manipulate people into habitual use of their products, on a course curated by conference organiser Nir Eyal.
  • Eyal, 39, the author of Hooked: How to Build Habit-Forming Products, has spent several years consulting for the tech industry, teaching techniques he developed by closely studying how the Silicon Valley giants operate.
  • “The technologies we use have turned into compulsions, if not full-fledged addictions,” Eyal writes. “It’s the impulse to check a message notification. It’s the pull to visit YouTube, Facebook, or Twitter for just a few minutes, only to find yourself still tapping and scrolling an hour later.” None of this is an accident, he writes. It is all “just as their designers intended”
  • He explains the subtle psychological tricks that can be used to make people develop habits, such as varying the rewards people receive to create “a craving”, or exploiting negative emotions that can act as “triggers”. “Feelings of boredom, loneliness, frustration, confusion and indecisiveness often instigate a slight pain or irritation and prompt an almost instantaneous and often mindless action to quell the negative sensation,” Eyal writes.
  • The most seductive design, Harris explains, exploits the same psychological susceptibility that makes gambling so compulsive: variable rewards. When we tap those apps with red icons, we don’t know whether we’ll discover an interesting email, an avalanche of “likes”, or nothing at all. It is the possibility of disappointment that makes it so compulsive.
  • Finally, Eyal confided the lengths he goes to protect his own family. He has installed in his house an outlet timer connected to a router that cuts off access to the internet at a set time every day. “The idea is to remember that we are not powerless,” he said. “We are in control.
  • But are we? If the people who built these technologies are taking such radical steps to wean themselves free, can the rest of us reasonably be expected to exercise our free will?
  • Not according to Tristan Harris, a 33-year-old former Google employee turned vocal critic of the tech industry. “All of us are jacked into this system,” he says. “All of our minds can be hijacked. Our choices are not as free as we think they are.”
  • Harris, who has been branded “the closest thing Silicon Valley has to a conscience”, insists that billions of people have little choice over whether they use these now ubiquitous technologies, and are largely unaware of the invisible ways in which a small number of people in Silicon Valley are shaping their lives.
  • “I don’t know a more urgent problem than this,” Harris says. “It’s changing our democracy, and it’s changing our ability to have the conversations and relationships that we want with each other.” Harris went public – giving talks, writing papers, meeting lawmakers and campaigning for reform after three years struggling to effect change inside Google’s Mountain View headquarters.
  • He explored how LinkedIn exploits a need for social reciprocity to widen its network; how YouTube and Netflix autoplay videos and next episodes, depriving users of a choice about whether or not they want to keep watching; how Snapchat created its addictive Snapstreaks feature, encouraging near-constant communication between its mostly teenage users.
  • The techniques these companies use are not always generic: they can be algorithmically tailored to each person. An internal Facebook report leaked this year, for example, revealed that the company can identify when teens feel “insecure”, “worthless” and “need a confidence boost”. Such granular information, Harris adds, is “a perfect model of what buttons you can push in a particular person”.
  • Tech companies can exploit such vulnerabilities to keep people hooked; manipulating, for example, when people receive “likes” for their posts, ensuring they arrive when an individual is likely to feel vulnerable, or in need of approval, or maybe just bored. And the very same techniques can be sold to the highest bidder. “There’s no ethics,” he says. A company paying Facebook to use its levers of persuasion could be a car business targeting tailored advertisements to different types of users who want a new vehicle. Or it could be a Moscow-based troll farm seeking to turn voters in a swing county in Wisconsin.
  • It was Rosenstein’s colleague, Leah Pearlman, then a product manager at Facebook and on the team that created the Facebook “like”, who announced the feature in a 2009 blogpost. Now 35 and an illustrator, Pearlman confirmed via email that she, too, has grown disaffected with Facebook “likes” and other addictive feedback loops. She has installed a web browser plug-in to eradicate her Facebook news feed, and hired a social media manager to monitor her Facebook page so that she doesn’t have to.
  • Harris believes that tech companies never deliberately set out to make their products addictive. They were responding to the incentives of an advertising economy, experimenting with techniques that might capture people’s attention, even stumbling across highly effective design by accident.
  • It’s this that explains how the pull-to-refresh mechanism, whereby users swipe down, pause and wait to see what content appears, rapidly became one of the most addictive and ubiquitous design features in modern technology. “Each time you’re swiping down, it’s like a slot machine,” Harris says. “You don’t know what’s coming next. Sometimes it’s a beautiful photo. Sometimes it’s just an ad.”
  • The reality TV star’s campaign, he said, had heralded a watershed in which “the new, digitally supercharged dynamics of the attention economy have finally crossed a threshold and become manifest in the political realm”.
  • “Smartphones are useful tools,” he says. “But they’re addictive. Pull-to-refresh is addictive. Twitter is addictive. These are not good things. When I was working on them, it was not something I was mature enough to think about. I’m not saying I’m mature now, but I’m a little bit more mature, and I regret the downsides.”
  • All of it, he says, is reward-based behaviour that activates the brain’s dopamine pathways. He sometimes finds himself clicking on the red icons beside his apps “to make them go away”, but is conflicted about the ethics of exploiting people’s psychological vulnerabilities. “It is not inherently evil to bring people back to your product,” he says. “It’s capitalism.”
  • He identifies the advent of the smartphone as a turning point, raising the stakes in an arms race for people’s attention. “Facebook and Google assert with merit that they are giving users what they want,” McNamee says. “The same can be said about tobacco companies and drug dealers.”
  • McNamee chooses his words carefully. “The people who run Facebook and Google are good people, whose well-intentioned strategies have led to horrific unintended consequences,” he says. “The problem is that there is nothing the companies can do to address the harm unless they abandon their current advertising models.”
  • But how can Google and Facebook be forced to abandon the business models that have transformed them into two of the most profitable companies on the planet?
  • McNamee believes the companies he invested in should be subjected to greater regulation, including new anti-monopoly rules. In Washington, there is growing appetite, on both sides of the political divide, to rein in Silicon Valley. But McNamee worries the behemoths he helped build may already be too big to curtail.
  • Rosenstein, the Facebook “like” co-creator, believes there may be a case for state regulation of “psychologically manipulative advertising”, saying the moral impetus is comparable to taking action against fossil fuel or tobacco companies. “If we only care about profit maximisation,” he says, “we will go rapidly into dystopia.”
  • James Williams does not believe talk of dystopia is far-fetched. The ex-Google strategist who built the metrics system for the company’s global search advertising business, he has had a front-row view of an industry he describes as the “largest, most standardised and most centralised form of attentional control in human history”.
  • It is a journey that has led him to question whether democracy can survive the new technological age.
  • He says his epiphany came a few years ago, when he noticed he was surrounded by technology that was inhibiting him from concentrating on the things he wanted to focus on. “It was that kind of individual, existential realisation: what’s going on?” he says. “Isn’t technology supposed to be doing the complete opposite of this?
  • That discomfort was compounded during a moment at work, when he glanced at one of Google’s dashboards, a multicoloured display showing how much of people’s attention the company had commandeered for advertisers. “I realised: this is literally a million people that we’ve sort of nudged or persuaded to do this thing that they weren’t going to otherwise do,” he recalls.
  • Williams and Harris left Google around the same time, and co-founded an advocacy group, Time Well Spent, that seeks to build public momentum for a change in the way big tech companies think about design. Williams finds it hard to comprehend why this issue is not “on the front page of every newspaper every day.
  • “Eighty-seven percent of people wake up and go to sleep with their smartphones,” he says. The entire world now has a new prism through which to understand politics, and Williams worries the consequences are profound.
  • g. “The attention economy incentivises the design of technologies that grab our attention,” he says. “In so doing, it privileges our impulses over our intentions.”
  • That means privileging what is sensational over what is nuanced, appealing to emotion, anger and outrage. The news media is increasingly working in service to tech companies, Williams adds, and must play by the rules of the attention economy to “sensationalise, bait and entertain in order to survive”.
  • It is not just shady or bad actors who were exploiting the internet to change public opinion. The attention economy itself is set up to promote a phenomenon like Trump, who is masterly at grabbing and retaining the attention of supporters and critics alike, often by exploiting or creating outrage.
  • All of which has left Brichter, who has put his design work on the backburner while he focuses on building a house in New Jersey, questioning his legacy. “I’ve spent many hours and weeks and months and years thinking about whether anything I’ve done has made a net positive impact on society or humanity at all,” he says. He has blocked certain websites, turned off push notifications, restricted his use of the Telegram app to message only with his wife and two close friends, and tried to wean himself off Twitter. “I still waste time on it,” he confesses, “just reading stupid news I already know about.” He charges his phone in the kitchen, plugging it in at 7pm and not touching it until the next morning.
  • He stresses these dynamics are by no means isolated to the political right: they also play a role, he believes, in the unexpected popularity of leftwing politicians such as Bernie Sanders and Jeremy Corbyn, and the frequent outbreaks of internet outrage over issues that ignite fury among progressives.
  • All of which, Williams says, is not only distorting the way we view politics but, over time, may be changing the way we think, making us less rational and more impulsive. “We’ve habituated ourselves into a perpetual cognitive style of outrage, by internalising the dynamics of the medium,” he says.
  • It was another English science fiction writer, Aldous Huxley, who provided the more prescient observation when he warned that Orwellian-style coercion was less of a threat to democracy than the more subtle power of psychological manipulation, and “man’s almost infinite appetite for distractions”.
  • If the attention economy erodes our ability to remember, to reason, to make decisions for ourselves – faculties that are essential to self-governance – what hope is there for democracy itself?
  • “The dynamics of the attention economy are structurally set up to undermine the human will,” he says. “If politics is an expression of our human will, on individual and collective levels, then the attention economy is directly undermining the assumptions that democracy rests on.”
Javier E

A New Dark Age Looms - The New York Times - 1 views

  • picture yourself in our grandchildren’s time, a century hence. Significant global warming has occurred, as scientists predicted. Nature’s longstanding, repeatable patterns — relied on for millenniums by humanity to plan everything from infrastructure to agriculture — are no longer so reliable. Cycles that have been largely unwavering during modern human history are disrupted by substantial changes in temperature and precipitation.
  • As Earth’s warming stabilizes, new patterns begin to appear. At first, they are confusing and hard to identify. Scientists note similarities to Earth’s emergence from the last ice age. These new patterns need many years — sometimes decades or more — to reveal themselves fully, even when monitored with our sophisticated observing systems
  • Disruptive societal impacts will be widespread.
  • ...9 more annotations...
  • Our foundation of Earth knowledge, largely derived from historically observed patterns, has been central to society’s progress. Early cultures kept track of nature’s ebb and flow, passing improved knowledge about hunting and agriculture to each new generation. Science has accelerated this learning process through advanced observation methods and pattern discovery techniques. These allow us to anticipate the future with a consistency unimaginable to our ancestors.
  • But as Earth warms, our historical understanding will turn obsolete faster than we can replace it with new knowledge. Some patterns will change significantly; others will be largely unaffected
  • The list of possible disruptions is long and alarming.
  • Historians of the next century will grasp the importance of this decline in our ability to predict the future. They may mark the coming decades of this century as the period during which humanity, despite rapid technological and scientific advances, achieved “peak knowledge” about the planet it occupies
  • One exception to this pattern-based knowledge is the weather, whose underlying physics governs how the atmosphere moves and adjusts. Because we understand the physics, we can replicate the atmosphere with computer models.
  • But farmers need to think a season or more ahead. So do infrastructure planners as they design new energy and water systems
  • The intermediate time period is our big challenge. Without substantial scientific breakthroughs, we will remain reliant on pattern-based methods for time periods between a month and a decade. The problem is, as the planet warms, these patterns will become increasingly difficult to discern.
  • The oceans, which play a major role in global weather patterns, will also see substantial changes as global temperatures rise. Ocean currents and circulation patterns evolve on time scales of decades and longer, and fisheries change in response. We lack reliable, physics-based models to tell us how this occurs.
  • Our grandchildren could grow up knowing less about the planet than we do today. This is not a legacy we want to leave them. Yet we are on the verge of ensuring this happens.
Javier E

How Calls for Privacy May Upend Business for Facebook and Google - The New York Times - 0 views

  • People detailed their interests and obsessions on Facebook and Google, generating a river of data that could be collected and harnessed for advertising. The companies became very rich. Users seemed happy. Privacy was deemed obsolete, like bloodletting and milkmen
  • It has been many months of allegations and arguments that the internet in general and social media in particular are pulling society down instead of lifting it up.
  • That has inspired a good deal of debate about more restrictive futures for Facebook and Google. At the furthest extreme, some dream of the companies becoming public utilities.
  • ...20 more annotations...
  • There are other avenues still, said Jascha Kaykas-Wolff, the chief marketing officer of Mozilla, the nonprofit organization behind the popular Firefox browser, including advertisers and large tech platforms collecting vastly less user data and still effectively customizing ads to consumers.
  • The greatest likelihood is that the internet companies, frightened by the tumult, will accept a few more rules and work a little harder for transparency.
  • The Cambridge Analytica case, said Vera Jourova, the European Union commissioner for justice, consumers and gender equality, was not just a breach of private data. “This is much more serious, because here we witness the threat to democracy, to democratic plurality,” she said.
  • Although many people had a general understanding that free online services used their personal details to customize the ads they saw, the latest controversy starkly exposed the machinery.
  • Consumers’ seemingly benign activities — their likes — could be used to covertly categorize and influence their behavior. And not just by unknown third parties. Facebook itself has worked directly with presidential campaigns on ad targeting, describing its services in a company case study as “influencing voters.”
  • “If your personal information can help sway elections, which affects everyone’s life and societal well-being, maybe privacy does matter after all.”
  • some trade group executives also warned that any attempt to curb the use of consumer data would put the business model of the ad-supported internet at risk.
  • “You’re undermining a fundamental concept in advertising: reaching consumers who are interested in a particular product,”
  • If suspicion of Facebook and Google is a relatively new feeling in the United States, it has been embedded in Europe for historical and cultural reasons that date back to the Nazi Gestapo, the Soviet occupation of Eastern Europe and the Cold War.
  • “We’re at an inflection point, when the great wave of optimism about tech is giving way to growing alarm,” said Heather Grabbe, director of the Open Society European Policy Institute. “This is the moment when Europeans turn to the state for protection and answers, and are less likely than Americans to rely on the market to sort out imbalances.”
  • In May, the European Union is instituting a comprehensive new privacy law, called the General Data Protection Regulation. The new rules treat personal data as proprietary, owned by an individual, and any use of that data must be accompanied by permission — opting in rather than opting out — after receiving a request written in clear language, not legalese.
  • the protection rules will have more teeth than the current 1995 directive. For example, a company experiencing a data breach involving individuals must notify the data protection authority within 72 hours and would be subject to fines of up to 20 million euros or 4 percent of its annual revenue.
  • “With the new European law, regulators for the first time have real enforcement tools,” said Jeffrey Chester, the executive director of the Center for Digital Democracy, a nonprofit group in Washington. “We now have a way to hold these companies accountable.”
  • Privacy advocates and even some United States regulators have long been concerned about the ability of online services to track consumers and make inferences about their financial status, health concerns and other intimate details to show them behavior-based ads. They warned that such microtargeting could unfairly categorize or exclude certain people.
  • the Do Not Track effort and the privacy bill were both stymied.Industry groups successfully argued that collecting personal details posed no harm to consumers and that efforts to hinder data collection would chill innovation.
  • “If it can be shown that the current situation is actually a market failure and not an individual-company failure, then there’s a case to be made for federal regulation” under certain circumstances
  • The business practices of Facebook and Google were reinforced by the fact that no privacy flap lasted longer than a news cycle or two. Nor did people flee for other services. That convinced the companies that digital privacy was a dead issue.
  • If the current furor dies down without meaningful change, critics worry that the problems might become even more entrenched. When the tech industry follows its natural impulses, it becomes even less transparent.
  • “To know the real interaction between populism and Facebook, you need to give much more access to researchers, not less,” said Paul-Jasper Dittrich, a German research fellow
  • There’s another reason Silicon Valley tends to be reluctant to share information about what it is doing. It believes so deeply in itself that it does not even think there is a need for discussion. The technology world’s remedy for any problem is always more technology
Javier E

The Coming Software Apocalypse - The Atlantic - 1 views

  • Our standard framework for thinking about engineering failures—reflected, for instance, in regulations for medical devices—was developed shortly after World War II, before the advent of software, for electromechanical systems. The idea was that you make something reliable by making its parts reliable (say, you build your engine to withstand 40,000 takeoff-and-landing cycles) and by planning for the breakdown of those parts (you have two engines). But software doesn’t break. Intrado’s faulty threshold is not like the faulty rivet that leads to the crash of an airliner. The software did exactly what it was told to do. In fact it did it perfectly. The reason it failed is that it was told to do the wrong thing.
  • Software failures are failures of understanding, and of imagination. Intrado actually had a backup router, which, had it been switched to automatically, would have restored 911 service almost immediately. But, as described in a report to the FCC, “the situation occurred at a point in the application logic that was not designed to perform any automated corrective actions.”
  • The introduction of programming languages like Fortran and C, which resemble English, and tools, known as “integrated development environments,” or IDEs, that help correct simple mistakes (like Microsoft Word’s grammar checker but for code), obscured, though did little to actually change, this basic alienation—the fact that the programmer didn’t work on a problem directly, but rather spent their days writing out instructions for a machine.
  • ...52 more annotations...
  • Code is too hard to think about. Before trying to understand the attempts themselves, then, it’s worth understanding why this might be: what it is about code that makes it so foreign to the mind, and so unlike anything that came before it.
  • Technological progress used to change the way the world looked—you could watch the roads getting paved; you could see the skylines rise. Today you can hardly tell when something is remade, because so often it is remade by code.
  • Software has enabled us to make the most intricate machines that have ever existed. And yet we have hardly noticed, because all of that complexity is packed into tiny silicon chips as millions and millions of lines of cod
  • The programmer, the renowned Dutch computer scientist Edsger Dijkstra wrote in 1988, “has to be able to think in terms of conceptual hierarchies that are much deeper than a single mind ever needed to face before.” Dijkstra meant this as a warning.
  • As programmers eagerly poured software into critical systems, they became, more and more, the linchpins of the built world—and Dijkstra thought they had perhaps overestimated themselves.
  • What made programming so difficult was that it required you to think like a computer.
  • “The problem is that software engineers don’t understand the problem they’re trying to solve, and don’t care to,” says Leveson, the MIT software-safety expert. The reason is that they’re too wrapped up in getting their code to work.
  • Though he runs a lab that studies the future of computing, he seems less interested in technology per se than in the minds of the people who use it. Like any good toolmaker, he has a way of looking at the world that is equal parts technical and humane. He graduated top of his class at the California Institute of Technology for electrical engineering,
  • “The serious problems that have happened with software have to do with requirements, not coding errors.” When you’re writing code that controls a car’s throttle, for instance, what’s important is the rules about when and how and by how much to open it. But these systems have become so complicated that hardly anyone can keep them straight in their head. “There’s 100 million lines of code in cars now,” Leveson says. “You just cannot anticipate all these things.”
  • a nearly decade-long investigation into claims of so-called unintended acceleration in Toyota cars. Toyota blamed the incidents on poorly designed floor mats, “sticky” pedals, and driver error, but outsiders suspected that faulty software might be responsible
  • software experts spend 18 months with the Toyota code, picking up where NASA left off. Barr described what they found as “spaghetti code,” programmer lingo for software that has become a tangled mess. Code turns to spaghetti when it accretes over many years, with feature after feature piling on top of, and being woven around
  • Using the same model as the Camry involved in the accident, Barr’s team demonstrated that there were actually more than 10 million ways for the onboard computer to cause unintended acceleration. They showed that as little as a single bit flip—a one in the computer’s memory becoming a zero or vice versa—could make a car run out of control. The fail-safe code that Toyota had put in place wasn’t enough to stop it
  • . In all, Toyota recalled more than 9 million cars, and paid nearly $3 billion in settlements and fines related to unintended acceleration.
  • The problem is that programmers are having a hard time keeping up with their own creations. Since the 1980s, the way programmers work and the tools they use have changed remarkably little.
  • “Visual Studio is one of the single largest pieces of software in the world,” he said. “It’s over 55 million lines of code. And one of the things that I found out in this study is more than 98 percent of it is completely irrelevant. All this work had been put into this thing, but it missed the fundamental problems that people faced. And the biggest one that I took away from it was that basically people are playing computer inside their head.” Programmers were like chess players trying to play with a blindfold on—so much of their mental energy is spent just trying to picture where the pieces are that there’s hardly any left over to think about the game itself.
  • The fact that the two of them were thinking about the same problem in the same terms, at the same time, was not a coincidence. They had both just seen the same remarkable talk, given to a group of software-engineering students in a Montreal hotel by a computer researcher named Bret Victor. The talk, which went viral when it was posted online in February 2012, seemed to be making two bold claims. The first was that the way we make software is fundamentally broken. The second was that Victor knew how to fix it.
  • This is the trouble with making things out of code, as opposed to something physical. “The complexity,” as Leveson puts it, “is invisible to the eye.”
  • in early 2012, Victor had finally landed upon the principle that seemed to thread through all of his work. (He actually called the talk “Inventing on Principle.”) The principle was this: “Creators need an immediate connection to what they’re creating.” The problem with programming was that it violated the principle. That’s why software systems were so hard to think about, and so rife with bugs: The programmer, staring at a page of text, was abstracted from whatever it was they were actually making.
  • “Our current conception of what a computer program is,” he said, is “derived straight from Fortran and ALGOL in the late ’50s. Those languages were designed for punch cards.”
  • WYSIWYG (pronounced “wizzywig”) came along. It stood for “What You See Is What You Get.”
  • Victor’s point was that programming itself should be like that. For him, the idea that people were doing important work, like designing adaptive cruise-control systems or trying to understand cancer, by staring at a text editor, was appalling.
  • With the right interface, it was almost as if you weren’t working with code at all; you were manipulating the game’s behavior directly.
  • When the audience first saw this in action, they literally gasped. They knew they weren’t looking at a kid’s game, but rather the future of their industry. Most software involved behavior that unfolded, in complex ways, over time, and Victor had shown that if you were imaginative enough, you could develop ways to see that behavior and change it, as if playing with it in your hands. One programmer who saw the talk wrote later: “Suddenly all of my tools feel obsolete.”
  • hen John Resig saw the “Inventing on Principle” talk, he scrapped his plans for the Khan Academy programming curriculum. He wanted the site’s programming exercises to work just like Victor’s demos. On the left-hand side you’d have the code, and on the right, the running program: a picture or game or simulation. If you changed the code, it’d instantly change the picture. “In an environment that is truly responsive,” Resig wrote about the approach, “you can completely change the model of how a student learns ... [They] can now immediately see the result and intuit how underlying systems inherently work without ever following an explicit explanation.” Khan Academy has become perhaps the largest computer-programming class in the world, with a million students, on average, actively using the program each month.
  • The ideas spread. The notion of liveness, of being able to see data flowing through your program instantly, made its way into flagship programming tools offered by Google and Apple. The default language for making new iPhone and Mac apps, called Swift, was developed by Apple from the ground up to support an environment, called Playgrounds, that was directly inspired by Light Table.
  • “Typically the main problem with software coding—and I’m a coder myself,” Bantegnie says, “is not the skills of the coders. The people know how to code. The problem is what to code. Because most of the requirements are kind of natural language, ambiguous, and a requirement is never extremely precise, it’s often understood differently by the guy who’s supposed to code.”
  • In a pair of later talks, “Stop Drawing Dead Fish” and “Drawing Dynamic Visualizations,” Victor went one further. He demoed two programs he’d built—the first for animators, the second for scientists trying to visualize their data—each of which took a process that used to involve writing lots of custom code and reduced it to playing around in a WYSIWYG interface.
  • Victor suggested that the same trick could be pulled for nearly every problem where code was being written today. “I’m not sure that programming has to exist at all,” he told me. “Or at least software developers.” In his mind, a software developer’s proper role was to create tools that removed the need for software developers. Only then would people with the most urgent computational problems be able to grasp those problems directly, without the intermediate muck of code.
  • Victor implored professional software developers to stop pouring their talent into tools for building apps like Snapchat and Uber. “The inconveniences of daily life are not the significant problems,” he wrote. Instead, they should focus on scientists and engineers—as he put it to me, “these people that are doing work that actually matters, and critically matters, and using really, really bad tools.”
  • Bantegnie’s company is one of the pioneers in the industrial use of model-based design, in which you no longer write code directly. Instead, you create a kind of flowchart that describes the rules your program should follow (the “model”), and the computer generates code for you based on those rules
  • In a model-based design tool, you’d represent this rule with a small diagram, as though drawing the logic out on a whiteboard, made of boxes that represent different states—like “door open,” “moving,” and “door closed”—and lines that define how you can get from one state to the other. The diagrams make the system’s rules obvious: Just by looking, you can see that the only way to get the elevator moving is to close the door, or that the only way to get the door open is to stop.
  • . In traditional programming, your task is to take complex rules and translate them into code; most of your energy is spent doing the translating, rather than thinking about the rules themselves. In the model-based approach, all you have is the rules. So that’s what you spend your time thinking about. It’s a way of focusing less on the machine and more on the problem you’re trying to get it to solve.
  • “Everyone thought I was interested in programming environments,” he said. Really he was interested in how people see and understand systems—as he puts it, in the “visual representation of dynamic behavior.” Although code had increasingly become the tool of choice for creating dynamic behavior, it remained one of the worst tools for understanding it. The point of “Inventing on Principle” was to show that you could mitigate that problem by making the connection between a system’s behavior and its code immediate.
  • On this view, software becomes unruly because the media for describing what software should do—conversations, prose descriptions, drawings on a sheet of paper—are too different from the media describing what software does do, namely, code itself.
  • for this approach to succeed, much of the work has to be done well before the project even begins. Someone first has to build a tool for developing models that are natural for people—that feel just like the notes and drawings they’d make on their own—while still being unambiguous enough for a computer to understand. They have to make a program that turns these models into real code. And finally they have to prove that the generated code will always do what it’s supposed to.
  • tice brings order and accountability to large codebases. But, Shivappa says, “it’s a very labor-intensive process.” He estimates that before they used model-based design, on a two-year-long project only two to three months was spent writing code—the rest was spent working on the documentation.
  • uch of the benefit of the model-based approach comes from being able to add requirements on the fly while still ensuring that existing ones are met; with every change, the computer can verify that your program still works. You’re free to tweak your blueprint without fear of introducing new bugs. Your code is, in FAA parlance, “correct by construction.”
  • “people are not so easily transitioning to model-based software development: They perceive it as another opportunity to lose control, even more than they have already.”
  • The bias against model-based design, sometimes known as model-driven engineering, or MDE, is in fact so ingrained that according to a recent paper, “Some even argue that there is a stronger need to investigate people’s perception of MDE than to research new MDE technologies.”
  • “Human intuition is poor at estimating the true probability of supposedly ‘extremely rare’ combinations of events in systems operating at a scale of millions of requests per second,” he wrote in a paper. “That human fallibility means that some of the more subtle, dangerous bugs turn out to be errors in design; the code faithfully implements the intended design, but the design fails to correctly handle a particular ‘rare’ scenario.”
  • Newcombe was convinced that the algorithms behind truly critical systems—systems storing a significant portion of the web’s data, for instance—ought to be not just good, but perfect. A single subtle bug could be catastrophic. But he knew how hard bugs were to find, especially as an algorithm grew more complex. You could do all the testing you wanted and you’d never find them all.
  • An algorithm written in TLA+ could in principle be proven correct. In practice, it allowed you to create a realistic model of your problem and test it not just thoroughly, but exhaustively. This was exactly what he’d been looking for: a language for writing perfect algorithms.
  • TLA+, which stands for “Temporal Logic of Actions,” is similar in spirit to model-based design: It’s a language for writing down the requirements—TLA+ calls them “specifications”—of computer programs. These specifications can then be completely verified by a computer. That is, before you write any code, you write a concise outline of your program’s logic, along with the constraints you need it to satisfy
  • Programmers are drawn to the nitty-gritty of coding because code is what makes programs go; spending time on anything else can seem like a distraction. And there is a patient joy, a meditative kind of satisfaction, to be had from puzzling out the micro-mechanics of code. But code, Lamport argues, was never meant to be a medium for thought. “It really does constrain your ability to think when you’re thinking in terms of a programming language,”
  • Code makes you miss the forest for the trees: It draws your attention to the working of individual pieces, rather than to the bigger picture of how your program fits together, or what it’s supposed to do—and whether it actually does what you think. This is why Lamport created TLA+. As with model-based design, TLA+ draws your focus to the high-level structure of a system, its essential logic, rather than to the code that implements it.
  • But TLA+ occupies just a small, far corner of the mainstream, if it can be said to take up any space there at all. Even to a seasoned engineer like Newcombe, the language read at first as bizarre and esoteric—a zoo of symbols.
  • this is a failure of education. Though programming was born in mathematics, it has since largely been divorced from it. Most programmers aren’t very fluent in the kind of math—logic and set theory, mostly—that you need to work with TLA+. “Very few programmers—and including very few teachers of programming—understand the very basic concepts and how they’re applied in practice. And they seem to think that all they need is code,” Lamport says. “The idea that there’s some higher level than the code in which you need to be able to think precisely, and that mathematics actually allows you to think precisely about it, is just completely foreign. Because they never learned it.”
  • “In the 15th century,” he said, “people used to build cathedrals without knowing calculus, and nowadays I don’t think you’d allow anyone to build a cathedral without knowing calculus. And I would hope that after some suitably long period of time, people won’t be allowed to write programs if they don’t understand these simple things.”
  • Programmers, as a species, are relentlessly pragmatic. Tools like TLA+ reek of the ivory tower. When programmers encounter “formal methods” (so called because they involve mathematical, “formally” precise descriptions of programs), their deep-seated instinct is to recoil.
  • Formal methods had an image problem. And the way to fix it wasn’t to implore programmers to change—it was to change yourself. Newcombe realized that to bring tools like TLA+ to the programming mainstream, you had to start speaking their language.
  • he presented TLA+ as a new kind of “pseudocode,” a stepping-stone to real code that allowed you to exhaustively test your algorithms—and that got you thinking precisely early on in the design process. “Engineers think in terms of debugging rather than ‘verification,’” he wrote, so he titled his internal talk on the subject to fellow Amazon engineers “Debugging Designs.” Rather than bemoan the fact that programmers see the world in code, Newcombe embraced it. He knew he’d lose them otherwise. “I’ve had a bunch of people say, ‘Now I get it,’” Newcombe says.
  • In the world of the self-driving car, software can’t be an afterthought. It can’t be built like today’s airline-reservation systems or 911 systems or stock-trading systems. Code will be put in charge of hundreds of millions of lives on the road and it has to work. That is no small task.
martinelligi

What the Pandemic Is Doing to Our Brains - The Atlantic - 0 views

  • This is the fog of late pandemic, and it is brutal. In the spring, we joked about the Before Times, but they were still within reach, easily accessible in our shorter-term memories. In the summer and fall, with restrictions loosening and temperatures rising, we were able to replicate some of what life used to be like, at least in an adulterated form: outdoor drinks, a day at the beach. But now, in the cold, dark, featureless middle of our pandemic winter, we can neither remember what life was like before nor imagine what it’ll be like after.
  • The sunniest optimist would point out that all this forgetting is evidence of the resilience of our species. Humans forget a great deal of what happens to us, and we tend to do it pretty quickly—after the first 24 hours or so. “Our brains are very good at learning different things and forgetting the things that are not a priority,” Tina Franklin, a neuroscientist at Georgia Tech, told me. As the pandemic has taught us new habits and made old ones obsolete, our brains have essentially put actions like taking the bus and going to restaurants in deep storage, and placed social distancing and coughing into our elbows near the front of the closet. When our habits change back, presumably so will our recall.
  • The share of Americans reporting symptoms of anxiety disorder, depressive disorder, or both roughly quadrupled from June 2019 to December 2020, according to a Census Bureau study released late last year. What’s more, we simply don’t know the long-term effects of collective, sustained grief. Longitudinal studies of survivors of Chernobyl, 9/11, and Hurricane Katrina show elevated rates of mental-health problems, in some cases lasting for more than a decade
1 - 20 of 21 Next ›
Showing 20 items per page