Skip to main content

Home/ Dystopias/ Group items matching "errors" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
Ed Webb

Does the Digital Classroom Enfeeble the Mind? - NYTimes.com - 0 views

  • My father would have been unable to “teach to the test.” He once complained about errors in a sixth-grade math textbook, so he had the class learn math by designing a spaceship. My father would have been spat out by today’s test-driven educational regime.
  • A career in computer science makes you see the world in its terms. You start to see money as a form of information display instead of as a store of value. Money flows are the computational output of a lot of people planning, promising, evaluating, hedging and scheming, and those behaviors start to look like a set of algorithms. You start to see the weather as a computer processing bits tweaked by the sun, and gravity as a cosmic calculation that keeps events in time and space consistent. This way of seeing is becoming ever more common as people have experiences with computers. While it has its glorious moments, the computational perspective can at times be uniquely unromantic. Nothing kills music for me as much as having some algorithm calculate what music I will want to hear. That seems to miss the whole point. Inventing your musical taste is the point, isn’t it? Bringing computers into the middle of that is like paying someone to program a robot to have sex on your behalf so you don’t have to. And yet it seems we benefit from shining an objectifying digital light to disinfect our funky, lying selves once in a while. It’s heartless to have music chosen by digital algorithms. But at least there are fewer people held hostage to the tastes of bad radio D.J.’s than there once were. The trick is being ambidextrous, holding one hand to the heart while counting on the digits of the other.
  • The future of education in the digital age will be determined by our judgment of which aspects of the information we pass between generations can be represented in computers at all. If we try to represent something digitally when we actually can’t, we kill the romance and make some aspect of the human condition newly bland and absurd. If we romanticize information that shouldn’t be shielded from harsh calculations, we’ll suffer bad teachers and D.J.’s and their wares.
  • ...5 more annotations...
  • Some of the top digital designs of the moment, both in school and in the rest of life, embed the underlying message that we understand the brain and its workings. That is false. We don’t know how information is represented in the brain. We don’t know how reason is accomplished by neurons. There are some vaguely cool ideas floating around, and we might know a lot more about these things any moment now, but at this moment, we don’t. You could spend all day reading literature about educational technology without being reminded that this frontier of ignorance lies before us. We are tempted by the demons of commercial and professional ambition to pretend we know more than we do.
  • Outside school, something similar happens. Students spend a lot of time acting as trivialized relays in giant schemes designed for the purposes of advertising and other revenue-minded manipulations. They are prompted to create databases about themselves and then trust algorithms to assemble streams of songs and movies and stories for their consumption. We see the embedded philosophy bloom when students assemble papers as mash-ups from online snippets instead of thinking and composing on a blank piece of screen. What is wrong with this is not that students are any lazier now or learning less. (It is probably even true, I admit reluctantly, that in the presence of the ambient Internet, maybe it is not so important anymore to hold an archive of certain kinds of academic trivia in your head.) The problem is that students could come to conceive of themselves as relays in a transpersonal digital structure. Their job is then to copy and transfer data around, to be a source of statistics, whether to be processed by tests at school or by advertising schemes elsewhere.
  • If students don’t learn to think, then no amount of access to information will do them any good.
  • To the degree that education is about the transfer of the known between generations, it can be digitized, analyzed, optimized and bottled or posted on Twitter. To the degree that education is about the self-invention of the human race, the gargantuan process of steering billions of brains into unforeseeable states and configurations in the future, it can continue only if each brain learns to invent itself. And that is beyond computation because it is beyond our comprehension.
  • Roughly speaking, there are two ways to use computers in the classroom. You can have them measure and represent the students and the teachers, or you can have the class build a virtual spaceship. Right now the first way is ubiquitous, but the virtual spaceships are being built only by tenacious oddballs in unusual circumstances. More spaceships, please.
  •  
    How do we get this right - use the tech for what it can do well, develop our brains for what the tech can't do? Who's up for building a spaceship?
Ed Webb

Can Economists and Humanists Ever Be Friends? | The New Yorker - 0 views

  • There is something thrilling about the intellectual audacity of thinking that you can explain ninety per cent of behavior in a society with one mental tool.
  • education, which they believe is a form of domestication
  • there is no moral dimension to this economic analysis: utility is a fundamentally amoral concept
  • ...11 more annotations...
  • intellectual overextension is often found in economics, as Gary Saul Morson and Morton Schapiro explain in their wonderful book “Cents and Sensibility: What Economics Can Learn from the Humanities” (Princeton). Morson and Schapiro—one a literary scholar and the other an economist—draw on the distinction between hedgehogs and foxes made by Isaiah Berlin in a famous essay from the nineteen-fifties, invoking an ancient Greek fragment: “The fox knows many things, but the hedgehog one big thing.” Economists tend to be hedgehogs, forever on the search for a single, unifying explanation of complex phenomena. They love to look at a huge, complicated mass of human behavior and reduce it to an equation: the supply-and-demand curves; the Phillips curve, which links unemployment and inflation; or mb=mc, which links a marginal benefit to a marginal cost—meaning that the fourth slice of pizza is worth less to you than the first. These are powerful tools, which can be taken too far. Morson and Schapiro cite the example of Gary Becker, the Nobel laureate in economics in 1992. Becker is a hero to many in the field, but, for all the originality of his thinking, to outsiders he can stand for intellectual overconfidence. He thought that “the economic approach is a comprehensive one that is applicable to all human behavior.” Not some, not most—all
  • Becker analyzed, in his own words, “fertility, education, the uses of time, crime, marriage, social interactions, and other ‘sociological,’ ‘legal,’ and ‘political problems,’ ” before concluding that economics explained everything
  • The issue here is one of overreach: taking an argument that has worthwhile applications and extending it further than it usefully goes. Our motives are often not what they seem: true. This explains everything: not true. After all, it’s not as if the idea that we send signals about ourselves were news; you could argue that there is an entire social science, sociology, dedicated to the subject. Classic practitioners of that discipline study the signals we send and show how they are interpreted by those around us, as in Erving Goffman’s “The Presentation of Self in Everyday Life,” or how we construct an entire identity, both internally and externally, from the things we choose to be seen liking—the argument of Pierre Bourdieu’s masterpiece “Distinction.” These are rich and complicated texts, which show how rich and complicated human difference can be. The focus on signalling and unconscious motives in “The Elephant in the Brain,” however, goes the other way: it reduces complex, diverse behavior to simple rules.
  • “A traditional cost-benefit analysis could easily have led to the discontinuation of a project widely viewed as being among the most successful health interventions in African history.”
  • Another part of me, though, is done with it, with the imperialist ambitions of economics and its tendency to explain away differences, to ignore culture, to exalt reductionism. I want to believe Morson and Schapiro and Desai when they posit that the gap between economics and the humanities can be bridged, but my experience in both writing fiction and studying economics leads me to think that they’re wrong. The hedgehog doesn’t want to learn from the fox. The realist novel is a solemn enemy of equations. The project of reducing behavior to laws and the project of attending to human beings in all their complexity and specifics are diametrically opposed. Perhaps I’m only talking about myself, and this is merely an autobiographical reflection, rather than a general truth, but I think that if I committed any further to economics I would have to give up writing fiction. I told an economist I know about this, and he laughed. He said, “Sounds like you’re maximizing your utility.” 
  • finance is full of “attribution errors,” in which people view their successes as deserved and their failures as bad luck. Desai notes that in business, law, or pedagogy we can gauge success only after months or years; in finance, you can be graded hour by hour, day by day, and by plainly quantifiable measures. What’s more, he says, “the ‘discipline of the market’ shrouds all of finance in a meritocratic haze.” And so people who succeed in finance “are susceptible to developing massively outsized egos and appetites.”
  • one of the things I liked about economics, finance, and the language of money was their lack of hypocrisy. Modern life is full of cant, of people saying things they don’t quite believe. The money guys, in private, don’t go in for cant. They’re more like Mafia bosses. I have to admit that part of me resonates to that coldness.
  • Economics, Morson and Schapiro say, has three systematic biases: it ignores the role of culture, it ignores the fact that “to understand people one must tell stories about them,” and it constantly touches on ethical questions beyond its ken. Culture, stories, and ethics are things that can’t be reduced to equations, and economics accordingly has difficulty with them
  • There is something thrilling about the intellectual audacity of thinking that you can explain ninety per cent of behavior in a society with one mental tool
  • According to Hanson and Simler, these unschooled workers “won’t show up for work reliably on time, or they have problematic superstitions, or they prefer to get job instructions via indirect hints instead of direct orders, or they won’t accept tasks and roles that conflict with their culturally assigned relative status with co-workers, or they won’t accept being told to do tasks differently than they had done them before.”
  • The idea that Maya Angelou’s career amounts to nothing more than a writer shaking her tail feathers to attract the attention of a dominant male is not just misleading; it’s actively embarrassing.
Ed Webb

Narrative Napalm | Noah Kulwin - 0 views

  • there are books whose fusion of factual inaccuracy and moral sophistry is so total that they can only be written by Malcolm Gladwell
  • Malcolm Gladwell’s decades-long shtick has been to launder contrarian thought and corporate banalities through his positions as a staff writer at The New Yorker and author at Little, Brown and Company. These insitutitions’ disciplining effect on Gladwell’s prose, getting his rambling mind to conform to clipped sentences and staccato revelations, has belied his sly maliciousness and explosive vacuity: the two primary qualities of Gladwell’s oeuvre.
  • as is typical with Gladwell’s books and with many historical podcasts, interrogation of the actual historical record and the genuine moral dilemmas it poses—not the low-stakes bait that he trots out as an MBA case study in War—is subordinated to fluffy bullshit and biographical color
  • ...13 more annotations...
  • by taking up military history, Gladwell’s half-witted didacticism threatens to convince millions of people that the only solution to American butchery is to continue shelling out for sharper and larger knives
  • Although the phrase “Bomber Mafia” traditionally refers to the pre-World War II staff and graduates of the Air Corps Tactical School, Gladwell’s book expands the term to include both kooky tinkerers and buttoned-down military men. Wild, far-seeing mavericks, they understood that the possibilities of air power had only just been breached. They were also, as Gladwell insists at various points, typical Gladwellian protagonists: secluded oddballs whose technical zealotry and shared mission gave them a sense of community that propelled them beyond any station they could have achieved on their own.
  • Gladwell’s narrative is transmitted as seamlessly as the Wall Street or Silicon Valley koans that appear atop LinkedIn profiles, Clubhouse accounts, and Substack missives.
  • Gladwell has built a career out of making banality seem fresh
  • Drawing a false distinction between the Bomber Mafia and the British and American military leaders who preceded them allows Gladwell to make the case that a few committed brainiacs developed a humane, “tactical” kind of air power that has built the security of the world we live in today.
  • By now, the press cycle for every Gladwell book release is familiar: experts and critics identify logical flaws and factual errors, they are ignored, Gladwell sells a zillion books, and the world gets indisputably dumber for it.
  • “What actually happened?” Gladwell asks of the Blitz. “Not that much! The panic never came,” he answers, before favorably referring to an unnamed “British government film from 1940,” which is in actuality the Academy Award-nominated propaganda short London Can Take It!, now understood to be emblematic of how the myth of the stoic Brit was manufactured.
  • Gladwell goes to great pains to portray Curtis “Bombs Away” LeMay as merely George Patton-like: a prima donna tactician with some masculinity issues. In reality, LeMay bears a closer resemblance to another iconic George C. Scott performance, one that LeMay directly inspired: Dr. Strangelove’s General Buck Turgidson, who at every turn attempts to force World War III and, at the movie’s close, when global annihilation awaits, soberly warns of a “mineshaft gap” between the United States and the Commies. That, as Gladwell might phrase it, was the “real” Curtis LeMay: a violent reactionary who was never killed or tried because he had the luck to wear the brass of the correct country on his uniform. “I suppose if I had lost the war, I would have been tried as a war criminal,” LeMay once told an Air Force cadet. “Fortunately, we were on the winning side.”
  • Why would Malcolm Gladwell, who seems to admire LeMay so much, talk at such great length about the lethality of LeMay’s Japanese firebombing? The answer lies in what this story leaves out. Mentioned only glancingly in Gladwell’s story are the atomic bombs dropped on Japan. The omission allows for a stupid and classically Gladwell argument: that indiscriminate firebombing brought a swift end to the war, and its attendant philosophical innovations continue to envelop us in a blanket of security that has not been adequately appreciated
  • While LeMay’s 1945 firebombing campaign was certainly excessive—and represented the same base indifference to human life that got Nazis strung up at Nuremberg—it did not end the war. The Japanese were not solely holding out because their military men were fanatical in ways that the Americans weren’t, as Gladwell seems to suggest, citing Conrad Crane, an Army staff historian and hagiographer of LeMay’s[1]; they were holding out because they wanted better terms of surrender—terms they had the prospect of negotiating with the Soviet Union. The United States, having already developed an atomic weapon—and having made the Soviet Union aware of it—decided to drop it as it became clear the Soviet Union was readying to invade Japan. On August 6, the United States dropped a bomb on Hiroshima. Three days later, and mere hours after the Soviet Union formally declared war on the morning of August 9, the Americans dropped the second atomic bomb on Nagasaki. An estimated 210,000 people were killed, the majority of them on the days of the bombings. It was the detonation of these bombs that forced the end of the war. The Japanese unconditional surrender to the Americans was announced on August 15 and formalized on the deck of the USS Missouri on September 2. As historians like Martin Sherwin and Tsuyoshi Hasegawa have pointed out, by dropping the bombs, the Truman administration had kept the Communist threat out of Japan. Imperial Japan was staunchly anticommunist, and under American post-war dominion, the country would remain that way. But Gladwell is unequipped to supply the necessary geopolitical context that could meaningfully explain why the American government would force an unconditional surrender when the possibility of negotiation remained totally live.
  • In 1968, he would join forces with segregationist George Wallace as the vice-presidential candidate on his “American Independent Party” ticket, a fact literally relegated to a footnote in Gladwell’s book. This kind of omission is par for the course in The Bomber Mafia. While Gladwell constantly reminds the reader that the air force leadership was trying to wage more effective wars so as to end all wars, he cannot help but shove under the rug that which is inconvenient
  • This is truly a lesson for the McKinsey set and passive-income crowd for whom The Bomber Mafia is intended: doing bad things is fine, so long as you privately feel bad about it.
  • The British advocacy group Action on Armed Violence just this month estimated that between 2016 and 2020 in Afghanistan, there were more than 2,100 civilians killed and 1,800 injured by air strikes; 37 percent of those killed were children.
  •  
    An appropriately savage review of Gladwell's foray into military history. Contrast with the elegance of KSR's The Lucky Strike which actually wrestles with the moral issues.
1 - 4 of 4
Showing 20 items per page