Skip to main content

Home/ TOK Friends/ Group items tagged Subjective

Rss Feed Group items tagged

Javier E

The Benefits of 'Binocularity' - NYTimes.com - 0 views

  • Will advances in neuroscience move reasonable people to abandon the idea that criminals deserve to be punished?
  • if the idea of deserving punishment depends upon the idea that criminals freely choose their actions, and if neuroscience reveals that free choice is an illusion, then we can see that the idea of deserving punishment is nonsense
  • “new neuroscience will undermine people’s common sense, libertarian conception of free will and the retributivist thinking that depends on it, both of which have heretofore been shielded by the inaccessibility of sophisticated thinking about the mind and its neural basis.”
  • ...11 more annotations...
  • when university students learn about “the neural basis of behavior” — quite simply, the brain activity underlying human actions —they become less supportive of the idea that criminals deserve to be punished.
  • To see what is right — and wrong — with the notion that neuroscience will transform our idea of just deserts, and, more generally, our idea of what it means to be human, it can help to step back and consider
  • British philosopher Jonathan Glover. He said that if we want to understand what sorts of beings we are in depth, we need to achieve a sort of intellectual “binocularity.”
  • Glover was saying that, just as we need two eyes that integrate slightly different information about one scene to achieve visual depth perception, being able see ourselves though two fundamentally different lenses, and integrate those two sources of information, can give us a greater depth of understanding of ourselves.
  • Through one lens we see that we are “subjects” (we act) who have minds and can have the experience of making free choices. Through the other we see that we are “objects” or bodies (we are acted upon), and that our experiences or movements are determined by an infinitely long chain of natural and social forces.
  • intellectual binocularity itself is not easy to achieve. While visual binocularity comes naturally, intellectual binocularity requires effort. In fact — and this is one source of the trouble we so often have when we try to talk about the sorts of beings we are — we can’t actually achieve perfect binocular understanding.
  • We can’t actually see ourselves as subjects and as objects at the same time any more than we can see Wittgenstein’s famous duck-rabbit figure as a duck and as a rabbit at once. Rather, we have to accept the necessity of oscillating between the lenses or ways of seeing, fully aware that, not only are we unable to use both at once, but that there is no algorithm for knowing when to use which.
  • When I said in the beginning that there’s something right about the reasoning of those researchers who reject the idea that our choices are “spontaneous” and not determined by prior events, I was referring to their rejection of the idea that our choices are rooted in some God-given, extra-natural, bodyless stuff.
  • My complaint is that they slip from making the reasonable claim that such extra-natural stuff is an illusion to speaking in ways that suggest that free will is an illusion, full stop. To suggest that our experience of choosing is wholly an illusion is as unhelpful as to suggest that, to explain the emergence of that experience, we need to appeal to extra-natural phenomena.
  • Using either lens alone can lead to pernicious mistakes. When we use only the subject lens, we are prone to a sort of inhumanity where we ignore the reality of the natural and social forces that bear down on all of us to make our choices.
  • When we use only the object lens, however, we are prone to a different, but equally noxious sort of inhumanity, where we fail to appreciate the reality of the experience of making choices freely and of knowing that we can deserve punishment — or praise.
summertyler

What Faces Can't Tell Us - NYTimes.com - 0 views

  • CAN you detect someone’s emotional state just by looking at his face?
  • seems like it
  • Hundreds of scientific studies support the idea that the face is a kind of emotional beacon, clearly and universally signaling the full array of human sentiments
  • ...7 more annotations...
  • software to identify consumers’ moods
  • this assumption is wrong
  • human facial expressions, viewed on their own, are not universally understood
  • look at photographs of facial expressions (smiling, scowling and so on) and match them to a limited set of emotion words (happiness, anger and so on) or to stories with phrases like “Her husband recently died.” Most subjects, even those from faraway cultures with little contact with Western civilization, were extremely good at this task, successfully matching the photos most of the time.
  • this research method was flawed
  • with a preselected set of emotion words, these experiments had inadvertently “primed” the subjects — in effect, hinting at the answers — and thus skewed the results
  • asked to freely describe the emotion on a face (or to view two faces and answer yes or no as to whether they expressed the same emotion). The subjects’ performance plummeted
  •  
    Detecting emotions
Javier E

Is Algebra Necessary? - NYTimes.com - 1 views

  • My aim is not to spare students from a difficult subject, but to call attention to the real problems we are causing by misdirecting precious resources.
  • one in four ninth graders fail to finish high school. In South Carolina, 34 percent fell away in 2008-9, according to national data released last year; for Nevada, it was 45 percent. Most of the educators I’ve talked with cite algebra as the major academic reason.
  • Algebra is an onerous stumbling block for all kinds of students: disadvantaged and affluent, black and white. In New Mexico, 43 percent of white students fell below “proficient,” along with 39 percent in Tennessee
  • ...15 more annotations...
  • The depressing conclusion of a faculty report: “failing math at all levels affects retention more than any other academic factor.” A national sample of transcripts found mathematics had twice as many F’s and D’s compared as other subjects.
  • Of all who embark on higher education, only 58 percent end up with bachelor’s degrees. The main impediment to graduation: freshman math.
  • California’s two university systems, for instance, consider applications only from students who have taken three years of mathematics and in that way exclude many applicants who might excel in fields like art or history. Community college students face an equally prohibitive mathematics wall. A study of two-year schools found that fewer than a quarter of their entrants passed the algebra classes they were required to take.
  • a definitive analysis by the Georgetown Center on Education and the Workforce forecasts that in the decade ahead a mere 5 percent of entry-level workers will need to be proficient in algebra or above.
  • “mathematical reasoning in workplaces differs markedly from the algorithms taught in school.” Even in jobs that rely on so-called STEM credentials — science, technology, engineering, math — considerable training occurs after hiring, including the kinds of computations that will be required.
  • I fully concur that high-tech knowledge is needed to sustain an advanced industrial economy. But we’re deluding ourselves if we believe the solution is largely academic.
  • Nor will just passing grades suffice. Many colleges seek to raise their status by setting a high mathematics bar. Hence, they look for 700 on the math section of the SAT, a height attained in 2009 by only 9 percent of men and 4 percent of women. And it’s not just Ivy League colleges that do this: at schools like Vanderbilt, Rice and Washington University in St. Louis, applicants had best be legacies or athletes if they have scored less than 700 on their math SATs.
  • A January 2012 analysis from the Georgetown center found 7.5 percent unemployment for engineering graduates and 8.2 percent among computer scientists.
  • “Our civilization would collapse without mathematics.” He’s absolutely right.
  • Quantitative literacy clearly is useful in weighing all manner of public policies
  • Mathematics is used as a hoop, a badge, a totem to impress outsiders and elevate a profession’s status.
  • Instead of investing so much of our academic energy in a subject that blocks further attainment for much of our population, I propose that we start thinking about alternatives. Thus mathematics teachers at every level could create exciting courses in what I call “citizen statistics.” This would not be a backdoor version of algebra, as in the Advanced Placement syllabus. Nor would it focus on equations used by scholars when they write for one another. Instead, it would familiarize students with the kinds of numbers that describe and delineate our personal and public lives.
  • This need not involve dumbing down. Researching the reliability of numbers can be as demanding as geometry.
  • I hope that mathematics departments can also create courses in the history and philosophy of their discipline, as well as its applications in early cultures. Why not mathematics in art and music — even poetry — along with its role in assorted sciences? The aim would be to treat mathematics as a liberal art, making it as accessible and welcoming as sculpture or ballet
  • Yes, young people should learn to read and write and do long division, whether they want to or not. But there is no reason to force them to grasp vectorial angles and discontinuous functions. Think of math as a huge boulder we make everyone pull, without assessing what all this pain achieves. So why require it, without alternatives or exceptions? Thus far I haven’t found a compelling answer.
Javier E

Secrets of a Mind-Gamer - NYTimes.com - 0 views

  • “What you have to understand is that even average memories are remarkably powerful if used properly,” Cooke said. He explained to me that mnemonic competitors saw themselves as “participants in an amateur research program” whose aim is to rescue a long-lost tradition of memory training.
  • it wasn’t so long ago that culture depended on individual memories. A trained memory was not just a handy tool but also a fundamental facet of any worldly mind. It was considered a form of character-building, a way of developing the cardinal virtue of prudence and, by extension, ethics. Only through memorizing, the thinking went, could ideas be incorporated into your psyche and their values absorbed.
  • all the other mental athletes I met kept insisting that anyone could do what they do. It was simply a matter of learning to “think in more memorable ways,” using a set of mnemonic techniques almost all of which were invented in ancient Greece. These techniques existed not to memorize useless information like decks of playing cards but to etch into the brain foundational texts and ideas.
  • ...10 more annotations...
  • not only did the brains of the mental athletes appear anatomically indistinguishable from those of the control subjects, but on every test of general cognitive ability, the mental athletes’ scores came back well within the normal range.
  • There was, however, one telling difference between the brains of the mental athletes and those of the control subjects. When the researchers looked at the parts of the brain that were engaged when the subjects memorized, they found that the mental athletes were relying more heavily on regions known to be involved in spatial memory.
  • just about anything could be imprinted upon our memories, and kept in good order, simply by constructing a building in the imagination and filling it with imagery of what needed to be recalled. This imagined edifice could then be walked through at any time in the future. Such a building would later come to be called a memory palace.
  • Memory training was considered a centerpiece of classical education in the language arts, on par with grammar, logic and rhetoric. Students were taught not just what to remember but how to remember it. In a world with few books, memory was sacrosanct.
  • In his essay “First Steps Toward a History of Reading,” Robert Darnton describes a switch from “intensive” to “extensive” reading that occurred as printed books began to proliferate.
  • Until relatively recently, people read “intensively,” Darnton says. “They had only a few books — the Bible, an almanac, a devotional work or two — and they read them over and over again, usually aloud and in groups, so that a narrow range of traditional literature became deeply impressed on their consciousness.” Today we read books “extensively,” often without sustained focus, and with rare exceptions we read each book only once. We value quantity of reading over quality of reading.
  • “Rhetorica ad Herennium” underscores the importance of purposeful attention by making a distinction between natural memory and artificial memory:
  • Our hunter-gatherer ancestors didn’t need to recall phone numbers or word-for-word instructions from their bosses or the Advanced Placement U.S. history curriculum or (because they lived in relatively small, stable groups) the names of dozens of strangers at a cocktail party. What they did need to remember was where to find food and resources and the route home and which plants were edible and which were poisonous
  • What distinguishes a great mnemonist, I learned, is the ability to create lavish images on the fly, to paint in the mind a scene so unlike any other it cannot be forgotten. And to do it quickly. Many c
  • the three stages of acquiring a new skill. During the first phase, known as the cognitive phase, we intellectualize the task and discover new strategies to accomplish it more proficiently. During the second
nolan_delaney

3 ways to use the placebo effect to have a better day - CNN.com - 0 views

    • nolan_delaney
       
      This relates to TOK because of how subjective our minds are to trickery
  • 0 survey of more than 400 docs found that a whopping 56% said they'd actually prescribed placebos to their patients
  • those who were told they got quality sleep performed better than those who were told they slept badly.
  • ...2 more annotations...
  • Placebos seem to work in large part "because they are given by authority figures,
    • nolan_delaney
       
      the part about authority figures relates to one of the fallacieswe discussed
  •  
    Placebo affect- our minds subjective to trickery, authority figure fallacy 
Javier E

How to Raise a University's Profile: Pricing and Packaging - NYTimes.com - 0 views

  • I talked to a half-dozen of Hugh Moren’s fellow students. A highly indebted senior who was terrified of the weak job market described George Washington, where he had invested considerable time getting and doing internships, as “the world’s most expensive trade school.” Another mentioned the abundance of rich students whose parents were giving them a fancy-sounding diploma the way they might a new car. There are serious students here, he acknowledged, but: “You can go to G.W. and essentially buy a degree.”
  • A recent study from the Organization for Economic Cooperation and Development found that, on average, American college graduates score well below college graduates from most other industrialized countries in mathematics. In literacy (“understanding, evaluating, using and engaging with written text”), scores are just average. This comes on the heels of Richard Arum and Josipa Roksa’s “Academically Adrift,” a study that found “limited or no learning” among many college students.Instead of focusing on undergraduate learning, nu
  • colleges have been engaged in the kind of building spree I saw at George Washington. Recreation centers with world-class workout facilities and lazy rivers rise out of construction pits even as students and parents are handed staggeringly large tuition bills. Colleges compete to hire famous professors even as undergraduates wander through academic programs that often lack rigor or coherence. Campuses vie to become the next Harvard — or at least the next George Washington — while ignoring the growing cost and suspect quality of undergraduate education.
  • ...58 more annotations...
  • Mr. Trachtenberg understood the centrality of the university as a physical place. New structures were a visceral sign of progress. They told visitors, donors and civic leaders that the institution was, like beams and scaffolding rising from the earth, ascending. He added new programs, recruited more students, and followed the dictate of constant expansion.
  • the American research university had evolved into a complicated and somewhat peculiar organization. It was built to be all things to all people: to teach undergraduates, produce knowledge, socialize young men and women, train workers for jobs, anchor local economies, even put on weekend sports events. And excellence was defined by similarity to old, elite institutions. Universities were judged by the quality of their scholars, the size of their endowments, the beauty of their buildings and the test scores of their incoming students.
  • John Silber embarked on a huge building campaign while bringing luminaries like Saul Bellow and Elie Wiesel on board to teach and lend their prestige to the B.U. name, creating a bigger, more famous and much more costly institution. He had helped write a game plan for the aspiring college president.
  • GWU is, for all intents and purposes, a for-profit organization. Best example: study abroad. Their top program, a partnering with Sciences Po, costs each student (30 of them, on a program with 'prestige' status?) a full semester's tuition. It costs GW, according to Sciences Po website, €1000. A neat $20,000 profit per student (who is in digging her/himself deeper and deeper in debt.) Moreover, the school takes a $500 admin fee for the study abroad application! With no guarantee that all credits transfer. Students often lose a partial semester, GW profits again. Nor does GW offer help with an antiquated, one-shot/no transfers, tricky registration process. It's tough luck in gay Paris.Just one of many examples. Dorms with extreme mold, off-campus housing impossible for freshmen and sophomores. Required meal plan: Chick-o-Filet etc. Classes with over 300 students (required).This is not Harvard, but costs same.Emotional problems? Counselors too few. Suicides continue and are not appropriately addressed. Caring environment? Extension so and so, please hold.It's an impressive campus, I'm an alum. If you apply, make sure the DC experience is worth the price: good are internships, a few colleges like Elliot School, post-grad.GWU uses undergrad $$ directly for building projects, like the medical center to which students have NO access. (Student health facility is underfunded, outsourced.)Outstanding professors still make a difference. But is that enough?
  • Mr. Trachtenberg, however, understood something crucial about the modern university. It had come to inhabit a market for luxury goods. People don’t buy Gucci bags merely for their beauty and functionality. They buy them because other people will know they can afford the price of purchase. The great virtue of a luxury good, from the manufacturer’s standpoint, isn’t just that people will pay extra money for the feeling associated with a name brand. It’s that the high price is, in and of itself, a crucial part of what people are buying.
  • Mr. Trachtenberg convinced people that George Washington was worth a lot more money by charging a lot more money. Unlike most college presidents, he was surprisingly candid about his strategy. College is like vodka, he liked to explain.
  • The Absolut Rolex plan worked. The number of applicants surged from some 6,000 to 20,000, the average SAT score of students rose by nearly 200 points, and the endowment jumped from $200 million to almost $1 billion.
  • The university became a magnet for the children of new money who didn’t quite have the SATs or family connections required for admission to Stanford or Yale. It also aggressively recruited international students, rich families from Asia and the Middle East who believed, as nearly everyone did, that American universities were the best in the world.
  • U.S. News & World Report now ranks the university at No. 54 nationwide, just outside the “first tier.”
  • The watch and vodka analogies are correct. Personally, I used car analogies when discussing college choices with my kids. We were in the fortunate position of being able to comfortably send our kids to any college in the country and have them leave debt free. Notwithstanding, I told them that they would be going to a state school unless they were able to get into one of about 40 schools that I felt, in whatever arbitrary manner I decided, that was worth the extra cost. They both ended up going to state schools.College is by and large a commodity and you get out of it what you put into it. Both of my kids worked hard in college and were involved in school life. They both left the schools better people and the schools better schools for them being there. They are both now successful adults.I believe too many people look for the prestige of a named school and that is not what college should be primarily about.
  • In 2013, only 14 percent of the university’s 10,000 undergraduates received a grant — a figure on a par with elite schools but far below the national average. The average undergraduate borrower leaves with about $30,800 in debt.
  • When I talk to the best high school students in my state I always stress the benefits of the honors college experience at an affordable public university. For students who won't qualify for a public honors college. the regular pubic university experience is far preferable to the huge debt of places like GW.
  • Carey would do well to look beyond high ticket private universities (which after all are still private enterprises) and what he describes as the Olympian heights of higher education (which for some reason seems also to embitter him) and look at the system overall . The withdrawal of public support was never a policy choice; it was a political choice, "packaged and branded" as some tax cutting palaver all wrapped up in the argument that a free-market should decide how much college should cost and how many seats we need. In such an environment, trustees at private universities are no more solely responsible for turning their degrees into commodities than the administrations of state universities are for raising the number of out-of-state students in order to offset the loss of support from their legislatures. No doubt, we will hear more about market based solutions and technology from Mr. Carey
  • I went to GW back in the 60s. It was affordable and it got me away from home in New York. While I was there, Newsweek famously published a article about the DC Universities - GW, Georgetown, American and Catholic - dubbing them the Pony league, the schools for the children of wealthy middle class New Yorkers who couldn't get into the Ivy League. Nobody really complained. But that wasn't me. I went because I wanted to be where the action was in the 60s, and as we used to say - "GW was literally a stone's throw from the White House. And we could prove it." Back then, the two biggest alumni names were Jackie Kennedy, who's taken some classes there, and J. Edgar Hoover. Now, according to the glossy magazine they send me each month, it's the actress Kerry Washington. There's some sort of progress there, but I'm a GW alum and not properly trained to understand it.
  • This explains a lot of the modern, emerging mentality. It encompasses the culture of enforced grade inflation, cheating and anti-intellectualism in much of higher education. It is consistent with our culture of misleading statistics and information, cronyism and fake quality, the "best and the brightest" being only schemers and glad handers. The wisdom and creativity engendered by an honest, rigorous academic education are replaced by the disingenuous quick fix, the winner-take-all mentality that neglects the common good.
  • I attended nearby Georgetown University and graduated in 1985. Relative to state schools and elite schools, it was expensive then. I took out loans. I had Pell grants. I had work-study and GSL. I paid my debt of $15,000 off in ten years. Would I have done it differently? Yes: I would have continued on to graduate school and not worried about paying off those big loans right after college. My career work out and I am grateful for the education I received and paid for. But I would not recommend to my nieces and nephews debts north of $100,000 for a BA in liberal arts. Go community. Then go state. Then punch your ticket to Harvard, Yale or Stanford — if you are good enough.
  • American universities appear to have more and more drifted away from educating individuals and citizens to becoming high priced trade schools and purveyors of occupational licenses. Lost in the process is the concept of expanding a student's ability to appreciate broadly and deeply, as well as the belief that a republican democracy needs an educated citizenry, not a trained citizenry, to function well.Both the Heisman Trophy winner and the producer of a successful tech I.P.O. likely have much in common, a college education whose rewards are limited to the financial. I don't know if I find this more sad on the individual level or more worrisome for the future of America.
  • This is now a consumer world for everything, including institutions once thought to float above the Shakespearean briars of the work-a-day world such as higher education, law and medicine. Students get this. Parents get this. Everything is negotiable: financial aid, a spot in the nicest dorm, tix to the big game. But through all this, there are faculty - lots of 'em - who work away from the fluff to link the ambitions of the students with the reality and rigor of the 21st century. The job of the student is to get beyond the visible hype of the surroundings and find those faculty members. They will make sure your investment is worth it
  • My experience in managing or working with GW alumni in their 20's or 30's has not been good. Virtually all have been mentally lazy and/or had a stunning sense of entitlement. Basically they've been all talk and no results. That's been quite a contrast to the graduates from VA/MD state universities.
  • More and more, I notice what my debt-financed contributions to the revenue streams of my vendors earn them, not me. My banks earned enough to pay ridiculous bonuses to employees for reckless risk-taking. My satellite tv operator earned enough to overpay ESPN for sports programming that I never watch--and that, in turn, overpays these idiotic pro athletes and college sports administrators. My health insurer earned enough to defeat one-payor insurance; to enable the opaque, inefficient billing practices of hospitals and other providers; and to feed the behemoth pharmaceutical industry. My church earned enough to buy the silence of sex abuse victims and oppose progressive political candidates. And my govt earned enough to continue ag subsidies, inefficient defense spending, and obsolete transportation and energy policies.
  • as the parent of GWU freshman I am grateful for every opportunity afforded her. She has a generous merit scholarship, is in the honors program with some small classes, and has access to internships that can be done while at school. GWU also gave her AP credits to advance her to sophomore status. Had she attended the state flagship school (where she was accepted into that exclusive honors program) she would have a great education but little else. It's not possible to do foreign affairs related internship far from D.C. or Manhattan. She went to a very competitive high school where for the one or two ivy league schools in which she was interested, she didn't have the same level of connections or wealth as many of her peers. Whether because of the Common Application or other factors, getting into a good school with financial help is difficult for a middle class student like my daughter who had a 4.0 GPA and 2300 on the SAT. She also worked after school.The bottom line - GWU offered more money than perceived "higher tier" universities, and brought tuition to almost that of our state school system. And by the way, I think she is also getting a very good education.
  • This article reinforces something I have learned during my daughter's college application process. Most students choose a school based on emotion (reputation) and not value. This luxury good analogy holds up.
  • The entire education problem can be solved by MOOCs lots and lots of them plus a few closely monitored tests and personal interviews with people. Of course many many people make MONEY off of our entirely inefficient way of "educating" -- are we even really doing that -- getting a degree does NOT mean one is actually educated
  • As a first-generation college graduate I entered GW ambitious but left saddled with debt, and crestfallen at the hard-hitting realization that my four undergraduate years were an aberration from what life is actually like post-college: not as simple as getting an [unpaid] internship with a fancy titled institution, as most Colonials do. I knew how to get in to college, but what do you do after the recess of life ends?I learned more about networking, resume plumping (designated responses to constituents...errr....replied to emails), and elevator pitches than actual theory, economic principles, strong writing skills, critical thinking, analysis, and philosophy. While relatively easy to get a job after graduating (for many with a GW degree this is sadly not the case) sustaining one and excelling in it is much harder. It's never enough just to be able to open a new door, you also need to be prepared to navigate your way through that next opportunity.
  • this is a very telling article. Aimless and directionless high school graduates are matched only by aimless and directionless institutes of higher learning. Each child and each parent should start with a goal - before handing over their hard earned tuition dollars, and/or leaving a trail of broken debt in the aftermath of a substandard, unfocused education.
  • it is no longer the most expensive university in America. It is the 46th.Others have been implementing the Absolut Rolex Plan. John Sexton turned New York University into a global higher-education player by selling the dream of downtown living to students raised on “Sex and the City.” Northeastern followed Boston University up the ladder. Under Steven B. Sample, the University of Southern California became a U.S. News top-25 university. Washington University in St. Louis did the same.
  • I currently attend GW, and I have to say, this article completely misrepresents the situation. I have yet to meet a single person who is paying the full $60k tuition - I myself am paying $30k, because the school gave me $30k in grants. As for the quality of education, Foreign Policy rated GW the #8 best school in the world for undergraduate education in international affairs, Princeton Review ranks it as one of the best schools for political science, and U.S. News ranks the law school #20. The author also ignores the role that an expanding research profile plays in growing a university's prestige and educational power.
  • And in hundreds of regional universities and community colleges, presidents and deans and department chairmen have watched this spectacle of ascension and said to themselves, “That could be me.” Agricultural schools and technical institutes are lobbying state legislatures for tuition increases and Ph.D. programs, fitness centers and arenas for sport. Presidents and boards are drawing up plans to raise tuition, recruit “better” students and add academic programs. They all want to go in one direction — up! — and they are all moving with a single vision of what they want to be.
  • this is the same playbook used by hospitals the past 30 years or so. It is how Hackensack Hospital became Hackensack Medical Center and McComb Hospital became Southwest Mississippi Regional Medical Center. No wonder the results have been the same in healthcare and higher education; both have priced themselves out of reach for average Americans.
  • a world where a college is rated not by the quality of its output, but instaed, by the quality of its inputs. A world where there is practically no work to be done by the administration because the college's reputation is made before the first class even begins! This is isanity! But this is the swill that the mammoth college marketing departments nationwide have shoved down America's throat. Colleges are ranked not by the quality of their graduates, but rather, by the test scores of their incoming students!
  • The Pew Foundation has been doing surveys on what students learn, how much homework they do, how much time they spend with professors etc. All good stuff to know before a student chooses a school. It is called the National Survey of Student Engagement (NSSE - called Nessy). It turns out that the higher ranked schools do NOT allow their information to be released to the public. It is SECRET.Why do you think that is?
  • The article blames "the standard university organizational model left teaching responsibilities to autonomous academic departments and individual faculty members, each of which taught and tested in its own way." This is the view of someone who has never taught at a university, nor thought much about how education there actually happens. Once undergraduates get beyond the general requirements, their educations _have_ to depend on "autonomous departments" because it's only those departments know what the requirements for given degree can be, and can grant the necessary accreditation of a given student. The idea that some administrator could know what's necessary for degrees in everything from engineering to fiction writing is nonsense, except that's what the people who only know the theory of education (but not its practice) actually seem to think. In the classroom itself, you have tremendously talented people, who nevertheless have their own particular strengths and approaches. Don't you think it's a good idea to let them do what they do best rather than trying to make everyone teach the same way? Don't you think supervision of young teachers by older colleagues, who actually know their field and its pedagogy, rather than some administrator, who knows nothing of the subject, is a good idea?
  • it makes me very sad to see how expensive some public schools have become. Used to be you could work your way through a public school without loans, but not any more. Like you, I had the advantage of a largely-scholarship paid undergraduate education at a top private college. However, I was also offered a virtually free spot in my state university's (then new) honors college
  • My daughter attended a good community college for a couple of classes during her senior year of high school and I could immediately see how such places are laboratories for failure. They seem like high schools in atmosphere and appearance. Students rush in by car and rush out again when the class is over.The four year residency college creates a completely different feel. On arrival, you get the sense that you are engaging in something important, something apart and one that will require your full attention. I don't say this is for everyone or that the model is not flawed in some ways (students actually only spend 2 1/2 yrs. on campus to get the four yr. degree). College is supposed to be a 60 hour per week job. Anything less than that and the student is seeking himself or herself
  • This. Is. STUNNING. I have always wondered, especially as my kids have approached college age, why American colleges have felt justified in raising tuition at a rate that has well exceeded inflation, year after year after year. (Nobody needs a dorm with luxury suites and a lazy river pool at college!) And as it turns out, they did it to become luxury brands. Just that simple. Incredible.I don't even blame this guy at GWU for doing what he did. He wasn't made responsible for all of American higher ed. But I do think we all need to realize what happened, and why. This is front page stuff.
  • I agree with you, but, unfortunately, given the choice between low tuition, primitive dorms, and no athletic center VS expensive & luxurious, the customers (and their parents) are choosing the latter. As long as this is the case, there is little incentive to provide bare-bones and cheap education.
  • Wesleyan University in CT is one school that is moving down the rankings. Syracuse University is another. Reed College is a third. Why? Because these schools try hard to stay out of the marketing game. (With its new president, Syracuse has jumped back into the game.) Bryn Mawr College, outside Philadelphia hasn't fared well over the past few decades in the rankings, which is true of practically every women's college. Wellesley is by far the highest ranked women's college, but even there the acceptance rate is significantly higher than one finds at comparable coed liberal arts colleges like Amherst & Williams. University of Chicago is another fascinating case for Mr. Carey to study (I'm sure he does in his forthcoming book, which I look forward to reading). Although it has always enjoyed an illustrious academic reputation, until recently Chicago's undergraduate reputation paled in comparison to peer institutions on the two coasts. A few years ago, Chicago changed its game plan to more closely resemble Harvard and Stanford in undergraduate amenities, and lo and behold, its rankings shot up. It was a very cynical move on the president's part to reassemble the football team, but it was a shrewd move because athletics draw more money than academics ever can (except at engineering schools like Cal Tech & MIT), and more money draws richer students from fancier secondary schools with higher test scores, which lead to higher rankings - and the beat goes on.
  • College INDUSTRY is out of control. Sorry, NYU, GW, BU are not worth the price. Are state schools any better? We have the University of Michigan, which is really not a state school, but a university that gives a discount to people who live in Michigan. Why? When you have an undergraduate body 40+% out-of-state that pays tuition of over $50K/year, you tell me?Perhaps the solution is two years of community college followed by two at places like U of M or Michigan State - get the same diploma at the end for much less and beat the system.
  • In one recent yr., the majority of undergrad professors at Harvard, according to Boston.com, where adjuncts. That means low pay, no benefits, no office, temp workers. Harvard.Easily available student loans fueled this arms race of amenities and frills that in which colleges now engage. They moved the cost of education onto the backs of people, kids, who don't understand what they are doing.Students in colleges these days are customers and the customers must be able to get through. If it requires dumbing things down, so be it. On top of tuition, G.W. U. is known by its students as the land of added fees on top of added fees. The joke around campus was that they would soon be installing pay toilets in the student union. No one was laughing.
  • You could written the same story about my alma mater, American University. The place reeked of ambition and upward mobility decades ago and still does. Whoever's running it now must look at its measly half-billion-dollar endowment and compare it to GWU's $1.5 billion and seethe with envy, while GWU's president sets his sights on an Ivy League-size endowment. And both get back to their real jobs: 24/7 fundraising,Which is what university presidents are all about these days. Money - including million-dollar salaries for themselves (GWU's president made more than Harvard's in 2011) - pride, cachet, power, a mansion, first-class all the way. They should just be honest about it and change their university's motto to Ostende mihi pecuniam! (please excuse my questionable Latin)Whether the students are actually learning anything is up to them, I guess - if they do, it's thanks to the professors, adjuncts and the administrative staff, who do the actual work of educating and keep the school running.
  • When I was in HS (70s), many of my richer friends went to GW and I was then of the impression that GW was a 'good' school. As I age, I have come to realize that this place is just another façade to the emptiness that has become America. All too often are we faced with a dilemma: damned if we do, damned if we don't. Yep, 'education' has become a trap for all too many of our citizen.
  • I transferred to GWU from a state school. I am forever grateful that I did. I wanted to get a good rigorous education and go to one of the best International Affairs schools in the world. Even though the state school I went to was dirt-cheap, the education and the faculty was awful. I transferred to GW and was amazed at the professors at that university. An ambassador or a prominent IA scholar taught every class. GW is an expensive school, but that is the free market. If you want a good education you need to be willing to pay for it or join the military. I did the latter and my school was completely free with no debt and I received an amazing education. If young people aren't willing to make some sort of sacrifice to get ahead or just expect everything to be given to then our country is in a sad state.We need to stop blaming universities like GWU that strive to attract better students, better professors, and better infrastructure. They are doing what is expected in America, to better oneself.
  • "Whether the students are actually learning anything is up to them, I guess." How could it possibly be otherwise??? I am glad that you are willing to give credit to teachers and administrators, but it is not they who "do the actual work of educating." From this fallacy comes its corollary, that we should blame teachers first for "under-performing schools". This long-running show of scapegoating may suit the wallets and vanity of American parents, but it is utterly senseless. When, if ever, American culture stops reeking of arrogance, greed and anti-intellectualism, things may improve, and we may resume the habit of bothering to learn. Until then, nothing doing.
  • Universities sell knowledge and grade students on how much they have learned. Fundamentally, there is conflict of interest in thsi setup. Moreover, students who are poorly educated, even if they know this, will not criticize their school, because doing so would make it harder for them to have a career. As such, many problems with higher education remain unexposed to the public.
  • I've lectured and taught in at least five different countries in three continents and the shortest perusal of what goes on abroad would totally undermine most of these speculations. For one thing American universities are unique in their dedication to a broad based liberal arts type education. In France, Italy or Germany, for example, you select a major like mathematics or physics and then in your four years you will not take even one course in another subject. The amount of work that you do that is critically evaluated by an instructor is a tiny fraction of what is done in an American University. While half educated critics based on profoundly incomplete research write criticism like this Universities in Germany Italy, the Netherlands, South Korea and Japan as well as France have appointed committees and made studies to explain why the American system of higher education so drastically outperforms their own system. Elsewhere students do get a rather nice dose of general education but it ends in secondary school and it has the narrowness and formulaic quality that we would just normally associate with that. The character who wrote this article probably never set foot on a "campus" of the University of Paris or Rome
  • The university is part of a complex economic system and it is responding to the demands of that system. For example, students and parents choose universities that have beautiful campuses and buildings. So universities build beautiful campuses. State support of universities has greatly declined, and this decline in funding is the greatest cause of increased tuition. Therefore universities must compete for dollars and must build to attract students and parents. Also, universities are not ranked based on how they educate students -- that's difficult to measure so it is not measured. Instead universities are ranked on research publications. So while universities certainly put much effort into teaching, research has to have a priority in order for the university to survive. Also universities do not force students and parents to attend high price institutions. Reasonably priced state institutions and community colleges are available to every student. Community colleges have an advantage because they are funded by property taxes. Finally learning requires good teaching, but it also requires students that come to the university funded, prepared, and engaged. This often does not happen. Conclusion- universities have to participate in profile raising actions in order to survive. The day that funding is provided for college, ranking is based on education, and students choose campuses with simple buildings, then things will change at the university.
  • This is the inevitable result of privatizing higher education. In the not-so-distant past, we paid for great state universities through our taxes, not tuition. Then, the states shifted funding to prisons and the Federal government radically cut research support and the GI bill. Instead, today we expect universities to support themselves through tuition, and to the extent that we offered students support, it is through non-dischargeable loans. To make matters worse, the interest rates on those loans are far above the government's cost of funds -- so in effect the loans are an excise tax on education (most of which is used to support a handful of for-profit institutions that account for the most student defaults). This "consumer sovereignty" privatized model of funding education works no better than privatizing California's electrical system did in the era of Enron, or our privatized funding of medical service, or our increasingly privatized prison system: it drives up costs at the same time that it replace quality with marketing.
  • There are data in some instances on student learning, but the deeper problem, as I suspect the author already knows, is that there is nothing like a consensus on how to measure that learning, or even on when is the proper end point to emphasize (a lot of what I teach -- I know this from what students have told me -- tends to come into sharp focus years after graduation).
  • Michael (Baltimore) has hit the nail on the head. Universities are increasingly corporatized institutions in the credentialing business. Knowledge, for those few who care about it (often not those paying for the credentials) is available freely because there's no profit in it. Like many corporate entities, it is increasingly run by increasingly highly paid administrators, not faculty.
  • GWU has not defined itself in any unique way, it has merely embraced the bland, but very expensive, accoutrements of American private education: luxury dorms, food courts, spa-like gyms, endless extracurricular activities, etc. But the real culprit for this bloat that students have to bear financially is the college ranking system by US News, Princeton Review, etc. An ultimately meaningless exercise in competition that has nevertheless pushed colleges and universities to be more like one another. A sad state of affairs, and an extremely expensive one for students
  • It is long past time to realize the failure of the Reagonomics-neoliberal private profits over public good program. In education, we need to return to public institutions publicly funded. Just as we need to recognize that Medicare, Social Security, the post office, public utilities, fire departments, interstate highway system, Veterans Administration hospitals and the GI bill are models to be improved and expanded, not destroyed.
  • George Washington is actually not a Rolex watch, it is a counterfeit Rolex. The real Rolexes of higher education -- places like Hopkins, Georgetown, Duke, the Ivies etc. -- have real endowments and real financial aid. No middle class kid is required to borrow $100,000 to get a degree from those schools, because they offer generous need-based financial aid in the form of grants, not loans. The tuition at the real Rolexes is really a sticker price that only the wealthy pay -- everybody else on a sliding scale. For middle class kids who are fortunate enough to get in, Penn actually ends up costing considerably less than a state university.The fake Rolexes -- BU, NYU, Drexel in Philadelphia -- don't have the sliding scale. They bury middle class students in debt.And really, though it is foolish to borrow $100,000 or $120,000 for an undergraduate degree, I don't find the transaction morally wrong. What is morally wrong is our federal government making that loan non-dischargeable in bankruptcy, so many if these kids will be having their wages garnished for the REST OF THEIR LIVES.There is a very simple solution to this, by the way. Cap the amount of non-dischargeable student loan debt at, say, $50,000
  • The slant of this article is critical of the growth of research universities. Couldn't disagree more. Modern research universities create are incredibly engines of economic opportunity not only for the students (who pay the bills) but also for the community via the creation of blue and white collar jobs. Large research university employ tens of thousands of locals from custodial and food service workers right up to high level administrators and specialist in finance, computer services, buildings and facilities management, etc. Johns Hopkins University and the University of Maryland system employ more people than any other industry in Maryland -- including the government. Research universities typically have hospitals providing cutting-edge medical care to the community. Local business (from cafes to property rental companies) benefit from a built-in, long-term client base as well as an educated workforce. And of course they are the foundry of new knowledge which is critical for the future growth of our country.Check out the work of famed economist Dr. Julia Lane on modeling the economic value of the research university. In a nutshell, there are few better investments America can make in herself than research universities. We are the envy of the world in that regard -- and with good reason. How many *industries* (let alone jobs) have Stanford University alone catalyzed?
  • What universities have the monopoly on is the credential. Anyone can learn, from books, from free lectures on the internet, from this newspaper, etc. But only universities can endow you with the cherished degree. For some reason, people are will to pay more for one of these pieces of paper with a certain name on it -- Ivy League, Stanford, even GW -- than another -- Generic State U -- though there is no evidence one is actually worth more in the marketplace of reality than the other. But, by the laws of economics, these places are actually underpriced: after all, something like 20 times more people are trying to buy a Harvard education than are allowed to purchase one. Usually that means you raise your price.
  • Overalll a good article, except for - "This comes on the heels of Richard Arum and Josipa Roksa’s “Academically Adrift,” a study that found “limited or no learning” among many college students." The measure of learning you report was a general thinking skills exam. That's not a good measure of college gains. Most psychologists and cognitive scientists worth their salt would tell you that improvement in critical thinking skills is going to be limited to specific areas. In other words, learning critical thinking skills in math will make little change in critical thinking about political science or biology. Thus we should not expect huge improvements in general critical thinking skills, but rather improvements in a student's major and other areas of focus, such as a minor. Although who has time for a minor when it is universally acknowledged that the purpose of a university is to please and profit an employer or, if one is lucky, an investor. Finally, improved critical thinking skills are not the end all and be all of a college education even given this profit centered perspective. Learning and mastering the cumulative knowledge of past generations is arguably the most important thing to be gained, and most universities still tend to excel at that even with the increasing mandate to run education like a business and cultivate and cull the college "consumer".
  • As for community colleges, there was an article in the Times several years ago that said it much better than I could have said it myself: community colleges are places where dreams are put on hold. Without making the full commitment to study, without leaving the home environment, many, if not most, community college students are caught betwixt and between, trying to balance work responsibilities, caring for a young child or baby and attending classes. For males, the classic "end of the road" in community college is to get a car, a job and a girlfriend, one who is not in college, and that is the end of the dream. Some can make it, but most cannot.
  • as a scientist I disagree with the claim that undergrad tuition subsidizes basic research. Nearly all lab equipment and research personnel (grad students, technicians, anyone with the title "research scientist" or similar) on campus is paid for through federal grants. Professors often spend all their time outside teaching and administration writing grant proposals, as the limited federal grant funds mean ~%85 of proposals must be rejected. What is more, out of each successful grant the university levies a "tax", called "overhead", of 30-40%, nominally to pay for basic operations (utilities, office space, administrators). So in fact one might say research helps fund the university rather than the other way around. Flag
  • It's certainly overrated as a research and graduate level university. Whether it is good for getting an undergraduate education is unclear, but a big part of the appeal is getting to live in D.C..while attending college instead of living in some small college town in the corn fields.
Javier E

The Joy of Psyching Myself Out­ - The New York Times - 0 views

  • that neat separation is not just unwarranted; it’s destructive
  • Although it’s often presented as a dichotomy (the apparent subjectivity of the writer versus the seeming objectivity of the psychologist), it need not be.
  • IS it possible to think scientifically and creatively at once? Can you be both a psychologist and a writer?
  • ...10 more annotations...
  • “A writer must be as objective as a chemist,” Anton Chekhov wrote in 1887. “He must abandon the subjective line; he must know that dung heaps play a very reasonable part in a landscape.”
  • At the turn of the century, psychology was a field quite unlike what it is now. The theoretical musings of William James were the norm (a wry commenter once noted that William James was the writer, and his brother Henry, the psychologist)
  • Freud was a breed of psychologist that hardly exists anymore: someone who saw the world as both writer and psychologist, and for whom there was no conflict between the two. That boundary melding allowed him to posit the existence of cognitive mechanisms that wouldn’t be empirically proved for decades,
  • Freud got it brilliantly right and brilliantly wrong. The rightness is as good a justification as any of the benefits, the necessity even, of knowing how to look through the eyes of a writer. The wrongness is part of the reason that the distinction between writing and experimental psychology has grown far more rigid than it was a century ago.
  • the signs people associate with liars often have little empirical evidence to support them. Therein lies the psychologist’s distinct role and her necessity. As a writer, you look in order to describe, but you remain free to use that description however you see fit. As a psychologist, you look to describe, yes, but also to verify.
  • Without verification, we can’t always trust what we see — or rather, what we think we see.
  • The desire for the world to be what it ought to be and not what it is permeates experimental psychology as much as writing, though. There’s experimental bias and the problem known in the field as “demand characteristics” — when researchers end up finding what they want to find by cuing participants to act a certain way.
  • IN 1932, when he was in his 70s, Freud gave a series of lectures on psychoanalysis. In his final talk, “A Philosophy of Life,” he focused on clarifying an important caveat to his research: His followers should not be confused by the seemingly internal, and thus possibly subjective, nature of his work. “There is no other source of knowledge of the universe but the intellectual manipulation of carefully verified observations,” he said.
  • That is what both the psychologist and the writer should strive for: a self-knowledge that allows you to look in order to discover, without agenda, without preconception, without knowing or caring if what you’re seeing is wrong or right in your scheme of the world. It’s harder than it sounds. For one thing, you have to possess the self-knowledge that will allow you to admit when you’re wrong.
  • Even with the best intentions, objectivity can prove a difficult companion. I left psychology behind because I found its structural demands overly hampering. I couldn’t just pursue interesting lines of inquiry; I had to devise a set of experiments, see how feasible they were, both technically and financially, consider how they would reflect on my career. That meant that most new inquiries never happened — in a sense, it meant that objectivity was more an ideal than a reality. Each study was selected for a reason other than intrinsic interest.
Javier E

Buddhism Is More 'Western' Than You Think - The New York Times - 0 views

  • Not only have Buddhist thinkers for millenniums been making very much the kinds of claims that Western philosophers and psychologists make — many of these claims are looking good in light of modern Western thought.
  • In fact, in some cases Buddhist thought anticipated Western thought, grasping things about the human mind, and its habitual misperception of reality, that modern psychology is only now coming to appreciate.
  • “Things exist but they are not real.” I agree with Gopnik that this sentence seems a bit hard to unpack. But if you go look at the book it is taken from, you’ll find that the author himself, Mu Soeng, does a good job of unpacking it.
  • ...14 more annotations...
  • It turns out Soeng is explaining an idea that is central to Buddhist philosophy: “not self” — the idea that your “self,” as you intuitively conceive it, is actually an illusion. Soeng writes that the doctrine of not-self doesn’t deny an “existential personality” — it doesn’t deny that there is a you that exists; what it denies is that somewhere within you is an “abiding core,” a kind of essence-of-you that remains constant amid the flux of thoughts, feelings, perceptions and other elements that constitute your experience. So if by “you” we mean a “self” that features an enduring essence, then you aren’t real.
  • In recent decades, important aspects of the Buddhist concept of not-self have gotten support from psychology. In particular, psychology has bolstered Buddhism’s doubts about our intuition of what you might call the “C.E.O. self” — our sense that the conscious “self” is the initiator of thought and action.
  • recognizing that “you” are not in control, that you are not a C.E.O., can help give “you” more control. Or, at least, you can behave more like a C.E.O. is expected to behave: more rationally, more wisely, more reflectively; less emotionally, less rashly, less reactively.
  • Suppose that, via mindfulness meditation, you observe a feeling like anxiety or anger and, rather than let it draw you into a whole train of anxious or angry thoughts, you let it pass away. Though you experience the feeling — and in a sense experience it more fully than usual — you experience it with “non-attachment” and so evade its grip. And you now see the thoughts that accompanied it in a new light — they no longer seem like trustworthy emanations from some “I” but rather as transient notions accompanying transient feelings.
  • Brain-scan studies have produced tentative evidence that this lusting and disliking — embracing thoughts that feel good and rejecting thoughts that feel bad — lies near the heart of certain “cognitive biases.” If such evidence continues to accumulate, the Buddhist assertion that a clear view of the world involves letting go of these lusts and dislikes will have drawn a measure of support from modern science.
  • There’s a broader and deeper sense in which Buddhist thought is more “Western” than stereotype suggests. What, after all, is more Western than science’s emphasis on causality, on figuring out what causes what, and hoping to thus explain why all things do the things they do?
  • the Buddhist idea of “not-self” grows out of the belief undergirding this mission — that the world is pervasively governed by causal laws. The reason there is no “abiding core” within us is that the ever-changing forces that impinge on us — the sights, the sounds, the smells, the tastes — are constantly setting off chain reactions inside of us.
  • Buddhism’s doubts about the distinctness and solidity of the “self” — and of other things, for that matter — rests on a recognition of the sense in which pervasive causality means pervasive fluidity.
  • Buddhism long ago generated insights that modern psychology is only now catching up to, and these go beyond doubts about the C.E.O. self.
  • psychology has lately started to let go of its once-sharp distinction between “cognitive” and “affective” parts of the mind; it has started to see that feelings are so finely intertwined with thoughts as to be part of their very coloration. This wouldn’t qualify as breaking news in Buddhist circles.
  • Note how, in addition to being therapeutic, this clarifies your view of the world. After all, the “anxious” or “angry” trains of thought you avoid probably aren’t objectively true. They probably involve either imagining things that haven’t happened or making subjective judgments about things that have.
  • All we can do is clear away as many impediments to comprehension as possible. Science has a way of doing that — by insisting that entrants in its “competitive storytelling” demonstrate explanatory power in ways that are publicly observable, thus neutralizing, to the extent possible, subjective biases that might otherwise prevail.
  • Buddhism has a different way of doing it: via meditative disciplines that are designed to attack subjective biases at the source, yielding a clearer view of both the mind itself and the world beyond it.
  • The results of these two inquiries converge to a remarkable extent — an extent that can be appreciated only in light of the last few decades of progress in psychology and evolutionary science. At least, that’s my argument.
mshilling1

Isaac Newton's Influence on Modern Science - 0 views

  • Aristotelian thought had dominated mathematics and astronomy for centuries, until revolutionaries like Nicolaus Copernicus and Galileo Galilei challenged those views.
  • The mathematization of physics was a crucial step in the advancement of science. It was realized that the mathematical tools we had at the time weren’t strong enough.
  • By trial and error, Kepler worked and worked until finally, he hit upon the shape that worked—elliptical orbits with the Sun at one focus. It turned out to perfectly fit the known observations.
  • ...10 more annotations...
  • They were stunning results, but no one knew why they would be true. Aristotle’s circular orbits had a philosophical basis—the perfection of the aether from which everything out there was made.
  • The basic concepts which ordered the universe and the picture of reality they gave rise to had become wobbly, but had not fallen.
  • So, the first law describes the behavior of an object subjected to no external force. The second law then describes the behavior of an object that is subjected to an external force.
  • And so Newton’s success supercharged an intellectual movement developing around him, the Enlightenment. The picture of reality that emerged from the Enlightenment is one in which the universe is well-ordered according to principles that are accessible to the human mind.
  • Again, if a person is on ice skates and someone pushes them, they accelerate forward because of the force and the other person goes backwards because of it. To every action there is always an equal, but opposite reaction.
  • When these three laws of mechanics and the law of universal gravitation are used together, we suddenly have an explanation for Kepler’s elliptical orbits. Not only that, we can explain the tides, the motion of cannonballs, virtually everything we see in the world around us.
  • When these three laws of mechanics and the law of universal gravitation are used together, it was not only successful in terms of explaining and predicting, but, theoretically, it also undermined the old foundation—Aristotle.
  • Newton’s law of universal gravitation is universal. It applies to everything equally. Aristotle’s worldview was enforced by the centralized power of the Catholic Church. Newton’s worldview came not from authority, but from observing, something anyone could do.
  • The bigger the push, the more the change; the heavier the object, the less the change. An object is either subject to a force or it isn’t, so the first two laws are sufficient to describe the behavior of the object.
  • We live in a world that we can understand. Humans are perfectly rational beings, made to understand the world we inhabit.
Javier E

How Does Science Really Work? | The New Yorker - 1 views

  • I wanted to be a scientist. So why did I find the actual work of science so boring? In college science courses, I had occasional bursts of mind-expanding insight. For the most part, though, I was tortured by drudgery.
  • I’d found that science was two-faced: simultaneously thrilling and tedious, all-encompassing and narrow. And yet this was clearly an asset, not a flaw. Something about that combination had changed the world completely.
  • “Science is an alien thought form,” he writes; that’s why so many civilizations rose and fell before it was invented. In his view, we downplay its weirdness, perhaps because its success is so fundamental to our continued existence.
  • ...50 more annotations...
  • In school, one learns about “the scientific method”—usually a straightforward set of steps, along the lines of “ask a question, propose a hypothesis, perform an experiment, analyze the results.”
  • That method works in the classroom, where students are basically told what questions to pursue. But real scientists must come up with their own questions, finding new routes through a much vaster landscape.
  • Since science began, there has been disagreement about how those routes are charted. Two twentieth-century philosophers of science, Karl Popper and Thomas Kuhn, are widely held to have offered the best accounts of this process.
  • For Popper, Strevens writes, “scientific inquiry is essentially a process of disproof, and scientists are the disprovers, the debunkers, the destroyers.” Kuhn’s scientists, by contrast, are faddish true believers who promulgate received wisdom until they are forced to attempt a “paradigm shift”—a painful rethinking of their basic assumptions.
  • Working scientists tend to prefer Popper to Kuhn. But Strevens thinks that both theorists failed to capture what makes science historically distinctive and singularly effective.
  • Sometimes they seek to falsify theories, sometimes to prove them; sometimes they’re informed by preëxisting or contextual views, and at other times they try to rule narrowly, based on t
  • Why do scientists agree to this scheme? Why do some of the world’s most intelligent people sign on for a lifetime of pipetting?
  • Strevens thinks that they do it because they have no choice. They are constrained by a central regulation that governs science, which he calls the “iron rule of explanation.” The rule is simple: it tells scientists that, “if they are to participate in the scientific enterprise, they must uncover or generate new evidence to argue with”; from there, they must “conduct all disputes with reference to empirical evidence alone.”
  • , it is “the key to science’s success,” because it “channels hope, anger, envy, ambition, resentment—all the fires fuming in the human heart—to one end: the production of empirical evidence.”
  • Strevens arrives at the idea of the iron rule in a Popperian way: by disproving the other theories about how scientific knowledge is created.
  • The problem isn’t that Popper and Kuhn are completely wrong. It’s that scientists, as a group, don’t pursue any single intellectual strategy consistently.
  • Exploring a number of case studies—including the controversies over continental drift, spontaneous generation, and the theory of relativity—Strevens shows scientists exerting themselves intellectually in a variety of ways, as smart, ambitious people usually do.
  • “Science is boring,” Strevens writes. “Readers of popular science see the 1 percent: the intriguing phenomena, the provocative theories, the dramatic experimental refutations or verifications.” But, he says,behind these achievements . . . are long hours, days, months of tedious laboratory labor. The single greatest obstacle to successful science is the difficulty of persuading brilliant minds to give up the intellectual pleasures of continual speculation and debate, theorizing and arguing, and to turn instead to a life consisting almost entirely of the production of experimental data.
  • Ultimately, in fact, it was good that the geologists had a “splendid variety” of somewhat arbitrary opinions: progress in science requires partisans, because only they have “the motivation to perform years or even decades of necessary experimental work.” It’s just that these partisans must channel their energies into empirical observation. The iron rule, Strevens writes, “has a valuable by-product, and that by-product is data.”
  • Science is often described as “self-correcting”: it’s said that bad data and wrong conclusions are rooted out by other scientists, who present contrary findings. But Strevens thinks that the iron rule is often more important than overt correction.
  • Eddington was never really refuted. Other astronomers, driven by the iron rule, were already planning their own studies, and “the great preponderance of the resulting measurements fit Einsteinian physics better than Newtonian physics.” It’s partly by generating data on such a vast scale, Strevens argues, that the iron rule can power science’s knowledge machine: “Opinions converge not because bad data is corrected but because it is swamped.”
  • Why did the iron rule emerge when it did? Strevens takes us back to the Thirty Years’ War, which concluded with the Peace of Westphalia, in 1648. The war weakened religious loyalties and strengthened national ones.
  • Two regimes arose: in the spiritual realm, the will of God held sway, while in the civic one the decrees of the state were paramount. As Isaac Newton wrote, “The laws of God & the laws of man are to be kept distinct.” These new, “nonoverlapping spheres of obligation,” Strevens argues, were what made it possible to imagine the iron rule. The rule simply proposed the creation of a third sphere: in addition to God and state, there would now be science.
  • Strevens imagines how, to someone in Descartes’s time, the iron rule would have seemed “unreasonably closed-minded.” Since ancient Greece, it had been obvious that the best thinking was cross-disciplinary, capable of knitting together “poetry, music, drama, philosophy, democracy, mathematics,” and other elevating human disciplines.
  • We’re still accustomed to the idea that a truly flourishing intellect is a well-rounded one. And, by this standard, Strevens says, the iron rule looks like “an irrational way to inquire into the underlying structure of things”; it seems to demand the upsetting “suppression of human nature.”
  • Descartes, in short, would have had good reasons for resisting a law that narrowed the grounds of disputation, or that encouraged what Strevens describes as “doing rather than thinking.”
  • In fact, the iron rule offered scientists a more supple vision of progress. Before its arrival, intellectual life was conducted in grand gestures.
  • Descartes’s book was meant to be a complete overhaul of what had preceded it; its fate, had science not arisen, would have been replacement by some equally expansive system. The iron rule broke that pattern.
  • Strevens sees its earliest expression in Francis Bacon’s “The New Organon,” a foundational text of the Scientific Revolution, published in 1620. Bacon argued that thinkers must set aside their “idols,” relying, instead, only on evidence they could verify. This dictum gave scientists a new way of responding to one another’s work: gathering data.
  • it also changed what counted as progress. In the past, a theory about the world was deemed valid when it was complete—when God, light, muscles, plants, and the planets cohered. The iron rule allowed scientists to step away from the quest for completeness.
  • The consequences of this shift would become apparent only with time
  • In 1713, Isaac Newton appended a postscript to the second edition of his “Principia,” the treatise in which he first laid out the three laws of motion and the theory of universal gravitation. “I have not as yet been able to deduce from phenomena the reason for these properties of gravity, and I do not feign hypotheses,” he wrote. “It is enough that gravity really exists and acts according to the laws that we have set forth.”
  • What mattered, to Newton and his contemporaries, was his theory’s empirical, predictive power—that it was “sufficient to explain all the motions of the heavenly bodies and of our sea.”
  • Descartes would have found this attitude ridiculous. He had been playing a deep game—trying to explain, at a fundamental level, how the universe fit together. Newton, by those lights, had failed to explain anything: he himself admitted that he had no sense of how gravity did its work
  • by authorizing what Strevens calls “shallow explanation,” the iron rule offered an empirical bridge across a conceptual chasm. Work could continue, and understanding could be acquired on the other side. In this way, shallowness was actually more powerful than depth.
  • Quantum theory—which tells us that subatomic particles can be “entangled” across vast distances, and in multiple places at the same time—makes intuitive sense to pretty much nobody.
  • Without the iron rule, Strevens writes, physicists confronted with such a theory would have found themselves at an impasse. They would have argued endlessly about quantum metaphysics.
  • ollowing the iron rule, they can make progress empirically even though they are uncertain conceptually. Individual researchers still passionately disagree about what quantum theory means. But that hasn’t stopped them from using it for practical purposes—computer chips, MRI machines, G.P.S. networks, and other technologies rely on quantum physics.
  • One group of theorists, the rationalists, has argued that science is a new way of thinking, and that the scientist is a new kind of thinker—dispassionate to an uncommon degree.
  • As evidence against this view, another group, the subjectivists, points out that scientists are as hopelessly biased as the rest of us. To this group, the aloofness of science is a smoke screen behind which the inevitable emotions and ideologies hide.
  • At least in science, Strevens tells us, “the appearance of objectivity” has turned out to be “as important as the real thing.”
  • The subjectivists are right, he admits, inasmuch as scientists are regular people with a “need to win” and a “determination to come out on top.”
  • But they are wrong to think that subjectivity compromises the scientific enterprise. On the contrary, once subjectivity is channelled by the iron rule, it becomes a vital component of the knowledge machine. It’s this redirected subjectivity—to come out on top, you must follow the iron rule!—that solves science’s “problem of motivation,” giving scientists no choice but “to pursue a single experiment relentlessly, to the last measurable digit, when that digit might be quite meaningless.”
  • If it really was a speech code that instigated “the extraordinary attention to process and detail that makes science the supreme discriminator and destroyer of false ideas,” then the peculiar rigidity of scientific writing—Strevens describes it as “sterilized”—isn’t a symptom of the scientific mind-set but its cause.
  • The iron rule—“a kind of speech code”—simply created a new way of communicating, and it’s this new way of communicating that created science.
  • Other theorists have explained science by charting a sweeping revolution in the human mind; inevitably, they’ve become mired in a long-running debate about how objective scientists really are
  • In “The Knowledge Machine: How Irrationality Created Modern Science” (Liveright), Michael Strevens, a philosopher at New York University, aims to identify that special something. Strevens is a philosopher of science
  • Compared with the theories proposed by Popper and Kuhn, Strevens’s rule can feel obvious and underpowered. That’s because it isn’t intellectual but procedural. “The iron rule is focused not on what scientists think,” he writes, “but on what arguments they can make in their official communications.”
  • Like everybody else, scientists view questions through the lenses of taste, personality, affiliation, and experience
  • geologists had a professional obligation to take sides. Europeans, Strevens reports, tended to back Wegener, who was German, while scholars in the United States often preferred Simpson, who was American. Outsiders to the field were often more receptive to the concept of continental drift than established scientists, who considered its incompleteness a fatal flaw.
  • Strevens’s point isn’t that these scientists were doing anything wrong. If they had biases and perspectives, he writes, “that’s how human thinking works.”
  • Eddington’s observations were expected to either confirm or falsify Einstein’s theory of general relativity, which predicted that the sun’s gravity would bend the path of light, subtly shifting the stellar pattern. For reasons having to do with weather and equipment, the evidence collected by Eddington—and by his colleague Frank Dyson, who had taken similar photographs in Sobral, Brazil—was inconclusive; some of their images were blurry, and so failed to resolve the matter definitively.
  • it was only natural for intelligent people who were free of the rule’s strictures to attempt a kind of holistic, systematic inquiry that was, in many ways, more demanding. It never occurred to them to ask if they might illuminate more collectively by thinking about less individually.
  • In the single-sphered, pre-scientific world, thinkers tended to inquire into everything at once. Often, they arrived at conclusions about nature that were fascinating, visionary, and wrong.
  • How Does Science Really Work?Science is objective. Scientists are not. Can an “iron rule” explain how they’ve changed the world anyway?By Joshua RothmanSeptember 28, 2020
ilanaprincilus06

How Sleeping Memories Come Back to Life | Time - 0 views

  • It’s almost a good thing that we’ve never been entirely able to figure out how human memory works, because if we did, we’d probably just forget.
  • It’s almost a good thing that we’ve never been entirely able to figure out how human memory works, because if we did, we’d probably just forget.
    • ilanaprincilus06
       
      Shows just how unreliable our brains truly are
  • Working memories, it seems, are preserved in a latent or hidden state, existing without any evident activation at all until the moment they’re needed.
    • ilanaprincilus06
       
      Reminds me of learning about a new topic for a class and then somehow remembering the concept during an assessment.
  • ...5 more annotations...
  • Instead, however, while there was indeed detectable neural activity for the so-called attended memory item (AMI)—the one that the subjects knew they would need right away—there was none at all for the unattended memory items (UMI), which the subjects might also need, but not until later.
  • All the same, when subjects were asked about a UMI, a peak appeared for it just as it did for an AMI. In both cases, working memory worked just fine, but in one case it did so without the benefit of any visible storage system.
  • unattended memories are maintained in what the researchers called “a privileged state” only as long as they had to be.
  • Whatever the explanation, the work has implications for understanding not just memory but other cognitive functions like perception, attention and goal maintenance.
  • if noninvasive brain stimulation techniques can be used to reactivate and potentially strengthen latent memories”—in other words, recovering information that had been forever lost.
annabaldwin_

How Getting Enough Sleep Can Make You Less Afraid - The Atlantic - 0 views

  • A new study suggests that people who naturally get more REM sleep may be less sensitive to frightening things.
  • For the study, a team of researchers from Rutgers University sent 17 subjects home with sleep-monitoring devices—headbands that monitor their brain waves, wristbands that track arm movements, and sleep logs—and asked them to sleep as they normally would for a week. They were monitoring how much sleep they were getting—especially REM, or rapid-eye-movement sleep.
  • Each night, most people sleep about seven or eight hours, about two hours of which is REM sleep, the stage of sleep in which the body relaxes fully and most dreams occur.
  • ...6 more annotations...
  • The researchers then conditioned the participants to be afraid of certain images by showing them pictures of ordinary-looking rooms lit with lamps of various hues, some of which were paired with a mild shock to the finger. Through the shocks, they were taught to fear the rooms that were lit by certain colors.
  • The subjects with more REM sleep also had less activity in those areas of the brain. That suggests that the more well-rested subjects may not have been hard-wiring those fears into their brains quite as strongly.
  • PTSD is already known to be associated with sleep disturbances, and past studies have shown that sleep-deprived people have more activity in their amygdalae upon being shown upsetting pictures.
  • “REM is very unique because it’s the only time that area of the brain is completely silent,” said Shira Lupkin, one of the study’s authors and a researcher with the Center for Molecular and Behavioral Neuroscience at Rutgers University.
  • Because of that, people who get plenty of REM sleep might be less reactive to emotional stimuli.
  • If the study is replicated, there could be real-world implications for stopping trauma—before it starts.
Javier E

Technopoly-Chs. 4.5--The Broken Defenses - 0 views

  • r ~~~-~st of us. There is almo-~t-n~ ~ wheth;~~ct~~l or imag'l ined, that will surprise us for very long, since we have no comprehensive and consistent picture of the world that would [ make the fact appear as an unacceptable contradiction.
  • The belief system of a tool-using culture is rather like a brand-new deck of cards. Whether it is a culture of technological simplicity or sophistication, there always exists a more or less comprehensive, ordered world-view, resting on a set of metaphysical or theological assumptions. Ordinary men and women might not clearly grasp how the harsh realities of their lives fit into the grand and benevolent design of the universe, but they have no doubt that there is such a design, and their priests and shamans are well able, by deduction from a handful of principles, to make it, if not wholly rational, at least coherent.
  • From the early seventeenth century, when Western culture u~ertook to reorganize itself to accommodate the printing press, until the mid-nineteenth century, no significant technologies were introduced that altered l-he form, volume, or speed of . in~. As a consequence, Western culture had more than two hundred years to accustom itself to the new information conditions created by the press.
  • ...86 more annotations...
  • That is eseecial1y the case with technical facts.
  • as incomprehensible problems mount, as the con- ~ cept of progress fades, as meaning itself becomes suspect, the T echnopolist stands firm in believing that what the world needs is yet more information. It is like the joke about the man who , complains that the food he is being served in a restaurant is \ inedibleand also that the_ portions are too small
  • The faith of those who believed in Progress was based on the assumption that one could discern a purpose to the human enterprise, even without the theological scaffolding that supported the Christian edifice of belief. Science and technology were the chief instruments of Progress, and · i.Lac_cumulation of reliable in orma on a out nature _1b_n, would bring ignorance, superstition, and suffering to an end.
  • In T ~chnopoly, we are driven to fill our lives with the quesUo "accesTinformation.
  • But the genie that came out of the bottle proclaiming that information was the new god of culture was a deceiver. It solved the problez:n of information scarcity, the disadvantages o_f wh~s~ious. But it gave no wami g_ahout the dan_gers of information7rttn,
  • !:ion of what is called a_ curriculum was a logical step toward 1./ organizing, limiting, and discriminating among available sources of information. Schools became technocracy's first secular bureaucracies, structures for legitimizing some parts of the flow of infgrmatiQD and di"s.ci.e.diling other earts. School;;ere, in short, a ~eans of governing the ecology of information.
  • James Beniger's The <;antral Revolution, which is among the three or four most important books we have on the lb\b'ect of the relation of informe;ition to culture. In the next chapter, I have relied to a considerable degree on The Control Revolution in my discussion of the breakdown of the control mechanisms,
  • most of the methods by which technocracies. have hoped to keep information from running amok are now dysfunctional. Indeed, one_ ~_i!)!_.Q.L.de£ining_a.I..em Q~ oly is to say that its inf_o_fmation immu is inoperable.
  • Very early ~n, tt..w.as..understood that the printed book had er ate.cl-a ir::ifo · · on crisis and that . =somet ing needed to be done to aintain a measure of control.
  • it is why in _a TechnoE,.oly there can be no transcendent sense of purpose or meaning, no cultural coherence.
  • In - 1480, before the informati9n explosion, there were thirty-four schools in all of England. By 1660, there were 444, one school for every twelve square miles.
  • There were several reasons for the rapid growth of the common school, but none was more obvious than that it was a necessary response to the anxiefies and confusion aroused by information on the loose. The inven-
  • The milieu in which T echnopoly flourishes is one in which the tie between information and human purpose has been severed, i.e., inf~rmation appears indiscriminately, directed at no one in particular, in enormous volume and at high speeds; and disconnected from theory, meaning, or purpose.
  • Abetted ~~orm of ed~~on that in itself has been em _lie~any co~e~ent world-view, Technopoly deprives us of the social, p·olitical, historical, mefaphys1cal, logical, or spiritual bases for knowing what is beyond belief.
  • It developed new institutions, such as the school and representative government. It developed new conceptions of knowledge and intelligence, and a height-
  • ened respect for reason and privacy. It developed new forms of economic activity, such as mechanized production and corporate capitalism, and even gave articulate expression to the possibilities of a humane socialism.
  • There is not a single line written by Jefferson, Adams, Paine, Hamilton, or Franklin that does not take for granted that when information is made available to citizens they are capable of managing it. This is not to say that the Founding Fathers believed information could not be false, misleading, or irrelevant. But they believed that the marketplace of infonpation and ideas was sufficiently ordered so that citizens could make sense of what they read and heard and, through reason, judge ·its μsefulness to their lives. Jefferson's proposals for education, Paine'~ arguments for self-governance, Franklin's arrangements for community affairs assume coherent, commonly shared principles.that allow us to debate such questions as: What are the responsibilities of citizens? What is the nature of education? What constitutes human progress? What are the limitations of social structures?
  • New forms of public discourse came into being through newspapers, pamphlets, broadsides, and books.
  • It is no wonder that the eighteenth century gave us our standard of excellence in the use of reason, as exemplified in the work of Goethe, Voltaire, Diderot, Kant, Hume, Adam Smith, Edmund Burke, Vico, Edward Gibbon, and, of course, Jefferson, Madison, Franklin, Adams, Hamilton, and Thomas Paine.
  • I weight the list with America's "Founding Fathers" because technocratic-typographic America was the first nation ever to be argued into existence irLpr111t. Paine's Common Sense and The Rights of Man, Jefferson's Declaration of Independence, and the Federalist Papers were written and printed efforts to make the American experiment appear reasonable to the people, which to the eighteenth-century mind was both necessary and sufficient. To any people whose politics were the politics of the printed page, as Tocqueville said of America, reason and printr ing were inseparable.
  • The presumed close connection among information, reason, and usefulness began to lose its_ legitimacy toward the midnineteenth century with the invention of the telegraph. Prior to the telegraph, information could be moved only as fa~. as a train could travel: al5out thirty-five miles per hour. Prior to the telegraph, information was sought as part of the process of understanding and solvin articular roblems. Prior to the telegraph, informal-ion tended to be of local interest.
  • First Amendment to the United States Constitution stands as a monument to the ideolo_g~~ print. It says: "Congress shall make no law respecting the establishment of religion, or prohibiting the free exercise thereof; or abridging freedom of speech or of the press; or of the right of the people peaceably to assemble, and to petition the government for a redress of grievances." In these forty-five words we may find the fundamental values of the literate, reasoning_giind as fostered by the print revolution: a belief in privacy, individuality, intellectual freedom, open criticism, and ~.' adio .
  • telegraphy created the idea of context-free . 1 informatig_n::= that fs'~the idea that the value of information need ;;~t be ti~ to any function it might serve in social and political
  • decision-making and action. The telegraph made information into a commodity, a "thing" that could be bought and sold irrespective of its uses or meaning. 2
  • a new definition qf information came into being. Here was information that rejected the necessit ·of interco~nectedness, proceeded without conte~rgued for instancy against historic continuity, and offere · ascination· in place of corn !exit and cohe ence.
  • The potential of the telegraph to transform information into a commodity might never have been realized except for its artnershi with the enny ress, which was the first institution to grasp the significance of the annihilation of space and the saleability of irrelevant information.
  • the fourth stage of the information revolution occurred, broadcasting. And then the fifth, computer technology. Each of these brought with it new forms of information, unpre~edented amounts of it, and increased speeds
  • photography was invented at approximately the same time a~phy, and initiated the Ehi:rd stage of the information revolution. Daniel Boorstin has called it "the graphic revolution," bec~use the photograph and other ico~ogr~phs br~ on a massive intrusion of ima es into the symbolic environment:
  • The new imagery, with photography at its forefront, did not merely function as a supplement to language but tended to replace it as our dominant: means for construing, understanding~d testing reaj.ity.
  • ~ the beginning of the seventeenth century, an entirely new information environment had been created by_12rint
  • It is an improbable world. It is a world in which the idea of human progress, as Bacon ex~sed it, has been g~ by the idea of technological progress.
  • The aim is no_t to reduZe ignorance, r . supersti ion, and s ering but to accommodate ourselves to the requirements of new technologies.
  • echnopoly is a state of cttlture., It is also a st~te of mind. It consists in the deification of technology, which means that the culture seeks its authorization in te0,~logy, finds · .atisf~tions in technolo , and takes its orders from technolog-¥,
  • We proceed under ( the. assumption that information is our friend, believing that cultures may suffer grievously from a lack of information, which, of course, they do. It is only now beginning to be understood that cultures may also suffer grievously from infori mation glut, information without meaning, information without · .... control mechanisms.
  • Those who feel most comfortable in Technop.oJy are those who are convinced that technical progress is humanity's supreme achievement and the instrument by which our most profound dilemmas may be solved. They also believe that information is an unmixed blessing, which through its continued and uncontrolled production and dissemination offers increased freedom, creativity, and peace of mind.
  • Th_e relationship between information and the mechanisms ( for its control is fairly simple ~ec · ·ology increases the available supply of information. As the supply is increased, \ control mechanisms are strained. Additional control mech\ anisms ~re needed to cope with new information. When addi1 tional control mechanisms are themselves technical, they in tum I further increase the supply of information. When the supply of information is no longer controllable, a general breakdown in psychic tranquillity and social purpose occurs. Without defenses, people have no way of finding meaning in their experiences, lose their capacity to remember, and have difficulty imagining reasonable futures.
  • any decline in the force of i~~~ti'?n_s makes people vulnerable to information chaos. 1 To say that life is destabilized by weakened institutions is merely to say that information loses its use and therefore becomes a source of confu;~n rather than coherence.
  • T echnop_oly, then, is to say it is what h~pens to society when the defe~ainst informati;~ glut have broken down.
  • Soci~finstitufions sometimes do their work simply by denying people access to information, but principally by directing how much weight and, therefore, value one must give to information. Social institutions are concerned with the meaning of information and can be quite rigorous in enforcing standards of admission.
  • H is what happens when a culture, overcome by information generated by technology, tries to employ technology itself as a means of providing clear direction and humane purpose. The effort is mostly doomed to failure
  • although legal theory has been taxed to the limit by new information from diverse sources-biology, psychology, and sociology, among themthe rules governing relevance have remained fairly stable. This may account for Americans' overuse of the co~~-~~ as a mean; of finding cohe_!Til.<iAncl__s.tability. As other institutions become I unusabl~ mechanisms for the control of wanton information, the courts stand as a final arbiter of truth.
  • the school as a mechanism for information control. What its standards are can usually be found in, a curriculum or, with even more clarity, in a course catalogue. A college catalogue lists courses, subjects, and fields of study that, taken together, amount to a certified statement of what a serious student ought to think about.
  • The Republican Party represented the interests of the rich, who, by definition, had no concern for us.
  • More to the point, in what is omitted from a catalogue, we may learn what a serious student ought not to think about. A college catalogue, in other words, is a formal description of an information management program; it defines and categorizes knowledge, and in so doing systematically excludes, demeans, labels as trivial-i~ a word, disregards certain kinds of information.
  • In the West, the family as an institution for the management of nonbiological information began with the ascendance of print. As books on every conceivable subject become available, parent_~ were forced int°._the roles of guard-· ians'... protectors, nurturers, and arbiters of taste and rectitude. \ Their function was to define what it means to be a child by \ excluding from the family's domain information that would 1. undermine its purpose.
  • all_ theories are oversimplifications, or at least lead to oversimplification. The rule of law is an oversimplification. A curriculum is an oversimplification. So is a family's conception of a child. T~~t is the funt!ion _o._Ltheories-_ to o~~~~ip:lp}}_fy, and thus to assist believers in_ organiziDg, weighting, _ _an~_ excluding information. Therein lies the power of theories.
  • That the family can no longer do this is, I believe, obvious to everyone.
  • Th~-ir weakness is that precisely because they oversimplify, they are vulnerable to attack by new information. When there is too much information to _$_ustaJ12 -~,:Z}I theory, infoLm_a_ti.on._Q.~S<?~es essentially mea11iD_g!~s
  • The political party is another.
  • As a young man growing up in a Democratic-household, I was provided with clear instructions on what value to assign to political events and commentary.
  • The most imposing institutions for the control of information are religio!1 ~nd the st~J:f, .. They do their work in a somewhat more abstract way than do courts, schools, families, or political parties. The_y m?n~g~__Ji;1formation throug~ creation of mytJ:is and stories that express theories about funq1m1entaf question_s_:_ __ 10:_hy are we here, where have we come from, and where are we headed?
  • They followed logically from theory, which was, as I remember it, as follows: Because people need protection, they must align themselves with a political organization. The Democratic Party was entitled to our loyalty because it represented the social and economic interests of the working class, of which our family, relatives, and neighbors were members
  • the Bible also served as an information control mechanism, especially in the moral domain. The Bible gives manifold
  • any educational institution, if it is to function well in the mana~~nt of information, must have a theory about its purpose and meaning-'. .!n'!::!Sl. have the means to give clear expression to its_ theory, and must do so, to a large extent, by excluding information.
  • instructions on what one must do and must not do, as well as guidance on what language to avoid (on pain of committing blasphemy), what ideas to avoid (on pain of committing heresy), what symbols to avoid (on pain of committing idolatry). Necessarily but perhaps._ unfortunately, the Bible also explained how the world came into being in such literal detail that it could not accommodate new information produced by the telescope and subsequent technologies.
  • in observing God's laws, and the detailed requirements of their enactment, believers receive guidance about what books they should not read, about what plays and films they should not see, about what music they should not hear, about what subjects their children should not study, and so on. For strict fundamentalists of the Bible, the theory and what follows from it seal them off from unwanted information, and in that way their actions are invested with meaning, clarity, and, they believe, moral authority.
  • Those who reject the Bible's theory and who believe, let us say, in the theory of Science are also protected from unwanted information. Their theory, for example, instructs them to disregard information about astrology, dianetics, and creationism, which they usually label as medieval superstition or subjective opinion.
  • Their theory fails to give any guidance about moral information and, by definition, gives little weight to information that falls outside the constraints of science. Undeniably, fewer and fewer people are bound in any serious way to Biblical or other religious traditions as a source of compelling attention and authority, the result of which is that they make no f!lOral decisions, onl~_pradical ones. _This is still another way of defining Technopoly. The term is aptly used for a _culture whose av.~ilable theories do not offer guidance about what is acceptable informaHon in the moral domain.
  • thought-world that functions not only without a transcendent; narrative to provide moral underpinnings but also without strong social institutions to control the flood of information produced by technology.
  • In the r case of the United States, the great eighteenth-century revolution was not indifferent to commodity capitalism but was nonetheless infused with profound moral content. The U~!ed States was not merely an experiment in a new form of governance; it wai1nefu1fillmenl-oFGocf s plan. True, Adams, Jeffe;son, and Painere1ected-fne supernatural elements in the Bible,· but they never doubted that their experiment had the imprimatur of \ Providence. People were to be free but for a eurp_9se. Their [ God~giv_e~ig[ifs im li~_? obli ations and responsibilities, not L onfytoGod but to other nations, to which the new republic would be a guide and a showcase of what is possible-w~en reason and spirituality commingle.
  • American Technopoly must rel,y, to an obsessive extent, on technica( ~ethods to control the flow of information. Three such means merit speci attention.
  • The first is bureaucracy, which James Beniger in The Control © Revolution ra°i1l~as atoremost among all technological solutions to the crisis of control."
  • It is an open question whether or not "liberal democracy" in its present form can provide a thought-world of sufficient moral substance to sustain meaningful lives.
  • Vaclav Havel, then newly elected as president of Czechoslovakia, posed in an address to the U.S. Congress. "We still don't know how to put morality ahead of politics, science, and economics," he said. "We are still incapable of understanding that the only genuine backbone of our actions-if they are to be moral-is responsibility. Responsibility to something higher than my family, my country, my firm, my success." What Havel is saying is that it is not enough for his nation to liberate itself from one flawed theory; it is necessary to find another, and he worries that Technopoly provides no answer.
  • Francis Fukuyama is wrong. There is another ideological conflict to be fought-between "liberal democracy" as conceived in the eighteenth century, with all its transcendent moral underpinnings, and T echnopoly, a twentieth-century
  • in at- ~ tempting to make the most rational use of information, bureaucracy ignores all information and ideas that do not contribute to efficiency
  • bureaucracy has no intellectual, I political, or moral theory--,--except for its implicit assumption that efficiency is the principal aim of all social institutions and that other goals are essentially less worthy, if not irrelevant. That is why John Stuart Mill thought bureaucracy a "tyranny" and C. S. Lewis identified it with Hell.
  • in principle a bureaucracy is simply a coordinated series of techniques for reducing the amount of information that requires processing.
  • The transformation of bureaucracy from a set of techniques·> designecfto serve social ~tutions to an auton-;;mous metainstitution that largely serves itself came as a result of several developments in the mid-andlate-nineteenth century: rapid ../ industrial growth, improvements in transportation and commu- ·✓ nication, the extension of government into ever-larger realms of V public and business affairs, the increasing centralization of gov- v ernmental structures.
  • extent that the decision will affect the efficient operations of the J bureaucracy, and takes no responsibility for its human consequences.
  • Along the way, it ceased to be merely a servant of social institutions an
  • became ~ their master. Bureaucracy now not only solves problems but creates them. More important, it defines what our problems are---and they arec!.lways, in the bureaucra!!c view, problems of l . , efficiency.
  • ex~r- (J} tis~ is a second important technical means by which Technopoly s~s furiously to control information.
  • the expert in Techno oly has two characteristics that distinguish im or her from experts of the {i) past. First, Technopoly's experts tend to be ignorant about any matter not directly related to their specialized area.
  • T echnopoly' s experts claim dominion not only_gyer technical matters but also over so@,--12~ichological. and moral · aff~irs.
  • "bureaucrat" has come to mean a person who \ by training, commitment, and even temperament is indifferent ~ ). to both the content and the fatality of a human problem. Th~ \ 'bureaucrat considers the implications of a decision only to the
  • Technical machinery is essential to both the bureaucrat and c:/ the expert, and m~ be regarded as a third mechanism of information control.
  • I have in mind "softer" technologies such as IQ tests, SATs, standardized forms, taxonomies, and opinion polls. Some of these I discuss in detail in chapter eight, "Invisible T echnologies," but I mention them here because their role in reducing the types and quantity of information admitted to a system often goes unnoticed, and therefore their role in redefining traditional concepl::s also· goes unnoticed. There is, for example, no test that can measure a person's intelligenc
  • Th_~-role of t!;_e ~xpert is to concentrate o_l}_one_ .H~ld of knowledge, sift through all that is available, eliminate that -.--:-: __ __:~---------which has no bearing on a problem, and use what is left !Q. !!§Sist in solving a probl~.
  • the expert relies on our believing in the reality of technical machinery, which means we will reify the answers generated by the machinery. We come to believe that our score is our intelligence,· or our capacity for creativity or love or pain. We come to believe that the results of opinion polls are what people believe, as if our beliefs can be encapsulated in such sentences as "I approve" and "I disapprove."
  • it is disas~ \ trou~p!ie~e_~ved by technical means and where efficiency is usually irrelevant, such as in education, law, fa~iiy life, and p·r;blems of p~;;~~al maladjustment.
  • perceptions and judgment declines, bureaucracies, expertise, and technical machinery become the principal means by which, T echnopoly hopes to control information and thereby provide itself with intelligibility and order. The rest of this book tells the · story of why this cannot work, and of the pain and stupidity that are the consequences.
  • Institutions ca~~aked~cisions on the basis of scores and. sfatistics, and. there certainly may be occasions where there is no reasonable alternative. But unless such decisions are made with profound skepticism-that is, acknowledged as being made for administrative convenience-they are delusionary.
  • In Technopoly, the \. delusion is sanctified by our granting inordinate prestige to experts who are armed with sophisticated technical machinery. Shaw once remarked that all professions are conspiracies against the laity. I would go further: in Technopoly, all exeeds are invested with the charisma of priestliness
  • The god they serve does not speak \ of righteousness or goodness or mercy or grace. Their god speaks of efficiency, precision, objectivity. And that is why such concepts as sin and evil disappear in Technopoly. They come from a moral universe that is irrelevant to the theology of expertise. And so the priests of Technopoly call sin "social deviance," which is a statistical concept, and they call evil "psychopathology," which is a medical concept. Sin and evil disappear because they cannot be measured and objectified, and therefore cannot be dealt with by experts.
  • As the power of traditional social institutions to organize
Javier E

Data on inbred nobles support a leader-driven theory of history | The Economist - 0 views

  • a recent working paper by Nico Voigtländer and Sebastian Ottinger of the University of California at Los Angeles argues that leaders’ impact can indeed be isolated—thanks to the genomes of kings like Charles.
  • In theory, each round of inbreeding should have made monarchs slightly stupider—and thus worse at their jobs. This yields a natural experiment. Assuming that countries’ propensity for incest did not vary based on their political fortunes, the periods in which they had highly inbred (and probably dim-witted) leaders occurred at random intervals.
  • The authors analysed 331 European monarchs between 990 and 1800. They first calculated how inbred each ruler was, and then assessed countries’ success during their reigns using two measures: historians’ subjective scores, and the change in land area controlled by each monarch. The authors only compared each ruler against their own country’s historical averages.
  • ...3 more annotations...
  • The change in their land areas tended to be about 24 percentage points greater under their least inbred rulers than under their most inbred ones.
  • Sure enough, Spain’s tailspin under Charles was predictable. Countries tended to endure their darkest periods under their most inbred monarchs, and enjoy golden ages during the reigns of their most genetically diverse leaders.
  • the study’s finding—rulers who preside over setbacks tend to be relatively unintelligent—has timeless implications.
  •  
    The authors analysed 331 European monarchs between 990 and 1800. They first calculated how inbred each ruler was, and then assessed countries' success during their reigns using two measures: historians' subjective scores, and the change in land area controlled by each monarch. The authors only compared each ruler against their own country's historical averages.
Javier E

Cognitive Biases and the Human Brain - The Atlantic - 1 views

  • Present bias shows up not just in experiments, of course, but in the real world. Especially in the United States, people egregiously undersave for retirement—even when they make enough money to not spend their whole paycheck on expenses, and even when they work for a company that will kick in additional funds to retirement plans when they contribute.
  • hen people hear the word bias, many if not most will think of either racial prejudice or news organizations that slant their coverage to favor one political position over another. Present bias, by contrast, is an example of cognitive bias—the collection of faulty ways of thinking that is apparently hardwired into the human brain. The collection is large. Wikipedia’s “List of cognitive biases” contains 185 entries, from actor-observer bias (“the tendency for explanations of other individuals’ behaviors to overemphasize the influence of their personality and underemphasize the influence of their situation … and for explanations of one’s own behaviors to do the opposite”) to the Zeigarnik effect (“uncompleted or interrupted tasks are remembered better than completed ones”)
  • If I had to single out a particular bias as the most pervasive and damaging, it would probably be confirmation bias. That’s the effect that leads us to look for evidence confirming what we already think or suspect, to view facts and ideas we encounter as further confirmation, and to discount or ignore any piece of evidence that seems to support an alternate view
  • ...48 more annotations...
  • Confirmation bias shows up most blatantly in our current political divide, where each side seems unable to allow that the other side is right about anything.
  • The whole idea of cognitive biases and faulty heuristics—the shortcuts and rules of thumb by which we make judgments and predictions—was more or less invented in the 1970s by Amos Tversky and Daniel Kahneman
  • versky died in 1996. Kahneman won the 2002 Nobel Prize in Economics for the work the two men did together, which he summarized in his 2011 best seller, Thinking, Fast and Slow. Another best seller, last year’s The Undoing Project, by Michael Lewis, tells the story of the sometimes contentious collaboration between Tversky and Kahneman
  • Another key figure in the field is the University of Chicago economist Richard Thaler. One of the biases he’s most linked with is the endowment effect, which leads us to place an irrationally high value on our possessions.
  • In an experiment conducted by Thaler, Kahneman, and Jack L. Knetsch, half the participants were given a mug and then asked how much they would sell it for. The average answer was $5.78. The rest of the group said they would spend, on average, $2.21 for the same mug. This flew in the face of classic economic theory, which says that at a given time and among a certain population, an item has a market value that does not depend on whether one owns it or not. Thaler won the 2017 Nobel Prize in Economics.
  • “The question that is most often asked about cognitive illusions is whether they can be overcome. The message … is not encouraging.”
  • that’s not so easy in the real world, when we’re dealing with people and situations rather than lines. “Unfortunately, this sensible procedure is least likely to be applied when it is needed most,” Kahneman writes. “We would all like to have a warning bell that rings loudly whenever we are about to make a serious error, but no such bell is available.”
  • At least with the optical illusion, our slow-thinking, analytic mind—what Kahneman calls System 2—will recognize a Müller-Lyer situation and convince itself not to trust the fast-twitch System 1’s perception
  • Kahneman and others draw an analogy based on an understanding of the Müller-Lyer illusion, two parallel lines with arrows at each end. One line’s arrows point in; the other line’s arrows point out. Because of the direction of the arrows, the latter line appears shorter than the former, but in fact the two lines are the same length.
  • Because biases appear to be so hardwired and inalterable, most of the attention paid to countering them hasn’t dealt with the problematic thoughts, judgments, or predictions themselves
  • Is it really impossible, however, to shed or significantly mitigate one’s biases? Some studies have tentatively answered that question in the affirmative.
  • what if the person undergoing the de-biasing strategies was highly motivated and self-selected? In other words, what if it was me?
  • Over an apple pastry and tea with milk, he told me, “Temperament has a lot to do with my position. You won’t find anyone more pessimistic than I am.”
  • I met with Kahneman
  • “I see the picture as unequal lines,” he said. “The goal is not to trust what I think I see. To understand that I shouldn’t believe my lying eyes.” That’s doable with the optical illusion, he said, but extremely difficult with real-world cognitive biases.
  • In this context, his pessimism relates, first, to the impossibility of effecting any changes to System 1—the quick-thinking part of our brain and the one that makes mistaken judgments tantamount to the Müller-Lyer line illusion
  • he most effective check against them, as Kahneman says, is from the outside: Others can perceive our errors more readily than we can.
  • “slow-thinking organizations,” as he puts it, can institute policies that include the monitoring of individual decisions and predictions. They can also require procedures such as checklists and “premortems,”
  • A premortem attempts to counter optimism bias by requiring team members to imagine that a project has gone very, very badly and write a sentence or two describing how that happened. Conducting this exercise, it turns out, helps people think ahead.
  • “My position is that none of these things have any effect on System 1,” Kahneman said. “You can’t improve intuition.
  • Perhaps, with very long-term training, lots of talk, and exposure to behavioral economics, what you can do is cue reasoning, so you can engage System 2 to follow rules. Unfortunately, the world doesn’t provide cues. And for most people, in the heat of argument the rules go out the window.
  • Kahneman describes an even earlier Nisbett article that showed subjects’ disinclination to believe statistical and other general evidence, basing their judgments instead on individual examples and vivid anecdotes. (This bias is known as base-rate neglect.)
  • over the years, Nisbett had come to emphasize in his research and thinking the possibility of training people to overcome or avoid a number of pitfalls, including base-rate neglect, fundamental attribution error, and the sunk-cost fallacy.
  • Nisbett’s second-favorite example is that economists, who have absorbed the lessons of the sunk-cost fallacy, routinely walk out of bad movies and leave bad restaurant meals uneaten.
  • When Nisbett asks the same question of students who have completed the statistics course, about 70 percent give the right answer. He believes this result shows, pace Kahneman, that the law of large numbers can be absorbed into System 2—and maybe into System 1 as well, even when there are minimal cues.
  • about half give the right answer: the law of large numbers, which holds that outlier results are much more frequent when the sample size (at bats, in this case) is small. Over the course of the season, as the number of at bats increases, regression to the mean is inevitabl
  • When Nisbett has to give an example of his approach, he usually brings up the baseball-phenom survey. This involved telephoning University of Michigan students on the pretense of conducting a poll about sports, and asking them why there are always several Major League batters with .450 batting averages early in a season, yet no player has ever finished a season with an average that high.
  • we’ve tested Michigan students over four years, and they show a huge increase in ability to solve problems. Graduate students in psychology also show a huge gain.”
  • , “I know from my own research on teaching people how to reason statistically that just a few examples in two or three domains are sufficient to improve people’s reasoning for an indefinitely large number of events.”
  • isbett suggested another factor: “You and Amos specialized in hard problems for which you were drawn to the wrong answer. I began to study easy problems, which you guys would never get wrong but untutored people routinely do … Then you can look at the effects of instruction on such easy problems, which turn out to be huge.”
  • Nisbett suggested that I take “Mindware: Critical Thinking for the Information Age,” an online Coursera course in which he goes over what he considers the most effective de-biasing skills and concepts. Then, to see how much I had learned, I would take a survey he gives to Michigan undergraduates. So I did.
  • he course consists of eight lessons by Nisbett—who comes across on-screen as the authoritative but approachable psych professor we all would like to have had—interspersed with some graphics and quizzes. I recommend it. He explains the availability heuristic this way: “People are surprised that suicides outnumber homicides, and drownings outnumber deaths by fire. People always think crime is increasing” even if it’s not.
  • When I finished the course, Nisbett sent me the survey he and colleagues administer to Michigan undergrads
  • It contains a few dozen problems meant to measure the subjects’ resistance to cognitive biases
  • I got it right. Indeed, when I emailed my completed test, Nisbett replied, “My guess is that very few if any UM seniors did as well as you. I’m sure at least some psych students, at least after 2 years in school, did as well. But note that you came fairly close to a perfect score.”
  • Nevertheless, I did not feel that reading Mindware and taking the Coursera course had necessarily rid me of my biases
  • For his part, Nisbett insisted that the results were meaningful. “If you’re doing better in a testing context,” he told me, “you’ll jolly well be doing better in the real world.”
  • The New York–based NeuroLeadership Institute offers organizations and individuals a variety of training sessions, webinars, and conferences that promise, among other things, to use brain science to teach participants to counter bias. This year’s two-day summit will be held in New York next month; for $2,845, you could learn, for example, “why are our brains so bad at thinking about the future, and how do we do it better?”
  • Philip E. Tetlock, a professor at the University of Pennsylvania’s Wharton School, and his wife and research partner, Barbara Mellers, have for years been studying what they call “superforecasters”: people who manage to sidestep cognitive biases and predict future events with far more accuracy than the pundits
  • One of the most important ingredients is what Tetlock calls “the outside view.” The inside view is a product of fundamental attribution error, base-rate neglect, and other biases that are constantly cajoling us into resting our judgments and predictions on good or vivid stories instead of on data and statistics
  • In 2006, seeking to prevent another mistake of that magnitude, the U.S. government created the Intelligence Advanced Research Projects Activity (iarpa), an agency designed to use cutting-edge research and technology to improve intelligence-gathering and analysis. In 2011, iarpa initiated a program, Sirius, to fund the development of “serious” video games that could combat or mitigate what were deemed to be the six most damaging biases: confirmation bias, fundamental attribution error, the bias blind spot (the feeling that one is less biased than the average person), the anchoring effect, the representativeness heuristic, and projection bias (the assumption that everybody else’s thinking is the same as one’s own).
  • most promising are a handful of video games. Their genesis was in the Iraq War
  • Together with collaborators who included staff from Creative Technologies, a company specializing in games and other simulations, and Leidos, a defense, intelligence, and health research company that does a lot of government work, Morewedge devised Missing. Some subjects played the game, which takes about three hours to complete, while others watched a video about cognitive bias. All were tested on bias-mitigation skills before the training, immediately afterward, and then finally after eight to 12 weeks had passed.
  • “The literature on training suggests books and classes are fine entertainment but largely ineffectual. But the game has very large effects. It surprised everyone.”
  • he said he saw the results as supporting the research and insights of Richard Nisbett. “Nisbett’s work was largely written off by the field, the assumption being that training can’t reduce bias,
  • even the positive results reminded me of something Daniel Kahneman had told me. “Pencil-and-paper doesn’t convince me,” he said. “A test can be given even a couple of years later. But the test cues the test-taker. It reminds him what it’s all about.”
  • Morewedge told me that some tentative real-world scenarios along the lines of Missing have shown “promising results,” but that it’s too soon to talk about them.
  • In the future, I will monitor my thoughts and reactions as best I can
Javier E

The Data Vigilante - Christopher Shea - The Atlantic - 0 views

  • He is, on the contrary, seized by the conviction that science is beset by sloppy statistical maneuvering and, in some cases, outright fraud. He has therefore been moonlighting as a fraud-buster, developing techniques to help detect doctored data in other people’s research. Already, in the space of less than a year, he has blown up two colleagues’ careers.
  • In a paper called “False-Positive Psychology,” published in the prestigious journal Psychological Science, he and two colleagues—Leif Nelson, a professor at the University of California at Berkeley, and Wharton’s Joseph Simmons—showed that psychologists could all but guarantee an interesting research finding if they were creative enough with their statistics and procedures.
  • By going on what amounted to a fishing expedition (that is, by recording many, many variables but reporting only the results that came out to their liking); by failing to establish in advance the number of human subjects in an experiment; and by analyzing the data as they went, so they could end the experiment when the results suited them, they produced a howler of a result, a truly absurd finding. They then ran a series of computer simulations using other experimental data to show that these methods could increase the odds of a false-positive result—a statistical fluke, basically—to nearly two-thirds.
  • ...2 more annotations...
  • “I couldn’t tolerate knowing something was fake and not doing something about it,” he told me. “Everything loses meaning. What’s the point of writing a paper, fighting very hard to get it published, going to conferences?”
  • Simonsohn stressed that there’s a world of difference between data techniques that generate false positives, and fraud, but he said some academic psychologists have, until recently, been dangerously indifferent to both. Outright fraud is probably rare. Data manipulation is undoubtedly more common—and surely extends to other subjects dependent on statistical study, including biomedicine. Worse, sloppy statistics are “like steroids in baseball”: Throughout the affected fields, researchers who are too intellectually honest to use these tricks will publish less, and may perish. Meanwhile, the less fastidious flourish.
Javier E

The Future of Sex - The European - 1 views

  • Consider the most likely scenario for how human sexual behavior will develop over the next hundred years or so in the absence of cataclysm. Here’s what I see if we continue on our current path:
  • Like every other aspect of human life, our sexuality will become increasingly mediated by technology. The technology of pornography will become ever more sophisticated—even if the subject matter of porn itself will remain as primal as ever.
  • As the technology improves, society continues to grow ever more fragmented, and hundreds of millions of Chinese men with no hope of marrying a bona-fide, flesh-and-blood woman come of age, sex robots will become as common and acceptable as dildos and vibrators are today. After all, the safest sex is that which involves no other living things…
  • ...4 more annotations...
  • As our sexuality becomes ever more divorced from emotion and intimacy, a process already well underway, sex will increasingly be seen as simply a matter of provoking orgasm in the most efficient, reliable ways possible.
  • Human sexuality will continue to be subjected to the same commodification and mechanization as other aspects of our lives. Just as the 21st century saw friends replaced by Facebook friends, nature replaced by parks, ocean fisheries replaced by commercially farmed seafood, and sunshine largely supplanted by tanning salons, we’ll see sexual interaction reduced to mechanically provoked orgasm as human beings become ever more dominated by the machines and mechanistic thought processes that developed in our brains and societies like bacteria in a petri dish.
  • Gender identity will fade away as sexual interaction becomes less “human” and we grow less dependent upon binary interactions with other people. As more and more of our interactions take place with non-human partners, others’ expectations and judgments will become less relevant to the development of sexual identity, leading to greater fluidity and far less urgency and passion concerning sexual expression.
  • the collapse of western civilization may well be the best thing that could happen for human sexuality. Following the collapse of the consumerist, competitive mind-set that now dominates so much of human thought, we’d possibly be free to rebuild a social world more in keeping with our preagricultural origins, characterized by economies built upon sharing rather than hoarding, a politics of respect rather than of power, and a sexuality of intimacy rather than alienation.
Javier E

The Faulty Logic of the 'Math Wars' - NYTimes.com - 0 views

  • The American philosopher Wilfrid Sellars was challenging this assumption when he spoke of “material inferences.” Sellars was interested in inferences that we can only recognize as valid if we possess certain bits of factual knowledge.
  • That the use of standard algorithms isn’t merely mechanical is not by itself a reason to teach them. It is important to teach them because, as we already noted, they are also the most elegant and powerful methods for specific operations. This means that they are our best representations of connections among mathematical concepts. Math instruction that does not teach both that these algorithms work and why they do is denying students insight into the very discipline it is supposed to be about.
  • according to Wittgenstein, is why it is wrong to understand algorithm-based calculations as expressions of nothing more than “mental mechanisms.” Far from being genuinely mechanical, such calculations involve a distinctive kind of thought.
  • ...3 more annotations...
  • If we make room for such material inferences, we will be inclined to reject the view that individuals can reason well without any substantial knowledge of, say, the natural world and human affairs. We will also be inclined to regard the specifically factual content of subjects such as biology and history as integral to a progressive education.
  • There is a moral here for progressive education that reaches beyond the case of math. Even if we sympathize with progressivists in wanting schools to foster independence of mind, we shouldn’t assume that it is obvious how best to do this. Original thought ranges over many different domains, and it imposes divergent demands as it does so. Just as there is good reason to believe that in biology and history such thought requires significant factual knowledge, there is good reason to believe that in mathematics it requires understanding of and facility with the standard algorithms.
  • there is also good reason to believe that when we examine further areas of discourse we will come across yet further complexities. The upshot is that it would be naïve to assume that we can somehow promote original thinking in specific areas simply by calling for subject-related creative reasoning
Emily Horwitz

News from The Associated Press - 0 views

  • If you saw the film "Argo," no, you didn't miss this development, which is recounted in Mendez's book about the real-life operation. It wasn't there because director Ben Affleck and screenwriter Chris Terrio replaced it with an even more dramatic scenario, involving canceled flight reservations, suspicious Iranian officials who call the Hollywood office of the fake film crew (a call answered just in time), and finally a heart-pounding chase on the tarmac just as the plane's wheels lift off, seconds from catastrophe.
  • they've caught some flak for the liberties they took in the name of entertainment.
  • And they aren't alone - two other high-profile best-picture nominees this year, Kathryn Bigelow's "Zero Dark Thirty" and Steven Spielberg's "Lincoln," have also been criticized for different sorts of factual issues.
  • ...15 more annotations...
  • But because these three major films are in contention, the issue has come to the forefront of this year's Oscar race, and with it a thorny cultural question: Does the audience deserve the truth, the whole truth and nothing but? Surely not, but just how much fiction is OK?
  • In response to a complaint by a Connecticut congressman, Kushner acknowledged he'd changed the details for dramatic effect, having two Connecticut congressmen vote against the amendment when, in fact, all four voted for it. (The names of those congressmen were changed, to avoid changing the vote of specific individuals.)
  • Kushner said he had "adhered to time-honored and completely legitimate standards for the creation of historical drama, which is what `Lincoln' is. I hope nobody is shocked to learn that I also made up dialogue and imagined encounters and invented characters."
  • "Maybe changing the vote went too far," says Richard Walter, chairman of screenwriting at the University of California, Los Angeles. "Maybe there was another way to do it. But really, it's not terribly important. People accept that liberties will be taken. A movie is a movie. People going for a history lesson are going to the wrong place."
  • Walter says he always tells his students: "Go for the feelings. Because the only thing that's truly real in the movies are the feelings that people feel when they watch."
  • No subject or individual's life is compelling and dramatic enough by itself, he says, that it neatly fits into a script with three acts, subplots, plot twists and a powerful villain.
  • Reeves, who actually gave the "Lincoln" script a negative review because he thought it was too heavy on conversation and lacking action. He adds, though, that when the subject is as famous as Lincoln, one has a responsibility to be more faithful to the facts.
  • "This is fraught territory," he says. "You're always going to have to change something, and you're always going to get in some sort of trouble, with somebody," he says.
  • Futterman also doesn't begrudge the "Argo" filmmakers, because he feels they use a directorial style that implies some fun is being had with the story. "All the inside joking about Hollywood - tonally, you get a sense that something is being played with," he says.
  • Futterman says he was sympathetic to those concerns and would certainly have addressed them in the script, had he anticipated them.
  • Of the three Oscar-nominated films in question, "Zero Dark Thirty" has inspired the most fervent debate. The most intense criticism, despite acclaim for the filmmaking craft involved, has been about its depictions of interrogations, with some, including a group of senators, saying the film misleads viewers for suggesting that torture provided information that helped the CIA find Osama bin Laden.
  • have been questions about the accuracy of the depiction of the main character, a CIA officer played by Jessica Chastain; the real person - or even combination of people, according to some theories - that she plays remains anonymous.
  • screenwriters have a double responsibility: to the material and to the audience.
  • The debate over "Argo" has been much less intense, though there has been some grumbling from former officials in Britain and New Zealand that their countries were portrayed incorrectly in the film as offering no help at all to the six Americans, whereas actually, as Mendez writes, they did provide some help.
  • "When I am hungry and crave a tuna fish sandwich, I don't go to a hardware store," he says. "When I seek a history lesson, I do not go to a movie theater. I loved `Argo' even though I know there was no last-minute turn-around via a phone call from President Carter, nor were there Iranian police cars chasing the plane down the tarmac as it took off. So what? These conceits simply make the movie more exciting."
  •  
    This article reaffirmed my feelings that we can't trust everything that we see or hear through the media, because it is often skewed to better captivate the target audience. As the article stated, there appears to be a fine line in catering to the attention span of the audience, and respecting the known facts of a given event that is portrayed by a movie.
Javier E

Noam Chomsky on Where Artificial Intelligence Went Wrong - Yarden Katz - The Atlantic - 0 views

  • If you take a look at the progress of science, the sciences are kind of a continuum, but they're broken up into fields. The greatest progress is in the sciences that study the simplest systems. So take, say physics -- greatest progress there. But one of the reasons is that the physicists have an advantage that no other branch of sciences has. If something gets too complicated, they hand it to someone else.
  • If a molecule is too big, you give it to the chemists. The chemists, for them, if the molecule is too big or the system gets too big, you give it to the biologists. And if it gets too big for them, they give it to the psychologists, and finally it ends up in the hands of the literary critic, and so on.
  • neuroscience for the last couple hundred years has been on the wrong track. There's a fairly recent book by a very good cognitive neuroscientist, Randy Gallistel and King, arguing -- in my view, plausibly -- that neuroscience developed kind of enthralled to associationism and related views of the way humans and animals work. And as a result they've been looking for things that have the properties of associationist psychology.
  • ...19 more annotations...
  • in general what he argues is that if you take a look at animal cognition, human too, it's computational systems. Therefore, you want to look the units of computation. Think about a Turing machine, say, which is the simplest form of computation, you have to find units that have properties like "read", "write" and "address." That's the minimal computational unit, so you got to look in the brain for those. You're never going to find them if you look for strengthening of synaptic connections or field properties, and so on. You've got to start by looking for what's there and what's working and you see that from Marr's highest level.
  • it's basically in the spirit of Marr's analysis. So when you're studying vision, he argues, you first ask what kind of computational tasks is the visual system carrying out. And then you look for an algorithm that might carry out those computations and finally you search for mechanisms of the kind that would make the algorithm work. Otherwise, you may never find anything.
  • AI and robotics got to the point where you could actually do things that were useful, so it turned to the practical applications and somewhat, maybe not abandoned, but put to the side, the more fundamental scientific questions, just caught up in the success of the technology and achieving specific goals.
  • "Good Old Fashioned AI," as it's labeled now, made strong use of formalisms in the tradition of Gottlob Frege and Bertrand Russell, mathematical logic for example, or derivatives of it, like nonmonotonic reasoning and so on. It's interesting from a history of science perspective that even very recently, these approaches have been almost wiped out from the mainstream and have been largely replaced -- in the field that calls itself AI now -- by probabilistic and statistical models. My question is, what do you think explains that shift and is it a step in the right direction?
  • The approximating unanalyzed data kind is sort of a new approach, not totally, there's things like it in the past. It's basically a new approach that has been accelerated by the existence of massive memories, very rapid processing, which enables you to do things like this that you couldn't have done by hand. But I think, myself, that it is leading subjects like computational cognitive science into a direction of maybe some practical applicability... ..in engineering? Chomsky: ...But away from understanding.
  • I was very skeptical about the original work. I thought it was first of all way too optimistic, it was assuming you could achieve things that required real understanding of systems that were barely understood, and you just can't get to that understanding by throwing a complicated machine at it.
  • if success is defined as getting a fair approximation to a mass of chaotic unanalyzed data, then it's way better to do it this way than to do it the way the physicists do, you know, no thought experiments about frictionless planes and so on and so forth. But you won't get the kind of understanding that the sciences have always been aimed at -- what you'll get at is an approximation to what's happening.
  • Suppose you want to predict tomorrow's weather. One way to do it is okay I'll get my statistical priors, if you like, there's a high probability that tomorrow's weather here will be the same as it was yesterday in Cleveland, so I'll stick that in, and where the sun is will have some effect, so I'll stick that in, and you get a bunch of assumptions like that, you run the experiment, you look at it over and over again, you correct it by Bayesian methods, you get better priors. You get a pretty good approximation of what tomorrow's weather is going to be. That's not what meteorologists do -- they want to understand how it's working. And these are just two different concepts of what success means, of what achievement is.
  • take a concrete example of a new field in neuroscience, called Connectomics, where the goal is to find the wiring diagram of very complex organisms, find the connectivity of all the neurons in say human cerebral cortex, or mouse cortex. This approach was criticized by Sidney Brenner, who in many ways is [historically] one of the originators of the approach. Advocates of this field don't stop to ask if the wiring diagram is the right level of abstraction -- maybe it's no
  • the right approach, is to try to see if you can understand what the fundamental principles are that deal with the core properties, and recognize that in the actual usage, there's going to be a thousand other variables intervening -- kind of like what's happening outside the window, and you'll sort of tack those on later on if you want better approximations, that's a different approach.
  • if you get more and more data, and better and better statistics, you can get a better and better approximation to some immense corpus of text, like everything in The Wall Street Journal archives -- but you learn nothing about the language.
  • if you went to MIT in the 1960s, or now, it's completely different. No matter what engineering field you're in, you learn the same basic science and mathematics. And then maybe you learn a little bit about how to apply it. But that's a very different approach. And it resulted maybe from the fact that really for the first time in history, the basic sciences, like physics, had something really to tell engineers. And besides, technologies began to change very fast, so not very much point in learning the technologies of today if it's going to be different 10 years from now. So you have to learn the fundamental science that's going to be applicable to whatever comes along next. And the same thing pretty much happened in medicine.
  • that's the kind of transition from something like an art, that you learn how to practice -- an analog would be trying to match some data that you don't understand, in some fashion, maybe building something that will work -- to science, what happened in the modern period, roughly Galilean science.
  • it turns out that there actually are neural circuits which are reacting to particular kinds of rhythm, which happen to show up in language, like syllable length and so on. And there's some evidence that that's one of the first things that the infant brain is seeking -- rhythmic structures. And going back to Gallistel and Marr, its got some computational system inside which is saying "okay, here's what I do with these things" and say, by nine months, the typical infant has rejected -- eliminated from its repertoire -- the phonetic distinctions that aren't used in its own language.
  • people like Shimon Ullman discovered some pretty remarkable things like the rigidity principle. You're not going to find that by statistical analysis of data. But he did find it by carefully designed experiments. Then you look for the neurophysiology, and see if you can find something there that carries out these computations. I think it's the same in language, the same in studying our arithmetical capacity, planning, almost anything you look at. Just trying to deal with the unanalyzed chaotic data is unlikely to get you anywhere, just like as it wouldn't have gotten Galileo anywhere.
  • with regard to cognitive science, we're kind of pre-Galilean, just beginning to open up the subject
  • You can invent a world -- I don't think it's our world -- but you can invent a world in which nothing happens except random changes in objects and selection on the basis of external forces. I don't think that's the way our world works, I don't think it's the way any biologist thinks it is. There are all kind of ways in which natural law imposes channels within which selection can take place, and some things can happen and other things don't happen. Plenty of things that go on in the biology in organisms aren't like this. So take the first step, meiosis. Why do cells split into spheres and not cubes? It's not random mutation and natural selection; it's a law of physics. There's no reason to think that laws of physics stop there, they work all the way through. Well, they constrain the biology, sure. Chomsky: Okay, well then it's not just random mutation and selection. It's random mutation, selection, and everything that matters, like laws of physics.
  • What I think is valuable is the history of science. I think we learn a lot of things from the history of science that can be very valuable to the emerging sciences. Particularly when we realize that in say, the emerging cognitive sciences, we really are in a kind of pre-Galilean stage. We don't know wh
  • at we're looking for anymore than Galileo did, and there's a lot to learn from that.
‹ Previous 21 - 40 of 475 Next › Last »
Showing 20 items per page