Skip to main content

Home/ TOK Friends/ Group items tagged selection

Rss Feed Group items tagged

katherineharron

FBI arrests spotlight lessons learned after Charlottesville (opinion) - CNN - 0 views

  • On Thursday, the FBI arrested three men, Patrik J. Mathews, 27, Brian M. Lemley Jr., 33, and William G. Bilbrough IV, 19, with firearms charges, and they had plans, an official said, to attend a Virginia pro-gun rally. This followed Virginia Gov. Ralph Northam's declaration of a temporary state of emergency after authorities learned that extremists hoped to use the anti-gun control rally planned next Monday -- Martin Luther King, Jr. Day -- to incite a violent clash.
  • These arrests add to mounting evidence that a decades-old and violent white-power movement is alive and well, perhaps even gaining strength. White power is a social movement that has united neo-Nazis, Klansmen, skinheads, and militiamen around a shared fear of racial annihilation and cultural change. Since 1983, when movement leaders declared war on the federal government, members of such groups have worked together to bring about a race war.
  • JUST WATCHEDOn GPS: What motivates white power activists?ReplayMore Videos ...MUST WATCH position: absol
  • ...4 more annotations...
  • Silver linings aside, it will take many, many more instances of coordinated response to stop a movement generations in the making. In more than a decade of studying the earlier white power movement, I have become familiar with the themes of underground activity that are today clearly drawing from the earlier movement. In the absence of decisive action across multiple institutions, a rich record of criminal activity and violence will continue to provide these activists with a playbook for further chaos.
  • The Base, furthermore, is what experts call "accelerationist," meaning that its members hope to provoke what they see as an inevitable race war. They have conducted paramilitary training in the Pacific Northwest. Both of these strategies date back to the 1980s, when the Order trained in those forests with hopes of provoking the same race war.
  • One of the men arrested Thursday was formerly a reservist in the Canadian Army, where he received training in explosives and demolition, according to the New York Times. This kind of preparation, too, is common among extremists like these. To take just a few representative examples, in the 1960s, Bobby Frank Cherry, a former Marine trained in demolition, helped fellow members of the United Klans of America to bomb the 16th Street Birmingham Baptist Church, killing four black girls.
  • This news out of Virginia shows that there is a real social benefit when people direct their attention to these events -- and sustain the public conversation about the presence of a renewed white-power movement and what it means for our society.
huffem4

How to Use Critical Thinking to Separate Fact From Fiction Online | by Simon Spichak | ... - 2 views

  • Critical thinking helps us frame everyday problems, teaches us to ask the correct questions, and points us towards intelligent solutions.
  • Critical thinking is a continuing practice that involves an open mind and methods for synthesizing and evaluating the quality of knowledge and evidence, as well as an understanding of human errors.
  • Step 1. What We Believe Depends on How We Feel
  • ...33 more annotations...
  • One of the first things I ask myself when I read a headline or find a claim about a product is if the phrase is emotionally neutral. Some headlines generate outrage or fear, indicating that there is a clear bias. When we read something that exploits are emotions, we must be careful.
  • misinformation tends to play on our emotions a lot better than factual reporting or news.
  • When I’m trying to figure out whether a claim is factual, there are a few questions I always ask myself.Does the headline, article, or information evoke fear, anger, or other strong negative emotions?Where did you hear about the information? Does it cite any direct evidence?What is the expert consensus on this information?
  • Step 2. Evidence Synthesis and EvaluationSometimes I’m still feeling uncertain if there’s any truth to a claim. Even after taking into account the emotions it evokes, I need to find the evidence of a claim and evaluate its quality
  • Often, the information that I want to check is either political or scientific. There are different questions I ask myself, depending on the nature of these claims.
  • Political claims
  • Looking at multiple different outlets, each with its own unique biases, helps us get a picture of the issue.
  • I use multiple websites specializing in fact-checking. They provide primary sources of evidence for different types of claims. Here is a list of websites where I do my fact-checking:
  • SnopesPolitifactFactCheckMedia Bias/Fact Check (a bias assessor for fact-checking websites)Simply type in some keywords from the claim to find out if it’s verified with primary sources, misleading, false, or unproven.
  • Science claims
  • Often we tout science as the process by which we uncover absolute truths about the universe. Once many scientists agree on something, it gets disseminated in the news. Confusion arises once this science changes or evolves, as is what happened throughout the coronavirus pandemic. In addition to fear and misinformation, we have to address a fundamental misunderstanding of the way science works when practicing critical thinking.
  • It is confusing to hear about certain drugs found to cure the coronavirus one moment, followed by many other scientists and researchers saying that they don’t. How do we collect and assess these scientific claims when there are discrepancies?
  • A big part of these scientific findings is difficult to access for the public
  • Sometimes the distinction between scientific coverage and scientific articles isn’t clear. When this difference is clear, we might still find findings in different academic journals that disagree with each other. Sometimes, research that isn’t peer-reviewed receives plenty of coverage in the media
  • Correlation and causation: Sometimes a claim might present two factors that appear correlated. Consider recent misinformation about 5G Towers and the spread of coronavirus. While there might appear to be associations, it doesn’t necessarily mean that there is a causative relationship
  • To practice critical thinking with these kinds of claims, we must ask the following questions:Does this claim emerge from a peer-reviewed scientific article? Has this paper been retracted?Does this article appear in a reputable journal?What is the expert consensus on this article?
  • The next examples I want to bring up refer to retracted articles from peer-reviewed journals. Since science is a self-correcting process, rather than a decree of absolutes, mistakes and fraud are corrected.
  • Briefly, I will show you exactly how to tell if the resource you are reading is an actual, peer-reviewed scientific article.
  • How does science go from experiments to the news?
  • researchers outline exactly how they conducted their experiments so other researchers can replicate them, build upon them, or provide quality assurance for them. This scientific report does not go straight to the nearest science journalist. Websites and news outlets like Scientific American or The Atlantic do not publish scientific articles.
  • Here is a quick checklist that will help you figure out if you’re viewing a scientific paper.
  • Once it’s written up, researchers send this manuscript to a journal. Other experts in the field then provide comments, feedback, and critiques. These peer reviewers ask researchers for clarification or even more experiments to strengthen their results. Peer review often takes months or sometimes years.
  • Some peer-reviewed scientific journals are Science and Nature; other scientific articles are searchable through the PubMed database. If you’re curious about a topic, search for scientific papers.
  • Peer-review is crucial! If you’re assessing the quality of evidence for claims, peer-reviewed research is a strong indicator
  • Finally, there are platforms for scientists to review research even after publication in a peer-reviewed journal. Although most scientists conduct experiments and interpret their data objectively, they may still make errors. Many scientists use Twitter and PubPeer to perform a post-publication review
  • Step 3. Are You Practicing Objectivity?
  • To finish off, I want to discuss common cognitive errors that we tend to make. Finally, there are some framing questions to ask at the end of our research to help us with assessing any information that we find.
  • Dunning-Kruger effect: Why do we rely on experts? In 1999, David Dunning and Justin Kruger published “Unskilled and unaware of it: how difficulties in recognizing one’s own incompetence lead to inflated self-assessments.” They found that the less a person understands about a topic, the more confident of their abilities or knowledge they will be
  • How does this relate to critical thinking? If you’re reading a claim sourced or written by somebody who lacks expertise in a field, they are underestimating its complexity. Whenever possible, look for an authoritative source when synthesizing and evaluating evidence for a claim.
  • Survivorship bias: Ever heard someone argue that we don’t need vaccines or seatbelts? After all, they grew up without either of them and are still alive and healthy!These arguments are appealing at first, but they don’t account for any cases of failures. They are attributing a misplaced sense of optimism and safety by ignoring the deaths that occurred resultant from a lack of vaccinations and seatbelts
  • When you’re still unsure, follow the consensus of the experts within the field. Scientists pointed out flaws within this pre-print article leading to its retraction. The pre-print was removed from the server because it did not hold up to proper scientific standards or scrutiny.
  • Now with all the evidence we’ve gathered, we ask ourselves some final questions. There are plenty more questions you will come up with yourself, case-by-case.Who is making the original claim?Who supports these claims? What are their qualifications?What is the evidence used for these claims?Where is this evidence published?How was the evidence gathered?Why is it important?
  • “even if some data is supporting a claim, does it make sense?” Some claims are deceptively true but fall apart when accounting for this bias.
carolinewren

Researchers at Brown University shattered an electron wave function | Motherboard - 1 views

  • When we say some element of the quantum world occupies many states at once, what’s really being referred to is the element’s wave function. A wave function can be viewed as a space occupied simultaneously by many different possibilities or degrees of freedom.
  • Even what we’d normally (deterministically) consider empty space has a wave function and, as such, contains very real possibilities of not being empty.
  • Visually, we might imagine a particle in its undisturbed state looking more like a cloud than a point in space.
  • ...15 more annotations...
  • a bunch of particles can share these states at the same time, effectively becoming instances of the same particle. And so: entanglement.
  • possible to strip away all of this indeterminateness
  • wave functions are very fragile, subject to a “collapse” in which all of those possibilities become just a single particle at a single point at a single time.
  • physicists have observed a very peculiar behavior of electrons in supercooled baths of helium. When an electron enters the bath, it acts to
  • two probabilities can be isolated from each other, cordoned off like quantum crime scenes
  • it’s possible to take a wave function and isolate it into different parts. So, if our electron has some probability of being in position (x1,y1,z1) and another probability of being in position (x2,y2,z2), those two probabilities can be isolated from each other, cordoned off like quantum crime scenes
  • when a macroscopic human attempts to measure a quantum mechanical system: The wave drops away and all that’s left is a boring, well-defined thing.
  • trapping the chance of finding the electron, not pieces of the electron
  • using tiny bubbles of helium as physical “traps.
  • repel the surrounding helium atoms, forming its own little bubble or cavity in the process.
  • That an electron (or other particle) can be in many places at the same time is strange enough, but the notion that those possibilities can be captured and shuttled away adds a whole new twist.
  • wave function isn’t a physical thing. It’s mathematics that describe a phenomenon.
  • The electron, upon measurement, will be in precisely one bubble.
  • “No one is sure what actually constitutes a measurement,”
  • Is consciousness required? We don’t really know.”
Javier E

How to Remember Everything You Want From Non-Fiction Books | by Eva Keiffenheim, MSc | ... - 0 views

  • A Bachelor’s degree taught me how to learn to ace exams. But it didn’t teach me how to learn to remember.
  • 65% to 80% of students answered “no” to the question “Do you study the way you do because somebody taught you to study that way?”
  • the most-popular Coursera course of all time: Dr. Barabara Oakley’s free course on “Learning how to Learn.” So did I. And while this course taught me about chunking, recalling, and interleaving
  • ...66 more annotations...
  • I learned something more useful: the existence of non-fiction literature that can teach you anything.
  • something felt odd. Whenever a conversation revolved around a serious non-fiction book I read, such as ‘Sapiens’ or ‘Thinking Fast and Slow,’ I could never remember much. Turns out, I hadn’t absorbed as much information as I’d believed. Since I couldn’t remember much, I felt as though reading wasn’t an investment in knowledge but mere entertainment.
  • When I opened up about my struggles, many others confessed they also can’t remember most of what they read, as if forgetting is a character flaw. But it isn’t.
  • It’s the way we work with books that’s flawed.
  • there’s a better way to read. Most people rely on techniques like highlighting, rereading, or, worst of all, completely passive reading, which are highly ineffective.
  • Since I started applying evidence-based learning strategies to reading non-fiction books, many things have changed. I can explain complex ideas during dinner conversations. I can recall interesting concepts and link them in my writing or podcasts. As a result, people come to me for all kinds of advice.
  • What’s the Architecture of Human Learning and Memory?
  • Human brains don’t work like recording devices. We don’t absorb information and knowledge by reading sentences.
  • we store new information in terms of its meaning to our existing memory
  • we give new information meaning by actively participating in the learning process — we interpret, connect, interrelate, or elaborate
  • To remember new information, we not only need to know it but also to know how it relates to what we already know.
  • Learning is dependent on memory processes because previously-stored knowledge functions as a framework in which newly learned information can be linked.”
  • Human memory works in three stages: acquisition, retention, and retrieval. In the acquisition phase, we link new information to existing knowledge; in the retention phase, we store it, and in the retrieval phase, we get information out of our memory.
  • Retrieval, the third stage, is cue dependent. This means the more mental links you’re generating during stage one, the acquisition phase, the easier you can access and use your knowledge.
  • we need to understand that the three phases interrelate
  • creating durable and flexible access to to-be-learned information is partly a matter of achieving a meaningful encoding of that information and partly a matter of exercising the retrieval process.”
  • Next, we’ll look at the learning strategies that work best for our brains (elaboration, retrieval, spaced repetition, interleaving, self-testing) and see how we can apply those insights to reading non-fiction books.
  • The strategies that follow are rooted in research from professors of Psychological & Brain Science around Henry Roediger and Mark McDaniel. Both scientists spent ten years bridging the gap between cognitive psychology and education fields. Harvard University Press published their findings in the book ‘Make It Stick.
  • #1 Elaboration
  • “Elaboration is the process of giving new material meaning by expressing it in your own words and connecting it with what you already know.”
  • Why elaboration works: Elaborative rehearsal encodes information into your long-term memory more effectively. The more details and the stronger you connect new knowledge to what you already know, the better because you’ll be generating more cues. And the more cues they have, the easier you can retrieve your knowledge.
  • How I apply elaboration: Whenever I read an interesting section, I pause and ask myself about the real-life connection and potential application. The process is invisible, and my inner monologues sound like: “This idea reminds me of…, This insight conflicts with…, I don’t really understand how…, ” etc.
  • For example, when I learned about A/B testing in ‘The Lean Startup,’ I thought about applying this method to my startup. I added a note on the site stating we should try it in user testing next Wednesday. Thereby the book had an immediate application benefit to my life, and I will always remember how the methodology works.
  • How you can apply elaboration: Elaborate while you read by asking yourself meta-learning questions like “How does this relate to my life? In which situation will I make use of this knowledge? How does it relate to other insights I have on the topic?”
  • While pausing and asking yourself these questions, you’re generating important memory cues. If you take some notes, don’t transcribe the author’s words but try to summarize, synthesize, and analyze.
  • #2 Retrieval
  • With retrieval, you try to recall something you’ve learned in the past from your memory. While retrieval practice can take many forms — take a test, write an essay, do a multiple-choice test, practice with flashcards
  • the authors of ‘Make It Stick’ state: “While any kind of retrieval practice generally benefits learning, the implication seems to be that where more cognitive effort is required for retrieval, greater retention results.”
  • Whatever you settle for, be careful not to copy/paste the words from the author. If you don’t do the brain work yourself, you’ll skip the learning benefits of retrieval
  • Retrieval strengthens your memory and interrupts forgetting and, as other researchers replicate, as a learning event, the act of retrieving information is considerably more potent than is an additional study opportunity, particularly in terms of facilitating long-term recall.
  • How I apply retrieval: I retrieve a book’s content from my memory by writing a book summary for every book I want to remember. I ask myself questions like: “How would you summarize the book in three sentences? Which concepts do you want to keep in mind or apply? How does the book relate to what you already know?”
  • I then publish my summaries on Goodreads or write an article about my favorite insights
  • How you can apply retrieval: You can come up with your own questions or use mine. If you don’t want to publish your summaries in public, you can write a summary into your journal, start a book club, create a private blog, or initiate a WhatsApp group for sharing book summaries.
  • a few days after we learn something, forgetting sets in
  • #3 Spaced Repetition
  • With spaced repetition, you repeat the same piece of information across increasing intervals.
  • The harder it feels to recall the information, the stronger the learning effect. “Spaced practice, which allows some forgetting to occur between sessions, strengthens both the learning and the cues and routes for fast retrieval,”
  • Why it works: It might sound counterintuitive, but forgetting is essential for learning. Spacing out practice might feel less productive than rereading a text because you’ll realize what you forgot. Your brain has to work harder to retrieve your knowledge, which is a good indicator of effective learning.
  • How I apply spaced repetition: After some weeks, I revisit a book and look at the summary questions (see #2). I try to come up with my answer before I look up my actual summary. I can often only remember a fraction of what I wrote and have to look at the rest.
  • “Knowledge trapped in books neatly stacked is meaningless and powerless until applied for the betterment of life.”
  • How you can apply spaced repetition: You can revisit your book summary medium of choice and test yourself on what you remember. What were your action points from the book? Have you applied them? If not, what hindered you?
  • By testing yourself in varying intervals on your book summaries, you’ll strengthen both learning and cues for fast retrieval.
  • Why interleaving works: Alternate working on different problems feels more difficult as it, again, facilitates forgetting.
  • How I apply interleaving: I read different books at the same time.
  • 1) Highlight everything you want to remember
  • #5 Self-Testing
  • While reading often falsely tricks us into perceived mastery, testing shows us whether we truly mastered the subject at hand. Self-testing helps you identify knowledge gaps and brings weak areas to the light
  • “It’s better to solve a problem than to memorize a solution.”
  • Why it works: Self-testing helps you overcome the illusion of knowledge. “One of the best habits a learner can instill in herself is regular self-quizzing to recalibrate her understanding of what she does and does not know.”
  • How I apply self-testing: I explain the key lessons from non-fiction books I want to remember to others. Thereby, I test whether I really got the concept. Often, I didn’t
  • instead of feeling frustrated, cognitive science made me realize that identifying knowledge gaps are a desirable and necessary effect for long-term remembering.
  • How you can apply self-testing: Teaching your lessons learned from a non-fiction book is a great way to test yourself. Before you explain a topic to somebody, you have to combine several mental tasks: filter relevant information, organize this information, and articulate it using your own vocabulary.
  • Now that I discovered how to use my Kindle as a learning device, I wouldn’t trade it for a paper book anymore. Here are the four steps it takes to enrich your e-reading experience
  • How you can apply interleaving: Your brain can handle reading different books simultaneously, and it’s effective to do so. You can start a new book before you finish the one you’re reading. Starting again into a topic you partly forgot feels difficult first, but as you know by now, that’s the effect you want to achieve.
  • it won’t surprise you that researchers proved highlighting to be ineffective. It’s passive and doesn’t create memory cues.
  • 2) Cut down your highlights in your browser
  • After you finished reading the book, you want to reduce your highlights to the essential part. Visit your Kindle Notes page to find a list of all your highlights. Using your desktop browser is faster and more convenient than editing your highlights on your e-reading device.
  • Now, browse through your highlights, delete what you no longer need, and add notes to the ones you really like. By adding notes to the highlights, you’ll connect the new information to your existing knowledge
  • 3) Use software to practice spaced repetitionThis part is the main reason for e-books beating printed books. While you can do all of the above with a little extra time on your physical books, there’s no way to systemize your repetition praxis.
  • Readwise is the best software to combine spaced repetition with your e-books. It’s an online service that connects to your Kindle account and imports all your Kindle highlights. Then, it creates flashcards of your highlights and allows you to export your highlights to your favorite note-taking app.
  • Common Learning Myths DebunkedWhile reading and studying evidence-based learning techniques I also came across some things I wrongly believed to be true.
  • #2 Effective learning should feel easyWe think learning works best when it feels productive. That’s why we continue to use ineffective techniques like rereading or highlighting. But learning works best when it feels hard, or as the authors of ‘Make It Stick’ write: “Learning that’s easy is like writing in sand, here today and gone tomorrow.”
  • In Conclusion
  • I developed and adjusted these strategies over two years, and they’re still a work in progress.
  • Try all of them but don’t force yourself through anything that doesn’t feel right for you. I encourage you to do your own research, add further techniques, and skip what doesn’t serve you
  • “In the case of good books, the point is not to see how many of them you can get through, but rather how many can get through to you.”— Mortimer J. Adler
Javier E

The decline effect and the scientific method : The New Yorker - 3 views

  • The test of replicability, as it’s known, is the foundation of modern research. Replicability is how the community enforces itself. It’s a safeguard for the creep of subjectivity. Most of the time, scientists know what results they want, and that can influence the results they get. The premise of replicability is that the scientific community can correct for these flaws.
  • But now all sorts of well-established, multiply confirmed findings have started to look increasingly uncertain. It’s as if our facts were losing their truth: claims that have been enshrined in textbooks are suddenly unprovable.
  • This phenomenon doesn’t yet have an official name, but it’s occurring across a wide range of fields, from psychology to ecology.
  • ...39 more annotations...
  • If replication is what separates the rigor of science from the squishiness of pseudoscience, where do we put all these rigorously validated findings that can no longer be proved? Which results should we believe?
  • Schooler demonstrated that subjects shown a face and asked to describe it were much less likely to recognize the face when shown it later than those who had simply looked at it. Schooler called the phenomenon “verbal overshadowing.”
  • The most likely explanation for the decline is an obvious one: regression to the mean. As the experiment is repeated, that is, an early statistical fluke gets cancelled out. The extrasensory powers of Schooler’s subjects didn’t decline—they were simply an illusion that vanished over time.
  • yet Schooler has noticed that many of the data sets that end up declining seem statistically solid—that is, they contain enough data that any regression to the mean shouldn’t be dramatic. “These are the results that pass all the tests,” he says. “The odds of them being random are typically quite remote, like one in a million. This means that the decline effect should almost never happen. But it happens all the time!
  • this is why Schooler believes that the decline effect deserves more attention: its ubiquity seems to violate the laws of statistics
  • In 2001, Michael Jennions, a biologist at the Australian National University, set out to analyze “temporal trends” across a wide range of subjects in ecology and evolutionary biology. He looked at hundreds of papers and forty-four meta-analyses (that is, statistical syntheses of related studies), and discovered a consistent decline effect over time, as many of the theories seemed to fade into irrelevance.
  • Jennions admits that his findings are troubling, but expresses a reluctance to talk about them
  • publicly. “This is a very sensitive issue for scientists,” he says. “You know, we’re supposed to be dealing with hard facts, the stuff that’s supposed to stand the test of time. But when you see these trends you become a little more skeptical of things.”
  • Sterling saw that if ninety-seven per cent of psychology studies were proving their hypotheses, either psychologists were extraordinarily lucky or they published only the outcomes of successful experiments.
  • Jennions, similarly, argues that the decline effect is largely a product of publication bias, or the tendency of scientists and scientific journals to prefer positive data over null results, which is what happens when no effect is found. The bias was first identified by the statistician Theodore Sterling, in 1959, after he noticed that ninety-seven per cent of all published psychological studies with statistically significant data found the effect they were looking for
  • While publication bias almost certainly plays a role in the decline effect, it remains an incomplete explanation. For one thing, it fails to account for the initial prevalence of positive results among studies that never even get submitted to journals. It also fails to explain the experience of people like Schooler, who have been unable to replicate their initial data despite their best efforts.
  • One of his most cited papers has a deliberately provocative title: “Why Most Published Research Findings Are False.”
  • suspects that an equally significant issue is the selective reporting of results—the data that scientists choose to document in the first place. Palmer’s most convincing evidence relies on a statistical tool known as a funnel graph. When a large number of studies have been done on a single subject, the data should follow a pattern: studies with a large sample size should all cluster around a common value—the true result—whereas those with a smaller sample size should exhibit a random scattering, since they’re subject to greater sampling error. This pattern gives the graph its name, since the distribution resembles a funnel.
  • after Palmer plotted every study of fluctuating asymmetry, he noticed that the distribution of results with smaller sample sizes wasn’t random at all but instead skewed heavily toward positive results. Palmer has since documented a similar problem in several other contested subject areas. “Once I realized that selective reporting is everywhere in science, I got quite depressed,” Palmer told me. “As a researcher, you’re always aware that there might be some nonrandom patterns, but I had no idea how widespread it is.”
  • Palmer summarized the impact of selective reporting on his field: “We cannot escape the troubling conclusion that some—perhaps many—cherished generalities are at best exaggerated in their biological significance and at worst a collective illusion nurtured by strong a-priori beliefs often repeated.”
  • Palmer emphasizes that selective reporting is not the same as scientific fraud. Rather, the problem seems to be one of subtle omissions and unconscious misperceptions, as researchers struggle to make sense of their results. Stephen Jay Gould referred to this as the “sho
  • horning” process.
  • “A lot of scientific measurement is really hard,” Simmons told me. “If you’re talking about fluctuating asymmetry, then it’s a matter of minuscule differences between the right and left sides of an animal. It’s millimetres of a tail feather. And so maybe a researcher knows that he’s measuring a good male”—an animal that has successfully mated—“and he knows that it’s supposed to be symmetrical. Well, that act of measurement is going to be vulnerable to all sorts of perception biases. That’s not a cynical statement. That’s just the way human beings work.”
  • One of the classic examples of selective reporting concerns the testing of acupuncture in different countries. While acupuncture is widely accepted as a medical treatment in various Asian countries, its use is much more contested in the West. These cultural differences have profoundly influenced the results of clinical trials.
  • John Ioannidis, an epidemiologist at Stanford University, argues that such distortions are a serious issue in biomedical research. “These exaggerations are why the decline has become so common,” he says. “It’d be really great if the initial studies gave us an accurate summary of things. But they don’t. And so what happens is we waste a lot of money treating millions of patients and doing lots of follow-up studies on other themes based on results that are misleading.”
  • In 2005, Ioannidis published an article in the Journal of the American Medical Association that looked at the forty-nine most cited clinical-research studies in three major medical journals.
  • the data Ioannidis found were disturbing: of the thirty-four claims that had been subject to replication, forty-one per cent had either been directly contradicted or had their effect sizes significantly downgraded.
  • the most troubling fact emerged when he looked at the test of replication: out of four hundred and thirty-two claims, only a single one was consistently replicable. “This doesn’t mean that none of these claims will turn out to be true,” he says. “But, given that most of them were done badly, I wouldn’t hold my breath.”
  • According to Ioannidis, the main problem is that too many researchers engage in what he calls “significance chasing,” or finding ways to interpret the data so that it passes the statistical test of significance—the ninety-five-per-cent boundary invented by Ronald Fisher.
  • For Simmons, the steep rise and slow fall of fluctuating asymmetry is a clear example of a scientific paradigm, one of those intellectual fads that both guide and constrain research: after a new paradigm is proposed, the peer-review process is tilted toward positive results. But then, after a few years, the academic incentives shift—the paradigm has become entrenched—so that the most notable results are now those that disprove the theory.
  • The problem of selective reporting is rooted in a fundamental cognitive flaw, which is that we like proving ourselves right and hate being wrong.
  • “It feels good to validate a hypothesis,” Ioannidis said. “It feels even better when you’ve got a financial interest in the idea or your career depends upon it. And that’s why, even after a claim has been systematically disproven”—he cites, for instance, the early work on hormone replacement therapy, or claims involving various vitamins—“you still see some stubborn researchers citing the first few studies
  • That’s why Schooler argues that scientists need to become more rigorous about data collection before they publish. “We’re wasting too much time chasing after bad studies and underpowered experiments,”
  • The current “obsession” with replicability distracts from the real problem, which is faulty design.
  • “Every researcher should have to spell out, in advance, how many subjects they’re going to use, and what exactly they’re testing, and what constitutes a sufficient level of proof. We have the tools to be much more transparent about our experiments.”
  • Schooler recommends the establishment of an open-source database, in which researchers are required to outline their planned investigations and document all their results. “I think this would provide a huge increase in access to scientific work and give us a much better way to judge the quality of an experiment,”
  • scientific research will always be shadowed by a force that can’t be curbed, only contained: sheer randomness. Although little research has been done on the experimental dangers of chance and happenstance, the research that exists isn’t encouraging.
  • The disturbing implication of the Crabbe study is that a lot of extraordinary scientific data are nothing but noise. The hyperactivity of those coked-up Edmonton mice wasn’t an interesting new fact—it was a meaningless outlier, a by-product of invisible variables we don’t understand.
  • The problem, of course, is that such dramatic findings are also the most likely to get published in prestigious journals, since the data are both statistically significant and entirely unexpected
  • This suggests that the decline effect is actually a decline of illusion. While Karl Popper imagined falsification occurring with a single, definitive experiment—Galileo refuted Aristotelian mechanics in an afternoon—the process turns out to be much messier than that.
  • Many scientific theories continue to be considered true even after failing numerous experimental tests.
  • Even the law of gravity hasn’t always been perfect at predicting real-world phenomena. (In one test, physicists measuring gravity by means of deep boreholes in the Nevada desert found a two-and-a-half-per-cent discrepancy between the theoretical predictions and the actual data.)
  • Such anomalies demonstrate the slipperiness of empiricism. Although many scientific ideas generate conflicting results and suffer from falling effect sizes, they continue to get cited in the textbooks and drive standard medical practice. Why? Because these ideas seem true. Because they make sense. Because we can’t bear to let them go. And this is why the decline effect is so troubling. Not because it reveals the human fallibility of science, in which data are tweaked and beliefs shape perceptions. (Such shortcomings aren’t surprising, at least for scientists.) And not because it reveals that many of our most exciting theories are fleeting fads and will soon be rejected. (That idea has been around since Thomas Kuhn.)
  • The decline effect is troubling because it reminds us how difficult it is to prove anything. We like to pretend that our experiments define the truth for us. But that’s often not the case. Just because an idea is true doesn’t mean it can be proved. And just because an idea can be proved doesn’t mean it’s true. When the experiments are done, we still have to choose what to believe. ♦
Javier E

Do Political Experts Know What They're Talking About? | Wired Science | Wired... - 1 views

  • I often joke that every cable news show should be forced to display a disclaimer, streaming in a loop at the bottom of the screen. The disclaimer would read: “These talking heads have been scientifically proven to not know what they are talking about. Their blather is for entertainment purposes only.” The viewer would then be referred to Tetlock’s most famous research project, which began in 1984.
  • He picked a few hundred political experts – people who made their living “commenting or offering advice on political and economic trends” – and began asking them to make predictions about future events. He had a long list of pertinent questions. Would George Bush be re-elected? Would there be a peaceful end to apartheid in South Africa? Would Quebec secede from Canada? Would the dot-com bubble burst? In each case, the pundits were asked to rate the probability of several possible outcomes. Tetlock then interrogated the pundits about their thought process, so that he could better understand how they made up their minds.
  • Most of Tetlock’s questions had three possible answers; the pundits, on average, selected the right answer less than 33 percent of the time. In other words, a dart-throwing chimp would have beaten the vast majority of professionals. These results are summarized in his excellent Expert Political Judgment.
  • ...5 more annotations...
  • Some experts displayed a top-down style of reasoning: politics as a deductive art. They started with a big-idea premise about human nature, society, or economics and applied it to the specifics of the case. They tended to reach more confident conclusions about the future. And the positions they reached were easier to classify ideologically: that is the Keynesian prediction and that is the free-market fundamentalist prediction and that is the worst-case environmentalist prediction and that is the best case technology-driven growth prediction etc. Other experts displayed a bottom-up style of reasoning: politics as a much messier inductive art. They reached less confident conclusions and they are more likely to draw on a seemingly contradictory mix of ideas in reaching those conclusions (sometimes from the left, sometimes from the right). We called the big-idea experts “hedgehogs” (they know one big thing) and the more eclectic experts “foxes” (they know many, not so big things).
  • The most consistent predictor of consistently more accurate forecasts was “style of reasoning”: experts with the more eclectic, self-critical, and modest cognitive styles tended to outperform the big-idea people (foxes tended to outperform hedgehogs).
  • Lehrer: Can non-experts do anything to encourage a more effective punditocracy?
  • Tetlock: Yes, non-experts can encourage more accountability in the punditocracy. Pundits are remarkably skillful at appearing to go out on a limb in their claims about the future, without actually going out on one. For instance, they often “predict” continued instability and turmoil in the Middle East (predicting the present) but they virtually never get around to telling you exactly what would have to happen to disconfirm their expectations. They are essentially impossible to pin down. If pundits felt that their public credibility hinged on participating in level playing field forecasting exercises in which they must pit their wits against an extremely difficult-to-predict world, I suspect they would be learn, quite quickly, to be more flexible and foxlike in their policy pronouncements.
  • tweetmeme_style = 'compact'; Digg Stumble Upon Delicious Reddit if(typeof CN!=='undefined' && CN.dart){ CN.dart.call("blogsBody",{sz: "300x250", kws : ["bottom"]}); } Disqus Login About Disqus Like Dislike and 5 others liked this. Glad you liked it. Would you like to share? Facebook Twitter Share No thanks Sharing this page … Thanks! Close Login Add New Comment Post as … Image http://mediacdn.disqus.com/1312506743/build/system/upload.html#xdm_e=http%3A%2F%2Fwww.wired.com&xdm_c=default5471&xdm_p=1&f=wiredscience&t=do_political_experts_know_what_they8217re_talking_
Javier E

E.O. Wilson on altruism and the New Enlightenment - Slate Magazine - 0 views

  • we need to answer two more fundamental questions. The first is why advanced social life exists in the first place and has occurred so rarely. The second is what are the driving forces that brought it into existence.
  • Eusociality, where some individuals reduce their own reproductive potential to raise others' offspring, is what underpins the most advanced form of social organization and the dominance of social insects and humans.
  • Humans originated by multilevel selection—individual selection interacting with group selection
  • ...10 more annotations...
  • We should consider ourselves as a product of these two interacting and often competing levels of evolutionary selection. Individual versus group selection results in a mix of altruism and selfishness, of virtue and sin, among the members of a society.
  • a pretty straightforward answer as to why conflicted emotions are at the very foundation of human existence
  • why we never seem to be able to work things out satisfactorily, particularly internationally.
  • religious strife is not the consequence of differences among people. It's about
  • our tribalistic tendencies to form groups, occupy territories and react fiercely to any intrusion or threat to ourselves, our tribe and our special creation story
  • Such intense instincts could arise in evolution only by group selection—tribe competing against tribe. For me, the peculiar qualities of faith are a logical outcome of this level of biological organization.
  • I see no way out of the problems that organized religion and tribalism create other than humans just becoming more honest and fully aware of themselves. Right now we're living in what Carl Sagan correctly termed a demon-haunted world. We have created a Star Wars civilization but we have Paleolithic emotions
  • I'm devoted to the kind of environmentalism that is particularly geared towards the conservation of the living world, the rest of life on Earth, the place we came from. We need to put a lot more attention into that as something that could unify people. Surely one moral precept we can agree on is to stop destroying our birthplace, the only home humanity will ever have.
  • we ought to have another go at the Enlightenment and use that as a common goal to explain and understand ourselves, to take that self-understanding which we so sorely lack as a foundation for what we do in the moral and political realm
  • I would like to see us improving education worldwide and putting a lot more emphasis—as some Asian and European countries have—on science and technology as part of basic education
Javier E

E. O. Wilson's Theory of Everything - Magazine - The Atlantic - 0 views

  • Wilson told me the new proposed evolutionary model pulls the field “out of the fever swamp of kin selection,” and he confidently predicted a coming paradigm shift that would promote genetic research to identify the “trigger” genes that have enabled a tiny number of cases, such as the ant family, to achieve complex forms of cooperation.
  • In the book, he proposes a theory to answer what he calls “the great unsolved problem of biology,” namely how roughly two dozen known examples in the history of life—humans, wasps, termites, platypodid ambrosia beetles, bathyergid mole rats, gall-making aphids, one type of snapping shrimp, and others—made the breakthrough to life in highly social, complex societies. Eusocial species, Wilson noted, are by far “the most successful species in the history of life.”
  • Summarizing parts of it for me, Wilson was particularly unsparing of organized religion, likening the Book of Revelation, for example, to the ranting of “a paranoid schizophrenic who was allowed to write down everything that came to him.” Toward philosophy, he was only slightly kinder. Generation after generation of students have suffered trying to “puzzle out” what great thinkers like Socrates, Plato, and Descartes had to say on the great questions of man’s nature, Wilson said, but this was of little use, because philosophy has been based on “failed models of the brain.”
  • ...6 more annotations...
  • His theory draws upon many of the most prominent views of how humans emerged. These range from our evolution of the ability to run long distances to our development of the earliest weapons, which involved the improvement of hand-eye coordination. Dramatic climate change in Africa over the course of a few tens of thousands of years also may have forced Australopithecus and Homo to adapt rapidly. And over roughly the same span, humans became cooperative hunters and serious meat eaters, vastly enriching our diet and favoring the development of more-robust brains. By themselves, Wilson says, none of these theories is satisfying. Taken together, though, all of these factors pushed our immediate prehuman ancestors toward what he called a huge pre-adaptive step: the formation of the earliest communities around fixed camps.
  • “Within groups, the selfish are more likely to succeed,” Wilson told me in a telephone conversation. “But in competition between groups, groups of altruists are more likely to succeed. In addition, it is clear that groups of humans proselytize other groups and accept them as allies, and that that tendency is much favored by group selection.” Taking in newcomers and forming alliances had become a fundamental human trait, he added, because “it is a good way to win.”
  • “The humans become consistent with all the others,” he said, and the evolutionary steps were likely similar—beginning with the formation of groups within a freely mixing population, followed by the accumulation of pre-adaptations that make eusociality more likely, such as the invention of campsites. Finally comes the rise to prevalence of eusocial alleles—one of two or more alternative forms of a gene that arise by mutation, and are found at the same place on a chromosome—which promote novel behaviors (like communal child care) or suppress old, asocial traits. Now it is up to geneticists, he adds, to “determine how many genes are involved in crossing the eusociality threshold, and to go find those genes.”
  • Wilson posits that two rival forces drive human behavior: group selection and what he calls “individual selection”—competition at the level of the individual to pass along one’s genes—with both operating simultaneously. “Group selection,” he said, “brings about virtue, and—this is an oversimplification, but—individual selection, which is competing with it, creates sin. That, in a nutshell, is an explanation of the human condition.
  • “When humans started having a camp—and we know that Homo erectus had campsites—then we know they were heading somewhere,” he told me. “They were a group progressively provisioned, sending out some individuals to hunt and some individuals to stay back and guard the valuable campsite. They were no longer just wandering through territory, emitting calls. They were on long-term campsites, maybe changing from time to time, but they had come together. They began to read intentions in each other’s behavior, what each other are doing. They started to learn social connections more solidly.”
  • If Wilson is right, the human impulse toward racism and tribalism could come to be seen as a reflection of our genetic nature as much as anything else—but so could the human capacity for altruism, and for coalition- and alliance-building. These latter possibilities may help explain Wilson’s abiding optimism—about the environment and many other matters. If these traits are indeed deeply written into our genetic codes, we might hope that we can find ways to emphasize and reinforce them, to build problem-solving coalitions that can endure, and to identify with progressively larger and more-inclusive groups over time.
sandrine_h

Darwin's Influence on Modern Thought - Scientific American - 0 views

  • Great minds shape the thinking of successive historical periods. Luther and Calvin inspired the Reformation; Locke, Leibniz, Voltaire and Rousseau, the Enlightenment. Modern thought is most dependent on the influence of Charles Darwin
  • one needs schooling in the physicist’s style of thought and mathematical techniques to appreciate Einstein’s contributions in their fullness. Indeed, this limitation is true for all the extraordinary theories of modern physics, which have had little impact on the way the average person apprehends the world.
  • The situation differs dramatically with regard to concepts in biology.
  • ...10 more annotations...
  • Many biological ideas proposed during the past 150 years stood in stark conflict with what everybody assumed to be true. The acceptance of these ideas required an ideological revolution. And no biologist has been responsible for more—and for more drastic—modifications of the average person’s worldview than Charles Darwin
  • . Evolutionary biology, in contrast with physics and chemistry, is a historical science—the evolutionist attempts to explain events and processes that have already taken place. Laws and experiments are inappropriate techniques for the explication of such events and processes. Instead one constructs a historical narrative, consisting of a tentative reconstruction of the particular scenario that led to the events one is trying to explain.
  • The discovery of natural selection, by Darwin and Alfred Russel Wallace, must itself be counted as an extraordinary philosophical advance
  • The concept of natural selection had remarkable power for explaining directional and adaptive changes. Its nature is simplicity itself. It is not a force like the forces described in the laws of physics; its mechanism is simply the elimination of inferior individuals
  • A diverse population is a necessity for the proper working of natural selection
  • Because of the importance of variation, natural selection should be considered a two-step process: the production of abundant variation is followed by the elimination of inferior individuals
  • By adopting natural selection, Darwin settled the several-thousandyear- old argument among philosophers over chance or necessity. Change on the earth is the result of both, the first step being dominated by randomness, the second by necessity
  • Another aspect of the new philosophy of biology concerns the role of laws. Laws give way to concepts in Darwinism. In the physical sciences, as a rule, theories are based on laws; for example, the laws of motion led to the theory of gravitation. In evolutionary biology, however, theories are largely based on concepts such as competition, female choice, selection, succession and dominance. These biological concepts, and the theories based on them, cannot be reduced to the laws and theories of the physical sciences
  • Despite the initial resistance by physicists and philosophers, the role of contingency and chance in natural processes is now almost universally acknowledged. Many biologists and philosophers deny the existence of universal laws in biology and suggest that all regularities be stated in probabilistic terms, as nearly all so-called biological laws have exceptions. Philosopher of science Karl Popper’s famous test of falsification therefore cannot be applied in these cases.
  • To borrow Darwin’s phrase, there is grandeur in this view of life. New modes of thinking have been, and are being, evolved. Almost every component in modern man’s belief system is somehow affected by Darwinian principles
Javier E

The "missing law" of nature was here all along | Salon.com - 0 views

  • recently published scientific article proposes a sweeping new law of nature, approaching the matter with dry, clinical efficiency that still reads like poetry.
  • “Evolving systems are asymmetrical with respect to time; they display temporal increases in diversity, distribution, and/or patterned behavior,” they continue, mounting their case from the shoulders of Charles Darwin, extending it toward all things living and not. 
  • To join the known physics laws of thermodynamics, electromagnetism and Newton’s laws of motion and gravity, the nine scientists and philosophers behind the paper propose their “law of increasing functional information.”
  • ...27 more annotations...
  • In short, a complex and evolving system — whether that’s a flock of gold finches or a nebula or the English language — will produce ever more diverse and intricately detailed states and configurations of itself.
  • Some of these more diverse and intricate configurations, the scientists write, are shed and forgotten over time. The configurations that persist are ones that find some utility or novel function in a process akin to natural selection, but a selection process driven by the passing-on of information rather than just the sowing of biological genes
  • Have they finally glimpsed, I wonder, the connectedness and symbiotic co-evolution of their own scientific ideas with those of the world’s writers
  • Have they learned to describe in their own quantifying language that cradle from which both our disciplines have emerged and the firmament on which they both stand — the hearing and telling of stories in order to exist?
  • Have they quantified the quality of all existent matter, living and not: that all things inherit a story in data to tell, and that our stories are told by the very forms we take to tell them? 
  • “Is there a universal basis for selection? Is there a more quantitative formalism underlying this conjectured conceptual equivalence—a formalism rooted in the transfer of information?,” they ask of the world’s disparate phenomena. “The answer to both questions is yes.”
  • In her Pulitzer-winning “Pilgrim at Tinker Creek,” nature writer Annie Dillard explains plainly that evolution is the vehicle of such intricacy in the natural world, as much as it is in our own thoughts and actions. 
  • The principle of complexity evolving at its own pace when left to its own devices, independent of time but certainly in a dance with it, is nothing new. Not in science, nor in its closest humanities kin, science and nature writing. Give things time and nourishing environs, protect them from your own intrusions and — living organisms or not — they will produce abundant enlacement of forms.
  • This is how poetry was born from the same larynxes and phalanges that tendered nuclear equations: We featherless bipeds gave language our time and delighted attendance until its forms were so multivariate that they overflowed with inevitable utility.
  • Yes. They’ve glimpsed it, whether they know it or not. Sing to me, O Muse, of functional information and its complex diversity.
  • “The stability of simple forms is the sturdy base from which more complex, stable forms might arise, forming in turn more complex forms,” she explains, drawing on the undercap frills of mushrooms and filament-fine filtering tubes inside human kidneys to illustrate her point. 
  • “Utility to the creature is evolution’s only aesthetic consideration. Form follows function in the created world, so far as I know, and the creature that functions, however bizarre, survives to perpetuate its form,” writes Dillard.
  • Or, as the Mishna would have it, “the creations were all made in generic form, and they gradually expanded.” 
  • She notes that, of all forms of life we’ve ever known to exist, only about 10% are still alive. What extravagant multiplicity. 
  • “Intricacy is that which is given from the beginning, the birthright, and in the intricacy is the hardiness of complexity that ensures against the failures of all life,” Dillard writes. “The wonder is — given the errant nature of freedom and the burgeoning texture of time — the wonder is that all the forms are not monsters, that there is beauty at all, grace gratuitous.”
  • “This paper, and the reason why I'm so proud of it, is because it really represents a connection between science and the philosophy of science that perhaps offers a new lens into why we see everything that we see in the universe,” lead scientist Michael Wong told Motherboard in a recent interview. 
  • Wong is an astrobiologist and planetary scientist at the Carnegie Institute for Science. In his team’s paper, that bridge toward scientific philosophy is not only preceded by a long history of literary creativity but directly theorizes about the creative act itself.  
  • “The creation of art and music may seem to have very little to do with the maintenance of society, but their origins may stem from the need to transmit information and create bonds among communities, and to this day, they enrich life in innumerable ways,” Wong’s team writes.  
  • “Perhaps, like eddies swirling off of a primary flow field, selection pressures for ancillary functions can become so distant from the core functions of their host systems that they can effectively be treated as independently evolving systems,” the authors add, pointing toward the elaborate mating dance culture observed in birds of paradise.
  • “Perhaps it will be humanity’s ability to learn, invent, and adopt new collective modes of being that will lead to its long-term persistence as a planetary phenomenon. In light of these considerations, we suspect that the general principles of selection and function discussed here may also apply to the evolution of symbolic and social systems.”
  • The Mekhilta teaches that all Ten Commandments were pronounced in a single utterance. Similarly, the Maharsha says the Torah’s 613 mitzvoth are only perceived as a plurality because we’re time-bound humans, even though they together form a singular truth which is indivisible from He who expressed it. 
  • “Of the multiplicity of forms, I know nothing. Except that, apparently, anything goes. This holds for forms of behavior as well as design — the mantis munching her mate, the frog wintering in mud.” 
  • Like swirling eddies off of a primary flow field.
  • “O Lord, how manifold are thy works!,” cried out David in his psalm. “In wisdom hast thou made them all: the earth is full of thy riches. So is this great and wide sea, wherein are things creeping innumerable, both small and great beasts.” 
  • Because, whether wittingly or not, science is singing the tune of the humanities. And whether expressed in algebraic logic or ancient Greek hymn, its chorus is the same throughout the universe: Be fruitful and multiply. 
  • In all things, then — from poetic inventions, to rare biodiverse ecosystems, to the charted history of our interstellar equations — it is best if we conserve our world’s intellectual and physical diversity, for both the study and testimony of its immeasurable multiplicity.
  • Both intricate configurations of art and matter arise and fade according to their shared characteristic, long-known by students of the humanities: each have been graced with enough time to attend to the necessary affairs of their most enduring pleasures. 
Javier E

Matt Ridley on Evolution by Sexual Selection | Mind & Matter - WSJ.com - 0 views

  • the evolutionary psychologist Geoffrey Miller in his book "The Mating Mind" explored the notion that since human males woo their mates with art, poetry, music and humor, as well as with brawn, much of the expansion of our brain may have been sexually selected.
  • sexual selection explains civilization itself. They mathematically explored the possibility that "as females prefer males who conspicuously consume, an increasing proportion of males engage in innovation, labor and other productive activities in order to engage in conspicuous consumption. These activities contribute to technological progress and economic growth.
  • Michael Shermer, in his book "The Mind of the Market," argues that you can trace anticapitalist egalitarianism to sexual selection. Back in the hunter-gatherer Paleolithic, inequality had reproductive consequences. The successful hunter, providing valuable protein for females, got a lot more mating opportunities than the unsuccessful.
  • ...1 more annotation...
  • this might explain why it is relative, rather than absolute, inequality that matters so much to people today. In modern Western society, when even relatively poor people have access to transport, refrigeration, entertainment, shoes and plentiful food, you might expect that inequality would be less resented than a century ago—when none of those things might come within the reach of a poor person. What does it matter if there are people who can afford private jets and designer dresses? But clearly that isn't how people think. They resent inequality in luxuries just as much if not more than inequality in necessities. They dislike (and envy) conspicuous consumption, even if it impinges on them not at all. What hurts is not that somebody is rich, but that he is richer.
Javier E

Darwin Was Wrong About Dating - NYTimes.com - 2 views

  • no fossilized record can really tell us how people behaved or thought back then, much less why they behaved or thought as they did. Nonetheless, something funny happens when social scientists claim that a behavior is rooted in our evolutionary past. Assumptions about that behavior take on the immutability of a physical trait — they come to seem as biologically rooted as opposable thumbs or ejaculation.
  • a new batch of scientists began applying Darwinian doctrine to the conduct of mating, and specifically to three assumptions that endure to this day: men are less selective about whom they’ll sleep with; men like casual sex more than women; and men have more sexual partners over a lifetime.
  • In 1972, Robert L. Trivers, a graduate student at Harvard, addressed that first assumption in one of evolutionary psychology’s landmark studies, “Parental Investment and Sexual Selection.” He argued that women are more selective about whom they mate with because they’re biologically obliged to invest more in offspring. Given the relative paucity of ova and plenitude of sperm, as well as the unequal feeding duties that fall to women, men invest less in children. Therefore, men should be expected to be less discriminating and more aggressive in competing for females.
  • ...7 more annotations...
  • if evolution didn’t determine human behavior, what did? The most common explanation is the effect of cultural norms. That, for instance, society tends to view promiscuous men as normal and promiscuous women as troubled outliers, or that our “social script” requires men to approach women while the pickier women do the selecting. Over the past decade, sociocultural explanations have gained steam.
  • In her study, when men and women considered offers of casual sex from famous people, or offers from close friends whom they were told were good in bed, the gender differences in acceptance of casual-sex proposals evaporated nearly to zero.
  • Everyone has always assumed — and early research had shown — that women desired fewer sexual partners over a lifetime than men.
  • In 2009, another long-assumed gender difference in mating — that women are choosier than men — also came under siege
  • in 2003, two behavioral psychologists, Michele G. Alexander and Terri D. Fisher, published the results of a study that used a “bogus pipeline” — a fake lie detector. When asked about actual sexual partners, rather than just theoretical desires, the participants who were not attached to the fake lie detector displayed typical gender differences. Men reported having had more sexual partners than women. But when participants believed that lies about their sexual history would be revealed by the fake lie detector, gender differences in reported sexual partners vanished. In fact, women reported slightly more sexual partners (a mean of 4.4) than did men (a mean of 4.0).
  • the fact that some gender differences can be manipulated, if not eliminated, by controlling for cultural norms suggests that the explanatory power of evolution can’t sustain itself when applied to mating behavior.
  • “Some sexual features are deeply rooted in evolutionary heritage, such as the sex response and how quickly it takes men and women to become aroused,” said Paul Eastwick, a co-author of the speed-dating study. “However, if you’re looking at features such as how men and women regulate themselves in society to achieve specific goals, I believe those features are unlikely to have evolved sex differences. I consider myself an evolutionary psychologist. But many evolutionary psychologists don’t think this way. They think these features are getting shaped and honed by natural selection all the time.” How far does Darwin go in explaining human behavior?
Javier E

Noam Chomsky on Where Artificial Intelligence Went Wrong - Yarden Katz - The Atlantic - 0 views

  • If you take a look at the progress of science, the sciences are kind of a continuum, but they're broken up into fields. The greatest progress is in the sciences that study the simplest systems. So take, say physics -- greatest progress there. But one of the reasons is that the physicists have an advantage that no other branch of sciences has. If something gets too complicated, they hand it to someone else.
  • If a molecule is too big, you give it to the chemists. The chemists, for them, if the molecule is too big or the system gets too big, you give it to the biologists. And if it gets too big for them, they give it to the psychologists, and finally it ends up in the hands of the literary critic, and so on.
  • neuroscience for the last couple hundred years has been on the wrong track. There's a fairly recent book by a very good cognitive neuroscientist, Randy Gallistel and King, arguing -- in my view, plausibly -- that neuroscience developed kind of enthralled to associationism and related views of the way humans and animals work. And as a result they've been looking for things that have the properties of associationist psychology.
  • ...19 more annotations...
  • in general what he argues is that if you take a look at animal cognition, human too, it's computational systems. Therefore, you want to look the units of computation. Think about a Turing machine, say, which is the simplest form of computation, you have to find units that have properties like "read", "write" and "address." That's the minimal computational unit, so you got to look in the brain for those. You're never going to find them if you look for strengthening of synaptic connections or field properties, and so on. You've got to start by looking for what's there and what's working and you see that from Marr's highest level.
  • it's basically in the spirit of Marr's analysis. So when you're studying vision, he argues, you first ask what kind of computational tasks is the visual system carrying out. And then you look for an algorithm that might carry out those computations and finally you search for mechanisms of the kind that would make the algorithm work. Otherwise, you may never find anything.
  • "Good Old Fashioned AI," as it's labeled now, made strong use of formalisms in the tradition of Gottlob Frege and Bertrand Russell, mathematical logic for example, or derivatives of it, like nonmonotonic reasoning and so on. It's interesting from a history of science perspective that even very recently, these approaches have been almost wiped out from the mainstream and have been largely replaced -- in the field that calls itself AI now -- by probabilistic and statistical models. My question is, what do you think explains that shift and is it a step in the right direction?
  • AI and robotics got to the point where you could actually do things that were useful, so it turned to the practical applications and somewhat, maybe not abandoned, but put to the side, the more fundamental scientific questions, just caught up in the success of the technology and achieving specific goals.
  • The approximating unanalyzed data kind is sort of a new approach, not totally, there's things like it in the past. It's basically a new approach that has been accelerated by the existence of massive memories, very rapid processing, which enables you to do things like this that you couldn't have done by hand. But I think, myself, that it is leading subjects like computational cognitive science into a direction of maybe some practical applicability... ..in engineering? Chomsky: ...But away from understanding.
  • I was very skeptical about the original work. I thought it was first of all way too optimistic, it was assuming you could achieve things that required real understanding of systems that were barely understood, and you just can't get to that understanding by throwing a complicated machine at it.
  • if success is defined as getting a fair approximation to a mass of chaotic unanalyzed data, then it's way better to do it this way than to do it the way the physicists do, you know, no thought experiments about frictionless planes and so on and so forth. But you won't get the kind of understanding that the sciences have always been aimed at -- what you'll get at is an approximation to what's happening.
  • Suppose you want to predict tomorrow's weather. One way to do it is okay I'll get my statistical priors, if you like, there's a high probability that tomorrow's weather here will be the same as it was yesterday in Cleveland, so I'll stick that in, and where the sun is will have some effect, so I'll stick that in, and you get a bunch of assumptions like that, you run the experiment, you look at it over and over again, you correct it by Bayesian methods, you get better priors. You get a pretty good approximation of what tomorrow's weather is going to be. That's not what meteorologists do -- they want to understand how it's working. And these are just two different concepts of what success means, of what achievement is.
  • if you get more and more data, and better and better statistics, you can get a better and better approximation to some immense corpus of text, like everything in The Wall Street Journal archives -- but you learn nothing about the language.
  • the right approach, is to try to see if you can understand what the fundamental principles are that deal with the core properties, and recognize that in the actual usage, there's going to be a thousand other variables intervening -- kind of like what's happening outside the window, and you'll sort of tack those on later on if you want better approximations, that's a different approach.
  • take a concrete example of a new field in neuroscience, called Connectomics, where the goal is to find the wiring diagram of very complex organisms, find the connectivity of all the neurons in say human cerebral cortex, or mouse cortex. This approach was criticized by Sidney Brenner, who in many ways is [historically] one of the originators of the approach. Advocates of this field don't stop to ask if the wiring diagram is the right level of abstraction -- maybe it's no
  • if you went to MIT in the 1960s, or now, it's completely different. No matter what engineering field you're in, you learn the same basic science and mathematics. And then maybe you learn a little bit about how to apply it. But that's a very different approach. And it resulted maybe from the fact that really for the first time in history, the basic sciences, like physics, had something really to tell engineers. And besides, technologies began to change very fast, so not very much point in learning the technologies of today if it's going to be different 10 years from now. So you have to learn the fundamental science that's going to be applicable to whatever comes along next. And the same thing pretty much happened in medicine.
  • that's the kind of transition from something like an art, that you learn how to practice -- an analog would be trying to match some data that you don't understand, in some fashion, maybe building something that will work -- to science, what happened in the modern period, roughly Galilean science.
  • it turns out that there actually are neural circuits which are reacting to particular kinds of rhythm, which happen to show up in language, like syllable length and so on. And there's some evidence that that's one of the first things that the infant brain is seeking -- rhythmic structures. And going back to Gallistel and Marr, its got some computational system inside which is saying "okay, here's what I do with these things" and say, by nine months, the typical infant has rejected -- eliminated from its repertoire -- the phonetic distinctions that aren't used in its own language.
  • people like Shimon Ullman discovered some pretty remarkable things like the rigidity principle. You're not going to find that by statistical analysis of data. But he did find it by carefully designed experiments. Then you look for the neurophysiology, and see if you can find something there that carries out these computations. I think it's the same in language, the same in studying our arithmetical capacity, planning, almost anything you look at. Just trying to deal with the unanalyzed chaotic data is unlikely to get you anywhere, just like as it wouldn't have gotten Galileo anywhere.
  • with regard to cognitive science, we're kind of pre-Galilean, just beginning to open up the subject
  • You can invent a world -- I don't think it's our world -- but you can invent a world in which nothing happens except random changes in objects and selection on the basis of external forces. I don't think that's the way our world works, I don't think it's the way any biologist thinks it is. There are all kind of ways in which natural law imposes channels within which selection can take place, and some things can happen and other things don't happen. Plenty of things that go on in the biology in organisms aren't like this. So take the first step, meiosis. Why do cells split into spheres and not cubes? It's not random mutation and natural selection; it's a law of physics. There's no reason to think that laws of physics stop there, they work all the way through. Well, they constrain the biology, sure. Chomsky: Okay, well then it's not just random mutation and selection. It's random mutation, selection, and everything that matters, like laws of physics.
  • What I think is valuable is the history of science. I think we learn a lot of things from the history of science that can be very valuable to the emerging sciences. Particularly when we realize that in say, the emerging cognitive sciences, we really are in a kind of pre-Galilean stage. We don't know wh
  • at we're looking for anymore than Galileo did, and there's a lot to learn from that.
Javier E

Humans, Version 3.0 § SEEDMAGAZINE.COM - 0 views

  • Where are we humans going, as a species? If science fiction is any guide, we will genetically evolve like in X-Men, become genetically engineered as in Gattaca, or become cybernetically enhanced like General Grievous in Star Wars.
  • There is, however, another avenue for human evolution, one mostly unappreciated in both science and fiction. It is this unheralded mechanism that will usher in the next stage of human, giving future people exquisite powers we do not currently possess, powers worthy of natural selection itself. And, importantly, it doesn’t require us to transform into cyborgs or bio-engineered lab rats. It merely relies on our natural bodies and brains functioning as they have for millions of years. This mystery mechanism of human transformation is neuronal recycling, coined by neuroscientist Stanislas Dehaene, wherein the brain’s innate capabilities are harnessed for altogether novel functions.
  • The root of these misconceptions is the radical underappreciation of the design engineered by natural selection into the powers implemented by our bodies and brains, something central to my 2009 book, The Vision Revolution. For example, optical illusions (such as the Hering) are not examples of the brain’s poor hardware design, but, rather, consequences of intricate evolutionary software for generating perceptions that correct for neural latencies in normal circumstances.
  • ...4 more annotations...
  • Like all animal brains, human brains are not general-purpose universal learning machines, but, instead, are intricately structured suites of instincts optimized for the environments in which they evolved. To harness our brains, we want to let the brain’s brilliant mechanisms run as intended—i.e., not to be twisted. Rather, the strategy is to twist Y into a shape that the brain does know how to process.
  • there is a very good reason to be optimistic that the next stage of human will come via the form of adaptive harnessing, rather than direct technological enhancement: It has already happened. We have already been transformed via harnessing beyond what we once were. We’re already Human 2.0, not the Human 1.0, or Homo sapiens, that natural selection made us. We Human 2.0’s have, among many powers, three that are central to who we take ourselves to be today: writing, speech, and music (the latter perhaps being the pinnacle of the arts). Yet these three capabilities, despite having all the hallmarks of design, were not a result of natural selection, nor were they the result of genetic engineering or cybernetic enhancement to our brains. Instead, and as I argue in both The Vision Revolution and my forthcoming Harnessed, these are powers we acquired by virtue of harnessing, or neuronal recycling.
  • Although the step from Human 1.0 to 2.0 was via cultural selection, not via explicit human designers, does the transformation to Human 3.0 need to be entirely due to a process like cultural evolution, or might we have any hope of purposely guiding our transformation? When considering our future, that’s probably the most relevant question we should be asking ourselves.
  • One of my reasons for optimism is that nature-harnessing technologies (like writing, speech, and music) must mimic fundamental ecological features in nature, and that is a much easier task for scientists to tackle than emulating the exhorbitantly complex mechanisms of the brain
carolinewren

Book Review: 'A New History of Life' by Peter Ward and Joe Kirschvink - WSJ - 0 views

  • I imagine that physicists are similarly deluged with revelations about how to build a perpetual-motion machine or about the hitherto secret truth behind relativity. And so I didn’t view the arrival of “A New History of Life” with great enthusiasm.
  • subtitle breathlessly promises “radical new discoveries about the origins and evolution of life on earth,” while the jacket copy avers that “our current paradigm for understanding the history of life on Earth dates back to Charles Darwin’s time, yet scientific advances of the last few decades have radically reshaped that aging picture.”
  • authors Peter Ward and Joe Kirschvink are genuine scientists—paleontologists, to be exact. And they can write.
  • ...16 more annotations...
  • even genuine scientists are human and as such susceptible to the allure of offering up new paradigms (as the historian of science Thomas Kuhn put it)
  • paleontologist Stephen Jay Gould insisted that his conception of “punctuated equilibria” (a kind of Marxist biology that blurred the lines between evolution and revolution), which he developed along with fellow paleontologist Niles Eldredge, upended the traditional Darwinian understanding of how natural selection works.
  • This notion doesn’t constitute a fundamental departure from plain old evolution by natural selection; it simply italicizes that sometimes the process is comparatively rapid, other times slower.
  • In addition, they have long had a peculiar perspective on evolution, because of the limitations of the fossil record
  • Darwin was a pioneering geologist as well as the greatest of all biologists, and his insights were backgrounded by the key concept of uniformitarianism, as advocated by Charles Lyell, his friend and mentor
  • previously regnant paradigm among geologists had been “catastrophism
  • fossil record was therefore seen as reflecting the creation and extinction of new species by an array of dramatic and “unnatural” dei ex machina.
  • Of late, however, uniformitarianism has been on a losing streak. Catastrophism is back, with a bang . . . or a flood, or a burst of extraterrestrial radiation, or an onslaught of unpleasant, previously submerged chemicals
  • This emphasis on catastrophes is the first of a triad of novelties on which “A New History of Life” is based. The second involves an enhanced role for some common but insufficiently appreciated inorganic molecules, notably carbon dioxide, oxygen and hydrogen sulfide.
  • Life didn’t so much unfold smoothly over hundreds of millions of years as lurch chaotically in response to diverse crises and opportunities: too much oxygen, too little carbon dioxide, too little oxygen, too much carbon dioxide, too hot, too cold
  • So far, so good, except that in their eagerness to emphasize what is new and different, the authors teeter on the verge of the same trap as Gould: exaggerating the novelty of their own ideas.
  • Things begin to unravel when it comes to the third leg of Messrs. Ward and Kirschvink’s purported paradigmatic novelty: a supposed role for ecosystems—rain forests, deserts, rivers, coral reefs, deep-sea vents—as units of evolutionary change
  • “While the history of life may be populated by species,” they write, “it has been the evolution of ecosystems that has been the most influential factor in arriving at the modern-day assemblage of life. . . . [W]e know that on occasion in the deep past entirely new ecosystems appear, populated by new kinds of life.” True enough, but it is those “new kinds of life,” not whole ecosystems, upon which natural selection acts.
  • One of the most common popular misconceptions about evolution is that it proceeds “for the good of the species.”
  • The problem is that smaller, nimbler units are far more likely to reproduce differentially than are larger, clumsier, more heterogeneous ones. Insofar as ecosystems are consequential for evolution—and doubtless they are—it is because, like occasional catastrophes, they provide the immediate environment within which something not-so-new is acted out.
  • This is natural selection doing its same-old, same-old thing: acting by a statistically potent process of variation combined with selective retention and differential reproduction, a process that necessarily operates within the particular ecosystem that a given lineage occupies.
ilanaprincilus06

Half Of The Jury In The Chauvin Trial Is Nonwhite. That's Only Part Of The Story : Live... - 0 views

  • The jury chosen for the trial of former Minneapolis police officer Derek Chauvin, charged with murder in the death of George Floyd, is notable because it is significantly less white than Minneapolis itself.
  • three Black men, one Black woman and two jurors who identify as multiracial.
  • 50% of the panel that will vote on Chauvin's fate will be Black or multiracial.
  • ...9 more annotations...
  • Hennepin County, where the trial is being held, is only 17% Black or multiracial, while it is 74% white.
  • The jury's racial makeup will assuage some of the concerns that activists and others had expressed as jury selection got underway two weeks ago.
  • An insufficiently diverse jury, they believed, would undercut people's faith in the legitimacy of a trial seen as a critical moment in the racial justice movement that Floyd's killing helped reenergize last spring.
  • Two of the Black men on the jury are not African Americans but, rather, Black immigrants. During questioning, they expressed the kind of moderate views on policing and race relations
  • None of the Black jurors ultimately chosen for the panel spoke extensively about personal experiences with racism or about having had overtly negative interactions with police. Several said they had a healthy respect for law enforcement.
  • The fate of Juror 76 highlighted a tension that often exists in jury selection, especially in cases in which issues of race loom large. The experiences that come with being Black in America are often enough to get jurors struck from a case
  • That did not seem to be the case during jury selection for the Chauvin trial. Several jurors who expressed at least some support for the movement were seated on the jury — a sign of progress, Chakravarti said.
  • On one hand, that the defense would strike people with negative views of police is understandable, given Nelson's responsibility to seat a jury favorable to his client.
  • She said his fate was a reminder that the jury selection process should be reformed to ensure more African Americans have a fair shot to serve on juries."We should start," she wrote, "by recognizing that their lived experiences with racism are not justification to excuse them."
Javier E

The Selfish Gene turns 40 | Science | The Guardian - 0 views

  • The idea was this: genes strive for immortality, and individuals, families, and species are merely vehicles in that quest. The behaviour of all living things is in service of their genes hence, metaphorically, they are selfish.
  • Before this, it had been proposed that natural selection was honing the behaviour of living things to promote the continuance through time of the individual creature, or family, or group or species. But in fact, Dawkins said, it was the gene itself that was trying to survive, and it just so happened that the best way for it to survive was in concert with other genes in the impermanent husk of an individual
  • This gene-centric view of evolution also began to explain one of the oddities of life on Earth – the behaviour of social insects. What is the point of a drone bee, doomed to remain childless and in the service of a totalitarian queen? Suddenly it made sense that, with the gene itself steering evolution, the fact that the drone shared its DNA with the queen meant that its servitude guarantees not the individual’s survival, but the endurance of the genes they shar
  • ...9 more annotations...
  • the subject is taught bafflingly minimally and late in the curriculum even today; evolution by natural selection is crucial to every aspect of the living world. In the words of the Russian scientist Theodosius Dobzhansky: “Nothing in biology makes sense except in the light of evolution.”
  • his true legacy is The Selfish Gene and its profound effect on multiple generations of scientists and lay readers. In a sense, The Selfish Gene and Dawkins himself are bridges, both intellectually and chronologically, between the titans of mid-century biology – Ronald Fisher, Trivers, Hamilton, Maynard Smith and Williams – and our era of the genome, in which the interrogation of DNA dominates the study of evolution.
  • Genes aren’t what they used to be either. In 1976 they were simply stretches of DNA that encoded proteins. We now know about genes made of DNA’s cousin, RNA; we’ve discovered genes that hop from genome to genome
  • Since 1976, our understanding of why life is the way it is has blossomed and changed. Once the gene became the dominant idea in biology in the 1990s there followed a technological goldrush – the Human Genome Project – to find them all.
  • None of the complications of modern genomes erodes the central premise of the selfish gene.
  • Much of the enmity stems from people misunderstanding that selfishness is being used as a metaphor. The irony of these attacks is that the selfish gene metaphor actually explains altruism. We help others who are not directly related to us because we share similar versions of genes with them.
  • In the scientific community, the chief objection maintains that natural selection can operate at the level of a group of animals, not solely on genes or even individuals
  • To my mind, and that of the majority of evolutionary biologists, the gene-centric view of evolution always emerges intact.
  • the premise remains exciting that a gene’s only desire is to reproduce itself, and that the complexity of genomes makes that reproduction more efficient.
Javier E

Ivy League Schools Are Overrated. Send Your Kids Elsewhere. | New Republic - 1 views

  • a blizzard of admissions jargon that I had to pick up on the fly. “Good rig”: the transcript exhibits a good degree of academic rigor. “Ed level 1”: parents have an educational level no higher than high school, indicating a genuine hardship case. “MUSD”: a musician in the highest category of promise. Kids who had five or six items on their list of extracurriculars—the “brag”—were already in trouble, because that wasn’t nearly enough.
  • With so many accomplished applicants to choose from, we were looking for kids with something special, “PQs”—personal qualities—that were often revealed by the letters or essays. Kids who only had the numbers and the résumé were usually rejected: “no spark,” “not a team-builder,” “this is pretty much in the middle of the fairway for us.” One young person, who had piled up a truly insane quantity of extracurriculars and who submitted nine letters of recommendation, was felt to be “too intense.”
  • On the other hand, the numbers and the résumé were clearly indispensable. I’d been told that successful applicants could either be “well-rounded” or “pointy”—outstanding in one particular way—but if they were pointy, they had to be really pointy: a musician whose audition tape had impressed the music department, a scientist who had won a national award.
  • ...52 more annotations...
  • When I speak of elite education, I mean prestigious institutions like Harvard or Stanford or Williams as well as the larger universe of second-tier selective schools, but I also mean everything that leads up to and away from them—the private and affluent public high schools; the ever-growing industry of tutors and consultants and test-prep courses; the admissions process itself, squatting like a dragon at the entrance to adulthood; the brand-name graduate schools and employment opportunities that come after the B.A.; and the parents and communities, largely upper-middle class, who push their children into the maw of this machine.
  • Our system of elite education manufactures young people who are smart and talented and driven, yes, but also anxious, timid, and lost, with little intellectual curiosity and a stunted sense of purpose: trapped in a bubble of privilege, heading meekly in the same direction, great at what they’re doing but with no idea why they’re doing it.
  • “Super People,” the writer James Atlas has called them—the stereotypical ultra-high-achieving elite college students of today. A double major, a sport, a musical instrument, a couple of foreign languages, service work in distant corners of the globe, a few hobbies thrown in for good measure: They have mastered them all, and with a serene self-assurance
  • The first thing that college is for is to teach you to think.
  • It was only after 24 years in the Ivy League—college and a Ph.D. at Columbia, ten years on the faculty at Yale—that I started to think about what this system does to kids and how they can escape from it, what it does to our society and how we can dismantle it.
  • I taught many wonderful young people during my years in the Ivy League—bright, thoughtful, creative kids whom it was a pleasure to talk with and learn from. But most of them seemed content to color within the lines that their education had marked out for them. Very few were passionate about ideas. Very few saw college as part of a larger project of intellectual discovery and development. Everyone dressed as if they were ready to be interviewed at a moment’s notice.
  • Look beneath the façade of seamless well-adjustment, and what you often find are toxic levels of fear, anxiety, and depression, of emptiness and aimlessness and isolation. A large-scale survey of college freshmen recently found that self-reports of emotional well-being have fallen to their lowest level in the study’s 25-year history.
  • So extreme are the admission standards now that kids who manage to get into elite colleges have, by definition, never experienced anything but success. The prospect of not being successful terrifies them, disorients them. The cost of falling short, even temporarily, becomes not merely practical, but existential. The result is a violent aversion to risk.
  • There are exceptions, kids who insist, against all odds, on trying to get a real education. But their experience tends to make them feel like freaks. One student told me that a friend of hers had left Yale because she found the school “stifling to the parts of yourself that you’d call a soul.”
  • What no one seems to ask is what the “return” is supposed to be. Is it just about earning more money? Is the only purpose of an education to enable you to get a job? What, in short, is college for?
  • Like so many kids today, I went off to college like a sleepwalker. You chose the most prestigious place that let you in; up ahead were vaguely understood objectives: status, wealth—“success.” What it meant to actually get an education and why you might want one—all this was off the table.
  • College is an opportunity to stand outside the world for a few years, between the orthodoxy of your family and the exigencies of career, and contemplate things from a distance.
  • it is only through the act of establishing communication between the mind and the heart, the mind and experience, that you become an individual, a unique being—a soul. The job of college is to assist you to begin to do that. Books, ideas, works of art and thought, the pressure of the minds around you that are looking for their own answers in their own ways.
  • College is not the only chance to learn to think, but it is the best. One thing is certain: If you haven’t started by the time you finish your B.A., there’s little likelihood you’ll do it later. That is why an undergraduate experience devoted exclusively to career preparation is four years largely wasted.
  • Elite schools like to boast that they teach their students how to think, but all they mean is that they train them in the analytic and rhetorical skills that are necessary for success in business and the professions.
  • Everything is technocratic—the development of expertise—and everything is ultimately justified in technocratic terms.
  • Religious colleges—even obscure, regional schools that no one has ever heard of on the coasts—often do a much better job in that respect.
  • At least the classes at elite schools are academically rigorous, demanding on their own terms, no? Not necessarily. In the sciences, usually; in other disciplines, not so much
  • there is now a thriving sector devoted to producing essay-ready summers
  • higher marks for shoddier work.
  • today’s young people appear to be more socially engaged than kids have been for several decades and that they are more apt to harbor creative or entrepreneurial impulses
  • they tend to be played out within the same narrow conception of what constitutes a valid life: affluence, credentials, prestige.
  • Experience itself has been reduced to instrumental function, via the college essay. From learning to commodify your experiences for the application, the next step has been to seek out experiences in order to have them to commodify
  • professors and students have largely entered into what one observer called a “nonaggression pact.”
  • The SAT is supposed to measure aptitude, but what it actually measures is parental income, which it tracks quite closely
  • what these institutions mean by leadership is nothing more than getting to the top. Making partner at a major law firm or becoming a chief executive, climbing the greasy pole of whatever hierarchy you decide to attach yourself to. I don’t think it occurs to the people in charge of elite colleges that the concept of leadership ought to have a higher meaning, or, really, any meaning.
  • The irony is that elite students are told that they can be whatever they want, but most of them end up choosing to be one of a few very similar things
  • As of 2010, about a third of graduates went into financing or consulting at a number of top schools, including Harvard, Princeton, and Cornell.
  • Whole fields have disappeared from view: the clergy, the military, electoral politics, even academia itself, for the most part, including basic science
  • It’s considered glamorous to drop out of a selective college if you want to become the next Mark Zuckerberg, but ludicrous to stay in to become a social worker. “What Wall Street figured out,” as Ezra Klein has put it, “is that colleges are producing a large number of very smart, completely confused graduates. Kids who have ample mental horsepower, an incredible work ethic and no idea what to do next.”
  • t almost feels ridiculous to have to insist that colleges like Harvard are bastions of privilege, where the rich send their children to learn to walk, talk, and think like the rich. Don’t we already know this? They aren’t called elite colleges for nothing. But apparently we like pretending otherwise. We live in a meritocracy, after all.
  • Visit any elite campus across our great nation, and you can thrill to the heart-warming spectacle of the children of white businesspeople and professionals studying and playing alongside the children of black, Asian, and Latino businesspeople and professionals
  • That doesn’t mean there aren’t a few exceptions, but that is all they are. In fact, the group that is most disadvantaged by our current admissions policies are working-class and rural whites, who are hardly present
  • The college admissions game is not primarily about the lower and middle classes seeking to rise, or even about the upper-middle class attempting to maintain its position. It is about determining the exact hierarchy of status within the upper-middle class itself.
  • This system is exacerbating inequality, retarding social mobility, perpetuating privilege, and creating an elite that is isolated from the society that it’s supposed to lead. The numbers are undeniable. In 1985, 46 percent of incoming freshmen at the 250 most selective colleges came from the top quarter of the income distribution. By 2000, it was 55 percent
  • The major reason for the trend is clear. Not increasing tuition, though that is a factor, but the ever-growing cost of manufacturing children who are fit to compete in the college admissions game
  • Wealthy families start buying their children’s way into elite colleges almost from the moment they are born: music lessons, sports equipment, foreign travel (“enrichment” programs, to use the all-too-perfect term)—most important, of course, private-school tuition or the costs of living in a place with top-tier public schools.
  • s there anything that I can do, a lot of young people have written to ask me, to avoid becoming an out-of-touch, entitled little shit? I don’t have a satisfying answer, short of telling them to transfer to a public university. You cannot cogitate your way to sympathy with people of different backgrounds, still less to knowledge of them. You need to interact with them directly, and it has to be on an equal footing
  • Elite private colleges will never allow their students’ economic profile to mirror that of society as a whole. They can’t afford to—they need a critical mass of full payers and they need to tend to their donor base—and it’s not even clear that they’d want to.
  • Elite colleges are not just powerless to reverse the movement toward a more unequal society; their policies actively promote it.
  • To be a high-achieving student is to constantly be urged to think of yourself as a future leader of society.
  • U.S. News and World Report supplies the percentage of freshmen at each college who finished in the highest 10 percent of their high school class. Among the top 20 universities, the number is usually above 90 percent. I’d be wary of attending schools like that. Students determine the level of classroom discussion; they shape your values and expectations, for good and ill. It’s partly because of the students that I’d warn kids away from the Ivies and their ilk. Kids at less prestigious schools are apt to be more interesting, more curious, more open, and far less entitled and competitive.
  • The best option of all may be the second-tier—not second-rate—colleges, like Reed, Kenyon, Wesleyan, Sewanee, Mount Holyoke, and others. Instead of trying to compete with Harvard and Yale, these schools have retained their allegiance to real educational values.
  • Not being an entitled little shit is an admirable goal. But in the end, the deeper issue is the situation that makes it so hard to be anything else. The time has come, not simply to reform that system top to bottom, but to plot our exit to another kind of society altogether.
  • The education system has to act to mitigate the class system, not reproduce it. Affirmative action should be based on class instead of race, a change that many have been advocating for years. Preferences for legacies and athletes ought to be discarded. SAT scores should be weighted to account for socioeconomic factors. Colleges should put an end to résumé-stuffing by imposing a limit on the number of extracurriculars that kids can list on their applications. They ought to place more value on the kind of service jobs that lower-income students often take in high school and that high achievers almost never do. They should refuse to be impressed by any opportunity that was enabled by parental wealth
  • More broadly, they need to rethink their conception of merit. If schools are going to train a better class of leaders than the ones we have today, they’re going to have to ask themselves what kinds of qualities they need to promote. Selecting students by GPA or the number of extracurriculars more often benefits the faithful drudge than the original mind.
  • reforming the admissions process. That might address the problem of mediocrity, but it won’t address the greater one of inequality
  • The problem is the Ivy League itself. We have contracted the training of our leadership class to a set of private institutions. However much they claim to act for the common good, they will always place their interests first.
  • I’ve come to see that what we really need is to create one where you don’t have to go to the Ivy League, or any private college, to get a first-rate education.
  • High-quality public education, financed with public money, for the benefit of all
  • Everybody gets an equal chance to go as far as their hard work and talent will take them—you know, the American dream. Everyone who wants it gets to have the kind of mind-expanding, soul-enriching experience that a liberal arts education provides.
  • We recognize that free, quality K–12 education is a right of citizenship. We also need to recognize—as we once did and as many countries still do—that the same is true of higher education. We have tried aristocracy. We have tried meritocracy. Now it’s time to try democracy.
Javier E

In Defense of Facts - The Atlantic - 1 views

  • over 13 years, he has published a series of anthologies—of the contemporary American essay, of the world essay, and now of the historical American essay—that misrepresents what the essay is and does, that falsifies its history, and that contains, among its numerous selections, very little one would reasonably classify within the genre. And all of this to wide attention and substantial acclaim
  • D’Agata’s rationale for his “new history,” to the extent that one can piece it together from the headnotes that preface each selection, goes something like this. The conventional essay, nonfiction as it is, is nothing more than a delivery system for facts. The genre, as a consequence, has suffered from a chronic lack of critical esteem, and thus of popular attention. The true essay, however, deals not in knowing but in “unknowing”: in uncertainty, imagination, rumination; in wandering and wondering; in openness and inconclusion
  • Every piece of this is false in one way or another.
  • ...31 more annotations...
  • There are genres whose principal business is fact—journalism, history, popular science—but the essay has never been one of them. If the form possesses a defining characteristic, it is that the essay makes an argument
  • That argument can rest on fact, but it can also rest on anecdote, or introspection, or cultural interpretation, or some combination of all these and more
  • what makes a personal essay an essay and not just an autobiographical narrative is precisely that it uses personal material to develop, however speculatively or intuitively, a larger conclusion.
  • Nonfiction is the source of the narcissistic injury that seems to drive him. “Nonfiction,” he suggests, is like saying “not art,” and if D’Agata, who has himself published several volumes of what he refers to as essays, desires a single thing above all, it is to be known as a maker of art.
  • D’Agata tells us that the term has been in use since about 1950. In fact, it was coined in 1867 by the staff of the Boston Public Library and entered widespread circulation after the turn of the 20th century. The concept’s birth and growth, in other words, did coincide with the rise of the novel to literary preeminence, and nonfiction did long carry an odor of disesteem. But that began to change at least as long ago as the 1960s, with the New Journalism and the “nonfiction novel.”
  • What we really seem to get in D’Agata’s trilogy, in other words, is a compendium of writing that the man himself just happens to like, or that he wants to appropriate as a lineage for his own work.
  • What it’s like is abysmal: partial to trivial formal experimentation, hackneyed artistic rebellion, opaque expressions of private meaning, and modish political posturing
  • If I bought a bag of chickpeas and opened it to find that it contained some chickpeas, some green peas, some pebbles, and some bits of goat poop, I would take it back to the store. And if the shopkeeper said, “Well, they’re ‘lyric’ chickpeas,” I would be entitled to say, “You should’ve told me that before I bought them.”
  • when he isn’t cooking quotes or otherwise fudging the record, he is simply indifferent to issues of factual accuracy, content to rely on a mixture of guesswork, hearsay, and his own rather faulty memory.
  • His rejoinders are more commonly a lot more hostile—not to mention juvenile (“Wow, Jim, your penis must be so much bigger than mine”), defensive, and in their overarching logic, deeply specious. He’s not a journalist, he insists; he’s an essayist. He isn’t dealing in anything as mundane as the facts; he’s dealing in “art, dickhead,” in “poetry,” and there are no rules in art.
  • D’Agata replies that there is something between history and fiction. “We all believe in emotional truths that could never hold water, but we still cling to them and insist on their relevance.” The “emotional truths” here, of course, are D’Agata’s, not Presley’s. If it feels right to say that tae kwon do was invented in ancient India (not modern Korea, as Fingal discovers it was), then that is when it was invented. The term for this is truthiness.
  • D’Agata clearly wants to have it both ways. He wants the imaginative freedom of fiction without relinquishing the credibility (and for some readers, the significance) of nonfiction. He has his fingers crossed, and he’s holding them behind his back. “John’s a different kind of writer,” an editor explains to Fingal early in the book. Indeed he is. But the word for such a writer isn’t essayist. It’s liar.
  • he point of all this nonsense, and a great deal more just like it, is to advance an argument about the essay and its history. The form, D’Agata’s story seems to go, was neglected during the long ages that worshiped “information” but slowly emerged during the 19th and 20th centuries as artists learned to defy convention and untrammel their imaginations, coming fully into its own over the past several decades with the dawning recognition of the illusory nature of knowledge.
  • Most delectable is when he speaks about “the essay’s traditional ‘five-paragraph’ form.” I almost fell off my chair when I got to that one. The five-paragraph essay—introduction, three body paragraphs, conclusion; stultifying, formulaic, repetitive—is the province of high-school English teachers. I have never met one outside of a classroom, and like any decent college writing instructor, I never failed to try to wean my students away from them. The five-paragraph essay isn’t an essay; it’s a paper.
  • When he refers to his selections as essays, he does more than falsify the essay as a genre. He also effaces all the genres that they do belong to: not only poetry, fiction, journalism, and travel, but, among his older choices, history, parable, satire, the sermon, and more—genres that possess their own particular traditions, conventions, and expectation
  • —by ignoring the actual contexts of his selections, and thus their actual intentions—D’Agata makes the familiar contemporary move of imposing his own conceits and concerns upon the past. That is how ethnography turns into “song,” Socrates into an essayist, and the whole of literary history into a single man’s “emotional truth.”
  • The history of the essay is indeed intertwined with “facts,” but in a very different way than D’Agata imagines. D’Agata’s mind is Manichaean. Facts bad, imagination good
  • What he fails to understand is that facts and the essay are not antagonists but siblings, offspring of the same historical moment
  • one needs to recognize that facts themselves have a history.
  • Facts are not just any sort of knowledge, such as also existed in the ancient and medieval worlds. A fact is a unit of information that has been established through uniquely modern methods
  • Fact, etymologically, means “something done”—that is, an act or deed
  • It was only in the 16th century—an age that saw the dawning of a new empirical spirit, one that would issue not only in modern science, but also in modern historiography, journalism, and scholarship—that the word began to signify our current sense of “real state of things.”
  • It was at this exact time, and in this exact spirit, that the essay was born. What distinguished Montaigne’s new form—his “essays” or attempts to discover and publish the truth about himself—was not that it was personal (precursors like Seneca also wrote personally), but that it was scrupulously investigative. Montaigne was conducting research into his soul, and he was determined to get it right.
  • His famous motto, Que sais-je?—“What do I know?”—was an expression not of radical doubt but of the kind of skepticism that fueled the modern revolution in knowledge.
  • It is no coincidence that the first English essayist, Galileo’s contemporary Francis Bacon, was also the first great theorist of science.
  • That knowledge is problematic—difficult to establish, labile once created, often imprecise and always subject to the limitations of the human mind—is not the discovery of postmodernism. It is a foundational insight of the age of science, of fact and information, itself.
  • The point is not that facts do not exist, but that they are unstable (and are becoming more so as the pace of science quickens). Knowledge is always an attempt. Every fact was established by an argument—by observation and interpretation—and is susceptible to being overturned by a different one
  • A fact, you might say, is nothing more than a frozen argument, the place where a given line of investigation has come temporarily to rest.
  • Sometimes those arguments are scientific papers. Sometimes they are news reports, which are arguments with everything except the conclusions left out (the legwork, the notes, the triangulation of sources—the research and the reasoning).
  • When it comes to essays, though, we don’t refer to those conclusions as facts. We refer to them as wisdom, or ideas
  • the essay draws its strength not from separating reason and imagination but from putting them in conversation. A good essay moves fluidly between thought and feeling. It subjects the personal to the rigors of the intellect and the discipline of external reality. The truths it finds are more than just emotional.
Javier E

How Did Consciousness Evolve? - The Atlantic - 0 views

  • Theories of consciousness come from religion, from philosophy, from cognitive science, but not so much from evolutionary biology. Maybe that’s why so few theories have been able to tackle basic questions such as: What is the adaptive value of consciousness? When did it evolve and what animals have it?
  • The Attention Schema Theory (AST), developed over the past five years, may be able to answer those questions.
  • The theory suggests that consciousness arises as a solution to one of the most fundamental problems facing any nervous system: Too much information constantly flows in to be fully processed. The brain evolved increasingly sophisticated mechanisms for deeply processing a few select signals at the expense of others, and in the AST, consciousness is the ultimate result of that evolutionary sequence
  • ...23 more annotations...
  • Even before the evolution of a central brain, nervous systems took advantage of a simple computing trick: competition.
  • It coordinates something called overt attention – aiming the satellite dishes of the eyes, ears, and nose toward anything important.
  • Selective enhancement therefore probably evolved sometime between hydras and arthropods—between about 700 and 600 million years ago, close to the beginning of complex, multicellular life
  • The next evolutionary advance was a centralized controller for attention that could coordinate among all senses. In many animals, that central controller is a brain area called the tectum
  • At any moment only a few neurons win that intense competition, their signals rising up above the noise and impacting the animal’s behavior. This process is called selective signal enhancement, and without it, a nervous system can do almost nothing.
  • With the evolution of reptiles around 350 to 300 million years ago, a new brain structure began to emerge – the wulst. Birds inherited a wulst from their reptile ancestors. Mammals did too, but our version is usually called the cerebral cortex and has expanded enormously
  • According to fossil and genetic evidence, vertebrates evolved around 520 million years ago. The tectum and the central control of attention probably evolved around then, during the so-called Cambrian Explosion when vertebrates were tiny wriggling creatures competing with a vast range of invertebrates in the sea.
  • The tectum is a beautiful piece of engineering. To control the head and the eyes efficiently, it constructs something called an internal model, a feature well known to engineers. An internal model is a simulation that keeps track of whatever is being controlled and allows for predictions and planning.
  • The tectum’s internal model is a set of information encoded in the complex pattern of activity of the neurons. That information simulates the current state of the eyes, head, and other major body parts, making predictions about how these body parts will move next and about the consequences of their movement
  • In fish and amphibians, the tectum is the pinnacle of sophistication and the largest part of the brain. A frog has a pretty good simulation of itself.
  • All vertebrates—fish, reptiles, birds, and mammals—have a tectum. Even lampreys have one, and they appeared so early in evolution that they don’t even have a lower jaw. But as far as anyone knows, the tectum is absent from all invertebrates
  • The cortex also takes in sensory signals and coordinates movement, but it has a more flexible repertoire. Depending on context, you might look toward, look away, make a sound, do a dance, or simply store the sensory event in memory in case the information is useful for the future.
  • The most important difference between the cortex and the tectum may be the kind of attention they control. The tectum is the master of overt attention—pointing the sensory apparatus toward anything important. The cortex ups the ante with something called covert attention. You don’t need to look directly at something to covertly attend to it. Even if you’ve turned your back on an object, your cortex can still focus its processing resources on it
  • The cortex needs to control that virtual movement, and therefore like any efficient controller it needs an internal model. Unlike the tectum, which models concrete objects like the eyes and the head, the cortex must model something much more abstract. According to the AST, it does so by constructing an attention schema—a constantly updated set of information that describes what covert attention is doing moment-by-moment and what its consequences are
  • Covert attention isn’t intangible. It has a physical basis, but that physical basis lies in the microscopic details of neurons, synapses, and signals. The brain has no need to know those details. The attention schema is therefore strategically vague. It depicts covert attention in a physically incoherent way, as a non-physical essence
  • this, according to the theory, is the origin of consciousness. We say we have consciousness because deep in the brain, something quite primitive is computing that semi-magical self-description.
  • I’m reminded of Teddy Roosevelt’s famous quote, “Do what you can with what you have where you are.” Evolution is the master of that kind of opportunism. Fins become feet. Gill arches become jaws. And self-models become models of others. In the AST, the attention schema first evolved as a model of one’s own covert attention. But once the basic mechanism was in place, according to the theory, it was further adapted to model the attentional states of others, to allow for social prediction. Not only could the brain attribute consciousness to itself, it began to attribute consciousness to others.
  • In the AST’s evolutionary story, social cognition begins to ramp up shortly after the reptilian wulst evolved. Crocodiles may not be the most socially complex creatures on earth, but they live in large communities, care for their young, and can make loyal if somewhat dangerous pets.
  • If AST is correct, 300 million years of reptilian, avian, and mammalian evolution have allowed the self-model and the social model to evolve in tandem, each influencing the other. We understand other people by projecting ourselves onto them. But we also understand ourselves by considering the way other people might see us.
  • t the cortical networks in the human brain that allow us to attribute consciousness to others overlap extensively with the networks that construct our own sense of consciousness.
  • Language is perhaps the most recent big leap in the evolution of consciousness. Nobody knows when human language first evolved. Certainly we had it by 70 thousand years ago when people began to disperse around the world, since all dispersed groups have a sophisticated language. The relationship between language and consciousness is often debated, but we can be sure of at least this much: once we developed language, we could talk about consciousness and compare notes
  • Maybe partly because of language and culture, humans have a hair-trigger tendency to attribute consciousness to everything around us. We attribute consciousness to characters in a story, puppets and dolls, storms, rivers, empty spaces, ghosts and gods. Justin Barrett called it the Hyperactive Agency Detection Device, or HADD
  • the HADD goes way beyond detecting predators. It’s a consequence of our hyper-social nature. Evolution turned up the amplitude on our tendency to model others and now we’re supremely attuned to each other’s mind states. It gives us our adaptive edge. The inevitable side effect is the detection of false positives, or ghosts.
1 - 20 of 207 Next › Last »
Showing 20 items per page