Skip to main content

Home/ Long Game/ Group items tagged lesswrong

Rss Feed Group items tagged

anonymous

Theory of Knowledge (rationality outreach) - 0 views

  • It's called Theory of Knowledge, and it's offered at 2,307 schools worldwide as part of the IB Diploma Program.
  • For the record, I'm not convinced the IB Diploma Program is a good thing. It doesn't really solve any of the problems with public schools, it shares the frustrating focus on standardized testing and password-guessing instead of real learning, etc. But I think Theory of Knowledge is a huge opportunity to spread the ideas of rationality.
  • There isn't much in the way of standards for a curriculum, and in the entire last semester we covered less content than I learn from any given top-level LessWrong post.
  • ...2 more annotations...
  • In retrospect, I think the best thing that could have been added would have been a discussion up front about how not to be confused about words. Some combo of the material in Disputing Definitions and Conceptual Analysis and Moral Theory. After that, something to undermine reliance on introspection and intuition more generally, perhaps in the context of presenting basic cognitive biases.
  • There are a lot of ways to make ToK good, and some of them don't look like LessWrong.
  •  
    "The consensus seems to be that a class teaching the basic principles of thinking would be a huge step towards raising the sanity waterline, but that it will never happen. Well, my school has one. It's called Theory of Knowledge, and it's offered at 2,307 schools worldwide as part of the IB Diploma Program."
anonymous

What Cost for Irrationality? - 0 views

  • Status Quo bias is a general human tendency to prefer the default state, regardless of whether the default is actually good or not.
  • Reporting on a study of 700 mutual funds during 1998-2001, finanical reporter Jason Zweig noted that "to a remarkable degree, investors underperformed their funds' reported returns - sometimes by as much as 75 percentage points per year."
  • But when group A had 100 children, each with an 80 percent chance of surviving when transplanted, and group B had 100 children, each with a 20 percent chance of surviving when transplanted, people still chose the equal allocation method even if this caused the unnecessary deaths of 30 children.
  • ...4 more annotations...
  • It was only when the question was framed as "group A versus group B" that people suddenly felt they didn't want to abandon group B entirely.
  • End up falsely accused or imprisoned
  • Researchers have estimated that over 300 more people died in the last months of 2001 because they drove instead of flying
  • In 1997, however, one half of the adult population had fallen victim to Ponzi schemes. In a Ponzi scheme, the investment itself isn't actually making any money, but rather early investors are paid off with the money from late investors, and eventually the system has to collapse when no new investors can be recruited.
  •  
    By Kaj Sotala at Less Wrong on July 1, 2010
anonymous

What I've learned from Less Wrong - 0 views

  • It’s much easier to be egalitarian and respect everyone when you can always say ”Well, I suppose that might be right -- you never know!“
  • in a desire to be exceptional, I naïvely reasoned that believing similar things to other smart people would probably get me the same boring life outcomes that many of them seemed to be getting
  • it no longer makes sense to be a meme collecting, universal egalitarian the same way I was before
  • ...3 more annotations...
  • This is because induction is the only way to reliably find candidate hypotheses which deserve attention.
  • Not only is the free will problem solved, but it turns out it was easy. 
  • philosophers failing to uniformly mark this as ”settled“ and move on is not because this is a questionable result... they’re just in a world where most philosophers are still having trouble figuring out if god exists or not.
  •  
    "I've been compiling a list of the top things I've learned from Less Wrong in the past few months. If you're new here or haven't been here since the beginning of this blog, perhaps my personal experience from reading the back-log of articles known as the sequences can introduce you to some of the more useful insights you might get from reading and using Less Wrong."
anonymous

Your intuitions are not magic - 0 views

  • As a formal system, pure math exists only inside our heads. We can try to apply it to the real world, but if we are misapplying it, nothing in the system itself will tell us that we're making a mistake.
  • When someone says "correlation", they are most commonly talking about Pearson's correlation coefficient, which seeks to gauge whether there's a linear relationship between two variables.
  • A person who doesn't stop to consider the assumptions of the techniques she's using is, in effect, thinking that her techniques are magical.
  • ...7 more annotations...
  • Our brains keep track of countless pieces of information that we will not usually even think about. Few people will explicitly keep track of the amount of different restaurants they've seen.
    • anonymous
       
      This should probably read: "Our brains keep track of countless pieces of mis-remembered information"...
  • But like explicit statistical techniques, the brain makes numerous assumptions when building its models of the world.
    • anonymous
       
      Which this alludes to. :)
  • Thus, people asked to estimate the frequency of different causes of death underestimate the frequency of those that are underreported in the media, and overestimate the ones that are overreported.
  • like the person who was naively misapplying her statistical tools, the process which generates the answers is a black box to you.
  • he science seems absurd and unintuitive; our intuitions seem firm and clear. And indeed, sometimes there's a flaw in the science, and we are right to trust our intuitions. But on other occasions, our intuitions are wrong.
  • And what is ironic is that we persist on holding onto them exactly because we do not know how they work, because we cannot see their insides and all the things inside them that could go wrong.
  • That is why we need to study the cognitive sciences, figure out the way our intuitions work and how we might correct for mistakes.
  •  
    By Kaj Sotala at Less Wrong on June 10, 2010.
anonymous

What Do We Mean By "Rationality"? - 0 views

  • Epistemic rationality: believing, and updating on evidence, so as to systematically improve the correspondence between your map and the territory.  The art of obtaining beliefs that correspond to reality as closely as possible.  This correspondence is commonly termed "truth" or "accuracy", and we're happy to call it that.
  • First, the Bayesian formalisms in their full form are computationally intractable on most real-world problems.  No one can actually calculate and obey the math, any more than you can predict the stock market by calculating the movements of quarks.
  • we have to learn our own flaws, overcome our biases, prevent ourselves from self-deceiving, get ourselves into good emotional shape to confront the truth and do what needs doing, etcetera etcetera and so on
  • ...5 more annotations...
  • Second, sometimes the meaning of the math itself is called into question.  The exact rules of probability theory are called into question by e.g. anthropic problems in which the number of observers is uncertain. 
  • We aren't interested in probability theory because it is the holy word handed down from Laplace.  We're interested in Bayesian-style belief-updating (with Occam priors) because we expect that this style of thinking gets us systematically closer to, you know, accuracy, the map that reflects the territory.
  • How can you improve your conception of rationality?  Not by saying to yourself, “It is my duty to be rational.”  By this you only enshrine your mistaken conception.  Perhaps your conception of rationality is that it is rational to believe the words of the Great Teacher, and the Great Teacher says, “The sky is green,” and you look up at the sky and see blue.  If you think:  “It may look like the sky is blue, but rationality is to believe the words of the Great Teacher,” you lose a chance to discover your mistake.
  • You cannot change reality, or prove the thought, by manipulating which meanings go with which words.
  • Instrumental rationality: achieving your values.  Not necessarily "your values" in the sense of being selfish values or unshared values: "your values" means anything you care about.  The art of choosing actions that steer the future toward outcomes ranked higher in your preferences.  On LW we sometimes refer to this as "winning".
  •  
    By Eliezer Yudkowsky at Less Wrong on March 16, 2009.
anonymous

Applied Bayes' Theorem: Reading People - 0 views

  • 1. Start with the person's most striking traits, and as you gather more information see if his other traits are consistent or inconsistent.
  • 2. Consider each characteristic in light of the circumstances, not in isolation.
  • 3. Look for extremes. The importance of a trait or characteristic may be a matter of degree.
  • ...3 more annotations...
  • 4. Identify deviations from the pattern.
  • 5. Ask yourself if what you're seeing reflects a temporary state of mind or a permanent quality.
  • 6. Distinguish between elective and nonelective traits. Some things you control; other things control you.
  •  
    By Kaj Sotala at Less Wrong on June 30, 2010.
anonymous

Einstein's Arrogance - 0 views

  • To assign more than 50% probability to the correct candidate from a pool of 100,000,000 possible hypotheses, you need at least 27 bits of evidence (or thereabouts).
  • The Traditional phrasing implies that you start out with a hunch, or some private line of reasoning that leads you to a suggested hypothesis, and then you have to gather "evidence" to confirm it - to convince the scientific community, or justify saying that you believe in your hunch.
  • But from a Bayesian perspective, you need an amount of evidence roughly equivalent to the complexity of the hypothesis just to locate the hypothesis in theory-space.
  •  
    "In 1919, Sir Arthur Eddington led expeditions to Brazil and to the island of Principe, aiming to observe solar eclipses and thereby test an experimental prediction of Einstein's novel theory of General Relativity. A journalist asked Einstein what he would do if Eddington's observations failed to match his theory. Einstein famously replied: 'Then I would feel sorry for the good Lord. The theory is correct.'" By Eliezer Yudkowsky at Less Wrong on September 25, 2007.
anonymous

The Psychological Diversity of Mankind - 0 views

  •  
    By Kaj_Sotala at Less Wrong on May 9, 2010. Cochran and Harpending's basic thesis is that the notion of a psychological unity is most likely false. Different human populations are likely for biological reasons to have slightly different minds, shaped by selection pressures in the specific regions the populations happened to live in. Hat tip to Robin Hanson at Overcoming Bias.
1 - 8 of 8
Showing 20 items per page