Skip to main content

Home/ TOK Friends/ Group items tagged Galileo

Rss Feed Group items tagged

caelengrubb

8 Things You May Not Know About Galileo - HISTORY - 0 views

  • When he was 16, Galileo enrolled at the University of Pisa to study medicine, at his father’s urging. Instead, though, he became interested in mathematics and shifted his focus to that subjec
  • Galileo left the school in 1585 without earning a degree.
  • Galileo didn’t invent the telescope—Dutch eyeglass maker Hans Lippershey is generally credited with its creation—but he was the first person to use the optical instrument to systematically study the heavens.
  • ...10 more annotations...
  • In 1609, Galileo learned about the device and developed one of his own, significantly improving its design. That fall, he pointed it at the moon and discovered it had craters and mountains, debunking the common belief that the moon’s surface was smooth.
  • Galileo soon went on to make other findings with his telescope, including that there were four moons orbiting Jupiter and that Venus went through a complete set of phases (indicating the planet traveled around the sun).
  • Galileo had three children with a woman named Marina Gamba, who he never married. In 1613, he placed his two daughters, Virginia, born in 1600, and Livia, born in 1601, in a convent near Florence, where they remained for the rest of their lives, despite their father’s eventual troubles with the Catholic Church
  • Copernicus’ heliocentric theory about the way the universe works challenged the widely accepted belief, espoused by the astronomer Ptolemy in the second century, that put the Earth at the center of the solar system.
  • Galileo received permission from the Church to continue investigating Copernicus’ ideas, as long as he didn’t hold or defend them.
  • As a result, the following year Galileo was ordered to stand trial before the Inquisition in Rome
  • After being found guilty of heresy, Galileo was forced to publicly repent and sentenced to life in prison.
  • Although Galileo was given life behind bars, his sentence soon was changed to house arrest. He lived out his final years at Villa Il Gioiello (“the Jewel”), his home in the town of Arcetri, near Florence
  • In 1979, Pope John Paul II initiated an investigation into the Catholic Church’s condemnation of Galileo.
  • Thirteen years later, and 359 years after Galileo was tried by the Inquisition, the pope officially closed the investigation and issued a formal apology in the case, acknowledging that errors were made by the judges during the trial.
caelengrubb

Copernicus, Galileo, and the Church: Science in a Religious World - Inquiries Journal - 0 views

  • During most of the 16th and 17th centuries, fear of heretics spreading teachings and opinions that contradicted the Bible dominated the Catholic Church
  • A type of war between science and religion was in play but there would be more casualties on the side of science.
  • Nicholas Copernicus and Galileo Galilei were two scientists who printed books that later became banned
  • ...8 more annotations...
  • Copernicus faced no persecution when he was alive because he died shortly after publishing his book. Galileo, on the other hand, was tried by the Inquisition after his book was published
  • As the contents of the Bible were taken literally, the publishing of these books proved, to the Church, that Copernicus and Galileo were sinners; they preached, through their writing, that the Bible was wrong.
  • The Master of the Sacred Palace ordered Galileo to have someone the Master chose review the manuscript to ensure it was fit for publishing.
  • fter his death, the Church was heavily involved in the Council of Trent during the years 1545 to 1563 and other matters10.) . Thus, Revolutions escaped prohibition for many years and eventually influenced Galileo Galilei, who read it and wrote on the subject himself
  • In 1616, Galileo was issued an injunction not to “hold, defend, or teach” heliocentrism
  • By writing in this fashion, Copernicus would have been able to deny that he himself believed in heliocentrism because he phrased it as nothing more than a hypothesis and as a result, would be able to slip past the Church's dislike of heliocentrism
  • Also, the title with the sea in it might have made the Church feel threatened that Galileo was supporting heliocentrism, which would have resulted in Galileo being charged with heresy.
  • With that decision, it was determined that Galileo would be tried by the Inquisition. The Inquisition did not need to decide if Galileo was innocent or guilty, they already knew he was guilty. The Inquisition wanted to determine what Galileo's intentions were. Galileo tried to delay going to Rome for the trial, most likely due to the Inquisition's infamous methods.
caelengrubb

How Galileo Changed Your Life - Biography - 0 views

  • Galileo’s contributions to the fields of astronomy, physics, mathematics, and philosophy have led many to call him the father of modern science.
  • But his controversial theories, which impacted how we see and understand the solar system and our place within it, led to serious conflict with the Catholic Church and the long-time suppression of his achievements
  • Galileo developed one of the first telescopesGalileo didn’t invent the telescope — it was invented by Dutch eyeglass makers — but he made significant improvements to it.
  • ...18 more annotations...
  • His innovations brought him both professional and financial success. He was given a lifetime tenure position at the University of Padua, where he had been teaching for several years, at double his salary.
  • And he received a contract to produce his telescopes for a group of Venetian merchants, eager to use them as a navigational tool.
  • He helped created modern astronomyGalileo turned his new, high-powered telescope to the sky. In early 1610, he made the first in a remarkable series of discoveries.
  • While the scientific doctrine of the day held that space was perfect, unchanging environments created by God, Galileo’s telescope helped change that view
  • His studies and drawings showed the Moon had a rough, uneven surface that was pockmarked in some places, and was actually an imperfect sphere
  • He was also one of the first people to observe the phenomena known as sunspots, thanks to his telescope which allowed him to view the sun for extended periods of time without damaging the eye.
  • Galileo helped prove that the Earth revolved around the sunIn 1610, Galileo published his new findings in the book Sidereus Nuncius, or Starry Messenger, which was an instant success
  • He became close with a number of other leading scientists, including Johannes Kepler. A German astronomer and mathematician, Kepler’s work helped lay the foundations for the later discoveries of Isaac Newton and others.
  • Kepler’s experiments had led him to support the idea that the planets, Earth included, revolved around the sun. This heliocentric theory, as well as the idea of Earth’s daily rotational turning, had been developed by Polish astronomer Nicolaus Copernicus half a century earlier
  • Their belief that the Sun, and not the Earth, was the gravitational center of the universe, upended almost 2,000 years of scientific thinking, dating back to theories about the fixed, unchanging universe put forth by the Greek philosopher and scientist Aristotle.
  • Galileo had been testing Aristotle’s theories for years, including an experiment in the late 16th century in which he dropped two items of different masses from the Leaning Tower of Pisa, disproving Aristotle’s belief that objects would fall at differing speeds based on their weight (Newton later improved upon this work).
  • Galileo paid a high price for his contributionsBut challenging the Aristotelian or Ptolemaic theories about the Earth’s role in the universe was dangerous stuff.
  • Geocentrism was, in part, a theoretical underpinning of the Roman Catholic Church. Galileo’s work brought him to the attention of Church authorities, and in 1615, he was called before the Roman Inquisition, accused of heresy for beliefs which contradicted Catholic scripture.
  • The following year, the Church banned all works that supported Copernicus’ theories and forbade Galileo from publicly discussing his works.
  • In 1632, after the election of a new pope who he considered more liberal, he published another book, Dialogue on the Two Chief World Systems, Ptolemaic and Copernican, which argued both sides of the scientific (and religious) debate but fell squarely on the side of Copernicus’ heliocentrism.
  • Galileo was once again summoned to Rome. In 1633, following a trial, he was found guilty of suspected heresy, forced to recant his views and sentenced to house arrest until his death in 1642.
  • It took nearly 200 years after Galileo’s death for the Catholic Church to drop its opposition to heliocentrism.
  • In 1992, after a decade-long process and 359 years after his heresy conviction, Pope John Paul II formally expressed the Church’s regret over Galileo’s treatment.
caelengrubb

February 2016: 400 Years Ago the Catholic Church Prohibited Copernicanism | Origins: Cu... - 0 views

  • In February-March 1616, the Catholic Church issued a prohibition against the Copernican theory of the earth’s motion.
  • This led later (1633) to the Inquisition trial and condemnation of Galileo Galilei (1564-1642) as a suspected heretic, which generated a controversy that continues to our day.
  • In 1543, Polish astronomer Nicolaus Copernicus (1473-1543) published On the Revolutions of the Heavenly Spheres. This book elaborated the (geokinetic and heliocentric) idea that the earth rotates daily on its own axis and revolves yearly around the sun
  • ...17 more annotations...
  • Since antiquity, this idea had been considered but rejected in favor of the traditional (geostatic and geocentric) thesis that the earth stands still at the center of the universe.
  • The objections to the geokinetic and heliocentric idea involved astronomical observations, the physics of motion, biblical passages, and epistemological principles (e.g., the reliability of human senses, which reveal a stationary earth)
  • The Inquisition launched an investigation. Galileo’s writings were evaluated and other witnesses interrogated. The charges against Galileo were unsubstantiated. However, the officials started worrying about the status of heliocentrism and consulted a committee of experts.
  • These discoveries did not conclusively prove Copernicanism, but provided new evidence in its favor and refutations of some old objections.
  • Galileo became more explicit in his pursuit of heliocentrism, and this soon got him into trouble.
  • In February-March 1615, one Dominican friar filed a written complaint against him, and another one testified in person in front of the Roman Inquisition. They accused Galileo of heresy, for believing in the earth’s motion, which contradicted Scripture, e.g., the miracle in Joshua 10:12-13.
  • Copernicus did not really refute these objections, but he elaborated a novel and important astronomical argument. Thus, Copernicanism attracted few followers. At first, Galileo himself was not one of them, although he was interested because his new physics enabled him to answer the mechanical objections.
  • On February 24, 1616, the consultants unanimously reported the assessment that heliocentrism was philosophically (i.e., scientifically) false and theologically heretical or at least erroneous.
  • The following day, the Inquisition, presided by Pope Paul V, considered the case. Although it did not endorse the heresy recommendation, it accepted the judgments of scientific falsity and theological error, and decided to prohibit the theory.
  • the Church was going to declare the idea of the earth’s motion false and contrary to Scripture, and so this theory could not be held or defended. Galileo agreed to comply.
  • Without mentioning Galileo, it publicly declared the earth’s motion false and contrary to Scripture. It prohibited the reading of Copernicus’s Revolutions, and banned a book published in 1615 by Paolo Antonio Foscarini; he had argued that the earth’s motion was probably true, and certainly compatible with Scripture.
  • The 1616 condemnation of Copernicanism was bad enough for the relationship between science and religion, but the problems were compounded by Galileo’s trial 17 years later.
  • Galileo kept quiet until 1623, when a new pope was elected, Urban VIII, who was a great admirer of Galileo.
  • The Inquisition summoned him to Rome, and the trial proceedings lasted from April to June 1633. He was found guilty of suspected heresy, for defending the earth’s motion, and thus denying the authority of Scripture.
  • “Suspected heresy” was not as serious a religious crime as “formal heresy,” and so his punishment was not death by being burned at the stake, but rather house arrest and the banning of the Dialogue.
  • The Church’s condemnation of Copernicanism and Galileo became the iconic illustration of the problematic relationship between science and religion.
  • This controversy will probably not end any time soon. This may be seen from Pope Francis’s 2015 encyclical Laudato Si’, with its focus on climate change. Whatever its merits, it could be criticized for having failed to learn, from the Galileo affair, the lesson that the Church should be wary of interfering in scientific matters.
caelengrubb

How Galileo Galilei's discoveries helped create modern science - 0 views

  • Few people in history can claim as large a contribution to how we conduct and think about science as Galileo. His work revolutionized our entire outlook on what it means to study nature (and got him in some very hot water with the Roman Inquisition)
  • He is perhaps best known for his championing of Copernicus’ heliocentric model (the one that says the Earth and other planets orbit the Sun), but that is by no means the full extent of his legacy. Far, far from it.
  • Galileo earned himself a place among the stars as Europe’s global navigation satellite system bears his name
  • ...10 more annotations...
  • Galileo is certainly among the titans of science — in many ways, he’s one of its ‘founders’. His legacy includes contributions to the fields of physics, astronomy, math, engineering, and the application of the scientific method
  • He was an accomplished mathematician and inventor, designing (among others) several military compasses and the thermoscope. He was also the one to pick up the torch of modern astronomy from Copernicus, cementing the foundations of this field of study by proving his theories right.
  • Showing others what science can do, and how one should go about it, is Galileo’s most important achievement. Its effects still ripple through the lives of every researcher to this day
  • Since the days of Aristotle, scholars in Europe believed that heavier objects fall faster than lighter ones. Galileo showed that this wasn’t the case, using balls of the same materials but different weights and sizes. In one of his infamous experiments, he dropped two such balls from the top of the leaning tower of Pisa to show that objects of different weights accelerate just as fast towards the ground (air resistance notwithstanding).
  • The truth is Galileo’s experiments in this area used a more reliable but less flashy bunch of inclined planes that he rolled balls down on.
  • His interest regarding motion and the falling of objects were tightly linked to Galileo’s interest in planets, stars, and the solar system.
  • Apart from his theoretical pursuits, Galileo was also an accomplished engineer — meaning he could also turn his knowledge to the solving of practical problems. Most of these, historical accounts tell us, were attempts by Galileo to earn a little bit of extra cash in order to support his extended family after his father passed away.
  • Among his creations are a set of military compasses (sectors) that were simple enough for artillery crews and surveyors to use.
  • He was also an early builder and user of telescopes and microscopes. Galileo, among a few select others, was the first to ever use a refracting telescope as an instrument to observe heavenly bodies, in 1609
  • His fascination with celestial bodies and defense of the heliocentric model is what eventually led to the Inquisition cracking down on him and his works.
Javier E

Noam Chomsky on Where Artificial Intelligence Went Wrong - Yarden Katz - The Atlantic - 0 views

  • If you take a look at the progress of science, the sciences are kind of a continuum, but they're broken up into fields. The greatest progress is in the sciences that study the simplest systems. So take, say physics -- greatest progress there. But one of the reasons is that the physicists have an advantage that no other branch of sciences has. If something gets too complicated, they hand it to someone else.
  • If a molecule is too big, you give it to the chemists. The chemists, for them, if the molecule is too big or the system gets too big, you give it to the biologists. And if it gets too big for them, they give it to the psychologists, and finally it ends up in the hands of the literary critic, and so on.
  • neuroscience for the last couple hundred years has been on the wrong track. There's a fairly recent book by a very good cognitive neuroscientist, Randy Gallistel and King, arguing -- in my view, plausibly -- that neuroscience developed kind of enthralled to associationism and related views of the way humans and animals work. And as a result they've been looking for things that have the properties of associationist psychology.
  • ...19 more annotations...
  • in general what he argues is that if you take a look at animal cognition, human too, it's computational systems. Therefore, you want to look the units of computation. Think about a Turing machine, say, which is the simplest form of computation, you have to find units that have properties like "read", "write" and "address." That's the minimal computational unit, so you got to look in the brain for those. You're never going to find them if you look for strengthening of synaptic connections or field properties, and so on. You've got to start by looking for what's there and what's working and you see that from Marr's highest level.
  • it's basically in the spirit of Marr's analysis. So when you're studying vision, he argues, you first ask what kind of computational tasks is the visual system carrying out. And then you look for an algorithm that might carry out those computations and finally you search for mechanisms of the kind that would make the algorithm work. Otherwise, you may never find anything.
  • "Good Old Fashioned AI," as it's labeled now, made strong use of formalisms in the tradition of Gottlob Frege and Bertrand Russell, mathematical logic for example, or derivatives of it, like nonmonotonic reasoning and so on. It's interesting from a history of science perspective that even very recently, these approaches have been almost wiped out from the mainstream and have been largely replaced -- in the field that calls itself AI now -- by probabilistic and statistical models. My question is, what do you think explains that shift and is it a step in the right direction?
  • AI and robotics got to the point where you could actually do things that were useful, so it turned to the practical applications and somewhat, maybe not abandoned, but put to the side, the more fundamental scientific questions, just caught up in the success of the technology and achieving specific goals.
  • The approximating unanalyzed data kind is sort of a new approach, not totally, there's things like it in the past. It's basically a new approach that has been accelerated by the existence of massive memories, very rapid processing, which enables you to do things like this that you couldn't have done by hand. But I think, myself, that it is leading subjects like computational cognitive science into a direction of maybe some practical applicability... ..in engineering? Chomsky: ...But away from understanding.
  • I was very skeptical about the original work. I thought it was first of all way too optimistic, it was assuming you could achieve things that required real understanding of systems that were barely understood, and you just can't get to that understanding by throwing a complicated machine at it.
  • if success is defined as getting a fair approximation to a mass of chaotic unanalyzed data, then it's way better to do it this way than to do it the way the physicists do, you know, no thought experiments about frictionless planes and so on and so forth. But you won't get the kind of understanding that the sciences have always been aimed at -- what you'll get at is an approximation to what's happening.
  • Suppose you want to predict tomorrow's weather. One way to do it is okay I'll get my statistical priors, if you like, there's a high probability that tomorrow's weather here will be the same as it was yesterday in Cleveland, so I'll stick that in, and where the sun is will have some effect, so I'll stick that in, and you get a bunch of assumptions like that, you run the experiment, you look at it over and over again, you correct it by Bayesian methods, you get better priors. You get a pretty good approximation of what tomorrow's weather is going to be. That's not what meteorologists do -- they want to understand how it's working. And these are just two different concepts of what success means, of what achievement is.
  • if you get more and more data, and better and better statistics, you can get a better and better approximation to some immense corpus of text, like everything in The Wall Street Journal archives -- but you learn nothing about the language.
  • the right approach, is to try to see if you can understand what the fundamental principles are that deal with the core properties, and recognize that in the actual usage, there's going to be a thousand other variables intervening -- kind of like what's happening outside the window, and you'll sort of tack those on later on if you want better approximations, that's a different approach.
  • take a concrete example of a new field in neuroscience, called Connectomics, where the goal is to find the wiring diagram of very complex organisms, find the connectivity of all the neurons in say human cerebral cortex, or mouse cortex. This approach was criticized by Sidney Brenner, who in many ways is [historically] one of the originators of the approach. Advocates of this field don't stop to ask if the wiring diagram is the right level of abstraction -- maybe it's no
  • if you went to MIT in the 1960s, or now, it's completely different. No matter what engineering field you're in, you learn the same basic science and mathematics. And then maybe you learn a little bit about how to apply it. But that's a very different approach. And it resulted maybe from the fact that really for the first time in history, the basic sciences, like physics, had something really to tell engineers. And besides, technologies began to change very fast, so not very much point in learning the technologies of today if it's going to be different 10 years from now. So you have to learn the fundamental science that's going to be applicable to whatever comes along next. And the same thing pretty much happened in medicine.
  • that's the kind of transition from something like an art, that you learn how to practice -- an analog would be trying to match some data that you don't understand, in some fashion, maybe building something that will work -- to science, what happened in the modern period, roughly Galilean science.
  • it turns out that there actually are neural circuits which are reacting to particular kinds of rhythm, which happen to show up in language, like syllable length and so on. And there's some evidence that that's one of the first things that the infant brain is seeking -- rhythmic structures. And going back to Gallistel and Marr, its got some computational system inside which is saying "okay, here's what I do with these things" and say, by nine months, the typical infant has rejected -- eliminated from its repertoire -- the phonetic distinctions that aren't used in its own language.
  • people like Shimon Ullman discovered some pretty remarkable things like the rigidity principle. You're not going to find that by statistical analysis of data. But he did find it by carefully designed experiments. Then you look for the neurophysiology, and see if you can find something there that carries out these computations. I think it's the same in language, the same in studying our arithmetical capacity, planning, almost anything you look at. Just trying to deal with the unanalyzed chaotic data is unlikely to get you anywhere, just like as it wouldn't have gotten Galileo anywhere.
  • with regard to cognitive science, we're kind of pre-Galilean, just beginning to open up the subject
  • You can invent a world -- I don't think it's our world -- but you can invent a world in which nothing happens except random changes in objects and selection on the basis of external forces. I don't think that's the way our world works, I don't think it's the way any biologist thinks it is. There are all kind of ways in which natural law imposes channels within which selection can take place, and some things can happen and other things don't happen. Plenty of things that go on in the biology in organisms aren't like this. So take the first step, meiosis. Why do cells split into spheres and not cubes? It's not random mutation and natural selection; it's a law of physics. There's no reason to think that laws of physics stop there, they work all the way through. Well, they constrain the biology, sure. Chomsky: Okay, well then it's not just random mutation and selection. It's random mutation, selection, and everything that matters, like laws of physics.
  • What I think is valuable is the history of science. I think we learn a lot of things from the history of science that can be very valuable to the emerging sciences. Particularly when we realize that in say, the emerging cognitive sciences, we really are in a kind of pre-Galilean stage. We don't know wh
  • at we're looking for anymore than Galileo did, and there's a lot to learn from that.
catbclark

Why Do Many Reasonable People Doubt Science? - National Geographic Magazine - 0 views

  • Actually fluoride is a natural mineral that, in the weak concentrations used in public drinking water systems, hardens tooth enamel and prevents tooth decay—a cheap and safe way to improve dental health for everyone, rich or poor, conscientious brusher or not. That’s the scientific and medical consensus.
  • when Galileo claimed that the Earth spins on its axis and orbits the sun, he wasn’t just rejecting church doctrine. He was asking people to believe something that defied common sense
  • all manner of scientific knowledge—from the safety of fluoride and vaccines to the reality of climate change—faces organized and often furious opposition.
  • ...61 more annotations...
  • Empowered by their own sources of information and their own interpretations of research, doubters have declared war on the consensus of experts.
  • Our lives are permeated by science and technology as never before. For many of us this new world is wondrous, comfortable, and rich in rewards—but also more complicated and sometimes unnerving. We now face risks we can’t easily analyze.
  • The world crackles with real and imaginary hazards, and distinguishing the former from the latter isn’t easy.
  • In this bewildering world we have to decide what to believe and how to act on that. In principle that’s what science is for.
  • “Science is not a body of facts,” says geophysicist Marcia McNutt,
  • “Science is a method for deciding whether what we choose to believe has a basis in the laws of nature or not.”
  • The scientific method leads us to truths that are less than self-evident, often mind-blowing, and sometimes hard to swallow.
  • We don’t believe you.
  • Galileo was put on trial and forced to recant. Two centuries later Charles Darwin escaped that fate. But his idea that all life on Earth evolved from a primordial ancestor and that we humans are distant cousins of apes, whales, and even deep-sea mollusks is still a big ask for a lot of people. So is another 19th-century notion: that carbon dioxide, an invisible gas that we all exhale all the time and that makes up less than a tenth of one percent of the atmosphere, could be affecting Earth’s climate.
  • we intellectually accept these precepts of science, we subconsciously cling to our intuitions
  • Shtulman’s research indicates that as we become scientifically literate, we repress our naive beliefs but never eliminate them entirely. They lurk in our brains, chirping at us as we try to make sense of the world.
  • Most of us do that by relying on personal experience and anecdotes, on stories rather than statistics.
  • We have trouble digesting randomness; our brains crave pattern and meaning.
  • we can deceive ourselves.
  • Even for scientists, the scientific method is a hard discipline. Like the rest of us, they’re vulnerable to what they call confirmation bias—the tendency to look for and see only evidence that confirms what they already believe. But unlike the rest of us, they submit their ideas to formal peer review before publishing them
  • other scientists will try to reproduce them
  • Scientific results are always provisional, susceptible to being overturned by some future experiment or observation. Scientists rarely proclaim an absolute truth or absolute certainty. Uncertainty is inevitable at the frontiers of knowledge.
  • Many people in the United States—a far greater percentage than in other countries—retain doubts about that consensus or believe that climate activists are using the threat of global warming to attack the free market and industrial society generally.
  • news media give abundant attention to such mavericks, naysayers, professional controversialists, and table thumpers. The media would also have you believe that science is full of shocking discoveries made by lone geniuses
  • science tells us the truth rather than what we’d like the truth to be. Scientists can be as dogmatic as anyone else—but their dogma is always wilting in the hot glare of new research.
  • But industry PR, however misleading, isn’t enough to explain why only 40 percent of Americans, according to the most recent poll from the Pew Research Center, accept that human activity is the dominant cause of global warming.
  • “science communication problem,”
  • yielded abundant new research into how people decide what to believe—and why they so often don’t accept the scientific consensus.
  • higher literacy was associated with stronger views—at both ends of the spectrum. Science literacy promoted polarization on climate, not consensus. According to Kahan, that’s because people tend to use scientific knowledge to reinforce beliefs that have already been shaped by their worldview.
  • “egalitarian” and “communitarian” mind-set are generally suspicious of industry and apt to think it’s up to something dangerous that calls for government regulation; they’re likely to see the risks of climate change.
  • “hierarchical” and “individualistic” mind-set respect leaders of industry and don’t like government interfering in their affairs; they’re apt to reject warnings about climate change, because they know what accepting them could lead to—some kind of tax or regulation to limit emissions.
  • For a hierarchical individualist, Kahan says, it’s not irrational to reject established climate science: Accepting it wouldn’t change the world, but it might get him thrown out of his tribe.
  • Science appeals to our rational brain, but our beliefs are motivated largely by emotion, and the biggest motivation is remaining tight with our peers.
  • organizations funded in part by the fossil fuel industry have deliberately tried to undermine the public’s understanding of the scientific consensus by promoting a few skeptics.
  • Internet makes it easier than ever for climate skeptics and doubters of all kinds to find their own information and experts
  • Internet has democratized information, which is a good thing. But along with cable TV, it has made it possible to live in a “filter bubble” that lets in only the information with which you already agree.
  • How to convert climate skeptics? Throwing more facts at them doesn’t help.
  • people need to hear from believers they can trust, who share their fundamental values.
  • We believe in scientific ideas not because we have truly evaluated all the evidence but because we feel an affinity for the scientific community.
  • “Believing in evolution is just a description about you. It’s not an account of how you reason.”
  • evolution actually happened. Biology is incomprehensible without it. There aren’t really two sides to all these issues. Climate change is happening. Vaccines really do save lives. Being right does matter—and the science tribe has a long track record of getting things right in the end. Modern society is built on things it got right.
  • Doubting science also has consequences.
  • In the climate debate the consequences of doubt are likely global and enduring. In the U.S., climate change skeptics have achieved their fundamental goal of halting legislative action to combat global warming.
  • “That line between science communication and advocacy is very hard to step back from,”
  • It’s their very detachment, what you might call the cold-bloodedness of science, that makes science the killer app.
  • that need to fit in is so strong that local values and local opinions are always trumping science.
  • not a sin to change your mind when the evidence demands it.
  • for the best scientists, the truth is more important than the tribe.
  • Students come away thinking of science as a collection of facts, not a method.
  • Shtulman’s research has shown that even many college students don’t really understand what evidence is.
  • “Everybody should be questioning,” says McNutt. “That’s a hallmark of a scientist. But then they should use the scientific method, or trust people using the scientific method, to decide which way they fall on those questions.”
  • science has made us the dominant organisms,
  • incredibly rapid change, and it’s scary sometimes. It’s not all progress.
  • But the notion of a vaccine-autism connection has been endorsed by celebrities and reinforced through the usual Internet filters. (Anti-vaccine activist and actress Jenny McCarthy famously said on the Oprah Winfrey Show, “The University of Google is where I got my degree from.”)
    • catbclark
       
      Power of celebraties, internet as a source 
  • The scientific method doesn’t come naturally—but if you think about it, neither does democracy. For most of human history neither existed. We went around killing each other to get on a throne, praying to a rain god, and for better and much worse, doing things pretty much as our ancestors did.
  • We need to get a lot better at finding answers, because it’s certain the questions won’t be getting any simpler.
  • That the Earth is round has been known since antiquity—Columbus knew he wouldn’t sail off the edge of the world—but alternative geographies persisted even after circumnavigations had become common
  • We live in an age when all manner of scientific knowledge—from climate change to vaccinations—faces furious opposition.Some even have doubts about the moon landing.
  • Why Do Many Reasonable People Doubt Science?
  • science doubt itself has become a pop-culture meme.
  • Flat-Earthers held that the planet was centered on the North Pole and bounded by a wall of ice, with the sun, moon, and planets a few hundred miles above the surface. Science often demands that we discount our direct sensory experiences—such as seeing the sun cross the sky as if circling the Earth—in favor of theories that challenge our beliefs about our place in the universe.
  • . Yet just because two things happened together doesn’t mean one caused the other, and just because events are clustered doesn’t mean they’re not still random.
  • Sometimes scientists fall short of the ideals of the scientific method. Especially in biomedical research, there’s a disturbing trend toward results that can’t be reproduced outside the lab that found them, a trend that has prompted a push for greater transparency about how experiments are conducted
  • “Science will find the truth,” Collins says. “It may get it wrong the first time and maybe the second time, but ultimately it will find the truth.” That provisional quality of science is another thing a lot of people have trouble with.
  • scientists love to debunk one another
  • they will continue to trump science, especially when there is no clear downside to ignoring science.”
Javier E

Is Science Kind of a Scam? - The New Yorker - 1 views

  • No well-tested scientific concept is more astonishing than the one that gives its name to a new book by the Scientific American contributing editor George Musser, “Spooky Action at a Distance
  • The ostensible subject is the mechanics of quantum entanglement; the actual subject is the entanglement of its observers.
  • his question isn’t so much how this weird thing can be true as why, given that this weird thing had been known about for so long, so many scientists were so reluctant to confront it. What keeps a scientific truth from spreading?
  • ...29 more annotations...
  • it is as if two magic coins, flipped at different corners of the cosmos, always came up heads or tails together. (The spooky action takes place only in the context of simultaneous measurement. The particles share states, but they don’t send signals.)
  • fashion, temperament, zeitgeist, and sheer tenacity affected the debate, along with evidence and argument.
  • The certainty that spooky action at a distance takes place, Musser says, challenges the very notion of “locality,” our intuitive sense that some stuff happens only here, and some stuff over there. What’s happening isn’t really spooky action at a distance; it’s spooky distance, revealed through an action.
  • Why, then, did Einstein’s question get excluded for so long from reputable theoretical physics? The reasons, unfolding through generations of physicists, have several notable social aspects,
  • What started out as a reductio ad absurdum became proof that the cosmos is in certain ways absurd. What began as a bug became a feature and is now a fact.
  • “If poetry is emotion recollected in tranquility, then science is tranquility recollected in emotion.” The seemingly neutral order of the natural world becomes the sounding board for every passionate feeling the physicist possesses.
  • Musser explains that the big issue was settled mainly by being pushed aside. Generational imperatives trumped evidentiary ones. The things that made Einstein the lovable genius of popular imagination were also the things that made him an easy object of condescension. The hot younger theorists patronized him,
  • There was never a decisive debate, never a hallowed crucial experiment, never even a winning argument to settle the case, with one physicist admitting, “Most physicists (including me) accept that Bohr won the debate, although like most physicists I am hard pressed to put into words just how it was done.”
  • Arguing about non-locality went out of fashion, in this account, almost the way “Rock Around the Clock” displaced Sinatra from the top of the charts.
  • The same pattern of avoidance and talking-past and taking on the temper of the times turns up in the contemporary science that has returned to the possibility of non-locality.
  • the revival of “non-locality” as a topic in physics may be due to our finding the metaphor of non-locality ever more palatable: “Modern communications technology may not technically be non-local but it sure feels that it is.”
  • Living among distant connections, where what happens in Bangalore happens in Boston, we are more receptive to the idea of such a strange order in the universe.
  • The “indeterminacy” of the atom was, for younger European physicists, “a lesson of modernity, an antidote to a misplaced Enlightenment trust in reason, which German intellectuals in the 1920’s widely held responsible for their country’s defeat in the First World War.” The tonal and temperamental difference between the scientists was as great as the evidence they called on.
  • Science isn’t a slot machine, where you drop in facts and get out truths. But it is a special kind of social activity, one where lots of different human traits—obstinacy, curiosity, resentment of authority, sheer cussedness, and a grudging readiness to submit pet notions to popular scrutiny—end by producing reliable knowledge
  • What was magic became mathematical and then mundane. “Magical” explanations, like spooky action, are constantly being revived and rebuffed, until, at last, they are reinterpreted and accepted. Instead of a neat line between science and magic, then, we see a jumpy, shifting boundary that keeps getting redrawn
  • Real-world demarcations between science and magic, Musser’s story suggests, are like Bugs’s: made on the move and as much a trap as a teaching aid.
  • In the past several decades, certainly, the old lines between the history of astrology and astronomy, and between alchemy and chemistry, have been blurred; historians of the scientific revolution no longer insist on a clean break between science and earlier forms of magic.
  • Where once logical criteria between science and non-science (or pseudo-science) were sought and taken seriously—Karl Popper’s criterion of “falsifiability” was perhaps the most famous, insisting that a sound theory could, in principle, be proved wrong by one test or another—many historians and philosophers of science have come to think that this is a naïve view of how the scientific enterprise actually works.
  • They see a muddle of coercion, old magical ideas, occasional experiment, hushed-up failures—all coming together in a social practice that gets results but rarely follows a definable logic.
  • Yet the old notion of a scientific revolution that was really a revolution is regaining some credibility.
  • David Wootton, in his new, encyclopedic history, “The Invention of Science” (Harper), recognizes the blurred lines between magic and science but insists that the revolution lay in the public nature of the new approach.
  • What killed alchemy was the insistence that experiments must be openly reported in publications which presented a clear account of what had happened, and they must then be replicated, preferably before independent witnesses.
  • Wootton, while making little of Popper’s criterion of falsifiability, makes it up to him by borrowing a criterion from his political philosophy. Scientific societies are open societies. One day the lunar tides are occult, the next day they are science, and what changes is the way in which we choose to talk about them.
  • Wootton also insists, against the grain of contemporary academia, that single observed facts, what he calls “killer facts,” really did polish off antique authorities
  • once we agree that the facts are facts, they can do amazing work. Traditional Ptolemaic astronomy, in place for more than a millennium, was destroyed by what Galileo discovered about the phases of Venus. That killer fact “serves as a single, solid, and strong argument to establish its revolution around the Sun, such that no room whatsoever remains for doubt,” Galileo wrote, and Wootton adds, “No one was so foolish as to dispute these claims.
  • everal things flow from Wootton’s view. One is that “group think” in the sciences is often true think. Science has always been made in a cloud of social networks.
  • There has been much talk in the pop-sci world of “memes”—ideas that somehow manage to replicate themselves in our heads. But perhaps the real memes are not ideas or tunes or artifacts but ways of making them—habits of mind rather than products of mind
  • science, then, a club like any other, with fetishes and fashions, with schemers, dreamers, and blackballed applicants? Is there a real demarcation to be made between science and every other kind of social activity
  • The claim that basic research is valuable because it leads to applied technology may be true but perhaps is not at the heart of the social use of the enterprise. The way scientists do think makes us aware of how we can think
caelengrubb

I'm So Totally Over Newton's Laws of Motion | WIRED - 0 views

  • We don't need to be stuck with the traditions of the past if we want students to understand physics.
  • Newton's First Law: An object in motion stays in motion unless acted on by a force. An object at rest, stays at rest unless acted on by a force.Newton's Second Law: The magnitude of an object's acceleration is proportional to the net force and inversely proportional to the mass of the object.Newton's Third Law: For every force there is an equal and opposite force. (I've already complained about the way most books talk about this one)
  • Newton's First Law Is Really About Aristotle
  • ...7 more annotations...
  • Remember that before Galileo and Newton, people looked to Aristotle for ideas about physics
  • Yes, it's true that Aristotle wasn't a scientist since he didn't really do any experiments. However, that didn't stop him from become a huge influence on the way people think about physics
  • Do I think that we should ban Newton's Laws? No. There is still a place to talk about the historical development of the interaction between forces and matter and Newton played a large role here (but so did Aristotle and Galileo
  • Let's write down Newton's Second Law in its common form as an equation:Although this is a very useful model, it doesn't always work. If you take a proton moving at half the speed of light and push on it with a force, you cannot use this to find the new velocity of the proton---but it's still a great model. So, maybe we shouldn't call it a Law.
  • Science is all about models. If there is one thing I've tried to be consistent about---it's that we build models in science. These models could be conceptual, physical, or mathematical
  • Since Newton's ideas are Laws, does that mean that they are true? No---there is no truth in science, there are just models. Some models work better than others, and some models are wrong but still useful
  • Just because most physics textbooks (but not all) have been very explicit about Newton's Laws of Motion, this doesn't mean that is the best way for students to learn.
caelengrubb

Universe Is Created, According to Kepler - HISTORY - 0 views

  • On April 27, 4977 B.C., the universe is created, according to German mathematician and astronomer Johannes Kepler, considered a founder of modern science
  • Kepler is best known for his theories explaining the motion of planets.
  • Kepler’s main project was to investigate the orbit of Mars.
  • ...6 more annotations...
  • When Brahe died the following year, Kepler took over his job and inherited Brahe’s extensive collection of astronomy data, which had been painstakingly observed by the naked eye
  • Over the next decade, Kepler learned about the work of Italian physicist and astronomer Galileo Galilei (1564-1642), who had invented a telescope with which he discovered lunar mountains and craters, the largest four satellites of Jupiter and the phases of Venus, among other things
  • In 1609, Kepler published the first two of his three laws of planetary motion, which held that planets move around the sun in ellipses, not circles (as had been widely believed up to that time), and that planets speed up as they approach the sun and slow down as they move away.
  • Kepler’s research was slow to gain widespread traction during his lifetime, but it later served as a key influence on the English mathematician Sir Isaac Newton (1643-1727) and his law of gravitational force
  • Additionally, Kepler did important work in the fields of optics, including demonstrating how the human eye works, and math.
  • As for Kepler’s calculation about the universe’s birthday, scientists in the 20th century developed the Big Bang theory, which showed that his calculations were off by about 13.7 billion years.
Sean Kirkpatrick

Conflicts between science and religion - 0 views

  •  
    I found another article that highlighted the clear conflict between science and religion as the author reflects on the work of all the scientists we studied, including Copernicus, Galileo, and Newton, three of the most famous scientists in history. The author highlights the struggle between church and the findings of these scientists. For example, in the Steven Hawkings movie, the narrator talked about how Newton's discovery of gravity and his laws of motion went against what the church believed. In the article, the author highlights this conflict when he says, "Interestingly, this led to two diametrically opposed inferences. On the one hand, many people saw the success of Newton (and many people see the continued success of physics to the present day) as an argument for atheism. If God is not needed to explain the behavior of the world, and if the cosmos, like a giant clock, operates on mechanical principles alone, then one has no reason to suppose that God even exists. There are no explanatory gaps left for God to fill. Newton himself would have rejected this. He considered God to have a vital role in setting up the initial conditions for the universe."
Javier E

The decline effect and the scientific method : The New Yorker - 3 views

  • The test of replicability, as it’s known, is the foundation of modern research. Replicability is how the community enforces itself. It’s a safeguard for the creep of subjectivity. Most of the time, scientists know what results they want, and that can influence the results they get. The premise of replicability is that the scientific community can correct for these flaws.
  • But now all sorts of well-established, multiply confirmed findings have started to look increasingly uncertain. It’s as if our facts were losing their truth: claims that have been enshrined in textbooks are suddenly unprovable.
  • This phenomenon doesn’t yet have an official name, but it’s occurring across a wide range of fields, from psychology to ecology.
  • ...39 more annotations...
  • If replication is what separates the rigor of science from the squishiness of pseudoscience, where do we put all these rigorously validated findings that can no longer be proved? Which results should we believe?
  • Schooler demonstrated that subjects shown a face and asked to describe it were much less likely to recognize the face when shown it later than those who had simply looked at it. Schooler called the phenomenon “verbal overshadowing.”
  • The most likely explanation for the decline is an obvious one: regression to the mean. As the experiment is repeated, that is, an early statistical fluke gets cancelled out. The extrasensory powers of Schooler’s subjects didn’t decline—they were simply an illusion that vanished over time.
  • yet Schooler has noticed that many of the data sets that end up declining seem statistically solid—that is, they contain enough data that any regression to the mean shouldn’t be dramatic. “These are the results that pass all the tests,” he says. “The odds of them being random are typically quite remote, like one in a million. This means that the decline effect should almost never happen. But it happens all the time!
  • this is why Schooler believes that the decline effect deserves more attention: its ubiquity seems to violate the laws of statistics
  • In 2001, Michael Jennions, a biologist at the Australian National University, set out to analyze “temporal trends” across a wide range of subjects in ecology and evolutionary biology. He looked at hundreds of papers and forty-four meta-analyses (that is, statistical syntheses of related studies), and discovered a consistent decline effect over time, as many of the theories seemed to fade into irrelevance.
  • Jennions admits that his findings are troubling, but expresses a reluctance to talk about them
  • publicly. “This is a very sensitive issue for scientists,” he says. “You know, we’re supposed to be dealing with hard facts, the stuff that’s supposed to stand the test of time. But when you see these trends you become a little more skeptical of things.”
  • While publication bias almost certainly plays a role in the decline effect, it remains an incomplete explanation. For one thing, it fails to account for the initial prevalence of positive results among studies that never even get submitted to journals. It also fails to explain the experience of people like Schooler, who have been unable to replicate their initial data despite their best efforts.
  • Jennions, similarly, argues that the decline effect is largely a product of publication bias, or the tendency of scientists and scientific journals to prefer positive data over null results, which is what happens when no effect is found. The bias was first identified by the statistician Theodore Sterling, in 1959, after he noticed that ninety-seven per cent of all published psychological studies with statistically significant data found the effect they were looking for
  • Sterling saw that if ninety-seven per cent of psychology studies were proving their hypotheses, either psychologists were extraordinarily lucky or they published only the outcomes of successful experiments.
  • One of his most cited papers has a deliberately provocative title: “Why Most Published Research Findings Are False.”
  • suspects that an equally significant issue is the selective reporting of results—the data that scientists choose to document in the first place. Palmer’s most convincing evidence relies on a statistical tool known as a funnel graph. When a large number of studies have been done on a single subject, the data should follow a pattern: studies with a large sample size should all cluster around a common value—the true result—whereas those with a smaller sample size should exhibit a random scattering, since they’re subject to greater sampling error. This pattern gives the graph its name, since the distribution resembles a funnel.
  • after Palmer plotted every study of fluctuating asymmetry, he noticed that the distribution of results with smaller sample sizes wasn’t random at all but instead skewed heavily toward positive results. Palmer has since documented a similar problem in several other contested subject areas. “Once I realized that selective reporting is everywhere in science, I got quite depressed,” Palmer told me. “As a researcher, you’re always aware that there might be some nonrandom patterns, but I had no idea how widespread it is.”
  • Palmer summarized the impact of selective reporting on his field: “We cannot escape the troubling conclusion that some—perhaps many—cherished generalities are at best exaggerated in their biological significance and at worst a collective illusion nurtured by strong a-priori beliefs often repeated.”
  • Palmer emphasizes that selective reporting is not the same as scientific fraud. Rather, the problem seems to be one of subtle omissions and unconscious misperceptions, as researchers struggle to make sense of their results. Stephen Jay Gould referred to this as the “sho
  • horning” process.
  • “A lot of scientific measurement is really hard,” Simmons told me. “If you’re talking about fluctuating asymmetry, then it’s a matter of minuscule differences between the right and left sides of an animal. It’s millimetres of a tail feather. And so maybe a researcher knows that he’s measuring a good male”—an animal that has successfully mated—“and he knows that it’s supposed to be symmetrical. Well, that act of measurement is going to be vulnerable to all sorts of perception biases. That’s not a cynical statement. That’s just the way human beings work.”
  • For Simmons, the steep rise and slow fall of fluctuating asymmetry is a clear example of a scientific paradigm, one of those intellectual fads that both guide and constrain research: after a new paradigm is proposed, the peer-review process is tilted toward positive results. But then, after a few years, the academic incentives shift—the paradigm has become entrenched—so that the most notable results are now those that disprove the theory.
  • John Ioannidis, an epidemiologist at Stanford University, argues that such distortions are a serious issue in biomedical research. “These exaggerations are why the decline has become so common,” he says. “It’d be really great if the initial studies gave us an accurate summary of things. But they don’t. And so what happens is we waste a lot of money treating millions of patients and doing lots of follow-up studies on other themes based on results that are misleading.”
  • In 2005, Ioannidis published an article in the Journal of the American Medical Association that looked at the forty-nine most cited clinical-research studies in three major medical journals.
  • the data Ioannidis found were disturbing: of the thirty-four claims that had been subject to replication, forty-one per cent had either been directly contradicted or had their effect sizes significantly downgraded.
  • the most troubling fact emerged when he looked at the test of replication: out of four hundred and thirty-two claims, only a single one was consistently replicable. “This doesn’t mean that none of these claims will turn out to be true,” he says. “But, given that most of them were done badly, I wouldn’t hold my breath.”
  • According to Ioannidis, the main problem is that too many researchers engage in what he calls “significance chasing,” or finding ways to interpret the data so that it passes the statistical test of significance—the ninety-five-per-cent boundary invented by Ronald Fisher.
  • One of the classic examples of selective reporting concerns the testing of acupuncture in different countries. While acupuncture is widely accepted as a medical treatment in various Asian countries, its use is much more contested in the West. These cultural differences have profoundly influenced the results of clinical trials.
  • The problem of selective reporting is rooted in a fundamental cognitive flaw, which is that we like proving ourselves right and hate being wrong.
  • “It feels good to validate a hypothesis,” Ioannidis said. “It feels even better when you’ve got a financial interest in the idea or your career depends upon it. And that’s why, even after a claim has been systematically disproven”—he cites, for instance, the early work on hormone replacement therapy, or claims involving various vitamins—“you still see some stubborn researchers citing the first few studies
  • That’s why Schooler argues that scientists need to become more rigorous about data collection before they publish. “We’re wasting too much time chasing after bad studies and underpowered experiments,”
  • The current “obsession” with replicability distracts from the real problem, which is faulty design.
  • “Every researcher should have to spell out, in advance, how many subjects they’re going to use, and what exactly they’re testing, and what constitutes a sufficient level of proof. We have the tools to be much more transparent about our experiments.”
  • Schooler recommends the establishment of an open-source database, in which researchers are required to outline their planned investigations and document all their results. “I think this would provide a huge increase in access to scientific work and give us a much better way to judge the quality of an experiment,”
  • scientific research will always be shadowed by a force that can’t be curbed, only contained: sheer randomness. Although little research has been done on the experimental dangers of chance and happenstance, the research that exists isn’t encouraging.
  • The disturbing implication of the Crabbe study is that a lot of extraordinary scientific data are nothing but noise. The hyperactivity of those coked-up Edmonton mice wasn’t an interesting new fact—it was a meaningless outlier, a by-product of invisible variables we don’t understand.
  • The problem, of course, is that such dramatic findings are also the most likely to get published in prestigious journals, since the data are both statistically significant and entirely unexpected
  • This suggests that the decline effect is actually a decline of illusion. While Karl Popper imagined falsification occurring with a single, definitive experiment—Galileo refuted Aristotelian mechanics in an afternoon—the process turns out to be much messier than that.
  • Many scientific theories continue to be considered true even after failing numerous experimental tests.
  • Even the law of gravity hasn’t always been perfect at predicting real-world phenomena. (In one test, physicists measuring gravity by means of deep boreholes in the Nevada desert found a two-and-a-half-per-cent discrepancy between the theoretical predictions and the actual data.)
  • Such anomalies demonstrate the slipperiness of empiricism. Although many scientific ideas generate conflicting results and suffer from falling effect sizes, they continue to get cited in the textbooks and drive standard medical practice. Why? Because these ideas seem true. Because they make sense. Because we can’t bear to let them go. And this is why the decline effect is so troubling. Not because it reveals the human fallibility of science, in which data are tweaked and beliefs shape perceptions. (Such shortcomings aren’t surprising, at least for scientists.) And not because it reveals that many of our most exciting theories are fleeting fads and will soon be rejected. (That idea has been around since Thomas Kuhn.)
  • The decline effect is troubling because it reminds us how difficult it is to prove anything. We like to pretend that our experiments define the truth for us. But that’s often not the case. Just because an idea is true doesn’t mean it can be proved. And just because an idea can be proved doesn’t mean it’s true. When the experiments are done, we still have to choose what to believe. ♦
Javier E

Eric Kandel's Visions - The Chronicle Review - The Chronicle of Higher Education - 0 views

  • Judith, "barely clothed and fresh from the seduction and slaying of Holofernes, glows in her voluptuousness. Her hair is a dark sky between the golden branches of Assyrian trees, fertility symbols that represent her eroticism. This young, ecstatic, extravagantly made-up woman confronts the viewer through half-closed eyes in what appears to be a reverie of orgasmic rapture," writes Eric Kandel in his new book, The Age of Insight. Wait a minute. Writes who? Eric Kandel, the Nobel-winning neuroscientist who's spent most of his career fixated on the generously sized neurons of sea snails
  • Kandel goes on to speculate, in a bravura paragraph a few hundred pages later, on the exact neurochemical cognitive circuitry of the painting's viewer:
  • "At a base level, the aesthetics of the image's luminous gold surface, the soft rendering of the body, and the overall harmonious combination of colors could activate the pleasure circuits, triggering the release of dopamine. If Judith's smooth skin and exposed breast trigger the release of endorphins, oxytocin, and vasopressin, one might feel sexual excitement. The latent violence of Holofernes's decapitated head, as well as Judith's own sadistic gaze and upturned lip, could cause the release of norepinephrine, resulting in increased heart rate and blood pressure and triggering the fight-or-flight response. In contrast, the soft brushwork and repetitive, almost meditative, patterning may stimulate the release of serotonin. As the beholder takes in the image and its multifaceted emotional content, the release of acetylcholine to the hippocampus contributes to the storing of the image in the viewer's memory. What ultimately makes an image like Klimt's 'Judith' so irresistible and dynamic is its complexity, the way it activates a number of distinct and often conflicting emotional signals in the brain and combines them to produce a staggeringly complex and fascinating swirl of emotions."
  • ...18 more annotations...
  • His key findings on the snail, for which he shared the 2000 Nobel Prize in Physiology or Medicine, showed that learning and memory change not the neuron's basic structure but rather the nature, strength, and number of its synaptic connections. Further, through focus on the molecular biology involved in a learned reflex like Aplysia's gill retraction, Kandel demonstrated that experience alters nerve cells' synapses by changing their pattern of gene expression. In other words, learning doesn't change what neurons are, but rather what they do.
  • In Search of Memory (Norton), Kandel offered what sounded at the time like a vague research agenda for future generations in the budding field of neuroaesthetics, saying that the science of memory storage lay "at the foothills of a great mountain range." Experts grasp the "cellular and molecular mechanisms," he wrote, but need to move to the level of neural circuits to answer the question, "How are internal representations of a face, a scene, a melody, or an experience encoded in the brain?
  • Since giving a talk on the matter in 2001, he has been piecing together his own thoughts in relation to his favorite European artists
  • The field of neuroaesthetics, says one of its founders, Semir Zeki, of University College London, is just 10 to 15 years old. Through brain imaging and other studies, scholars like Zeki have explored the cognitive responses to, say, color contrasts or ambiguities of line or perspective in works by Titian, Michelangelo, Cubists, and Abstract Expressionists. Researchers have also examined the brain's pleasure centers in response to appealing landscapes.
  • it is fundamental to an understanding of human cognition and motivation. Art isn't, as Kandel paraphrases a concept from the late philosopher of art Denis Dutton, "a byproduct of evolution, but rather an evolutionary adaptation—an instinctual trait—that helps us survive because it is crucial to our well-being." The arts encode information, stories, and perspectives that allow us to appraise courses of action and the feelings and motives of others in a palatable, low-risk way.
  • "as far as activity in the brain is concerned, there is a faculty of beauty that is not dependent on the modality through which it is conveyed but which can be activated by at least two sources—musical and visual—and probably by other sources as well." Specifically, in this "brain-based theory of beauty," the paper says, that faculty is associated with activity in the medial orbitofrontal cortex.
  • It also enables Kandel—building on the work of Gombrich and the psychoanalyst and art historian Ernst Kris, among others—to compare the painters' rendering of emotion, the unconscious, and the libido with contemporaneous psychological insights from Freud about latent aggression, pleasure and death instincts, and other primal drives.
  • Kandel views the Expressionists' art through the powerful multiple lenses of turn-of-the-century Vienna's cultural mores and psychological insights. But then he refracts them further, through later discoveries in cognitive science. He seeks to reassure those who fear that the empirical and chemical will diminish the paintings' poetic power. "In art, as in science," he writes, "reductionism does not trivialize our perception—of color, light, and perspective—but allows us to see each of these components in a new way. Indeed, artists, particularly modern artists, have intentionally limited the scope and vocabulary of their expression to convey, as Mark Rothko and Ad Reinhardt do, the most essential, even spiritual ideas of their art."
  • The author of a classic textbook on neuroscience, he seems here to have written a layman's cognition textbook wrapped within a work of art history.
  • "our initial response to the most salient features of the paintings of the Austrian Modernists, like our response to a dangerous animal, is automatic. ... The answer to James's question of how an object simply perceived turns into an object emotionally felt, then, is that the portraits are never objects simply perceived. They are more like the dangerous animal at a distance—both perceived and felt."
  • If imaging is key to gauging therapeutic practices, it will be key to neuroaesthetics as well, Kandel predicts—a broad, intense array of "imaging experiments to see what happens with exaggeration, distorted faces, in the human brain and the monkey brain," viewers' responses to "mixed eroticism and aggression," and the like.
  • while the visual-perception literature might be richer at the moment, there's no reason that neuroaesthetics should restrict its emphasis to the purely visual arts at the expense of music, dance, film, and theater.
  • although Kandel considers The Age of Insight to be more a work of intellectual history than of science, the book summarizes centuries of research on perception. And so you'll find, in those hundreds of pages between Kandel's introduction to Klimt's "Judith" and the neurochemical cadenza about the viewer's response to it, dossiers on vision as information processing; the brain's three-dimensional-space mapping and its interpretations of two-dimensional renderings; face recognition; the mirror neurons that enable us to empathize and physically reflect the affect and intentions we see in others; and many related topics. Kandel elsewhere describes the scientific evidence that creativity is nurtured by spells of relaxation, which foster a connection between conscious and unconscious cognition.
  • Zeki's message to art historians, aesthetic philosophers, and others who chafe at that idea is twofold. The more diplomatic pitch is that neuroaesthetics is different, complementary, and not oppositional to other forms of arts scholarship. But "the stick," as he puts it, is that if arts scholars "want to be taken seriously" by neurobiologists, they need to take advantage of the discoveries of the past half-century. If they don't, he says, "it's a bit like the guys who said to Galileo that we'd rather not look through your telescope."
  • Matthews, a co-author of The Bard on the Brain: Understanding the Mind Through the Art of Shakespeare and the Science of Brain Imaging (Dana Press, 2003), seems open to the elucidations that science and the humanities can cast on each other. The neural pathways of our aesthetic responses are "good explanations," he says. But "does one [type of] explanation supersede all the others? I would argue that they don't, because there's a fundamental disconnection still between ... explanations of neural correlates of conscious experience and conscious experience" itself.
  • There are, Matthews says, "certain kinds of problems that are fundamentally interesting to us as a species: What is love? What motivates us to anger?" Writers put their observations on such matters into idiosyncratic stories, psychologists conceive their observations in a more formalized framework, and neuroscientists like Zeki monitor them at the level of functional changes in the brain. All of those approaches to human experience "intersect," Matthews says, "but no one of them is the explanation."
  • "Conscious experience," he says, "is something we cannot even interrogate in ourselves adequately. What we're always trying to do in effect is capture the conscious experience of the last moment. ... As we think about it, we have no way of capturing more than one part of it."
  • Kandel sees art and art history as "parent disciplines" and psychology and brain science as "antidisciplines," to be drawn together in an E.O. Wilson-like synthesis toward "consilience as an attempt to open a discussion between restricted areas of knowledge." Kandel approvingly cites Stephen Jay Gould's wish for "the sciences and humanities to become the greatest of pals ... but to keep their ineluctably different aims and logics separate as they ply their joint projects and learn from each other."
Javier E

The Dangers of Certainty: A Lesson From Auschwitz - NYTimes.com - 0 views

  • in 1973, the BBC aired an extraordinary documentary series called “The Ascent of Man,” hosted by one Dr. Jacob Bronowski
  • It was not an account of human biological evolution, but cultural evolution — from the origins of human life in the Rift Valley to the shifts from hunter/gatherer societies,  to nomadism and then settlement and civilization, from agriculture and metallurgy to the rise and fall of empires: Assyria, Egypt, Rome.
  • The tone of the programs was rigorous yet permissive, playful yet precise, and always urgent, open and exploratory. I remember in particular the programs on the trial of Galileo, Darwin’s hesitancy about publishing his theory of evolution and the dizzying consequences of Einstein’s theory of relativity.
  • ...11 more annotations...
  • For Bronowski, science and art were two neighboring mighty rivers that flowed from a common source: the human imagination.
  • For Dr. Bronowski, there was no absolute knowledge and anyone who claims it — whether a scientist, a politician or a religious believer — opens the door to tragedy. All scientific information is imperfect and we have to treat it with humility. Such, for him, was the human condition.
  • This is the condition for what we can know, but it is also, crucially, a moral lesson. It is the lesson of 20th-century painting from Cubism onwards, but also that of quantum physics. All we can do is to push deeper and deeper into better approximations of an ever-evasive reality
  • Errors are inextricably bound up with pursuit of human knowledge, which requires not just mathematical calculation but insight, interpretation and a personal act of judgment for which we are responsible.
  • Dr. Bronowski insisted that the principle of uncertainty was a misnomer, because it gives the impression that in science (and outside of it) we are always uncertain. But this is wrong. Knowledge is precise, but that precision is confined within a certain toleration of uncertainty.
  • The emphasis on the moral responsibility of knowledge was essential for all of Dr. Bronowski’s work. The acquisition of knowledge entails a responsibility for the integrity of what we are as ethical creatures.
  • Pursuing knowledge means accepting uncertainty. Heisenberg’s principle has the consequence that no physical events can ultimately be described with absolute certainty or with “zero tolerance,” as it were. The more we know, the less certain we are.
  • Our relations with others also require a principle of tolerance. We encounter other people across a gray area of negotiation and approximation. Such is the business of listening and the back and forth of conversation and social interaction.
  • For Dr. Bronowski, the moral consequence of knowledge is that we must never judge others on the basis of some absolute, God-like conception of certainty. All knowledge, all information that passes between human beings, can be exchanged only within what we might call “a play of tolerance,” whether in science, literature, politics or religion.
  • The play of tolerance opposes the principle of monstrous certainty that is endemic to fascism and, sadly, not just fascism but all the various faces of fundamentalism. When we think we have certainty, when we aspire to the knowledge of the gods, then Auschwitz can happen and can repeat itself.
  • The pursuit of scientific knowledge is as personal an act as lifting a paintbrush or writing a poem, and they are both profoundly human. If the human condition is defined by limitedness, then this is a glorious fact because it is a moral limitedness rooted in a faith in the power of the imagination, our sense of responsibility and our acceptance of our fallibility. We always have to acknowledge that we might be mistaken.
Javier E

The Washington Monthly - The Magazine - The Information Sage - 0 views

  • The underlying philosophy behind sparklines—and, really, all of Tufte’s work—is that data, when presented elegantly and with respect, is not confounding but clarifying.
  • Tufte has shifted how designers approach the job of turning information into understanding.
  • “It’s not about making the complex simple,” Grefe told me. “It’s about making the complex clear.”
  • ...2 more annotations...
  • Envisioning Information, published in 1990, implored readers to think of information design as a discipline that encompassed far more than the charts, tables, and other purely quantitative forms that had traditionally dominated the field
  • Graphics aren’t just useful for displaying numbers, in other words, but for clarifying just about anything one person is trying to tell someone else. The book opens with a print of a visitor’s guide to the Ise shrine in Japan and ends around 120 pages later with Galileo Galilei’s drawing of the rings of Saturn from 1613.
Javier E

The Peril of Knowledge Everywhere - NYTimes.com - 1 views

  • Are there things we should try not to know?
  • IBM says that 2.5 quintillion bytes of data are created each day. That is a number both unimaginable and somewhat unhelpful to real understanding. It’s not just the huge scale of the information, after all, it’s the novel types of data
  • many participants expressed concern about the effects all this data would have on the ability of powerful institutions to control people, from state coercion to product marketing.
  • ...5 more annotations...
  • If we want protection from the world we’re building, perhaps we’re asking that the algorithm wielders choose not to know things, despite their being true. To some, that may be a little like the 1616 order by the Catholic Church that Galileo cease from teaching or discussing the idea that the Earth moves around the sun.
  • one bit here and another there, both innocuous, may reveal something personal that is hidden perhaps even from myself.
  • Since then, we have been living in something closer to the spirit of the 18th-century Enlightenment, when all forms of knowledge were acceptable, and learning was a good in its own right. Regulation has been based on actions, not on knowledge.
  • the situation may be something like a vastly more difficult version of laws against red lining
  • we are also entering a new world where individuals can be as powerful as institutions. That phone gives Big Brother lots of data goodies, but it can also have access to its own pattern-finding algorithms, and publish those findings to the world.
Sophia C

Thomas Kuhn: Revolution Against Scientific Realism* - 1 views

  • as such a complex system that nobody believed that it corresponded to the physical reality of the universe. Although the Ptolemaic system accounted for observations-"saved the appearances"-its epicycles and deferents were never intended be anything more than a mathematical model to use in predicting the position of heavenly bodies. [3]
  • lileo that he was free to continue his work with Copernican theory if he agreed that the theory did not describe physical reality but was merely one of the many potential mathematical models. [10] Galileo continued to work, and while he "formally (23)claimed to prove nothing," [11] he passed his mathematical advances and his observational data to Newton, who would not only invent a new mathematics but would solve the remaining problems posed by Copernicus. [12]
  • Thus without pretending that his method could find the underlying causes of things such as gravity, Newton believed that his method produced theory, based upon empirical evidence, that was a close approximation of physical reality.
  • ...27 more annotations...
  • Medieval science was guided by "logical consistency."
  • The logical empiricist's conception of scientific progress was thus a continuous one; more comprehensive theory replaced compatible, older theory
  • Hempel also believed that science evolved in a continuous manner. New theory did not contradict past theory: "theory does not simply refute the earlier empirical generalizations in its field; rather, it shows that within a certain limited range defined by qualifying conditions, the generalizations hold true in fairly close approximation." [21]
  • New theory is more comprehensive; the old theory can be derived from the newer one and is one special manifestation" [22] of the more comprehensive new theory.
  • movement combined induction, based on empiricism, and deduction in the form of logic
  • It was the truth, and the prediction and control that came with it, that was the goal of logical-empirical science.
  • Each successive theory's explanation was closer to the truth than the theory before.
  • e notion of scientific realism held by Newton led to the evolutionary view of the progress of science
  • he entities and processes of theory were believed to exist in nature, and science should discover those entities and processes
  • Particularly disturbing discoveries were made in the area of atomic physics. For instance, Heisenberg's indeterminacy (25)principle, according to historian of science Cecil Schneer, yielded the conclusion that "the world of nature is indeterminate.
  • "even the fundamental principle of causality fail[ed] ."
  • was not until the second half of the twentieth century that the preservers of the evolutionary idea of scientific progress, the logical empiricists, were seriously challenged
  • revolutionary model of scientific change and examined the role of the scientific community in preventing and then accepting change. Kuhn's conception of scientific change occurring through revolutions undermined the traditional scientific goal, finding "truth" in nature
  • Textbooks inform scientists-to-be about this common body of knowledge and understanding.
  • for the world is too huge and complex to be explored randomly.
  • a scientist knows what facts are relevant and can build on past research
  • Normal science, as defined by Kuhn, is cumulative. New knowledge fills a gap of ignorance
  • ne standard product of the scientific enterprise is missing. Normal science does not aim at novelties of fact or theory and, when successful, finds none."
  • ntain a mechanism that uncovers anomaly, inconsistencies within the paradigm.
  • eventually, details arise that are inconsistent with the current paradigm
  • hese inconsistencies are eventually resolved or are ignored.
  • y concern a topic of central importance, a crisis occurs and normal science comes to a hal
  • that the scientists re-examine the foundations of their science that they had been taking for granted
  • it resolves the crisis better than the others, it offers promise for future research, and it is more aesthetic than its competitors. The reasons for converting to a new paradigm are never completely rational.
  • Unlike evolutionary science, in which new knowledge fills a gap of ignorance, in Kuhn's model new knowledge replaces incompatible knowledge.
  • Thus science is not a continuous or cumulative endeavor: when a paradigm shift occurs there is a revolution similar to a political revolution, with fundamental and pervasive changes in method and understanding. Each successive vision about the nature of the universe makes the past vision obsolete; predictions, though more precise, remain similar to the predictions of the past paradigm in their general orientation, but the new explanations do not accommodate the old
  • In a sense, we have circled back to the ancient and medieval practice of separating scientific theory from physical reality; both medieval scientists and Kuhn would agree that no theory corresponds to reality and therefore any number of theories might equally well explain a natural phenomenon. [36] Neither twentieth-century atomic theorists nor medieval astronomers are able to claim that their theories accurately describe physical phenomena. The inability to return to scientific realism suggests a tripartite division of the history of science, with a period of scientific realism fitting between two periods in which there is no insistence that theory correspond to reality. Although both scientific realism and the evolutionary idea of scientific progress appeal to common sense, both existed for only a few hundred years.
kushnerha

How to Cultivate the Art of Serendipity - The New York Times - 0 views

  • A surprising number of the conveniences of modern life were invented when someone stumbled upon a discovery or capitalized on an accident
  • wonder whether we can train ourselves to become more serendipitous. How do we cultivate the art of finding what we’re not seeking?
  • Croatian has no word to capture the thrill of the unexpected discovery, so she was delighted when — after moving to the United States on a Fulbright scholarship in the 1980s — she learned the English word “serendipity.”
  • ...12 more annotations...
  • Today we think of serendipity as something like dumb luck. But its original meaning was very different.
  • suggested that this old tale contained a crucial idea about human genius: “As their highnesses travelled, they were always making discoveries, by accident and sagacity, of things which they were not in quest of.” And he proposed a new word — “serendipity” — to describe this princely talent for detective work. At its birth, serendipity meant a skill rather than a random stroke of good fortune.
  • sees serendipity as something people do. In the mid-1990s, she began a study of about 100 people to find out how they created their own serendipity, or failed to do so.
  • As people dredge the unknown, they are engaging in a highly creative act. What an inventor “finds” is always an expression of him- or herself.
  • subjects fell into three distinct groups. Some she called “non-encounterers”; they saw through a tight focus, a kind of chink hole, and they tended to stick to their to-do lists when searching for information rather than wandering off into the margins. Other people were “occasional encounterers,” who stumbled into moments of serendipity now and then. Most interesting were the “super-encounterers,” who reported that happy surprises popped up wherever they looked.
  • “gathering string” is just another way of talking about super-encountering. After all, “string” is the stuff that accumulates in a journalist’s pocket. It’s the note you jot down in your car after the interview, the knickknack you notice on someone’s shelf, or the anomaly that jumps out at you in Appendix B of an otherwise boring research study.
  • came up with the term super-encounterer to give us a way to talk about the people rather than just the discoveries. Without such words, we tend to become dazzled by the happy accident itself, to think of it as something that exists independent of an observer.
  • We can slip into a twisted logic in which we half-believe the penicillin picked Alexander Fleming to be its emissary, or that the moons of Jupiter wanted to be seen by Galileo. But discoveries are products of the human mind.
  • You become a super-encounterer, according to Dr. Erdelez, in part because you believe that you are one — it helps to assume that you possess special powers of perception
  • One survey of patent holders (the PatVal study of European inventors, published in 2005) found that an incredible 50 percent of patents resulted from what could be described as a serendipitous process. Thousands of survey respondents reported that their idea evolved when they were working on an unrelated project — and often when they weren’t even trying to invent anything.
  • need to develop a new, interdisciplinary field — call it serendipity studies — that can help us create a taxonomy of discoveries in the chemistry lab, the newsroom, the forest, the classroom, the particle accelerator and the hospital. By observing and documenting the many different “species” of super-encounterers, we might begin to understand their minds.
  • Of course, even if we do organize the study of serendipity, it will always be a whimsical undertaking, given that the phenomenon is difficult to define, amazingly variable and hard to capture in data. The clues will no doubt emerge where we least expect them
katedriscoll

Is the Schrödinger Equation True? - Scientific American - 0 views

  • haped abstractions called vectors. Pondering Hilbert space makes me feel like a lump of dumb, decrepit flesh trapped in a squalid, 3-D prison. Far from exploring Hilbert space, I can’t even find a window through which to peer into it. I envision it as an immaterial paradise where luminescent cognoscenti glide to and fro, telepathically swapping witticisms about adjoint operators.
  • Reality, great sages have assured us, is essentially mathematical. Plato held that we and other things of this world are mere shadows of the sublime geometric forms that constitute reality. Galileo declared that “the great book of nature is written in mathematics.” We’re part of nature, aren’t we? So why does mathematics, once we get past natural numbers and basic arithmetic, feel so alien to most of us?
  • Physicists’ theories work. They predict the arc of planets and the flutter of electrons, and they have spawned smartphones, H-bombs and—well, what more do we need? But scientists, and especially physicists, aren’t just seeking practical advances. They’re after Truth. They want to believe that their theories are correct—exclusively correct—representations of nature. Physicists share this craving with religious folk, who need to believe that their path to salvation is the One True Path.
1 - 20 of 27 Next ›
Showing 20 items per page