Skip to main content

Home/ TOK Friends/ Group items tagged prize

Rss Feed Group items tagged

Javier E

Beyond Billboards - The Daily Dish | By Andrew Sullivan - 0 views

  • The Atlantic Home todaysDate();Sunday, December 12, 2010Sunday, December 12, 2010 Go Follow the Atlantic » atlanticPrintlayoutnavigation()Politics Presented ByBack to the Gold Standard? Joshua GreenSenate Dems Lose Vote on 'Don't Ask' RepealMegan Scully & Dan FriedmanA Primary Challenge to Obama? Marc Ambinder Business Presented byif (typeof window.dartOrd == 'undefined') {window.dartOrd = ('000000000' + Math.ceil(Math.random()*1000000000).toString()).slice(-9);}jsProperties = 'TheAtlanticOnline/channel_business;pos=navlogo;sz=88x31,215x64;tile=1';document.write('');if( $(".adNavlogo").html().search("grey.gif") != -1 ){$(".adNavlogo").hide();}Will the Economy Get Jobs for Christmas?Daniel Indiviglio27 Key Facts About US ExportsDerek ThompsonThe Last StimulusDerek Thompson Culture Presented ByThe 10 Biggest Sports Stories of 2010Eleanor Barkhorn and Kevin Fallon al
  • at the force behind all that exists actually intervened in the consciousness of humankind in the form of a man so saturated in godliness that merely being near him healed people of the weight of the world's sins.
Javier E

UK mathematician wins richest prize in academia | Mathematics | The Guardian - 0 views

  • Martin Hairer, an Austrian-British researcher at Imperial College London, is the winner of the 2021 Breakthrough prize for mathematics, an annual $3m (£2.3m) award that has come to rival the Nobels in terms of kudos and prestige.
  • Hairer landed the prize for his work on stochastic analysis, a field that describes how random effects turn the maths of things like stirring a cup of tea, the growth of a forest fire, or the spread of a water droplet that has fallen on a tissue into a fiendishly complex problem.
  • His major work, a 180-page treatise that introduced the world to “regularity structures”, so stunned his colleagues that one suggested it must have been transmitted to Hairer by a more intelligent alien civilisation.
  • ...3 more annotations...
  • After dallying with physics at university, Hairer moved into mathematics. The realisation that ideas in theoretical physics can be overturned and swiftly consigned to the dustbin did not appeal. “I wouldn’t really want to put my name to a result that could be superseded by something else three years later,” he said. “In mathematics, if you obtain a result then that is it. It’s the universality of mathematics, you discover absolute truths.”
  • Hairer’s expertise lies in stochastic partial differential equations, a branch of mathematics that describes how randomness throws disorder into processes such as the movement of wind in a wind tunnel or the creeping boundary of a water droplet landing on a tissue. When the randomness is strong enough, solutions to the equations get out of control. “In some cases, the solutions fluctuate so wildly that it is not even clear what the equation meant in the first place,” he said.
  • With the invention of regularity structures, Hairer showed how the infinitely jagged noise that threw his equations into chaos could be reframed and tamed.
Emily Freilich

Higgs Boson Gets Nobel Prize, But Physicists Still Don't Know What It's Telling Them - ... - 2 views

  • This morning, two physicists who 50 years ago theorized the existence of this particle, which is responsible for conferring mass to all other known particles in the universe, got the Nobel, the highest prize in science.
  • left physicists without a clear roadmap of where to go next
  • No one is sure which of these models, if any, will eventually describe reality
  • ...6 more annotations...
  • Some of them look at the data and say that we need to throw out speculative ideas such as supersymmetry and the multiverse, models that look elegant mathematically but are unprovable from an experimental perspective. Others look at the exact same data and come to the opposite conclusion.
  • we’ve entered a very deep crisis.
  • hough happy to know the Higgs was there, many scientists had hoped it would turn out to be strange, to defy their predictions in some way and give a hint as to which models beyond the Standard Model were correct.
  • One possibility has been brought up that even physicists don’t like to think about. Maybe the universe is even stranger than they think. Like, so strange that even post-Standard Model models can’t account for it. Some physicists are starting to question whether or not our universe is natural.
  • The multiverse idea has two strikes against it, though. First, physicists would refer to it as an unnatural explanation because it simply happened by chance. And second, no real evidence for it exists and we have no experiment that could currently test for it.
  • physicists are still in the dark. We can see vague outlines ahead of us but no one knows what form they will take when we reach them.
Javier E

Nobel Prize in Physics Is Awarded to 3 Scientists for Work Exploring Quantum Weirdness ... - 0 views

  • “We’re used to thinking that information about an object — say that a glass is half full — is somehow contained within the object.” Instead, he says, entanglement means objects “only exist in relation to other objects, and moreover these relationships are encoded in a wave function that stands outside the tangible physical universe.”
  • Einstein, though one of the founders of quantum theory, rejected it, saying famously, God did not play dice with the universe.In a 1935 paper written with Boris Podolsky and Nathan Rosen, he tried to demolish quantum mechanics as an incomplete theory by pointing out that by quantum rules, measuring a particle in one place could instantly affect measurements of the other particle, even if it was millions of miles away.
  • Dr. Clauser, who has a knack for electronics and experimentation and misgivings about quantum theory, was the first to perform Bell’s proposed experiment. He happened upon Dr. Bell’s paper while a graduate student at Columbia University and recognized it as something he could do.
  • ...13 more annotations...
  • In 1972, using duct tape and spare parts in the basement on the campus of the University of California, Berkeley, Dr. Clauser and a graduate student, Stuart Freedman, who died in 2012, endeavored to perform Bell’s experiment to measure quantum entanglement. In a series of experiments, he fired thousands of light particles, or photons, in opposite directions to measure a property known as polarization, which could have only two values — up or down. The result for each detector was always a series of seemingly random ups and downs. But when the two detectors’ results were compared, the ups and downs matched in ways that neither “classical physics” nor Einstein’s laws could explain. Something weird was afoot in the universe. Entanglement seemed to be real.
  • in 2002, Dr. Clauser admitted that he himself had expected quantum mechanics to be wrong and Einstein to be right. “Obviously, we got the ‘wrong’ result. I had no choice but to report what we saw, you know, ‘Here’s the result.’ But it contradicts what I believed in my gut has to be true.” He added, “I hoped we would overthrow quantum mechanics. Everyone else thought, ‘John, you’re totally nuts.’”
  • the correlations only showed up after the measurements of the individual particles, when the physicists compared their results after the fact. Entanglement seemed real, but it could not be used to communicate information faster than the speed of light.
  • In 1982, Dr. Aspect and his team at the University of Paris tried to outfox Dr. Clauser’s loophole by switching the direction along which the photons’ polarizations were measured every 10 nanoseconds, while the photons were already in the air and too fast for them to communicate with each other. He too, was expecting Einstein to be right.
  • Quantum predictions held true, but there were still more possible loopholes in the Bell experiment that Dr. Clauser had identified
  • For example, the polarization directions in Dr. Aspect’s experiment had been changed in a regular and thus theoretically predictable fashion that could be sensed by the photons or detectors.
  • Anton Zeilinger
  • added even more randomness to the Bell experiment, using random number generators to change the direction of the polarization measurements while the entangled particles were in flight.
  • Once again, quantum mechanics beat Einstein by an overwhelming margin, closing the “locality” loophole.
  • as scientists have done more experiments with entangled particles, entanglement is accepted as one of main features of quantum mechanics and is being put to work in cryptology, quantum computing and an upcoming “quantum internet
  • One of its first successes in cryptology is messages sent using entangled pairs, which can send cryptographic keys in a secure manner — any eavesdropping will destroy the entanglement, alerting the receiver that something is wrong.
  • , with quantum mechanics, just because we can use it, doesn’t mean our ape brains understand it. The pioneering quantum physicist Niels Bohr once said that anyone who didn’t think quantum mechanics was outrageous hadn’t understood what was being said.
  • In his interview with A.I.P., Dr. Clauser said, “I confess even to this day that I still don’t understand quantum mechanics, and I’m not even sure I really know how to use it all that well. And a lot of this has to do with the fact that I still don’t understand it.”
Javier E

Cognitive Biases and the Human Brain - The Atlantic - 1 views

  • Present bias shows up not just in experiments, of course, but in the real world. Especially in the United States, people egregiously undersave for retirement—even when they make enough money to not spend their whole paycheck on expenses, and even when they work for a company that will kick in additional funds to retirement plans when they contribute.
  • hen people hear the word bias, many if not most will think of either racial prejudice or news organizations that slant their coverage to favor one political position over another. Present bias, by contrast, is an example of cognitive bias—the collection of faulty ways of thinking that is apparently hardwired into the human brain. The collection is large. Wikipedia’s “List of cognitive biases” contains 185 entries, from actor-observer bias (“the tendency for explanations of other individuals’ behaviors to overemphasize the influence of their personality and underemphasize the influence of their situation … and for explanations of one’s own behaviors to do the opposite”) to the Zeigarnik effect (“uncompleted or interrupted tasks are remembered better than completed ones”)
  • If I had to single out a particular bias as the most pervasive and damaging, it would probably be confirmation bias. That’s the effect that leads us to look for evidence confirming what we already think or suspect, to view facts and ideas we encounter as further confirmation, and to discount or ignore any piece of evidence that seems to support an alternate view
  • ...48 more annotations...
  • Confirmation bias shows up most blatantly in our current political divide, where each side seems unable to allow that the other side is right about anything.
  • The whole idea of cognitive biases and faulty heuristics—the shortcuts and rules of thumb by which we make judgments and predictions—was more or less invented in the 1970s by Amos Tversky and Daniel Kahneman
  • versky died in 1996. Kahneman won the 2002 Nobel Prize in Economics for the work the two men did together, which he summarized in his 2011 best seller, Thinking, Fast and Slow. Another best seller, last year’s The Undoing Project, by Michael Lewis, tells the story of the sometimes contentious collaboration between Tversky and Kahneman
  • Another key figure in the field is the University of Chicago economist Richard Thaler. One of the biases he’s most linked with is the endowment effect, which leads us to place an irrationally high value on our possessions.
  • In an experiment conducted by Thaler, Kahneman, and Jack L. Knetsch, half the participants were given a mug and then asked how much they would sell it for. The average answer was $5.78. The rest of the group said they would spend, on average, $2.21 for the same mug. This flew in the face of classic economic theory, which says that at a given time and among a certain population, an item has a market value that does not depend on whether one owns it or not. Thaler won the 2017 Nobel Prize in Economics.
  • “The question that is most often asked about cognitive illusions is whether they can be overcome. The message … is not encouraging.”
  • that’s not so easy in the real world, when we’re dealing with people and situations rather than lines. “Unfortunately, this sensible procedure is least likely to be applied when it is needed most,” Kahneman writes. “We would all like to have a warning bell that rings loudly whenever we are about to make a serious error, but no such bell is available.”
  • At least with the optical illusion, our slow-thinking, analytic mind—what Kahneman calls System 2—will recognize a Müller-Lyer situation and convince itself not to trust the fast-twitch System 1’s perception
  • Kahneman and others draw an analogy based on an understanding of the Müller-Lyer illusion, two parallel lines with arrows at each end. One line’s arrows point in; the other line’s arrows point out. Because of the direction of the arrows, the latter line appears shorter than the former, but in fact the two lines are the same length.
  • Because biases appear to be so hardwired and inalterable, most of the attention paid to countering them hasn’t dealt with the problematic thoughts, judgments, or predictions themselves
  • Is it really impossible, however, to shed or significantly mitigate one’s biases? Some studies have tentatively answered that question in the affirmative.
  • what if the person undergoing the de-biasing strategies was highly motivated and self-selected? In other words, what if it was me?
  • Over an apple pastry and tea with milk, he told me, “Temperament has a lot to do with my position. You won’t find anyone more pessimistic than I am.”
  • I met with Kahneman
  • “I see the picture as unequal lines,” he said. “The goal is not to trust what I think I see. To understand that I shouldn’t believe my lying eyes.” That’s doable with the optical illusion, he said, but extremely difficult with real-world cognitive biases.
  • In this context, his pessimism relates, first, to the impossibility of effecting any changes to System 1—the quick-thinking part of our brain and the one that makes mistaken judgments tantamount to the Müller-Lyer line illusion
  • he most effective check against them, as Kahneman says, is from the outside: Others can perceive our errors more readily than we can.
  • “slow-thinking organizations,” as he puts it, can institute policies that include the monitoring of individual decisions and predictions. They can also require procedures such as checklists and “premortems,”
  • A premortem attempts to counter optimism bias by requiring team members to imagine that a project has gone very, very badly and write a sentence or two describing how that happened. Conducting this exercise, it turns out, helps people think ahead.
  • “My position is that none of these things have any effect on System 1,” Kahneman said. “You can’t improve intuition.
  • Perhaps, with very long-term training, lots of talk, and exposure to behavioral economics, what you can do is cue reasoning, so you can engage System 2 to follow rules. Unfortunately, the world doesn’t provide cues. And for most people, in the heat of argument the rules go out the window.
  • Kahneman describes an even earlier Nisbett article that showed subjects’ disinclination to believe statistical and other general evidence, basing their judgments instead on individual examples and vivid anecdotes. (This bias is known as base-rate neglect.)
  • over the years, Nisbett had come to emphasize in his research and thinking the possibility of training people to overcome or avoid a number of pitfalls, including base-rate neglect, fundamental attribution error, and the sunk-cost fallacy.
  • Nisbett’s second-favorite example is that economists, who have absorbed the lessons of the sunk-cost fallacy, routinely walk out of bad movies and leave bad restaurant meals uneaten.
  • When Nisbett asks the same question of students who have completed the statistics course, about 70 percent give the right answer. He believes this result shows, pace Kahneman, that the law of large numbers can be absorbed into System 2—and maybe into System 1 as well, even when there are minimal cues.
  • about half give the right answer: the law of large numbers, which holds that outlier results are much more frequent when the sample size (at bats, in this case) is small. Over the course of the season, as the number of at bats increases, regression to the mean is inevitabl
  • When Nisbett has to give an example of his approach, he usually brings up the baseball-phenom survey. This involved telephoning University of Michigan students on the pretense of conducting a poll about sports, and asking them why there are always several Major League batters with .450 batting averages early in a season, yet no player has ever finished a season with an average that high.
  • we’ve tested Michigan students over four years, and they show a huge increase in ability to solve problems. Graduate students in psychology also show a huge gain.”
  • , “I know from my own research on teaching people how to reason statistically that just a few examples in two or three domains are sufficient to improve people’s reasoning for an indefinitely large number of events.”
  • isbett suggested another factor: “You and Amos specialized in hard problems for which you were drawn to the wrong answer. I began to study easy problems, which you guys would never get wrong but untutored people routinely do … Then you can look at the effects of instruction on such easy problems, which turn out to be huge.”
  • Nisbett suggested that I take “Mindware: Critical Thinking for the Information Age,” an online Coursera course in which he goes over what he considers the most effective de-biasing skills and concepts. Then, to see how much I had learned, I would take a survey he gives to Michigan undergraduates. So I did.
  • he course consists of eight lessons by Nisbett—who comes across on-screen as the authoritative but approachable psych professor we all would like to have had—interspersed with some graphics and quizzes. I recommend it. He explains the availability heuristic this way: “People are surprised that suicides outnumber homicides, and drownings outnumber deaths by fire. People always think crime is increasing” even if it’s not.
  • When I finished the course, Nisbett sent me the survey he and colleagues administer to Michigan undergrads
  • It contains a few dozen problems meant to measure the subjects’ resistance to cognitive biases
  • I got it right. Indeed, when I emailed my completed test, Nisbett replied, “My guess is that very few if any UM seniors did as well as you. I’m sure at least some psych students, at least after 2 years in school, did as well. But note that you came fairly close to a perfect score.”
  • Nevertheless, I did not feel that reading Mindware and taking the Coursera course had necessarily rid me of my biases
  • For his part, Nisbett insisted that the results were meaningful. “If you’re doing better in a testing context,” he told me, “you’ll jolly well be doing better in the real world.”
  • The New York–based NeuroLeadership Institute offers organizations and individuals a variety of training sessions, webinars, and conferences that promise, among other things, to use brain science to teach participants to counter bias. This year’s two-day summit will be held in New York next month; for $2,845, you could learn, for example, “why are our brains so bad at thinking about the future, and how do we do it better?”
  • Philip E. Tetlock, a professor at the University of Pennsylvania’s Wharton School, and his wife and research partner, Barbara Mellers, have for years been studying what they call “superforecasters”: people who manage to sidestep cognitive biases and predict future events with far more accuracy than the pundits
  • One of the most important ingredients is what Tetlock calls “the outside view.” The inside view is a product of fundamental attribution error, base-rate neglect, and other biases that are constantly cajoling us into resting our judgments and predictions on good or vivid stories instead of on data and statistics
  • In 2006, seeking to prevent another mistake of that magnitude, the U.S. government created the Intelligence Advanced Research Projects Activity (iarpa), an agency designed to use cutting-edge research and technology to improve intelligence-gathering and analysis. In 2011, iarpa initiated a program, Sirius, to fund the development of “serious” video games that could combat or mitigate what were deemed to be the six most damaging biases: confirmation bias, fundamental attribution error, the bias blind spot (the feeling that one is less biased than the average person), the anchoring effect, the representativeness heuristic, and projection bias (the assumption that everybody else’s thinking is the same as one’s own).
  • most promising are a handful of video games. Their genesis was in the Iraq War
  • Together with collaborators who included staff from Creative Technologies, a company specializing in games and other simulations, and Leidos, a defense, intelligence, and health research company that does a lot of government work, Morewedge devised Missing. Some subjects played the game, which takes about three hours to complete, while others watched a video about cognitive bias. All were tested on bias-mitigation skills before the training, immediately afterward, and then finally after eight to 12 weeks had passed.
  • “The literature on training suggests books and classes are fine entertainment but largely ineffectual. But the game has very large effects. It surprised everyone.”
  • he said he saw the results as supporting the research and insights of Richard Nisbett. “Nisbett’s work was largely written off by the field, the assumption being that training can’t reduce bias,
  • even the positive results reminded me of something Daniel Kahneman had told me. “Pencil-and-paper doesn’t convince me,” he said. “A test can be given even a couple of years later. But the test cues the test-taker. It reminds him what it’s all about.”
  • Morewedge told me that some tentative real-world scenarios along the lines of Missing have shown “promising results,” but that it’s too soon to talk about them.
  • In the future, I will monitor my thoughts and reactions as best I can
Javier E

Over the Side With Old Scientific Tenets - NYTimes.com - 0 views

  • Here are some concepts you might consider tossing out with the Christmas wrappings as you get started on the new year: human nature, cause and effect, the theory of everything, free will and evidence-based medicine.
  • Frank Wilczek of M.I.T., a Nobel Prize winner in physics, would retire the distinction between mind and matter, a bedrock notion, at least in the West, since the time of Descartes. We know a lot more about matter and atoms now, Dr. Wilczek says, and about the brain. Matter, he says, “can dance in intricate, dynamic patterns; it can exploit environmental resources, to self-organize and export entropy.”
Javier E

Owner of a Credit Card Processor Is Setting a New Minimum Wage: $70,000 a Year - NYTime... - 1 views

  • Mr. Price surprised his 120-person staff by announcing that he planned over the next three years to raise the salary of even the lowest-paid clerk, customer service representative and salesman to a minimum of $70,000.
  • Mr. Price, who started the Seattle-based credit-card payment processing firm in 2004 at the age of 19, said he would pay for the wage increases by cutting his own salary from nearly $1 million to $70,000 and using 75 to 80 percent of the company’s anticipated $2.2 million in profit this year.
  • his unusual proposal does speak to an economic issue that has captured national attention: The disparity between the soaring pay of chief executives and that of their employees.
  • ...7 more annotations...
  • The United States has one of the world’s largest pay gaps, with chief executives earning nearly 300 times what the average worker makes, according to some economists’ estimates. That is much higher than the 20-to-1 ratio recommended by Gilded Age magnates like J. Pierpont Morgan and the 20th century management visionary Peter Drucker.
  • “The market rate for me as a C.E.O. compared to a regular person is ridiculous, it’s absurd,” said Mr. Price, who said his main extravagances were snowboarding and picking up the bar bill. He drives a 12-year-old Audi
  • Under a financial overhaul passed by Congress in 2010, the Securities and Exchange Commission was supposed to require all publicly held companies to disclose the ratio of C.E.O. pay to the median pay of all other employees, but it has so far failed to put it in effect. Corporate executives have vigorously opposed the idea, complaining it would be cumbersome and costly to implement.
  • Of all the social issues that he felt he was in a position to do something about as a business leader, “that one seemed like a more worthy issue to go after.”
  • The happiness research behind Mr. Price’s announcement on Monday came from Angus Deaton and Daniel Kahneman, a Nobel Prize-winning psychologist. They found that what they called emotional well-being — defined as “the emotional quality of an individual’s everyday experience, the frequency and intensity of experiences of joy, stress, sadness, anger, and affection that make one’s life pleasant or unpleasant” — rises with income, but only to a point. And that point turns out to be about $75,000 a year.
  • Of course, money above that level brings pleasures — there’s no denying the delights of a Caribbean cruise or a pair of diamond earrings — but no further gains on the emotional well-being scale.
  • As Mr. Kahneman has explained it, income above the threshold doesn’t buy happiness, but a lack of money can deprive you of it.
Javier E

A Billionaire Mathematician's Life of Ferocious Curiosity - The New York Times - 0 views

  • James H. Simons likes to play against type. He is a billionaire star of mathematics and private investment who often wins praise for his financial gifts to scientific research and programs to get children hooked on math.But in his Manhattan office, high atop a Fifth Avenue building in the Flatiron district, he’s quick to tell of his career failings.He was forgetful. He was demoted. He found out the hard way that he was terrible at programming computers. “I’d keep forgetting the notation,” Dr. Simons said. “I couldn’t write programs to save my life.”After that, he was fired.His message is clearly aimed at young people: If I can do it, so can you.
  • Down one floor from his office complex is Math for America, a foundation he set up to promote math teaching in public schools. Nearby, on Madison Square Park, is the National Museum of Mathematics, or MoMath, an educational center he helped finance. It opened in 2012 and has had a quarter million visitors.
  • Dr. Simons received his doctorate at 23; advanced code breaking for the National Security Agency at 26; led a university math department at 30; won geometry’s top prize at 37; founded Renaissance Technologies, one of the world’s most successful hedge funds, at 44; and began setting up charitable foundations at 56.
  • ...7 more annotations...
  • With a fortune estimated at $12.5 billion, Dr. Simons now runs a tidy universe of science endeavors, financing not only math teachers but hundreds of the world’s best investigators, even as Washington has reduced its support for scientific research. His favorite topics include gene puzzles, the origins of life, the roots of autism, math and computer frontiers, basic physics and the structure of the early cosmos.
  • In time, his novel approach helped change how the investment world looks at financial markets. The man who “couldn’t write programs” hired a lot of programmers, as well as physicists, cryptographers, computational linguists, and, oh yes, mathematicians. Wall Street experience was frowned on. A flair for science was prized. The techies gathered financial data and used complex formulas to make predictions and trade in global markets.
  • Working closely with his wife, Marilyn, the president of the Simons Foundation and an economist credited with philanthropic savvy, Dr. Simons has pumped more than $1 billion into esoteric projects as well as retail offerings like the World Science Festival and a scientific lecture series at his Fifth Avenue building. Characteristically, it is open to the public.
  • On a wall in Dr. Simons’s office is one of his prides: a framed picture of equations known as Chern-Simons, after a paper he wrote with Shiing-Shen Chern, a prominent geometer. Four decades later, the equations define many esoteric aspects of modern physics, including advanced theories of how invisible fields like those of gravity interact with matter to produce everything from superstrings to black holes.
  • “He’s an individual of enormous talent and accomplishment, yet he’s completely unpretentious,” said Marc Tessier-Lavigne, a neuroscientist who is the president of Rockefeller University. “He manages to blend all these admirable qualities.”
  • Forbes magazine ranks him as the world’s 93rd richest person — ahead of Eric Schmidt of Google and Elon Musk of Tesla Motors, among others — and in 2010, he and his wife were among the first billionaires to sign the Giving Pledge, promising to devote “the great majority” of their wealth to philanthropy.
  • For all his self-deprecations, Dr. Simons does credit himself with a contemplative quality that seems to lie behind many of his accomplishments.“I wasn’t the fastest guy in the world,” Dr. Simons said of his youthful math enthusiasms. “I wouldn’t have done well in an Olympiad or a math contest. But I like to ponder. And pondering things, just sort of thinking about it and thinking about it, turns out to be a pretty good approach.”
Javier E

A Romp Through Theories on Origins of Life - NYTimes.com - 0 views

  • they debated the definition of life — “anything highly statistically improbable, but in a particular direction,” in the words of Richard Dawkins, the evolutionary biologist at Oxford. Or, they wondered if it could be defined at all in the absence of a second example to the Earth’s biosphere — a web of interdependence all based on DNA.
  • The rapid appearance of complex life in some accounts — “like Athena springing from the head of Zeus,” in the words of Dr. McKay — has rekindled interest recently in a theory fancied by Francis Crick, one of the discoverers of the double helix, that life originated elsewhere and floated here through space. These days the favorite candidate for such an extraterrestrial cradle is Mars
  • “If you want to think of it that way, life is a very simple process,” said Sidney Altman, who shared a Nobel Prize in 1989 for showing that RNA had these dual abilities. “It uses energy, it sustains itself and it replicates.” One lesson of the meeting was how finicky are the chemical reactions needed for carrying out these simple-sounding functions. “There might be a reason why amino acids and nucleotides are the way they are,”
carolinewren

Laser-Controlled And See-Through Brains Get Biomedical Prize | Popular Science - 0 views

  • The mouse brain above has undergone a process called CLARITY
  • Through a series of chemical reactions, CLARITY stabilizes organs taken from an animal or human and makes them transparent to the naked eye.
  • allows scientists to look into organs in a whole new way.
  • ...3 more annotations...
  • The rodent at the top of this story is being studied with a technique called optogenetics, which Deisseroth pioneered.
  • genetically engineered the mouse so that its brain cells turn certain genes on or off when scientists shine laser light onto them. The light enters the mouse's brain through that optical fiber you see in the photo.
  • For example, say 20 percent of people with autism don't have Gene A, but scientists aren't sure what Gene A does. They could turn off Gene A in a mouse's brain and see what happens next. The mouse's reaction could provide a clue about what Gene A does in people and why it's missing in certain patients
Javier E

Geology's Timekeepers Are Feuding - The Atlantic - 0 views

  • , in 2000, the Nobel Prize-winning chemist Paul Crutzen won permanent fame for stratigraphy. He proposed that humans had so throughly altered the fundamental processes of the planet—through agriculture, climate change, and nuclear testing, and other phenomena—that a new geological epoch had commenced: the Anthropocene, the age of humans.
  • Zalasiewicz should know. He is the chair of the Anthropocene working group, which the ICS established in 2009 to investigate whether the new epoch deserved a place in stratigraphic time.
  • In 2015, the group announced that the Anthropocene was a plausible new layer and that it should likely follow the Holocene. But the team has yet to propose a “golden spike” for the epoch: a boundary in the sedimentary rock record where the Anthropocene clearly begins.
  • ...12 more annotations...
  • Officially, the Holocene is still running today. You have lived your entire life in the Holocene, and the Holocene has constituted the geological “present” for as long as there have been geologists.But if we now live in a new epoch, the Anthropocene, then the ICS will have to chop the Holocene somewhere. It will have to choose when the Holocene ended, and it will move some amount of time out of the purview of the Holocene working group and into that of the Anthropocene working group.
  • This is politically difficult. And right now, the Anthropocene working group seems intent on not carving too deep into the Holocene. In a paper published earlier this year in Earth-Science Reviews, the Anthropocene working group’s members strongly imply that they will propose starting the new epoch in the mid-20th century.
  • Some geologists argue that the Anthropocene started even earlier: perhaps 4,000 or 6,000 years ago, as farmers began to remake the land surface.“Most of the world’s forests that were going to be converted to cropland and agriculture were already cleared well before 1950,” says Bill Ruddiman, a geology professor at the University of Virginia and an advocate of this extremely early Anthropocene.
  • “Most of the world’s prairies and steppes that were going to be cleared for crops were already gone, by then. How can you argue the Anthropocene started in 1950 when all of the major things that affect Earth’s surface were already over?”Van der Pluijm agreed that the Anthropocene working group was picking 1950 for “not very good reasons.”“Agriculture was the revolution that allowed society to develop,” he said. “That was really when people started to force the land to work for them. That massive land movement—it’s like a landslide, except it’s a humanslide. And it is not, of course, as dramatic as today’s motion of land, but it starts the clock.”
  • This muddle had to stop. The Holocene comes up constantly in discussions of modern global warming. Geologists and climate scientists did not make their jobs any easier by slicing it in different ways and telling contradictory stories about it.
  • This process started almost 10 years ago. For this reason, Zalasiewicz, the chair of the Anthropocene working group, said he wasn’t blindsided by the new subdivisions at all. In fact, he voted to adopt them as a member of the Quaternary working group.“Whether the Anthropocene works with a unified Holocene or one that’s in three parts makes for very little difference,” he told me.In fact, it had made the Anthropocene group’s work easier. “It has been useful to compare the scale of the two climate events that mark the new boundaries [within the Holocene] with the kind of changes that we’re assessing in the Anthropocene. It has been quite useful to have the compare and contrast,” he said. “Our view is that some of the changes in the Anthropocene are rather bigger.”
  • Zalasiewicz said that he and his colleagues were going as fast as they could. When the working group group began its work in 2009, it was “really starting from scratch,” he told me.While other working groups have a large body of stratigraphic research to consider, the Anthropocene working group had nothing. “We had to spend a fair bit of time deciding whether the Anthropocene was geology at all,” he said. Then they had to decide where its signal could show up. Now, they’re looking for evidence that shows it.
  • This cycle of “glacials” and “interglacials” has played out about 50 times over the last several million years. When the Holocene began, it was only another interglacial—albeit the one we live in. Until recently, glaciers were still on schedule to descend in another 30,000 years or so.Yet geologists still call the Holocene an epoch, even though they do not bestow this term on any of the previous 49 interglacials. It get special treatment because we live in it.
  • Much of this science is now moot. Humanity’s vast emissions of greenhouse gas have now so warmed the climate that they have offset the next glaciation. They may even knock us out of the ongoing cycle of Ice Ages, sending the Earth hurtling back toward a “greenhouse” climate after the more amenable “icehouse” climate during which humans evolved.For this reason, van der Pluijm wants the Anthropocene to supplant the Holocene entirely. Humans made their first great change to the environment at the close of the last glaciation, when they seem to have hunted the world’s largest mammals—the wooly mammoth, the saber-toothed tiger—to extinction. Why not start the Anthropocene then?He would even rename the pre-1800 period “the Holocene Age” as a consolation prize:
  • Zalasiewicz said he would not start the Anthropocene too early in time, as it would be too work-intensive for the field to rename such a vast swath of time. “The early-Anthropocene idea would crosscut against the Holocene as it’s seen by Holocene workers,” he said. If other academics didn’t like this, they could create their own timescales and start the Anthropocene Epoch where they choose. “We have no jurisdiction over the word Anthropocene,” he said.
  • Ruddiman, the University of Virginia professor who first argued for a very early Anthropocene, now makes an even broader case. He’s not sure it makes sense to formally define the Anthropocene at all. In a paper published this week, he objects to designating the Anthropocene as starting in the 1950s—and then he objects to delineating the Anthropocene, or indeed any new geological epoch, by name. “Keep the use of the term informal,” he told me. “Don’t make it rigid. Keep it informal so people can say the early-agricultural Anthropocene, or the industrial-era Anthropocene.”
  • “This is the age of geochemical dating,” he said. Geologists have stopped looking to the ICS to place each rock sample into the rock sequence. Instead, field geologists use laboratory techniques to get a precise year or century of origin for each rock sample. “The community just doesn’t care about these definitions,” he said.
Javier E

Opinion | The 1619 Chronicles - The New York Times - 0 views

  • The 1619 Project introduced a date, previously obscure to most Americans, that ought always to have been thought of as seminal — and probably now will. It offered fresh reminders of the extent to which Black freedom was a victory gained by courageous Black Americans, and not just a gift obtained from benevolent whites.
  • in a point missed by many of the 1619 Project’s critics, it does not reject American values. As Nikole Hannah-Jones, its creator and leading voice, concluded in her essay for the project, “I wish, now, that I could go back to the younger me and tell her that her people’s ancestry started here, on these lands, and to boldly, proudly, draw the stars and those stripes of the American flag.” It’s an unabashedly patriotic thought.
  • ambition can be double-edged. Journalists are, most often, in the business of writing the first rough draft of history, not trying to have the last word on it. We are best when we try to tell truths with a lowercase t, following evidence in directions unseen, not the capital-T truth of a pre-established narrative in which inconvenient facts get discarded
  • ...25 more annotations...
  • on these points — and for all of its virtues, buzz, spinoffs and a Pulitzer Prize — the 1619 Project has failed.
  • That doesn’t mean that the project seeks to erase the Declaration of Independence from history. But it does mean that it seeks to dethrone the Fourth of July by treating American history as a story of Black struggle against white supremacy — of which the Declaration is, for all of its high-flown rhetoric, supposed to be merely a part.
  • he deleted assertions went to the core of the project’s most controversial goal, “to reframe American history by considering what it would mean to regard 1619 as our nation’s birth year.”
  • She then challenged me to find any instance in which the project stated that “using 1776 as our country’s birth date is wrong,” that it “should not be taught to schoolchildren,” and that the only one “that should be taught” was 1619. “Good luck unearthing any of us arguing that,” she added.
  • I emailed her to ask if she could point to any instances before this controversy in which she had acknowledged that her claims about 1619 as “our true founding” had been merely metaphorical. Her answer was that the idea of treating the 1619 date metaphorically should have been so obvious that it went without saying.
  • “1619. It is not a year that most Americans know as a notable date in our country’s history. Those who do are at most a tiny fraction of those who can tell you that 1776 is the year of our nation’s birth. What if, however, we were to tell you that this fact, which is taught in our schools and unanimously celebrated every Fourth of July, is wrong, and that the country’s true birth date, the moment that its defining contradictions first came into the world, was in late August of 1619?”
  • Here is an excerpt from the introductory essay to the project by The New York Times Magazine’s editor, Jake Silverstein, as it appeared in print in August 2019 (italics added):
  • In his introduction, Silverstein argues that America’s “defining contradictions” were born in August 1619, when a ship carrying 20 to 30 enslaved Africans from what is present-day Angola arrived in Point Comfort, in the English colony of Virginia. And the title page of Hannah-Jones’s essay for the project insists that “our founding ideals of liberty and equality were false when they were written.”
  • What was surprising was that in 1776 a politically formidable “defining contradiction” — “that all men are created equal” — came into existence through the Declaration of Independence. As Abraham Lincoln wrote in 1859, that foundational document would forever serve as a “rebuke and stumbling block to the very harbingers of reappearing tyranny and oppression.”
  • As for the notion that the Declaration’s principles were “false” in 1776, ideals aren’t false merely because they are unrealized, much less because many of the men who championed them, and the nation they created, hypocritically failed to live up to them.
  • These two flaws led to a third, conceptual, error. “Out of slavery — and the anti-Black racism it required — grew nearly everything that has truly made America exceptional,” writes Silverstein.
  • Nearly everything? What about, say, the ideas contained by the First Amendment? Or the spirit of openness that brought millions of immigrants through places like Ellis Island? Or the enlightened worldview of the Marshall Plan and the Berlin airlift? Or the spirit of scientific genius and discovery exemplified by the polio vaccine and the moon landing?
  • On the opposite side of the moral ledger, to what extent does anti-Black racism figure in American disgraces such as the brutalization of Native Americans, the Chinese Exclusion Act or the internment of Japanese-Americans in World War II?
  • The world is complex. So are people and their motives. The job of journalism is to take account of that complexity, not simplify it out of existence through the adoption of some ideological orthodoxy.
  • This mistake goes far to explain the 1619 Project’s subsequent scholarly and journalistic entanglements. It should have been enough to make strong yet nuanced claims about the role of slavery and racism in American history. Instead, it issued categorical and totalizing assertions that are difficult to defend on close examination.
  • It should have been enough for the project to serve as curator for a range of erudite and interesting voices, with ample room for contrary takes. Instead, virtually every writer in the project seems to sing from the same song sheet, alienating other potential supporters of the project and polarizing national debate.
  • James McPherson, the Pulitzer Prize-winning author of “Battle Cry of Freedom” and a past president of the American Historical Association. He was withering: “Almost from the outset,” McPherson told the World Socialist Web Site, “I was disturbed by what seemed like a very unbalanced, one-sided account, which lacked context and perspective.”
  • In particular, McPherson objected to Hannah-Jones’s suggestion that the struggle against slavery and racism and for civil rights and democracy was, if not exclusively then mostly, a Black one. As she wrote in her essay: “The truth is that as much democracy as this nation has today, it has been borne on the backs of Black resistance.”
  • McPherson demurs: “From the Quakers in the 18th century, on through the abolitionists in the antebellum, to the Radical Republicans in the Civil War and Reconstruction, to the N.A.A.C.P., which was an interracial organization founded in 1909, down through the civil rights movements of the 1950s and 1960s, there have been a lot of whites who have fought against slavery and racial discrimination, and against racism,” he said. “And that’s what’s missing from this perspective.”
  • Wilentz’s catalog of the project’s mistakes is extensive. Hannah-Jones’s essay claimed that by 1776 Britain was “deeply conflicted” over its role in slavery. But despite the landmark Somerset v. Stewart court ruling in 1772, which held that slavery was not supported by English common law, it remained deeply embedded in the practices of the British Empire. The essay claimed that, among Londoners, “there were growing calls to abolish the slave trade” by 1776. But the movement to abolish the British slave trade only began about a decade later — inspired, in part, Wilentz notes, by American antislavery agitation that had started in the 1760s and 1770s.
  • ie M. Harris, an expert on pre-Civil War African-American life and slavery. “On Aug. 19 of last year,” Harris wrote, “I listened in stunned silence as Nikole Hannah-Jones … repeated an idea that I had vigorously argued against with her fact checker: that the patriots fought the American Revolution in large part to preserve slavery in North America.”
  • The larger problem is that The Times’s editors, however much background reading they might have done, are not in a position to adjudicate historical disputes. That should have been an additional reason for the 1619 Project to seek input from, and include contributions by, an intellectually diverse range of scholarly voices. Yet not only does the project choose a side, it also brooks no doubt.
  • “It is finally time to tell our story truthfully,” the magazine declares on its 1619 cover page. Finally? Truthfully? Is The Times suggesting that distinguished historians, like the ones who have seriously disputed aspects of the project, had previously been telling half-truths or falsehoods?
  • unlike other dates, 1776 uniquely marries letter and spirit, politics and principle: The declaration that something new is born, combined with the expression of an ideal that — because we continue to believe in it even as we struggle to live up to it — binds us to the date.
  • On the other, the 1619 Project has become, partly by its design and partly because of avoidable mistakes, a focal point of the kind of intense national debate that columnists are supposed to cover, and that is being widely written about outside The Times. To avoid writing about it on account of the first scruple is to be derelict in our responsibility toward the second.
caelengrubb

The World's Most Efficient Languages - The Atlantic - 0 views

  • But languages are strikingly different in the level of detail they require a speaker to provide in order to put a sentence together.
  • Other languages occupy still other places on the linguistic axis of “busyness,” from prolix to laconic, and it’s surprising what a language can do without.
  • Moreover, anyone who has sampled Chinese, or Persian, or Finnish, knows that a language can get along just fine with the same word for “he” and “she.
  • ...6 more annotations...
  • If there were a prize for the busiest language, then a language like Kabardian, also known as Circassian and spoken in the Caucasus, would win
  • The prize for most economical language could go to certain colloquial dialects of Indonesian that are rarely written but represent the daily reality of Indonesian in millions of mouths
  • Experiments have shown that this is often true to a faint, flickering degree a psychologist can detect in the artifice of experimental conditions
  • In a language where final sounds take the accent, such sounds tend to hold on longer because they are so loud and clear—you’re less likely to mumble it and people listening are more likely to hear it
  • When a language seems especially telegraphic, usually another factor has come into play: Enough adults learned it at a certain stage in its history that, given the difficulty of learning a new language after childhood, it became a kind of stripped-down “schoolroom” version of itself
  • Even if languages’ differences in busyness can’t be taken as windows on psychological alertness, the differences remain awesome
manhefnawi

An Axiom of Feeling: Werner Herzog on the Absolute, the Sublime, and Ecstatic Truth - B... - 0 views

  • “The soul of the listener or the spectator… actualizes truth through the experience of sublimity: that is, it completes an independent act of creation.”
  • Nietzsche defined truth as “a movable host of metaphors, metonymies, and anthropomorphisms: in short, a sum of human relations which have been poetically and rhetorically intensified, transferred, and embellished.” Truth, of course, is not reality but a subset of reality, alongside the catalogue of fact and the question of meaning, inside which human consciousness dwells. “Only art penetrates … the seeming realities of this world,” Saul Bellow asserted in his superb Nobel Prize acceptance speech. “There is another reality, the genuine one, which we lose sight of. This other reality is always sending us hints, which without art, we can’t receive.”
Javier E

Ian Hacking, Eminent Philosopher of Science and Much Else, Dies at 87 - The New York Times - 0 views

  • In an academic career that included more than two decades as a professor in the philosophy department of the University of Toronto, following appointments at Cambridge and Stanford, Professor Hacking’s intellectual scope seemed to know no bounds. Because of his ability to span multiple academic fields, he was often described as a bridge builder.
  • “Ian Hacking was a one-person interdisciplinary department all by himself,” Cheryl Misak, a philosophy professor at the University of Toronto, said in a phone interview. “Anthropologists, sociologists, historians and psychologists, as well as those working on probability theory and physics, took him to have important insights for their disciplines.”
  • Professor Hacking wrote several landmark works on the philosophy and history of probability, including “The Taming of Chance” (1990), which was named one of the best 100 nonfiction books of the 20th century by the Modern Library.
  • ...17 more annotations...
  • “I have long been interested in classifications of people, in how they affect the people classified, and how the effects on the people in turn change the classifications,” he wrote in “Making Up People
  • His work in the philosophy of science was groundbreaking: He departed from the preoccupation with questions that had long concerned philosophers. Arguing that science was just as much about intervention as it was about representation, be helped bring experimentation to center stage.
  • Regarding one such question — whether unseen phenomena like quarks and electrons were real or merely the theoretical constructs of physicists — he argued for reality in the case of phenomena that figured in experiments, citing as an example an experiment at Stanford that involved spraying electrons and positrons into a ball of niobium to detect electric charges. “So far as I am concerned,” he wrote, “if you can spray them, they’re real.”
  • His book “The Emergence of Probability” (1975), which is said to have inspired hundreds of books by other scholars, examined how concepts of statistical probability have evolved over time, shaping the way we understand not just arcane fields like quantum physics but also everyday life.
  • “I was trying to understand what happened a few hundred years ago that made it possible for our world to be dominated by probabilities,” he said in a 2012 interview with the journal Public Culture. “We now live in a universe of chance, and everything we do — health, sports, sex, molecules, the climate — takes place within a discourse of probabilities.”
  • Whatever the subject, whatever the audience, one idea that pervades all his work is that “science is a human enterprise,” Ragnar Fjelland and Roger Strand of the University of Bergen in Norway wrote when Professor Hacking won the Holberg Prize. “It is always created in a historical situation, and to understand why present science is as it is, it is not sufficient to know that it is ‘true,’ or confirmed. We have to know the historical context of its emergence.”
  • Hacking often argued that as the human sciences have evolved, they have created categories of people, and that people have subsequently defined themselves as falling into those categories. Thus does human reality become socially constructed.
  • In 2000, he became the first Anglophone to win a permanent position at the Collège de France, where he held the chair in the philosophy and history of scientific concepts until he retired in 2006.
  • “I call this the ‘looping effect,’” he added. “Sometimes, our sciences create kinds of people that in a certain sense did not exist before.”
  • In “Why Race Still Matters,” a 2005 article in the journal Daedalus, he explored how anthropologists developed racial categories by extrapolating from superficial physical characteristics, with lasting effects — including racial oppression. “Classification and judgment are seldom separable,” he wrote. “Racial classification is evaluation.”
  • Similarly, he once wrote, in the field of mental health the word “normal” “uses a power as old as Aristotle to bridge the fact/value distinction, whispering in your ear that what is normal is also right.”
  • In his influential writings about autism, Professor Hacking charted the evolution of the diagnosis and its profound effects on those diagnosed, which in turn broadened the definition to include a greater number of people.
  • Encouraging children with autism to think of themselves that way “can separate the child from ‘normalcy’ in a way that is not appropriate,” he told Public Culture. “By all means encourage the oddities. By no means criticize the oddities.”
  • His emphasis on historical context also illuminated what he called transient mental illnesses, which appear to be so confined 0cto their time 0c 0cthat they can vanish when times change.
  • “hysterical fugue” was a short-lived epidemic of compulsive wandering that emerged in Europe in the 1880s, largely among middle-class men who had become transfixed by stories of exotic locales and the lure of trave
  • His intellectual tendencies were unmistakable from an early age. “When he was 3 or 4 years old, he would sit and read the dictionary,” Jane Hacking said. “His parents were completely baffled.”
  • He wondered aloud, the interviewer noted, if the whole universe was governed by nonlocality — if “everything in the universe is aware of everything else.”“That’s what you should be writing about,” he said. “Not me. I’m a dilettante. My governing word is ‘curiosity.’”
Javier E

Opinion | A Nobel Prize for the Economics of Panic - The New York Times - 0 views

  • Obviously, Bernanke, Diamond and Dybvig weren’t the first economists to notice that bank runs happen
  • Diamond and Dybvig provided the first really clear analysis of why they happen — and why, destructive as they are, they can represent rational behavior on the part of bank depositors. Their analysis was also full of implications for financial policy.
  • Bernanke provided evidence on why bank runs matter and, although he avoided saying so directly, why Milton Friedman was wrong about the causes of the Great Depression.
  • ...20 more annotations...
  • Diamond and Dybvig offered a stylized but insightful model of what banks do. They argued that there is always a tension between individuals’ desire for liquidity — ready access to funds — and the economy’s need to make long-term investments that can’t easily be converted into cash.
  • Banks square that circle by taking money from depositors who can withdraw their funds at will — making those deposits highly liquid — and investing most of that money in illiquid assets, such as business loans.
  • So banking is a productive activity that makes the economy richer by reconciling otherwise incompatible desires for liquidity and productive investment. And it normally works because only a fraction of a bank’s depositors want to withdraw their funds at any given time.
  • This does, however, make banks vulnerable to runs. Suppose that for some reason many depositors come to believe that many other depositors are about to cash out, and try to beat the pack by withdrawing their own funds. To meet these demands for liquidity, a bank will have to sell off its illiquid assets at fire sale prices, and doing so can drive an institution that should be solvent into bankruptcy
  • If that happens, people who didn’t withdraw their funds will be left with nothing. So during a panic, the rational thing to do is to panic along with everyone else.
  • There was, of course, a huge wave of banking panics in 1930-31. Many banks failed, and those that survived made far fewer business loans than before, holding cash instead, while many families shunned banks altogether, putting their cash in safes or under their mattresses. The result was a diversion of wealth into unproductive uses. In his 1983 paper, Bernanke offered evidence that this diversion played a large role in driving the economy into a depression and held back the subsequent recovery.
  • In the story told by Friedman and Anna Schwartz, the banking crisis of the early 1930s was damaging because it led to a fall in the money supply — currency plus bank deposits. Bernanke asserted that this was at most only part of the stor
  • a government backstop — either deposit insurance, the willingness of the central bank to lend money to troubled banks or both — can short-circuit potential crises.
  • But providing such a backstop raises the possibility of abuse; banks may take on undue risks because they know they’ll be bailed out if things go wrong.
  • So banks need to be regulated as well as backstopped. As I said, the Diamond-Dybvig analysis had remarkably large implications for policy.
  • From an economic point of view, banking is any form of financial intermediation that offers people seemingly liquid assets while using their wealth to make illiquid investments.
  • This insight was dramatically validated in the 2008 financial crisis.
  • By the eve of the crisis, however, the financial system relied heavily on “shadow banking” — banklike activities that didn’t involve standard bank deposits
  • Such arrangements offered a higher yield than conventional deposits. But they had no safety net, which opened the door to an old-style bank run and financial panic.
  • And the panic came. The conventionally measured money supply didn’t plunge in 2008 the way it did in the 1930s — but repo and other money-like liabilities of financial intermediaries did:
  • Fortunately, by then Bernanke was chair of the Federal Reserve. He understood what was going on, and the Fed stepped in on an immense scale to prop up the financial system.
  • a sort of meta point about the Diamond-Dybvig work: Once you’ve understood and acknowledged the possibility of self-fulfilling banking crises, you become aware that similar things can happen elsewhere.
  • Perhaps the most notable case in relatively recent times was the euro crisis of 2010-12. Market confidence in the economies of southern Europe collapsed, leading to huge spreads between the interest rates on, for example, Portuguese bonds and those on German bonds. The conventional wisdom at the time — especially in Germany — was that countries were being justifiably punished for taking on excessive debt
  • the Belgian economist Paul De Grauwe argued that what was actually happening was a self-fulfilling panic — basically a run on the bonds of countries that couldn’t provide a backstop because they no longer had their own currencies.
  • Sure enough, when Mario Draghi, the president of the European Central Bank at the time, finally did provide a backstop in 2012 — he said the magic words “whatever it takes,” implying that the bank would lend money to the troubled governments if necessary — the spreads collapsed and the crisis came to an end:
Javier E

Peter Higgs, physicist who discovered Higgs boson, dies aged 94 | Peter Higgs | The Gua... - 0 views

  • Peter Higgs, the Nobel prize-winning physicist who discovered a new particle known as the Higgs boson, has died.Higgs, 94, who was awarded the Nobel prize for physics in 2013 for his work in 1964 showing how the boson helped bind the universe together by giving particles their mass
  • “A giant of particle physics has left us,” Ellis told the Guardian. “Without his theory, atoms could not exist and radioactivity would be a force as strong as electricity and magnetism.
  • “His prediction of the existence of the particle that bears his name was a deep insight, and its discovery at Cern in 2012 was a crowning moment that confirmed his understanding of the way the Universe works.”
  • ...2 more annotations...
  • The particle that carries his name is perhaps the single most stunning example of how seemingly abstract mathematical ideas can make predictions which turn out to have huge physical consequences.”
  • The Royal Swedish Academy of Sciences, which awards the Nobel, said at the time the standard model of physics which underpins the scientific understanding of the universe “rests on the existence of a special kind of particle: the Higgs particle. This particle originates from an invisible field that fills up all space.“Even when the universe seems empty this field is there. Without it, we would not exist, because it is from contact with the field that particles acquire mass. The theory proposed by Englert and Higgs describes this process.”
Emily Horwitz

UK, Japan scientists win Nobel for stem cell breakthroughs | Reuters - 0 views

  • Scientists from Britain and Japan shared a Nobel Prize on Monday for the discovery that adult cells can be transformed back into embryo-like stem cells that may one day regrow tissue in damaged brains, hearts or other organs.
  • discovered ways to create tissue that would act like embryonic cells, without the need to harvest embryos.
  • "These groundbreaking discoveries have completely changed our view of the development and specialization of cells," the Nobel Assembly at Stockholm's Karolinska Institute said.
  • ...6 more annotations...
  • big hope for stem cells is that they can be used to replace damaged tissue in everything from spinal cord injuries to Parkinson's disease.
  • Scientists once thought it was impossible to turn adult tissue back into stem cells, which meant that new stem cells could only be created by harvesting embryos - a practice that raised ethical qualms in some countries and also means that implanted cells might be rejected by the body.
  • The new stem cells are known as "induced pluripotency stem cells", or iPS cells.
  • "We would like to be able to find a way of obtaining spare heart or brain cells from skin or blood cells. The important point is that the replacement cells need to be from the same individual, to avoid problems of rejection and hence of the need for immunosuppression."
  • Thomas Perlmann, Nobel Committee member and professor of Molecular Development Biology at the Karolinska Institute said: "Thanks to these two scientists, we know now that development is not strictly a one-way street."
  • "You can't take out a large part of the heart or the brain or so to study this, but now you can take a cell from for example the skin of the patient, reprogram it, return it to a pluripotent state, and then grow it in a laboratory," he said.
Javier E

The Danger of Making Science Political - Puneet Opal - The Atlantic - 0 views

  • there seems to be a growing gulf between U.S Republicans and science. Indeed, by some polls only 6 percent of scientists are Republican, and in the recent U.S. Presidential election, 68 science Nobel Prize winners endorsed the Democratic nominee Barack Obama over the Republican candidate Mitt Romney.
  • What are the reasons for this apparent tilt?
  • he backs up his statement by suggesting a precedent: the social sciences, he feels, have already received this treatment at the hands of conservatives in government by making pointed fingers at their funding.
  • ...6 more annotations...
  • Moreover, when they attempt to give their expert knowledge for policy decisions, conservatives will choose to ignore the evidence, claiming a liberal bias.
  • most of the bad news is the potential impact on scientists. Why? Because scientists, he believes -- once perceived by Republicans to be a Democratic interest group -- will lose bipartisan support for federal science funding.
  • this sort of thinking might well be bad for scientists, but is simply dangerous for the country. As professionals, scientists should not be put into a subservient place by politicians and ideologues. They should never be felt that their advice might well be attached to carrots or sticks.
  • Political choices can be made after the evidence is presented, but the evidence should stand for what it is. If the evidence itself is rejected by politicians -- as is currently going on -- then the ignorance of the political class should indeed be exposed, and all threats resisted.
  • This might seem to be a diatribe against conservatives. But really this criticism is aimed at all unscientific thinking.
  • there are a number on the left who have their own dogmatic beliefs; the most notable are unscientific theories with regard to the dangers of vaccinations, genetically modified produce, or nuclear energy.
Javier E

Wine-tasting: it's junk science | Life and style | The Observer - 0 views

  • google_ad_client = 'ca-guardian_js'; google_ad_channel = 'lifeandstyle'; google_max_num_ads = '3'; // Comments Click here to join the discussion. We can't load the discussion on guardian.co.uk because you don't have JavaScript enabled. if (!!window.postMessage) { jQuery.getScript('http://discussion.guardian.co.uk/embed.js') } else { jQuery('#d2-root').removeClass('hd').html( '' + 'Comments' + 'Click here to join the discussion.We can\'t load the ' + 'discussion on guardian.co.uk ' + 'because your web browser does not support all the features that we ' + 'need. If you cannot upgrade your browser to a newer version, you can ' + 'access the discussion ' + 'here.' ); } Wor
  • Hodgson approached the organisers of the California State Fair wine competition, the oldest contest of its kind in North America, and proposed an experiment for their annual June tasting sessions.Each panel of four judges would be presented with their usual "flight" of samples to sniff, sip and slurp. But some wines would be presented to the panel three times, poured from the same bottle each time. The results would be compiled and analysed to see whether wine testing really is scientific.
  • Results from the first four years of the experiment, published in the Journal of Wine Economics, showed a typical judge's scores varied by plus or minus four points over the three blind tastings. A wine deemed to be a good 90 would be rated as an acceptable 86 by the same judge minutes later and then an excellent 94.
  • ...9 more annotations...
  • Hodgson's findings have stunned the wine industry. Over the years he has shown again and again that even trained, professional palates are terrible at judging wine."The results are disturbing," says Hodgson from the Fieldbrook Winery in Humboldt County, described by its owner as a rural paradise. "Only about 10% of judges are consistent and those judges who were consistent one year were ordinary the next year."Chance has a great deal to do with the awards that wines win."
  • French academic Frédéric Brochet tested the effect of labels in 2001. He presented the same Bordeaux superior wine to 57 volunteers a week apart and in two different bottles – one for a table wine, the other for a grand cru.The tasters were fooled.When tasting a supposedly superior wine, their language was more positive – describing it as complex, balanced, long and woody. When the same wine was presented as plonk, the critics were more likely to use negatives such as weak, light and flat.
  • In 2011 Professor Richard Wiseman, a psychologist (and former professional magician) at Hertfordshire University invited 578 people to comment on a range of red and white wines, varying from £3.49 for a claret to £30 for champagne, and tasted blind.People could tell the difference between wines under £5 and those above £10 only 53% of the time for whites and only 47% of the time for reds. Overall they would have been just as a successful flipping a coin to guess.
  • why are ordinary drinkers and the experts so poor at tasting blind? Part of the answer lies in the sheer complexity of wine.For a drink made by fermenting fruit juice, wine is a remarkably sophisticated chemical cocktail. Dr Bryce Rankine, an Australian wine scientist, identified 27 distinct organic acids in wine, 23 varieties of alcohol in addition to the common ethanol, more than 80 esters and aldehydes, 16 sugars, plus a long list of assorted vitamins and minerals that wouldn't look out of place on the ingredients list of a cereal pack. There are even harmless traces of lead and arsenic that come from the soil.
  • "People underestimate how clever the olfactory system is at detecting aromas and our brain is at interpreting them," says Hutchinson."The olfactory system has the complexity in terms of its protein receptors to detect all the different aromas, but the brain response isn't always up to it. But I'm a believer that everyone has the same equipment and it comes down to learning how to interpret it." Within eight tastings, most people can learn to detect and name a reasonable range of aromas in wine
  • People struggle with assessing wine because the brain's interpretation of aroma and bouquet is based on far more than the chemicals found in the drink. Temperature plays a big part. Volatiles in wine are more active when wine is warmer. Serve a New World chardonnay too cold and you'll only taste the overpowering oak. Serve a red too warm and the heady boozy qualities will be overpowering.
  • Colour affects our perceptions too. In 2001 Frédérick Brochet of the University of Bordeaux asked 54 wine experts to test two glasses of wine – one red, one white. Using the typical language of tasters, the panel described the red as "jammy' and commented on its crushed red fruit.The critics failed to spot that both wines were from the same bottle. The only difference was that one had been coloured red with a flavourless dye
  • Other environmental factors play a role. A judge's palate is affected by what she or he had earlier, the time of day, their tiredness, their health – even the weather.
  • Robert Hodgson is determined to improve the quality of judging. He has developed a test that will determine whether a judge's assessment of a blind-tasted glass in a medal competition is better than chance. The research will be presented at a conference in Cape Town this year. But the early findings are not promising."So far I've yet to find someone who passes," he says.
1 - 20 of 65 Next › Last »
Showing 20 items per page