Skip to main content

Home/ History Readings/ Group items tagged quantum

Rss Feed Group items tagged

Javier E

Quantum Computing Advance Begins New Era, IBM Says - The New York Times - 0 views

  • While researchers at Google in 2019 claimed that they had achieved “quantum supremacy” — a task performed much more quickly on a quantum computer than a conventional one — IBM’s researchers say they have achieved something new and more useful, albeit more modestly named.
  • “We’re entering this phase of quantum computing that I call utility,” said Jay Gambetta, a vice president of IBM Quantum. “The era of utility.”
  • Present-day computers are called digital, or classical, because they deal with bits of information that are either 1 or 0, on or off. A quantum computer performs calculations on quantum bits, or qubits, that capture a more complex state of information. Just as a thought experiment by the physicist Erwin Schrödinger postulated that a cat could be in a quantum state that is both dead and alive, a qubit can be both 1 and 0 simultaneously.
  • ...15 more annotations...
  • That allows quantum computers to make many calculations in one pass, while digital ones have to perform each calculation separately. By speeding up computation, quantum computers could potentially solve big, complex problems in fields like chemistry and materials science that are out of reach today.
  • When Google researchers made their supremacy claim in 2019, they said their quantum computer performed a calculation in 3 minutes 20 seconds that would take about 10,000 years on a state-of-the-art conventional supercomputer.
  • The IBM researchers in the new study performed a different task, one that interests physicists. They used a quantum processor with 127 qubits to simulate the behavior of 127 atom-scale bar magnets — tiny enough to be governed by the spooky rules of quantum mechanics — in a magnetic field. That is a simple system known as the Ising model, which is often used to study magnetism.
  • This problem is too complex for a precise answer to be calculated even on the largest, fastest supercomputers.
  • On the quantum computer, the calculation took less than a thousandth of a second to complete. Each quantum calculation was unreliable — fluctuations of quantum noise inevitably intrude and induce errors — but each calculation was quick, so it could be performed repeatedly.
  • Indeed, for many of the calculations, additional noise was deliberately added, making the answers even more unreliable. But by varying the amount of noise, the researchers could tease out the specific characteristics of the noise and its effects at each step of the calculation.“We can amplify the noise very precisely, and then we can rerun that same circuit,” said Abhinav Kandala, the manager of quantum capabilities and demonstrations at IBM Quantum and an author of the Nature paper. “And once we have results of these different noise levels, we can extrapolate back to what the result would have been in the absence of noise.”In essence, the researchers were able to subtract the effects of noise from the unreliable quantum calculations, a process they call error mitigation.
  • Altogether, the computer performed the calculation 600,000 times, converging on an answer for the overall magnetization produced by the 127 bar magnets.
  • Although an Ising model with 127 bar magnets is too big, with far too many possible configurations, to fit in a conventional computer, classical algorithms can produce approximate answers, a technique similar to how compression in JPEG images throws away less crucial data to reduce the size of the file while preserving most of the image’s details
  • Certain configurations of the Ising model can be solved exactly, and both the classical and quantum algorithms agreed on the simpler examples. For more complex but solvable instances, the quantum and classical algorithms produced different answers, and it was the quantum one that was correct.
  • Thus, for other cases where the quantum and classical calculations diverged and no exact solutions are known, “there is reason to believe that the quantum result is more accurate,”
  • Mr. Anand is currently trying to add a version of error mitigation for the classical algorithm, and it is possible that could match or surpass the performance of the quantum calculations.
  • In the long run, quantum scientists expect that a different approach, error correction, will be able to detect and correct calculation mistakes, and that will open the door for quantum computers to speed ahead for many uses.
  • Error correction is already used in conventional computers and data transmission to fix garbles. But for quantum computers, error correction is likely years away, requiring better processors able to process many more qubits
  • “This is one of the simplest natural science problems that exists,” Dr. Gambetta said. “So it’s a good one to start with. But now the question is, how do you generalize it and go to more interesting natural science problems?”
  • Those might include figuring out the properties of exotic materials, accelerating drug discovery and modeling fusion reactions.
Javier E

Microsoft Makes Bet Quantum Computing Is Next Breakthrough - NYTimes.com - 0 views

  • Conventional computing is based on a bit that can be either a 1 or a 0, representing a single value in a computation. But quantum computing is based on qubits, which simultaneously represent both zero and one values. If they are placed in an “entangled” state — physically separated but acting as though they are connected — with many other qubits, they can represent a vast number of values simultaneously.
  • In the approach that Microsoft is pursuing, which is described as “topological quantum computing,” precisely controlling the motions of pairs of subatomic particles as they wind around one another would manipulate entangled quantum bits.
  • By weaving the particles around one another, topological quantum computers would generate imaginary threads whose knots and twists would create a powerful computing system. Most important, the mathematics of their motions would correct errors that have so far proved to be the most daunting challenge facing quantum computer designers.
  • ...4 more annotations...
  • Microsoft’s topological approach is generally perceived as the most high-risk by scientists, because the type of exotic anyon particle needed to generate qubits has not been definitively proved to exist.
  • Microsoft began supporting the effort after Dr. Freedman, who has won both the Fields Medal and a MacArthur Fellowship and is widely known for his work in the mathematical field of topology, approached Craig Mundie, one of Microsoft’s top executives, and convinced him there was a new path to quantum computing based on ideas in topology originally proposed in 1997 by the physicist Alexei Kitaev.
  • Mr. Mundie said the idea struck him as the kind of gamble the company should be pursuing.“It’s hard to find things that you could say, I know that’s a 20-year problem and would be worth doing,” he said. “But this one struck me as being in that category.”
  • For some time, many thought quantum computers were useful only for factoring huge numbers — good for N.S.A. code breakers but few others. But new algorithms for quantum machines have begun to emerge in areas as varied as searching large amounts of data or modeling drugs. Now many scientists believe that quantum computers could tackle new kinds of problems that have yet to be defined.
Javier E

Finding the Higgs Leads to More Puzzles - NYTimes.com - 0 views

  • Taken at face value, the result implies that eventually (in 10^100 years or so) an unlucky quantum fluctuation will produce a bubble of a different vacuum, which will then expand at the speed of light, destroying everything.”
  • The idea is that the Higgs field could someday twitch and drop to a lower energy state, like water freezing into ice, thereby obliterating the workings of reality as we know it. Naturally, we would have no warning. Just blink and it’s over.
  • . You might think that finding the Higgs boson, after 50 years and $10 billion or so, would bring clarity to physics and to the cosmos. But just the opposite is true: they may have found the Higgs boson, but they don’t understand it.
  • ...5 more annotations...
  • they don’t understand why it weighs what it does — it is about 125 times as massive as the protons that were collided to make it, not gazillions of times as heavy, as standard quantum mechanical calculations would suggest.
  • For years the preferred solution to this conundrum has been a theory called supersymmetry, which, among other things, predicted the existence of a whole new spectrum of particles, superpartners of the ones we already know, that would cancel out the quantum calculations and keep the Higgs light. One of these particles might also be the dark matter that makes up a quarter of the universe by weight.
  • experiments at CERN’s Large Hadron Collider have already eliminated the simplest versions of supersymmetry.
  • The most talked-about alternative to supersymmetry is the idea of the multiverse, an almost infinite ensemble of universes in which the value of the Higgs — as well as many other crucial parameters — is random. We just happen to live in the one in which the conditions and parameters are fit for us. This is a notion that flows naturally from string theory and modern theories of the Big Bang, but accepting multiple universes means giving up the Einsteinian dream of a single explanation for the cosmos, a painful concession.
  • “Physical science has historically progressed not only by finding precise explanations of natural phenomena, but also by discovering what sorts of things can be precisely explained. These may be fewer than we had thought.”
Javier E

What Oppenheimer really knew about an atomic bomb ending the world - The Washington Post - 0 views

  • In a chilling, existential, bizarrely comic moment, the new movie “Oppenheimer” revives an old question: Did Manhattan Project scientists think there was even a minute possibility that detonating the first atomic bomb on the remote plains of New Mexico could destroy the world?
  • physicists knew it wouldn’t, long before the Trinity test on July 16, 1945, at the Alamogordo Bombing Range, about 210 miles south of the secret Los Alamos, N.M., laboratory.
  • “This thing has been blown out of proportion over the years,” said Richard Rhodes, author of the Pulitzer Prize-winning book “The Making of the Atomic Bomb.” The question on the scientists’ minds before the test, he said, “wasn’t, ‘Is it going to blow up the world?’ It was, ‘Is it going to work at all?’”
  • ...17 more annotations...
  • In the movie, one scene has J. Robert Oppenheimer, director of the laboratory, seeking to reassure his boss, Gen. Leslie Groves, on the eve of the test. Upon investigation, Oppenheimer tells him, physicists have concluded that the chances the test detonation will destroy the world are “near zero.” Realizing the news has alarmed, not reassured, the general, Oppenheimer asks, “What do you want from theory alone?”“Zero would be nice,” the general replies.
  • no physicists or historians interviewed for this story recalled coming across any mention of such a conversation between Oppenheimer and the general in the historical record.
  • Still, the discussions and calculations persisted long after the Trinity test. In 1946, three Manhattan project scientists, including Teller, who would later become known as the father of the hydrogen bomb, wrote a report concluding that the explosive force of the first atomic bomb wasn’t even close to what would be required to trigger a planet-destroying chain reaction in air. The report was not declassified until 1973.
  • At a conference in the summer of 1942, almost a full year before Los Alamos opened, physicist Edward Teller raised the possibility of atomic bombs igniting Earth’s oceans or atmosphere. According to Rhodes’s account, Hans Bethe, who headed the theoretical division at Los Alamos, “didn’t believe it from the first minute” but nonetheless performed the calculations convincing the other physicists that such a disaster was not a reasonable possibility.
  • “I don’t think any physicists seriously worried about it,” said John Preskill, a professor of theoretical physics at California Institute of Technology.
  • “Did the actual exchange happen at that moment? No, I don’t think so,” said Alex Wellerstein, an associate professor at Stevens Institute of Technology in Hoboken, N.J., and author of the 2021 book, “Restricted Data: The History of Nuclear Secrecy in the United States.”“But were there discussions like that? I believe so,” he added.
  • A 1979 study by scientists at the University of California’s Lawrence Livermore Laboratory examined the question of whether a nuclear explosion might trigger a runaway reaction in the atmosphere or oceans. In page after page of mathematical equations, the scientists described a complex set of factors that made atmospheric ignition effectively impossible.
  • As outlandish as the notion was to many scientists, the nuclear research organization CERN felt obliged to deal with the fear, noting on its website that “some theories suggest that the formation of tiny ‘quantum’ black holes may be possible. The observation of such an event would be thrilling in terms of our understanding of the Universe; and would be perfectly safe.”
  • Dudley’s essay also recounted a story that on the day of the test, “as zero hour approached” Gen. Groves was annoyed to find Manhattan Project physicist and Nobel Prize winner Enrico Fermi making bets with colleagues about whether the bomb would ignite the atmosphere, “and, if so, whether it would destroy only New Mexico ― or the entire world.” (Some experts have suggested Fermi’s actions may have been more of a joke, or an example of gallows humor.)
  • Fascination with this doomsday scenario may stem, at least in part, from a misunderstanding of what physicists mean when they say “near zero.” The branch of physics known as quantum mechanics, which deals with matter and light at the atomic and subatomic scale, does not rule out any possibilities.
  • For example, if a boy tosses a rubber ball at a brick wall, there is an exceedingly remote — but still valid — possibility that instead of watching the ball bounce back, he could see it pass through the wall.
  • Aditi Verma, an assistant professor of nuclear engineering and radiological sciences at the University of Michigan, put it this way: “What a physicist means by ‘near zero’ would be zero to an engineer.”
  • In the 2000s, scientists encountered a similar problem of terminology as they prepared to generate high-speed particle collisions at the Large Hadron Collider in Geneva. Talk surfaced that the activity might generate a black hole that would devour Earth.
  • Probably the easiest to grasp is the fact that, even under the harshest scenarios, far more energy would be lost in the explosion than gained, wiping out any chance to sustain a chain reaction.
  • In other words, any black hole created by the collider would be far too small to pose any risk to the planet.Scientists say such disaster scenarios are sometimes the price of crossing new thresholds of discovery.
  • “You don’t often talk in certainties,” he said. “You talk in probabilities. If you haven’t done the experiment, you are hesitant to say ‘This is impossible. It will never happen.’ … It was good to think it through.”
  • Rhodes added that he hopes the “Oppenheimer” movie will not lead people to doubt the scientists on the Manhattan Project.“They knew what they were doing,” he said. “They were not feeling around in the dark.”
Javier E

Planck Satellite Shows Image of Infant Universe - NYTimes.com - 0 views

  • Recorded by the European Space Agency’s Planck satellite, the image is a heat map of the cosmos as it appeared only 370,000 years after the Big Bang, showing space speckled with faint spots from which galaxies would grow over billions of years.
  • is in stunning agreement with the general view of the universe that has emerged over the past 20 years, of a cosmos dominated by mysterious dark energy that seems to be pushing space apart and the almost-as-mysterious dark matter that is pulling galaxies together. It also shows a universe that seems to have endured an explosive burp known as inflation, which was the dynamite in the Big Bang.
  • “The extraordinary quality of Planck’s portrait of the infant universe allows us to peel back its layers to the very foundations, revealing that our blueprint of the cosmos is far from complete.”
  • ...10 more annotations...
  • Analyzing the relative sizes and frequencies of spots and ripples over the years has allowed astronomers to describe the birth of the universe to a precision that would make the philosophers weep. The new data have allowed astronomers to tweak their model a bit. It now seems the universe is 13.8 billion years old, instead of 13.7 billion, and consists by mass of 4.9 percent ordinary matter like atoms, 27 percent dark matter and 68 percent dark energy.
  • “Our ultimate goal would be to construct a new model that predicts the anomalies and links them together. But these are early days; so far, we don’t know whether this is possible and what type of new physics might be needed. And that’s exciting.”
  • The microwaves detected by the Planck date from 370,000 years after the Big Bang, which is as far back as optical or radio telescopes will ever be able to see, cosmologists say. But the patterns within them date from less than a trillionth of a second after the Big Bang, when the universe is said to have undergone a violent burst of expansion known as inflation that set cosmic history on the course it has followed ever since. Those patterns are Planck’s prize.
  • Within the standard cosmological framework, however, the new satellite data underscored the existence of puzzling anomalies that may yet lead theorists back to the drawing board. The universe appears to be slightly lumpier, with bigger and more hot and cold spots in the northern half of the sky as seen from Earth than toward the south, for example. And there is a large, unexplained cool spot in the northern hemisphere.
  • The biggest surprise here, astronomers said, is that the universe is expanding slightly more slowly than previous measurements had indicated. The Hubble constant, which characterizes the expansion rate, is 67 kilometers per second per megaparsec — in the units astronomers use — according to Planck. Recent ground-based measurements combined with the WMAP data gave a value of 69, offering enough of a discrepancy to make cosmologists rerun their computer simulations of cosmic history.
  • a Planck team member from the University of California, Berkeley, said it represents a mismatch between measurements made of the beginning of time and those made more recently, and that it could mean that dark energy, which is speeding up the expansion of the universe, is more complicated than cosmologists thought. He termed the possibility “pretty radical,” adding, “That would be pretty exciting.”
  • The data also offered striking support for the notion of inflation, which has been the backbone of Big Bang theorizing for 30 years. Under the influence of a mysterious force field during the first trillionth of a fraction of a second, what would become the observable universe ballooned by 100 trillion trillion times in size from a subatomic pinprick to a grapefruit in less than a violent eye-blink, so the story first enunciated by Alan Guth of M.I.T. goes.
  • Submicroscopic quantum fluctuations in this force field are what would produce the hot spots in the cosmic microwaves, which in turn would grow into galaxies. According to Planck’s measurements, those fluctuations so far fit the predictions of the simplest model of inflation, invented by Andrei Linde of Stanford, to a T. Dr. Tegmark of M.I.T. said, “We’re homing in on the simplest model.”
  • Cosmologists still do not know what might have caused inflation, but the recent discovery of the Higgs boson has provided evidence that the kinds of fields that can provoke such behavior really exist.
  • another clue to the nature of inflation could come from the anomalies in the microwave data — the lopsided bumpiness, for example — that tend to happen on the largest scales in the universe. By the logic of quantum cosmology, they were the first patterns to be laid down on the emerging cosmos; that is to say, when inflation was just starting.
Javier E

Physicists Anxiously Await News of the 'God Particle' - NYTimes.com - 0 views

  • At 8 a.m. Eastern time on Tuesday morning, scientists from CERN, the European Center for Nuclear Research, are scheduled to give a progress report on the search for the Higgs boson — infamously known as the “God particle” — whose discovery would vindicate the modern theory of how elementary particles get mass
  • no one thinks the Higgs is the final word about what underlies the Standard Model of particle physics, the theory that describes the most basic elements of matter and the forces through which they interact. Even if the Higgs boson is discovered, the question will still remain of why masses are what they are.
  • According to quantum field theory — the theory that combines quantum mechanics and special relativity — masses would be expected to be ten thousand trillion times bigger. Without some deeper ingredient, a fudge of that size would be required to make it all hang together. No particle physicist believes that.
  • ...2 more annotations...
  • We all expect a richer theory underlying the Standard Model. That’s one reason the mass matters to us. Some theories only accommodate a particular range of masses. Knowing the mass will give us insight into what that deeper underlying theory is.
  • The other possibility is that the answer is not the simple, fundamental particle that the Large Hadron Collider currently is looking for. It could be a more complicated object or part of a more complex sector that would take longer
Javier E

'Oppenheimer,' 'The Maniac' and Our Terrifying Prometheus Moment - The New York Times - 0 views

  • Prometheus was the Titan who stole fire from the gods of Olympus and gave it to human beings, setting us on a path of glory and disaster and incurring the jealous wrath of Zeus. In the modern world, especially since the beginning of the Industrial Revolution, he has served as a symbol of progress and peril, an avatar of both the liberating power of knowledge and the dangers of technological overreach.
  • The consequences are real enough, of course. The bombs dropped on Hiroshima and Nagasaki killed at least 100,000 people. Their successor weapons, which Oppenheimer opposed, threatened to kill everybody els
  • Annie Dorsen’s theater piece “Prometheus Firebringer,” which was performed at Theater for a New Audience in September, updates the Greek myth for the age of artificial intelligence, using A.I. to weave a cautionary tale that my colleague Laura Collins-Hughes called “forcefully beneficial as an examination of our obeisance to technology.”
  • ...13 more annotations...
  • Something similar might be said about “The Maniac,” Benjamín Labatut’s new novel, whose designated Prometheus is the Hungarian-born polymath John von Neumann, a pioneer of A.I. as well as an originator of game theory.
  • both narratives are grounded in fact, using the lives and ideas of real people as fodder for allegory and attempting to write a new mythology of the modern world.
  • on Neumann and Oppenheimer were close contemporaries, born a year apart to prosperous, assimilated Jewish families in Budapest and New York. Von Neumann, conversant in theoretical physics, mathematics and analytic philosophy, worked for Oppenheimer at Los Alamos during the Manhattan Project. He spent most of his career at the Institute for Advanced Study, where Oppenheimer served as director after the war.
  • More than most intellectual bastions, the institute is a house of theory. The Promethean mad scientists of the 19th century were creatures of the laboratory, tinkering away at their infernal machines and homemade monsters. Their 20th-century counterparts were more likely to be found at the chalkboard, scratching out our future in charts, equations and lines of code.
  • MANIAC. The name was an acronym for “Mathematical Analyzer, Numerical Integrator and Computer,” which doesn’t sound like much of a threat. But von Neumann saw no limit to its potential. “If you tell me precisely what it is a machine cannot do,” he declared, “then I can always make a machine which will do just that.” MANIAC didn’t just represent a powerful new kind of machine, but “a new type of life.”
  • More than 200 years after the Shelleys, Prometheus is having another moment, one closer in spirit to Mary’s terrifying ambivalence than to Percy’s fulsome gratitude. As technological optimism curdles in the face of cyber-capitalist villainy, climate disaster and what even some of its proponents warn is the existential threat of A.I., that ancient fire looks less like an ember of divine ingenuity than the start of a conflagration. Prometheus is what we call our capacity for self-destruction.
  • Oppenheimer wasn’t a principal author of that theory. Those scientists, among them Niels Bohr, Erwin Schrödinger and Werner Heisenberg, were characters in Labatut’s previous novel, “When We Cease to Understand the World.” That book provides harrowing illumination of a zone where scientific insight becomes indistinguishable from madness or, perhaps, divine inspiration. The basic truths of the new science seem to explode all common sense: A particle is also a wave; one thing can be in many places at once; “scientific method and its object could no longer be prised apart.”
  • . Oppenheimer’s designation as Prometheus is precise. He snatched a spark of quantum insight from those divinities and handed it to Harry S. Truman and the U.S. Army Air Forces.
  • Labatut’s account of von Neumann is, if anything, more unsettling than “Oppenheimer.” We had decades to get used to the specter of nuclear annihilation, and since the end of the Cold War it has been overshadowed by other terrors. A.I., on the other hand, seems newly sprung from science fiction, and especially terrifying because we can’t quite grasp what it will become.
  • Von Neumann, who died in 1957, did not teach machines to play Go. But when asked “what it would take for a computer, or some other mechanical entity, to begin to think and behave like a human being,” he replied that “it would have to play, like a child.”
  • the intellectual drama of “Oppenheimer” — as distinct from the dramas of his personal life and his political fate — is about how abstraction becomes reality. The atomic bomb may be, for the soldiers and politicians, a powerful strategic tool in war and diplomacy. For the scientists, it’s something else: a proof of concept, a concrete manifestation of quantum theory.
  • If Oppenheimer took hold of the sacred fire of atomic power, von Neumann’s theft was bolder and perhaps more insidious: He stole a piece of the human essence. He’s not only a modern Prometheus; he’s a second Frankenstein, creator of an all but human, potentially more than human monster.
  • “Technological power as such is always an ambivalent achievement,” Labatut’s von Neumann writes toward the end of his life, “and science is neutral all through, providing only means of control applicable to any purpose, and indifferent to all. It is not the particularly perverse destructiveness of one specific invention that creates danger. The danger is intrinsic. For progress there is no cure.”
Javier E

Fiber Optic Breakthrough to Improve Internet Security Cheaply - NYTimes.com - 1 views

  • Despite their ability to carry prodigious amounts of data, fiber-optic cables are also highly insecure. An eavesdropper needs only to bend a cable and expose the fiber, Dr. Shields said. It is then possible to capture light that leaks from the cable and convert it into digital ones and zeros. “The laws of quantum physics tell us that if someone tries to measure those single photons, that measurement disturbs their state and it causes errors in the information carried by the single photon,” he said. “By measuring the error rate in the secret key, we can determine whether there has been any eavesdropping in the fiber and in that way directly test the secrecy of each key.”
Javier E

All Signs Point to Higgs Boson, but Still Waiting for Scientific Certainty - NYTimes.com - 1 views

  • physicists admit that it will take more work and analysis before they will have the cold numbers that clinch the case that the new particle announced on July 4 last year is in fact the exact boson first predicted by Peter Higgs and others in 1964 to be the arbiter of mass and cosmic diversity
  • What happened in the first instant of the Big Bang? What happens at the middle of a black hole where matter and time blink in or out of existence? What is the dark matter whose gravitational influence, astronomers say, shapes the structures of galaxies, or the dark energy that is forcing the universe apart? Why is the universe full of matter but not antimatter? And what, finally, is the fate of the universe? These are all questions that the Standard Model, the vanilla-sounding set of equations that ruled physics for the last half century, does not answer
  • Some of them could be answered by the unproven theory called supersymmetry, which among other things is needed to explain why whatever mass the Higgs has is low enough to be discovered in the first place and not almost infinite. It predicts a whole new population of elementary particles — called superpartners to the particles physicists already know about — one of which could be the dark matter that pervades the universe. If such particles exist, they would affect the rate at which Higgs bosons decay into other particles, but the CERN teams have yet to record what they consider a convincing deviation from the Standard Model predictions for those decays. Supersymmetry is still at best a beautiful idea.
  • ...9 more annotations...
  • One thing that has hampered progress is that physicists still do not agree on how much the new particle weighs.
  • What does it matter how much a Higgs boson weighs? It could determine the fate of the universe.
  • his colleagues ran the numbers and concluded that the universe was in a precarious condition and could be prone to collapse in the far, far future. The reason lies in the Higgs field, the medium of which the Higgs boson is the messenger and which determines the structure of empty space, i.e., the vacuum.
  • It works like this. The Higgs field, like everything else in nature, is lazy, and, like water running downhill, always seeks to be in the state of lowest energy. Physicists assume that the Higgs field today is in the lowest state possible, but Dr. Giudice found that was not the case. What counts as rock bottom in today’s universe could turn out to be just a plateau. Our universe is like a rock perched precariously on a mountaintop, he explained, in what physicists call a metastable state. The Higgs field could drop to a lower value by a process known as quantum tunneling, although it is not imminent.
  • If that should happen — tomorrow or billions of years from now — a bubble would sweep out through the universe at the speed of light, obliterating the laws of nature as we know them.
  • The calculations assume that the Standard Model is the final word in physics, good for all times and places and energies — something that no physicist really believes. Theories like supersymmetry or string theory could intercede at higher energies and change the outcome.
  • The calculations also depend crucially on the mass of the top quark, the heaviest known elementary particle, as well as the Higgs, neither of which have been weighed precisely enough yet to determine the fate of the universe. If the top quark were just a little lighter or the Higgs a little heavier, 130 billion electron volts, Dr. Giudice said, the vacuum would in fact be stable.
  • , “Why do we happen to live at the edge of collapse?” He went on, “In my view, the message about near-criticality of the universe is the most important thing we have learned from the discovery of the Higgs boson so far.” Guido Tonelli of CERN and the University of Pisa, said, “If true, it is somehow magic.” We wouldn’t be having this discussion, he said, if there hadn’t been enough time already for this universe to produce galaxies, stars, planets and “human beings who are attempting to produce a vision of the world,” he said.
  • “So, in some sense, we are here, because we have been lucky, because for this particular universe the lottery produced a certain set of numbers, which allow the universe to have an evolution, which is very long.”
Javier E

The Truth About Harvard - Magazine - The Atlantic - 0 views

  • the professor is not just a disinterested pedagogue. As a dispenser of grades he is a gatekeeper to worldly success. And in that capacity professors face upward pressure from students ("I can't afford a B if I want to get into law school"); horizontal pressure from their colleagues, to which even Mansfield gave way; downward pressure from the administration ("If you want to fail someone, you have to be prepared for a very long, painful battle with the higher echelons," one professor told the Crimson); and perhaps pressure from within, from the part of them that sympathizes with students' careerism.
  • Not every class was so easy. Those that were tended to be in history and English, classics and foreign languages, art and philosophy—in other words, in those departments that provide what used to be considered the meat of a liberal arts education. Humanities students generally did the least work, got the highest grades, and cruised academically
  • the libertarian philosopher Robert Nozick once hypothesized that most professors oppose capitalism because they consider themselves far smarter than boobish businessmen, and therefore resent the economic system that rewards practical intelligence over their own gifts. I'm inclined to think that such resentment—at least in money-drunk America—increasingly coexists with a deep inferiority complex regarding modern capitalism, and a need, however unconscious, to justify academic life in the face of the fantastic accumulation of wealth that takes place outside the ivory tower.
  • ...13 more annotations...
  • some areas of academic life aren't vulnerable to this crisis of confidence in the importance of one's work. Scientists can rest secure in the knowledge that their labors will help shove along the modern project of advancing health—and wealth.
  • Then there is economics, the new queen of the sciences—a discipline perfectly tailored to the modern market-driven university, and not coincidentally the most popular concentration
  • The humanities have no such reservoirs of confidence. And attempts by humanities professors to ape the rigor of their scientific colleagues have led to a decades-long wade in the marshes of postmodern academic theory, where canons are scorned, books exist only as texts to be deconstructed, and willfully obscure writing is championed over accessible prose. All this has merely reinforced capitalism's insistence that the sciences are the only important academic pursuits, because only they provide tangible, quantifiable (and potentially profitable) results. Far from making the humanities scientific, postmodernism has made them irrelevant.
  • The retreat into irrelevance is visible all across the humanities curriculum. Philosophy departments have largely purged themselves of metaphysicians and moralists; history departments emphasize exhaustive primary research and micro-history. In the field of English there is little pretense that literature is valuable in itself and should be part of every educated person's life, rather than serving as grist for endless academic debates
  • Sure, historians believe in their primary sources, English scholars in their textual debates, philosophers in their logic games. But many of them seem to believe that they have nothing to offer students who don't plan to be historians, or literary theorists, or philosophers. They make no effort to apply their work to what should be the most pressing task of undergraduate education: to provide a general education, a liberal arts education, to future doctors and bankers and lawyers and diplomats.
  • In this environment who can blame professors if, when it comes time to grade their students, they sometimes take the path of least resistance—the path of the gentleman's B-plus?
  • the Core's mission statement asserts, with a touch of smugness, that "the Core differs from other programs of general education. It does not define intellectual breadth as the mastery of a set of Great Books, or the digestion of a specific quantum of information … rather, the Core seeks to introduce students to the major approaches to knowledge in areas that the faculty considers indispensable to undergraduate education."
  • These words, which appear in the course catalogue each year, are the closest that Harvard comes to articulating an undergraduate educational philosophy. They suggest that the difference in importance between, say, "Democracy, Development, and Equality in Mexico" and "Reason and Faith in the West" (both offerings in Historical Study) does not matter. As the introduction to the history courses puts it, both courses offer a "historical" approach to knowledge that is presumably more valuable than mere "facts" about the past. Comprehending history "as a form of inquiry and understanding" trumps learning about actual events. The catalogue contains similarly pat introductions to the other disciplines. In each case the emphasis is squarely on methodology, not material.
  • The few Core classes that are well taught are swamped each year, no matter how obscure the subject matter. The closest thing to a Harvard education—that is, to an intellectual corpus that most Harvard graduates have in common—is probably obtained in such oversubscribed courses as "The Warren Court and the Pursuit of Justice," "First Nights: Five Performance Premieres," and "Fairy Tales, Children's Literature, and the Construction of Childhood."
  • As in a great library ravaged by a hurricane, the essential elements of a liberal arts education lie scattered everywhere at Harvard, waiting to be picked up. But little guidance is given on how to proceed with that task.
  • Harvard never attempted to answer that question—perhaps the most important question facing any incoming freshman. I chose my classes as much by accident as by design. There were times when some of them mattered to me, and even moments when I was intoxicated. But achieving those moments required pulling myself away from Harvard's other demands, whether social, extracurricular, or pre-professional, which took far more discipline than I was usually able to exert.
  • It was hard work to get into Harvard, and then it was hard work competing for offices and honors and extracurriculars with thousands of brilliant and driven young people; hard work keeping our heads in the swirling social world; hard work fighting for law-school slots and investment-banking jobs as college wound to a close … yes, all of that was heavy sledding. But the academics—the academics were another story.
  • What makes our age different is the moment that happened over and over again at Harvard, when we said This is going to be hard and then realized No, this is easy. Maybe it came when we boiled down a three-page syllabus to a hundred pages of exam-time reading, or saw that a paper could be turned in late without the frazzled teaching fellow's docking us, or handed in C-quality work and got a gleaming B-plus. Whenever the moment came, we learned that it wasn't our sloth alone, or our constant pushing for higher grades, that made Harvard easy. No, Harvard was easy because almost no one was pushing back.
Javier E

'Are the Clippers Really Worth $2 Billion?': You're Asking the Wrong Question - Derek T... - 0 views

  • Opportunities for extremely rich people to purchase quantum leaps in their reputation and renown are so rare—and their social, psychological, and emotional rewards so incalculable—that it's impossible to properly use terms like "worth" and "value" when you're looking at these sort of numbers.
  • It's highly debatable that $2 billion for a basketball team is the best use of money for Los Angeles, or California, or the broader world. In the game of utilitarianism, malaria nets beat alley-oops every day. But it might be the best use of money for Steve Ballmer. As psychologists and economists have written exhaustively in the last few years, happiness is hard to buy, but if you're going to try to do it, buy experiences. Owning an ascendent NBA team in a glamorous city that's ready to hail you as the fabulously un-racist savior of their most exciting professional franchise? That's some experience. For $2 billion out of Steve
  • Ballmer's deep pockets, it's practically a steal.
Javier E

So Bill Gates Has This Idea for a History Class ... - NYTimes.com - 0 views

  • Last month, the University of California system announced that a version of the Big History Project course could be counted in place of a more traditional World History class, paving the way for the state’s 1,300 high schools to offer it.
  • “We didn’t know when the last time was that somebody introduced a new course into high school,” Gates told me. “How does one go about it? What did the guy who liked biology — who did he call and say, ‘Hey, we should have biology in high school?’ It was pretty uncharted territory. But it was pretty cool.”
  • The American high school experience, at least as we now know it, is a relatively recent invention. Attendance did not start to become mandatory until the 1850s, and the notion of a nationwide standardized curriculum didn’t emerge until the turn of the century. But by the early 1900s, most children were taking the same list of classes that remains recognizable to this day: English, math, science and some form of history. For much of the 20th century, this last requirement would usually take the form of Western Civilization, a survey course that focused on European countries from around the rise of Rome through modernity.
  • ...16 more annotations...
  • “I remember the chain of thought,” he said. “I had to do prehistory, so I have to do some archaeology. But to do it seriously, I’m going to talk about how humans evolved, so, yikes, I’m in biology now. I thought: To do it seriously, I have to talk about how mammals evolved, how primates evolved. I have to go back to multicelled organisms, I have to go back to primeval slime. And then I thought: I have to talk about how life was created, how life appeared on earth! I have to talk geology, the history of the planet. And so you can see, this is pushing me back and back and back, until I realized there’s a stopping point — which is the Big Bang.” He paused. “I thought, Boy, would that be exciting to teach a course like this!”
  • In the wake of McNeill’s rebuke, Western Civ was slowly replaced by World History, a more comparative class that stressed broad themes across cultures and disciplines. Over the past 30 years, World History has produced its own formidable academic institutions and journals; these days, three-quarters of all American students take World History
  • by the early ‘70s, as the Vietnam War heightened interest in nations outside Europe, Western Civ was on the decline. In pedagogical circles, a book called “The Rise of the West: A History of the Human Community,” by William Hardy McNeill, a historian at the University of Chicago, persuasively argued that Western Civ was not merely biased against other cultures but also failed to account for the enormous influence that cultures had on one another over the millenniums.
  • Gates has insisted on tracking this venture as he would any Microsoft product or foundation project. The Big History Project produces reams of data — students and teachers are regularly surveyed, and teachers submit the results from classes, all of which allows his team to track what’s working and what isn’t as the course grows. “Our priority,” he told me from across the table, “was to get it into a form where ambitious teachers could latch onto it.”
  • They have monitored teacher feedback closely and decreased the course in size, from 20 units to 10. True to Christian’s original style, however, the high-school course links insights across subjects into wildly ambitious narratives. The units begin with the Big Bang and shift to lesson plans on the solar system, trade and communications, globalization and, finally, the future. A class on the emergence of life might start with photosynthesis before moving on to eukaryotes and multicellular organisms and the genius of Charles Darwin and James Watson. A lecture on the slave trade might include the history of coffee beans in Ethiopia.
  • “Most kids experience school as one damn course after another; there’s nothing to build connections between the courses that they take,” says Bob Bain, a professor of history and education at the University of Michigan and an adviser to the Big History Project, who has helped devise much of the curriculum. “The average kid has no way to make sense between what happens with their first-period World History class and their second-period algebra class, third-period gym class, fourth-period literature — it’s all disconnected. It’s like if I were to give you a jigsaw puzzle and throw 500 pieces on the table and say, ‘Oh, by the way, I’m not going to show you the box top as to how they fit together.’ ”
  • Christian, who is 67, now travels the world as something of an evangelist for the spread of the Big History Project. (His TED Talk, “The History of Our World in 18 Minutes,” has been viewed more than four million times online.)
  • Few schools had teachers who were willing or able to instruct a hybrid course; some schools wound up requiring that two teachers lead the class together. Gates, who had hoped to avoid bureaucracy, found himself mired in it. “You’ve got to get a teacher in the history department and the science department — they have to be very serious about it, and they have to get their administrative staff to agree. And then you have to get it on the course schedule so kids can sign up,” he said. “So they have to decide, kind of in the spring or earlier, and those teachers have to spend a lot of that summer getting themselves ready for the thing.”
  • Perhaps the largest challenge facing the Big History Project, however, is Gates himself, or at least the specter of him. To his bafflement and frustration, he has become a remarkably polarizing figure in the education world. This owes largely to the fact that Gates, through his foundation, has spent more than $200 million to advocate for the Common Core, something of a third rail in education circles
  • Diane Ravitch, an education historian at New York University who has been a vocal critic of Gates, put even it more starkly: “When I think about history, I think about different perspectives, clashing points of view. I wonder how Bill Gates would treat the robber barons. I wonder how Bill Gates would deal with issues of extremes of wealth and poverty.”
  • “It begins to be a question of: Is this Bill Gates’s history? And should it be labeled ‘Bill Gates’s History’? Because Bill Gates’s history would be very different from somebody else’s who wasn’t worth $50-60 billion.”
  • perhaps, Big History might even become a successor to Western Civ and World History.
  • he also noted that Big History — which is already being offered in South Korea, the Netherlands and, of course, Australia — had significant global potential.
  • Sam Wineburg, a professor of education and history at Stanford, told me that although he sees Big History as “an important intellectual movement,” he did not consider the class to be a suitable replacement for an actual history course. “At certain points, it becomes less history and more of a kind of evolutionary biology or quantum physics. It loses the compelling aspect that is at the heart of the word ‘history.’ ”
  • Wineburg’s deepest concern about the approach was its failure to impart a methodology to students. “What is most pressing for American high-school students right now, in the history-social-studies curriculum, is: How do we read a text? How do we connect our ability to sharpen our intellectual capabilities when we’re evaluating sources and trying to understand human motivation?” he asked. “When we think about history, what are the primary sources of Big History? The original scientific reports of the Big Bang?”
  • Barr, the principal in Brooklyn, however, came to feel that Gates’s course was better than the existing alternative. “If you were to interview many, many progressive social-studies teachers, they would tell you that World History is a completely flawed course. It’s spotty. It’s like fact soup. Kids don’t come out of it really having a sense of global history,”
Javier E

Everything is up for grabs in Schrödinger's Brexit | John Crace | Politics | ... - 0 views

  • The hardline Brexiters were as good as their word. There was no Brexit they could vote for. Bill Cash, Steve Baker, Owen Paterson and John Redwood had been very clear about that. They had devoted their lives to fighting those bastard Johnny Foreigners in Brussels and they weren’t going to let Brexit stop them. Imagine a life with nothing to moan about; nothing to get out of bed for. Without the EU, life was a meaningless void. They were the parasites who couldn’t survive without their host.
  • most MPs have long since said everything they had to say about Brexit. Like Lino, they too are now on repeat. The one exception was Dominic Raab who stood up to say that you would still need to be insane to support an exit deal as bad as the one the government had negotiated. But because he now realised he was clinically certifiable, he was going to vote for it. It was the first time anyone had ever launched a leadership bid by effectively ending it. His last remaining cohort of Spartans who would never take yes for an answer would never trust him again. A small win on the day
  • There was just one certainty. By voting with the government, Boris Johnson had traded his principles for his career. But then we had always known he would. Johnson’s untrustworthiness is the only solid thing the country has left to hang on to. A Newtonian rock in a Quantum Brexit. We really are that far up shit creek.
Javier E

How to Prepare for an Automated Future - The New York Times - 0 views

  • We don’t know how quickly machines will displace people’s jobs, or how many they’ll take, but we know it’s happening — not just to factory workers but also to money managers, dermatologists and retail workers.
  • The logical response seems to be to educate people differently, so they’re prepared to work alongside the robots or do the jobs that machines can’t. But how to do that, and whether training can outpace automation, are open questions.
  • Pew Research Center and Elon University surveyed 1,408 people who work in technology and education to find out if they think new schooling will emerge in the next decade to successfully train workers for the future. Two-thirds said yes; the rest said n
  • ...18 more annotations...
  • People still need to learn skills, the respondents said, but they will do that continuously over their careers. In school, the most important thing they can learn is how to learn.
  • At universities, “people learn how to approach new things, ask questions and find answers, deal with new situations,”
  • Schools will also need to teach traits that machines can’t yet easily replicate, like creativity, critical thinking, emotional intelligence, adaptability and collaboration.
  • these are not necessarily easy to teach.
  • “Many of the ‘skills’ that will be needed are more like personality characteristics, like curiosity, or social skills that require enculturation to take hold,
  • “I have complete faith in the ability to identify job gaps and develop educational tools to address those gaps,” wrote Danah Boyd, a principal researcher at Microsoft Research and founder of Data and Society, a research institute. “I have zero confidence in us having the political will to address the socioeconomic factors that are underpinning skill training.”
  • Andrew Walls, managing vice president at Gartner, wrote, “Barring a neuroscience advance that enables us to embed knowledge and skills directly into brain tissue and muscle formation, there will be no quantum leap in our ability to ‘up-skill’ people.
  • many survey respondents said a degree was not enough — or not always the best choice, especially given its price tag.
  • Many of them expect more emphasis on certificates or badges, earned from online courses or workshops, even for college graduates.
  • One potential future, said David Karger, a professor of computer science at M.I.T., would be for faculty at top universities to teach online and for mid-tier universities to “consist entirely of a cadre of teaching assistants who provide support for the students.”
  • Portfolios of work are becoming more important than résumés.
  • “Three-dimensional materials — in essence, job reels that demonstrate expertise — will be the ultimate demonstration of an individual worker’s skills.”
  • Consider it part of your job description to keep learning, many respondents said — learn new skills on the job, take classes, teach yourself new things.
  • Focus on learning how to do tasks that still need humans, said Judith Donath of Harvard’s Berkman Klein Center for Internet & Society: teaching and caregiving; building and repairing; and researching and evaluating
  • The problem is that not everyone is cut out for independent learning, which takes a lot of drive and discipline.
  • People who are suited for it tend to come from privileged backgrounds, with a good education and supportive parents,
  • “The fact that a high degree of self-direction may be required in the new work force means that existing structures of inequality will be replicated in the future,”
  • “The ‘jobs of the future’ are likely to be performed by robots,” said Nathaniel Borenstein, chief scientist at Mimecast, an email company. “The question isn’t how to train people for nonexistent jobs. It’s how to share the wealth in a world where we don’t need most people to work.”
Javier E

Natural Gas, America's No. 1 Power Source, Already Has a New Challenger: Batteries - WSJ - 0 views

  • Vistra Corp. owns 36 natural-gas power plants, one of America’s largest fleets. It doesn’t plan to buy or build any more. Instead, Vistra intends to invest more than $1 billion in solar farms and battery storage units in Texas and California as it tries to transform its business to survive in an electricity industry being reshaped by new technology.
  • A decade ago, natural gas displaced coal as America’s top electric-power source, as fracking unlocked cheap quantities of the fuel. Now, in quick succession, natural gas finds itself threatened with the same kind of disruption, only this time from cost-effective batteries charged with wind and solar energy.
  • Natural-gas-fired electricity represented 38% of U.S. generation in 2019
  • ...23 more annotations...
  • Wind and solar generators have gained substantial market share, and as battery costs fall, batteries paired with that green power are beginning to step into those roles by storing inexpensive green energy and discharging it after the sun falls or the wind dies.
  • President Biden is proposing to extend renewable-energy tax credits to stand-alone battery projects—installations that aren’t part of a generating facility—as part of his $2.3 trillion infrastructure plan, which could add fuel to an already booming market for energy storage.
  • renewables have become increasingly cost-competitive without subsidies in recent years, spurring more companies to voluntarily cut carbon emissions by investing in wind and solar power at the expense of that generated from fossil fuels.
  • the specter of more state and federal regulations to address climate change is accelerating the trend.
  • the combination of batteries and renewable energy is threatening to upend billions of dollars in natural-gas investments, raising concerns about whether power plants built in the past 10 years—financed with the expectation that they would run for decades—will become “stranded assets,” facilities that retire before they pay for themselves.
  • Much of the nation’s gas fleet, on the other hand, is relatively young, increasing the potential for stranded costs if widespread closures occur within the next two decades.
  • most current batteries can deliver power only for several hours before needing to recharge. That makes them nearly useless during extended outages.
  • Duke Energy Corp. , a utility company based in Charlotte, N.C., that supplies electricity and natural gas in parts of seven states, is still looking to build additional gas-fired power plants. But it has started to rethink its financial calculus to reflect that the plants might need to pay for themselves sooner, because they might not be able to operate for as long.
  • To remedy that, Duke in public filings said it is considering shortening the plants’ expected lifespan from about 40 years to 25 years and recouping costs using accelerated depreciation, an accounting measure that would let the company write off more expenses earlier in the plants’ lives
  • It may also consider eventually converting the plants to run on hydrogen, which doesn’t result in carbon emissions when burned.
  • as batteries help wind and solar displace traditional power sources, some investors view the projects with caution, noting that they, too, could become victims of disruption in coming years, if still-other technological advances yield better ways to store energy.
  • Gas plants that supply power throughout the day face the biggest risk of displacement. Such “baseload” plants typically need to run at 60% to 80% capacity to be economically viable, making them vulnerable as batteries help fill gaps in power supplied by solar and wind farms.
  • Today, such plants average 60% capacity in the U.S., according to IHS Markit, a data and analytics firm. By the end of the decade, the firm expects that average to fall to 50%, raising the prospect of bankruptcy and restructuring for the lowest performers.
  • “It’s just coal repeating itself.”
  • It took only a few years for inexpensive fracked gas to begin displacing coal used in power generation. Between 2011, shortly after the start of the fracking boom, and 2020, more than 100 coal plants with 95,000 megawatts of capacity were closed or converted to run on gas, according to the EIA. An additional 25,000 megawatts are slated to close by 2025.
  • Batteries are most often paired with solar farms, rather than wind farms, because of their power’s predictability and because it is easier to secure federal tax credits for that pairing.
  • Already, the cost of discharging a 100-megawatt battery with a two-hour power supply is roughly on par with the cost of generating electricity from the special power plants that operate during peak hours. Such batteries can discharge for as little as $140 a megawatt-hour, while the lowest-cost “peaker” plants—which fire up on demand when supplies are scarce—generate at $151 a megawatt-hour, according to investment bank Lazard.
  • Solar farms paired with batteries, meanwhile, are becoming competitive with gas plants that run all the time. Those types of projects can produce power for as little as $81 a megawatt-hour, according to Lazard, while the priciest of gas plants average $73 a megawatt-hour
  • Even in Texas, a state with a fiercely competitive power market and no emissions mandates, scarcely any gas plants are under construction, while solar farms and batteries are growing fast. Companies are considering nearly 88,900 megawatts of solar, 23,860 megawatts of wind and 30,300 megawatts of battery storage capacity in the state, according to the Electric Reliability Council of Texas. By comparison, only 7,900 megawatts of new gas-fired capacity is under consideration.
  • California last summer experienced the consequences of quickly reducing its reliance on gas plants. In August, during an intense heat wave that swept the West, the California grid operator resorted to rolling blackouts to ease a supply crunch when demand skyrocketed. In a postmortem published jointly with the California Public Utilities Commission and the California Energy Commission, the operator identified the rapid shift to solar and wind power as one of several contributing factors.
  • Mr. Morgan, who has closed a number of Vistra’s coal-fired and gas-fired plants since becoming CEO in 2016, said he anticipates most of the company’s remaining gas plants to operate for the next 20 years.
  • Quantum Energy Partners, a Houston-based private-equity firm, in the last several years sold a portfolio of six gas plants in Texas and three other states upon seeing just how competitive renewable energy was becoming. It is now working to develop more than 8,000 megawatts of wind, solar and battery projects in 10 states.
  • “We pivoted,” said Sean O’Donnell, a partner in the firm who helps oversee the firm’s power investments. “Everything that we had on the conventional power side, we decided to sell, given our outlook of increasing competition and diminishing returns.”
Javier E

The Failure of Rational Choice Philosophy - NYTimes.com - 0 views

  • According to Hegel, history is idea-driven.
  • Ideas for him are public, rather than in our heads, and serve to coordinate behavior. They are, in short, pragmatically meaningful words.  To say that history is “idea driven” is to say that, like all cooperation, nation building requires a common basic vocabulary.
  • One prominent component of America’s basic vocabulary is ”individualism.”
  • ...12 more annotations...
  • individualism, the desire to control one’s own life, has many variants. Tocqueville viewed it as selfishness and suspected it, while Emerson and Whitman viewed it as the moment-by-moment expression of one’s unique self and loved it.
  • individualism as the making of choices so as to maximize one’s preferences. This differed from “selfish individualism” in that the preferences were not specified: they could be altruistic as well as selfish. It differed from “expressive individualism” in having general algorithms by which choices were made. These made it rational.
  • it was born in 1951 as “rational choice theory.” Rational choice theory’s mathematical account of individual choice, originally formulated in terms of voting behavior, made it a point-for-point antidote to the collectivist dialectics of Marxism
  • Functionaries at RAND quickly expanded the theory from a tool of social analysis into a set of universal doctrines that we may call “rational choice philosophy.” Governmental seminars and fellowships spread it to universities across the country, aided by the fact that any alternative to it would by definition be collectivist.
  • But the real significance of rational choice philosophy lay in ethics. Rational choice theory, being a branch of economics, does not question people’s preferences; it simply studies how they seek to maximize them. Rational choice philosophy seems to maintain this ethical neutrality (see Hans Reichenbach’s 1951 “The Rise of Scientific Philosophy,” an unwitting masterpiece of the genre); but it does not.
  • Today, governments and businesses across the globe simply assume that social reality  is merely a set of individuals freely making rational choices.
  • At home, anti-regulation policies are crafted to appeal to the view that government must in no way interfere with Americans’ freedom of choice.
  • rational choice philosophy moved smoothly on the backs of their pupils into the “real world” of business and governme
  • Whatever my preferences are, I have a better chance of realizing them if I possess wealth and power. Rational choice philosophy thus promulgates a clear and compelling moral imperative: increase your wealth and power!
  • Today, institutions which help individuals do that (corporations, lobbyists) are flourishing; the others (public hospitals, schools) are basically left to rot. Business and law schools prosper; philosophy departments are threatened with closure.
  • Hegel, for one, had denied all three of its central claims in his “Encyclopedia of the Philosophical Sciences” over a century before. In that work, as elsewhere in his writings, nature is not neatly causal, but shot through with randomness. Because of this chaos, we cannot know the significance of what we have done until our community tells us; and ethical life correspondingly consists, not in pursuing wealth and power, but in integrating ourselves into the right kinds of community.
  • By 1953, W. V. O. Quine was exposing the flaws in rational choice epistemology. John Rawls, somewhat later, took on its sham ethical neutrality, arguing that rationality in choice includes moral constraints. The neat causality of rational choice ontology, always at odds with quantum physics, was further jumbled by the environmental crisis, exposed by Rachel Carson’s 1962 book “The Silent Spring,” which revealed that the causal effects of human actions were much more complex, and so less predicable, than previously thought.
Javier E

Opinion | Dan Coats: The new 'Cold War' between the U.S. and China is a dangerous myth ... - 0 views

  • ll this has many observers — even in the White House — speaking of a new “Cold War” between the United States and China. Some even argue that this is desirable, presumably with the belief that our side will naturally emerge victorious.
  • the phrase is a misleading one. It assumes that the terms of the old Cold War between the Soviet Union and the United States, which we fought and won, are relevant, and that the tools used successfully then could be used again now.
  • This conceptual error ignores the many differences between then and now. It is worth recalling that the Soviet Union was not our major trading partner, was not a major holder of our debt and was not tightly interconnected in the supply chains critical to our (and the world’s) economy.
  • ...9 more annotations...
  • The Cold War was fought and won pretty much exclusively on military and cultural terms. The economic side was relevant only because the Soviets' doomed model inhibited any real competition. We were neither competitors nor partners in the economic space. A new Cold War between the United States and China would be something else entirely. It is difficult to see how it could be fought effectively, not to mention successfully.
  • This is by no means to question the need to respond to increasingly aggressive behavior by China. But the U.S. response must be coherent, disciplined and sophisticated. It must balance capabilities and objectives
  • Reverting to a Cold War mentality will drive us toward belligerent posturing that has little or no chance of changing Chinese behavior and could, on the contrary, provoke overreactions and dangerous miscalculations on both sides.
  • China has recognized, far earlier and far more clearly than any of the rest of us, that technology is the determining factor in the decisive battle of this moment in history. Beijing is working hard to create an overwhelming Chinese advantage in this battle.
  • This is very hard work, requiring patience, conviction and broad political support. It also requires the full participation of our allies, both in the region and elsewhere. We must undertake these efforts with the imperative of preventing a downward spiral toward armed conflict.
  • the Chinese are clearly pursuing their foreign policy goals according to a carefully calculated long-term strategy.
  • China’s strategy also aims to encircle the West technologically, dominating all the advanced systems of data collection and manipulation, including artificial intelligence, robotics, aerospace and quantum computing, always taking into account potential military applications
  • Above all, we must create a deliberate strategy that is aimed at managing this great-power conflict rather than vanquishing a foe.
  • Nearly spontaneous and seemingly unconnected irritations such as closing a consulate, imposing sanctions on a few officials, tweaking tariffs or sanctioning individual companies merely provoke countermeasures that will inhibit real management of this immense and complicated problem.
Javier E

Ian Hacking, Eminent Philosopher of Science and Much Else, Dies at 87 - The New York Times - 0 views

  • In an academic career that included more than two decades as a professor in the philosophy department of the University of Toronto, following appointments at Cambridge and Stanford, Professor Hacking’s intellectual scope seemed to know no bounds. Because of his ability to span multiple academic fields, he was often described as a bridge builder.
  • “Ian Hacking was a one-person interdisciplinary department all by himself,” Cheryl Misak, a philosophy professor at the University of Toronto, said in a phone interview. “Anthropologists, sociologists, historians and psychologists, as well as those working on probability theory and physics, took him to have important insights for their disciplines.”
  • Professor Hacking wrote several landmark works on the philosophy and history of probability, including “The Taming of Chance” (1990), which was named one of the best 100 nonfiction books of the 20th century by the Modern Library.
  • ...17 more annotations...
  • In 2000, he became the first Anglophone to win a permanent position at the Collège de France, where he held the chair in the philosophy and history of scientific concepts until he retired in 2006.
  • His work in the philosophy of science was groundbreaking: He departed from the preoccupation with questions that had long concerned philosophers. Arguing that science was just as much about intervention as it was about representation, be helped bring experimentation to center stage.
  • Hacking often argued that as the human sciences have evolved, they have created categories of people, and that people have subsequently defined themselves as falling into those categories. Thus does human reality become socially constructed.
  • His book “The Emergence of Probability” (1975), which is said to have inspired hundreds of books by other scholars, examined how concepts of statistical probability have evolved over time, shaping the way we understand not just arcane fields like quantum physics but also everyday life.
  • “I was trying to understand what happened a few hundred years ago that made it possible for our world to be dominated by probabilities,” he said in a 2012 interview with the journal Public Culture. “We now live in a universe of chance, and everything we do — health, sports, sex, molecules, the climate — takes place within a discourse of probabilities.”
  • Whatever the subject, whatever the audience, one idea that pervades all his work is that “science is a human enterprise,” Ragnar Fjelland and Roger Strand of the University of Bergen in Norway wrote when Professor Hacking won the Holberg Prize. “It is always created in a historical situation, and to understand why present science is as it is, it is not sufficient to know that it is ‘true,’ or confirmed. We have to know the historical context of its emergence.”
  • Regarding one such question — whether unseen phenomena like quarks and electrons were real or merely the theoretical constructs of physicists — he argued for reality in the case of phenomena that figured in experiments, citing as an example an experiment at Stanford that involved spraying electrons and positrons into a ball of niobium to detect electric charges. “So far as I am concerned,” he wrote, “if you can spray them, they’re real.”
  • “I have long been interested in classifications of people, in how they affect the people classified, and how the effects on the people in turn change the classifications,” he wrote in “Making Up People
  • “I call this the ‘looping effect,’” he added. “Sometimes, our sciences create kinds of people that in a certain sense did not exist before.”
  • In “Why Race Still Matters,” a 2005 article in the journal Daedalus, he explored how anthropologists developed racial categories by extrapolating from superficial physical characteristics, with lasting effects — including racial oppression. “Classification and judgment are seldom separable,” he wrote. “Racial classification is evaluation.”
  • Similarly, he once wrote, in the field of mental health the word “normal” “uses a power as old as Aristotle to bridge the fact/value distinction, whispering in your ear that what is normal is also right.”
  • In his influential writings about autism, Professor Hacking charted the evolution of the diagnosis and its profound effects on those diagnosed, which in turn broadened the definition to include a greater number of people.
  • Encouraging children with autism to think of themselves that way “can separate the child from ‘normalcy’ in a way that is not appropriate,” he told Public Culture. “By all means encourage the oddities. By no means criticize the oddities.”
  • His emphasis on historical context also illuminated what he called transient mental illnesses, which appear to be so confined 0cto their time 0c 0cthat they can vanish when times change.
  • “hysterical fugue” was a short-lived epidemic of compulsive wandering that emerged in Europe in the 1880s, largely among middle-class men who had become transfixed by stories of exotic locales and the lure of trave
  • His intellectual tendencies were unmistakable from an early age. “When he was 3 or 4 years old, he would sit and read the dictionary,” Jane Hacking said. “His parents were completely baffled.”
  • He wondered aloud, the interviewer noted, if the whole universe was governed by nonlocality — if “everything in the universe is aware of everything else.”“That’s what you should be writing about,” he said. “Not me. I’m a dilettante. My governing word is ‘curiosity.’”
Javier E

They Wanted to Write the History of Modern China. But How? - The New York Times - 0 views

  • this is the key message of Tsu’s book: The story of how linguists, activists, librarians, scholars and ordinary citizens adapted Chinese writing to the modern world is the story of how China itself became modern.
  • Following the history of the script helps explain China’s past, present — and future. “More than a century’s effort at learning how to standardize and transform its language into a modern technology has landed China here,” writes Tsu, a professor of East Asian languages and literature at Yale, “at the beginning — not the end — of becoming a standard setter, from artificial intelligence to quantum natural language processing, automation to machine translation.”
  • With their “ad hoc efforts to retrofit Chinese characters” to typewriters and telegraphs, Chinese inventors sought to resolve the difficulties “that accompanied being late entrants in systems intended for a different kind of written language. But many wondered if the Chinese script itself was the problem.”
  • ...9 more annotations...
  • This book tells the stories of those who decided otherwise.
  • Tsu weaves linguistic analysis together with biographical and historical context — the ravages of imperialism, civil war, foreign invasions, diplomatic successes and disappointments. This approach not only adds background and meaning to the script debate, but also terrific color to what might have otherwise read like a textbook.
  • Could any alphabet account for the tones needed to differentiate among characters?
  • Each step of the way, these innovators had to ask questions like: How can the Chinese script be organized in a rational way? Could the language be written with an alphabet?
  • By examining these questions closely, Tsu helps the novice to Chinese understand both the underlying challenges and how they were conquered.
  • Mao, Tsu notes, “went down in history as, among other things, the political figure who guided the Chinese language through its two greatest transformations in modern history.”
  • With more than 90 percent of the population illiterate, Mao embraced the movement to reduce the number of strokes in more than 2,200 characters to render them easier to learn and write. (Taiwan, rejecting simplification, still sees itself as the guardian of traditional Chinese culture.)
  • Mao also spurred the creation of Pinyin, a phonetic, Romanized Chinese alphabet designed as an auxiliary aid to learning Chinese script, rather than a replacement.
  • in the end, the Chinese script did not die; instead, it flourished. As Tsu writes, “Every technology that has ever confronted the Chinese script, or challenged it, also had to bow before it.”
Javier E

Opinion | Forget the Multiverse. Embrace the Monoverse. - The New York Times - 0 views

  • Capgras syndrome. First defined a century ago, Capgras typically describes a person’s belief that someone close to him or her — a spouse or a child — has been replaced with a duplicate impostor
  • n this case, the patient believed that the whole world — everything she could observe of it — was a duplicate, a fake.I know a little bit how that feels.So do you, probably.
  • It’s easy to see the appeal of the multiverse, even as metaphor: the notion that we’re surrounded by a multitude of parallel selves, one of which might be living in a better timeline than the one we’re stuck in. It’s probably no coincidence that the idea has become so popular during an era of pandemic, climate change and political turmoil, when so many of us have felt helpless and trapped.
  • ...14 more annotations...
  • Like the Capgras patient, we risk becoming detached from the world we can see and touch. Regardless of whether we can prove that the multiverse exists, the idea of it can distract us from doing the work we need to do to make this world better
  • In 1957, a year after Lewis published his last Narnia book, a Princeton doctoral student, Hugh Everett III, published a dissertation bringing the ancient idea of the simultaneous existence of several worlds into the realm of modern science.
  • Everett was trying to solve a seeming paradox in quantum theory: Certain elementary particles (say, a photon) seemed to exist mathematically in many places at once but could be detected at only one location at a time.
  • Perhaps, Everett suggested, the act of detecting the particle splinters reality; perhaps the observer, and indeed the universe, splits into different possible timelines, one for each possible location of the particle. This would become known as the many-worlds interpretation
  • In my 30s, I knew I had to save myself from the enticements of alternate realities. So I envisioned a new cosmology of time
  • I felt a horrible sense of vertigo as I watched the life I’d been expecting to live tilting away from me. In this new timeline, my stepsiblings were no longer my siblings; they would become, instead, just people I knew for a while in high school.
  • For years, I couldn’t stop thinking about other, better timelines where it didn’t happen, where my stepfather was still alive and my family intact. It helped me understand what was missing, but it did not allow me to mourn what I’d lost.
  • And that’s the peril of the multiverse; I was becoming unreal to myself, nostalgic not for a time before the death happened but for a timeline in which it never happened at all.
  • In “Everything Everywhere,” Joy, the character played by Stephanie Hsu, has become aware of every possible timeline. She succumbs to nihilistic despair. If everything is happening, then nothing can matter.
  • We can joke or wonder whether we’re in the wrong timeline. But we can’t lose sight of the fact that this timeline is the only one we’ve got.
  • When I was 12, my mother met a man, and suddenly the family I’d imagined for myself became real. I had an older brother who loved puns and an older sister who wrote poems.But when I was 19, my stepfather died of melanoma; within a few years of recriminations and disputes, our blended family unblended itself.
  • Instead of a linear, branching timeline with multiple, parallel possibilities — so much more vivid than my real life — I tried to imagine time as a sphere always expanding away from me in every direction, like the light leaving a star.
  • In this model of time, instead of the past receding behind me, it expands outward to surround me, always there and always present. The future is at the very center of the sphere, curled up infinitely small inside of me, waiting to be realized. That way, I can believe that there is nothing to come that I do not already contain
  • if we have to believe in something invisible, let me believe in a version of the universe that keeps my focus where it belongs: on the things I can touch and change.
1 - 20 of 23 Next ›
Showing 20 items per page