Skip to main content

Home/ TOK Friends/ Group items tagged passed

Rss Feed Group items tagged

carolinewren

The Reality of Quantum Weirdness - NYTimes.com - 2 views

  • Is there a true story, or is our belief in a definite, objective, observer-independent reality an illusion?
  • Is there a fixed reality apart from our various observations of it? Or is reality nothing more than a kaleidoscope of infinite possibilities?
  • So an electron is a wave, not a particle?
  • ...7 more annotations...
  • each electron somehow acts like a wave interfering with itself, as if it is simultaneously passing through both slits at once.
  • Instead, we see two lumps on the screen, as if the electrons, suddenly aware of being observed, decided to act like little pellets.
  • the electrons go back to their wavelike behavior, and the interference pattern miraculously reappears.
  • . For an individual particle like an electron, for example, the wave function provides information about the probabilities that the particle can be observed at particular locations, as well as the probabilities of the results of other measurements of the particle that you can make, such as measuring its momentum.
  • If the wave function is merely knowledge-based, then you can explain away odd quantum phenomena by saying that things appear to us this way only because our knowledge of the real state of affairs is insufficient.
  • If there is an objective reality at all, the paper demonstrates, then the wave function is in fact reality-based.
  • We should be careful to recognize that the weirdness of the quantum world does not directly imply the same kind of weirdness in the world of everyday experience. That’s because the nebulous quantum essence of individual elementary particles is known to quickly dissipate in large ensembles of particles (a phenomenon often referred to as “decoherence”).
sissij

A Scar on the Chinese Soul - The New York Times - 1 views

  • It “is that the Chinese, without direct orders, were so cruel to each other.”
  • Cultural Revolution trauma differs from that related to other horrific events, like the Holocaust and the Rwandan genocide, studies have noted, in part because in China, people were persecuted not for “unalterable” characteristics such as ethnicity and race, but for having the wrong frame of mind.
  • neither joining the Red Guards nor believing in Maoism protected someone from suffering long-term trauma.
  • ...3 more annotations...
  • The blurry distinction between perpetrators and victims makes collective healing by confronting the past a thorny project.
  • The idea that life experiences could cause inheritable genetic changes has been identified among children of Holocaust survivors, who have been shown to have an increased likelihood of stress-related illnesses.
  • The possibility of epigenetic inheritance has been raised by Chinese academics regarding the Cultural Revolution, but to research the topic would most certainly invite state punishment.
    • sissij
       
      I think this article is a little bit exaggerating. Nobody is perfect, even the Mao that saved China from the hand of the Japanese. The title "a scar on the Chinese soul" is just too heavy. It is only showing one perspective of the event and it can be misleading to the general population America. From a media course I took during the winter break, it shows that there are cultural elements influencing the media inevitably. For America, the key words are "freedom", "supremacy", "heroism", "democracy"... I can see those elements influencing the view of the author in this article. And he is not the only one. There is a book called "Street of Eternal Happiness". It is also talking about Chinese suffering. There is confirmation bias in whom the author is interviewing and what data he is using in support to his argument. --Sissi (1/19/2017)
bennetttony

Google has banned 200 publishers in the three months since it passed a new policy again... - 0 views

  •  
    The company routinely weeds out "bad ads." Now it weeds out more bad ad publishers, too.
katrinaskibicki

TV is killing off so many characters that death is losing its punch - Vox - 0 views

  • TV is drowning in cheap, sloppily executed deaths
  • The spring of 2016 has been marked by death after death after death on TV. Some have proved controversial. Some have passed without commentary. A couple have been largely applauded.
  • But the fact remains: TV is killing major characters at an astonishing rate.
  • ...4 more annotations...
  • That's the allure of a TV death in a nutshell. It's why everybody in the industry keeps chasing it. It raises the show's dramatic stakes. It almost automatically creates lots of conversation. And when done well, it can take your show to another level.
  • The problem with most TV deaths is pretty simple: They're frequently devoid of meaning, inserted into the plot only to create shock and boost a show's profile on Twitter.
  • Since it's hard to simply force audiences to care about the deaths of extreme supporting characters, both The Walking Dead and Game of Thrones have increasingly turned toward sensationalizing the deaths that do occur, making them bigger and bloodier, with mixed results.
  • The problem only compounds itself if you survey television as a whole. In isolation, a death that is adequate, though not particularly stirring, becomes harder to take when there's a whole wave of mediocre deaths around the programming grid. It gives off the appearance of a medium that's turning, increasingly, toward desperation.
Javier E

The Choice Explosion - The New York Times - 0 views

  • the social psychologist Sheena Iyengar asked 100 American and Japanese college students to take a piece of paper. On one side, she had them write down the decisions in life they would like to make for themselves. On the other, they wrote the decisions they would like to pass on to others.
  • The Americans desired choice in four times more domains than the Japanese.
  • Americans now have more choices over more things than any other culture in human history. We can choose between a broader array of foods, media sources, lifestyles and identities. We have more freedom to live out our own sexual identities and more religious and nonreligious options to express our spiritual natures.
  • ...15 more annotations...
  • But making decisions well is incredibly difficult, even for highly educated professional decision makers. As Chip Heath and Dan Heath point out in their book “Decisive,” 83 percent of corporate mergers and acquisitions do not increase shareholder value, 40 percent of senior hires do not last 18 months in their new position, 44 percent of lawyers would recommend that a young person not follow them into the law.
  • It’s becoming incredibly important to learn to decide well, to develop the techniques of self-distancing to counteract the flaws in our own mental machinery. The Heath book is a very good compilation of those techniques.
  • assume positive intent. When in the midst of some conflict, start with the belief that others are well intentioned. It makes it easier to absorb information from people you’d rather not listen to.
  • Suzy Welch’s 10-10-10 rule. When you’re about to make a decision, ask yourself how you will feel about it 10 minutes from now, 10 months from now and 10 years from now. People are overly biased by the immediate pain of some choice, but they can put the short-term pain in long-term perspective by asking these questions.
  • An "explosion" that may also be a "dissolution" or "disintegration," in my view. Unlimited choices. Conduct without boundaries. All of which may be viewed as either "great" or "terrible." The poor suffer when they have no means to pursue choices, which is terrible. The rich seem only to want more and more, wealth without boundaries, which is great for those so able to do. Yes, we need a new decision-making tool, but perhaps one that is also very old: simplify, simplify,simplify by setting moral boundaries that apply to all and which define concisely what our life together ought to be.
  • our tendency to narrow-frame, to see every decision as a binary “whether or not” alternative. Whenever you find yourself asking “whether or not,” it’s best to step back and ask, “How can I widen my options?”
  • deliberate mistakes. A survey of new brides found that 20 percent were not initially attracted to the man they ended up marrying. Sometimes it’s useful to make a deliberate “mistake” — agreeing to dinner with a guy who is not your normal type. Sometimes you don’t really know what you want and the filters you apply are hurting you.
  • It makes you think that we should have explicit decision-making curriculums in all schools. Maybe there should be a common course publicizing the work of Daniel Kahneman, Cass Sunstein, Dan Ariely and others who study the way we mess up and the techniques we can adopt to prevent error.
  • The explosion of choice places extra burdens on the individual. Poorer Americans have fewer resources to master decision-making techniques, less social support to guide their decision-making and less of a safety net to catch them when they err.
  • the stress of scarcity itself can distort decision-making. Those who experienced stress as children often perceive threat more acutely and live more defensively.
  • The explosion of choice means we all need more help understanding the anatomy of decision-making.
  • living in an area of concentrated poverty can close down your perceived options, and comfortably “relieve you of the burden of choosing life.” It’s hard to maintain a feeling of agency when you see no chance of opportunity.
  • In this way the choice explosion has contributed to widening inequality.
  • The relentless all-hour reruns of "Law and Order" in 100 channel cable markets provide direct rebuff to the touted but hollow promise/premise of wider "choice." The small group of personalities debating a pre-framed trivial point of view, over and over, nightly/daily (in video clips), without data, global comparison, historic reference, regional content, or a deep commitment to truth or knowledge of facts has resulted in many choosing narrower limits: streaming music, coffee shops, Facebook--now a "choice" of 1.65 billion users.
  • It’s important to offer opportunity and incentives. But we also need lessons in self-awareness — on exactly how our decision-making tool is fundamentally flawed, and on mental frameworks we can adopt to avoid messing up even more than we do.
Javier E

The Anger Wave That May Just Wipe Out Laissez-Faire Economics - The New York Times - 1 views

  • few would have guessed that the economic order built upon Mr. Reagan’s and Mrs. Thatcher’s common faith in unfettered global markets (and largely accepted by their more liberal successors Bill Clinton and Tony Blair) would be brought down by right-wing populists riding the anger of a working class that has been cast aside in the globalized economy that the two leaders trumpeted 40 years ago.
  • The so-called Brexit vote was driven by an inchoate sense among older white workers with modest education that they have been passed over, condemned by forces beyond their control to an uncertain job for little pay in a world where their livelihoods are challenged not just by cheap Asian workers halfway around the world, but closer to home by waves of immigrants of different faiths and skin tones.
  • It is the same frustration that has buoyed proto-fascist political parties across Europe. It is the same anger fueling the candidacy of Mr. Trump in the United States.
  • ...7 more annotations...
  • Mr. Trump, the bombastic businessman who’s never held office, and Mr. Johnson, the former journalist turned mayor of London, might not put it this way, since they continue to cling to a conservative mantle. But they are riding a revolt of the working class against a 40-year-long project of the political right and its corporate backers that has dominated policy making in the English-speaking world for a generation.
  • The British political scientist Andrew Gamble at the University of Cambridge has argued that Western capitalism has experienced two transformational crises since the end of the 19th century. The first, brought about by the Depression of the 1930s, ended an era in which governments bowed to the gospel of the gold standard and were expected to butt out of the battles between labor and capital, letting markets function on their own, whatever the consequences
  • Mr. Keynes’s views ultimately prevailed, though, providing the basis for a new post-World War II orthodoxy favoring active government intervention in the economy and a robust welfare state. But that era ended when skyrocketing oil prices and economic mismanagement in the 1970s brought about a combination of inflation and unemployment that fatally undermined people’s trust in the state.
  • The Keynesian era ended when Margaret Thatcher and Ronald Reagan rode onto the scene with a version of capitalism based on tax cuts, privatization and deregulation that helped revive their engines of growth but led the workers of the world to the deeply frustrating, increasingly unequal economy of today.
  • After the Brexit vote, Lawrence Summers, former Treasury secretary under President Clinton and one of President Obama’s top economic advisers at the nadir of the Great Recession, laid out an argument for what he called “responsible nationalism,” which focused squarely on the interests of domestic workers.
  • Instead of negotiating more agreements to ease business across borders, governments would focus on deals to improve labor and environmental standards internationally. They might cut deals to prevent cross-border tax evasion.
  • There is, however, little evidence that the world’s leaders will go down that path. Despite the case for economic stimulus, austerity still rules across much of the West. In Europe, most governments have imposed stringent budget cuts — ensuring that all but the strongest economies would stall. In the United States, political polarization has brought fiscal policy — spending and taxes — to a standstill.
Javier E

[Six Questions] | Astra Taylor on The People's Platform: Taking Back Power and Culture ... - 1 views

  • Astra Taylor, a cultural critic and the director of the documentaries Zizek! and Examined Life, challenges the notion that the Internet has brought us into an age of cultural democracy. While some have hailed the medium as a platform for diverse voices and the free exchange of information and ideas, Taylor shows that these assumptions are suspect at best. Instead, she argues, the new cultural order looks much like the old: big voices overshadow small ones, content is sensationalist and powered by advertisements, quality work is underfunded, and corporate giants like Google and Facebook rule. The Internet does offer promising tools, Taylor writes, but a cultural democracy will be born only if we work collaboratively to develop the potential of this powerful resource
  • Most people don’t realize how little information can be conveyed in a feature film. The transcripts of both of my movies are probably equivalent in length to a Harper’s cover story.
  • why should Amazon, Apple, Facebook, and Google get a free pass? Why should we expect them to behave any differently over the long term? The tradition of progressive media criticism that came out of the Frankfurt School, not to mention the basic concept of political economy (looking at the way business interests shape the cultural landscape), was nowhere to be seen, and that worried me. It’s not like political economy became irrelevant the second the Internet was invented.
  • ...15 more annotations...
  • How do we reconcile our enjoyment of social media even as we understand that the corporations who control them aren’t always acting in our best interests?
  • hat was because the underlying economic conditions hadn’t been changed or “disrupted,” to use a favorite Silicon Valley phrase. Google has to serve its shareholders, just like NBCUniversal does. As a result, many of the unappealing aspects of the legacy-media model have simply carried over into a digital age — namely, commercialism, consolidation, and centralization. In fact, the new system is even more dependent on advertising dollars than the one that preceded it, and digital advertising is far more invasive and ubiquitous
  • the popular narrative — new communications technologies would topple the establishment and empower regular people — didn’t accurately capture reality. Something more complex and predictable was happening. The old-media dinosaurs weren’t dying out, but were adapting to the online environment; meanwhile the new tech titans were coming increasingly to resemble their predecessors
  • I’m resistant to the tendency to place this responsibility solely on the shoulders of users. Gadgets and platforms are designed to be addictive, with every element from color schemes to headlines carefully tested to maximize clickability and engagement. The recent news that Facebook tweaked its algorithms for a week in 2012, showing hundreds of thousands of users only “happy” or “sad” posts in order to study emotional contagion — in other words, to manipulate people’s mental states — is further evidence that these platforms are not neutral. In the end, Facebook wants us to feel the emotion of wanting to visit Facebook frequently
  • these services aren’t free even if we don’t pay money for them; we pay with our personal data, with our privacy. This feeds into the larger surveillance debate, since government snooping piggybacks on corporate data collection. As I argue in the book, there are also negative cultural consequences (e.g., when advertisers are paying the tab we get more of the kind of culture marketers like to associate themselves with and less of the stuff they don’t) and worrying social costs. For example, the White House and the Federal Trade Commission have both recently warned that the era of “big data” opens new avenues of discrimination and may erode hard-won consumer protections.
  • I use lots of products that are created by companies whose business practices I object to and that don’t act in my best interests, or the best interests of workers or the environment — we all do, since that’s part of living under capitalism. That said, I refuse to invest so much in any platform that I can’t quit without remorse
  • there’s a pervasive and ill-advised faith that technology will promote competition if left to its own devices (“competition is a click away,” tech executives like to say), but that’s not true for a variety of reasons. The paradox of our current media landscape is this: our devices and consumption patterns are ever more personalized, yet we’re simultaneously connected to this immense, opaque, centralized infrastructure. We’re all dependent on a handful of firms that are effectively monopolies — from Time Warner and Comcast on up to Google and Facebook — and we’re seeing increased vertical integration, with companies acting as both distributors and creators of content. Amazon aspires to be the bookstore, the bookshelf, and the book. Google isn’t just a search engine, a popular browser, and an operating system; it also invests in original content
  • That it’s invisible or at least harder to track and prove. We haven’t figured out how to deal with the unique ways prejudice plays out over digital channels, and that’s partly because some folks can’t accept the fact that discrimination persists online. (After all, there is no sign on the door that reads Minorities Not Allowed.)
  • just because the Internet is open doesn’t mean it’s equal; offline hierarchies carry over to the online world and are even amplified there. For the past year or so, there has been a lively discussion taking place about the disproportionate and often outrageous sexual harassment women face simply for entering virtual space and asserting themselves there — research verifies that female Internet users are dramatically more likely to be threatened or stalked than their male counterparts — and yet there is very little agreement about what, if anything, can be done to address the problem.
  • What steps can we take to encourage better representation of independent and non-commercial media? We need to fund it, first and foremost. As individuals this means paying for the stuff we believe in and want to see thrive. But I don’t think enlightened consumption can get us where we need to go on its own. I’m skeptical of the idea that we can shop our way to a better world. The dominance of commercial media is a social and political problem that demands a collective solution, so I make an argument for state funding and propose a reconceptualization of public media. More generally, I’m struck by the fact that we use these civic-minded metaphors, calling Google Books a “library” or Twitter a “town square” — or even calling social media “social” — but real public options are off the table, at least in the United States. We hand the digital commons over to private corporations at our peril.
  • 6. You advocate for greater government regulation of the Internet. Why is this important?
  • I’m for regulating specific things, like Internet access, which is what the fight for net neutrality is ultimately about. We also need stronger privacy protections and restrictions on data gathering, retention, and use, which won’t happen without a fight.
  • I challenge the techno-libertarian insistence that the government has no productive role to play and that it needs to keep its hands off the Internet for fear that it will be “broken.” The Internet and personal computing as we know them wouldn’t exist without state investment and innovation, so let’s be real.
  • social inequalities that exist in the real world remain meaningful online. What are the particular dangers of discrimination on the Internet?
  • So it’s not that the Internet needs to be regulated but that these big tech corporations need to be subject to governmental oversight. After all, they are reaching farther and farther into our intimate lives. They’re watching us. Someone should be watching them.
kushnerha

A new atlas maps word meanings in the brain | PBS NewsHour - 0 views

  • like Google Maps for your cerebral cortex: A new interactive atlas, developed with the help of such unlikely tools as public radio podcasts and Wikipedia, purports to show which bits of your brain help you understand which types of concepts.
  • Hear a word relating to family, loss, or the passing of time — such as “wife,” “month,” or “remarried”— and a ridge called the right angular gyrus may be working overtime. Listening to your contractor talking about the design of your new front porch? Thank a pea-sized spot of brain behind your left ear.
  • The research on the “brain dictionary” has the hallmarks of a big scientific splash: Published on Wednesday in Nature, it’s accompanied by both a video and an interactive website where you can click your way from brain region to brain region, seeing what kinds of words are processed in each. Yet neuroscientists aren’t uniformly impressed.
  • ...9 more annotations...
  • invoked an old metaphor to explain why he isn’t convinced by the analysis: He compared it to establishing a theory of how weather works by pointing a video camera out the window for 7 hours.
  • Indeed, among neuroscientists, the new “comprehensive atlas” of the cerebral cortex is almost as controversial as a historical atlas of the Middle East. That’s because every word has a constellation of meanings and associations — and it’s hard for scientists to agree about how best to study them in the lab.
  • For this study, neuroscientist Jack Gallant and his team at the University of California, Berkeley played more than two hours’ worth of stories from the Moth Radio Hour for seven grad students and postdocs while measuring their cerebral blood flow using functional magnetic resonance imaging. Then, they linked the activity in some 50,000 pea-sized regions of the cortex to the “meaning” of the words being heard at that moment.
  • How, you might ask, did they establish the meaning of words? The neuroscientists pulled all the nouns and verbs from the podcasts. With a computer program, they then looked across millions of pages of text to see how often the words from the podcasts are used near 985 common words taken from Wikipedia’s List of 1,000 Basic Words. “Wolf,” for instance, would presumably be used more often in proximity to “dog” than to, say, “eggplant.” Using that data, the program assigned numbers that approximated the meaning of each individual word from the podcasts — and, with some fancy number crunching, they figured out what areas of the brain were activated when their research subjects heard words with certain meanings.
  • Everyone agrees that the research is innovative in its method. After all, linking up the meanings of thousands of words to the second-by-second brain activity in thousands of tiny brain regions is no mean feat. “That’s way more data than any human being can possibly think about,” said Gallant.
  • What they can’t agree on is what it means. “In this study, our goal was not to ask a specific question. Our goal was to map everything so that we can ask questions after that,” said Gallant. “One of the most frequent questions we get is, ‘What does it mean?’ If I gave you a globe, you wouldn’t ask what it means, you’d start using it for stuff. You can look for the smallest ocean or how long it will take to get to San Francisco.”
  • This “data-driven approach” still involves assumptions about how to break up language into different categories of meaning
  • “Of course it’s a very simplified version of how meaning is captured in our minds, but it seems to be a pretty good proxy,” she said.
  • hordes of unanswered questions: “We can map where your brain represents the meaning of a narrative text that is associated with family, but we don’t know why the brain is responding to family at that location. Is it the word ‘father’ itself? Is it your memories of your own father? Is it your own thinking about being a parent yourself?” He hopes that it’s just those types of questions that researchers will ask, using his brain map as a guide.
kushnerha

When Was America Greatest? - The New York Times - 0 views

  • The slogan evokes a time when America was stronger and more prosperous. But Mr. Trump doesn’t specify whether he’s expressing nostalgia for the 1950s — or 10 years ago. That vagueness is reflected by his voters, according to the results of a new survey, conducted online by the digital media and polling company Morning Consult.
  • Trump supporters offered a wide range of answers, with no distinct pattern. The most popular choice was the year 2000. But 1955, 1960, 1970 and 1985 were also popular. More than 2 percent of Trump’s supporters picked 2015, when Mr. Trump’s campaign began.
  • Political science research suggests that Americans’ optimism can be influenced by whether their political party is in the White House. So it’s perhaps not surprising that Democrats feel better than Republicans about current circumstances.
  • ...6 more annotations...
  • In March, Pew asked people whether life was better for people like them 50 years ago — and a majority of Republicans answered yes. Trump supporters were the most emphatic, with 75 percent saying things were better in the mid-1960s.Democrats, though, were less enthusiastic about the past. Forty-eight percent said life was better now than it was 50 years ago, while 17 percent of Democrats said it was the same, and only 28 percent said it was worse.
  • In the Morning Consult survey, 44 percent of people over all said America’s greatest years were ahead of it, while 36 percent said those years had already passed. But in an election when America’s past greatness has played such a starring role, we wanted to see more details about just how voters saw the past and the future.
  • So, when was the greatest year?Over all, 2000 was the most popular choice, a preference that cut across political party, candidate preference, gender and age. The year’s popularity may partly reflect people’s fondness for round numbers. But many voters explained their choice by referring to a greater sense of security. The Sept. 11 attacks occurred the following year. (An election year also has something for all partisans to grab onto. Bill Clinton was president that year, but George W. Bush won the election to replace him.)
  • Some people, of course, reached farther back into history. The year the Declaration of Independence was signed, 1776, got a few votes. One person chose 1789, the year the Constitution took effect. One person chose 1800. One chose 1860, the year Southern states began to secede from the Union. But most answers were of a more recent vintage.
  • partisan patterns in views of America’s greatness. Republicans, over all, recall the late 1950s and the mid-1980s most fondly. Sample explanations: “Reagan.” “Economy was booming.” “No wars!” “Life was simpler.” “Strong family values.” The distribution of Trump supporters’ greatest years is somewhat similar to the Republican trend, but more widely dispersed over the last 70 years.
  • Democrats seem to think America’s greatest days were more recent; they were more likely to pick a year in the 1990s, or since 2000. After 2000, their second-most-popular answer was 2016. Sample explanations: “We’re getting better.” “Improving social justice.” “Technology.” Even 2008, a year of financial collapse, was pretty popular, perhaps because President Obama was also elected that year.
sissij

Tesla Passes Ford in Market Value as Investors Bet on the Future - The New York Times - 0 views

  • But there is one exception. Tesla, the electric-vehicle upstart, continues to surge.
  • “Investors want something that is going to go up in orders of magnitude in six months to six years, and Tesla is that story,” said Karl Brauer, a senior editor at Kelley Blue Book. “Nobody thinks Ford or G.M. is going to do that.”
  • Tesla’s chief executive, Elon Musk, has shattered the conventional wisdom that automakers should be viewed as a stable, reliable investment. Instead, he promotes his California-based company as a dynamic vehicle for growth, despite the risks and challenges ahead of it.
  • ...3 more annotations...
  • But neither automaker has convinced Wall Street that it has shed its boom-or-bust reputation tied to broader economic cycles, or is at the forefront of new technology being developed for self-driving vehicles and electric cars.
  • “It’s almost like Tesla is positioned in people’s minds as an energy storage company that happens to put most of its batteries on wheels,” said Andrew Stewart, chief investment officer at Exchange Capital Management, an investment firm in Ann Arbor, Mich.
  • While Tesla may enjoy the favor of investors, it still faces some daunting hurdles to reach its goals.
  •  
    In my research on Tesla, I found it very interesting that Tesla never has advertisement spreading out like Ford, Motor or other motor companies do. Yet, it is very popular and well-known. How does Tesla manage to be known by the public if they don't have any advertisement and their target costumers are the elites? One of the reason I found online about its propaganda strategy is its skill on giving stock holder confidence. Thus their stock price is always positive and healthy. By generating new ideas, Tesla is able to stay on the headline of the newspaper. When I saw this news, my first reaction is that it's Tesla again and give me a very positive image on the future of Tesla. This new way of propaganda is directly related to the new form of economics in the society so I found it very interesting. --Sissi (4/4/2017)
Duncan H

Living in the Material World - NYTimes.com - 0 views

  • on a visit to the Academy of Sciences in Almaty some years ago I was presented with a souvenir meant to assure me that Central Asia was indeed still producing philosophy worthy of note. It was a collectively authored book entitled “The Development of Materialist Dialectics in Kazakhstan,” and I still display it proudly on my shelf. Its rough binding and paper bespeak economic hardship. It is packed with the traces of ideas, yet everything about the book announces its materiality.I had arrived in the Kazakh capital 1994, just in time to encounter the last of a dying breed: the philosopher as party functionary (they are all by now retired, dead or defenestrated, or have simply given up on what they learned in school). The book, written by committee, was a collection of official talking points, and what passed for conversation there was something much closer to recitation.
  • The philosophical meaning of materialism may in the final analysis be traced back to a religious view of the world. On this view, to focus on the material side of existence is to turn away from the eternal and divine. Here, the category of the material is assimilated to that of sin or evil.
  • Yet in fact this feature of Marxist philosophical classification is one that, with some variations, continues to be shared by all philosophers, even in the West, even today
  • ...9 more annotations...
  • materialism is not the greedy desire for material goods, but rather the belief that the fundamental reality of the world is material;
  • idealism is not the aspiration toward lofty and laudable goals, but rather the belief that the fundamental reality of the world is mental or idea-like. English-speaking philosophers today tend to speak of “physicalism” or “naturalism” rather than materialism (perhaps to avoid confusion with the Wall Street sense of the term). At the same time, Anglo-American historians of philosophy continue to find the distinction between materialism and idealism a useful one in our attempts at categorizing past schools of thought. Democritus and La Mettrie were materialists; Hobbes was pretty close. Berkeley and Kant were idealists; Leibniz may have been.
  • And it was these paradoxes that led the Irish philosopher to conclude that talk of matter was but a case of multiplying entities beyond necessity. For Berkeley, all we can know are ideas, and for this reason it made sense to suppose that the world itself consists in ideas.
  • Soviet and Western Marxists alike, by stark contrast, and before them the French “vulgar” (i.e., non-dialectical) materialists of the 18th century, saw and see the material world as the base and cause of all mental activity, as both bringing ideas into existence, and also determining the form and character of a society’s ideas in accordance with the state of its technology, its methods of resource extraction and its organization of labor. So here to focus on the material is not to become distracted from the true source of being, but rather to zero right in on it.
  • one great problem with the concept of materialism is that it says very little in itself. What is required in addition is an elaboration of what a given thinker takes matter, or ideas, to be. It may not be just the Marxist aftertaste, but also the fact that the old common-sense idea about matter as brute, given stuff has turned out to have so little to do with the way the physical world actually is, that has led Anglo-American philosophers to prefer to associate themselves with the “physical” or the “natural” rather than with the material.  Reality, they want to say, is just what is natural, while everything else is in turn “supernatural” (this distinction has its clarity going for it, but it also seems uncomfortably close to tautology). Not every philosopher has a solid grasp of subatomic physics, but most know enough to grasp that, even if reality is eventually exhaustively accounted for through an enumeration of the kinds of particles and a few basic forces, this reality will still look nothing like what your average person-in-the-street takes reality to be.
  • The 18th-century idealist philosopher George Berkeley strongly believed that matter was only a fiction contrived by philosophers in the first place, for which the real people had no need. For Berkeley, there was never anything common-sensical about matter. We did not need to arrive at the era of atom-splitting and wave-particle duality, then, in order for the paradoxes inherent in matter to make themselves known (is it infinitely divisible or isn’t it?
  • Central to this performance was the concept of  “materialism.” The entire history of philosophy, in fact, was portrayed in Soviet historiography as a series of matches between the materialist home-team and its “idealist” opponents, beginning roughly with Democritus (good) and Plato (bad), and culminating in the opposition between official party philosophy and logical positivism, the latter of which was portrayed as a shrouded variety of idealism. Thus from the “Short Philosophical Dictionary,” published in Moscow in 1951, we learn that the school of logical empiricism represented by Rudolf Carnap, Otto Neurath and others, “is a form of subjective idealism, characteristic of degenerating bourgeois philosophy in the epoch of the decline of capitalism.”Now the Soviet usage of this pair of terms appears to fly in the face of our ordinary, non-philosophical understanding of them (that, for example,  Wall Street values are “materialist,” while the Occupy movement is “idealist”). One might have thought that the communists should be flinging the “materialist” label at their capitalist enemies, rather than claiming it for themselves. One might also have thought that the Bolshevik Revolution and the subsequent failed project of building a workers’ utopia was nothing if not idealistic.
  • Consider money. Though it might sometimes be represented by bank notes or coins, money is an immaterial thing par excellence, and to seek to acquire it is to move on the plane of ideas. Of course, money can also be converted into material things, yet it seems simplistic to suppose that we want money only in order to convert it into the material things we really want, since even these material things aren’t just material either: they are symbolically dense artifacts, and they convey to others certain ideas about their owners. This, principally, is why their owners want them, which is to say that materialists (in the everyday sense) are trading in ideas just as much as anyone else.
  • In the end no one really cares about stuff itself. Material acquisitions — even, or perhaps especially, material acquisitions of things like Rolls Royces and Rolexes — are maneuvers within a universe of materially instantiated ideas. This is human reality, and it is within this reality that mystics, scientists, and philosophers alike are constrained to pursue their various ends, no matter what they might take the ultimate nature of the external world to be.
  •  
    A very interesting article on the contrast between materialism and idealism.
Javier E

What Have We Learned, If Anything? by Tony Judt | The New York Review of Books - 0 views

  • During the Nineties, and again in the wake of September 11, 2001, I was struck more than once by a perverse contemporary insistence on not understanding the context of our present dilemmas, at home and abroad; on not listening with greater care to some of the wiser heads of earlier decades; on seeking actively to forget rather than remember, to deny continuity and proclaim novelty on every possible occasion. We have become stridently insistent that the past has little of interest to teach us. Ours, we assert, is a new world; its risks and opportunities are without precedent.
  • the twentieth century that we have chosen to commemorate is curiously out of focus. The overwhelming majority of places of official twentieth-century memory are either avowedly nostalgo-triumphalist—praising famous men and celebrating famous victories—or else, and increasingly, they are opportunities for the recollection of selective suffering.
  • The problem with this lapidary representation of the last century as a uniquely horrible time from which we have now, thankfully, emerged is not the description—it was in many ways a truly awful era, an age of brutality and mass suffering perhaps unequaled in the historical record. The problem is the message: that all of that is now behind us, that its meaning is clear, and that we may now advance—unencumbered by past errors—into a different and better era.
  • ...19 more annotations...
  • Today, the “common” interpretation of the recent past is thus composed of the manifold fragments of separate pasts, each of them (Jewish, Polish, Serb, Armenian, German, Asian-American, Palestinian, Irish, homosexual…) marked by its own distinctive and assertive victimhood.
  • The resulting mosaic does not bind us to a shared past, it separates us from it. Whatever the shortcomings of the national narratives once taught in school, however selective their focus and instrumental their message, they had at least the advantage of providing a nation with past references for present experience. Traditional history, as taught to generations of schoolchildren and college students, gave the present a meaning by reference to the past: today’s names, places, inscriptions, ideas, and allusions could be slotted into a memorized narrative of yesterday. In our time, however, this process has gone into reverse. The past now acquires meaning only by reference to our many and often contrasting present concerns.
  • the United States thus has no modern memory of combat or loss remotely comparable to that of the armed forces of other countries. But it is civilian casualties that leave the most enduring mark on national memory and here the contrast is piquant indeed
  • Today, the opposite applies. Most people in the world outside of sub-Saharan Africa have access to a near infinity of data. But in the absence of any common culture beyond a small elite, and not always even there, the fragmented information and ideas that people select or encounter are determined by a multiplicity of tastes, affinities, and interests. As the years pass, each one of us has less in common with the fast-multiplying worlds of our contemporaries, not to speak of the world of our forebears.
  • What is significant about the present age of transformations is the unique insouciance with which we have abandoned not merely the practices of the past but their very memory. A world just recently lost is already half forgotten.
  • In the US, at least, we have forgotten the meaning of war. There is a reason for this. I
  • Until the last decades of the twentieth century most people in the world had limited access to information; but—thanks to national education, state-controlled radio and television, and a common print culture—within any one state or nation or community people were all likely to know many of the same things.
  • it was precisely that claim, that “it’s torture, and therefore it’s no good,” which until very recently distinguished democracies from dictatorships. We pride ourselves on having defeated the “evil empire” of the Soviets. Indeed so. But perhaps we should read again the memoirs of those who suffered at the hands of that empire—the memoirs of Eugen Loebl, Artur London, Jo Langer, Lena Constante, and countless others—and then compare the degrading abuses they suffered with the treatments approved and authorized by President Bush and the US Congress. Are they so very different?
  • American civilian losses (excluding the merchant navy) in both world wars amounted to less than 2,000 dead.
  • the complacent neoconservative claim that war and conflict are things Americans understand—in contrast to naive Europeans with their pacifistic fantasies—seems to me exactly wrong: it is Europeans (along with Asians and Africans) who understand war all too well. Most Americans have been fortunate enough to live in blissful ignorance of its true significance.
  • That same contrast may account for the distinctive quality of much American writing on the cold war and its outcome. In European accounts of the fall of communism, from both sides of the former Iron Curtain, the dominant sentiment is one of relief at the closing of a long, unhappy chapter. Here in the US, however, the story is typically recorded in a triumphalist key.5
  • For many American commentators and policymakers the message of the twentieth century is that war works. Hence the widespread enthusiasm for our war on Iraq in 2003 (despite strong opposition to it in most other countries). For Washington, war remains an option—on that occasion the first option. For the rest of the developed world it has become a last resort.6
  • Ignorance of twentieth-century history does not just contribute to a regrettable enthusiasm for armed conflict. It also leads to a misidentification of the enemy.
  • This abstracting of foes and threats from their context—this ease with which we have talked ourselves into believing that we are at war with “Islamofascists,” “extremists” from a strange culture, who dwell in some distant “Islamistan,” who hate us for who we are and seek to destroy “our way of life”—is a sure sign that we have forgotten the lesson of the twentieth century: the ease with which war and fear and dogma can bring us to demonize others, deny them a common humanity or the protection of our laws, and do unspeakable things to them.
  • How else are we to explain our present indulgence for the practice of torture? For indulge it we assuredly do.
  • “But what would I have achieved by proclaiming my opposition to torture?” he replied. “I have never met anyone who is in favor of torture.”8 Well, times have changed. In the US today there are many respectable, thinking people who favor torture—under the appropriate circumstances and when applied to those who merit it.
  • As a consequence, the United States today is the only advanced democracy where public figures glorify and exalt the military, a sentiment familiar in Europe before 1945 but quite unknown today
  • We are slipping down a slope. The sophistic distinctions we draw today in our war on terror—between the rule of law and “exceptional” circumstances, between citizens (who have rights and legal protections) and noncitizens to whom anything can be done, between normal people and “terrorists,” between “us” and “them”—are not new. The twentieth century saw them all invoked. They are the selfsame distinctions that licensed the worst horrors of the recent past: internment camps, deportation, torture, and murder—those very crimes that prompt us to murmur “never again.” So what exactly is it that we think we have learned from the past? Of what possible use is our self-righteous cult of memory and memorials if the United States can build its very own internment camp and torture people there?
  • We need to learn again—or perhaps for the first time—how war brutalizes and degrades winners and losers alike and what happens to us when, having heedlessly waged war for no good reason, we are encouraged to inflate and demonize our enemies in order to justify that war’s indefinite continuance.
Javier E

untitled - 0 views

  • Scientists at Stanford University and the J. Craig Venter Institute have developed the first software simulation of an entire organism, a humble single-cell bacterium that lives in the human genital and respiratory tracts.
  • the work was a giant step toward developing computerized laboratories that could carry out many thousands of experiments much faster than is possible now, helping scientists penetrate the mysteries of diseases like cancer and Alzheimer’s.
  • cancer is not a one-gene problem; it’s a many-thousands-of-factors problem.”
  • ...7 more annotations...
  • This kind of modeling is already in use to study individual cellular processes like metabolism. But Dr. Covert said: “Where I think our work is different is that we explicitly include all of the genes and every known gene function. There’s no one else out there who has been able to include more than a handful of functions or more than, say, one-third of the genes.”
  • The simulation, which runs on a cluster of 128 computers, models the complete life span of the cell at the molecular level, charting the interactions of 28 categories of molecules — including DNA, RNA, proteins and small molecules known as metabolites, which are generated by cell processes.
  • They called the simulation an important advance in the new field of computational biology, which has recently yielded such achievements as the creation of a synthetic life form — an entire bacterial genome created by a team led by the genome pioneer J. Craig Venter. The scientists used it to take over an existing cell.
  • A decade ago, scientists developed simulations of metabolism that are now being used to study a wide array of cells, including bacteria, yeast and photosynthetic organisms. Other models exist for processes like protein synthesis.
  • “Right now, running a simulation for a single cell to divide only one time takes around 10 hours and generates half a gigabyte of data,” Dr. Covert wrote. “I find this fact completely fascinating, because I don’t know that anyone has ever asked how much data a living thing truly holds. We often think of the DNA as the storage medium, but clearly there is more to it than that.”
  • scientists chose an approach called object-oriented programming, which parallels the design of modern software systems. Software designers organize their programs in modules, which communicate with one another by passing data and instructions back and forth.
  • “The major modeling insight we had a few years ago was to break up the functionality of the cell into subgroups, which we could model individually, each with its own mathematics, and then to integrate these submodels together into a whole,”
Javier E

The Positive Power of Negative Thinking - NYTimes.com - 0 views

  • visualizing a successful outcome, under certain conditions, can make people less likely to achieve it. She rendered her experimental participants dehydrated, then asked some of them to picture a refreshing glass of water. The water-visualizers experienced a marked decline in energy levels, compared with those participants who engaged in negative or neutral fantasies. Imagining their goal seemed to deprive the water-visualizers of their get-up-and-go, as if they’d already achieved their objective.
  • take affirmations, those cheery slogans intended to lift the user’s mood by repeating them: “I am a lovable person!” “My life is filled with joy!” Psychologists at the University of Waterloo concluded that such statements make people with low self-esteem feel worse
  • Ancient philosophers and spiritual teachers understood the need to balance the positive with the negative, optimism with pessimism, a striving for success and security with an openness to failure and uncertainty
  • ...3 more annotations...
  • Buddhist meditation, too, is arguably all about learning to resist the urge to think positively — to let emotions and sensations arise and pass, regardless of their content
  • Very brief training in meditation, according to a 2009 article in The Journal of Pain, brought significant reductions in pain
  • the relentless cheer of positive thinking begins to seem less like an expression of joy and more like a stressful effort to stamp out any trace of negativity.
Javier E

The Poverty of an Idea - NYTimes.com - 1 views

  • THE libertarian writer Charles Murray has probably done more than any other contemporary thinker to keep alive the idea of a “culture of poverty,” the theory that poor people are trapped by distorted norms and aspirations and not merely material deprivation.
  • Harrington had picked up the idea of a “culture of poverty” from the anthropologist Oscar Lewis, whose 1959 study of Mexican slum dwellers identified a “subculture” of lowered aspirations and short-term gratification. Echoing Lewis, Harrington argued that American poverty constituted “a separate culture, another nation, with its own way of life.” It would not be solved merely by economic expansion or moral exhortation, he contended, but by a “comprehensive assault on poverty.”
  • In his view, these problems were not a judgment on the poor as individuals, but on a society indifferent to their plight. His popularization of the phrase “culture of poverty” has unintended consequences. There was nothing  in the “vicious circle” of pathology he sketched that was culturally determined, but in the hands of others, the idea came to signify an ingrained system of norms passed from generation to generation.
  • ...3 more annotations...
  • Conservatives took the attitudes and behaviors Harrington saw as symptoms of poverty and portrayed them as its direct causes.
  • In his 1984 book, “Losing Ground,” Mr. Murray argued that welfare programs abet rather than ameliorate poverty. The book dismissed Harrington’s prescription for ending poverty, and Harrington returned the favor. In “The New American Poverty,” published the same year, he called Mr. Murray the right-wing equivalent of a “vulgar Marxist,” a social theorist who believed in a “one-to-one relationship between the economic and the political or the psychological.”
  • Harrington’s culture-of-poverty thesis was at best an ambiguous impediment to understanding — in later books, he made no use of the term. But in its moral clarity, “The Other America” was ultimately optimistic; it was less an indictment and more an appeal to Americans to live up to their better instincts.
Javier E

The American Scholar: The Decline of the English Department - William M. Chace - 1 views

  • The number of young men and women majoring in English has dropped dramatically; the same is true of philosophy, foreign languages, art history, and kindred fields, including history. As someone who has taught in four university English departments over the last 40 years, I am dismayed by this shift, as are my colleagues here and there across the land. And because it is probably irreversible, it is important to attempt to sort out the reasons—the many reasons—for what has happened.
  • English: from 7.6 percent of the majors to 3.9 percent
  • In one generation, then, the numbers of those majoring in the humanities dropped from a total of 30 percent to a total of less than 16 percent; during that same generation, business majors climbed from 14 percent to 22 percent.
  • ...23 more annotations...
  • History: from 18.5 percent to 10.7 percent
  • But the deeper explanation resides not in something that has happened to it, but in what it has done to itself. English has become less and less coherent as a discipline and, worse, has come near exhaustion as a scholarly pursuit.
  • The twin focus, then, was on the philological nature of the enterprise and the canon of great works to be studied in their historical evolution.
  • Studying English taught us how to write and think better, and to make articulate many of the inchoate impulses and confusions of our post-adolescent minds. We began to see, as we had not before, how such books could shape and refine our thinking. We began to understand why generations of people coming before us had kept them in libraries and bookstores and in classes such as ours. There was, we got to know, a tradition, a historical culture, that had been assembled around these books. Shakespeare had indeed made a difference—to people before us, now to us, and forever to the language of English-speaking people.
  • today there are stunning changes in the student population: there are more and more gifted and enterprising students coming from immigrant backgrounds, students with only slender connections to Western culture and to the assumption that the “great books” of England and the United States should enjoy a fixed centrality in the world. What was once the heart of the matter now seems provincial. Why throw yourself into a study of something not emblematic of the world but representative of a special national interest? As the campus reflects the cultural, racial, and religious complexities of the world around it, reading British and American literature looks more and more marginal. From a global perspective, the books look smaller.
  • With the cost of a college degree surging upward during the last quarter century—tuition itself increasing far beyond any measure of inflation—and with consequent growth in loan debt after graduation, parents have become anxious about the relative earning power of a humanities degree. Their college-age children doubtless share such anxiety. When college costs were lower, anxiety could be kept at bay. (Berkeley in the early ’60s cost me about $100 a year, about $700 in today’s dollars.)
  • Economists, chemists, biologists, psychologists, computer scientists, and almost everyone in the medical sciences win sponsored research, grants, and federal dollars. By and large, humanists don’t, and so they find themselves as direct employees of the institution, consuming money in salaries, pensions, and operating needs—not external money but institutional money.
  • These, then, are some of the external causes of the decline of English: the rise of public education; the relative youth and instability (despite its apparent mature solidity) of English as a discipline; the impact of money; and the pressures upon departments within the modern university to attract financial resources rather than simply use them up.
  • several of my colleagues around the country have called for a return to the aesthetic wellsprings of literature, the rock-solid fact, often neglected, that it can indeed amuse, delight, and educate. They urge the teaching of English, or French, or Russian literature, and the like, in terms of the intrinsic value of the works themselves, in all their range and multiplicity, as well-crafted and appealing artifacts of human wisdom. Second, we should redefine our own standards for granting tenure, placing more emphasis on the classroom and less on published research, and we should prepare to contest our decisions with administrators whose science-based model is not an appropriate means of evaluation.
  • “It may be that what has happened to the profession is not the consequence of social or philosophical changes, but simply the consequence of a tank now empty.” His homely metaphor pointed to the absence of genuinely new frontiers of knowledge and understanding for English professors to explore.
  • In this country and in England, the study of English literature began in the latter part of the 19th century as an exercise in the scientific pursuit of philological research, and those who taught it subscribed to the notion that literature was best understood as a product of language.
  • no one has come forward in years to assert that the study of English (or comparative literature or similar undertakings in other languages) is coherent, does have self-limiting boundaries, and can be described as this but not that.
  • to teach English today is to do, intellectually, what one pleases. No sense of duty remains toward works of English or American literature; amateur sociology or anthropology or philosophy or comic books or studies of trauma among soldiers or survivors of the Holocaust will do. You need not even believe that works of literature have intelligible meaning; you can announce that they bear no relationship at all to the world beyond the text.
  • With everything on the table, and with foundational principles abandoned, everyone is free, in the classroom or in prose, to exercise intellectual laissez-faire in the largest possible way—I won’t interfere with what you do and am happy to see that you will return the favor
  • Consider the English department at Harvard University. It has now agreed to remove its survey of English literature for undergraduates, replacing it and much else with four new “affinity groups”
  • there would be no one book, or family of books, that every English major at Harvard would have read by the time he or she graduates. The direction to which Harvard would lead its students in this “clean slate” or “trickle down” experiment is to suspend literary history, thrusting into the hands of undergraduates the job of cobbling together intellectual coherence for themselves
  • Those who once strove to give order to the curriculum will have learned, from Harvard, that terms like core knowledge and foundational experience only trigger acrimony, turf protection, and faculty mutinies. No one has the stomach anymore to refight the Western culture wars. Let the students find their own way to knowledge.
  • In English, the average number of years spent earning a doctoral degree is almost 11. After passing that milestone, only half of new Ph.D.’s find teaching jobs, the number of new positions having declined over the last year by more than 20 percent; many of those jobs are part-time or come with no possibility of tenure. News like that, moving through student networks, can be matched against, at least until recently, the reputed earning power of recent graduates of business schools, law schools, and medical schools. The comparison is akin to what young people growing up in Rust Belt cities are forced to see: the work isn’t here anymore; our technology is obsolete.
  • unlike other members of the university community, they might well have been plying their trade without proper credentials: “Whereas economists or physicists, geologists or climatologists, physicians or lawyers must master a body of knowledge before they can even think of being licensed to practice,” she said, “we literary scholars, it is tacitly assumed, have no definable expertise.”
  • English departments need not refight the Western culture wars. But they need to fight their own book wars. They must agree on which texts to teach and argue out the choices and the principles of making them if they are to claim the respect due a department of study.
  • They can teach their students to write well, to use rhetoric. They should place their courses in composition and rhetoric at the forefront of their activities. They should announce that the teaching of composition is a skill their instructors have mastered and that students majoring in English will be certified, upon graduation, as possessing rigorously tested competence in prose expression.
  • The study of literature will then take on the profile now held, with moderate dignity, by the study of the classics, Greek and Latin.
  • But we can, we must, do better. At stake are the books themselves and what they can mean to the young. Yes, it is just a literary tradition. That’s all. But without such traditions, civil societies have no compass to guide them.
Maria Delzi

How Life Began: New Clues | TIME.com - 0 views

  • Astronomers recently announced that there could be an astonishing 20 billion Earthlike planets in the Milky Way
  • How abundant life actually is, however, hinges on one crucial factor: given the right conditions and the right raw materials,
  • what is the mathematical likelihood that life will actually would arise?
  • ...8 more annotations...
  • biology would have to be popping up all over the place.
  • Andrew Ellington, of the Center for Systems and Synthetic Biology at the University of Texas, Austin, “I can’t tell you what the probability is. It’s a chapter of the story that’s pretty much blank.”
  • Given that rather bleak-sounding assessment, it may be surprising to learn that Ellington is actually pretty upbeat. But that’s how he and two colleagues come across in a paper in the latest Science. The crucial step from nonliving stuff to a live cell is still a mystery, they acknowledge, but the number of pathways a mix of inanimate chemicals could have taken to reach the threshold of the living turns out to be many and varied. “It’s difficult to say exactly how things did occur,” says Ellington. “But there are many ways it could have occurred.
  • The first stab at answering the question came all the way back in the 1950s, when chemists Stanley Miller and Harold Urey passed an electrical spark through a beaker containing methane, ammonia, water vapor and hydrogen, thought at the time to represent Earth’s primordial atmosphere.
  • Scientists have learned so much, in fact, that the number of places life might have begun has grown to include such disparate locations as the hydrothermal vents at the bottom of the ocean; beds of clay; the billowing clouds of gas emerging from volcanoes; and the spaces in between ice crystals.
  • The number of ideas about how the key step from organic chemicals to living organisms might have been taken has multiplied as well: there’s the “RNA world hypothesis” and the “lipid world hypothesis” and the “iron-sulfur world hypothesis” and more, all of them dependent on a particular set of chemical circumstances and a particular set of dynamics and all highly speculative.
  • “Maybe when they do,” says Ellington, “we’ll all do a face-plant because it turns out to be so obvious in retrospect.” But even if they succeed, it will only prove that a manufactured cell could represent the earliest life forms, not that it actually does. “It will be a story about what we think might have happened, but it will still be a story.”
  • The story Ellington and his colleagues have been able to tell already, however, is a reason for optimism. We still don’t know the odds that life will arise under the right conditions. But the underlying biochemistry is abundantly, ubiquitously available—and it would take an awfully perverse universe to take things so far only to shut them down at the last moment.
Javier E

Breathing In vs. Spacing Out - NYTimes.com - 0 views

  • Although pioneers like Jon Kabat-Zinn, now emeritus professor at the University of Massachusetts Medical Center, began teaching mindfulness meditation as a means of reducing stress as far back as the 1970s, all but a dozen or so of the nearly 100 randomized clinical trials have been published since 2005.
  • Michael Posner, of the University of Oregon, and Yi-Yuan Tang, of Texas Tech University, used functional M.R.I.’s before and after participants spent a combined 11 hours over two weeks practicing a form of mindfulness meditation developed by Tang. They found that it enhanced the integrity and efficiency of the brain’s white matter, the tissue that connects and protects neurons emanating from the anterior cingulate cortex, a region of particular importance for rational decision-making and effortful problem-solving.
  • Perhaps that is why mindfulness has proved beneficial to prospective graduate students. In May, the journal Psychological Science published the results of a randomized trial showing that undergraduates instructed to spend a mere 10 minutes a day for two weeks practicing mindfulness made significant improvement on the verbal portion of the Graduate Record Exam — a gain of 16 percentile points. They also significantly increased their working memory capacity, the ability to maintain and manipulate multiple items of attention.
  • ...7 more annotations...
  • By emphasizing a focus on the here and now, it trains the mind to stay on task and avoid distraction.
  • “Your ability to recognize what your mind is engaging with, and control that, is really a core strength,” said Peter Malinowski, a psychologist and neuroscientist at Liverpool John Moores University in England. “For some people who begin mindfulness training, it’s the first time in their life where they realize that a thought or emotion is not their only reality, that they have the ability to stay focused on something else, for instance their breathing, and let that emotion or thought just pass by.”
  • the higher adults scored on a measurement of mindfulness, the worse they performed on tests of implicit learning — the kind that underlies all sorts of acquired skills and habits but that occurs without conscious awareness.
  • he found that having participants spend a brief period of time on an undemanding task that maximizes mind wandering improved their subsequent performance on a test of creativity. In a follow-up study, he reported that physicists and writers alike came up with their most insightful ideas while spacing out.
  • The trick is knowing when mindfulness is called for and when it’s not.
  • one of the most surprising findings of recent mindfulness studies is that it could have unwanted side effects. Raising roadblocks to the mind’s peregrinations could, after all, prevent the very sort of mental vacations that lead to epiphanies.
  • “There’s so much our brain is doing when we’re not aware of it,” said the study’s leader, Chelsea Stillman, a doctoral candidate. “We know that being mindful is really good for a lot of explicit cognitive functions. But it might not be so useful when you want to form new habits.” Learning to ride a bicycle, speak grammatically or interpret the meaning of people’s facial expressions are three examples of knowledge we acquire through implicit learning
Javier E

The Dangers of Certainty: A Lesson From Auschwitz - NYTimes.com - 0 views

  • in 1973, the BBC aired an extraordinary documentary series called “The Ascent of Man,” hosted by one Dr. Jacob Bronowski
  • It was not an account of human biological evolution, but cultural evolution — from the origins of human life in the Rift Valley to the shifts from hunter/gatherer societies,  to nomadism and then settlement and civilization, from agriculture and metallurgy to the rise and fall of empires: Assyria, Egypt, Rome.
  • The tone of the programs was rigorous yet permissive, playful yet precise, and always urgent, open and exploratory. I remember in particular the programs on the trial of Galileo, Darwin’s hesitancy about publishing his theory of evolution and the dizzying consequences of Einstein’s theory of relativity.
  • ...11 more annotations...
  • For Bronowski, science and art were two neighboring mighty rivers that flowed from a common source: the human imagination.
  • For Dr. Bronowski, there was no absolute knowledge and anyone who claims it — whether a scientist, a politician or a religious believer — opens the door to tragedy. All scientific information is imperfect and we have to treat it with humility. Such, for him, was the human condition.
  • This is the condition for what we can know, but it is also, crucially, a moral lesson. It is the lesson of 20th-century painting from Cubism onwards, but also that of quantum physics. All we can do is to push deeper and deeper into better approximations of an ever-evasive reality
  • Errors are inextricably bound up with pursuit of human knowledge, which requires not just mathematical calculation but insight, interpretation and a personal act of judgment for which we are responsible.
  • Dr. Bronowski insisted that the principle of uncertainty was a misnomer, because it gives the impression that in science (and outside of it) we are always uncertain. But this is wrong. Knowledge is precise, but that precision is confined within a certain toleration of uncertainty.
  • The emphasis on the moral responsibility of knowledge was essential for all of Dr. Bronowski’s work. The acquisition of knowledge entails a responsibility for the integrity of what we are as ethical creatures.
  • Pursuing knowledge means accepting uncertainty. Heisenberg’s principle has the consequence that no physical events can ultimately be described with absolute certainty or with “zero tolerance,” as it were. The more we know, the less certain we are.
  • Our relations with others also require a principle of tolerance. We encounter other people across a gray area of negotiation and approximation. Such is the business of listening and the back and forth of conversation and social interaction.
  • For Dr. Bronowski, the moral consequence of knowledge is that we must never judge others on the basis of some absolute, God-like conception of certainty. All knowledge, all information that passes between human beings, can be exchanged only within what we might call “a play of tolerance,” whether in science, literature, politics or religion.
  • The play of tolerance opposes the principle of monstrous certainty that is endemic to fascism and, sadly, not just fascism but all the various faces of fundamentalism. When we think we have certainty, when we aspire to the knowledge of the gods, then Auschwitz can happen and can repeat itself.
  • The pursuit of scientific knowledge is as personal an act as lifting a paintbrush or writing a poem, and they are both profoundly human. If the human condition is defined by limitedness, then this is a glorious fact because it is a moral limitedness rooted in a faith in the power of the imagination, our sense of responsibility and our acceptance of our fallibility. We always have to acknowledge that we might be mistaken.
Javier E

NSF Report Flawed; Americans Do Not Believe Astrology is Scientific | NeoAcademic - 0 views

  • The problem with human subjects data – as any psychologist like myself will tell you – is that simply asking someone a question rarely gives you the information that you think it does. When you ask someone to respond to a question, it must pass through a variety of mental filters, and these filters often cause people’s answers to differ from reality. Some of these processes are conscious and others are not
  • Learning, and by extension knowledge, are no different. People don’t always know what they know. And this NSF report is a fantastic example of this in action. The goal of the NSF researchers was to assess, “Do US citizens believe astrology is scientific?” People were troubled that young people now apparently believe astrology is more scientific than in the past. But this interpretation unwisely assumes that people accurately interpret the word astrology. It assumes that they know what astrology is and recognize that they know it in order to respond authentically
  • When I saw the NSF report, I was reminded of my own poor understanding of these terms. “Surely,” I said to myself, “it’s not that Americans believe astrology is scientific. Instead, they must be confusing astronomy with astrology, like I did those many years ago.” Fortunately, I had a very quick way to answer this question: Amazon Mechanical Turk (MTurk).
  • ...2 more annotations...
  • MTurk is a fantastic tool available to quickly collect human subjects data. It pulls from a massive group of people looking to complete small tasks for small amounts of money. So for 5 cents per survey, I collected 100 responses to a short survey from American MTurk Workers. It asked only 3 questions:Please define astrology in 25 words or less.Do you believe astrology to be scientific? (using the same scale as the NSF study)What is your highest level of education completed? (using the same scale as the NSF study)
  • Among those that correctly identified astrology as astrology, only 13.5% found it “pretty scientific” or “very scientific”. Only 1 person said it was “very scientific.” Among those that identified astrology as astronomy, the field was overwhelmingly seen as scientific, exactly as I expected. This is the true driver of the NSF report findings
« First ‹ Previous 61 - 80 of 260 Next › Last »
Showing 20 items per page