Skip to main content

Home/ TOK Friends/ Group items matching "obsession" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
15More

What Michelle Obama Wore and Why It Mattered - The New York Times - 0 views

  • it had just been revealed that the campaign clothes budget for Sarah Palin, the Republican vice-presidential candidate, was $150,000
  • And thus was an eight-year obsession born. Not to mention a new approach to the story of dress and power.
  • it set in motion a strategic rethink about the use of clothes that not only helped define her tenure as first lady, but also started a conversation that went far beyond the label or look that she wore and that is only now, maybe, reaching its end.
  • ...11 more annotations...
  • If you know everyone is going to see what you wear and judge it, then what you wear becomes fraught with meaning.
  • She realized very early on that everything she did had ramifications
  • Just because something appears trivial does not mean it is any less powerful as a means of persuasion and outreach. In some ways its very triviality — the fact that everyone could talk about it, dissect it, imitate it — makes fashion the most potentially viral item in the subliminal political toolbox.
  • as she said to Vogue in her third cover story, the most of any first lady, one of the factors in choosing a garment always has to be, “Is it cute?”
  • she saw it as a way to frame her own independence and points of difference, add to her portfolio and amplify her husband’s agenda.
  • Mrs. Obama seemed to work with them all.
  • We all tend to gravitate toward certain designers in part because of sheer laziness: We know what suits us, what we like, and so we go there first. To have been so, well, evenhanded in her choices could have happened only with careful calculation.
  • Especially because Mrs. Obama not only wore their clothes, she also took their business seriously, framing fashion as a credible, covetable job choice during her education initiatives.
  • If you think that was an accident, there’s a bridge I can sell you — just as the fact she wore Jason Wu to her husband’s farewell address in Chicago, a designer she also wore at both inaugural balls, was no coincidence. It was closure.
  • But above all, her wardrobe was representative of the country her husband wanted to lead.
  • It may be because the point of what Mrs. Obama wore was never simply that it was good to mix up your wardrobe among a group of designers, but rather that clothes were most resonant when they were an expression of commitment to an idea, or an ideal, that had resonance.
  •  
    This article takes a close look to Mrs. Obama's wardrobe, which I found to be very interesting. Even clothes can represent what a person is think about. There is a quote in this article that I really like: "Just because something appears trial does not mean it is any less powerful as a means of persuasion and outreach." Especial that it is the era of internet and you can find literally every detail online. People always like to assign meanings to things they see, though sometimes others don't mean it. Also, people are very easily influenced by social medias. For example, when I finish reading this article, the clothes Mrs. Obama chose suddenly become meaningful. Twenty-first century is an era of information. Even the smallest thing such as clothing can be a delivery of information
26More

What Gamergate should have taught us about the 'alt-right' | Technology | The Guardian - 0 views

  • Gamergate
  • The 2014 hashtag campaign, ostensibly founded to protest about perceived ethical failures in games journalism, clearly thrived on hate – even though many of those who aligned themselves with the movement either denied there was a problem with harassment, or wrote it off as an unfortunate side effect
  • ure, women, minorities and progressive voices within the industry were suddenly living in fear. Sure, those who spoke out in their defence were quickly silenced through exhausting bursts of online abuse. But that wasn’t why people supported it, right? They were disenfranchised, felt ignored, and wanted to see a systematic change.
  • ...23 more annotations...
  • Is this all sounding rather familiar now? Does it remind you of something?
  • it quickly became clear that the GamerGate movement was a mess – an undefined mission to Make Video Games Great Again via undecided means.
  • fter all, the culture war that began in games now has a senior representative in The White House. As a founder member and former executive chair of Brietbart News, Steve Bannon had a hand in creating media monster Milo Yiannopoulos, who built his fame and Twitter following by supporting and cheerleading Gamergate. This hashtag was the canary in the coalmine, and we ignored it.
  • Gamergate was an online movement that effectively began because a man wanted to punish his ex girlfriend. Its most notable achievement was harassing a large number of progressive figures - mostly women – to the point where they felt unsafe or considered leaving the industry
  • The similarities between Gamergate and the far-right online movement, the “alt-right”, are huge, startling and in no way a coincidence
  • These figures gave Gamergate a new sense of direction – generalising the rhetoric: this was now a wider war between “Social Justice Warriors” (SJWs) and everyday, normal, decent people. Games were simply the tip of the iceberg – progressive values, went the argument, were destroying everything
  • In 2016, new wave conservative media outlets like Breitbart have gained trust with their audience by painting traditional news sources as snooty and aloof. In 2014, video game YouTube stars, seeking to appear in touch with online gaming communities, unscrupulously proclaimed that traditional old-media sources were corrupt. Everything we’re seeing now, had its precedent two years ago.
  • With 2014’s Gamergate, Breitbart seized the opportunity to harness the pre-existing ignorance and anger among disaffected young white dudes. With Trump’s movement in 2016, the outlet was effectively running his campaign: Steve Bannon took leave of his role at the company in August 2016 when he was hired as chief executive of Trump’s presidential campaign
  • young men converted via 2014’s Gamergate, are being more widely courted now. By leveraging distrust and resentment towards women, minorities and progressives, many of Gamergate’s most prominent voices – characters like Mike Cernovich, Adam Baldwin, and Milo Yiannopoulos – drew power and influence from its chaos
  • no one in the movement was willing to be associated with the abuse being carried out in its name. Prominent supporters on Twitter, in subreddits and on forums like 8Chan, developed a range of pernicious rhetorical devices and defences to distance themselves from threats to women and minorities in the industry: the targets were lying or exaggerating, they were too precious; a language of dismissal and belittlement was formed against them. Safe spaces, snowflakes, unicorns, cry bullies. Even when abuse was proven, the usual response was that people on their side were being abused too. These techniques, forged in Gamergate, have become the standard toolset of far-right voices online
  • The majority of people who voted for Trump will never take responsibility for his racist, totalitarian policies, but they’ll provide useful cover and legitimacy for those who demand the very worst from the President Elect. Trump himself may have disavowed the “alt-right”, but his rhetoric has led to them feeling legitimised. As with Gamergate, the press risks being manipulated into a position where it has to tread a respectful middle ground that doesn’t really exist.
  • Using 4chan (and then the more sympathetic offshoot 8Chan) to plan their subversions and attacks made Gamergate a terribly sloppy operation, leaving a trail of evidence that made it quite clear the whole thing was purposefully, plainly nasty. But the video game industry didn’t have the spine to react, and allowed the movement to coagulate – forming a mass of spiteful disappointment that Breitbart was only more than happy to coddle
  • Historically, that seems to be Breitbart’s trick - strongly represent a single issue in order to earn trust, and then gradually indoctrinate to suit wider purposes. With Gamergate, they purposefully went fishing for anti-feminists. 2016’s batch of fresh converts – the white extremists – came from enticing conspiracy theories about the global neoliberal elite secretly controlling the world.
  • The greatest strength of Gamergate, though, was that it actually appeared to represent many left-leaning ideals: stamping out corruption in the press, pushing for better ethical practices, battling for openness.
  • There are similarities here with many who support Trump because of his promises to put an end to broken neo-liberalism, to “drain the swamp” of establishment corruption. Many left-leaning supporters of Gamergate sought to intellectualise their alignment with the hashtag, adopting familiar and acceptable labels of dissent – identifying as libertarian, egalitarian, humanist.
  • At best they unknowingly facilitated abuse, defending their own freedom of expression while those who actually needed support were threatened and attacked.
  • Genuine discussions over criticism, identity and censorship were paralysed and waylaid by Twitter voices obsessed with rhetorical fallacies and pedantic debating practices. While the core of these movements make people’s lives hell, the outer shell – knowingly or otherwise – protect abusers by insisting that the real problem is that you don’t want to talk, or won’t provide the ever-shifting evidence they politely require.
  • In 2017, the tactics used to discredit progressive game critics and developers will be used to discredit Trump and Bannon’s critics. There will be gaslighting, there will be attempts to make victims look as though they are losing their grip on reality, to the point that they gradually even start to believe it. The “post-truth” reality is not simply an accident – it is a concerted assault on the rational psyche.
  • The strangest aspect of Gamergate is that it consistently didn’t make any sense: people chose to align with it, and yet refused responsibility. It was constantly demanded that we debate the issues, but explanations and facts were treated with scorn. Attempts to find common ground saw the specifics of the demands being shifted: we want you to listen to us; we want you to change your ways; we want you to close your publication down. This movement that ostensibly wanted to protect free speech from cry bully SJWs simultaneously did what it could to endanger sites it disagreed with, encouraging advertisers to abandon support for media outlets that published stories critical of the hashtag. The petulance of that movement is disturbingly echoed in Trump’s own Twitter feed.
  • Looking back, Gamergate really only made sense in one way: as an exemplar of what Umberto Eco called “eternal fascism”, a form of extremism he believed could flourish at any point in, in any place – a fascism that would extol traditional values, rally against diversity and cultural critics, believe in the value of action above thought and encourage a distrust of intellectuals or experts – a fascism built on frustration and machismo. The requirement of this formless fascism would – above all else – be to remain in an endless state of conflict, a fight against a foe who must always be portrayed as impossibly strong and laughably weak
  • 2016 has presented us with a world in which our reality is being wilfully manipulated. Fake news, divisive algorithms, misleading social media campaigns.
  • The same voices moved into other geek communities, especially comics, where Marvel and DC were criticised for progressive storylines and decisions. They moved into science fiction with the controversy over the Hugo awards. They moved into cinema with the revolting kickback against the all-female Ghostbusters reboot.
  • Perhaps the true lesson of Gamergate was that the media is culturally unequipped to deal with the forces actively driving these online movements. The situation was horrifying enough two years ago, it is many times more dangerous now.
31More

Covering politics in a "post-truth" America | Brookings Institution - 0 views

  • The media scandal of 2016 isn’t so much about what reporters failed to tell the American public; it’s about what they did report on, and the fact that it didn’t seem to matter.
  • Facebook and Snapchat and the other social media sites should rightfully be doing a lot of soul-searching about their role as the most efficient distribution network for conspiracy theories, hatred, and outright falsehoods ever invented.
  • I’ve been obsessively looking back over our coverage, too, trying to figure out what we missed along the way to the upset of the century
  • ...28 more annotations...
  • (An early conclusion: while we were late to understand how angry white voters were, a perhaps even more serious lapse was in failing to recognize how many disaffected Democrats there were who would stay home rather than support their party’s flawed candidate.)
  • Stories that would have killed any other politician—truly worrisome revelations about everything from the federal taxes Trump dodged to the charitable donations he lied about, the women he insulted and allegedly assaulted, and the mob ties that have long dogged him—did not stop Trump from thriving in this election year
  • the Oxford Dictionaries announced that “post-truth” had been chosen as the 2016 word of the year, defining it as a condition “in which objective facts are less influential in shaping public opinion than appeals to emotion and personal belief.”
  • Meantime, Trump personally blacklisted news organizations like Politico and The Washington Post when they published articles he didn’t like during the campaign, has openly mused about rolling back press freedoms enshrined by the U.S. Supreme Court, and has now named Stephen Bannon, until recently the executive chairman of Breitbart—a right-wing fringe website with a penchant for conspiracy theories and anti-Semitic tropes—to serve as one of his top White House advisers.
  • none of this has any modern precedent. And what makes it unique has nothing to do with the outcome of the election. This time, the victor was a right-wing demagogue; next time, it may be a left-wing populist who learns the lessons of Trump’s win.
  • This is no mere academic argument. The election of 2016 showed us that Americans are increasingly choosing to live in a cloud of like-minded spin, surrounded by the partisan political hackery and fake news that poisons their Facebook feeds.
  • To help us understand it all, there were choices, but not that many: three TV networks that mattered, ABC, CBS, and NBC; two papers for serious journalism, The New York Times and The Washington Post; and two giant-circulation weekly newsmagazines, Time and Newsweek. That, plus whatever was your local daily newspaper, pretty much constituted the news.
  • Fake news is thriving In the final three months of the presidential campaign, the 20 top-performing fake election news stories generated more engagement on Facebook than the top stories from major news outlets such as The New York Times.
  • Eventually, I came to think of the major media outlets of that era as something very similar to the big suburban shopping malls we flocked to in the age of shoulder pads and supply-side economics: We could choose among Kmart and Macy’s and Saks Fifth Avenue as our budgets and tastes allowed, but in the end the media were all essentially department stores, selling us sports and stock tables and foreign news alongside our politics, whether we wanted them or not. It may not have been a monopoly, but it was something pretty close.
  • This was still journalism in the scarcity era, and it affected everything from what stories we wrote to how fast we could produce them. Presidents could launch global thermonuclear war with the Russians in a matter of minutes, but news from the American hinterlands often took weeks to reach their sleepy capital. Even information within that capital was virtually unobtainable without a major investment of time and effort. Want to know how much a campaign was raising and spending from the new special-interest PACs that had proliferated? Prepare to spend a day holed up at the Federal Election Commission’s headquarters down on E Street across from the hulking concrete FBI building, and be sure to bring a bunch of quarters for the copy machine.
  • I am writing this in the immediate, shocking aftermath of a 2016 presidential election in which the Pew Research Center found that a higher percentage of Americans got their information about the campaign from late-night TV comedy shows than from a national newspaper. Don Graham sold the Post three years ago and though its online audience has been skyrocketing with new investments from Amazon.com founder Jeff Bezos, it will never be what it was in the ‘80s. That same Pew survey reported that a mere 2 percent of Americans today turned to such newspapers as the “most helpful” guides to the presidential campaign.
  • In 2013, Mark Leibovich wrote a bestselling book called This Town about the party-hopping, lobbyist-enabling nexus between Washington journalists and the political world they cover. A key character was Politico’s Mike Allen, whose morning email newsletter “Playbook” had become a Washington ritual, offering all the news and tidbits a power player might want to read before breakfast—and Politico’s most successful ad franchise to boot. In many ways, even that world of just a few years ago now seems quaint: the notion that anyone could be a single, once-a-day town crier in This Town (or any other) has been utterly exploded by the move to Twitter, Facebook, and all the rest. We are living, as Mark put it to me recently, “in a 24-hour scrolling version of what ‘Playbook’ was.”
  • Whether it was Walter Cronkite or The New York Times, they preached journalistic “objectivity” and spoke with authority when they pronounced on the day’s developments—but not always with the depth and expertise that real competition or deep specialization might have provided. They were great—but they were generalists.
  • I remained convinced that reporting would hold its value, especially as our other advantages—like access to information and the expensive means to distribute it—dwindled. It was all well and good to root for your political team, but when it mattered to your business (or the country, for that matter), I reasoned, you wouldn’t want cheerleading but real reporting about real facts. Besides, the new tools might be coming at us with dizzying speed—remember when that radical new video app Meerkat was going to change absolutely everything about how we cover elections?—but we would still need reporters to find a way inside Washington’s closed doors and back rooms, to figure out what was happening when the cameras weren’t rolling.
  • And if the world was suffering from information overload—well, so much the better for us editors; we would be all the more needed to figure out what to listen to amid the noise.
  • Trump turned out to be more correct than we editors were: the more relevant point of the Access Hollywood tape was not about the censure Trump would now face but the political reality that he, like Bill Clinton, could survive this—or perhaps any scandal. Yes, we were wrong about the Access Hollywood tape, and so much else.
  • These days, Politico has a newsroom of 200-odd journalists, a glossy award-winning magazine, dozens of daily email newsletters, and 16 subscription policy verticals. It’s a major player in coverage not only of Capitol Hill but many other key parts of the capital, and some months during this election year we had well over 30 million unique visitors to our website, a far cry from the controlled congressional circulation of 35,000 that I remember Roll Call touting in our long-ago sales materials.
  • , we journalists were still able to cover the public theater of politics while spending more of our time, resources, and mental energy on really original reporting, on digging up stories you couldn’t read anywhere else. Between Trump’s long and checkered business past, his habit of serial lying, his voluminous and contradictory tweets, and his revision of even his own biography, there was lots to work with. No one can say that Trump was elected without the press telling us all about his checkered past.
  • politics was NEVER more choose-your-own-adventure than in 2016, when entire news ecosystems for partisans existed wholly outside the reach of those who at least aim for truth
  • Pew found that nearly 50 percent of self-described conservatives now rely on a single news source, Fox, for political information they trust.
  • As for the liberals, they trust only that they should never watch Fox, and have MSNBC and Media Matters and the remnants of the big boys to confirm their biases.
  • And then there are the conspiracy-peddling Breitbarts and the overtly fake-news outlets of this overwhelming new world; untethered from even the pretense of fact-based reporting, their version of the campaign got more traffic on Facebook in the race’s final weeks than all the traditional news outlets combined.
  • When we assigned a team of reporters at Politico during the primary season to listen to every single word of Trump’s speeches, we found that he offered a lie, half-truth, or outright exaggeration approximately once every five minutes—for an entire week. And it didn’t hinder him in the least from winning the Republican presidential nomination.
  • when we repeated the exercise this fall, in the midst of the general election campaign, Trump had progressed to fibs of various magnitudes just about once every three minutes!
  • By the time Trump in September issued his half-hearted disavowal of the Obama “birther” whopper he had done so much to create and perpetuate, one national survey found that only 1 in 4 Republicans was sure that Obama was born in the U.S., and various polls found that somewhere between a quarter and a half of Republicans believed he’s Muslim. So not only did Trump think he was entitled to his own facts, so did his supporters. It didn’t stop them at all from voting for him.
  • in part, it’s not just because they disagree with the facts as reporters have presented them but because there’s so damn many reporters, and from such a wide array of outlets, that it’s often impossible to evaluate their standards and practices, biases and preconceptions. Even we journalists are increasingly overwhelmed.
  • So much terrific reporting and writing and digging over the years and … Trump? What happened to consequences? Reporting that matters? Sunlight, they used to tell us, was the best disinfectant for what ails our politics.
  • 2016 suggests a different outcome: We’ve achieved a lot more transparency in today’s Washington—without the accountability that was supposed to come with it.
42More

The decline effect and the scientific method : The New Yorker - 3 views

  • The test of replicability, as it’s known, is the foundation of modern research. Replicability is how the community enforces itself. It’s a safeguard for the creep of subjectivity. Most of the time, scientists know what results they want, and that can influence the results they get. The premise of replicability is that the scientific community can correct for these flaws.
  • But now all sorts of well-established, multiply confirmed findings have started to look increasingly uncertain. It’s as if our facts were losing their truth: claims that have been enshrined in textbooks are suddenly unprovable.
  • This phenomenon doesn’t yet have an official name, but it’s occurring across a wide range of fields, from psychology to ecology.
  • ...39 more annotations...
  • If replication is what separates the rigor of science from the squishiness of pseudoscience, where do we put all these rigorously validated findings that can no longer be proved? Which results should we believe?
  • Schooler demonstrated that subjects shown a face and asked to describe it were much less likely to recognize the face when shown it later than those who had simply looked at it. Schooler called the phenomenon “verbal overshadowing.”
  • The most likely explanation for the decline is an obvious one: regression to the mean. As the experiment is repeated, that is, an early statistical fluke gets cancelled out. The extrasensory powers of Schooler’s subjects didn’t decline—they were simply an illusion that vanished over time.
  • yet Schooler has noticed that many of the data sets that end up declining seem statistically solid—that is, they contain enough data that any regression to the mean shouldn’t be dramatic. “These are the results that pass all the tests,” he says. “The odds of them being random are typically quite remote, like one in a million. This means that the decline effect should almost never happen. But it happens all the time!
  • this is why Schooler believes that the decline effect deserves more attention: its ubiquity seems to violate the laws of statistics
  • In 2001, Michael Jennions, a biologist at the Australian National University, set out to analyze “temporal trends” across a wide range of subjects in ecology and evolutionary biology. He looked at hundreds of papers and forty-four meta-analyses (that is, statistical syntheses of related studies), and discovered a consistent decline effect over time, as many of the theories seemed to fade into irrelevance.
  • Jennions admits that his findings are troubling, but expresses a reluctance to talk about them
  • publicly. “This is a very sensitive issue for scientists,” he says. “You know, we’re supposed to be dealing with hard facts, the stuff that’s supposed to stand the test of time. But when you see these trends you become a little more skeptical of things.”
  • While publication bias almost certainly plays a role in the decline effect, it remains an incomplete explanation. For one thing, it fails to account for the initial prevalence of positive results among studies that never even get submitted to journals. It also fails to explain the experience of people like Schooler, who have been unable to replicate their initial data despite their best efforts.
  • Jennions, similarly, argues that the decline effect is largely a product of publication bias, or the tendency of scientists and scientific journals to prefer positive data over null results, which is what happens when no effect is found. The bias was first identified by the statistician Theodore Sterling, in 1959, after he noticed that ninety-seven per cent of all published psychological studies with statistically significant data found the effect they were looking for
  • Sterling saw that if ninety-seven per cent of psychology studies were proving their hypotheses, either psychologists were extraordinarily lucky or they published only the outcomes of successful experiments.
  • One of his most cited papers has a deliberately provocative title: “Why Most Published Research Findings Are False.”
  • suspects that an equally significant issue is the selective reporting of results—the data that scientists choose to document in the first place. Palmer’s most convincing evidence relies on a statistical tool known as a funnel graph. When a large number of studies have been done on a single subject, the data should follow a pattern: studies with a large sample size should all cluster around a common value—the true result—whereas those with a smaller sample size should exhibit a random scattering, since they’re subject to greater sampling error. This pattern gives the graph its name, since the distribution resembles a funnel.
  • after Palmer plotted every study of fluctuating asymmetry, he noticed that the distribution of results with smaller sample sizes wasn’t random at all but instead skewed heavily toward positive results. Palmer has since documented a similar problem in several other contested subject areas. “Once I realized that selective reporting is everywhere in science, I got quite depressed,” Palmer told me. “As a researcher, you’re always aware that there might be some nonrandom patterns, but I had no idea how widespread it is.”
  • Palmer summarized the impact of selective reporting on his field: “We cannot escape the troubling conclusion that some—perhaps many—cherished generalities are at best exaggerated in their biological significance and at worst a collective illusion nurtured by strong a-priori beliefs often repeated.”
  • Palmer emphasizes that selective reporting is not the same as scientific fraud. Rather, the problem seems to be one of subtle omissions and unconscious misperceptions, as researchers struggle to make sense of their results. Stephen Jay Gould referred to this as the “sho
  • horning” process.
  • “A lot of scientific measurement is really hard,” Simmons told me. “If you’re talking about fluctuating asymmetry, then it’s a matter of minuscule differences between the right and left sides of an animal. It’s millimetres of a tail feather. And so maybe a researcher knows that he’s measuring a good male”—an animal that has successfully mated—“and he knows that it’s supposed to be symmetrical. Well, that act of measurement is going to be vulnerable to all sorts of perception biases. That’s not a cynical statement. That’s just the way human beings work.”
  • For Simmons, the steep rise and slow fall of fluctuating asymmetry is a clear example of a scientific paradigm, one of those intellectual fads that both guide and constrain research: after a new paradigm is proposed, the peer-review process is tilted toward positive results. But then, after a few years, the academic incentives shift—the paradigm has become entrenched—so that the most notable results are now those that disprove the theory.
  • John Ioannidis, an epidemiologist at Stanford University, argues that such distortions are a serious issue in biomedical research. “These exaggerations are why the decline has become so common,” he says. “It’d be really great if the initial studies gave us an accurate summary of things. But they don’t. And so what happens is we waste a lot of money treating millions of patients and doing lots of follow-up studies on other themes based on results that are misleading.”
  • In 2005, Ioannidis published an article in the Journal of the American Medical Association that looked at the forty-nine most cited clinical-research studies in three major medical journals.
  • the data Ioannidis found were disturbing: of the thirty-four claims that had been subject to replication, forty-one per cent had either been directly contradicted or had their effect sizes significantly downgraded.
  • the most troubling fact emerged when he looked at the test of replication: out of four hundred and thirty-two claims, only a single one was consistently replicable. “This doesn’t mean that none of these claims will turn out to be true,” he says. “But, given that most of them were done badly, I wouldn’t hold my breath.”
  • According to Ioannidis, the main problem is that too many researchers engage in what he calls “significance chasing,” or finding ways to interpret the data so that it passes the statistical test of significance—the ninety-five-per-cent boundary invented by Ronald Fisher.
  • One of the classic examples of selective reporting concerns the testing of acupuncture in different countries. While acupuncture is widely accepted as a medical treatment in various Asian countries, its use is much more contested in the West. These cultural differences have profoundly influenced the results of clinical trials.
  • The problem of selective reporting is rooted in a fundamental cognitive flaw, which is that we like proving ourselves right and hate being wrong.
  • “It feels good to validate a hypothesis,” Ioannidis said. “It feels even better when you’ve got a financial interest in the idea or your career depends upon it. And that’s why, even after a claim has been systematically disproven”—he cites, for instance, the early work on hormone replacement therapy, or claims involving various vitamins—“you still see some stubborn researchers citing the first few studies
  • That’s why Schooler argues that scientists need to become more rigorous about data collection before they publish. “We’re wasting too much time chasing after bad studies and underpowered experiments,”
  • The current “obsession” with replicability distracts from the real problem, which is faulty design.
  • “Every researcher should have to spell out, in advance, how many subjects they’re going to use, and what exactly they’re testing, and what constitutes a sufficient level of proof. We have the tools to be much more transparent about our experiments.”
  • Schooler recommends the establishment of an open-source database, in which researchers are required to outline their planned investigations and document all their results. “I think this would provide a huge increase in access to scientific work and give us a much better way to judge the quality of an experiment,”
  • scientific research will always be shadowed by a force that can’t be curbed, only contained: sheer randomness. Although little research has been done on the experimental dangers of chance and happenstance, the research that exists isn’t encouraging.
  • The disturbing implication of the Crabbe study is that a lot of extraordinary scientific data are nothing but noise. The hyperactivity of those coked-up Edmonton mice wasn’t an interesting new fact—it was a meaningless outlier, a by-product of invisible variables we don’t understand.
  • The problem, of course, is that such dramatic findings are also the most likely to get published in prestigious journals, since the data are both statistically significant and entirely unexpected
  • This suggests that the decline effect is actually a decline of illusion. While Karl Popper imagined falsification occurring with a single, definitive experiment—Galileo refuted Aristotelian mechanics in an afternoon—the process turns out to be much messier than that.
  • Many scientific theories continue to be considered true even after failing numerous experimental tests.
  • Even the law of gravity hasn’t always been perfect at predicting real-world phenomena. (In one test, physicists measuring gravity by means of deep boreholes in the Nevada desert found a two-and-a-half-per-cent discrepancy between the theoretical predictions and the actual data.)
  • Such anomalies demonstrate the slipperiness of empiricism. Although many scientific ideas generate conflicting results and suffer from falling effect sizes, they continue to get cited in the textbooks and drive standard medical practice. Why? Because these ideas seem true. Because they make sense. Because we can’t bear to let them go. And this is why the decline effect is so troubling. Not because it reveals the human fallibility of science, in which data are tweaked and beliefs shape perceptions. (Such shortcomings aren’t surprising, at least for scientists.) And not because it reveals that many of our most exciting theories are fleeting fads and will soon be rejected. (That idea has been around since Thomas Kuhn.)
  • The decline effect is troubling because it reminds us how difficult it is to prove anything. We like to pretend that our experiments define the truth for us. But that’s often not the case. Just because an idea is true doesn’t mean it can be proved. And just because an idea can be proved doesn’t mean it’s true. When the experiments are done, we still have to choose what to believe. ♦
46More

The Man Who Would Teach Machines to Think - James Somers - The Atlantic - 1 views

  • Douglas Hofstadter, the Pulitzer Prize–winning author of Gödel, Escher, Bach, thinks we've lost sight of what artificial intelligence really means. His stubborn quest to replicate the human mind.
  • “If somebody meant by artificial intelligence the attempt to understand the mind, or to create something human-like, they might say—maybe they wouldn’t go this far—but they might say this is some of the only good work that’s ever been done
  • Their operating premise is simple: the mind is a very unusual piece of software, and the best way to understand how a piece of software works is to write it yourself.
  • ...43 more annotations...
  • “It depends on what you mean by artificial intelligence.”
  • Computers are flexible enough to model the strange evolved convolutions of our thought, and yet responsive only to precise instructions. So if the endeavor succeeds, it will be a double victory: we will finally come to know the exact mechanics of our selves—and we’ll have made intelligent machines.
  • Ever since he was about 14, when he found out that his youngest sister, Molly, couldn’t understand language, because she “had something deeply wrong with her brain” (her neurological condition probably dated from birth, and was never diagnosed), he had been quietly obsessed by the relation of mind to matter.
  • How could consciousness be physical? How could a few pounds of gray gelatin give rise to our very thoughts and selves?
  • Consciousness, Hofstadter wanted to say, emerged via just the same kind of “level-crossing feedback loop.”
  • In 1931, the Austrian-born logician Kurt Gödel had famously shown how a mathematical system could make statements not just about numbers but about the system itself.
  • But then AI changed, and Hofstadter didn’t change with it, and for that he all but disappeared.
  • By the early 1980s, the pressure was great enough that AI, which had begun as an endeavor to answer yes to Alan Turing’s famous question, “Can machines think?,” started to mature—or mutate, depending on your point of view—into a subfield of software engineering, driven by applications.
  • Take Deep Blue, the IBM supercomputer that bested the chess grandmaster Garry Kasparov. Deep Blue won by brute force.
  • Hofstadter wanted to ask: Why conquer a task if there’s no insight to be had from the victory? “Okay,” he says, “Deep Blue plays very good chess—so what? Does that tell you something about how we play chess? No. Does it tell you about how Kasparov envisions, understands a chessboard?”
  • AI started working when it ditched humans as a model, because it ditched them. That’s the thrust of the analogy: Airplanes don’t flap their wings; why should computers think?
  • It’s a compelling point. But it loses some bite when you consider what we want: a Google that knows, in the way a human would know, what you really mean when you search for something
  • Cognition is recognition,” he likes to say. He describes “seeing as” as the essential cognitive act: you see some lines a
  • How do you make a search engine that understands if you don’t know how you understand?
  • s “an A,” you see a hunk of wood as “a table,” you see a meeting as “an emperor-has-no-clothes situation” and a friend’s pouting as “sour grapes”
  • That’s what it means to understand. But how does understanding work?
  • analogy is “the fuel and fire of thinking,” the bread and butter of our daily mental lives.
  • there’s an analogy, a mental leap so stunningly complex that it’s a computational miracle: somehow your brain is able to strip any remark of the irrelevant surface details and extract its gist, its “skeletal essence,” and retrieve, from your own repertoire of ideas and experiences, the story or remark that best relates.
  • in Hofstadter’s telling, the story goes like this: when everybody else in AI started building products, he and his team, as his friend, the philosopher Daniel Dennett, wrote, “patiently, systematically, brilliantly,” way out of the light of day, chipped away at the real problem. “Very few people are interested in how human intelligence works,”
  • For more than 30 years, Hofstadter has worked as a professor at Indiana University at Bloomington
  • The quick unconscious chaos of a mind can be slowed down on the computer, or rewound, paused, even edited
  • project out of IBM called Candide. The idea behind Candide, a machine-translation system, was to start by admitting that the rules-based approach requires too deep an understanding of how language is produced; how semantics, syntax, and morphology work; and how words commingle in sentences and combine into paragraphs—to say nothing of understanding the ideas for which those words are merely conduits.
  • , Hofstadter directs the Fluid Analogies Research Group, affectionately known as FARG.
  • Parts of a program can be selectively isolated to see how it functions without them; parameters can be changed to see how performance improves or degrades. When the computer surprises you—whether by being especially creative or especially dim-witted—you can see exactly why.
  • When you read Fluid Concepts and Creative Analogies: Computer Models of the Fundamental Mechanisms of Thought, which describes in detail this architecture and the logic and mechanics of the programs that use it, you wonder whether maybe Hofstadter got famous for the wrong book.
  • ut very few people, even admirers of GEB, know about the book or the programs it describes. And maybe that’s because FARG’s programs are almost ostentatiously impractical. Because they operate in tiny, seemingly childish “microdomains.” Because there is no task they perform better than a human.
  • “The entire effort of artificial intelligence is essentially a fight against computers’ rigidity.”
  • “Nobody is a very reliable guide concerning activities in their mind that are, by definition, subconscious,” he once wrote. “This is what makes vast collections of errors so important. In an isolated error, the mechanisms involved yield only slight traces of themselves; however, in a large collection, vast numbers of such slight traces exist, collectively adding up to strong evidence for (and against) particular mechanisms.
  • So IBM threw that approach out the window. What the developers did instead was brilliant, but so straightforward,
  • The technique is called “machine learning.” The goal is to make a device that takes an English sentence as input and spits out a French sentence
  • What you do is feed the machine English sentences whose French translations you already know. (Candide, for example, used 2.2 million pairs of sentences, mostly from the bilingual proceedings of Canadian parliamentary debates.)
  • By repeating this process with millions of pairs of sentences, you will gradually calibrate your machine, to the point where you’ll be able to enter a sentence whose translation you don’t know and get a reasonable resul
  • Google Translate team can be made up of people who don’t speak most of the languages their application translates. “It’s a bang-for-your-buck argument,” Estelle says. “You probably want to hire more engineers instead” of native speakers.
  • But the need to serve 1 billion customers has a way of forcing the company to trade understanding for expediency. You don’t have to push Google Translate very far to see the compromises its developers have made for coverage, and speed, and ease of engineering. Although Google Translate captures, in its way, the products of human intelligence, it isn’t intelligent itself.
  • “Did we sit down when we built Watson and try to model human cognition?” Dave Ferrucci, who led the Watson team at IBM, pauses for emphasis. “Absolutely not. We just tried to create a machine that could win at Jeopardy.”
  • For Ferrucci, the definition of intelligence is simple: it’s what a program can do. Deep Blue was intelligent because it could beat Garry Kasparov at chess. Watson was intelligent because it could beat Ken Jennings at Jeopardy.
  • “There’s a limited number of things you can do as an individual, and I think when you dedicate your life to something, you’ve got to ask yourself the question: To what end? And I think at some point I asked myself that question, and what it came out to was, I’m fascinated by how the human mind works, it would be fantastic to understand cognition, I love to read books on it, I love to get a grip on it”—he called Hofstadter’s work inspiring—“but where am I going to go with it? Really what I want to do is build computer systems that do something.
  • Peter Norvig, one of Google’s directors of research, echoes Ferrucci almost exactly. “I thought he was tackling a really hard problem,” he told me about Hofstadter’s work. “And I guess I wanted to do an easier problem.”
  • Of course, the folly of being above the fray is that you’re also not a part of it
  • As our machines get faster and ingest more data, we allow ourselves to be dumber. Instead of wrestling with our hardest problems in earnest, we can just plug in billions of examples of them.
  • Hofstadter hasn’t been to an artificial-intelligence conference in 30 years. “There’s no communication between me and these people,” he says of his AI peers. “None. Zero. I don’t want to talk to colleagues that I find very, very intransigent and hard to convince of anything
  • Everything from plate tectonics to evolution—all those ideas, someone had to fight for them, because people didn’t agree with those ideas.
  • Academia is not an environment where you just sit in your bath and have ideas and expect everyone to run around getting excited. It’s possible that in 50 years’ time we’ll say, ‘We really should have listened more to Doug Hofstadter.’ But it’s incumbent on every scientist to at least think about what is needed to get people to understand the ideas.”
7More

'Naked Statistics' by Charles Wheelan - Review - NYTimes.com - 2 views

  • Whether you are healthy, moribund or traversing the stages of decrepitude in between, every morsel of medical advice you receive is pure conjecture — educated guesswork perhaps, but guesswork nonetheless. Your health care provider and your favorite columnist are both mere croupiers, enablers for your health gambling habit.
  • Staying well is all about probability and risk. So is the interpretation of medical tests, and so are all treatments for all illnesses, dire and trivial alike. Health has nothing in common with the laws of physics and everything in common with lottery cards, mutual funds and tomorrow’s weather forecast.
  • Are you impressed with studies showing that people who take Vitamin X or perform Exercise Y live longer? Remember, correlation does not imply causation. Do you obsess over studies claiming to show that various dietary patterns cause cancer? In fact, Mr. Wheelan points out, this kind of research examines not so much how diet affects the likelihood of cancer as how getting cancer affects people’s memory of what they used to eat.
  • ...3 more annotations...
  • the rest comes from his multiple real world examples illustrating exactly why even the most reluctant mathophobe is well advised to achieve a personal understanding of the statistical underpinnings of life, whether that individual is watching football on the couch, picking a school for the children or jiggling anxiously in a hospital admitting office.
  • And while we’re talking about bias, let’s not forget publication bias: studies that show a drug works get published, but those showing a drug does nothing tend to disappear.
  • The same trade-off applies to the interpretation of medical tests. Unproven disease screens are likely to do little but feed lots of costly, anxiety-producing garbage into your medical record.
  •  
    An interesting article/review of a book that compares statistics and human health. Interestingly enough, it shows that statistics and studies about health are often taken to be true and misinterpreted because we want them to be true, and we want to believe that some minor change in our lifestyles may somehow prevent us from getting cancer, for example. More info about the book from the publisher: http://books.wwnorton.com/books/detail.aspx?ID=24713
11More

Think Less, Think Better - The New York Times - 1 views

  • the capacity for original and creative thinking is markedly stymied by stray thoughts, obsessive ruminations and other forms of “mental load.”
  • Many psychologists assume that the mind, left to its own devices, is inclined to follow a well-worn path of familiar associations. But our findings suggest that innovative thinking, not routine ideation, is our default cognitive mode when our minds are clear.
  • We found that a high mental load consistently diminished the originality and creativity of the response: Participants with seven digits to recall resorted to the most statistically common responses (e.g., white/black), whereas participants with two digits gave less typical, more varied pairings (e.g., white/cloud).
  • ...8 more annotations...
  • In another experiment, we found that longer response times were correlated with less diverse responses, ruling out the possibility that participants with low mental loads simply took more time to generate an interesting response.
  • it seems that with a high mental load, you need more time to generate even a conventional thought. These experiments suggest that the mind’s natural tendency is to explore and to favor novelty, but when occupied it looks for the most familiar and inevitably least interesting solution.
  • In general, there is a tension in our brains between exploration and exploitation. When we are exploratory, we attend to things with a wide scope, curious and desiring to learn. Other times, we rely on, or “exploit,” what we already know, leaning on our expectations, trusting the comfort of a predictable environment
  • Much of our lives are spent somewhere between those extremes. There are functional benefits to both modes: If we were not exploratory, we would never have ventured out of the caves; if we did not exploit the certainty of the familiar, we would have taken too many risks and gone extinct. But there needs to be a healthy balance
  • All these loads can consume mental capacity, leading to dull thought and anhedonia — a flattened ability to experience pleasure.
  • ancient meditative practice helps free the mind to have richer experiences of the present
  • your life leaves too much room for your mind to wander. As a result, only a small fraction of your mental capacity remains engaged in what is before it, and mind-wandering and ruminations become a tax on the quality of your life
  • Honing an ability to unburden the load on your mind, be it through meditation or some other practice, can bring with it a wonderfully magnified experience of the world — and, as our study suggests, of your own mind.
15More

China: A Modern Babel - WSJ - 0 views

  • The oft-repeated claim that we must all learn Mandarin Chinese, the better to trade with our future masters, is one that readers of David Moser’s “A Billion Voices” will rapidly end up re-evaluating.
  • In fact, many Chinese don’t speak it: Even Chinese authorities quietly admit that only about 70% of the population speaks Mandarin, and merely one in 10 of those speak it fluently.
  • Mr. Moser presents a history of what is more properly called Putonghua, or “common speech,” along with a clear, concise and often amusing introduction to the limits of its spoken and written forms.
  • ...12 more annotations...
  • what Chinese schoolchildren are encouraged to think of as the longstanding natural speech of the common people is in fact an artificial hybrid, only a few decades old, although it shares a name—Mandarin—with the language of administration from imperial times. It’s a designed-by-committee camel of a language that has largely lost track of its past.
  • The idea of a national Chinese language began with the realization by the accidentally successful revolutionaries of 1911 that retaining control over a country speaking multiple languages and myriad dialects would necessitate reform. Long-term unification and the introduction of mass education would require a common language.
  • Whatever the province they originated from, the administrators of the now-toppled Great Qing Empire had all learned to communicate with one another in a second common language—Guanhua, China’s equivalent, in practical terms, of medieval Latin
  • To understand this highly compressed idiom required a considerable knowledge of the Chinese classics. Early Jesuit missionaries had labeled it Mandarin,
  • The committee decided that the four-tone dialect of the capital would be the base for a new national language but added a fifth tone whose use had lapsed in the north but not in southern dialects. The result was a language that no one actually spoke.
  • After the Communist victory of 1949, the process began all over again with fresh conferences, leading finally to the decision to use Beijing sounds, northern dialects and modern literature in the vernacular (of which there was very little) as a source of grammar.
  • This new spoken form is what is now loosely labeled Mandarin, still as alien to most Chinese as all the other Chinese languages.
  • A Latin alphabet system called Pinyin was introduced to help children learn to pronounce Chinese characters, but today it is usually abandoned after the first few years of elementary school.
  • The view that Mandarin is too difficult for mere foreigners to learn is essential to Chinese amour propre. But it is belied by the number of foreign high-school students who now learn the language by using Pinyin as a key to pronunciation —and who bask in the admiration they receive as a result.
  • Since 1949, the Chinese government, obsessed with promoting the image of a nation completely united in its love of the Communist Party, has decided that the Chinese people speak not several different languages but the same one in a variety of dialects. To say otherwise is to suggest, dangerously, that China is not one nation
  • Yet on Oct. 1, 1949, Mao Zedong announced the founding of the People’s Republic in a Hunan accent so thick that members of his audience subsequently differed about what he had said. He never mastered the Beijing sounds on which Putonghua is based, nor did Sichuanese-speaking Deng Xiaoping or most of his successors.
  • When Xi Jinping took power in 2012, many online commentators rejoiced. “At last! A Chinese leader who can speak Putonghua!” One leader down, only 400 million more common people to go.
10More

The Right Way to Say 'I'm Sorry' - The New York Times - 1 views

  • Most people say “I’m sorry” many times a day for a host of trivial affronts – accidentally bumping into someone or failing to hold open a door. These apologies are easy and usually readily accepted, often with a response like, “No problem.”
  • But when “I’m sorry” are the words needed to right truly hurtful words, acts or inaction, they can be the hardest ones to utter.
  • apology can be powerful medicine with surprising value for the giver as well as the recipient.
  • ...6 more annotations...
  • Expecting nothing in return, I was greatly relieved when my doorbell rang and the neighbor thanked me warmly for what I had said and done.
  • as an excuse for hurtful behavior.
  • She disputes popular thinking that failing to forgive is bad for one’s health and can lead to a life mired in bitterness and hate.
  • Offering an apology is an admission of guilt that admittedly leaves people vulnerable. There’s no guarantee as to how it will be received.
  • apologies followed by rationalizations are “never satisfying” and can even be harmful.
  • ‘I’m sorry’ are the two most healing words in the English language,” she said. “The courage to apologize wisely and well is not just a gift to the injured person, who can then feel soothed and released from obsessive recriminations, bitterness and corrosive anger.
  •  
    We are very easy with saying "sorry" when it is completely unnecessary to perform our politeness because we know for sure that others will respond with a "No problem". However, when it comes to the real times that a "sorry" is very essential, we become very reluctant on saying that since the recipient is very likely to reject the apology. Giving an apology for our wrong behavior can release us from guilt and bitterness. Apology should not be an ask for forgiveness, it is a communication between two people, a reviewing on ourselves. Apologies shouldn't be begging for forgiveness, it should be a self-reflection. --Sissi (1/31/2017)
16More

What Machines Can't Do - NYTimes.com - 0 views

  • certain mental skills will become less valuable because computers will take over. Having a great memory will probably be less valuable. Being able to be a straight-A student will be less valuable — gathering masses of information and regurgitating it back on tests. So will being able to do any mental activity that involves following a set of rules.
  • what human skills will be more valuable?
  • In the news business, some of those skills are already evident.
  • ...13 more annotations...
  • Technology has rewarded sprinters (people who can recognize and alertly post a message on Twitter about some interesting immediate event) and marathoners (people who can write large conceptual stories), but it has hurt middle-distance runners (people who write 800-word summaries of yesterday’s news conference).
  • Technology has rewarded graphic artists who can visualize data, but it has punished those who can’t turn written reporting into video presentations.
  • More generally, the age of brilliant machines seems to reward a few traits.
  • First, it rewards enthusiasm. The amount of information in front of us is practically infinite; so is that amount of data that can be collected with new tools. The people who seem to do best possess a voracious explanatory drive, an almost obsessive need to follow their curiosity.
  • Second, the era seems to reward people with extended time horizons and strategic discipline.
  • a human can provide an overall sense of direction and a conceptual frame. In a world of online distractions, the person who can maintain a long obedience toward a single goal, and who can filter out what is irrelevant to that goal, will obviously have enormous worth.
  • Third, the age seems to reward procedural architects. The giant Internet celebrities didn’t so much come up with ideas, they came up with systems in which other people could express ideas: Facebook, Twitter, Wikipedia, etc.
  • One of the oddities of collaboration is that tightly knit teams are not the most creative. Loosely bonded teams are, teams without a few domineering presences, teams that allow people to think alone before they share results with the group. So a manager who can organize a decentralized network around a clear question, without letting it dissipate or clump, will have enormous value.
  • Fifth, essentialists will probably be rewarded.
  • creativity can be described as the ability to grasp the essence of one thing, and then the essence of some very different thing, and smash them together to create some entirely new thing.
  • In the 1950s, the bureaucracy was the computer. People were organized into technocratic systems in order to perform routinized information processing.
  • now the computer is the computer. The role of the human is not to be dispassionate, depersonalized or neutral. It is precisely the emotive traits that are rewarded: the voracious lust for understanding, the enthusiasm for work, the ability to grasp the gist, the empathetic sensitivity to what will attract attention and linger in the mind.
  • Unable to compete when it comes to calculation, the best workers will come with heart in hand.
9More

Life as Instant Replay, Over and Over Again - NYTimes.com - 0 views

  • although I think about the Web as a real-time organism, most of the time the organism is obsessing about what happened earlier that day, or week.
  • The replay Web coexists with the real-time Web, the phrase often used to describe sites and services, like Twitter, that let people consume information as soon as it is published.
  • The rise of the replay Web is more than just a coping mechanism that helps us deal with information overload. It is shaping the way we consume, process and share information, and could potentially influence the businesses that are built on it.
  • ...6 more annotations...
  • “Our consumption of content isn’t synchronous,” he said. “Things are interesting to a small group, a cluster of friends or on Reddit — then they die.” He continued: “Then they pick up and go massive. There are these little ripple effects all around the Web, little waves that converge in a pool and make big waves.”
  • THERE are other signs to indicate a replay aesthetic is coming to the Web. For example, the latest version of the iPhone’s mobile software will include the ability to capture video in slow motion.
  • The way we use technology could be reshaping our sense of time and urgency. Douglas Rushkoff, the author of “Present Shock: When Everything Happens Now,” describes present shock as the stupefaction that overtakes people as they try to keep up with the never-ending onslaught of status updates, photo feeds and looping videos constantly refreshing before their eyes.
  • present shock keeps us suspended in a state of constant disarray, and causes us to prioritize the recent over the relevant and the new instead of the most important.
  • the medium of the loop — either in a GIF, a short clip posted to Vine, or Instagram photo- and video-sharing applications — has become a crucial framework for transmitting information. It mimics the way we remember — by repetition — and future tools could make use of that to understand not only how we transmit information but also how we retain it.
  • Perhaps the replay Web, by allowing us to constantly revisit and reconsider the recent past, can help us find new meaning in it.
10More

The Art of Focus - NYTimes.com - 1 views

  • in order to pursue their intellectual adventures, children need a secure social base:
  • The way to discover a terrifying longing is to liberate yourself from the self-censoring labels you began to tell yourself over the course of your mis-education. These formulas are stultifying
  • The lesson from childhood, then, is that if you want to win the war for attention, don’t try to say “no” to the trivial distractions you find on the information smorgasbord; try to say “yes” to the subject that arouses a terrifying longing, and let the terrifying longing crowd out everything else.
  • ...7 more annotations...
  • Don’t structure your encounters with them the way people do today, through brainstorming sessions (those don’t work) or through conferences with projection screen
  • Focus on the external objects of fascination, not on who you think you are. Find people with overlapping obsessions.
  • this creates a space internally into which one can be absorbed. In order to be absorbed one has to feel sufficiently safe, as though there is some shield, or somebody guarding
  • Instead look at the way children learn in groups. They make discoveries alone, but bring their treasures to the group. Then the group crowds around and hashes it out. In conversation, conflict, confusion and uncertainty can be metabolized and digested through somebody else.
  • 66 percent of workers aren’t able to focus on one thing at a time. Seventy percent of employees don’t have regular time for creative or strategic thinking while at work.
  • Many of us lead lives of distraction, unable to focus on what we know we should focus on.
  • I wonder if we might be able to copy some of the techniques used by the creatures who are phenomenally good at learning things: children.
10More

Why Do Some Brains Enjoy Fear? - Allegra Ringo - The Atlantic - 1 views

  • One of the most interesting things about studying fear is looking at the social constructions of fear, and learned fears versus those fears that appear to be more innate, or even genetic
  • Through fear conditioning (connecting a neutral stimulus with a negative consequence) we can link pretty much anything to a fear response.
  • So we know that we can learn to fear, and this means our socialization and the society in which we are raised is going to have a lot to do with what we find scary.
  • ...7 more annotations...
  • This speaks to the fact that things that violate the laws of nature are terrifying. And really anything that doesn’t make sense or causes us some sort of dissonance, whether it is cognitive or aesthetic, is going to be scary (axe-wielding animals, masked faces, contorted bodies).
  • Humans are obsessed with death; we simply have a hard time wrapping our mind around what happens when we die.
  • Humans have been scaring themselves and each other since the birth of the species, through all kinds of methods like storytelling, jumping off cliffs, and popping out to startle each other from the recesses of some dark cave.
  • to build group unity, to prepare kids for life in the scary world, and, of course, to control behavior.
  • These scary stories provided, and continue to deliver, intrigue, exhilaration, and a jolt of excitement to our lives.
  • One of the reasons people love Halloween is because it produces strong emotional responses, and those responses work to build stronger relationships and memories. When we’re happy, or afraid, we’re releasing powerful hormones, like oxcytocin, that are working to make these moments stick in our brain. So we’re going to remember the people we’re with. If it was a good experience, then we’ll remember them fondly and feel close to them, more so than if we were to meet them during some neutral unexciting event.
  • We’re social and emotional beings. We need each other in times of stress, so the fact that our bodies have evolved to make sure we feel close to those we are with when afraid makes sense.
11More

Bridgegate scandal coverage puts media 'bias' on 'full display,' Christie says | NJ.com - 0 views

  • Gov. Chris Christie insisted during his latest trip to New Hampshire that the fallout from the George Washington Bridge scandal wouldn't have been as nearly as intense if he were a Democrat.
  • argued to early-primary voters Hillary Clinton escaped scrutiny for clearing the private server housing emails from her tenure as secretary of state because she's a Democrat and he declared "bias is on full display" when that's compared to his own controversy.
  • "Could you imagine if my response the day after
  • ...8 more annotations...
  • "As you all know, I went through a really fun time the last 15 months with lots of different people investigating me too, right?"
  • all of that happened last January was, 'Oh, by the way, ... I have a private email server and all my emails were on this private server and I deleted a bunch of them, but they were only personal, and you're going to have to take my word for it cause the servers gone."
  • "There is a bias," Christie insisted
  • not the first time the governor suggested media bias was to blame for the fallout of the George Washington Bridge lane closure controversy.
  • December 2013, a month before the now infamous "time for some traffic problems in Fort Lee" email from a top Christie staffer was revealed, Christie brushed off questions about the lane closures during a Statehouse news conference
  • "I know you guys are obsessed with this, I'm not. I'm really not. It's not that big a deal," Christie insisted. "Just because press runs around and writes about it, both here and nationally, I know why that is and so do you, let's not pretend it's because of the gravity of the issue. It's because I am a national figure and anything like this will be written a lot about now, so let's not pretend this is some grave thing."
  • Christie signaled in New Hampshire he's intent on pressing Clinton, the Democratic frontrunner in the 2016 presidential race, on the attacks in Benghazi, Libya that's been a lighting rod for Clinton critics just as the governor declared during a recent trip here that he's done "apologizing" for the Bridgegate scandal
  • "I don't think there's been nearly enough questions asked about this," Christie said. "We need to ask a lot more questions about Benghazi. We need to get to the bottom of what happened because it does matter, madam secretary."
6More

Optogenetics and OCD | The Scientist Magazine® - 0 views

  • abnormal brain activity seen in patients with obsessive-compulsive disorder (OCD) is a likely cause of the condition, according to two papers published today (June 6) in Science. Both studies used optogenetic techniques, which allow specific brain cells to be turned on and off at the flick of a light switch, to link the abnormal brain activity to OCD behavior in mice.
  • relationship between [neural] circuit abnormalities and [OCD-like] behavior,
  • “so I’m very aware of how severe the illness is and how important it is for us to develop new and more effective treatments.” The problem is, she added, “we don’t have a very good understanding of the pathologic changes that lead to OCD.”
  • ...3 more annotations...
  • Ahmari turned to mice and to optogenetics—whereby specific types of brain cells can be switched on and off in response to light. The cells of interest were engineered to express light-responsive ion channels, such that when the cells are exposed to light—by means of a fiber optic cable pointing a laser at the relevant brain region—the channels open, ions flood in, and nerve impulses fire.
  • But crucially, the teams had very slightly different targets in the striatum—the general ventral striatum in the case of Ahmari’s study and the specific fast-spiking inhibitory interneurons of the centromedial striatum in Graybiel’s study, suggesting that different neural circuits send inhibitory and stimulatory signals for certain OCD-like behaviors.
  • “they’ve been able to really delineate a pathway for compulsive behaviors in a more refined way than we’ve ever been able to do in humans, and that’s exciting because it points us in the direction of pathogenesis and pathophysiology of the disorder, but also . . . to new targets for potential intervention.”
17More

Big Tech Has Become Way Too Powerful - The New York Times - 1 views

  • CONSERVATIVES and liberals interminably debate the merits of “the free market” versus “the government.
  • The important question, too rarely discussed, is who has the most influence over these decisions and in that way wins the game.
  • Now information and ideas are the most valuable forms of property. Most of the cost of producing it goes into discovering it or making the first copy. After that, the additional production cost is often zero. Such “intellectual property” is the key building block of the new economy
  • ...14 more annotations...
  • as has happened before with other forms of property, the most politically influential owners of the new property are doing their utmost to increase their profits by creating monopolies
  • The most valuable intellectual properties are platforms so widely used that everyone else has to use them, too. Think of standard operating systems like Microsoft’s Windows or Google’s Android; Google’s search engine; Amazon’s shopping system; and Facebook’s communication network
  • Despite an explosion in the number of websites over the last decade, page views are becoming more concentrated. While in 2001, the top 10 websites accounted for 31 percent of all page views in America, by 2010 the top 10 accounted for 75 percent
  • Amazon is now the first stop for almost a third of all American consumers seeking to buy anything
  • Google and Facebook are now the first stops for many Americans seeking news — while Internet traffic to much of the nation’s newspapers, network television and other news gathering agencies has fallen well below 50 percent of all traffic.
  • almost all of the profits go to the platforms’ owners, who have all of the bargaining power
  • The rate at which new businesses have formed in the United States has slowed markedly since the late 1970s. Big Tech’s sweeping patents, standard platforms, fleets of lawyers to litigate against potential rivals and armies of lobbyists have created formidable barriers to new entrants
  • The law gives 20 years of patent protection to inventions that are “new and useful,” as decided by the Patent and Trademark Office. But the winners are big enough to game the system. They make small improvements warranting new patents, effectively making their intellectual property semipermanent.
  • They also lay claim to whole terrains of potential innovation including ideas barely on drawing boards and flood the system with so many applications that lone inventors have to wait years.
  • Big Tech has been almost immune to serious antitrust scrutiny, even though the largest tech companies have more market power than ever. Maybe that’s because they’ve accumulated so much political power.
  • Economic and political power can’t be separated because dominant corporations gain political influence over how markets are maintained and enforced, which enlarges their economic power further. One of the original goals of antitrust law was to prevent this.
  • We are now in a new gilded age similar to the first Gilded Age, when the nation’s antitrust laws were enacted. As then, those with great power and resources are making the “free market” function on their behalf. Big Tech — along with the drug, insurance, agriculture and financial giants — dominates both our economy and our politics.
  • The real question is how government organizes the market, and who has the most influence over its decisions
  • Yet as long as we remain obsessed by the debate over the relative merits of the “free market” and “government,” we have little hope of seeing what’s occurring and taking the action that’s needed to make our economy work for the many, not the few.
5More

Atheist In A Foxhole | Issue 105 | Philosophy Now - 0 views

  • David Rönnegard asks how a committed atheist confronted with death might find consolation.
  • Faith is not a virtue I hold. In particular, I disbelieve claims to knowledge about God’s existence or will. As an atheist and a Humanist, my approach to life has been grounded on rational thought and empirical evidence. I consider death to be the end of our conscious existence, and that any meaning that life may have resides with man.
  • Public reflecting on life is often done in fear of, but seldom in the face of, death. I am in the privileged but unenviable position of doing the latter.
  • ...2 more annotations...
  • Having never had an inclination towards the supernatural, religion has never appeared to me as either credible or a source of comfort. News of looming death has not encouraged me to grasp for false consolation, though consolation is sorely needed. Rather, my obsession with death has hitherto been soothed by Socrates’ description of philosophy as the process by which one comes to accept one’s own death.
  • Existentialism emphasizes the subjective nature of being; that is, the essence of what it is like to be us. It gets us closer to considering what it is we value, which is central to shaping a meaningful life for ourselves as we pursue those values.
15More

Kung Fu for Philosophers - NYTimes.com - 0 views

  • any ability resulting from practice and cultivation could accurately be said to embody kung fu.
  • the predominant orientation of traditional Chinese philosophy is the concern about how to live one’s life, rather than finding out the truth about reality.
  • Confucius’s call for “rectification of names” — one must use words appropriately — is more a kung fu method for securing sociopolitical order than for capturing the essence of things, as “names,” or words, are placeholders for expectations of how the bearer of the names should behave and be treated. This points to a realization of what J. L. Austin calls the “performative” function of language.
  • ...12 more annotations...
  • Instead of leading to a search for certainty, as Descartes’s dream did, Zhuangzi came to the realization that he had perceived “the transformation of things,” indicating that one should go along with this transformation rather than trying in vain to search for what is real.
  • the views of Mencius and his later opponent Xunzi’s views about human nature are more recommendations of how one should view oneself in order to become a better person than metaphysical assertions about whether humans are by nature good or bad. Though each man’s assertions about human nature are incompatible with each other, they may still function inside the Confucian tradition as alternative ways of cultivation.
  • The Buddhist doctrine of no-self surely looks metaphysical, but its real aim is to free one from suffering, since according to Buddhism suffering comes ultimately from attachment to the self. Buddhist meditations are kung fu practices to shake off one’s attachment, and not just intellectual inquiries for getting propositional truth.
  • The essence of kung fu — various arts and instructions about how to cultivate the person and conduct one’s life — is often hard to digest for those who are used to the flavor and texture of mainstream Western philosophy. It is understandable that, even after sincere willingness to try, one is often still turned away by the lack of clear definitions of key terms and the absence of linear arguments in classic Chinese texts. This, however, is not a weakness, but rather a requirement of the kung fu orientation — not unlike the way that learning how to swim requires one to focus on practice and not on conceptual understanding.
  • It even expands epistemology into the non-conceptual realm in which the accessibility of knowledge is dependent on the cultivation of cognitive abilities, and not simply on whatever is “publicly observable” to everyone. It also shows that cultivation of the person is not confined to “knowing how.” An exemplary person may well have the great charisma to affect others but does not necessarily know how to affect others.
  • Western philosophy at its origin is similar to classic Chinese philosophy. The significance of this point is not merely in revealing historical facts. It calls our attention to a dimension that has been eclipsed by the obsession with the search for eternal, universal truth and the way it is practiced, namely through rational arguments.
  • One might well consider the Chinese kung fu perspective a form of pragmatism.  The proximity between the two is probably why the latter was well received in China early last century when John Dewey toured the country. What the kung fu perspective adds to the pragmatic approach, however, is its clear emphasis on the cultivation and transformation of the person, a dimension that is already in Dewey and William James but that often gets neglected
  • A kung fu master does not simply make good choices and use effective instruments to satisfy whatever preferences a person happens to have. In fact the subject is never simply accepted as a given. While an efficacious action may be the result of a sound rational decision, a good action that demonstrates kung fu has to be rooted in the entire person, including one’s bodily dispositions and sentiments, and its goodness is displayed not only through its consequences but also in the artistic style one does it. It also brings forward what Charles Taylor calls the “background” — elements such as tradition and community — in our understanding of the formation of a person’s beliefs and attitudes. Through the kung fu approach, classic Chinese philosophy displays a holistic vision that brings together these marginalized dimensions and thereby forces one to pay close attention to the ways they affect each other.
  • This kung fu approach shares a lot of insights with the Aristotelian virtue ethics, which focuses on the cultivation of the agent instead of on the formulation of rules of conduct. Yet unlike Aristotelian ethics, the kung fu approach to ethics does not rely on any metaphysics for justification.
  • This approach opens up the possibility of allowing multiple competing visions of excellence, including the metaphysics or religious beliefs by which they are understood and guided, and justification of these beliefs is then left to the concrete human experiences.
  • it is more appropriate to consider kung fu as a form of art. Art is not ultimately measured by its dominance of the market. In addition, the function of art is not accurate reflection of the real world; its expression is not constrained to the form of universal principles and logical reasoning, and it requires cultivation of the artist, embodiment of virtues/virtuosities, and imagination and creativity.
  • If philosophy is “a way of life,” as Pierre Hadot puts it, the kung fu approach suggests that we take philosophy as the pursuit of the art of living well, and not just as a narrowly defined rational way of life.
17More

How the Internet Gets Inside Us : The New Yorker - 0 views

  • It isn’t just that we’ve lived one technological revolution among many; it’s that our technological revolution is the big social revolution that we live with
  • The idea, for instance, that the printing press rapidly gave birth to a new order of information, democratic and bottom-up, is a cruel cartoon of the truth. If the printing press did propel the Reformation, one of the biggest ideas it propelled was Luther’s newly invented absolutist anti-Semitism. And what followed the Reformation wasn’t the Enlightenment, a new era of openness and freely disseminated knowledge. What followed the Reformation was, actually, the Counter-Reformation, which used the same means—i.e., printed books—to spread ideas about what jerks the reformers were, and unleashed a hundred years of religious warfare.
  • Robert K. Logan’s “The Sixth Language,” begins with the claim that cognition is not a little processing program that takes place inside your head, Robby the Robot style. It is a constant flow of information, memory, plans, and physical movements, in which as much thinking goes on out there as in here. If television produced the global village, the Internet produces the global psyche: everyone keyed in like a neuron, so that to the eyes of a watching Martian we are really part of a single planetary brain. Contraptions don’t change consciousness; contraptions are part of consciousness.
  • ...14 more annotations...
  • In a practical, immediate way, one sees the limits of the so-called “extended mind” clearly in the mob-made Wikipedia, the perfect product of that new vast, supersized cognition: when there’s easy agreement, it’s fine, and when there’s widespread disagreement on values or facts, as with, say, the origins of capitalism, it’s fine, too; you get both sides. The trouble comes when one side is right and the other side is wrong and doesn’t know it. The Shakespeare authorship page and the Shroud of Turin page are scenes of constant conflict and are packed with unreliable information. Creationists crowd cyberspace every bit as effectively as evolutionists, and extend their minds just as fully. Our trouble is not the over-all absence of smartness but the intractable power of pure stupidity, and no machine, or mind, seems extended enough to cure that.
  • “The medium does matter,” Carr has written. “As a technology, a book focuses our attention, isolates us from the myriad distractions that fill our everyday lives. A networked computer does precisely the opposite. It is designed to scatter our attention. . . . Knowing that the depth of our thought is tied directly to the intensity of our attentiveness, it’s hard not to conclude that as we adapt to the intellectual environment of the Net our thinking becomes shallower.”
  • when people struggle to describe the state that the Internet puts them in they arrive at a remarkably familiar picture of disassociation and fragmentation. Life was once whole, continuous, stable; now it is fragmented, multi-part, shimmering around us, unstable and impossible to fix.
  • The odd thing is that this complaint, though deeply felt by our contemporary Better-Nevers, is identical to Baudelaire’s perception about modern Paris in 1855, or Walter Benjamin’s about Berlin in 1930, or Marshall McLuhan’s in the face of three-channel television (and Canadian television, at that) in 1965.
  • If all you have is a hammer, the saying goes, everything looks like a nail; and, if you think the world is broken, every machine looks like the hammer that broke it.
  • Blair argues that the sense of “information overload” was not the consequence of Gutenberg but already in place before printing began.
  • Anyway, the crucial revolution was not of print but of paper: “During the later Middle Ages a staggering growth in the production of manuscripts, facilitated by the use of paper, accompanied a great expansion of readers outside the monastic and scholastic contexts.” For that matter, our minds were altered less by books than by index slips. Activities that seem quite twenty-first century, she shows, began when people cut and pasted from one manuscript to another; made aggregated news in compendiums; passed around précis. “Early modern finding devices” were forced into existence: lists of authorities, lists of headings.
  • The book index was the search engine of its era, and needed to be explained at length to puzzled researchers—as, for that matter, did the Hermione-like idea of “looking things up.” That uniquely evil and necessary thing the comprehensive review of many different books on a related subject, with the necessary oversimplification of their ideas that it demanded, was already around in 1500, and already being accused of missing all the points.
  • at any given moment, our most complicated machine will be taken as a model of human intelligence, and whatever media kids favor will be identified as the cause of our stupidity. When there were automatic looms, the mind was like an automatic loom; and, since young people in the loom period liked novels, it was the cheap novel that was degrading our minds. When there were telephone exchanges, the mind was like a telephone exchange, and, in the same period, since the nickelodeon reigned, moving pictures were making us dumb. When mainframe computers arrived and television was what kids liked, the mind was like a mainframe and television was the engine of our idiocy. Some machine is always showing us Mind; some entertainment derived from the machine is always showing us Non-Mind.
  • What we live in is not the age of the extended mind but the age of the inverted self. The things that have usually lived in the darker recesses or mad corners of our mind—sexual obsessions and conspiracy theories, paranoid fixations and fetishes—are now out there: you click once and you can read about the Kennedy autopsy or the Nazi salute or hog-tied Swedish flight attendants. But things that were once external and subject to the social rules of caution and embarrassment—above all, our interactions with other people—are now easily internalized, made to feel like mere workings of the id left on its own.
  • A social network is crucially different from a social circle, since the function of a social circle is to curb our appetites and of a network to extend them.
  • And so the peacefulness, the serenity that we feel away from the Internet, and which all the Better-Nevers rightly testify to, has less to do with being no longer harried by others than with being less oppressed by the force of your own inner life. Shut off your computer, and your self stops raging quite as much or quite as loud.
  • Now television is the harmless little fireplace over in the corner, where the family gathers to watch “Entourage.” TV isn’t just docile; it’s positively benevolent. This makes you think that what made television so evil back when it was evil was not its essence but its omnipresence. Once it is not everything, it can be merely something. The real demon in the machine is the tirelessness of the user.
  • the Internet screen has always been like the palantír in Tolkien’s “Lord of the Rings”—the “seeing stone” that lets the wizards see the entire world. Its gift is great; the wizard can see it all. Its risk is real: evil things will register more vividly than the great mass of dull good. The peril isn’t that users lose their knowledge of the world. It’s that they can lose all sense of proportion. You can come to think that the armies of Mordor are not just vast and scary, which they are, but limitless and undefeatable, which they aren’t.
12More

In a Data-Heavy Society, Being Defined by the Numbers - NYTimes.com - 0 views

  • Numbers and rankings are everywhere.
  • “Numbers make intangibles tangible,” said Jonah Lehrer, a journalist and author of “How We Decide,” (Houghton Mifflin Harcourt, 2009). “They give the illusion of control.”
  • “We want to quantify everything,” he went on, “to ground a decision in fact, instead of asking whether that variable matters.”
  • ...9 more annotations...
  • Numbers become not just part of the way we judge and assess, but the only way.
  • when students are researching a paper, how do they decide where to turn for the greatest expertise? Often, he said, by looking at what articles or papers online have the most hits.
  • “Just because we have the skills and ability to put metrics on everything doesn’t mean we should. People are ever-changing, fascinating and incredibly frustrating.”
  • black-and-white statistics, while arguably irrefutable in one way, really tell us almost nothing. Amazon’s rankings of book sales, for instance — which anyone can view — can vary wildly based on the sale of very few books.
  • “For almost anybody in the United States under the age of 25, the only models are quantifiable rankings,”
  • “Should it be that whatever has the most hits or the most editors makes it better than someone who spent his life studying Kant?”
  • The obsession with numbers, he said, means we don’t trust or even look for the intangibles that can’t be measured, like wisdom, judgment and expertise.
  • “What I’m most troubled by is the desire of individuals (especially myself) to constantly check up on these numbers, and to accept these measurements as a measure of something meaningful.”
  • “I have to stop worrying about numbers. I have to reclaim the ambiguous part of my own intelligence.”
‹ Previous 21 - 40 of 90 Next › Last »
Showing 20 items per page