Skip to main content

Home/ TOK Friends/ Group items matching "winning" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
nataliedepaulo1

Fact Check: Trump Blasts 'Fake News' and Repeats Inaccurate Claims at CPAC - The New York Times - 0 views

  • President Trump’s speech on Friday at the Conservative Political Action Conference followed a familiar pattern: Blast the news media as “dishonest,” repeat a string of falsehoods and wrap up by promising to change the status quo.
  • Mr. Trump mocked election polls for being wrong.
  • This is misleading. Mr. Trump, who trailed Hillary Clinton in major national polls leading up to the election, ended up winning the Electoral College. But most of the polls Mr. Trump referred to actually reflected the popular vote total within the margin of error.
sissij

Scientists Figure Out When Different Cognitive Abilities Peak Throughout Life | Big Think - 0 views

  • Such skills come from accumulated knowledge which benefits from a lifetime of experience. 
  • Vocabulary, in fact, peaked even later, in the late 60s to early 70s. So now you know why grandpa is so good at crosswords.
  • And here’s a win for the 40+ folks - the below representation of a test of 10,000 visitors to TestMyBrain.org shows that older subjects did better than the young on the vocabulary test.
  • ...4 more annotations...
  • The under-30 group did much better on memory-related tasks, however.
  • Is there one age when all of your mental powers are at their maximum? The researchers don’t think so.  
  • In general, the researchers found 24 to be a key age, after which player abilities slowly declined, losing about 15% of the speed every 15 years. 
  • Older players did perform better in some aspects, making up for the slower brain processing by using simpler strategies and being more efficient. They were, in other words, wiser.
  •  
    It is really surprising to me that cognitive abilities are directly related to age. But it is understandable since there also feels like a gulp between seniors and teenagers. There is always something we are especially good at at a certain age. I think this aligns with the logic of evolution as the society consists of people from different ages so they will cooperate well and reach the maximum benefit by working together. The society is really diverse and by having people of different age in the same team can have people cover up the cognitive disadvantages of others. --Sissi (4/4/2017)
Javier E

Science and Truth - We're All in It Together - NYTimes.com - 1 views

  • Almost any article worth reading these days generates some version of this long tail of commentary. Depending on whether they are moderated, these comments can range from blistering flameouts to smart factual corrections to full-on challenges to the very heart of an article’s argument.
  • These days, the comments section of any engaging article is almost as necessary a read as the piece itself — if you want to know how insider experts received the article and how those outsiders processed the new
  • By now, readers understand that the definitive “copy” of any article is no longer the one on paper but the online copy, precisely because it’s the version that’s been read and mauled and annotated by readers.
  • ...5 more annotations...
  • The print edition of any article is little more than a trophy version, the equivalent of a diploma or certificate of merit — suitable for framing, not much else.
  • We call the fallout to any article the “comments,” but since they are often filled with solid arguments, smart corrections and new facts, the thing needs a nobler name. Maybe “gloss.” In the Middle Ages, students often wrote notes in the margins of well-regarded manuscripts. These glosses, along with other forms of marginalia, took on a life of their own, becoming their own form of knowledge, as important as, say, midrash is to Jewish scriptures. The best glosses were compiled into, of course, glossaries and later published
  • The truth is that every decent article now aspires to become the wiki of its own headline.
  • t any good article that has provoked a real discussion typically comes with a small box of post-publication notes. And, since many magazines are naming the editor of the article as well as the author, the outing of the editor can come with a new duty: writing the bottom note that reviews the emendations to the article and perhaps, most importantly, summarizes the thrust of the discussion. If the writer gains the glory of the writing, the editor can win the credit for chaperoning the best and most provocative pieces.
  • Some may fear that recognizing the commentary of every article will turn every subject into an endless postmodern discussion. But actually, the opposite is true. Recognizing the gloss allows us to pause in the seemingly unending back and forth of contemporary free speech and free inquiry to say, well, for now, this much is true — the ivory-bill still hasn’t been definitively seen since World War II, climate change is happening and caused by mankind, natural selection is the best description of nature’s creative force. Et cetera.
Duncan H

What to Do About 'Coming Apart' - NYTimes.com - 0 views

  • Murray has produced a book-length argument placing responsibility for rising inequality and declining mobility on widespread decay in the moral fiber of white, lower-status, less well-educated Americans, putting relatively less emphasis on a similar social breakdown among low-status, less-educated Americans of all races
  • Murray’s strength lies in his ability to raise issues that center-left policy makers and academics prefer, for the most part, to shy away from. His research methods, his statistical analyses and the conclusions he draws are subject to passionate debate. But by forcing taboo issues into the public arena, Murray has opened up for discussion politically salient issues that lurk at a subterranean level in the back-brains of many voters, issues that are rarely examined with the rigor necessary to affirm or deny their legitimacy.
  • The National Review and the Conservative Monitor cited “Losing Ground” as one of the ten books that most changed America. Murray’s bookseemed like a bolt of lightning in the middle of the night revealing what should have been plain as the light of day. The welfare state so carefully built up in the 1960s and 1970s created a system of disincentives for people to better their own lives. By paying welfare mothers to have children out of wedlock into a poor home, more of these births were encouraged. By doling out dollars at a rate that could not be matched by the economy, the system encouraged the poor to stay home.
  • ...9 more annotations...
  • He contends in “Coming Apart” that there was far greater social cohesion across class lines 50 years ago because “the powerful norms of social and economic behavior in 1960 swept virtually everyone into their embrace,” adding in a Jan. 21 op-ed in the Wall Street Journal thatOver the past 50 years, that common civic culture has unraveled. We have developed a new upper class with advanced educations, often obtained at elite schools, sharing tastes and preferences that set them apart from mainstream America. At the same time, we have developed a new lower class, characterized not by poverty but by withdrawal from America’s core cultural institutions.According to Murray, higher education has now become a proxy for higher IQ, as elite colleges become sorting mechanisms for finding, training and introducing to each other the most intellectually gifted young people. Fifty years into the education revolution, members of this elite are likely to be themselves the offspring of cognitively gifted parents, and to ultimately bear cognitively gifted children.
  • “Industriousness: The norms for work and women were revolutionized after 1960, but the norm for men putatively has remained the same: Healthy men are supposed to work. In practice, though, that norm has eroded everywhere.”
  • Murray makes the case that cognitive ability is worth ever more in modern advanced, technologically complex hypercompetitive market economies. As an example, Murray quotes Bill Gates: “Software is an IQ business. Microsoft must win the IQ war or we won’t have a future.”
  • Murray alleges that those with higher IQs now exhibit personal and social behavioral choices in areas like marriage, industriousness, honesty and religiosity that allow them to enjoy secure and privileged lives. Whites in the lower social-economic strata are less cognitively able – in Murray’s view – and thus less well-equipped to resist the lure of the sexual revolution and doctrines of self-actualization so they succumb to higher rates of family dissolution, non-marital births, worklessness and criminality. This interaction between IQ and behavioral choice, in Murray’s framework, is what has led to the widening income and cultural gap.
  • Despised by the left, Murray has arguably done liberals a service by requiring them to deal with those whose values may seem alien, to examine the unintended consequences of their policies and to grapple with the political impact of assertions made by the right. He has also amassed substantial evidence to bolster his claims and at the same time elicited a formidable academic counter-attack.
  • To Murray, the overarching problem is that liberal elites, while themselves living lives of probity, have refused to proselytize for the bourgeois virtues to which they subscribe, thus leaving their less discerning fellow-citizens to flounder in the anti-bourgeois legacy of the counter-cultural 1960s.
  • “Great Civic Awakening” among the new upper class – an awakening that will lead to the kind of “moral rearmament” and paternalism characteristic of anti-poverty drives in the 19th century. To achieve this, Murray believes, the “new upper class must once again fall in love with what makes America different.”
  • The cognitive elites Murray cites are deeply committed to liberal norms of cultural tolerance and permissiveness. The antipathy to the moralism of the religious right has, in fact, been a major force driving these upscale, secular voters into the Democratic party.
  • changes in the world economy may be destructive in terms of the old social model, but they are profoundly liberating and benign in and of themselves. The family farm wasn’t dying because capitalism had failed or a Malthusian crisis was driving the world to starvation. The family farm died of abundance; it died of the rapidly rising productivity that meant that fewer and fewer people had to work to produce the food on which humanity depended.Mead continues:Revolutions in manufacturing and, above all, in communications and information technology create the potential for unprecedented abundance and a further liberation of humanity from meaningless and repetitive work. Our problem isn’t that the sources of prosperity have dried up in a long drought; our problem is that we don’t know how to swim. It is raining soup, and we are stuck holding a fork.The 21st century, Mead adds,must reinvent the American Dream. It must recast our economic, social, familial, educational and political systems for new challenges and new opportunities. Some hallowed practices and institutions will have to go under the bus. But in the end, the changes will make us richer, more free and more secure than we are now.Mead’s predictions may or may not prove prescient, but it his thinking, more than Murray’s, that reflects the underlying optimism that has sustained the United States for more than two centuries — a refusal to believe that anything about human nature is essentially “intractable.” Mead’s way of looking at things is not only more inviting than Murray’s, it is also more on target.
Javier E

Eric Kandel's Visions - The Chronicle Review - The Chronicle of Higher Education - 0 views

  • Judith, "barely clothed and fresh from the seduction and slaying of Holofernes, glows in her voluptuousness. Her hair is a dark sky between the golden branches of Assyrian trees, fertility symbols that represent her eroticism. This young, ecstatic, extravagantly made-up woman confronts the viewer through half-closed eyes in what appears to be a reverie of orgasmic rapture," writes Eric Kandel in his new book, The Age of Insight. Wait a minute. Writes who? Eric Kandel, the Nobel-winning neuroscientist who's spent most of his career fixated on the generously sized neurons of sea snails
  • Kandel goes on to speculate, in a bravura paragraph a few hundred pages later, on the exact neurochemical cognitive circuitry of the painting's viewer:
  • "At a base level, the aesthetics of the image's luminous gold surface, the soft rendering of the body, and the overall harmonious combination of colors could activate the pleasure circuits, triggering the release of dopamine. If Judith's smooth skin and exposed breast trigger the release of endorphins, oxytocin, and vasopressin, one might feel sexual excitement. The latent violence of Holofernes's decapitated head, as well as Judith's own sadistic gaze and upturned lip, could cause the release of norepinephrine, resulting in increased heart rate and blood pressure and triggering the fight-or-flight response. In contrast, the soft brushwork and repetitive, almost meditative, patterning may stimulate the release of serotonin. As the beholder takes in the image and its multifaceted emotional content, the release of acetylcholine to the hippocampus contributes to the storing of the image in the viewer's memory. What ultimately makes an image like Klimt's 'Judith' so irresistible and dynamic is its complexity, the way it activates a number of distinct and often conflicting emotional signals in the brain and combines them to produce a staggeringly complex and fascinating swirl of emotions."
  • ...18 more annotations...
  • His key findings on the snail, for which he shared the 2000 Nobel Prize in Physiology or Medicine, showed that learning and memory change not the neuron's basic structure but rather the nature, strength, and number of its synaptic connections. Further, through focus on the molecular biology involved in a learned reflex like Aplysia's gill retraction, Kandel demonstrated that experience alters nerve cells' synapses by changing their pattern of gene expression. In other words, learning doesn't change what neurons are, but rather what they do.
  • In Search of Memory (Norton), Kandel offered what sounded at the time like a vague research agenda for future generations in the budding field of neuroaesthetics, saying that the science of memory storage lay "at the foothills of a great mountain range." Experts grasp the "cellular and molecular mechanisms," he wrote, but need to move to the level of neural circuits to answer the question, "How are internal representations of a face, a scene, a melody, or an experience encoded in the brain?
  • Since giving a talk on the matter in 2001, he has been piecing together his own thoughts in relation to his favorite European artists
  • The field of neuroaesthetics, says one of its founders, Semir Zeki, of University College London, is just 10 to 15 years old. Through brain imaging and other studies, scholars like Zeki have explored the cognitive responses to, say, color contrasts or ambiguities of line or perspective in works by Titian, Michelangelo, Cubists, and Abstract Expressionists. Researchers have also examined the brain's pleasure centers in response to appealing landscapes.
  • it is fundamental to an understanding of human cognition and motivation. Art isn't, as Kandel paraphrases a concept from the late philosopher of art Denis Dutton, "a byproduct of evolution, but rather an evolutionary adaptation—an instinctual trait—that helps us survive because it is crucial to our well-being." The arts encode information, stories, and perspectives that allow us to appraise courses of action and the feelings and motives of others in a palatable, low-risk way.
  • "as far as activity in the brain is concerned, there is a faculty of beauty that is not dependent on the modality through which it is conveyed but which can be activated by at least two sources—musical and visual—and probably by other sources as well." Specifically, in this "brain-based theory of beauty," the paper says, that faculty is associated with activity in the medial orbitofrontal cortex.
  • It also enables Kandel—building on the work of Gombrich and the psychoanalyst and art historian Ernst Kris, among others—to compare the painters' rendering of emotion, the unconscious, and the libido with contemporaneous psychological insights from Freud about latent aggression, pleasure and death instincts, and other primal drives.
  • Kandel views the Expressionists' art through the powerful multiple lenses of turn-of-the-century Vienna's cultural mores and psychological insights. But then he refracts them further, through later discoveries in cognitive science. He seeks to reassure those who fear that the empirical and chemical will diminish the paintings' poetic power. "In art, as in science," he writes, "reductionism does not trivialize our perception—of color, light, and perspective—but allows us to see each of these components in a new way. Indeed, artists, particularly modern artists, have intentionally limited the scope and vocabulary of their expression to convey, as Mark Rothko and Ad Reinhardt do, the most essential, even spiritual ideas of their art."
  • The author of a classic textbook on neuroscience, he seems here to have written a layman's cognition textbook wrapped within a work of art history.
  • "our initial response to the most salient features of the paintings of the Austrian Modernists, like our response to a dangerous animal, is automatic. ... The answer to James's question of how an object simply perceived turns into an object emotionally felt, then, is that the portraits are never objects simply perceived. They are more like the dangerous animal at a distance—both perceived and felt."
  • If imaging is key to gauging therapeutic practices, it will be key to neuroaesthetics as well, Kandel predicts—a broad, intense array of "imaging experiments to see what happens with exaggeration, distorted faces, in the human brain and the monkey brain," viewers' responses to "mixed eroticism and aggression," and the like.
  • while the visual-perception literature might be richer at the moment, there's no reason that neuroaesthetics should restrict its emphasis to the purely visual arts at the expense of music, dance, film, and theater.
  • although Kandel considers The Age of Insight to be more a work of intellectual history than of science, the book summarizes centuries of research on perception. And so you'll find, in those hundreds of pages between Kandel's introduction to Klimt's "Judith" and the neurochemical cadenza about the viewer's response to it, dossiers on vision as information processing; the brain's three-dimensional-space mapping and its interpretations of two-dimensional renderings; face recognition; the mirror neurons that enable us to empathize and physically reflect the affect and intentions we see in others; and many related topics. Kandel elsewhere describes the scientific evidence that creativity is nurtured by spells of relaxation, which foster a connection between conscious and unconscious cognition.
  • Zeki's message to art historians, aesthetic philosophers, and others who chafe at that idea is twofold. The more diplomatic pitch is that neuroaesthetics is different, complementary, and not oppositional to other forms of arts scholarship. But "the stick," as he puts it, is that if arts scholars "want to be taken seriously" by neurobiologists, they need to take advantage of the discoveries of the past half-century. If they don't, he says, "it's a bit like the guys who said to Galileo that we'd rather not look through your telescope."
  • Matthews, a co-author of The Bard on the Brain: Understanding the Mind Through the Art of Shakespeare and the Science of Brain Imaging (Dana Press, 2003), seems open to the elucidations that science and the humanities can cast on each other. The neural pathways of our aesthetic responses are "good explanations," he says. But "does one [type of] explanation supersede all the others? I would argue that they don't, because there's a fundamental disconnection still between ... explanations of neural correlates of conscious experience and conscious experience" itself.
  • There are, Matthews says, "certain kinds of problems that are fundamentally interesting to us as a species: What is love? What motivates us to anger?" Writers put their observations on such matters into idiosyncratic stories, psychologists conceive their observations in a more formalized framework, and neuroscientists like Zeki monitor them at the level of functional changes in the brain. All of those approaches to human experience "intersect," Matthews says, "but no one of them is the explanation."
  • "Conscious experience," he says, "is something we cannot even interrogate in ourselves adequately. What we're always trying to do in effect is capture the conscious experience of the last moment. ... As we think about it, we have no way of capturing more than one part of it."
  • Kandel sees art and art history as "parent disciplines" and psychology and brain science as "antidisciplines," to be drawn together in an E.O. Wilson-like synthesis toward "consilience as an attempt to open a discussion between restricted areas of knowledge." Kandel approvingly cites Stephen Jay Gould's wish for "the sciences and humanities to become the greatest of pals ... but to keep their ineluctably different aims and logics separate as they ply their joint projects and learn from each other."
Javier E

Chris Hayes Has Arrived With 'Up' - NYTimes.com - 0 views

  • In less than a year on television (and with a chirpy voice, a weakness for gesticulation and a tendency to drop honors-thesis words like “signifier” into casual conversation), Mr. Hayes has established himself as Generation Y’s wonk prince of the morning political talk-show circuit.
  • “He is never doctrinaire,” Mr. Leo said in an interview. Both punk fans and “Up” fans are “suspicious of any authority,” he said, and appreciate that Mr. Hayes “is always willing to challenge his own assumptions, and the received wisdom on both sides of the aisle.”
  • Social media, in fact, have played an unusually important role in driving traffic to the program, an MSNBC spokeswoman said. About 45 percent of the visitors to the program’s Web site, which contains complete episodes, linked through sites like Facebook and Twitter. In April, those users spent an average of 51 minutes on the site each visit.
  • ...9 more annotations...
  • “Up” comes off as a rebuke to traditional cable shout-fests like CNN’s late “Crossfire.” Thanks to its early weekend time slot, the program has the freedom to unwind over two hours each Saturday and Sunday. Guests are encouraged to go deep into the issues of the week, and not try to score cheap-shot points to win the debate.
  • “The first and foremost important rule of the show: we’re not on television — no talking points, no sound bites,” he said, his hair still a bed-head tangle and his suit collar askew. “We have a lot of time for actual conversation. So actually listen, actually respond.”
  • An hour later, as the cameras rolled, Mr. Hayes and his guests waded thigh-deep into an analysis of private equity and whether it is bad for the economy. At a table of wonks, Mr. Hayes, who studied the philosophy of mathematics at Brown, came off as the wonkiest as he deconstructed the budgetary implications of tax arbitrage. Opinions were varied and passionate, but there was no sniping, no partisan grandstanding.
  • “I like t
  • he fact that it’s dialogic, small-d ‘democratic,’ ” Mr. Hayes said of his show. “We’re all sitting at t
  • Since Dec. 26, it has been No. 1 on average in its Sunday time slot on cable news channels among viewers ages 18 to 34, according to Nielsen figures provided by the network.
  • Ms. Maddow said on her program that “Up” was “the best news show on TV, including this one.” “Chris is the antidote to the anti-intellectual posing that has characterized the last decade in cable news,”
  • “No one else in cable is even trying long-form, off-the-news-cycle dives like him — let alone succeeding at them as he is. He’s giving the network Sunday shows a run for their money.”
  • As a student at Hunter College High School in Manhattan, he aspired to write. “My dream when I was 14,” he said, “was someday I could have a David Levine caricature of me in The New York Review of Books.”
markfrankel18

Why Waiting in Line Is Torture - NYTimes.com - 1 views

  • the experience of waiting, whether for luggage or groceries, is defined only partly by the objective length of the wait. “Often the psychology of queuing is more important than the statistics of the wait itself,” notes the M.I.T. operations researcher Richard Larson, widely considered to be the world’s foremost expert on lines.
  • This is also why one finds mirrors next to elevators. The idea was born during the post-World War II boom, when the spread of high-rises led to complaints about elevator delays. The rationale behind the mirrors was similar to the one used at the Houston airport: give people something to occupy their time, and the wait will feel shorter.
  • Professors Carmon and Kahneman have also found that we are more concerned with how long a line is than how fast it’s moving. Given a choice between a slow-moving short line and a fast-moving long one, we will often opt for the former, even if the waits are identical. (This is why Disney hides the lengths of its lines by wrapping them around buildings and using serpentine queues.)
  • ...1 more annotation...
  • Surveys show that many people will wait twice as long for fast food, provided the establishment uses a first-come-first-served, single-queue ordering system as opposed to a multi-queue setup. Anyone who’s ever had to choose a line at a grocery store knows how unfair multiple queues can seem; invariably, you wind up kicking yourself for not choosing the line next to you moving twice as fast. But there’s a curious cognitive asymmetry at work here. While losing to the line at our left drives us to despair, winning the race against the one to our right does little to lift our spirits. Indeed, in a system of multiple queues, customers almost always fixate on the line they’re losing to and rarely the one they’re beating.
Javier E

How The Internet Enables Lies - The Dish | By Andrew Sullivan - The Daily Beast - 1 views

  • the ease of self-publishing and search afforded by the Internet along with a growing skeptical stance towards scientific expertise—has given the anti-vaccination movement a significant boost
  • She regularly shares her "knowledge" about vaccination with her nearly half-million Twitter followers. This is the kind of online influence that Nobel Prize-winning scientists can only dream of; Richard Dawkins, perhaps the most famous working scientist, has only 300,000 Twitter followers.
Duncan H

Mitt Romney's Problem Speaking About Money - NYTimes.com - 0 views

  • Why is someone who is so good at making money so bad at talking about it?Mitt Romney is not the first presidential candidate who’s had trouble communicating with working-class voters: John Kerry famously enjoyed wind-surfing, and George Bush blamed a poor showing in a straw poll on the fact that many of his supporters were “at their daughter’s coming out party.”Veritable battalions of Kennedys and Roosevelts have dealt with the economic and cultural gaps between themselves and the voters over the years without much difficulty. Not so Barack Obama, whose attempt to commiserate with Iowa farmers in 2007 about crop prices by mentioning the cost of arugula at Whole Foods fell flat.
  • Romney’s reference last week to the fact that his wife “drives a couple of Cadillacs, actually” is not grounds in itself for a voter to oppose his candidacy. Neither was the $10,000 bet he offered to Rick Perry during a debate in December or the time he told a group of the unemployed in Florida that he was “also unemployed.”But his penchant for awkward references to his own wealth has underscored the suspicion that many voters have about his ability to understand their economic problems. His opponents in both parties  are gleefully highlighting these moments as a way to drive a wedge between Romney and the working class voters who have become an increasingly important part of the Republican Party base.
  • The current economic circumstances have undoubtedly exacerbated the problem for Romney. Had Obama initially sought the presidency during a primary season dominated by concerns about the domestic economy rather than war in Iraq, his explanation that small town voters “get bitter, they cling to guns or religion or antipathy to people who aren’t like them” might have created an opportunity for Hillary Clinton or even the populist message of John Edwards.
  • ...7 more annotations...
  • But Obama’s early opposition to the Iraq war gave him a political firewall that protected him throughout that primary campaign, while Romney has no such policy safe harbor to safeguard him from an intramural backlash.
  • Romney and Obama share a lack of natural affinity for this key group of swing voters, but it is Romney who needs to figure out some way of addressing this shortcoming if he wants to make it to the White House. It’s Romney’s misfortune that the voters’ prioritization of economic issues, his own privileged upbringing and his lack of connection with his party’s base on other core issues put him in a much more precarious position than candidate Obama ever reached.
  • By the time the 2008 general election rolled around, Obama had bolstered his outreach to these voters by recruiting the blue-collar avatar Joe Biden as his running mate. Should Romney win the Republican nomination this year, his advisers will almost certainly be tempted by the working-class credentials that a proletarian like New Jersey Governor Chris Christie or Florida Senator Marco Rubio would bring to the ticket.
  • Of more immediate concern to Team Romney should be how their candidate can overcome his habit of economic tone-deafness before Rick Santorum steals away enough working-class and culturally conservative voters to throw the Republican primary into complete and utter turmoil.
  • The curious thing about Romney’s verbal missteps is how limited they are to this very specific area of public policy. He is usually quite articulate when talking about foreign affairs and national security. Despite his complicated history on social and cultural matters like health care and abortion, his explanations are usually both coherent and comprehensible, even to those who oppose his positions. It’s only when he begins talking about economic issues – his biographical strength – that he seems to get clumsy.
  • The second possibility would be for him to outline a series of proposals specifically targeted at the needs of working-class and poor Americans, not only to control the damage from his gaffes but also to underscore the conservative premise that a right-leaning agenda will create opportunities for those on the lower rungs of the economic ladder. But while that approach might help Romney in a broader philosophical conversation, it’s unlikely to offer him much protection from the attacks and ridicule that his unforced errors will continue to bring him.
  • The question is why Romney hasn’t embraced a third alternative – admitting the obvious and then explaining why he gets so tongue-tied when the conversation turns to money. Romney’s upbringing and religious faith suggest a sense of obligation to the less fortunate and an unspoken understanding that it isn’t appropriate to call attention to one’s financial success.It wouldn’t be that hard for him to say something like:I was taught not to brag and boast and think I’m better than other people because of the successes I’ve had, so occasionally I’m going to say things that sound awkward. It’s because I’d rather talk about what it takes to get America back to work.
  •  
    Do you think the solution Douthat proposes would work?
Javier E

Shirley Chisholm, Cont. - Ta-Nehisi Coates - Personal - The Atlantic - 0 views

  • I'm always suspicious of any thinker who leads with overly broad analogizing as opposed to specific issues
  • What issue are you actually trying to clarify when you make these comparisons? What aspect of history or social life are you really trying to illuminate? Are you really trying to talk to other people, or are you trying to shut them down?
  • There is always a way to make yourself right, and win the argument. There's also a difference between curiosity and one-upmanship. I've yet to see a legitimately curious person begin their conversation with an overly broad analogy. But I've seen a lot of dogmatic ideologues wield them like clubs. 
Javier E

The American Scholar: The Decline of the English Department - William M. Chace - 1 views

  • The number of young men and women majoring in English has dropped dramatically; the same is true of philosophy, foreign languages, art history, and kindred fields, including history. As someone who has taught in four university English departments over the last 40 years, I am dismayed by this shift, as are my colleagues here and there across the land. And because it is probably irreversible, it is important to attempt to sort out the reasons—the many reasons—for what has happened.
  • English: from 7.6 percent of the majors to 3.9 percent
  • In one generation, then, the numbers of those majoring in the humanities dropped from a total of 30 percent to a total of less than 16 percent; during that same generation, business majors climbed from 14 percent to 22 percent.
  • ...23 more annotations...
  • History: from 18.5 percent to 10.7 percent
  • But the deeper explanation resides not in something that has happened to it, but in what it has done to itself. English has become less and less coherent as a discipline and, worse, has come near exhaustion as a scholarly pursuit.
  • The twin focus, then, was on the philological nature of the enterprise and the canon of great works to be studied in their historical evolution.
  • Studying English taught us how to write and think better, and to make articulate many of the inchoate impulses and confusions of our post-adolescent minds. We began to see, as we had not before, how such books could shape and refine our thinking. We began to understand why generations of people coming before us had kept them in libraries and bookstores and in classes such as ours. There was, we got to know, a tradition, a historical culture, that had been assembled around these books. Shakespeare had indeed made a difference—to people before us, now to us, and forever to the language of English-speaking people.
  • today there are stunning changes in the student population: there are more and more gifted and enterprising students coming from immigrant backgrounds, students with only slender connections to Western culture and to the assumption that the “great books” of England and the United States should enjoy a fixed centrality in the world. What was once the heart of the matter now seems provincial. Why throw yourself into a study of something not emblematic of the world but representative of a special national interest? As the campus reflects the cultural, racial, and religious complexities of the world around it, reading British and American literature looks more and more marginal. From a global perspective, the books look smaller.
  • With the cost of a college degree surging upward during the last quarter century—tuition itself increasing far beyond any measure of inflation—and with consequent growth in loan debt after graduation, parents have become anxious about the relative earning power of a humanities degree. Their college-age children doubtless share such anxiety. When college costs were lower, anxiety could be kept at bay. (Berkeley in the early ’60s cost me about $100 a year, about $700 in today’s dollars.)
  • Economists, chemists, biologists, psychologists, computer scientists, and almost everyone in the medical sciences win sponsored research, grants, and federal dollars. By and large, humanists don’t, and so they find themselves as direct employees of the institution, consuming money in salaries, pensions, and operating needs—not external money but institutional money.
  • These, then, are some of the external causes of the decline of English: the rise of public education; the relative youth and instability (despite its apparent mature solidity) of English as a discipline; the impact of money; and the pressures upon departments within the modern university to attract financial resources rather than simply use them up.
  • several of my colleagues around the country have called for a return to the aesthetic wellsprings of literature, the rock-solid fact, often neglected, that it can indeed amuse, delight, and educate. They urge the teaching of English, or French, or Russian literature, and the like, in terms of the intrinsic value of the works themselves, in all their range and multiplicity, as well-crafted and appealing artifacts of human wisdom. Second, we should redefine our own standards for granting tenure, placing more emphasis on the classroom and less on published research, and we should prepare to contest our decisions with administrators whose science-based model is not an appropriate means of evaluation.
  • “It may be that what has happened to the profession is not the consequence of social or philosophical changes, but simply the consequence of a tank now empty.” His homely metaphor pointed to the absence of genuinely new frontiers of knowledge and understanding for English professors to explore.
  • In this country and in England, the study of English literature began in the latter part of the 19th century as an exercise in the scientific pursuit of philological research, and those who taught it subscribed to the notion that literature was best understood as a product of language.
  • no one has come forward in years to assert that the study of English (or comparative literature or similar undertakings in other languages) is coherent, does have self-limiting boundaries, and can be described as this but not that.
  • to teach English today is to do, intellectually, what one pleases. No sense of duty remains toward works of English or American literature; amateur sociology or anthropology or philosophy or comic books or studies of trauma among soldiers or survivors of the Holocaust will do. You need not even believe that works of literature have intelligible meaning; you can announce that they bear no relationship at all to the world beyond the text.
  • With everything on the table, and with foundational principles abandoned, everyone is free, in the classroom or in prose, to exercise intellectual laissez-faire in the largest possible way—I won’t interfere with what you do and am happy to see that you will return the favor
  • Consider the English department at Harvard University. It has now agreed to remove its survey of English literature for undergraduates, replacing it and much else with four new “affinity groups”
  • there would be no one book, or family of books, that every English major at Harvard would have read by the time he or she graduates. The direction to which Harvard would lead its students in this “clean slate” or “trickle down” experiment is to suspend literary history, thrusting into the hands of undergraduates the job of cobbling together intellectual coherence for themselves
  • Those who once strove to give order to the curriculum will have learned, from Harvard, that terms like core knowledge and foundational experience only trigger acrimony, turf protection, and faculty mutinies. No one has the stomach anymore to refight the Western culture wars. Let the students find their own way to knowledge.
  • In English, the average number of years spent earning a doctoral degree is almost 11. After passing that milestone, only half of new Ph.D.’s find teaching jobs, the number of new positions having declined over the last year by more than 20 percent; many of those jobs are part-time or come with no possibility of tenure. News like that, moving through student networks, can be matched against, at least until recently, the reputed earning power of recent graduates of business schools, law schools, and medical schools. The comparison is akin to what young people growing up in Rust Belt cities are forced to see: the work isn’t here anymore; our technology is obsolete.
  • unlike other members of the university community, they might well have been plying their trade without proper credentials: “Whereas economists or physicists, geologists or climatologists, physicians or lawyers must master a body of knowledge before they can even think of being licensed to practice,” she said, “we literary scholars, it is tacitly assumed, have no definable expertise.”
  • English departments need not refight the Western culture wars. But they need to fight their own book wars. They must agree on which texts to teach and argue out the choices and the principles of making them if they are to claim the respect due a department of study.
  • They can teach their students to write well, to use rhetoric. They should place their courses in composition and rhetoric at the forefront of their activities. They should announce that the teaching of composition is a skill their instructors have mastered and that students majoring in English will be certified, upon graduation, as possessing rigorously tested competence in prose expression.
  • The study of literature will then take on the profile now held, with moderate dignity, by the study of the classics, Greek and Latin.
  • But we can, we must, do better. At stake are the books themselves and what they can mean to the young. Yes, it is just a literary tradition. That’s all. But without such traditions, civil societies have no compass to guide them.
Javier E

Welcome, Robot Overlords. Please Don't Fire Us? | Mother Jones - 0 views

  • This is the happy version. It's the one where computers keep getting smarter and smarter, and clever engineers keep building better and better robots. By 2040, computers the size of a softball are as smart as human beings. Smarter, in fact. Plus they're computers: They never get tired, they're never ill-tempered, they never make mistakes, and they have instant access to all of human knowledge.
  • , just as it took us until 2025 to fill up Lake Michigan, the simple exponential curve of Moore's Law suggests it's going to take us until 2025 to build a computer with the processing power of the human brain. And it's going to happen the same way: For the first 70 years, it will seem as if nothing is happening, even though we're doubling our progress every 18 months. Then, in the final 15 years, seemingly out of nowhere, we'll finish the job.
  • And that's exactly where we are. We've moved from computers with a trillionth of the power of a human brain to computers with a billionth of the power. Then a millionth. And now a thousandth. Along the way, computers progressed from ballistics to accounting to word processing to speech recognition, and none of that really seemed like progress toward artificial intelligence. That's because even a thousandth of the power of a human brain is—let's be honest—a bit of a joke.
  • ...4 more annotations...
  • But there's another reason as well: Every time computers break some new barrier, we decide—or maybe just finally get it through our thick skulls—that we set the bar too low.
  • the best estimates of the human brain suggest that our own processing power is about equivalent to 10 petaflops. ("Peta" comes after giga and tera.) That's a lot of flops, but last year an IBM Blue Gene/Q supercomputer at Lawrence Livermore National Laboratory was clocked at 16.3 petaflops.
  • in Lake Michigan terms, we finally have a few inches of water in the lake bed, and we can see it rising. All those milestones along the way—playing chess, translating web pages, winning at Jeopardy!, driving a car—aren't just stunts. They're precisely the kinds of things you'd expect as we struggle along with platforms that aren't quite powerful enough—yet. True artificial intelligence will very likely be here within a couple of decades. Making it small, cheap, and ubiquitous might take a decade more.
  • In other words, by about 2040 our robot paradise awaits.
Javier E

The Refiner's Fire - NYTimes.com - 0 views

  • In 2005, Michael Ignatieff left a teaching job at Harvard to enter politics in his native Canada
  • In Parliament, he became a total partisan, putting, as one must, loyalty to the group above loyalty to truth. He had no friends who were not in his own party. He loathed the other side. “We never wasted a single breath trying to convince each other of anything,” he recalls.
  • He quickly came to understand how politics is different from academia. In academia, you use words to persuade or discover; in politics, you use words to establish a connection. Academia is a cerebral enterprise, but politics is a physical enterprise, a charismatic form of athletics in which you touch people to show you care.In academia, the goal is to come up with a timeless truth. In politics, timing is everything, knowing when the time is ripe for a certain proposal. In academia, the idea is to take a stand based on what you believe; in politics, the idea is to position yourself along a left-right axis in a way that will differentiate you from your opponents and help you win a majority.In academia, a certain false modesty is encouraged; in politics, you have to self-dramatize a fable about yourself — concoct a story to show how your life connects to certain policies. In academia, you are rewarded for candor, intellectual rigor and a willingness to follow an idea to its logical conclusion. In politics, all of these traits are ruinous.
  • ...1 more annotation...
  • He learned that when you are attacking your opponent, you have to hit his strengths because his weaknesses will take care of themselves. Political discourse, he came to see, is not really a debate about issues; it is a verbal contest to deny your opponents of standing, or as we would say, legitimacy. “Of the three elections that I fought, none was a debate on the country’s future. All were vicious battles over standing.”
julia rhodes

Fict or Faction - How Much Do We Care About the Truth? | Psychology Today - 0 views

  • Many books and hundreds of articles have been written about how drug companies have “gamed every system” to push their products. Negative clinical studies are suppressed; claims are made for larger usefulness that have no real basis in fact; side effects are ignored or deliberately underreported; and companies pay fines in the billions that still represent small fractions of total sales.
  • Spending enormous amounts of cash looking at cancer, cardiovascular and “women’s health” research, the Bayer scientists could corroborate less than a quarter of the studies they tested. In other words, 75-80% of these major research findings could not be confirmed.
  • Science lives on replication. Yet these clinically critical attempts to corroborate research findings could not confirm them. Why? Ironically, the reasons resemble many that are used to describe the malfeasance of drug companies – the need for money, grant support, major findings to achieve tenure – and a desire for others not to have the “secret sauce” of methodology needed to create the research.
  • ...5 more annotations...
  • The author retorts “I really have an issue with the word hoax.” He regards himself as a performance artist. His response – “It’s the people who reported it who are deceiving their audience.”
  • Why is fake news so popular on newssites? Here are two reasons: first, it provides emotional “buzz.” Second, because it can make a lot of money. As the Washington Bureau chief of the Pulitzer winning Huffington Post lamented, “If you throw something up without fact checking it, and you’re the first one to put it up, and you get millions and millions of views, and later it’s proved false, you still got those views. That’s a problem. The incentives are all wrong.”Especially when, as at places like Bloomberg, remuneration is based on the number of hits an article receives. But incentives are wrong not just for news gathering organizations.
  • Americans continue to believe important historical “facts” that are untrue. After 9/11, Americans were incensed to hear that the many in the Middle East thought Osama bin Laden’s horrifying attack was the product of a CIA-Mossad plot. To this day, large majorities in countries like Pakistan think the massacre of 9/11 was created in Washington or Tel Aviv.
  • Yet close to a majority of Americans believe that Saddam Hussein, tyrant of Iraq, was in cahoots with Al-Qaeda, especially before the 9/11 attack. The Bush administration told them so.Which people in the Middle East rightly regard as preposterous.Saddam Hussein was the leader of a boldly secular, Arabist tyranny. Sunni fanatics like Al Qaeda were his regime’s blood enemies. That they would work together rather than murder each other was just insane. Welcome to the world of fict and faction.
  • What can we learn from this? Plausibility is not truth; when something is “too good to be true” it generally isn’t; institutions increasingly do not back up what they proclaim and sell.And the “free informational marketplace” of the Internet is a wonderful site for fraud, scams, lies, plausible lies, and pleasant, beautiful untruths. So we all need our own truth detectors.
Javier E

Man shot in heated debate -- about philosopher Immanuel Kant - latimes.com - 0 views

  • A little philosophy can be a dangerous thing. A heated conversation between two men about the seminal 18th century philosopher Immanuel Kant first came to blows, then one man shot the other.
  • The Kant shooting incident took place in southern Russia in a beer line,
  • The 26-year-old alleged shooter has been apprehended by the police and charged with “intentional infliction of serious harm.” He could serve up to 15 years in prison for not living in accordance with the first, or indeed second, formulation of Kant's categorical imperative: using a gun to win an argument would not work as a universal strategy, and there is no rational end to getting into a fistfight about “The Critique of Pure Reason” or any of Kant's other works.
  • ...2 more annotations...
  • the men had “decided to find out which of them is a bigger fan of this philosopher, and a tempestuous argument escalated into a fistfight.” 
  • If they had stuck with Kant's philosophy of relying on reason over emotion, Kant's biggest fans might never have gotten so wound up in the first place.
Javier E

How to Make Your Own Luck | Brain Pickings - 0 views

  • editor Jocelyn Glei and her team at Behance’s 99U pull together another package of practical wisdom from 21 celebrated creative entrepreneurs. Despite the somewhat self-helpy, SEO-skewing title, this compendium of advice is anything but contrived. Rather, it’s a no-nonsense, experience-tested, life-approved cookbook for creative intelligence, exploring everything from harnessing the power of habit to cultivating meaningful relationships that enrich your work to overcoming the fear of failure.
  • If the twentieth-century career was a ladder that we climbed from one predictable rung to the next, the twenty-first-century career is more like a broad rock face that we are all free-climbing. There’s no defined route, and we must use our own ingenuity, training, and strength to rise to the top. We must make our own luck.
  • Lucky people take advantage of chance occurrences that come their way. Instead of going through life on cruise control, they pay attention to what’s happening around them and, therefore, are able to extract greater value from each situation… Lucky people are also open to novel opportunities and willing to try things outside of their usual experiences. They’re more inclined to pick up a book on an unfamiliar subject, to travel to less familiar destinations, and to interact with people who are different than themselves.
  • ...14 more annotations...
  • the primary benefit of a diary as a purely pragmatic record of your workday productivity and progress — while most dedicated diarists would counter that the core benefits are spiritual and psychoemotional — it does offer some valuable insight into the psychology of how journaling elevates our experience of everyday life:
  • We can’t, however, simply will ourselves into better habits. Since willpower is a limited resource, whenever we’ve overexerted our self-discipline in one domain, a concept known as “ego depletion” kicks in and renders us mindless automata in another
  • the key to changing a habit is to invest heavily in the early stages of habit-formation so that the behavior becomes automated and we later default into it rather than exhausting our willpower wrestling with it. Young also cautions that it’s a self-defeating strategy to try changing several habits at once. Rather, he advises, spend one month on each habit alone before moving on to the next
  • a diary boosts your creativity
  • This is one of the most important reasons to keep a diary: it can make you more aware of your own progress, thus becoming a wellspring of joy in your workday.
  • The second reason is focalism. When we contemplate failure from afar, according to Gilbert and Wilson, we tend to overemphasize the focal event (i.e., failure) and overlook all the other episodic details of daily life that help us move on and feel better. The threat of failure is so vivid that it consumes our attention
  • the authors point to a pattern that reveals the single most important motivator: palpable progress on meaningful work: On the days when these professionals saw themselves moving forward on something they cared about — even if the progress was a seemingly incremental “small win” — they were more likely to be happy and deeply engaged in their work. And, being happier and more deeply engaged, they were more likely to come up with new ideas and solve problems creatively.
  • Although the act of reflecting and writing, in itself, can be beneficial, you’ll multiply the power of your diary if you review it regularly — if you listen to what your life has been telling you. Periodically, maybe once a month, set aside time to get comfortable and read back through your entries. And, on New Year’s Day, make an annual ritual of reading through the previous year.
  • This, they suggest, can yield profound insights into the inner workings of your own mind — especially if you look for specific clues and patterns, trying to identify the richest sources of meaning in your work and the types of projects that truly make your heart sing. Once you understand what motivates you most powerfully, you’ll be able to prioritize this type of work in going forward. Just as important, however, is cultivating a gratitude practice and acknowledging your own accomplishments in the diary:
  • Fields argues that if we move along the Uncertainty Curve either too fast or too slowly, we risk either robbing the project of its creative potential and ending up in mediocrity. Instead, becoming mindful of the psychology of that process allows us to pace ourselves better and master that vital osmosis between freedom and constraint.
  • Schwalbe reminds us of the “impact bias” — our tendency to greatly overestimate the intensity and extent of our emotional reactions, which causes us to expect failures to be more painful than they actually are and thus to fear them more than we should.
  • When we think about taking a risk, we rarely consider how good we will be at reframing a disappointing outcome. In short, we underestimate our resilience.
  • what you do every day is best seen as an iceberg, with a small fraction of conscious decision sitting atop a much larger foundation of habits and behaviors.
  • don’t let yourself forget that the good life, the meaningful life, the truly fulfilling life, is the life of presence, not of productivity.
Javier E

Checking Privilege Checking - Phoebe Maltz Bovy - The Atlantic - 0 views

  • While privilege is nothing new, the term “privilege” is everywhere in post-recession America, not just on college campuses.
  • The ideas of Peggy “Invisible Knapsack” McIntosh and Pierre “Cultural Capital” Bourdieu have trickled into the culture.
  • “Privilege” isn't merely unearned advantage—it implies entitlement. To say that someone “comes across as privileged” is to call that person clueless and insensitive.
  • ...10 more annotations...
  • to stand accused of privilege, one need not actually possess the form of privilege in question.
  • To call someone “privileged” is to say that his or her successes are undeserved. It’s a personal insult posing as social critique.
  • Use of the term “privilege” has, I’d argue, actually set back the cultural conversation about privilege. It's not just that “privilege,” when used as an accusation, silences. It’s also that it’s made cluelessness a greater crime than inequality. These ubiquitous expressions—“check your privilege” or “your privilege is showing”—ask the accused to own up to privilege, not to do anything about it.
  • There may be a vague, implied hope that privilege checking will lead to efforts to remedy some injustice, but the more direct concern is not coming across as entitled, not offending anyone underprivileged
  • more typically, it’s a way for someone privileged to play self-appointed spokesperson for the marginalized, so as to win a sensitivity competition with others similarly aloof.
  • Having the privilege conversation is itself an expression of privilege.
  • It’s not just that commenting online about privilege—or any other topic—suggests leisure time. It’s also that the vocabulary of “privilege” is learned at liberal-arts colleges or in highbrow publications.
  • A certain sort of self-deprecating privilege awareness has become, in effect, upper- or upper-middle-class good manners, maybe even a new form of noblesse oblige, reinforcing class divides.
  • because “privileged” is essentially an epithet, it ends up encouraging privilege denial. “Your privilege is showing” inspires—and Fortgang's essay is just the latest example of this—all these selective histories of people's upbringings (or, if they grew up truly "blessed," that of their parents or grandparents) to create the illusion of scrappiness and an upward trajectory.
  • he self-deprecatory, class-signaling approach might (but rarely does) serve as a first step towards genuine self-examination and, in turn, some broader social-justice commitment. But the main result of privilege talk is scrappiness one-upmanship among the privileged.
Javier E

Florida Finds Itself in the Eye of the Storm on Climate Change - NYTimes.com - 0 views

  • In acknowledging the problem, politicians must endorse a solution, but the only major policy solutions to climate change — taxing or regulating the oil, gas and coal industries — are anathema to the base of the Republican Party. Thus, many Republicans, especially in Florida, appear to be dealing with the issue by keeping silent.
  • on this, Republicans are dead set against taking action on climate change on the national level. If you have political aspirations, this is not something you should talk about if you want to win a Republican primary.”
  • “Sea level rise is our reality in Miami Beach,” said the city’s mayor, Philip Levine. “We are past the point of debating the existence of climate change and are now focusing on adapting to current and future threats.” In the face of encroaching saltwater and sunny-day flooding like that on Alton Road, Mr. Levine has supported a $400 million spending project to make the city’s drainage system more resilient in the face of rising tides.
  • ...1 more annotation...
  • Sea levels have risen eight inches since 1870, according to the new report, which projects a further rise of one to four feet by the end of the century. Waters around southeast Florida could surge up to two feet by 2060, according to a report by the Southeast Florida Regional Climate Compact. A study by the Florida Department of Transportation concluded that over the next 35 years, rising sea levels will increasingly flood and damage smaller local roads in the Miami area.
Javier E

But What Would the End of Humanity Mean for Me? - James Hamblin - The Atlantic - 0 views

  • Tegmark is more worried about much more immediate threats, which he calls existential risks. That’s a term borrowed from physicist Nick Bostrom, director of Oxford University’s Future of Humanity Institute, a research collective modeling the potential range of human expansion into the cosmos
  • "I am finding it increasingly plausible that existential risk is the biggest moral issue in the world, even if it hasn’t gone mainstream yet,"
  • Existential risks, as Tegmark describes them, are things that are “not just a little bit bad, like a parking ticket, but really bad. Things that could really mess up or wipe out human civilization.”
  • ...17 more annotations...
  • The single existential risk that Tegmark worries about most is unfriendly artificial intelligence. That is, when computers are able to start improving themselves, there will be a rapid increase in their capacities, and then, Tegmark says, it’s very difficult to predict what will happen.
  • Tegmark told Lex Berko at Motherboard earlier this year, "I would guess there’s about a 60 percent chance that I’m not going to die of old age, but from some kind of human-caused calamity. Which would suggest that I should spend a significant portion of my time actually worrying about this. We should in society, too."
  • "Longer term—and this might mean 10 years, it might mean 50 or 100 years, depending on who you ask—when computers can do everything we can do," Tegmark said, “after that they will probably very rapidly get vastly better than us at everything, and we’ll face this question we talked about in the Huffington Post article: whether there’s really a place for us after that, or not.”
  • "This is very near-term stuff. Anyone who’s thinking about what their kids should study in high school or college should care a lot about this.”
  • Tegmark and his op-ed co-author Frank Wilczek, the Nobel laureate, draw examples of cold-war automated systems that assessed threats and resulted in false alarms and near misses. “In those instances some human intervened at the last moment and saved us from horrible consequences,” Wilczek told me earlier that day. “That might not happen in the future.”
  • there are still enough nuclear weapons in existence to incinerate all of Earth’s dense population centers, but that wouldn't kill everyone immediately. The smoldering cities would send sun-blocking soot into the stratosphere that would trigger a crop-killing climate shift, and that’s what would kill us all
  • “We are very reckless with this planet, with civilization,” Tegmark said. “We basically play Russian roulette.” The key is to think more long term, “not just about the next election cycle or the next Justin Bieber album.”
  • “There are several issues that arise, ranging from climate change to artificial intelligence to biological warfare to asteroids that might collide with the earth,” Wilczek said of the group’s launch. “They are very serious risks that don’t get much attention.
  • a widely perceived issue is when intelligent entities start to take on a life of their own. They revolutionized the way we understand chess, for instance. That’s pretty harmless. But one can imagine if they revolutionized the way we think about warfare or finance, either those entities themselves or the people that control them. It could pose some disquieting perturbations on the rest of our lives.”
  • Wilczek’s particularly concerned about a subset of artificial intelligence: drone warriors. “Not necessarily robots,” Wilczek told me, “although robot warriors could be a big issue, too. It could just be superintelligence that’s in a cloud. It doesn’t have to be embodied in the usual sense.”
  • it’s important not to anthropomorphize artificial intelligence. It's best to think of it as a primordial force of nature—strong and indifferent. In the case of chess, an A.I. models chess moves, predicts outcomes, and moves accordingly. If winning at chess meant destroying humanity, it might do that.
  • Even if programmers tried to program an A.I. to be benevolent, it could destroy us inadvertently. Andersen’s example in Aeon is that an A.I. designed to try and maximize human happiness might think that flooding your bloodstream with heroin is the best way to do that.
  • “It’s not clear how big the storm will be, or how long it’s going to take to get here. I don’t know. It might be 10 years before there’s a real problem. It might be 20, it might be 30. It might be five. But it’s certainly not too early to think about it, because the issues to address are only going to get more complex as the systems get more self-willed.”
  • Even within A.I. research, Tegmark admits, “There is absolutely not a consensus that we should be concerned about this.” But there is a lot of concern, and sense of lack of power. Because, concretely, what can you do? “The thing we should worry about is that we’re not worried.”
  • Tegmark brings it to Earth with a case-example about purchasing a stroller: If you could spend more for a good one or less for one that “sometimes collapses and crushes the baby, but nobody’s been able to prove that it is caused by any design flaw. But it’s 10 percent off! So which one are you going to buy?”
  • “There are seven billion of us on this little spinning ball in space. And we have so much opportunity," Tegmark said. "We have all the resources in this enormous cosmos. At the same time, we have the technology to wipe ourselves out.”
  • Ninety-nine percent of the species that have lived on Earth have gone extinct; why should we not? Seeing the biggest picture of humanity and the planet is the heart of this. It’s not meant to be about inspiring terror or doom. Sometimes that is what it takes to draw us out of the little things, where in the day-to-day we lose sight of enormous potentials.
Javier E

The Art of Focus - NYTimes.com - 1 views

  • in order to pursue their intellectual adventures, children need a secure social base:
  • The way to discover a terrifying longing is to liberate yourself from the self-censoring labels you began to tell yourself over the course of your mis-education. These formulas are stultifying
  • The lesson from childhood, then, is that if you want to win the war for attention, don’t try to say “no” to the trivial distractions you find on the information smorgasbord; try to say “yes” to the subject that arouses a terrifying longing, and let the terrifying longing crowd out everything else.
  • ...7 more annotations...
  • Don’t structure your encounters with them the way people do today, through brainstorming sessions (those don’t work) or through conferences with projection screen
  • Focus on the external objects of fascination, not on who you think you are. Find people with overlapping obsessions.
  • this creates a space internally into which one can be absorbed. In order to be absorbed one has to feel sufficiently safe, as though there is some shield, or somebody guarding
  • Instead look at the way children learn in groups. They make discoveries alone, but bring their treasures to the group. Then the group crowds around and hashes it out. In conversation, conflict, confusion and uncertainty can be metabolized and digested through somebody else.
  • 66 percent of workers aren’t able to focus on one thing at a time. Seventy percent of employees don’t have regular time for creative or strategic thinking while at work.
  • Many of us lead lives of distraction, unable to focus on what we know we should focus on.
  • I wonder if we might be able to copy some of the techniques used by the creatures who are phenomenally good at learning things: children.
« First ‹ Previous 81 - 100 of 219 Next › Last »
Showing 20 items per page