Skip to main content

Home/ TOK Friends/ Group items matching "Knowing" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
4More

What Americans Don't Know About Science - Eleanor Barkhorn - The Atlantic - 0 views

  • he report shows us that for at least some of the questions, Americans may be answering not based on knowledge, but on belief.
  • only 48 percent of Americans responded "true" to the statement "Human beings, as we know them today, developed from earlier species of animals." But if the question was reframed slightly, far more people responded with "true." Given the statement "according to the theory of evolution, human beings, as we know them today, developed from earlier species of animals," 72 percent answered "true."
  • A similar pattern happens with the Big Bang question. When the statement is simply "The universe began with a huge explosion," 39 percent responded "true." When it is "according to astronomers, the universe began with a huge explosion," 60 percent said "true.
  • ...1 more annotation...
  • This seems to indicate that many Americans are familiar with the theories of evolution and the Big Bang; they simply don't believe they're true.
5More

NSF Report Flawed; Americans Do Not Believe Astrology is Scientific | NeoAcademic - 0 views

  • The problem with human subjects data – as any psychologist like myself will tell you – is that simply asking someone a question rarely gives you the information that you think it does. When you ask someone to respond to a question, it must pass through a variety of mental filters, and these filters often cause people’s answers to differ from reality. Some of these processes are conscious and others are not
  • Learning, and by extension knowledge, are no different. People don’t always know what they know. And this NSF report is a fantastic example of this in action. The goal of the NSF researchers was to assess, “Do US citizens believe astrology is scientific?” People were troubled that young people now apparently believe astrology is more scientific than in the past. But this interpretation unwisely assumes that people accurately interpret the word astrology. It assumes that they know what astrology is and recognize that they know it in order to respond authentically
  • When I saw the NSF report, I was reminded of my own poor understanding of these terms. “Surely,” I said to myself, “it’s not that Americans believe astrology is scientific. Instead, they must be confusing astronomy with astrology, like I did those many years ago.” Fortunately, I had a very quick way to answer this question: Amazon Mechanical Turk (MTurk).
  • ...2 more annotations...
  • MTurk is a fantastic tool available to quickly collect human subjects data. It pulls from a massive group of people looking to complete small tasks for small amounts of money. So for 5 cents per survey, I collected 100 responses to a short survey from American MTurk Workers. It asked only 3 questions:Please define astrology in 25 words or less.Do you believe astrology to be scientific? (using the same scale as the NSF study)What is your highest level of education completed? (using the same scale as the NSF study)
  • Among those that correctly identified astrology as astrology, only 13.5% found it “pretty scientific” or “very scientific”. Only 1 person said it was “very scientific.” Among those that identified astrology as astronomy, the field was overwhelmingly seen as scientific, exactly as I expected. This is the true driver of the NSF report findings
16More

The New York Times > Magazine > In the Magazine: Faith, Certainty and the Presidency of... - 0 views

  • The Delaware senator was, in fact, hearing what Bush's top deputies -- from cabinet members like Paul O'Neill, Christine Todd Whitman and Colin Powell to generals fighting in Iraq -- have been told for years when they requested explanations for many of the president's decisions, policies that often seemed to collide with accepted facts. The president would say that he relied on his ''gut'' or his ''instinct'' to guide the ship of state, and then he ''prayed over it.''
  • What underlies Bush's certainty? And can it be assessed in the temporal realm of informed consent?
  • Top officials, from cabinet members on down, were often told when they would speak in Bush's presence, for how long and on what topic. The president would listen without betraying any reaction. Sometimes there would be cross-discussions -- Powell and Rumsfeld, for instance, briefly parrying on an issue -- but the president would rarely prod anyone with direct, informed questions.
  • ...13 more annotations...
  • This is one key feature of the faith-based presidency: open dialogue, based on facts, is not seen as something of inherent value. It may, in fact, create doubt, which undercuts faith. It could result in a loss of confidence in the decision-maker and, just as important, by the decision-maker.
  • has spent a lot of time trying to size up the president. ''Most successful people are good at identifying, very early, their strengths and weaknesses, at knowing themselves,'' he told me not long ago. ''For most of us average Joes, that meant we've relied on strengths but had to work on our weakness -- to lift them to adequacy -- otherwise they might bring us down. I don't think the president really had to do that, because he always had someone there -- his family or friends -- to bail him out. I don't think, on balance, that has served him well for the moment he's in now as president. He never seems to have worked on his weaknesses.''
  • Details vary, but here's the gist of what I understand took place. George W., drunk at a party, crudely insulted a friend of his mother's. George senior and Barbara blew up. Words were exchanged along the lines of something having to be done. George senior, then the vice president, dialed up his friend, Billy Graham, who came to the compound and spent several days with George W. in probing exchanges and walks on the beach. George W. was soon born again. He stopped drinking, attended Bible study and wrestled with issues of fervent faith. A man who was lost was saved.
  • Rubenstein described that time to a convention of pension managers in Los Angeles last year, recalling that Malek approached him and said: ''There is a guy who would like to be on the board. He's kind of down on his luck a bit. Needs a job. . . . Needs some board positions.'' Though Rubenstein didn't think George W. Bush, then in his mid-40's, ''added much value,'' he put him on the Caterair board. ''Came to all the meetings,'' Rubenstein told the conventioneers. ''Told a lot of jokes. Not that many clean ones. And after a while I kind of said to him, after about three years: 'You know, I'm not sure this is really for you. Maybe you should do something else. Because I don't think you're adding that much value to the board. You don't know that much about the company.' He said: 'Well, I think I'm getting out of this business anyway. And I don't really like it that much. So I'm probably going to resign from the board.' And I said thanks. Didn't think I'd ever see him again.''
  • challenges -- from either Powell or his opposite number as the top official in domestic policy, Paul O'Neill -- were trials that Bush had less and less patience for as the months passed. He made that clear to his top lieutenants. Gradually, Bush lost what Richard Perle, who would later head a largely private-sector group under Bush called the Defense Policy Board Advisory Committee, had described as his open posture during foreign-policy tutorials prior to the 2000 campaign. (''He had the confidence to ask questions that revealed he didn't know very much,'' Perle said.) By midyear 2001, a stand-and-deliver rhythm was established. Meetings, large and small, started to take on a scripted quality.
  • That a deep Christian faith illuminated the personal journey of George W. Bush is common knowledge. But faith has also shaped his presidency in profound, nonreligious ways. The president has demanded unquestioning faith from his followers, his staff, his senior aides and his kindred in the Republican Party. Once he makes a decision -- often swiftly, based on a creed or moral position -- he expects complete faith in its rightness.
  • A cluster of particularly vivid qualities was shaping George W. Bush's White House through the summer of 2001: a disdain for contemplation or deliberation, an embrace of decisiveness, a retreat from empiricism, a sometimes bullying impatience with doubters and even friendly questioners.
  • By summer's end that first year, Vice President Dick Cheney had stopped talking in meetings he attended with Bush. They would talk privately, or at their weekly lunch. The president was spending a lot of time outside the White House, often at the ranch, in the presence of only the most trustworthy confidants.
  • ''When I was first with Bush in Austin, what I saw was a self-help Methodist, very open, seeking,'' Wallis says now. ''What I started to see at this point was the man that would emerge over the next year -- a messianic American Calvinist. He doesn't want to hear from anyone who doubts him.''
  • , I had a meeting with a senior adviser to Bush. He expressed the White House's displeasure, and then he told me something that at the time I didn't fully comprehend -- but which I now believe gets to the very heart of the Bush presidency.
  • The aide said that guys like me were ''in what we call the reality-based community,'' which he defined as people who ''believe that solutions emerge from your judicious study of discernible reality.'' I nodded and murmured something about enlightenment principles and empiricism. He cut me off. ''That's not the way the world really works anymore,'' he continued. ''We're an empire now, and when we act, we create our own reality. And while you're studying that reality -- judiciously, as you will -- we'll act again, creating other new realities, which you can study too, and that's how things will sort out. We're history's actors . . . and you, all of you, will be left to just study what we do.''
  • ''If you operate in a certain way -- by saying this is how I want to justify what I've already decided to do, and I don't care how you pull it off -- you guarantee that you'll get faulty, one-sided information,'' Paul O'Neill, who was asked to resign his post of treasury secretary in December 2002, said when we had dinner a few weeks ago. ''You don't have to issue an edict, or twist arms, or be overt.''
  • George W. Bush and his team have constructed a high-performance electoral engine. The soul of this new machine is the support of millions of likely voters, who judge his worth based on intangibles -- character, certainty, fortitude and godliness -- rather than on what he says or does.
16More

Spann proves media bias includes weather: 'They never let facts get in the way of a goo... - 0 views

  • Meteorologist James Spann’s no-nonsense, yet enthusiastic approach to making sure Alabamians know the latest weather information in our severe-weather prone state has made him quite the pop culture favorite, especially on social media
  • Spann is also not afraid to call people out when they spread misinformation.
  • The suspendered-Spann, who boasts nearly 200,000 followers on Twitter, did exactly that in a recent article titled “The Age of Disinformation” for national website Medium.com
  • ...13 more annotations...
  • “Since my debut on television in 1979, I have been an eyewitness to the many changes in technology, society, and how we communicate. I am one who embraces change, and celebrates the higher quality of life we enjoy now thanks to this progress.
  • I realize the instant communication platforms we enjoy now do have some negatives that are troubling. Just a few examples in recent days…”
  • “This is a lenticular cloud. They have always been around, and quite frankly aren’t that unusual (although it is an anomaly to see one away from a mountain range). The one thing that is different today is that almost everyone has a camera phone, and almost everyone shares pictures of weather events. You didn’t see these often in earlier decades because technology didn’t allow it. Lenticular clouds are nothing new. But, yes, they are cool to see.”
  • This age of misinformation can lead to dangerous consequences, and promote an agenda, he warns.
  • “The Houston flooding is a great example. We are being told this is unprecedented’… Houston is ‘under water… and it is due to manmade global warming. “Yes, the flooding in Houston yesterday was severe, and a serious threat to life and property. A genuine weather disaster that has brought on suffering.
  • this was not ‘unprecedented.’ Flooding from Tropical Storm Allison in 2001 was more widespread, and flood waters were deeper. There is no comparison.”
  • “Those on the right, and those on the left hang out in ‘echo chambers,’ listening to those with similar world views refusing to believe anything else could be true
  • “Everyone knows the climate is changing; it always has, and always will. I do not know of a single ‘climate denier.’ I am still waiting to meet one.
  • “The debate involves the anthropogenic impact, and this is not why I am writing this piece. Let’s just say the Houston flood this week is weather, and not climate, and leave it at that.”
  • Spann lays much of the blame on the mainstream media and social media “hype and misinformation.”
  • “They will be sure to let you know that weather events they are reporting on are unprecedented,’ there are ‘millions and millions in the path,’ it is caused by a ‘monster storm,’ and ‘the worst is yet to come since these events are becoming more ‘frequent.’
  • “You will never hear about the low tornado count in recent years, the lack of major hurricane landfalls on U.S. coasts over the past 10 years, or the low number of wildfires this year. It doesn’t fit their story.
  • never let facts get in the way of a good story…. there will ALWAYS be a heat wave, flood, wildfire, tornado, tyhpoon, cold wave, and snow storm somewhere. And, trust me, they will find them, and it will probably lead their newscasts
23More

Rise in Scientific Journal Retractions Prompts Calls for Reform - NYTimes.com - 1 views

  • before long they reached a troubling conclusion: not only that retractions were rising at an alarming rate, but that retractions were just a manifestation of a much more profound problem — “a symptom of a dysfunctional scientific climate,” as Dr. Fang put it.
  • he feared that science had turned into a winner-take-all game with perverse incentives that lead scientists to cut corners and, in some cases, commit acts of misconduct.
  • Members of the committee agreed with their assessment. “I think this is really coming to a head,” said Dr. Roberta B. Ness, dean of the University of Texas School of Public Health. And Dr. David Korn of Harvard Medical School agreed that “there are problems all through the system.”
  • ...20 more annotations...
  • science has changed in some worrying ways in recent decades — especially biomedical research, which consumes a larger and larger share of government science spending.
  • the journal Nature reported that published retractions had increased tenfold over the past decade, while the number of published papers had increased by just 44 percent.
  • because journals are now online, bad papers are simply reaching a wider audience, making it more likely that errors will be spotted.
  • The National Institutes of Health accepts a much lower percentage of grant applications today than in earlier decades. At the same time, many universities expect scientists to draw an increasing part of their salaries from grants, and these pressures have influenced how scientists are promoted.
  • Dr. Fang and Dr. Casadevall looked at the rate of retractions in 17 journals from 2001 to 2010 and compared it with the journals’ “impact factor,” a score based on how often their papers are cited by scientists. The higher a journal’s impact factor, the two editors found, the higher its retraction rate.
  • Each year, every laboratory produces a new crop of Ph.D.’s, who must compete for a small number of jobs, and the competition is getting fiercer. In 1973, more than half of biologists had a tenure-track job within six years of getting a Ph.D. By 2006 the figure was down to 15 percent.
  • Yet labs continue to have an incentive to take on lots of graduate students to produce more research. “I refer to it as a pyramid scheme,
  • In such an environment, a high-profile paper can mean the difference between a career in science or leaving the field. “It’s becoming the price of admission,”
  • To survive professionally, scientists feel the need to publish as many papers as possible, and to get them into high-profile journals. And sometimes they cut corners or even commit misconduct to get ther
  • “What people do is they count papers, and they look at the prestige of the journal in which the research is published, and they see how may grant dollars scientists have, and if they don’t have funding, they don’t get promoted,” Dr. Fang said. “It’s not about the quality of the research.”
  • Dr. Ness likens scientists today to small-business owners, rather than people trying to satisfy their curiosity about how the world works. “You’re marketing and selling to other scientists,” she said. “To the degree you can market and sell your products better, you’re creating the revenue stream to fund your enterprise.”
  • Universities want to attract successful scientists, and so they have erected a glut of science buildings, Dr. Stephan said. Some universities have gone into debt, betting that the flow of grant money will eventually pay off the loans.
  • “You can’t afford to fail, to have your hypothesis disproven,” Dr. Fang said. “It’s a small minority of scientists who engage in frank misconduct. It’s a much more insidious thing that you feel compelled to put the best face on everything.”
  • , Dr. Stephan points out that a number of countries — including China, South Korea and Turkey — now offer cash rewards to scientists who get papers into high-profile journals.
  • To change the system, Dr. Fang and Dr. Casadevall say, start by giving graduate students a better understanding of science’s ground rules — what Dr. Casadevall calls “the science of how you know what you know.”
  • They would also move away from the winner-take-all system, in which grants are concentrated among a small fraction of scientists. One way to do that may be to put a cap on the grants any one lab can receive.
  • Such a shift would require scientists to surrender some of their most cherished practices — the priority rule, for example, which gives all the credit for a scientific discovery to whoever publishes results first.
  • To ease such cutthroat competition, the two editors would also change the rules for scientific prizes and would have universities take collaboration into account when they decide on promotions.
  • Even scientists who are sympathetic to the idea of fundamental change are skeptical that it will happen any time soon. “I don’t think they have much chance of changing what they’re talking about,” said Dr. Korn, of Harvard.
  • “When our generation goes away, where is the new generation going to be?” he asked. “All the scientists I know are so anxious about their funding that they don’t make inspiring role models. I heard it from my own kids, who went into art and music respectively. They said, ‘You know, we see you, and you don’t look very happy.’ ”
47More

All Can Be Lost: The Risk of Putting Our Knowledge in the Hands of Machines - Nicholas ... - 0 views

  • We rely on computers to fly our planes, find our cancers, design our buildings, audit our businesses. That's all well and good. But what happens when the computer fails?
  • On the evening of February 12, 2009, a Continental Connection commuter flight made its way through blustery weather between Newark, New Jersey, and Buffalo, New York.
  • The Q400 was well into its approach to the Buffalo airport, its landing gear down, its wing flaps out, when the pilot’s control yoke began to shudder noisily, a signal that the plane was losing lift and risked going into an aerodynamic stall. The autopilot disconnected, and the captain took over the controls. He reacted quickly, but he did precisely the wrong thing: he jerked back on the yoke, lifting the plane’s nose and reducing its airspeed, instead of pushing the yoke forward to gain velocity.
  • ...43 more annotations...
  • The crash, which killed all 49 people on board as well as one person on the ground, should never have happened.
  • aptain’s response to the stall warning, the investigators reported, “should have been automatic, but his improper flight control inputs were inconsistent with his training” and instead revealed “startle and confusion.
  • Automation has become so sophisticated that on a typical passenger flight, a human pilot holds the controls for a grand total of just three minutes.
  • We humans have been handing off chores, both physical and mental, to tools since the invention of the lever, the wheel, and the counting bead.
  • And that, many aviation and automation experts have concluded, is a problem. Overuse of automation erodes pilots’ expertise and dulls their reflexes,
  • No one doubts that autopilot has contributed to improvements in flight safety over the years. It reduces pilot fatigue and provides advance warnings of problems, and it can keep a plane airborne should the crew become disabled. But the steady overall decline in plane crashes masks the recent arrival of “a spectacularly new type of accident,”
  • “We’re forgetting how to fly.”
  • The experience of airlines should give us pause. It reveals that automation, for all its benefits, can take a toll on the performance and talents of those who rely on it. The implications go well beyond safety. Because automation alters how we act, how we learn, and what we know, it has an ethical dimension. The choices we make, or fail to make, about which tasks we hand off to machines shape our lives and the place we make for ourselves in the world.
  • What pilots spend a lot of time doing is monitoring screens and keying in data. They’ve become, it’s not much of an exaggeration to say, computer operators.
  • Examples of complacency and bias have been well documented in high-risk situations—on flight decks and battlefields, in factory control rooms—but recent studies suggest that the problems can bedevil anyone working with a computer
  • That may leave the person operating the computer to play the role of a high-tech clerk—entering data, monitoring outputs, and watching for failures. Rather than opening new frontiers of thought and action, software ends up narrowing our focus.
  • A labor-saving device doesn’t just provide a substitute for some isolated component of a job or other activity. It alters the character of the entire task, including the roles, attitudes, and skills of the people taking part.
  • when we work with computers, we often fall victim to two cognitive ailments—complacency and bias—that can undercut our performance and lead to mistakes. Automation complacency occurs when a computer lulls us into a false sense of security. Confident that the machine will work flawlessly and handle any problem that crops up, we allow our attention to drift.
  • Automation bias occurs when we place too much faith in the accuracy of the information coming through our monitors. Our trust in the software becomes so strong that we ignore or discount other information sources, including our own eyes and ears
  • Automation is different now. Computers can be programmed to perform complex activities in which a succession of tightly coordinated tasks is carried out through an evaluation of many variables. Many software programs take on intellectual work—observing and sensing, analyzing and judging, even making decisions—that until recently was considered the preserve of humans.
  • Automation turns us from actors into observers. Instead of manipulating the yoke, we watch the screen. That shift may make our lives easier, but it can also inhibit the development of expertise.
  • Since the late 1970s, psychologists have been documenting a phenomenon called the “generation effect.” It was first observed in studies of vocabulary, which revealed that people remember words much better when they actively call them to mind—when they generate them—than when they simply read them.
  • When you engage actively in a task, you set off intricate mental processes that allow you to retain more knowledge. You learn more and remember more. When you repeat the same task over a long period, your brain constructs specialized neural circuits dedicated to the activit
  • What looks like instinct is hard-won skill, skill that requires exactly the kind of struggle that modern software seeks to alleviate.
  • In many businesses, managers and other professionals have come to depend on decision-support systems to analyze information and suggest courses of action. Accountants, for example, use the systems in corporate audits. The applications speed the work, but some signs suggest that as the software becomes more capable, the accountants become less so.
  • You can put limits on the scope of automation, making sure that people working with computers perform challenging tasks rather than merely observing.
  • Experts used to assume that there were limits to the ability of programmers to automate complicated tasks, particularly those involving sensory perception, pattern recognition, and conceptual knowledge
  • Who needs humans, anyway? That question, in one rhetorical form or another, comes up frequently in discussions of automation. If computers’ abilities are expanding so quickly and if people, by comparison, seem slow, clumsy, and error-prone, why not build immaculately self-contained systems that perform flawlessly without any human oversight or intervention? Why not take the human factor out of the equation?
  • The cure for imperfect automation is total automation.
  • That idea is seductive, but no machine is infallible. Sooner or later, even the most advanced technology will break down, misfire, or, in the case of a computerized system, encounter circumstances that its designers never anticipated. As automation technologies become more complex, relying on interdependencies among algorithms, databases, sensors, and mechanical parts, the potential sources of failure multiply. They also become harder to detect.
  • conundrum of computer automation.
  • Because many system designers assume that human operators are “unreliable and inefficient,” at least when compared with a computer, they strive to give the operators as small a role as possible.
  • People end up functioning as mere monitors, passive watchers of screens. That’s a job that humans, with our notoriously wandering minds, are especially bad at
  • people have trouble maintaining their attention on a stable display of information for more than half an hour. “This means,” Bainbridge observed, “that it is humanly impossible to carry out the basic function of monitoring for unlikely abnormalities.”
  • a person’s skills “deteriorate when they are not used,” even an experienced operator will eventually begin to act like an inexperienced one if restricted to just watching.
  • You can program software to shift control back to human operators at frequent but irregular intervals; knowing that they may need to take command at any moment keeps people engaged, promoting situational awareness and learning.
  • What’s most astonishing, and unsettling, about computer automation is that it’s still in its early stages.
  • most software applications don’t foster learning and engagement. In fact, they have the opposite effect. That’s because taking the steps necessary to promote the development and maintenance of expertise almost always entails a sacrifice of speed and productivity.
  • Learning requires inefficiency. Businesses, which seek to maximize productivity and profit, would rarely accept such a trade-off. Individuals, too, almost always seek efficiency and convenience.
  • Abstract concerns about the fate of human talent can’t compete with the allure of saving time and money.
  • The small island of Igloolik, off the coast of the Melville Peninsula in the Nunavut territory of northern Canada, is a bewildering place in the winter.
  • , Inuit hunters have for some 4,000 years ventured out from their homes on the island and traveled across miles of ice and tundra to search for game. The hunters’ ability to navigate vast stretches of the barren Arctic terrain, where landmarks are few, snow formations are in constant flux, and trails disappear overnight, has amazed explorers and scientists for centuries. The Inuit’s extraordinary way-finding skills are born not of technological prowess—they long eschewed maps and compasses—but of a profound understanding of winds, snowdrift patterns, animal behavior, stars, and tides.
  • The Igloolik hunters have begun to rely on computer-generated maps to get around. Adoption of GPS technology has been particularly strong among younger Inuit, and it’s not hard to understand why.
  • But as GPS devices have proliferated on Igloolik, reports of serious accidents during hunts have spread. A hunter who hasn’t developed way-finding skills can easily become lost, particularly if his GPS receiver fails.
  • The routes so meticulously plotted on satellite maps can also give hunters tunnel vision, leading them onto thin ice or into other hazards a skilled navigator would avoid.
  • An Inuit on a GPS-equipped snowmobile is not so different from a suburban commuter in a GPS-equipped SUV: as he devotes his attention to the instructions coming from the computer, he loses sight of his surroundings. He travels “blindfolded,” as Aporta puts it
  • A unique talent that has distinguished a people for centuries may evaporate in a generation.
  • Computer automation severs the ends from the means. It makes getting what we want easier, but it distances us from the work of knowing. As we transform ourselves into creatures of the screen, we face an existential question: Does our essence still lie in what we know, or are we now content to be defined by what we want?
  •  
    Automation increases efficiency and speed of tasks, but decreases the individual's knowledge of a task and decrease's a human's ability to learn. 
14More

WHICH IS THE BEST LANGUAGE TO LEARN? | More Intelligent Life - 2 views

  • For language lovers, the facts are grim: Anglophones simply aren’t learning them any more. In Britain, despite four decades in the European Union, the number of A-levels taken in French and German has fallen by half in the past 20 years, while what was a growing trend of Spanish-learning has stalled. In America, the numbers are equally sorry.
  • compelling reasons remain for learning other languages.
  • First of all, learning any foreign language helps you understand all language better—many Anglophones first encounter the words “past participle” not in an English class, but in French. Second, there is the cultural broadening. Literature is always best read in the original. Poetry and lyrics suffer particularly badly in translation. And learning another tongue helps the student grasp another way of thinking.
  • ...11 more annotations...
  • is Chinese the language of the future?
  • So which one should you, or your children, learn? If you take a glance at advertisements in New York or A-level options in Britain, an answer seems to leap out: Mandarin.
  • The practical reasons are just as compelling. In business, if the team on the other side of the table knows your language but you don’t know theirs, they almost certainly know more about you and your company than you do about them and theirs—a bad position to negotiate from.
  • This factor is the Chinese writing system (which Japan borrowed and adapted centuries ago). The learner needs to know at least 3,000-4,000 characters to make sense of written Chinese, and thousands more to have a real feel for it. Chinese, with all its tones, is hard enough to speak. But  the mammoth feat of memory required to be literate in Mandarin is harder still. It deters most foreigners from ever mastering the system—and increasingly trips up Chinese natives.
  • If you were to learn ten languages ranked by general usefulness, Japanese would probably not make the list. And the key reason for Japanese’s limited spread will also put the brakes on Chinese.
  • A recent survey reported in the People’s Daily found 84% of respondents agreeing that skill in Chinese is declining.
  • Fewer and fewer native speakers learn to produce characters in traditional calligraphy. Instead, they write their language the same way we do—with a computer. And not only that, but they use the Roman alphabet to produce Chinese characters: type in wo and Chinese language-support software will offer a menu of characters pronounced wo; the user selects the one desired. (Or if the user types in wo shi zhongguo ren, “I am Chinese”, the software detects the meaning and picks the right characters.) With less and less need to recall the characters cold, the Chinese are forgetting them
  • As long as China keeps the character-based system—which will probably be a long time, thanks to cultural attachment and practical concerns alike—Chinese is very unlikely to become a true world language, an auxiliary language like English, the language a Brazilian chemist will publish papers in, hoping that they will be read in Finland and Canada. By all means, if China is your main interest, for business or pleasure, learn Chinese. It is fascinating, and learnable—though Moser’s online essay, “Why Chinese is so damn hard,” might discourage the faint of heart and the short of time.
  • But if I was asked what foreign language is the most useful, and given no more parameters (where? for what purpose?), my answer would be French. Whatever you think of France, the language is much less limited than many people realise.
  • French ranks only 16th on the list of languages ranked by native speakers. But ranked above it are languages like Telegu and Javanese that no one would call world languages. Hindi does not even unite India. Also in the top 15 are Arabic, Spanish and Portuguese, major languages to be sure, but regionally concentrated. If your interest is the Middle East or Islam, by all means learn Arabic. If your interest is Latin America, Spanish or Portuguese is the way to go. Or both; learning one makes the second quite easy.
  • if you want another truly global language, there are surprisingly few candidates, and for me French is unquestionably top of the list. It can enhance your enjoyment of art, history, literature and food, while giving you an important tool in business and a useful one in diplomacy. It has native speakers in every region on earth. And lest we forget its heartland itself, France attracts more tourists than any other country—76.8m in 2010, according to the World Tourism Organisation, leaving America a distant second with 59.7m
4More

The Book Bench: Is Self-Knowledge Overrated? : The New Yorker - 1 views

  • It’s impossible to overstate the influence of Kahneman and Tversky. Like Darwin, they helped to dismantle a longstanding myth of human exceptionalism. Although we’d always seen ourselves as rational creatures—this was our Promethean gift—it turns out that human reason is rather feeble, easily overwhelmed by ancient instincts and lazy biases. The mind is a deeply flawed machine.
  • there is a subtle optimism lurking in all of Kahneman’s work: it is the hope that self-awareness is a form of salvation, that if we know about our mental mistakes, we can avoid them. One day, we will learn to equally weigh losses and gains; science can help us escape from the cycle of human error. As Kahneman and Tversky noted in the final sentence of their classic 1974 paper, “A better understanding of these heuristics and of the biases to which they lead could improve judgments and decisions in situations of uncertainty.” Unfortunately, such hopes appear to be unfounded. Self-knowledge isn’t a cure for irrationality; even when we know why we stumble, we still find a way to fall.
  • self-knowledge is surprisingly useless. Teaching people about the hazards of multitasking doesn’t lead to less texting in the car; learning about the weakness of the will doesn’t increase the success of diets; knowing that most people are overconfident about the future doesn’t make us more realistic. The problem isn’t that we’re stupid—it’s that we’re so damn stubborn
  • ...1 more annotation...
  • Kahneman has given us a new set of labels for our shortcomings. But his greatest legacy, perhaps, is also his bleakest: By categorizing our cognitive flaws, documenting not just our errors but also their embarrassing predictability, he has revealed the hollowness of a very ancient aspiration. Knowing thyself is not enough. Not even close.
15More

The Mental Virtues - NYTimes.com - 0 views

  • Even if you are alone in your office, you are thinking. Thinking well under a barrage of information may be a different sort of moral challenge than fighting well under a hail of bullets, but it’s a character challenge nonetheless.
  • some of the cerebral virtues. We can all grade ourselves on how good we are at each of them.
  • love of learning. Some people are just more ardently curious than others, either by cultivation or by nature.
  • ...12 more annotations...
  • courage. The obvious form of intellectual courage is the willingness to hold unpopular views. But the subtler form is knowing how much risk to take in jumping to conclusions.
  • Intellectual courage is self-regulation, Roberts and Wood argue, knowing when to be daring and when to be cautious. The philosopher Thomas Kuhn pointed out that scientists often simply ignore facts that don’t fit with their existing paradigms, but an intellectually courageous person is willing to look at things that are surprisingly hard to look at.
  • The median point between flaccidity and rigidity is the virtue of firmness. The firm believer can build a steady worldview on solid timbers but still delight in new information. She can gracefully adjust the strength of her conviction to the strength of the evidence. Firmness is a quality of mental agility.
  • humility, which is not letting your own desire for status get in the way of accuracy. The humble person fights against vanity and self-importance.
  • The humble researcher doesn’t become arrogant toward his subject, assuming he has mastered it. Such a person is open to learning from anyone at any stage in life.
  • autonomy
  • Autonomy is the median of knowing when to bow to authority and when not to, when to follow a role model and when not to, when to adhere to tradition and when not to.
  • generosity. This virtue starts with the willingness to share knowledge and give others credit. But it also means hearing others as they would like to be heard, looking for what each person has to teach and not looking to triumphantly pounce upon their errors.
  • thinking well means pushing against the grain of our nature — against vanity, against laziness, against the desire for certainty, against the desire to avoid painful truths. Good thinking isn’t just adopting the right technique. It’s a moral enterprise and requires good character, the ability to go against our lesser impulses for the sake of our higher ones.
  • wisdom isn’t a body of information. It’s the moral quality of knowing how to handle your own limitations.
  • Warren Buffett made a similar point in his own sphere, “Investing is not a game where the guy with the 160 I.Q. beats the guy with the 130 I.Q. Once you have ordinary intelligence, what you need is the temperament to control the urges that get other people into trouble.”
  • Good piece. I only wish David had written more about all the forces that work _against_ the virtues he describes. The innumerable examples of corporate suppression/spin of "inconvenient" truths (i.e, GM, Toyota, et al); the virtual acceptance that lying is a legitimate tactic in political campaigns; our preoccupation with celebrity, appearances, and "looking good" in every imaginable transaction; make the quiet virtues that DB describes even more heroic than he suggests.
18More

What Makes a Positive College Experience? - 0 views

  • sociologist at Hamilton College
  • believes he knows what most determines how students feel about their time at college
  • dorm design, friends and extracurricular involvement more than what happens in the classroom
  • ...14 more annotations...
  • What’s the most important element in shaping the college experience?
  • who meets whom, and when
  • So what should students do to get more out of college?
  • As a freshman, live in one of the old-fashioned dorms with the long hallways, multiple roommates and communal bathroom, where you’ll have to bump into a lot of different people every day.
  • In choosing classes, pick the teacher over the topic.
  • Try to get to know a lot of people your first year, when everyone is looking for friends.
  • It helps to join a large high-contact activity, like a sports team or choir
  • it only takes two or three close friends and one or two great professors to have a fulfilling college experience
  • should students take as many small classes as they can get into?
  • Small classes are great. But most colleges also have some wonderful very large classes
  • What should colleges do to make students’ experiences better?
  • things that give the biggest payoff for the least effort
  • What do colleges do that doesn’t improve the experience?
  • Strategic plans.
  •  
    A sociologist believes he knows what determines how students feel about their time at college. Sociology is the study of social problems.
12More

Facebook Has All the Power - Julie Posetti - The Atlantic - 0 views

  • scholars covet thy neighbor's data. They're attracted to the very large and often fascinating data sets that private companies have developed.
  • It's the companies that own and manage this data. The only standards we know they have to follow are in the terms-of-service that users accept to create an account, and the law as it stands in different countries.
  • the "sexiness" of the Facebook data that led Cornell University and the Proceedings of the National Academy of Sciences (PNAS) into an ethically dubious arrangement, where, for example, Facebook's unreadable 9,000-word terms-of-service are said to be good enough to meet the standard for "informed consent."
  • ...9 more annotations...
  • When the study drew attention and controversy, there was a moment when they both could have said: "We didn't look carefully enough at this the first time. Now we can see that it doesn't meet our standards." Instead they allowed Facebook and the PR people to take the lead in responding to the controversy.
  • What should this reality signal to Facebook users? Is it time to pull-back? You have (almost) no rights. You have (almost) no control. You have no idea what they're doing to you or with you. You don't even know who's getting the stuff you are posting, and you're not allowed to know. Trade secret!
  • Are there any particular warnings here for journalists and editors in terms of their exposure on Facebook? Yeah. Facebook has all the power. You have almost none. Just keep that in mind in all your dealings with it, as an individual with family and friends, as a journalist with a story to file, and as a news organization that is "on" Facebook.
  • I am not in a commercial situation where I have to maximize my traffic, so I can opt out. Right now my choice is to keep my account, but use it cynically. 
  • does this level of experimentation indicate the prospect of a further undermining of audience-driven news priorities and traditional news values? The right way to think about it is a loss of power—for news producers and their priorities. As I said, Facebook thinks it knows better than I do what "my" 180,000 subscribers should get from me.
  • Facebook has "where else are they going to go?" logic now. And they have good reason for this confidence. (It's called network effects.) But "where else are they going to go?" is a long way from trust and loyalty. It is less a durable business model than a statement of power. 
  • I distinguished between the "thin" legitimacy that Facebook operates under and the "thick" legitimacy that the university requires to be the institution it was always supposed to be. (Both are distinct from il-legitimacy.) News organizations should learn to make this distinction more often. Normal PR exists to muddle it. Which is why you don't hand a research crisis over to university PR people.
  • some commentators have questioned the practice of A/B headline testing in the aftermath of this scandal—is there a clear connection? The connection to me is that both are forms of behaviourism. Behaviourism is a view of human beings in which, as Hannah Arendt said, they are reduced to the level of a conditioned and "behaving" animal—an animal that responds to these stimuli but not those. This is why a popular shorthand for Facebook's study was that users were being treated as lab rats.
  • Journalism is supposed to be about informing people so they can understand the world and take action when necessary. Action and behaviour are not the same thing at all. One is a conscious choice, the other a human tendency. There's a tension, then, between commercial behaviourism, which may be deeply functional in some ways for the news industry, and informing people as citizens capable of understanding their world well enough to improve it, which is the deepest purpose of journalism. A/B testing merely highlights this tension.
3More

What makes us human? Doing pointless things for fun - 2 views

  • Playfulness is what makes us human. Doing pointless, purposeless things, just for fun. Doing things for the sheer devilment of it. Being silly for the sake of being silly. Larking around. Taking pleasure in activities that do not advantage us and have nothing to do with our survival. These are the highest signs of intelligence. It is when a creature, having met and surmounted all the practical needs that face him, decides to dance that we know we are in the presence of a human. It is when a creature, having successfully performed all necessary functions, starts to play the fool, just for the hell of it, that we know he is not a robot.
  • All at once, it was clear. The bush people, lounging about after dark in their family shelter, perhaps around a fire – basically just hanging out – had been amusing themselves doing a bit of rock art. And perhaps with some leftover red paste, a few of the younger ones had had a competition to see who could jump highest and make their fingermarks highest up the overhang. This was not even art. It called for no particular skill. It was just mucking about. And yet, for all the careful beauty of their pictures, for all the recognition of their lives from the vantage point of my life that was sparked in me by the appreciation of their artwork, it was not what was skilful that brought me closest to them. It was what was playful. It was their jumping and daubing finger-blobs competition that brought them to me, suddenly, as fellow humans across all those thousands of years. It tingled my spine.
  • An age is coming when machines will be able to do everything. “Ah,” you say, “but they will not be conscious.” But how will we know a machine is not conscious – how do we know another human being is conscious? There is only one way. When it starts to play. In playfulness lies the highest expression of the human spirit.
4More

Pulling Teeth to Treat Mental Illness - The Atlantic - 0 views

  • Cotton's experiments were unethical and awful, but they weren't that illogical if you consider the knowledge that was available at the time. This was before surgeons operated with gloves on, before doctors knew that people shouldn't stand in front of X-ray machines for 45 minutes, and before people knew about blood types or heroin addiction or that eugenics is not a thing.
  • "Modern medicine had to start somewhere."
  • it's also a reminder of how little we still know about the brain. Certainly, science has progressed to the point where patients aren't subjected to painful and permanent procedures without their consent, and we obviously now know the basic mechanisms behind mental illness. But we still don't know, say, the very best way to prevent schizophrenia or to treat addiction.
  • ...1 more annotation...
  • To some extent, the brain remains a bit of a black box, as puzzling to modern-day psychiatrists as it was to turn-of-the-century charlatans. The difference is, most doctors today have the humility to admit what they don't know.
32More

BBC - Future - The countries that don't exist - 2 views

  • In the deep future, every territory we know could eventually become a country that doesn’t exist.
    • silveiragu
       
      Contrary to the human expectation that situations remain constant. 
  • There really is a secret world of hidden independent nations
  • Middleton, however, is here to talk about countries missing from the vast majority of books and maps for sale here. He calls them the “countries that don’t exist”
    • silveiragu
       
      Reminds us of our strange relationship with nationalism-that we forget how artificial countries' boundaries are. 
  • ...21 more annotations...
  • The problem, he says, is that we don’t have a watertight definition of what a country is. “Which as a geographer, is kind of shocking
  • The globe, it turns out, is full of small (and not so small) regions that have all the trappings of a real country
  • and are ignored on most world maps.
  • Middleton, a geographer at the University of Oxford, has now charted these hidden lands in his new book, An Atlas of Countries that Don’t Exist
  • Middleton’s quest began, appropriately enough, with Narnia
    • silveiragu
       
      Interesting connection to imagination as a way of knowing.
  • a defined territory, a permanent population, a government, and “the capacity to enter into relations with other states”.
  • In Australia, meanwhile, the Republic of Murrawarri was founded in 2013, after the indigenous tribe wrote a letter to Queen Elizabeth II asking her to prove her legitimacy to govern their land.
  • Yet many countries that meet these criteria aren‘t members of the United Nations (commonly accepted as the final seal of a country’s statehood).
  • many of them are instead members of the “Unrepresented United Nations – an alternative body to champion their rights.
  • A handful of the names will be familiar to anyone who has read a newspaper: territories such as Taiwan, Tibet, Greenland, and Northern Cyprus.
  • The others are less famous, but they are by no means less serious
    • silveiragu
       
      By what criterion, "serious"?
  • One of the most troubling histories, he says, concerns the Republic of Lakotah (with a population of 100,000). Bang in the centre of the United States of America (just east of the Rocky Mountains), the republic is an attempt to reclaim the sacred Black Hills for the Lakota Sioux tribe.
  • Their plight began in the 18th Century, and by 1868 they had finally signed a deal with the US government that promised the right to live on the Black Hills. Unfortunately, they hadn’t accounted for a gold rush
  • Similar battles are being fought across every continent.
  • In fact, you have almost certainly, unknowingly, visited one.
  • Christiania, an enclave in the heart of Copenhagen.
  • On 26 September that year, they declared it independent, with its own “direct democracy”, in which each of the inhabitants (now numbering 850) could vote on any important matter.
    • silveiragu
       
      Interesting reminder that the label "country" does not only have to arise from military or economic struggles, as is tempting to think in our study of history. Also, interesting reminder that the label of "country"-by itself-means nothing. 
  • a blind eye to the activities
    • silveiragu
       
      That is really why any interest is demonstrated towards this topic. Not that some country named Christiania exists in the heart of Denmark, but that they can legitimately call themselves a nation. We have grown up, and our parents have grown up, with a rigid definition of nationalism, and the strange notion that the lines in an atlas were always there. One interpretation of the Danish government's response to Christiania is simply that they do not know what to think. Although probably not geopolitically significant, such enclave states represent a challenge our perception of countries, one which fascinates Middleton's readers because it disconcerts them. 
  • perhaps we need to rethink the concept of the nation-state altogether? He points to Antarctica, a continent shared peacefully among the international community
    • silveiragu
       
      A sign of progress, perhaps, from the industrialism-spurred cycle of divide land, industrialize, and repeat-even if the chief reason is the region's climate. 
  • The last pages of Middleton’s Atlas contain two radical examples that question everything we think we mean by the word ‘country’.
    • silveiragu
       
      That is really why any interest is demonstrated towards this topic. Not that some country named Christiania exists in the heart of Denmark, but that they can legitimately call themselves a nation. We have grown up, and our parents have grown up, with a rigid definition of nationalism, and the strange notion that the lines in an atlas were always there. These "nonexistent countries"-and our collective disregard for them-are reminiscent of the 17th and 18th centuries: then, the notion of identifying by national lines was almost as strange and artificial as these countries' borders seem to us today. 
  • “They all raise the possibility that countries as we know them are not the only legitimate basis for ordering the planet,
18More

Here's what the government's dietary guidelines should really say - The Washington Post - 0 views

  • If I were writing the dietary guidelines, I would give them a radical overhaul. I’d go so far as to radically overhaul the way we evaluate diet. Here’s why and how.
  • Lately, as scientists try, and fail, to reproduce results, all of science is taking a hard look at funding biases, statistical shenanigans and groupthink. All that criticism, and then some, applies to nutrition.
  • Prominent in the charge to change the way we do science is John Ioannidis, professor of health research and policy at Stanford University. In 2005, he published “Why Most Research Findings Are False” in the journal PLOS Medicin
  • ...15 more annotations...
  • He came down hard on nutrition in a pull-no-punches 2013 British Medical Journal editorial titled, “Implausible results in human nutrition research,” in which he noted, “Almost every single nutrient imaginable has peer reviewed publications associating it with almost any outcome.”
  • Ioannidis told me that sussing out the connection between diet and health — nutritional epidemiology — is enormously challenging, and “the tools that we’re throwing at the problem are not commensurate with the complexity and difficulty of the problem.” The biggest of those tools is observational research, in which we collect data on what people eat, and track what happens to them.
  • He lists plant-based foods — fruit, veg, whole grains, legumes — but acknowledges that we don’t understand enough to prescribe specific combinations or numbers of servings.
  • funding bias isn’t the only kind. “Fanatical opinions abound in nutrition,” Ioannidis wrote in 2013, and those have bias power too.
  • “Definitive solutions won’t come from another million observational papers or small randomized trials,” reads the subtitle of Ioannidis’s paper. His is a burn-down-the-house ethos.
  • When it comes to actual dietary recommendations, the disagreement is stark. “Ioannidis and others say we have no clue, the science is so bad that we don’t know anything,” Hu told me. “I think that’s completely bogus. We know a lot about the basic elements of a healthy diet.”
  • Give tens of thousands of people that FFQ, and you end up with a ginormous repository of possible correlations. You can zero in on a vitamin, macronutrient or food, and go to town. But not only are you starting with flawed data, you’ve got a zillion possible confounding variables — dietary, demographic, socioeconomic. I’ve heard statisticians call it “noise mining,” and Ioannidis is equally skeptical. “With this type of data, you can get any result you want,” he said. “You can align it to your beliefs.”
  • Big differences in what people eat track with other differences. Heavy plant-eaters are different from, say, heavy meat-eaters in all kinds of ways (income, education, physical activity, BMI). Red meat consumption correlates with increased risk of dying in an accident as much as dying from heart disease. The amount of faith we put in observational studies is a judgment call.
  • I find myself in Ioannidis’s camp. What have we learned, unequivocally enough to build a consensus in the nutrition community, about how diet affects health? Well, trans-fats are bad.
  • Over and over, large population studies get sliced and diced, and it’s all but impossible to figure out what’s signal and what’s noise. Researchers try to do that with controlled trials to test the connections, but those have issues too. They’re expensive, so they’re usually small and short-term. People have trouble sticking to the diet being studied. And scientists are generally looking for what they call “surrogate endpoints,” like increased cholesterol rather than death from heart disease, since it’s impractical to keep a trial going until people die.
  • , what do we do? Hu and Ioannidis actually have similar suggestions. For starters, they both think we should be looking at dietary patterns rather than single foods or nutrients. They also both want to look across the data sets. Ioannidis emphasizes transparency. He wants to open data to the world and analyze all the data sets in the same way to see if “any signals survive.” Hu is more cautious (partly to safeguard confidentiality
  • I have a suggestion. Let’s give up on evidence-based eating. It’s given us nothing but trouble and strife. Our tools can’t find any but the most obvious links between food and health, and we’ve found those already.
  • Instead, let’s acknowledge the uncertainty and eat to hedge against what we don’t know
  • We’ve got two excellent hedges: variety and foods with nutrients intact (which describes such diets as the Mediterranean, touted by researchers). If you severely limit your foods (vegan, keto), you might miss out on something. Ditto if you eat foods with little nutritional value (sugar, refined grains). Oh, and pay attention to the two things we can say with certainty: Keep your weight down, and exercise.
  • I used to say I could tell you everything important about diet in 60 seconds. Over the years, my spiel got shorter and shorter as truisms fell by the wayside, and my confidence waned in a field where we know less, rather than more, over time. I’m down to five seconds now: Eat a wide variety of foods with their nutrients intact, keep your weight down and get some exercise.
21More

Why Study Philosophy? 'To Challenge Your Own Point of View' - The Atlantic - 1 views

  • Goldstein’s forthcoming book, Plato at the Googleplex: Why Philosophy Won’t Go Away, offers insight into the significant—and often invisible—progress that philosophy has made. I spoke with Goldstein about her take on the science vs. philosophy debates, how we can measure philosophy’s advances, and why an understanding of philosophy is critical to our lives today.
  • One of the things about philosophy is that you don’t have to give up on any other field. Whatever field there is, there’s a corresponding field of philosophy. Philosophy of language, philosophy of politics, philosophy of math. All the things I wanted to know about I could still study within a philosophical framework.
  • There’s a peer pressure that sets in at a certain age. They so much want to be like everybody else. But what I’ve found is that if you instill this joy of thinking, the sheer intellectual fun, it will survive even the adolescent years and come back in fighting form. It’s empowering.
  • ...18 more annotations...
  • One thing that’s changed tremendously is the presence of women and the change in focus because of that. There’s a lot of interest in literature and philosophy, and using literature as a philosophical examination. It makes me so happy! Because I was seen as a hard-core analytic philosopher, and when I first began to write novels people thought, Oh, and we thought she was serious! But that’s changed entirely. People take literature seriously, especially in moral philosophy, as thought experiments. A lot of the most developed and effective thought experiments come from novels. Also, novels contribute to making moral progress, changing people’s emotions.
  • The other thing that’s changed is that there’s more applied philosophy. Let’s apply philosophical theory to real-life problems, like medical ethics, environmental ethics, gender issues. This is a real change from when I was in school and it was only theory.
  • here’s a lot of philosophical progress, it’s just a progress that’s very hard to see. It’s very hard to see because we see with it. We incorporate philosophical progress into our own way of viewing the world.
  • Plato would be constantly surprised by what we know. And not only what we know scientifically, or by our technology, but what we know ethically. We take a lot for granted. It’s obvious to us, for example, that individual’s ethical truths are equally important.
  • it’s usually philosophical arguments that first introduce the very outlandish idea that we need to extend rights. And it takes more, it takes a movement, and activism, and emotions, to affect real social change. It starts with an argument, but then it becomes obvious. The tracks of philosophy’s work are erased because it becomes intuitively obvious
  • The arguments against slavery, against cruel and unusual punishment, against unjust wars, against treating children cruelly—these all took arguments.
  • About 30 years ago, the philosopher Peter Singer started to argue about the way animals are treated in our factory farms. Everybody thought he was nuts. But I’ve watched this movement grow; I’ve watched it become emotional. It has to become emotional. You have to draw empathy into it. But here it is, right in our time—a philosopher making the argument, everyone dismissing it, but then people start discussing it. Even criticizing it, or saying it’s not valid, is taking it seriously
  • This is what we have to teach our children. Even things that go against their intuition they need to take seriously. What was intuition two generations ago is no longer intuition; and it’s arguments that change i
  • We are very inertial creatures. We do not like to change our thinking, especially if it’s inconvenient for us. And certainly the people in power never want to wonder whether they should hold power.
  • I’m really trying to draw the students out, make them think for themselves. The more they challenge me, the more successful I feel as a teacher. It has to be very active
  • Plato used the metaphor that in teaching philosophy, there needs to be a fire in the teacher, and the sheer heat will help the fire grow in the student. It’s something that’s kindled because of the proximity to the heat.
  • how can you make the case that they should study philosophy?
  • ches your inner life. You have lots of frameworks to apply to problems, and so many ways to interpret things. It makes life so much more interesting. It’s us at our most human. And it helps us increase our humanity. No matter what you do, that’s an asset.
  • What do you think are the biggest philosophical issues of our time? The growth in scientific knowledge presents new philosophical issues.
  • The idea of the multiverse. Where are we in the universe? Physics is blowing our minds about this.
  • The question of whether some of these scientific theories are really even scientific. Can we get predictions out of them?
  • And with the growth in cognitive science and neuroscience. We’re going into the brain and getting these images of the brain. Are we discovering what we really are? Are we solving the problem of free will? Are we learning that there isn’t any free will? How much do the advances in neuroscience tell us about the deep philosophical issues?
  • With the decline of religion is there a sense of the meaninglessness of life and the easy consumerist answer that’s filling the space religion used to occupy? This is something that philosophers ought to be addressing.
42More

ROUGH TYPE | Nicholas Carr's blog - 0 views

  • The smartphone has become a repository of the self, recording and dispensing the words, sounds and images that define what we think, what we experience and who we are. In a 2015 Gallup survey, more than half of iPhone owners said that they couldn’t imagine life without the device.
  • So what happens to our minds when we allow a single tool such dominion over our perception and cognition?
  • Not only do our phones shape our thoughts in deep and complicated ways, but the effects persist even when we aren’t using the devices. As the brain grows dependent on the technology, the research suggests, the intellect weakens.
  • ...39 more annotations...
  • he has seen mounting evidence that using a smartphone, or even hearing one ring or vibrate, produces a welter of distractions that makes it harder to concentrate on a difficult problem or job. The division of attention impedes reasoning and performance.
  • Another 2015 study, appearing in the Journal of Computer-Mediated Communication, showed that when people hear their phone ring but are unable to answer it, their blood pressure spikes, their pulse quickens, and their problem-solving skills decline.
  • The researchers recruited 520 undergraduates at UCSD and gave them two standard tests of intellectual acuity. One test gauged “available working-memory capacity,” a measure of how fully a person’s mind can focus on a particular task. The second assessed “fluid intelligence,” a person’s ability to interpret and solve an unfamiliar problem. The only variable in the experiment was the location of the subjects’ smartphones. Some of the students were asked to place their phones in front of them on their desks; others were told to stow their phones in their pockets or handbags; still others were required to leave their phones in a different room.
  • In both tests, the subjects whose phones were in view posted the worst scores, while those who left their phones in a different room did the best. The students who kept their phones in their pockets or bags came out in the middle. As the phone’s proximity increased, brainpower decreased.
  • In subsequent interviews, nearly all the participants said that their phones hadn’t been a distraction—that they hadn’t even thought about the devices during the experiment. They remained oblivious even as the phones disrupted their focus and thinking.
  • In a 2013 study conducted at the University of Essex in England, 142 participants were divided into pairs and asked to converse in private for ten minutes. Half talked with a phone in the room, half without a phone present. The subjects were then given tests of affinity, trust and empathy. “The mere presence of mobile phones,” the researchers reported in the Journal of Social and Personal Relationships, “inhibited the development of interpersonal closeness and trust” and diminished “the extent to which individuals felt empathy and understanding from their partners.”
  • the “integration of smartphones into daily life” appears to cause a “brain drain” that can diminish such vital mental skills as “learning, logical reasoning, abstract thought, problem solving, and creativity.”
  •  Smartphones have become so entangled with our existence that, even when we’re not peering or pawing at them, they tug at our attention, diverting precious cognitive resources. Just suppressing the desire to check our phone, which we do routinely and subconsciously throughout the day, can debilitate our thinking.
  • They found that students who didn’t bring their phones to the classroom scored a full letter-grade higher on a test of the material presented than those who brought their phones. It didn’t matter whether the students who had their phones used them or not: All of them scored equally poorly.
  • A study of nearly a hundred secondary schools in the U.K., published last year in the journal Labour Economics, found that when schools ban smartphones, students’ examination scores go up substantially, with the weakest students benefiting the most.
  • Social skills and relationships seem to suffer as well.
  • Because smartphones serve as constant reminders of all the friends we could be chatting with electronically, they pull at our minds when we’re talking with people in person, leaving our conversations shallower and less satisfying.
  • A second experiment conducted by the researchers produced similar results, while also revealing that the more heavily students relied on their phones in their everyday lives, the greater the cognitive penalty they suffered.
  • The evidence that our phones can get inside our heads so forcefully is unsettling. It suggests that our thoughts and feelings, far from being sequestered in our skulls, can be skewed by external forces we’re not even aware o
  •  Scientists have long known that the brain is a monitoring system as well as a thinking system. Its attention is drawn toward any object that is new, intriguing or otherwise striking — that has, in the psychological jargon, “salience.”
  • even in the history of captivating media, the smartphone stands out. It is an attention magnet unlike any our minds have had to grapple with before. Because the phone is packed with so many forms of information and so many useful and entertaining functions, it acts as what Dr. Ward calls a “supernormal stimulus,” one that can “hijack” attention whenever it is part of our surroundings — and it is always part of our surroundings.
  • Imagine combining a mailbox, a newspaper, a TV, a radio, a photo album, a public library and a boisterous party attended by everyone you know, and then compressing them all into a single, small, radiant object. That is what a smartphone represents to us. No wonder we can’t take our minds off it.
  • The irony of the smartphone is that the qualities that make it so appealing to us — its constant connection to the net, its multiplicity of apps, its responsiveness, its portability — are the very ones that give it such sway over our minds.
  • Phone makers like Apple and Samsung and app writers like Facebook, Google and Snap design their products to consume as much of our attention as possible during every one of our waking hours
  • Social media apps were designed to exploit “a vulnerability in human psychology,” former Facebook president Sean Parker said in a recent interview. “[We] understood this consciously. And we did it anyway.”
  • A quarter-century ago, when we first started going online, we took it on faith that the web would make us smarter: More information would breed sharper thinking. We now know it’s not that simple.
  • As strange as it might seem, people’s knowledge and understanding may actually dwindle as gadgets grant them easier access to online data stores
  • In a seminal 2011 study published in Science, a team of researchers — led by the Columbia University psychologist Betsy Sparrow and including the late Harvard memory expert Daniel Wegner — had a group of volunteers read forty brief, factual statements (such as “The space shuttle Columbia disintegrated during re-entry over Texas in Feb. 2003”) and then type the statements into a computer. Half the people were told that the machine would save what they typed; half were told that the statements would be erased.
  • Afterward, the researchers asked the subjects to write down as many of the statements as they could remember. Those who believed that the facts had been recorded in the computer demonstrated much weaker recall than those who assumed the facts wouldn’t be stored. Anticipating that information would be readily available in digital form seemed to reduce the mental effort that people made to remember it
  • The researchers dubbed this phenomenon the “Google effect” and noted its broad implications: “Because search engines are continually available to us, we may often be in a state of not feeling we need to encode the information internally. When we need it, we will look it up.”
  • as the pioneering psychologist and philosopher William James said in an 1892 lecture, “the art of remembering is the art of thinking.”
  • Only by encoding information in our biological memory can we weave the rich intellectual associations that form the essence of personal knowledge and give rise to critical and conceptual thinking. No matter how much information swirls around us, the less well-stocked our memory, the less we have to think with.
  • As Dr. Wegner and Dr. Ward explained in a 2013 Scientific American article, when people call up information through their devices, they often end up suffering from delusions of intelligence. They feel as though “their own mental capacities” had generated the information, not their devices. “The advent of the ‘information age’ seems to have created a generation of people who feel they know more than ever before,” the scholars concluded, even though “they may know ever less about the world around them.”
  • That insight sheds light on society’s current gullibility crisis, in which people are all too quick to credit lies and half-truths spread through social media. If your phone has sapped your powers of discernment, you’ll believe anything it tells you.
  • Data, the novelist and critic Cynthia Ozick once wrote, is “memory without history.” Her observation points to the problem with allowing smartphones to commandeer our brains
  • When we constrict our capacity for reasoning and recall or transfer those skills to a gadget, we sacrifice our ability to turn information into knowledge. We get the data but lose the meaning
  • We need to give our minds more room to think. And that means putting some distance between ourselves and our phones.
  • Harvard Business School professor emerita Shoshana Zuboff argues in her new book that the Valley’s wealth and power are predicated on an insidious, essentially pathological form of private enterprise—what she calls “surveillance capitalism.” Pioneered by Google, perfected by Facebook, and now spreading throughout the economy, surveillance capitalism uses human life as its raw material. Our everyday experiences, distilled into data, have become a privately-owned business asset used to predict and mold our behavior, whether we’re shopping or socializing, working or voting.
  • By reengineering the economy and society to their own benefit, Google and Facebook are perverting capitalism in a way that undermines personal freedom and corrodes democracy.
  • Under the Fordist model of mass production and consumption that prevailed for much of the twentieth century, industrial capitalism achieved a relatively benign balance among the contending interests of business owners, workers, and consumers. Enlightened executives understood that good pay and decent working conditions would ensure a prosperous middle class eager to buy the goods and services their companies produced. It was the product itself — made by workers, sold by companies, bought by consumers — that tied the interests of capitalism’s participants together. Economic and social equilibrium was negotiated through the product.
  • By removing the tangible product from the center of commerce, surveillance capitalism upsets the equilibrium. Whenever we use free apps and online services, it’s often said, we become the products, our attention harvested and sold to advertisers
  • this truism gets it wrong. Surveillance capitalism’s real products, vaporous but immensely valuable, are predictions about our future behavior — what we’ll look at, where we’ll go, what we’ll buy, what opinions we’ll hold — that internet companies derive from our personal data and sell to businesses, political operatives, and other bidders.
  • Unlike financial derivatives, which they in some ways resemble, these new data derivatives draw their value, parasite-like, from human experience.To the Googles and Facebooks of the world, we are neither the customer nor the product. We are the source of what Silicon Valley technologists call “data exhaust” — the informational byproducts of online activity that become the inputs to prediction algorithms
  • internet companies operate in what Zuboff terms “extreme structural independence from people.” When databases displace goods as the engine of the economy, our own interests, as consumers but also as citizens, cease to be part of the negotiation. We are no longer one of the forces guiding the market’s invisible hand. We are the objects of surveillance and control.
8More

Can Social Networks Do Better? We Don't Know Because They Haven't Tried - Talking Point... - 0 views

  • it’s not fair to say it’s Facebook or a Facebook problem. Facebook is just the latest media and communications medium. We hardly blame the technology of the book for spreading anti-Semitism via the notorious Protocols of the Elders of Zion
  • But of course, it’s not that simple. Social media platforms have distinct features that earlier communications media did not. The interactive nature of the media, the collection of data which is then run through algorithms and artificial intelligence creates something different.
  • All social media platforms are engineered with two basic goals: maximize the time you spend on the platform and make advertising as effective and thus as lucrative as possible. This means that social media can never be simply a common carrier, a distribution technology that has no substantial influence over the nature of the communication that travels over it.
  • ...5 more annotations...
  • it’s a substantial difference which deprives social media platforms of the kind of hands-off logic that would make it ridiculous to say phones are bad or the phone company is responsible if planning for a mass murder was carried out over the phone.
  • the Internet doesn’t ‘do’ anything more than make the distribution of information more efficient and radically lower the formal, informal and financial barriers to entry that used to stand in the way of various marginalized ideas.
  • Social media can never plead innocence like this because the platforms are designed to addict you and convince you of things.
  • If the question is: what can social media platforms do to protect against government-backed subversion campaigns like the one we saw in the 2016 campaign the best answer is, we don’t know. And we don’t know for a simple reason: they haven’t tried.
  • The point is straightforward: the mass collection of data, harnessed to modern computing power and the chance to amass unimaginable wealth has spurred vast technological innovation.
22More

History News Network | Just How Stupid Are We? Facing the Truth About Donald Trump's Am... - 1 views

  •  Just How Stupid Are We?  Facing the Truth About the American Voter.  The book is filled with statistics like these:● A majority of Americans don’t know which party is in control of Congress.  ● A majority can’t name the chief justice of the Supreme Court.  ● A majority don’t know we have three branches of government.
  • suddenly mainstream media pundits have discovered how ignorant millions of voters are.  See this and this and this and this.  More importantly, the concern with low-information voters has become widespread.  Many are now wondering what country they’re living in. 
  • The answer science gives us (the title of my last book and this essay notwithstanding) is not that people fall for slick charlatans like Trump because they’re stupid.
  • ...19 more annotations...
  •  The problem is that we humans didn’t evolve to live in the world in which we find ourselves.  As the social scientists Leda Cosmides and John Tooby put it, the human mind was “designed to solve the day-to-day problems of our hunter-gatherer ancestors. These stone age priorities produced a brain far better at solving some problems than others.” 
  • there are four failings common to human beings as a result of our Stone-Age brain that hinder us in politics.
  • why are we this way?  Science suggests that one reason is that we evolved to win in social settings and in such situations the truth doesn't matter as much as sheer doggedness
  • Second, we find it hard to size up politicians correctly.  The reason for this is that we rely on instant impressions. 
  • This stops voters from worrying that they need to bolster their impressions by consulting experts and reading news stories from a broad array of ideological viewpoints.  Why study when you can rely on your gut instinct?
  • Third, we aren’t inclined to reward politicians who tell us hard truths.
  • First, most people find it easy to ignore politics because it usually involves people they don’t know.  As human beings we evolved to care about people in our immediate vicinity.  Our nervous system kicks into action usually only when we meet people face-to-face
  •  This has left millions of voters on their own.  Lacking information, millions do what you would expect.  They go with their gut
  • We don't want the truth to prevail, as Harvard's Steven Pinker informs us, we want our version of the truth to prevail, for in the end what we're really concerned with is maintaining our status or enhancing it.
  • Fourth, we frequently fail to show empathy in circumstances that clearly cry out for it.
  • We evolved to show empathy for people we know.  It takes special effort to empathize with people who don’t dress like us or look like us.
  • long-term we need to teach voters not to trust their instincts in politics because our instincts often don’t work.
  • Doing politics in a modern mass democracy, in other words, is an unnatural act.
  • Teaching this lesson doesn’t sound like a job for historians, but in one way it is.  Studying history is all about putting events into context. And as it turns out, voters need to learn the importance of context.
  • Given the mismatch between our Stone-Age brain and the problems we face in the 21st century, we should only trust our political instincts when those instincts are serviceable in a modern context.  If they aren’t (and most of the time they aren't), then higher order cognitive thinking is required.
  • Just why mass ignorance seems to be afflicting our politics at this moment is a complicated question.  But here again history can be helpful.  The answer seems to be that the institutions voters formerly could turn to for help have withered.
  • most of the time we return to a state of well-being by simply ignoring the evidence we find discomforting.  This is known as Disconfirmation Bias and it afflicts all of us
  • ut cultural norms can be established that help us overcome our natural inclinations.
  • don’t have much confidence that people in general will be willing on their own to undertake the effort.
2More

Next Stop: 100,000 Dead? - 0 views

  • A model is not a report sent back from the future. It's an exercise in taking what we know, what we think we know, and what we have no idea about, making some educated guesses about how those three pieces will interact, and coming up with a probabilistic set of possible future outcomes.
  • Models change as new data comes in (adding to the "stuff we know" inputs) and the universe of the other two inputs ("stuff we think we know" and "stuff we have no idea about") change.
« First ‹ Previous 61 - 80 of 1233 Next › Last »
Showing 20 items per page