Skip to main content

Home/ TOK Friends/ Group items matching "british" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
3More

New Statesman - The limits of science: Martin Rees - 1 views

  • Einstein averred that “the most incomprehensible thing about the universe is that it is comprehensible”. He was right to be astonished. It seems sur­prising that our minds, which evolved to cope with life on the African savannah and haven’t changed much in 10,000 years, can make sense of phenomena far from our everyday intuitions: the microworld of atoms and the vastness of the cosmos. But our comprehension could one day “hit the buffers”. A monkey is unaware that atoms exist. Likewise, our brainpower may not stretch to the deepest aspects of reality.
  • Everything, however complicated – breaking waves, migrating birds, or tropical forests – is made up of atoms and obeys the equations of quantum physics. That, at least, is what most scientists believe, and there is no reason to doubt it. Yet there are inherent limits to science’s predictive power. Some things, like the orbits of the planets, can be calculated far into the future. But that’s atypical. In most contexts, there is a limit. Even the most fine-grained compu­tation can only forecast British weather a few days ahead.
  • even if we could build a computer with hugely superhuman processing power, which could offer an accurate simulation, that doesn’t mean that we will have the insight to understand it. Some of the “aha” insights that scientists strive for may have to await the emergence of post-human intellects.
1More

From Riddle to Twittersphere: David Crystal tells the story of English in 100 words - T... - 0 views

  • f you can tell the history of the world in 100 objects, as the British Museum’s Neil MacGregor did last year, then it ought to be possible to tell the history of a language in a similar number. But, as with objects, it isn’t enough for each word to be interesting in its own right. It has to represent a whole class of words. It has to tell a story. And each of these individual stories should add up to the history of the English language as a whole.
26More

The American Scholar: The Decline of the English Department - William M. Chace - 1 views

  • The number of young men and women majoring in English has dropped dramatically; the same is true of philosophy, foreign languages, art history, and kindred fields, including history. As someone who has taught in four university English departments over the last 40 years, I am dismayed by this shift, as are my colleagues here and there across the land. And because it is probably irreversible, it is important to attempt to sort out the reasons—the many reasons—for what has happened.
  • English: from 7.6 percent of the majors to 3.9 percent
  • In one generation, then, the numbers of those majoring in the humanities dropped from a total of 30 percent to a total of less than 16 percent; during that same generation, business majors climbed from 14 percent to 22 percent.
  • ...23 more annotations...
  • History: from 18.5 percent to 10.7 percent
  • But the deeper explanation resides not in something that has happened to it, but in what it has done to itself. English has become less and less coherent as a discipline and, worse, has come near exhaustion as a scholarly pursuit.
  • The twin focus, then, was on the philological nature of the enterprise and the canon of great works to be studied in their historical evolution.
  • Studying English taught us how to write and think better, and to make articulate many of the inchoate impulses and confusions of our post-adolescent minds. We began to see, as we had not before, how such books could shape and refine our thinking. We began to understand why generations of people coming before us had kept them in libraries and bookstores and in classes such as ours. There was, we got to know, a tradition, a historical culture, that had been assembled around these books. Shakespeare had indeed made a difference—to people before us, now to us, and forever to the language of English-speaking people.
  • today there are stunning changes in the student population: there are more and more gifted and enterprising students coming from immigrant backgrounds, students with only slender connections to Western culture and to the assumption that the “great books” of England and the United States should enjoy a fixed centrality in the world. What was once the heart of the matter now seems provincial. Why throw yourself into a study of something not emblematic of the world but representative of a special national interest? As the campus reflects the cultural, racial, and religious complexities of the world around it, reading British and American literature looks more and more marginal. From a global perspective, the books look smaller.
  • With the cost of a college degree surging upward during the last quarter century—tuition itself increasing far beyond any measure of inflation—and with consequent growth in loan debt after graduation, parents have become anxious about the relative earning power of a humanities degree. Their college-age children doubtless share such anxiety. When college costs were lower, anxiety could be kept at bay. (Berkeley in the early ’60s cost me about $100 a year, about $700 in today’s dollars.)
  • Economists, chemists, biologists, psychologists, computer scientists, and almost everyone in the medical sciences win sponsored research, grants, and federal dollars. By and large, humanists don’t, and so they find themselves as direct employees of the institution, consuming money in salaries, pensions, and operating needs—not external money but institutional money.
  • These, then, are some of the external causes of the decline of English: the rise of public education; the relative youth and instability (despite its apparent mature solidity) of English as a discipline; the impact of money; and the pressures upon departments within the modern university to attract financial resources rather than simply use them up.
  • several of my colleagues around the country have called for a return to the aesthetic wellsprings of literature, the rock-solid fact, often neglected, that it can indeed amuse, delight, and educate. They urge the teaching of English, or French, or Russian literature, and the like, in terms of the intrinsic value of the works themselves, in all their range and multiplicity, as well-crafted and appealing artifacts of human wisdom. Second, we should redefine our own standards for granting tenure, placing more emphasis on the classroom and less on published research, and we should prepare to contest our decisions with administrators whose science-based model is not an appropriate means of evaluation.
  • “It may be that what has happened to the profession is not the consequence of social or philosophical changes, but simply the consequence of a tank now empty.” His homely metaphor pointed to the absence of genuinely new frontiers of knowledge and understanding for English professors to explore.
  • In this country and in England, the study of English literature began in the latter part of the 19th century as an exercise in the scientific pursuit of philological research, and those who taught it subscribed to the notion that literature was best understood as a product of language.
  • no one has come forward in years to assert that the study of English (or comparative literature or similar undertakings in other languages) is coherent, does have self-limiting boundaries, and can be described as this but not that.
  • to teach English today is to do, intellectually, what one pleases. No sense of duty remains toward works of English or American literature; amateur sociology or anthropology or philosophy or comic books or studies of trauma among soldiers or survivors of the Holocaust will do. You need not even believe that works of literature have intelligible meaning; you can announce that they bear no relationship at all to the world beyond the text.
  • With everything on the table, and with foundational principles abandoned, everyone is free, in the classroom or in prose, to exercise intellectual laissez-faire in the largest possible way—I won’t interfere with what you do and am happy to see that you will return the favor
  • Consider the English department at Harvard University. It has now agreed to remove its survey of English literature for undergraduates, replacing it and much else with four new “affinity groups”
  • there would be no one book, or family of books, that every English major at Harvard would have read by the time he or she graduates. The direction to which Harvard would lead its students in this “clean slate” or “trickle down” experiment is to suspend literary history, thrusting into the hands of undergraduates the job of cobbling together intellectual coherence for themselves
  • Those who once strove to give order to the curriculum will have learned, from Harvard, that terms like core knowledge and foundational experience only trigger acrimony, turf protection, and faculty mutinies. No one has the stomach anymore to refight the Western culture wars. Let the students find their own way to knowledge.
  • In English, the average number of years spent earning a doctoral degree is almost 11. After passing that milestone, only half of new Ph.D.’s find teaching jobs, the number of new positions having declined over the last year by more than 20 percent; many of those jobs are part-time or come with no possibility of tenure. News like that, moving through student networks, can be matched against, at least until recently, the reputed earning power of recent graduates of business schools, law schools, and medical schools. The comparison is akin to what young people growing up in Rust Belt cities are forced to see: the work isn’t here anymore; our technology is obsolete.
  • unlike other members of the university community, they might well have been plying their trade without proper credentials: “Whereas economists or physicists, geologists or climatologists, physicians or lawyers must master a body of knowledge before they can even think of being licensed to practice,” she said, “we literary scholars, it is tacitly assumed, have no definable expertise.”
  • English departments need not refight the Western culture wars. But they need to fight their own book wars. They must agree on which texts to teach and argue out the choices and the principles of making them if they are to claim the respect due a department of study.
  • They can teach their students to write well, to use rhetoric. They should place their courses in composition and rhetoric at the forefront of their activities. They should announce that the teaching of composition is a skill their instructors have mastered and that students majoring in English will be certified, upon graduation, as possessing rigorously tested competence in prose expression.
  • The study of literature will then take on the profile now held, with moderate dignity, by the study of the classics, Greek and Latin.
  • But we can, we must, do better. At stake are the books themselves and what they can mean to the young. Yes, it is just a literary tradition. That’s all. But without such traditions, civil societies have no compass to guide them.
6More

Accused "Fraudster" Heads Two Journals | The Scientist Magazine® - 0 views

  • Dmitry Kuznetsov, a Russian biochemist whose published work has been repeatedly alleged to be fraudulent, is now the chief editor of two science journals. The appointments are raising questions about the scientific integrity of the publications.
  • one of the worst fraud records in the history of science,” said Dan Larhammar, a professor at Uppsala University in Sweden who has written about problems in Kuznetsov's work. “That should be a major concern to” the publisher that recruited Kuznetsov as editor-in-chief, he said.
  • “As a result of these claims [by Kuznetsov and colleagues] a couple of students have spent several years of their life on a wild goose chase,” Coey told The Scientist.
  • ...3 more annotations...
  • researchers in various fields of study have also voiced their concerns about the quality of Kuznetsov’s research
  • the questions concerning Kuznetsov’s work, researchers who have published in one of the journals he edits, the British Journal of Medicine and Medical Research, report having no abnormal interactions with the editors during the review process. “I found my experience with that journal to be no different than with any other,”
  • Swanson said he has no way to judge the validity of the accusations against Kuznetsov, and that it would be unfair to jump to conclusions. If the allegations are true, however, “it certainly hurts the reputation of their journal, and I suspect they would rectify that problem (again, if there is truth here) fairly quickly. Most reputable researchers would not want to submit to such a journal.”
14More

The Benefits of 'Binocularity' - NYTimes.com - 0 views

  • Will advances in neuroscience move reasonable people to abandon the idea that criminals deserve to be punished?
  • if the idea of deserving punishment depends upon the idea that criminals freely choose their actions, and if neuroscience reveals that free choice is an illusion, then we can see that the idea of deserving punishment is nonsense
  • “new neuroscience will undermine people’s common sense, libertarian conception of free will and the retributivist thinking that depends on it, both of which have heretofore been shielded by the inaccessibility of sophisticated thinking about the mind and its neural basis.”
  • ...11 more annotations...
  • when university students learn about “the neural basis of behavior” — quite simply, the brain activity underlying human actions —they become less supportive of the idea that criminals deserve to be punished.
  • To see what is right — and wrong — with the notion that neuroscience will transform our idea of just deserts, and, more generally, our idea of what it means to be human, it can help to step back and consider
  • British philosopher Jonathan Glover. He said that if we want to understand what sorts of beings we are in depth, we need to achieve a sort of intellectual “binocularity.”
  • Glover was saying that, just as we need two eyes that integrate slightly different information about one scene to achieve visual depth perception, being able see ourselves though two fundamentally different lenses, and integrate those two sources of information, can give us a greater depth of understanding of ourselves.
  • Through one lens we see that we are “subjects” (we act) who have minds and can have the experience of making free choices. Through the other we see that we are “objects” or bodies (we are acted upon), and that our experiences or movements are determined by an infinitely long chain of natural and social forces.
  • intellectual binocularity itself is not easy to achieve. While visual binocularity comes naturally, intellectual binocularity requires effort. In fact — and this is one source of the trouble we so often have when we try to talk about the sorts of beings we are — we can’t actually achieve perfect binocular understanding.
  • We can’t actually see ourselves as subjects and as objects at the same time any more than we can see Wittgenstein’s famous duck-rabbit figure as a duck and as a rabbit at once. Rather, we have to accept the necessity of oscillating between the lenses or ways of seeing, fully aware that, not only are we unable to use both at once, but that there is no algorithm for knowing when to use which.
  • When I said in the beginning that there’s something right about the reasoning of those researchers who reject the idea that our choices are “spontaneous” and not determined by prior events, I was referring to their rejection of the idea that our choices are rooted in some God-given, extra-natural, bodyless stuff.
  • My complaint is that they slip from making the reasonable claim that such extra-natural stuff is an illusion to speaking in ways that suggest that free will is an illusion, full stop. To suggest that our experience of choosing is wholly an illusion is as unhelpful as to suggest that, to explain the emergence of that experience, we need to appeal to extra-natural phenomena.
  • Using either lens alone can lead to pernicious mistakes. When we use only the subject lens, we are prone to a sort of inhumanity where we ignore the reality of the natural and social forces that bear down on all of us to make our choices.
  • When we use only the object lens, however, we are prone to a different, but equally noxious sort of inhumanity, where we fail to appreciate the reality of the experience of making choices freely and of knowing that we can deserve punishment — or praise.
7More

In New Textbook, the Story of Singapore Begins 500 Years Earlier - NYTimes.com - 0 views

  • Why did it take 30 years to change the story? “It takes overwhelming evidence to shift the mind-set of a people from one image of its past to another,”
  • Professor Miksic gives credit for the new history lesson to former students who have reached positions of authority in academia and in the Ministry of Education.
  • Professor Heng surmised that one reason it had taken so long to change the narrative may have been the government’s fears of communal conflict in the 1960s and ’70s.
  • ...4 more annotations...
  • “If Singapore before 1800 was a sleepy backwater, the Chinese majority could say, ‘We built Singapore; before it was a blank slate,”’ he said.
  • Other factors also may help explain the timing of the rewrite. “Now is a good time,” Professor Heng said. “There’s a need to develop a collective social memory. It’s become a political issue.”
  • “Every generation has to rewrite its history,” he said. While it used to suit Singapore to see itself as a city-state with a British heritage, modern Singapore needs a different interpretation of history to reinforce a more global perspective, he suggested.
  • Professor Miksic goes a step further. “A short history puts a nation on shaky ground; a shallowly rooted place could be overturned quickly,” he said. “If you can show a long cohabitation between the Malays and the Chinese, it proves you have a pretty stable arrangement.”
8More

Triumph of the Unthinking - NYTimes.com - 0 views

  • “Words,” wrote John Maynard Keynes, “ought to be a little wild, for they are the assault of thoughts on the unthinking.”
  • It’s true that in practice Mr. Obama pushed through a stimulus that, while too small and short-lived, helped diminish the depth and duration of the slump. But when Republicans began talking nonsense, declaring that the government should match the belt-tightening of ordinary families — a recipe for full-on depression — Mr. Obama didn’t challenge their position. Instead, within a few months the very same nonsense became a standard line in his speeches, even though his economists knew better, and so did he.
  • Like Mr. Obama and company, Labour’s leaders probably know better, but have decided that it’s too hard to overcome the easy appeal of bad economics, especially when most of the British news media report this bad economics as truth.
  • ...5 more annotations...
  • What nonsense am I talking about? Simon Wren-Lewis of the University of Oxford, who has been a tireless but lonely crusader for economic sense, calls it “mediamacro.” It’s a story about Britain that runs like this: First, the Labour government that ruled Britain until 2010 was wildly irresponsible, spending far beyond its means. Second, this fiscal profligacy caused the economic crisis of 2008-2009. Third, this in turn left the coalition that took power in 2010 with no choice except to impose austerity policies despite the depressed state of the economy. Finally, Britain’s return to economic growth in 2013 vindicated austerity and proved its critics wrong.
  • every piece of this story is demonstrably, ludicrously wrong
  • Yet this nonsense narrative completely dominates news reporting, where it is treated as a fact rather than a hypothesis. And Labour hasn’t tried to push back, probably because they considered this a political fight they couldn’t win. But why?
  • Mr. Wren-Lewis suggests that it has a lot to do with the power of misleading analogies between governments and households, and also with the malign influence of economists working for the financial industry, who in Britain as in America constantly peddle scare stories about deficits and pay no price for being consistently wrong. If U.S. experience is any guide, my guess is that Britain also suffers from the desire of public figures to sound serious, a pose which they associate with stern talk about the need to make hard choices (at other people’s expense, of course.)
  • The fact is that Britain and America didn’t need to make hard choices in the aftermath of crisis. What they needed, instead, was hard thinking — a willingness to understand that this was a special environment, that the usual rules don’t apply in a persistently depressed economy, one in which government borrowing doesn’t compete with private investment and costs next to nothing.
16More

Sarah Palin dives in poll ratings as Tina Fey impersonates her on Saturday Night Live -... - 0 views

  • Palin's poll ratings are telling a more devastating story.
  • engage with the process much earlier on – not least with their Sunday morning political talk shows
  • It currently commands 10 million viewers – a creditable figure for a primetime drama, let alone a late-night sketch show.
  • ...13 more annotations...
  • Other satirical shows, such as The Daily Show with Jon Stewart and The Colbert Report, are also enjoying record ratings, as well as influence far beyond their own viewers.
  • Even bigger than Saturday Night Live have been the presidential and vice-presidential debates. Sarah Palin's set-to with Joe Biden on October 2 attracted nearly 70 million viewers – a record for a vice-presidential debate and the highest-rated election debate since 1992
  • It is impossible to imagine a similar level of engagement with political television in this country. Gordon Brown and David Cameron would not only have to debate each other on TV – an unlikely scenario in itself – but pull in an audience bigger than the finals of Britain's Got Talent and Strictly Come Dancing put together
  • American networks do have some advantages over the BBC and ITV in planning and executing their political coverage
  • four-year timetable, avoiding the unholy scramble when a British general election is called at a month's notice.
  • In a Newsweek poll in September, voters were asked whether Palin was qualified or unqualified to be president. The result was a near dead-heat. In the same poll this month, those saying she was "unqualified" outnumbered those saying she was "qualified" by a massive 16 points
  • "I think we're learning what it means to have opinion journalism in this country on such a grand scale," says Stelter. "It's only in the last six to 12 months that those lines have hardened between Fox and MSNBC. I think the [ratings] numbers for cable have surprised people.
  • I think that shows that people are looking for different stripes of political news."
  • American political TV certainly is polarised. When Governor Palin attacked the media in her speech at the Republican convention last month, the crowd chanted "NBC"
  • Gwen Ifill, a respected anchor on the non-commercial channel PBS, who moderated the vice-presidential debate, saw her impartiality attacked because she is writing a book about African-American politics that mentions Obama in its title
  • America's networks comprehensively outstrip this country in both volume and quality of political coverage.
  • All three major US networks – ABC, CBS and NBC – offer a large amount of serious (and unbiased) political coverage, both in their evening network newscasts and in their morning equivalents of GMTV
  • Impartiality and the public service ethos hardly characterise Tina Fey's performances. Tonight's presidential debate forms part of a series driven largely by commercial networks, not publicly funded channels. Neither Fox News nor MSNBC was set up as a sop to a regulator
3More

'Defending the Faith' in the Middle East - NYTimes.com - 0 views

  • nder the umbrella of Shiite solidarity, Iran provides military aid and funds industrial projects, madrasas, mosques and hospitals. And its leaders have become more vocal about their aims, with President Hassan Rouhani proclaiming himself protector of Iraq’s holy cities.
  • The most extensive patronage efforts, however, were made by the Ottomans. From the reign of Abdul Hamid II in the 19th century, the Ottomans used their self-professed status as the defenders of global Islam to advance their influence into rival empires, from French North Africa to British India.
  • The politics of religion undermined the Westphalian order, based on the principles of state sovereignty and territorial integrity. At the same time, these policies subverted states, fueled divisions within them — and often ended in violence.
6More

New Alternatives to Statins Add to a Quandary on Cholesterol - The New York Times - 0 views

  • “We’ve reached a point where patients are increasingly facing five- and six-figure price tags for medications that they will take over the course of their lifetimes,” said Matthew Eyles, an executive vice president for America’s Health Insurance Plans, the national trade association for the insurance industry. “If this is the new normal to treat common and chronic conditions, how can any health system sustain that cost?”
  • Doctors with patients who maintain they are intolerant to statins say they are confronted with a clash between the art and the science of medicine.
  • Dr. Peter Libby, a doctor and researcher at Brigham and Women’s Hospital in Boston, said that in his role as a physician, “the patient is always right.” But, he added, “as a scientist, I find randomized, large-scale, double-blind studies more persuasive than anecdote.”
  • ...3 more annotations...
  • The statin trials, which involved tens of thousands of people, found no more muscle aches, the most common complaint, in patients who took statins than in those who took placebos.
  • The widely held belief that statins affect memory also has not been borne out in clinical trials, said Dr. Jane Armitage of the University of Oxford. She and her colleagues studied memory problems in 20,000 patients randomly assigned to take a statin or a placebo. “There was absolutely no difference,” she said.
  • In a separate study, they looked at mood and sleep patterns and again found statins had no effect. Another study, in Scotland, detailed cognitive testing of older people taking statins or a placebo, and also found no effect.
28More

Joshua Foer: John Quijada and Ithkuil, the Language He Invented : The New Yorker - 2 views

  • Languages are something of a mess. They evolve over centuries through an unplanned, democratic process that leaves them teeming with irregularities, quirks, and words like “knight.” No one who set out to design a form of communication would ever end up with anything like English, Mandarin, or any of the more than six thousand languages spoken today.“Natural languages are adequate, but that doesn’t mean they’re optimal,” John Quijada, a fifty-four-year-old former employee of the California State Department of Motor Vehicles, told me. In 2004, he published a monograph on the Internet that was titled “Ithkuil: A Philosophical Design for a Hypothetical Language.” Written like a linguistics textbook, the fourteen-page Web site ran to almost a hundred and sixty thousand words. It documented the grammar, syntax, and lexicon of a language that Quijada had spent three decades inventing in his spare time. Ithkuil had never been spoken by anyone other than Quijada, and he assumed that it never would be.
  • his “greater goal” was “to attempt the creation of what human beings, left to their own devices, would never create naturally, but rather only by conscious intellectual effort: an idealized language whose aim is the highest possible degree of logic, efficiency, detail, and accuracy in cognitive expression via spoken human language, while minimizing the ambiguity, vagueness, illogic, redundancy, polysemy (multiple meanings) and overall arbitrariness that is seemingly ubiquitous in natural human language.”
  • Ithkuil, one Web site declared, “is a monument to human ingenuity and design.” It may be the most complete realization of a quixotic dream that has entranced philosophers for centuries: the creation of a more perfect language.
  • ...25 more annotations...
  • Since at least the Middle Ages, philosophers and philologists have dreamed of curing natural languages of their flaws by constructing entirely new idioms according to orderly, logical principles.
  • nventing new forms of speech is an almost cosmic urge that stems from what the linguist Marina Yaguello, the author of “Lunatic Lovers of Language,” calls “an ambivalent love-hate relationship.” Language creation is pursued by people who are so in love with what language can do that they hate what it doesn’t. “I don’t believe any other fantasy has ever been pursued with so much ardor by the human spirit, apart perhaps from the philosopher’s stone or the proof of the existence of God; or that any other utopia has caused so much ink to flow, apart perhaps from socialism,”
  • What if, they wondered, you could create a universal written language that could be understood by anyone, a set of “real characters,” just as the creation of Arabic numerals had done for counting? “This writing will be a kind of general algebra and calculus of reason, so that, instead of disputing, we can say that ‘we calculate,’ ” Leibniz wrote, in 1679.
  • In his “Essay Towards a Real Character, and a Philosophical Language,” from 1668, Wilkins laid out a sprawling taxonomic tree that was intended to represent a rational classification of every concept, thing, and action in the universe. Each branch along the tree corresponded to a letter or a syllable, so that assembling a word was simply a matter of tracing a set of forking limbs
  • Solresol, the creation of a French musician named Jean-François Sudre, was among the first of these universal languages to gain popular attention. It had only seven syllables: Do, Re, Mi, Fa, So, La, and Si. Words could be sung, or performed on a violin. Or, since the language could also be translated into the seven colors of the rainbow, sentences could be woven into a textile as a stream of colors.
  • “I had this realization that every individual language does at least one thing better than every other language,” he said. For example, the Australian Aboriginal language Guugu Yimithirr doesn’t use egocentric coördinates like “left,” “right,” “in front of,” or “behind.” Instead, speakers use only the cardinal directions. They don’t have left and right legs but north and south legs, which become east and west legs upon turning ninety degrees
  • Among the Wakashan Indians of the Pacific Northwest, a grammatically correct sentence can’t be formed without providing what linguists refer to as “evidentiality,” inflecting the verb to indicate whether you are speaking from direct experience, inference, conjecture, or hearsay.
  • Quijada began wondering, “What if there were one single language that combined the coolest features from all the world’s languages?”
  • he started scribbling notes on an entirely new grammar that would eventually incorporate not only Wakashan evidentiality and Guugu Yimithirr coördinates but also Niger-Kordofanian aspectual systems, the nominal cases of Basque, the fourth-person referent found in several nearly extinct Native American languages, and a dozen other wild ways of forming sentences.
  • he discovered “Metaphors We Live By,” a seminal book, published in 1980, by the cognitive linguists George Lakoff and Mark Johnson, which argues that the way we think is structured by conceptual systems that are largely metaphorical in nature. Life is a journey. Time is money. Argument is war. For better or worse, these figures of speech are profoundly embedded in how we think.
  • I asked him if he could come up with an entirely new concept on the spot, one for which there was no word in any existing language. He thought about it for a moment. “Well, no language, as far as I know, has a single word for that chin-stroking moment you get, often accompanied by a frown on your face, when someone expresses an idea that you’ve never thought of and you have a moment of suddenly seeing possibilities you never saw before.” He paused, as if leafing through a mental dictionary. “In Ithkuil, it’s ašţal.”
  • Many conlanging projects begin with a simple premise that violates the inherited conventions of linguistics in some new way. Aeo uses only vowels. Kēlen has no verbs. Toki Pona, a language inspired by Taoist ideals, was designed to test how simple a language could be. It has just a hundred and twenty-three words and fourteen basic sound units. Brithenig is an answer to the question of what English might have sounded like as a Romance language, if vulgar Latin had taken root on the British Isles. Láadan, a feminist language developed in the early nineteen-eighties, includes words like radíidin, defined as a “non-holiday, a time allegedly a holiday but actually so much a burden because of work and preparations that it is a dreaded occasion; especially when there are too many guests and none of them help.”
  • most conlangers come to their craft by way of fantasy and science fiction. J. R. R. Tolkien, who called conlanging his “secret vice,” maintained that he created the “Lord of the Rings” trilogy for the primary purpose of giving his invented languages, Quenya, Sindarin, and Khuzdul, a universe in which they could be spoken. And arguably the most commercially successful invented language of all time is Klingon, which has its own translation of “Hamlet” and a dictionary that has sold more than three hundred thousand copies.
  • He imagined that Ithkuil might be able to do what Lakoff and Johnson said natural languages could not: force its speakers to precisely identify what they mean to say. No hemming, no hawing, no hiding true meaning behind jargon and metaphor. By requiring speakers to carefully consider the meaning of their words, he hoped that his analytical language would force many of the subterranean quirks of human cognition to the surface, and free people from the bugs that infect their thinking.
  • Brown based the grammar for his ten-thousand-word language, called Loglan, on the rules of formal predicate logic used by analytical philosophers. He hoped that, by training research subjects to speak Loglan, he might turn them into more logical thinkers. If we could change how we think by changing how we speak, then the radical possibility existed of creating a new human condition.
  • today the stronger versions of the Sapir-Whorf hypothesis have “sunk into . . . disrepute among respectable linguists,” as Guy Deutscher writes, in “Through the Looking Glass: Why the World Looks Different in Other Languages.” But, as Deutscher points out, there is evidence to support the less radical assertion that the particular language we speak influences how we perceive the world. For example, speakers of gendered languages, like Spanish, in which all nouns are either masculine or feminine, actually seem to think about objects differently depending on whether the language treats them as masculine or feminine
  • The final version of Ithkuil, which Quijada published in 2011, has twenty-two grammatical categories for verbs, compared with the six—tense, aspect, person, number, mood, and voice—that exist in English. Eighteen hundred distinct suffixes further refine a speaker’s intent. Through a process of laborious conjugation that would befuddle even the most competent Latin grammarian, Ithkuil requires a speaker to home in on the exact idea he means to express, and attempts to remove any possibility for vagueness.
  • Every language has its own phonemic inventory, or library of sounds, from which a speaker can string together words. Consonant-poor Hawaiian has just thirteen phonemes. English has around forty-two, depending on dialect. In order to pack as much meaning as possible into each word, Ithkuil has fifty-eight phonemes. The original version of the language included a repertoire of grunts, wheezes, and hacks that are borrowed from some of the world’s most obscure tongues. One particular hard-to-make clicklike sound, a voiceless uvular ejective affricate, has been found in only a few other languages, including the Caucasian language Ubykh, whose last native speaker died in 1992.
  • Human interactions are governed by a set of implicit codes that can sometimes seem frustratingly opaque, and whose misreading can quickly put you on the outside looking in. Irony, metaphor, ambiguity: these are the ingenious instruments that allow us to mean more than we say. But in Ithkuil ambiguity is quashed in the interest of making all that is implicit explicit. An ironic statement is tagged with the verbal affix ’kçç. Hyperbolic statements are inflected by the letter ’m.
  • “I wanted to use Ithkuil to show how you would discuss philosophy and emotional states transparently,” Quijada said. To attempt to translate a thought into Ithkuil requires investigating a spectrum of subtle variations in meaning that are not recorded in any natural language. You cannot express a thought without first considering all the neighboring thoughts that it is not. Though words in Ithkuil may sound like a hacking cough, they have an inherent and unavoidable depth. “It’s the ideal language for political and philosophical debate—any forum where people hide their intent or obfuscate behind language,” Quijada co
  • In Ithkuil, the difference between glimpsing, glancing, and gawking is the mere flick of a vowel. Each of these distinctions is expressed simply as a conjugation of the root word for vision. Hunched over the dining-room table, Quijada showed me how he would translate “gawk” into Ithkuil. First, though, since words in Ithkuil are assembled from individual atoms of meaning, he had to engage in some introspection about what exactly he meant to say.For fifteen minutes, he flipped backward and forward through his thick spiral-bound manuscript, scratching his head, pondering each of the word’s aspects, as he packed the verb with all of gawking’s many connotations. As he assembled the evolving word from its constituent meanings, he scribbled its pieces on a notepad. He added the “second degree of the affix for expectation of outcome” to suggest an element of surprise that is more than mere unpreparedness but less than outright shock, and the “third degree of the affix for contextual appropriateness” to suggest an element of impropriety that is less than scandalous but more than simply eyebrow-raising. As he rapped his pen against the notepad, he paged through his manuscript in search of the third pattern of the first stem of the root for “shock” to suggest a “non-volitional physiological response,” and then, after several moments of contemplation, he decided that gawking required the use of the “resultative format” to suggest “an event which occurs in conjunction with the conflated sense but is also caused by it.” He eventually emerged with a tiny word that hardly rolled off the tongue: apq’uxasiu. He spoke the first clacking syllable aloud a couple of times before deciding that he had the pronunciation right, and then wrote it down in the script he had invented for printed Ithkuil:
  • “You can make up words by the millions to describe concepts that have never existed in any language before,” he said.
  • Neither Sapir nor Whorf formulated a definitive version of the hypothesis that bears their names, but in general the theory argues that the language we speak actually shapes our experience of reality. Speakers of different languages think differently. Stronger versions of the hypothesis go even further than this, to suggest that language constrains the set of possible thoughts that we can have. In 1955, a sociologist and science-fiction writer named James Cooke Brown decided he would test the Sapir-Whorf hypothesis by creating a “culturally neutral” “model language” that might recondition its speakers’ brains.
  • “We think that when a person learns Ithkuil his brain works faster,” Vishneva told him, in Russian. She spoke through a translator, as neither she nor Quijada was yet fluent in their shared language. “With Ithkuil, you always have to be reflecting on yourself. Using Ithkuil, we can see things that exist but don’t have names, in the same way that Mendeleyev’s periodic table showed gaps where we knew elements should be that had yet to be discovered.”
  • Lakoff, who is seventy-one, bearded, and, like Quijada, broadly built, seemed to have read a fair portion of the Ithkuil manuscript and familiarized himself with the language’s nuances.“There are a whole lot of questions I have about this,” he told Quijada, and then explained how he felt Quijada had misread his work on metaphor. “Metaphors don’t just show up in language,” he said. “The metaphor isn’t in the word, it’s in the idea,” and it can’t be wished away with grammar.“For me, as a linguist looking at this, I have to say, ‘O.K., this isn’t going to be used.’ It has an assumption of efficiency that really isn’t efficient, given how the brain works. It misses the metaphor stuff. But the parts that are successful are really nontrivial. This may be an impossible language,” he said. “But if you think of it as a conceptual-art project I think it’s fascinating.”
11More

The Unrealized Horrors of Population Explosion - NYTimes.com - 0 views

  • No one was more influential — or more terrifying, some would say — than Paul R. Ehrlich, a Stanford University biologist. His 1968 book, “The Population Bomb,” sold in the millions with a jeremiad that humankind stood on the brink of apocalypse because there were simply too many of us. Dr. Ehrlich’s opening statement was the verbal equivalent of a punch to the gut: “The battle to feed all of humanity is over.” He later went on to forecast that hundreds of millions would starve to death in the 1970s, that 65 million of them would be Americans, that crowded India was essentially doomed, that odds were fair “England will not exist in the year 2000.” Dr. Ehrlich was so sure of himself that he warned in 1970 that “sometime in the next 15 years, the end will come.” By “the end,” he meant “an utter breakdown of the capacity of the planet to support humanity.”
  • After the passage of 47 years, Dr. Ehrlich offers little in the way of a mea culpa. Quite the contrary. Timetables for disaster like those he once offered have no significance, he told Retro Report, because to someone in his field they mean something “very, very different” from what they do to the average person. The end is still nigh, he asserted, and he stood unflinchingly by his 1960s insistence that population control was required, preferably through voluntary methods. But if need be, he said, he would endorse “various forms of coercion” like eliminating “tax benefits for having additional children.”
  • Stewart Brand, founding editor of the Whole Earth Catalog. On this topic, Mr. Brand may be deemed a Keynesian, in the sense of an observation often attributed to John Maynard Keynes: “When the facts change, I change my mind, sir. What do you do?” Mr. Brand’s formulation for Retro Report was to ask, “How many years do you have to not have the world end” to reach a conclusion that “maybe it didn’t end because that reason was wrong?”
  • ...8 more annotations...
  • One thing that happened on the road to doom was that the world figured out how to feed itself despite its rising numbers. No small measure of thanks belonged to Norman E. Borlaug, an American plant scientist whose breeding of high-yielding, disease-resistant crops led to the agricultural savior known as the Green Revolution.
  • Some preternaturally optimistic analysts concluded that humans would always find their way out of tough spots. Among them was Julian L. Simon, an economist who established himself as the anti-Ehrlich, arguing that “humanity’s condition will improve in just about every material way.”
  • In fact, birthrates are now below long-term replacement levels, or nearly so, across much of Earth, not just in the industrialized West and Japan but also in India, China, much of Southeast Asia, Latin America — just about everywhere except Africa, although even there the continentwide rates are declining. “Girls that are never born cannot have babies,”
  • Because of improved health standards, birthing many children is not the survival imperative for families that it once was. In cramped cities, large families are not the blessing they were in the agricultural past. And women in many societies are ever more independent, socially and economically; they no longer accept that their fate is to be endlessly pregnant. If anything, the worry in many countries is that their populations are aging and that national vitality is ebbing.
  • Still, enough people are already around to ensure that the world’s population will keep rising. But for how long? That is a devilishly difficult question. One frequently cited demographic model by the United Nations envisions a peak of about nine billion around 2050. Other forecasts are for continued growth into the next century. Still others say the population will begin to drop before the middle of this century.
  • In Mr. Pearce’s view, the villain is not overpopulation but, rather, overconsumption. “We can survive massive demographic change,” he said in 2011. But he is less sanguine about the overuse of available resources and its effects on climate change
  • “Rising consumption today far outstrips the rising head count as a threat to the planet,” Mr. Pearce wrote in Prospect, a British magazine, in 2010. “And most of the extra consumption has been in rich countries that have long since given up adding substantial numbers to their population,
  • “Let’s look at carbon dioxide emissions, the biggest current concern because of climate change,” he continued. “The world’s richest half billion people — that’s about 7 percent of the global population — are responsible for half of the world’s carbon dioxide emissions. Meanwhile, the poorest 50 percent of the population are responsible for just 7 percent of emissions.”
9More

Does Everything Happen for a Reason? - NYTimes.com - 1 views

  • we asked people to reflect on significant events from their own lives, such as graduations, the births of children, falling in love, the deaths of loved ones and serious illnesses. Unsurprisingly, a majority of religious believers said they thought that these events happened for a reason and that they had been purposefully designed (presumably by God). But many atheists did so as well, and a majority of atheists in a related study also said that they believed in fate — defined as the view that life events happen for a reason and that there is an underlying order to life
  • British atheists were just as likely as American atheists to believe that their life events had underlying purposes, even though Britain is far less religious than America.
  • even young children show a bias to believe that life events happen for a reason — to “send a sign” or “to teach a lesson.” This belief exists regardless of how much exposure the children have had to religion at home, and even if they’ve had none at all.
  • ...6 more annotations...
  • This tendency to see meaning in life events seems to reflect a more general aspect of human nature: our powerful drive to reason in psychological terms, to make sense of events and situations by appealing to goals, desires and intentions
  • we found that highly paranoid people (who tend to obsess over other people’s hidden motives and intentions) and highly empathetic people (who think deeply about other people’s goals and emotions) are particularly likely to believe in fate and to believe that there are hidden messages and signs embedded in their own life events. In other words, the more likely people are to think about other people’s purposes and intentions, the more likely they are to also infer purpose and intention in human life itself.
  • But it can lead us into error when we overextend it, causing us to infer psychological states even when none exist. This fosters the illusion that the world itself is full of purpose and design.
  • This drive serves us well when we think about the actions of other people, who actually possess these psychological states, because it helps us figure out why people behave as they do and to respond appropriately.
  • the belief also has some ugly consequences. It tilts us toward the view that the world is a fundamentally fair place, where goodness is rewarded and badness punished. It can lead us to blame those who suffer from disease and who are victims of crimes, and it can motivate a reflexive bias in favor of the status quo — seeing poverty, inequality and oppression as reflecting the workings of a deep and meaningful plan.
  • even those who are devout should agree that, at least here on Earth, things just don’t naturally work out so that people get what they deserve. If there is such a thing as divine justice or karmic retribution, the world we live in is not the place to find it. Instead, the events of human life unfold in a fair and just manner only when individuals and society work hard to make this happen.We should resist our natural urge to think otherwise.
7More

Is Everyone a Little Bit Racist? - NYTimes.com - 0 views

  • Research in the last couple of decades suggests that the problem is not so much overt racists. Rather, the larger problem is a broad swath of people who consider themselves enlightened, who intellectually believe in racial equality, who deplore discrimination, yet who harbor unconscious attitudes that result in discriminatory policies and behavior.
  • The player takes on the role of a police officer who is confronted with a series of images of white or black men variously holding guns or innocent objects such as wallets or cellphones. The aim is to shoot anyone with a gun while holstering your weapon in other cases.Ordinary players (often university undergraduates) routinely shoot more quickly at black men than at white men, and are more likely to mistakenly shoot an unarmed black man than an unarmed white man.
  • Correll has found no statistically significant difference between the play of blacks and that of whites in the shooting game.
  • ...4 more annotations...
  • an uncomfortable starting point is to understand that racial stereotyping remains ubiquitous, and that the challenge is not a small number of twisted white supremacists but something infinitely more subtle and complex: People who believe in equality but who act in ways that perpetuate bias and inequality.
  • One finding is that we unconsciously associate “American” with “white.” Thus, in 2008, some California college students — many who were supporting Barack Obama for president — unconsciously treated Obama as more foreign than Tony Blair, the former British prime minister.
  • “There’s a whole culture that promotes this idea of aggressive young black men,” Correll notes. “In our minds, young black men are associated with danger.”
  • Joshua Correll of the University of Colorado at Boulder has used an online shooter video game to try to measure these unconscious attitudes (you can play the game yourself).
1More

BBC News - Science 'squeezed out of primary schools' - 0 views

  • Science is being squeezed out of English primary schools, with a third not providing the recommended two hours of teaching a week, research suggests. The Confederation of British Industry study also suggests science has become less of a priority in many schools. A third of 260 teachers surveyed said they lacked confidence teaching science.
20More

Journalists debunk vaccine science denial - 0 views

  • extra difficulties imposed irrationally by antiscience.
  • Large outbreaks in the U.S. of the highly infectious disease have become more common in the past two years, even though measles hasn’t been indigenous since 2000, according to the Centers for Disease Control and Prevention.
  • difficult because concerns about a possible link between vaccines and autism—now debunked by science—have expanded to more general, and equally groundless, worries about the effects of multiple shots on a child’s immune system, vaccine experts and doctors say.
  • ...17 more annotations...
  • It summarized and condemned the scientific and medical fraud that the British researcher Andrew Wakefield perpetrated. Years earlier, he had falsely linked the measles, mumps, and rubella (MMR) vaccine to autism. The editorial lamented that “the damage to public health continues, fuelled by unbalanced media reporting and an ineffective response from government, researchers, journals, and the medical profession.”
  • Reporters also seek to ensure that viewers, listeners, or readers understand that measles can afflict a victim more powerfully than does a mere passing ailment.
  • Measles doesn’t spread in most U.S. communities because people are protected by “herd immunity,” meaning that 92% to 94% of the population is vaccinated or immune. That level of protection makes it hard for one case of measles to spread even from one unvaccinated person to another without direct contact.
  • a study that “found that only 51 percent of Americans were confident that vaccines are safe and effective, which is similar to the proportion who believe that houses can be haunted by ghosts.”
  • In some parts of California, resistance to vaccinations including the MMR shot is stronger than ever, despite cases of measles hitting five US states.
  • “Vaccines are a great idea, but they are poisoning us, adding things that kick in later in life so they can sell us more drugs.”
  • Health professionals say those claims are unfounded or vastly overstated.
  • “the anti-vaccination movement is fueled by an over-privileged group of rich people grouped together who swear they won’t put any chemicals in their kids (food or vaccines or whatever else), either because it’s trendy to be all-natural or they don’t understand or accept the science of vaccinations. Their science denying has been propelled further by celebrities
  • the outbreak “should worry and enrage the public.” It indicted the anti-vaxxers’ “ignorant and self-absorbed rejection of science” and declared, “Getting vaccinated is good for the health of the inoculated person and also part of one’s public responsibility to help protect the health of others.”
  • “It’s wrong,” the editors emphasized, “to allow public health to be threatened while everyone else waits for these science-denying parents to open their eyes.”
  • “It’s because these people are highly educated and they get on the Internet and read things and think they can figure things out better than their physician.”
  • linked vaccination opposition to the “political left, which has long been suspicious of the lobbying power of the pharmaceutical industry and its influence on government regulators, and also the fringe political right, which has at different times seen vaccination, fluoridisation and other public-health initiatives as attempts by big government to impose tyrannical limits on personal freedom.”
  • Attempts to increase concerns about communicable diseases or correct false claims about vaccines may be especially likely to be counterproductive.
  • “attempting balance by giving vaccine skeptics and pro-vaccine advocates equal weight in news stories leads people to believe the evidence for and against vaccination is equally strong.”
  • A recent edition of the Washington Post carried a letter defending anti-vaxxers as “people who generally are pro-science and highly educated, who have high incomes and who have studied this issue carefully before coming to the conclusion that the risk to their children is greater than the slim possibility of contracting a childhood disease that [in many cases leaves] little or no residual consequences.”
  • anecdotal evidence suggests that some journalists, rather than omitting anti-vaxxers’ views, prefer to expose them and then oppose them.
  • “unwarranted fear . . . an assault on one of the greatest public-health inventions in world history.”
6More

BBC - Capital - The best new way to learn a language? - 0 views

  • As an Uber driver, Choudhary has to use an English-language app day in, day out, and he has found it has significantly improved his language skills.
  • now working for so-called shared economy platforms – business models that allow individuals to borrow or make use of assets or services offered by somebody else – such as Uber, Airbnb, freelance marketplace Fiverr and clothing hire platform Rent the Runway. Since these app-based businesses work primarily in English, Indians who work with them every day are improving their language skills as a side-effect.
  • some people deliberately choose to travel with Airbnb because it means their children can interact with other children and have exposure to other languages.” 
  • ...3 more annotations...
  • Although he already speaks English well, he says meeting native speakers has been helpful for picking up new phrases and mastering the art of both British and American slang.
  • Language teachers aren’t surprised by the trend, and see it as a natural progression given improved access to technology in countries like India. “In a developing country many people don’t have the disposable income to invest in self-improvement with things like language lessons. But access to the internet creates opportunities for self-directed study and to learn from the wealth of English language [content] available,”
  • “Smart learning is all about learning the English you need to deal with day-to-day situations that you may encounter. The instant gratification of learning something and being able to apply it in a meaningful way is a huge motivator,”
11More

Scientists Seek Ban on Method of Editing the Human Genome - NYTimes.com - 1 views

  • A group of leading biologists on Thursday called for a worldwide moratorium on use of a new genome-editing technique that would alter human DNA in a way that can be inherited.
  • The biologists fear that the new technique is so effective and easy to use that some physicians may push ahead before its safety can be assessed. They also want the public to understand the ethical issues surrounding the technique, which could be used to cure genetic diseases, but also to enhance qualities like beauty or intelligence. The latter is a path that many ethicists believe should never be taken.
  • a technique invented in 2012 makes it possible to edit the genome precisely and with much greater ease. The technique has already been used to edit the genomes of mice, rats and monkeys, and few doubt that it would work the same way in people.
  • ...8 more annotations...
  • The technique holds the power to repair or enhance any human gene. “It raises the most fundamental of issues about how we are going to view our humanity in the future and whether we are going to take the dramatic step of modifying our own germline and in a sense take control of our genetic destiny, which raises enormous peril for humanity,”
  • The paper’s authors, however, are concerned about countries that have less regulation in science. They urge that “scientists should avoid even attempting, in lax jurisdictions, germline genome modification for clinical application in humans” until the full implications “are discussed among scientific and governmental organizations.”
  • Though such a moratorium would not be legally enforceable and might seem unlikely to exert global influence, there is a precedent. In 1975, scientists worldwide were asked to refrain from using a method for manipulating genes, the recombinant DNA technique, until rules had been established.
  • Though highly efficient, the technique occasionally cuts the genome at unintended sites. The issue of how much mistargeting could be tolerated in a clinical setting is one that Dr. Doudna’s group wants to see thoroughly explored before any human genome is edited.
  • “We worry about people making changes without the knowledge of what those changes mean in terms of the overall genome,” Dr. Baltimore said. “I personally think we are just not smart enough — and won’t be for a very long time — to feel comfortable about the consequences of changing heredity, even in a single individual.”
  • Many ethicists have accepted the idea of gene therapy, changes that die with the patient, but draw a clear line at altering the germline, since these will extend to future generations. The British Parliament in February approved the transfer of mitochondria, small DNA-containing organelles, to human eggs whose own mitochondria are defective. But that technique is less far-reaching because no genes are edited.
  • There are two broad schools of thought on modifying the human germline, said R. Alta Charo, a bioethicist at the University of Wisconsin and a member of the Doudna group. One is pragmatic and seeks to balance benefit and risk. The other “sets up inherent limits on how much humankind should alter nature,” she said. Some Christian doctrines oppose the idea of playing God, whereas in Judaism and Islam there is the notion “that humankind is supposed to improve the world.” She described herself as more of a pragmatist, saying, “I would try to regulate such things rather than shut a new technology down at its beginning.
  • The Doudna group calls for public discussion, but is also working to develop some more formal process, such as an international meeting convened by the National Academy of Sciences, to establish guidelines for human use of the genome-editing technique.“We need some principled agreement that we want to enhance humans in this way or we don’t,” Dr. Jaenisch said. “You have to have this discussion because people are gearing up to do this.”
18More

Here's what the government's dietary guidelines should really say - The Washington Post - 0 views

  • If I were writing the dietary guidelines, I would give them a radical overhaul. I’d go so far as to radically overhaul the way we evaluate diet. Here’s why and how.
  • Lately, as scientists try, and fail, to reproduce results, all of science is taking a hard look at funding biases, statistical shenanigans and groupthink. All that criticism, and then some, applies to nutrition.
  • Prominent in the charge to change the way we do science is John Ioannidis, professor of health research and policy at Stanford University. In 2005, he published “Why Most Research Findings Are False” in the journal PLOS Medicin
  • ...15 more annotations...
  • He came down hard on nutrition in a pull-no-punches 2013 British Medical Journal editorial titled, “Implausible results in human nutrition research,” in which he noted, “Almost every single nutrient imaginable has peer reviewed publications associating it with almost any outcome.”
  • Ioannidis told me that sussing out the connection between diet and health — nutritional epidemiology — is enormously challenging, and “the tools that we’re throwing at the problem are not commensurate with the complexity and difficulty of the problem.” The biggest of those tools is observational research, in which we collect data on what people eat, and track what happens to them.
  • He lists plant-based foods — fruit, veg, whole grains, legumes — but acknowledges that we don’t understand enough to prescribe specific combinations or numbers of servings.
  • funding bias isn’t the only kind. “Fanatical opinions abound in nutrition,” Ioannidis wrote in 2013, and those have bias power too.
  • “Definitive solutions won’t come from another million observational papers or small randomized trials,” reads the subtitle of Ioannidis’s paper. His is a burn-down-the-house ethos.
  • When it comes to actual dietary recommendations, the disagreement is stark. “Ioannidis and others say we have no clue, the science is so bad that we don’t know anything,” Hu told me. “I think that’s completely bogus. We know a lot about the basic elements of a healthy diet.”
  • Give tens of thousands of people that FFQ, and you end up with a ginormous repository of possible correlations. You can zero in on a vitamin, macronutrient or food, and go to town. But not only are you starting with flawed data, you’ve got a zillion possible confounding variables — dietary, demographic, socioeconomic. I’ve heard statisticians call it “noise mining,” and Ioannidis is equally skeptical. “With this type of data, you can get any result you want,” he said. “You can align it to your beliefs.”
  • Big differences in what people eat track with other differences. Heavy plant-eaters are different from, say, heavy meat-eaters in all kinds of ways (income, education, physical activity, BMI). Red meat consumption correlates with increased risk of dying in an accident as much as dying from heart disease. The amount of faith we put in observational studies is a judgment call.
  • I find myself in Ioannidis’s camp. What have we learned, unequivocally enough to build a consensus in the nutrition community, about how diet affects health? Well, trans-fats are bad.
  • Over and over, large population studies get sliced and diced, and it’s all but impossible to figure out what’s signal and what’s noise. Researchers try to do that with controlled trials to test the connections, but those have issues too. They’re expensive, so they’re usually small and short-term. People have trouble sticking to the diet being studied. And scientists are generally looking for what they call “surrogate endpoints,” like increased cholesterol rather than death from heart disease, since it’s impractical to keep a trial going until people die.
  • , what do we do? Hu and Ioannidis actually have similar suggestions. For starters, they both think we should be looking at dietary patterns rather than single foods or nutrients. They also both want to look across the data sets. Ioannidis emphasizes transparency. He wants to open data to the world and analyze all the data sets in the same way to see if “any signals survive.” Hu is more cautious (partly to safeguard confidentiality
  • I have a suggestion. Let’s give up on evidence-based eating. It’s given us nothing but trouble and strife. Our tools can’t find any but the most obvious links between food and health, and we’ve found those already.
  • Instead, let’s acknowledge the uncertainty and eat to hedge against what we don’t know
  • We’ve got two excellent hedges: variety and foods with nutrients intact (which describes such diets as the Mediterranean, touted by researchers). If you severely limit your foods (vegan, keto), you might miss out on something. Ditto if you eat foods with little nutritional value (sugar, refined grains). Oh, and pay attention to the two things we can say with certainty: Keep your weight down, and exercise.
  • I used to say I could tell you everything important about diet in 60 seconds. Over the years, my spiel got shorter and shorter as truisms fell by the wayside, and my confidence waned in a field where we know less, rather than more, over time. I’m down to five seconds now: Eat a wide variety of foods with their nutrients intact, keep your weight down and get some exercise.
10More

Many Academics Are Eager to Publish in Worthless Journals - The New York Times - 0 views

  • it’s increasingly clear that many academics know exactly what they’re getting into, which explains why these journals have proliferated despite wide criticism. The relationship is less predator and prey, some experts say, than a new and ugly symbiosis.
  • “When hundreds of thousands of publications appear in predatory journals, it stretches credulity to believe all the authors and universities they work for are victims,” Derek Pyne, an economics professor at Thompson Rivers University in British Columbia, wrote in a op-ed published in the Ottawa Citizen, a Canadian newspaper.
  • The journals are giving rise to a wider ecosystem of pseudo science. For the academic who wants to add credentials to a resume, for instance, publishers also hold meetings where, for a hefty fee, you can be listed as a presenter — whether you actually attend the meeting or not.
  • ...7 more annotations...
  • Many of these journals have names that closely resemble those of established publications, making them easily mistakable. There is the Journal of Economics and Finance, published by Springer, but now also the Journal of Finance and Economics. There is the Journal of Engineering Technology, put out by the American Society for Engineering Education, but now another called the GSTF Journal of Engineering Technology.
  • Predatory journals have few expenses, since they do not seriously review papers that are submitted and they publish only online. They blast emails to academics, inviting them to publish. And the journals often advertise on their websites that they are indexed by Google Scholar. Often that is correct — but Google Scholar does not vet the journals it indexes.
  • The number of such journals has exploded to more than 10,000 in recent years, with nearly as many predatory as legitimate ones. “Predatory publishing is becoming an organized industry,” wrote one group of critics in a paper in Nature
  • Participating in such dubious enterprises carries few risks. Dr. Pyne, who did a study of his colleagues publications, reports that faculty members at his school who got promoted last year had at least four papers in questionable journals. All but one academic in 10 who won a School of Business and Economics award had published papers in these journals. One had 10 such articles.
  • Academics get rewarded with promotions when they stuff their resumes with articles like these, Dr. Pyne concluded. There are few or no adverse consequences — in fact, the rewards for publishing in predatory journals were greater than for publishing in legitimate ones.
  • Some say the academic system bears much of the blame for the rise of predatory journals, demanding publications even from teachers at places without real resources for research and where they may have little time apart from teaching.At Queensborough, faculty members typically teach nine courses per year. At four-year colleges, faculty may teach four to six courses a year.
  • Recently a group of researchers who invented a fake academic: Anna O. Szust. The name in Polish means fraudster. Dr. Szust applied to legitimate and predatory journals asking to be an editor. She supplied a résumé in which her publications and degrees were total fabrications, as were the names of the publishers of the books she said she had contributed to.The legitimate journals rejected her application immediately. But 48 out of 360 questionable journals made her an editor. Four made her editor in chief. One journal sent her an email saying, “It’s our pleasure to add your name as our editor in chief for the journal with no responsibilities.”
« First ‹ Previous 41 - 60 of 102 Next › Last »
Showing 20 items per page