Skip to main content

Home/ TOK Friends/ Group items matching "Trust" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
11More

The French Do Buy Books. Real Books. - NYTimes.com - 0 views

  • For a few bucks off and the pleasure of shopping from bed, have we handed over a precious natural resource — our nation’s books — to an ambitious billionaire with an engineering degree?
  • France, meanwhile, has just unanimously passed a so-called anti-Amazon law, which says online sellers can’t offer free shipping on discounted books. (“It will be either cheese or dessert, not both at once,” a French commentator explained.)
  • Amazon has a 10 or 12 percent share of new book sales in France. Amazon reportedly handles 70 percent of the country’s online book sales, but just 18 percent of books are sold online.
  • ...8 more annotations...
  • no seller can offer more than 5 percent off the cover price of new books. That means a book costs more or less the same wherever you buy it in France, even online. The Lang law was designed to make sure France continues to have lots of different books, publishers and booksellers.
  • Readers say they trust books far more than any other medium, including newspapers and TV.
  • Six of the world’s 10 biggest book-selling countries — Germany, Japan, France, Italy, Spain and South Korea — have versions of fixed book prices.
  • What underlies France’s book laws isn’t just an economic position — it’s also a worldview. Quite simply, the French treat books as special. Some 70 percent of French people said they read at least one book last year; the average among French readers was 15 books.
  • In Britain, which abandoned its own fixed-price system in the 1990s, there are fewer than 1,000 independent bookstores left. A third closed in the past nine years, as supermarkets and Amazon discounted some books by more than 50 percent.
  • The French government classifies books as an “essential good,” along with electricity, bread and water.
  • None of this is taken for granted. People here have thought for centuries about what makes a book industry vibrant, and are watching developments in Britain and America as cautionary tales. “We don’t sell potatoes,” says Mr. Moni. “There are also ideas in books. That’s what’s dangerous. Because the day that you have a large seller that sells 80 percent of books, he’s the one who will decide what’s published, or what won’t be published.
  • “When your computer dies, you throw it away,” says Mr. Montagne of the publishers’ association. “But you’ll remember a book 20 years later. You’ve deeply entered into a story that’s not your own. It’s forged who you are. You’ll only see later how much it has affected you. You don’t keep all books, but it’s not a market like others. The contents of a bookcase can define who you are.”
12More

Facebook Has All the Power - Julie Posetti - The Atlantic - 0 views

  • scholars covet thy neighbor's data. They're attracted to the very large and often fascinating data sets that private companies have developed.
  • It's the companies that own and manage this data. The only standards we know they have to follow are in the terms-of-service that users accept to create an account, and the law as it stands in different countries.
  • the "sexiness" of the Facebook data that led Cornell University and the Proceedings of the National Academy of Sciences (PNAS) into an ethically dubious arrangement, where, for example, Facebook's unreadable 9,000-word terms-of-service are said to be good enough to meet the standard for "informed consent."
  • ...9 more annotations...
  • When the study drew attention and controversy, there was a moment when they both could have said: "We didn't look carefully enough at this the first time. Now we can see that it doesn't meet our standards." Instead they allowed Facebook and the PR people to take the lead in responding to the controversy.
  • What should this reality signal to Facebook users? Is it time to pull-back? You have (almost) no rights. You have (almost) no control. You have no idea what they're doing to you or with you. You don't even know who's getting the stuff you are posting, and you're not allowed to know. Trade secret!
  • Are there any particular warnings here for journalists and editors in terms of their exposure on Facebook? Yeah. Facebook has all the power. You have almost none. Just keep that in mind in all your dealings with it, as an individual with family and friends, as a journalist with a story to file, and as a news organization that is "on" Facebook.
  • I am not in a commercial situation where I have to maximize my traffic, so I can opt out. Right now my choice is to keep my account, but use it cynically. 
  • does this level of experimentation indicate the prospect of a further undermining of audience-driven news priorities and traditional news values? The right way to think about it is a loss of power—for news producers and their priorities. As I said, Facebook thinks it knows better than I do what "my" 180,000 subscribers should get from me.
  • Facebook has "where else are they going to go?" logic now. And they have good reason for this confidence. (It's called network effects.) But "where else are they going to go?" is a long way from trust and loyalty. It is less a durable business model than a statement of power. 
  • I distinguished between the "thin" legitimacy that Facebook operates under and the "thick" legitimacy that the university requires to be the institution it was always supposed to be. (Both are distinct from il-legitimacy.) News organizations should learn to make this distinction more often. Normal PR exists to muddle it. Which is why you don't hand a research crisis over to university PR people.
  • some commentators have questioned the practice of A/B headline testing in the aftermath of this scandal—is there a clear connection? The connection to me is that both are forms of behaviourism. Behaviourism is a view of human beings in which, as Hannah Arendt said, they are reduced to the level of a conditioned and "behaving" animal—an animal that responds to these stimuli but not those. This is why a popular shorthand for Facebook's study was that users were being treated as lab rats.
  • Journalism is supposed to be about informing people so they can understand the world and take action when necessary. Action and behaviour are not the same thing at all. One is a conscious choice, the other a human tendency. There's a tension, then, between commercial behaviourism, which may be deeply functional in some ways for the news industry, and informing people as citizens capable of understanding their world well enough to improve it, which is the deepest purpose of journalism. A/B testing merely highlights this tension.
11More

How a Simple Spambot Became the Second Most Powerful Member of an Italian Social Networ... - 0 views

  • Luca Maria Aiello and a few pals from the University of Turin in Italy began studying a social network called aNobii.com in which people exchange information and opinions about the books they love. Each person has a site that anybody can visit. Users can then choose to set up social links with others
  • To map out the structure of the network, Aiello and co-created an automated crawler that starts by visiting one person’s profile on the network and then all of the people that connect to this node in turn. It then visits each of the people that link to these nodes and so on. In this way, the bot builds up a map of the network
  • people began to respond to the crawler’s visits. That gave the team an idea. “The unexpected reactions the bot caused by its visits motivated us to set up a social experiment in two parts to answer the question: can an individual with no trust gain popularity and influence?”
  • ...8 more annotations...
  • Aiello and co were careful to ensure that the crawler did not engage with anybody on the network in any way other than to visit his or her node. Their idea was to isolate a single, minimal social activity and test how effective it was in gaining popularity.
  • They began to record the reactions to lajello’s visits including the number of messages it received, their content, the links it received and how they varied over time and so on.
  • By December 2011, lajello’s profile had become one of the most popular on the entire social network. It had received more than 66,000 visits as well as 2435 messages from more than 1200 different people.  In terms of the number of different message received, a well-known writer was the most popular on this network but lajello was second.
  • “Our experiment gives strong support to the thesis that popularity can be gained just with continuous “social probing”,” conclude Aiello and co. “We have shown that a very simple spambot can attract great interest even without emulating any aspects of typical human behavior.”
  • Having created all this popularity, Aiello and co wanted to find out how influential the spam bot could be. So they started using the bot to send recommendations to users on who else to connect to.The spam bot could either make a recommendation chosen at random or one that was carefully selected by a recommendation engine. It then made its recommendations to users that had already linked to lajello and to other users chosen at random.
  • “Among the 361 users who created at least one social connection in the 36 hours after the recommendation, 52 per cent followed suggestion given by the bot,” they say.
  • shows just how easy it is for an automated bot to play a significant role in a social network. Popularity appears easy to buy using nothing more than page visits, at least in this experiment. What is more, this popularity can be easily translated into influence
  • It is not hard to see the significance of this work. Social bots are a fact of life on almost every social network and many have become so sophisticated they are hard to distinguish from humans. If the simplest of bots created by Aiello and co can have this kind of impact, it is anybody’s guess how more advanced bots could influence everything from movie reviews and Wikipedia entries to stock prices and presidential elections.
9More

BBC - Future - Will religion ever disappear? - 0 views

  • A growing number of people, millions worldwide, say they believe that life definitively ends at death
  • “Very few societies are more religious today than they were 40 or 50 years ago,”
  • Decline, however, does not mean disappearance
  • ...5 more annotations...
  • For some reason, religion seems to give meaning to suffering – much more so than any secular ideal or belief that we know of.”
  • This is because a god-shaped hole seems to exist in our species’ neuropsychology, thanks to a quirk of our evolutio
  • System 1, on the other hand, is intuitive, instinctual and automati
  • . Our minds crave purpose and explanation. “With education, exposure to science and critical thinking, people might stop trusting their intuitions,” Norenzayan says. “But the intuitions are there.”
  • experts guess that religion will probably never go awa
  •  
    article questioning if science could ever replace religion
15More

There's nothing wrong with grade inflation - The Washington Post - 0 views

  • By the early ’90s, so long as one had the good sense to major in the humanities — all bets were off in the STEM fields — it was nearly impossible to get a final grade below a B-minus at an elite college. According to a 2012 study, the average college GPA, which in the 1930s was a C-plus, had risen to a B at public universities and a B-plus at private schools. At Duke, Pomona and Harvard, D’s and F’s combine for just 2 percent of all grades. A Yale report found that 62 percent of all Yale grades are A or A-minus. According to a 2013 article in the Harvard Crimson, the median grade at Harvard was an A-minus , while the most common grade was an A.
  • The result is widespread panic about grade inflation at elite schools. (The phenomenon is not as prevalent at community colleges and less-selective universities.) Some blame students’ consumer mentality, a few see a correlation with small class sizes (departments with falling enrollments want to keep students happy), and many cite a general loss of rigor in a touchy-feely age.
  • Yet whenever elite schools have tried to fight grade inflation, it’s been a mess. Princeton instituted strict caps on the number of high grades awarded, then abandoned the plan, saying the caps dissuaded applicants and made students miserable. At Wellesley, grade-inflated humanities departments mandated that the average result in their introductory and intermediate classes not exceed a B-plus. According to one study, enrollment fell by one-fifth, and students were 30 percent less likely to major in one of these subjects.
  • ...12 more annotations...
  • I liked the joy my students found when they actually earned a grade they’d been reaching for. But whereas I once thought we needed to contain grades, I now see that we may as well let them float skyward. If grade inflation is bad, fighting it is worse. Our goal should be ending the centrality of grades altogether. For years, I feared that a world of only A’s would mean the end of meaningful grades; today, I’m certain of it. But what’s so bad about that?
  • It’s easy to see why schools want to fight grade inflation. Grades should motivate certain students: those afraid of the stigma of a bad grade or those ambitious, by temperament or conditioning, to succeed in measurable ways. Periodic grading during a term, on quizzes, tests or papers, provides feedback to students, which should enable them to do better. And grades theoretically signal to others, such as potential employers or graduate schools, how well the student did. (Grade-point averages are also used for prizes and class rankings, though that doesn’t strike me as an important feature.)
  • But it’s not clear that grades work well as motivators. Although recent research on the effects of grades is limited, several studies in the 1970s, 1980s and 1990s measured how students related to a task or a class when it was graded compared to when it was ungraded. Overall, graded students are less interested in the topic at hand and — and, for obvious, common-sense reasons — more inclined to pick the easiest possible task when given the chance. In the words of progressive-education theorist Alfie Kohn, author of “The Homework Myth,” “the quality of learning declines” when grades are introduced, becoming “shallower and more superficial when the point is to get a grade.”
  • Even where grades can be useful, as in describing what material a student has mastered, they are remarkably crude instruments. Yes, the student who gets a 100 on a calculus exam probably grasps the material better than the student with a 60 — but only if she retains the knowledge, which grades can’t show.
  • I still can’t say very well what separates a B from an A. What’s more, I never see the kind of incompetence or impudence that would merit a D or an F. And now, in our grade-inflated world, it’s even harder to use grades to motivate, or give feedback, or send a signal to future employers or graduate schools.
  • According to a 2012 study by the Chronicle of Higher Education, GPA was seventh out of eight factors employers considered in hiring, behind internships, extracurricular activities and previous employment. Last year, Stanford’s registrar told the Chronicle about “a clamor” from employers “for something more meaningful” than the traditional transcript. The Lumina Foundation gave a$1.27 million grant to two organizations for college administrators working to develop better student records, with grades only one part of a student’s final profile.
  • Some graduate schools, too, have basically ditched grades. “As long as you don’t bomb and flunk out, grades don’t matter very much in M.F.A. programs,” the director of one creative-writing program told the New York Times. To top humanities PhD programs, letters of reference and writing samples matter more than overall GPA (although students are surely expected to have received good grades in their intended areas of study). In fact, it’s impossible to get into good graduate or professional schools without multiple letters of reference, which have come to function as the kind of rich, descriptive comments that could go on transcripts in place of grades.
  • suggests that GPAs serve not to validate students from elite schools but to keep out those from less-prestigious schools and large public universities, where grades are less inflated. Grades at community colleges “have actually dropped” over the years, according to Stuart Rojstaczer, a co-author of the 2012 grade-inflation study. That means we have two systems: one for students at elite schools, who get jobs based on references, prestige and connections, and another for students everywhere else, who had better maintain a 3.0. Grades are a tool increasingly deployed against students without prestige.
  • The trouble is that, while it’s relatively easy for smaller colleges to go grade-free, with their low student-to-teacher ratios, it’s tough for professors at larger schools, who must evaluate more students, more quickly, with fewer resources. And adjuncts teaching five classes for poverty wages can’t write substantial term-end comments, so grades are a necessity if they want to give any feedback at all.
  • It would mean hiring more teachers and paying them better (which schools should do anyway). And if transcripts become more textured, graduate-school admission offices and employers will have to devote more resources to reading them, and to getting to know applicants through interviews and letters of reference — a salutary trend that is underway already.
  • When I think about getting rid of grades, I think of happier students, with whom I have more open, democratic relationships. I think about being forced to pay more attention to the quiet ones, since I’ll have to write something truthful about them, too. I’ve begun to wonder if a world without grades may be one of those states of affairs (like open marriages, bicycle lanes and single-payer health care) that Americans resist precisely because they seem too good, suspiciously good. Nothing worth doing is supposed to come easy.
  • Alfie Kohn, too, sees ideology at work in the grade-inflation panic. “Most of what powers the arguments against grade inflation is a very right-wing idea that excellence consists in beating everyone else around you,” he says. “Even when you have sorted them — even when they get to Harvard! — we have to sort them again.” In other words, we can trust only a system in which there are clear winners and losers.
9More

Diversity Makes You Brighter - The New York Times - 0 views

  • Diversity improves the way people think. By disrupting conformity, racial and ethnic diversity prompts people to scrutinize facts, think more deeply and develop their own opinions. Our findings show that such diversity actually benefits everyone, minorities and majority alike.
  • When trading, participants could observe the behavior of their counterparts and decide what to make of it. Think of yourself in similar situations: Interacting with others can bring new ideas into view, but it can also cause you to adopt popular but wrong ones.
  • It depends how deeply you contemplate what you observe. So if you think that something is worth $100, but others are bidding $120 for it, you may defer to their judgment and up the ante (perhaps contributing to a price bubble) or you might dismiss them and stand your ground.
  • ...6 more annotations...
  • When participants were in diverse company, their answers were 58 percent more accurate. The prices they chose were much closer to the true values of the stocks. As they spent time interacting in diverse groups, their performance improved.In homogeneous groups, whether in the United States or in Asia, the opposite happened. When surrounded by others of the same ethnicity or race, participants were more likely to copy others, in the wrong direction. Mistakes spread as participants seemingly put undue trust in others’ answers, mindlessly imitating them. In the diverse groups, across ethnicities and locales, participants were more likely to distinguish between wrong and accurate answers. Diversity brought cognitive friction that enhanced deliberation.
  • For our study, we intentionally chose a situation that required analytical thinking, seemingly unaffected by ethnicity or race. We wanted to understand whether the benefits of diversity stem, as the common thinking has it, from some special perspectives or skills of minorities.
  • What we actually found is that these benefits can arise merely from the very presence of minorities.
  • before participants interacted, there were no statistically significant differences between participants in the homogeneous or diverse groups. Minority members did not bring some special knowledge.
  • When surrounded by people “like ourselves,” we are easily influenced, more likely to fall for wrong ideas. Diversity prompts better, critical thinking. It contributes to error detection. It keeps us from drifting toward miscalculation.
  • Our findings suggest that racial and ethnic diversity matter for learning, the core purpose of a university. Increasing diversity is not only a way to let the historically disadvantaged into college, but also to promote sharper thinking for everyone.
9More

Who's to blame when fake science gets published? - 1 views

  • The now-discredited study got headlines because it offered hope. It seemed to prove that our sense of empathy, our basic humanity, could overcome prejudice and bridge seemingly irreconcilable differences. It was heartwarming, and it was utter bunkum. The good news is that this particular case of scientific fraud isn't going to do much damage to anyone but the people who concocted and published the study. The bad news is that the alleged deception is a symptom of a weakness at the heart of the scientific establishment.
  • When it was published in Science magazine last December, the research attracted academic as well as media attention; it seemed to provide solid evidence that increasing contact between minority and majority groups could reduce prejudice.
  • But in May, other researchers tried to reproduce the study using the same methods, and failed. Upon closer examination, they uncovered a number of devastating "irregularities" - statistical quirks and troubling patterns - that strongly implied that the whole LaCour/Green study was based upon made-up data.
  • ...6 more annotations...
  • The data hit the fan, at which point Green distanced himself from the survey and called for the Science article to be retracted. The professor even told Retraction Watch, the website that broke the story, that all he'd really done was help LaCour write up the findings.
  • Science magazine didn't shoulder any blame, either. In a statement, editor in chief Marcia McNutt said the magazine was essentially helpless against the depredations of a clever hoaxer: "No peer review process is perfect, and in fact it is very difficult for peer reviewers to detect artful fraud."
  • This is, unfortunately, accurate. In a scientific collaboration, a smart grad student can pull the wool over his adviser's eyes - or vice versa. And if close collaborators aren't going to catch the problem, it's no surprise that outside reviewers dragooned into critiquing the research for a journal won't catch it either. A modern science article rests on a foundation of trust.
  • If the process can't catch such obvious fraud - a hoax the perpetrators probably thought wouldn't work - it's no wonder that so many scientists feel emboldened to sneak a plagiarised passage or two past the gatekeepers.
  • Major peer-review journals tend to accept big, surprising, headline-grabbing results when those are precisely the ones that are most likely to be wrong.
  • Despite the artful passing of the buck by LaCour's senior colleague and the editors of Science magazine, affairs like this are seldom truly the product of a single dishonest grad student. Scientific publishers and veteran scientists - even when they don't take an active part in deception - must recognise that they are ultimately responsible for the culture producing the steady drip-drip-drip of falsification, exaggeration and outright fabrication eroding the discipline they serve.
20More

The Rise of Hate Search - The New York Times - 0 views

  • after the media first reported that at least one of the shooters had a Muslim-sounding name, a disturbing number of Californians had decided what they wanted to do with Muslims: kill them.
  • the rest of America searched for the phrase “kill Muslims” with about the same frequency that they searched for “martini recipe,” “migraine symptoms” and “Cowboys roster.”
  • People often have vicious thoughts. Sometimes they share them on Google. Do these thoughts matter?Yes. Using weekly data from 2004 to 2013, we found a direct correlation between anti-Muslim searches and anti-Muslim hate crimes.
  • ...17 more annotations...
  • In 2014, according to the F.B.I., anti-Muslim hate crimes represented 16.3 percent of the total of 1,092 reported offenses. Anti-Semitism still led the way as a motive for hate crimes, at 58.2 percent.
  • Hate crimes may seem chaotic and unpredictable, a consequence of random neurons that happen to fire in the brains of a few angry young men. But we can explain some of the rise and fall of anti-Muslim hate crimes just based on what people are Googling about Muslims.
  • If our model is right, Islamophobia and thus anti-Muslim hate crimes are currently higher than at any time since the immediate aftermath of the Sept. 11 attacks.
  • How can these Google searches track Islamophobia so well? Who searches for “I hate Muslims” anyway?We often think of Google as a source from which we seek information directly, on topics like the weather, who won last night’s game or how to make apple pie. But sometimes we type our uncensored thoughts into Google, without much hope that Google will be able to help us. The search window can serve as a kind of confessional.
  • It is not just that hatred against Muslims is extremely high today. It’s that it’s exceptional compared with prejudice against every other group in the United States.
  • “If someone is willing to say ‘I hate them’ or ‘they disgust me,’ we know that those emotions are as good a predictor of behavior as actual intent,” said Susan Fiske, a social psychologist at Princeton
  • Google searches seem to suffer from selection bias: Instead of asking a random sample of Americans how they feel, you just get information from those who are motivated to search. But this restriction may actually help search data predict hate crimes.
  • “Google searches answer a different question: What do people excited enough by an issue to comment on it think and believe about it? The answer to this question, just because it is unrepresentative of the public as a whole, may be a better bet to predict hate crimes.”
  • While the vast majority of Muslim Americans won’t be victims of hate crimes, few escape the “constant sense of fear and paranoia” that they or their loved ones might be next, said Rana Ibrahem
  • What about the other side of the coin — compassion and understanding? Do they stand a chance against hate?Searches for information about Islam and Muslims did rise after the attacks in Paris and San Bernardino. Yet they rose far less than searches for hate did. “Who is Muhammad?” “what do Muslims believe?” and “what does the Quran say?” for instance, were no match for intolerance.
  • Google searches expressing moods, rather than looking for information, represent a tiny sample of everyone who is actually thinking those thoughts.
  • The search data also tells us that changes in Americans’ policy concerns have been dramatic. They happened, quite literally, within minutes of the terror attacks.Before the Paris attacks, 60 percent of Americans’ searches that took an obvious view of Syrian refugees saw them positively, asking how they could “help,” “volunteer” or “aid.” The other 40 percent were negative and mostly expressed skepticism about security. After Paris, however, the share of people opposed to refugees rose to 80 percent.
  • One idea might be to increase cultural integration. This is based on the “contact hypothesis”: If more Americans have Muslim neighbors, they will learn not to harbor irrational hate. We did not find support for this in the data — in fact, we found evidence for the opposite.
  • That’s evidence for the dominance of the “racial threat” hypothesis, which predicts that proximity breeds tension, not trust.
  • Another solution might be for leaders to talk about the importance of tolerance and the irrationality of hatred, as President Obama did in his Oval Office speech last Sunday night. He asked Americans to reject discrimination and religious tests for immigration. The reactions to his speech offer an excellent opportunity to see what works and what doesn’t work.
  • There was one line, however, that did trigger the type of response Mr. Obama might have wanted. He said, “Muslim Americans are our friends and our neighbors, our co-workers, our sports heroes and yes, they are our men and women in uniform, who are willing to die in defense of our country.”After this line, for the first time in more than a year, the top Googled noun after “Muslim” was not “terrorists,” “extremists” or “refugees.” It was “athletes,” followed by “soldiers.” And, in fact, “athletes” kept the top spot for a full day afterward.
  • On the whole, though, the response to the president’s speech shows that appealing to the better angels of an angry mob will most likely just backfire.
32More

Is Science Kind of a Scam? - The New Yorker - 1 views

  • No well-tested scientific concept is more astonishing than the one that gives its name to a new book by the Scientific American contributing editor George Musser, “Spooky Action at a Distance
  • The ostensible subject is the mechanics of quantum entanglement; the actual subject is the entanglement of its observers.
  • his question isn’t so much how this weird thing can be true as why, given that this weird thing had been known about for so long, so many scientists were so reluctant to confront it. What keeps a scientific truth from spreading?
  • ...29 more annotations...
  • it is as if two magic coins, flipped at different corners of the cosmos, always came up heads or tails together. (The spooky action takes place only in the context of simultaneous measurement. The particles share states, but they don’t send signals.)
  • fashion, temperament, zeitgeist, and sheer tenacity affected the debate, along with evidence and argument.
  • The certainty that spooky action at a distance takes place, Musser says, challenges the very notion of “locality,” our intuitive sense that some stuff happens only here, and some stuff over there. What’s happening isn’t really spooky action at a distance; it’s spooky distance, revealed through an action.
  • Why, then, did Einstein’s question get excluded for so long from reputable theoretical physics? The reasons, unfolding through generations of physicists, have several notable social aspects,
  • What started out as a reductio ad absurdum became proof that the cosmos is in certain ways absurd. What began as a bug became a feature and is now a fact.
  • “If poetry is emotion recollected in tranquility, then science is tranquility recollected in emotion.” The seemingly neutral order of the natural world becomes the sounding board for every passionate feeling the physicist possesses.
  • Musser explains that the big issue was settled mainly by being pushed aside. Generational imperatives trumped evidentiary ones. The things that made Einstein the lovable genius of popular imagination were also the things that made him an easy object of condescension. The hot younger theorists patronized him,
  • There was never a decisive debate, never a hallowed crucial experiment, never even a winning argument to settle the case, with one physicist admitting, “Most physicists (including me) accept that Bohr won the debate, although like most physicists I am hard pressed to put into words just how it was done.”
  • Arguing about non-locality went out of fashion, in this account, almost the way “Rock Around the Clock” displaced Sinatra from the top of the charts.
  • The same pattern of avoidance and talking-past and taking on the temper of the times turns up in the contemporary science that has returned to the possibility of non-locality.
  • the revival of “non-locality” as a topic in physics may be due to our finding the metaphor of non-locality ever more palatable: “Modern communications technology may not technically be non-local but it sure feels that it is.”
  • Living among distant connections, where what happens in Bangalore happens in Boston, we are more receptive to the idea of such a strange order in the universe.
  • The “indeterminacy” of the atom was, for younger European physicists, “a lesson of modernity, an antidote to a misplaced Enlightenment trust in reason, which German intellectuals in the 1920’s widely held responsible for their country’s defeat in the First World War.” The tonal and temperamental difference between the scientists was as great as the evidence they called on.
  • Science isn’t a slot machine, where you drop in facts and get out truths. But it is a special kind of social activity, one where lots of different human traits—obstinacy, curiosity, resentment of authority, sheer cussedness, and a grudging readiness to submit pet notions to popular scrutiny—end by producing reliable knowledge
  • What was magic became mathematical and then mundane. “Magical” explanations, like spooky action, are constantly being revived and rebuffed, until, at last, they are reinterpreted and accepted. Instead of a neat line between science and magic, then, we see a jumpy, shifting boundary that keeps getting redrawn
  • Real-world demarcations between science and magic, Musser’s story suggests, are like Bugs’s: made on the move and as much a trap as a teaching aid.
  • In the past several decades, certainly, the old lines between the history of astrology and astronomy, and between alchemy and chemistry, have been blurred; historians of the scientific revolution no longer insist on a clean break between science and earlier forms of magic.
  • Where once logical criteria between science and non-science (or pseudo-science) were sought and taken seriously—Karl Popper’s criterion of “falsifiability” was perhaps the most famous, insisting that a sound theory could, in principle, be proved wrong by one test or another—many historians and philosophers of science have come to think that this is a naïve view of how the scientific enterprise actually works.
  • They see a muddle of coercion, old magical ideas, occasional experiment, hushed-up failures—all coming together in a social practice that gets results but rarely follows a definable logic.
  • Yet the old notion of a scientific revolution that was really a revolution is regaining some credibility.
  • David Wootton, in his new, encyclopedic history, “The Invention of Science” (Harper), recognizes the blurred lines between magic and science but insists that the revolution lay in the public nature of the new approach.
  • What killed alchemy was the insistence that experiments must be openly reported in publications which presented a clear account of what had happened, and they must then be replicated, preferably before independent witnesses.
  • Wootton, while making little of Popper’s criterion of falsifiability, makes it up to him by borrowing a criterion from his political philosophy. Scientific societies are open societies. One day the lunar tides are occult, the next day they are science, and what changes is the way in which we choose to talk about them.
  • Wootton also insists, against the grain of contemporary academia, that single observed facts, what he calls “killer facts,” really did polish off antique authorities
  • once we agree that the facts are facts, they can do amazing work. Traditional Ptolemaic astronomy, in place for more than a millennium, was destroyed by what Galileo discovered about the phases of Venus. That killer fact “serves as a single, solid, and strong argument to establish its revolution around the Sun, such that no room whatsoever remains for doubt,” Galileo wrote, and Wootton adds, “No one was so foolish as to dispute these claims.
  • everal things flow from Wootton’s view. One is that “group think” in the sciences is often true think. Science has always been made in a cloud of social networks.
  • There has been much talk in the pop-sci world of “memes”—ideas that somehow manage to replicate themselves in our heads. But perhaps the real memes are not ideas or tunes or artifacts but ways of making them—habits of mind rather than products of mind
  • science, then, a club like any other, with fetishes and fashions, with schemers, dreamers, and blackballed applicants? Is there a real demarcation to be made between science and every other kind of social activity
  • The claim that basic research is valuable because it leads to applied technology may be true but perhaps is not at the heart of the social use of the enterprise. The way scientists do think makes us aware of how we can think
9More

The Joy of Psyching Myself Out­ - The New York Times - 0 views

  • IS it possible to think scientifically and creatively at once? Can you be both a psychologist and a writer?
  • “A writer must be as objective as a chemist,” Anton Chekhov wrote in 1887. “He must abandon the subjective line; he must know that dung heaps play a very reasonable part in a landscape.”Chekhov’s chemist is a naturalist — someone who sees reality for what it is, rather than what it should be. In that sense, the starting point of the psychologist and the writer is the same: a curiosity that leads you to observe life in all its dimensions.
  • Without verification, we can’t always trust what we see — or rather, what we think we see. Whether we’re psychologists or writers (or anything else), our eyes are never the impartial eyes of Chekhov’s chemist. Our expectations, our wants and shoulds, get in the way. Take, once again, lying. Why do we think we know how liars behave? Liars should divert their eyes. They should feel ashamed and guilty and show the signs of discomfort that such feelings engender. And because they should, we think they do.
  • ...6 more annotations...
  • The desire for the world to be what it ought to be and not what it is permeates experimental psychology as much as writing, though. There’s experimental bias and the problem known in the field as “demand characteristics” — when researchers end up finding what they want to find by cuing participants to act a certain way. It’s also visible when psychologists choose to study one thing rather than another, dismiss evidence that doesn’t mesh with their worldview while embracing that which does. The subjectivity we tend to associate with the writerly way of looking may simply be more visible in that realm rather than exclusive to it.
  • “There is no other source of knowledge of the universe but the intellectual manipulation of carefully verified observations,” he said.
  • Intuition and inspiration, he went on, “can safely be counted as illusions, as fulfillments of wishes.” They are not to be relied on as evidence of any sort. “Science takes account of the fact that the mind of man creates such demands and is ready to trace their source, but it has not the slightest ground for thinking them justified.”
  • That is what both the psychologist and the writer should strive for: a self-knowledge that allows you to look in order to discover, without agenda, without preconception, without knowing or caring if what you’re seeing is wrong or right in your scheme of the world. It’s harder than it sounds. For one thing, you have to possess the self-knowledge that will allow you to admit when you’re wrong.
  • most new inquiries never happened — in a sense, it meant that objectivity was more an ideal than a reality. Each study was selected for a reason other than intrinsic interest.
  • Isolation precludes objectivity. It’s in the merging not simply of ways of seeing but also of modes of thought that a truly whole perception of reality may eventually emerge. Or at least that way we can realize its ultimate impossibility — and that’s not nothing, either.
13More

The Joy of Psyching Myself Out­ - The New York Times - 0 views

  • that neat separation is not just unwarranted; it’s destructive
  • Although it’s often presented as a dichotomy (the apparent subjectivity of the writer versus the seeming objectivity of the psychologist), it need not be.
  • IS it possible to think scientifically and creatively at once? Can you be both a psychologist and a writer?
  • ...10 more annotations...
  • “A writer must be as objective as a chemist,” Anton Chekhov wrote in 1887. “He must abandon the subjective line; he must know that dung heaps play a very reasonable part in a landscape.”
  • At the turn of the century, psychology was a field quite unlike what it is now. The theoretical musings of William James were the norm (a wry commenter once noted that William James was the writer, and his brother Henry, the psychologist)
  • Freud was a breed of psychologist that hardly exists anymore: someone who saw the world as both writer and psychologist, and for whom there was no conflict between the two. That boundary melding allowed him to posit the existence of cognitive mechanisms that wouldn’t be empirically proved for decades,
  • Freud got it brilliantly right and brilliantly wrong. The rightness is as good a justification as any of the benefits, the necessity even, of knowing how to look through the eyes of a writer. The wrongness is part of the reason that the distinction between writing and experimental psychology has grown far more rigid than it was a century ago.
  • the signs people associate with liars often have little empirical evidence to support them. Therein lies the psychologist’s distinct role and her necessity. As a writer, you look in order to describe, but you remain free to use that description however you see fit. As a psychologist, you look to describe, yes, but also to verify.
  • Without verification, we can’t always trust what we see — or rather, what we think we see.
  • The desire for the world to be what it ought to be and not what it is permeates experimental psychology as much as writing, though. There’s experimental bias and the problem known in the field as “demand characteristics” — when researchers end up finding what they want to find by cuing participants to act a certain way.
  • IN 1932, when he was in his 70s, Freud gave a series of lectures on psychoanalysis. In his final talk, “A Philosophy of Life,” he focused on clarifying an important caveat to his research: His followers should not be confused by the seemingly internal, and thus possibly subjective, nature of his work. “There is no other source of knowledge of the universe but the intellectual manipulation of carefully verified observations,” he said.
  • That is what both the psychologist and the writer should strive for: a self-knowledge that allows you to look in order to discover, without agenda, without preconception, without knowing or caring if what you’re seeing is wrong or right in your scheme of the world. It’s harder than it sounds. For one thing, you have to possess the self-knowledge that will allow you to admit when you’re wrong.
  • Even with the best intentions, objectivity can prove a difficult companion. I left psychology behind because I found its structural demands overly hampering. I couldn’t just pursue interesting lines of inquiry; I had to devise a set of experiments, see how feasible they were, both technically and financially, consider how they would reflect on my career. That meant that most new inquiries never happened — in a sense, it meant that objectivity was more an ideal than a reality. Each study was selected for a reason other than intrinsic interest.
5More

'Nothing on this page is real': How lies become truth in online America - The Washingto... - 0 views

  • “Share if you’re outraged!” his posts often read, and thousands of people on Facebook had clicked “like” and then “share,” most of whom did not recognize his posts as satire. Instead, Blair’s page had become one of the most popular on Facebook among Trump-supporting conservatives over 55.
  • “Nothing on this page is real,” read one of the 14 disclaimers on Blair’s site, and yet in the America of 2018 his stories had become real, reinforcing people’s biases, spreading onto Macedonian and Russian fake news sites, amassing an audience of as many 6 million visitors each month who thought his posts were factual
  • “No matter how racist, how bigoted, how offensive, how obviously fake we get, people keep coming back,” Blair once wrote, on his own personal Facebook page. “Where is the edge? Is there ever a point where people realize they’re being fed garbage and decide to return to reality?”
  • ...2 more annotations...
  • Chapian didn’t believe everything she read online, but she was also distrustful of mainstream fact-checkers and reported news. It sometimes felt to her like real facts had become indiscernible — that the truth was often somewhere in between. What she trusted most was her own ability to think critically and discern the truth, and increasingly her instincts aligned with the online community where she spent most of her time.
  • Her number of likes and shares on Facebook increased each year until she was sometimes awakening to check her news feed in the middle of the night, liking and commenting on dozens of posts each day. She felt as if she was being let in on a series of dark revelations about the United States, and it was her responsibility to see and to share them.
29More

Can truth survive this president? An honest investigation. - The Washington Post - 0 views

  • in the summer of 2002, long before “fake news” or “post-truth” infected the vernacular, one of President George W. Bush’s top advisers mocked a journalist for being part of the “reality-based community.” Seeking answers in reality was for suckers, the unnamed adviser explained. “We’re an empire now, and when we act, we create our own reality.”
  • This was the hubris and idealism of a post-Cold War, pre-Iraq War superpower: If you exert enough pressure, events will bend to your will.
  • the deceit emanating from the White House today is lazier, more cynical. It is not born of grand strategy or ideology; it is impulsive and self-serving. It is not arrogant, but shameless.
  • ...26 more annotations...
  • Bush wanted to remake the world. President Trump, by contrast, just wants to make it up as he goes along
  • Through all their debates over who is to blame for imperiling truth (whether Trump, postmodernism, social media or Fox News), as well as the consequences (invariably dire) and the solutions (usually vague), a few conclusions materialize, should you choose to believe them.
  • There is a pattern and logic behind the dishonesty of Trump and his surrogates; however, it’s less multidimensional chess than the simple subordination of reality to political and personal ambition
  • Trump’s untruth sells best precisely when feelings and instincts overpower facts, when America becomes a safe space for fabrication.
  • Rand Corp. scholars Jennifer Kavanagh and Michael D. Rich point to the Gilded Age, the Roaring Twenties and the rise of television in the mid-20th century as recent periods of what they call “Truth Decay” — marked by growing disagreement over facts and interpretation of data; a blurring of lines between opinion, fact and personal experience; and diminishing trust in once-respected sources of information.
  • In eras of truth decay, “competing narratives emerge, tribalism within the U.S. electorate increases, and political paralysis and dysfunction grow,”
  • Once you add the silos of social media as well as deeply polarized politics and deteriorating civic education, it becomes “nearly impossible to have the types of meaningful policy debates that form the foundation of democracy.”
  • To interpret our era’s debasement of language, Kakutani reflects perceptively on the World War II-era works of Victor Klemperer, who showed how the Nazis used “words as ‘tiny doses of arsenic’ to poison and subvert the German culture,” and of Stefan Zweig, whose memoir “The World of Yesterday” highlights how ordinary Germans failed to grasp the sudden erosion of their freedoms.
  • Kakutani calls out lefty academics who for decades preached postmodernism and social constructivism, which argued that truth is not universal but a reflection of relative power, structural forces and personal vantage points.
  • postmodernists rejected Enlightenment ideals as “vestiges of old patriarchal and imperialist thinking,” Kakutani writes, paving the way for today’s violence against fact in politics and science.
  • “dumbed-down corollaries” of postmodernist thought have been hijacked by Trump’s defenders, who use them to explain away his lies, inconsistencies and broken promises.
  • intelligent-design proponents and later climate deniers drew from postmodernism to undermine public perceptions of evolution and climate change. “Even if right-wing politicians and other science deniers were not reading Derrida and Foucault, the germ of the idea made its way to them: science does not have a monopoly on the truth,
  • McIntyre quotes at length from mea culpas by postmodernist and social constructivist writers agonizing over what their theories have wrought, shocked that conservatives would use them for nefarious purposes
  • pro-Trump troll and conspiracy theorist Mike Cernovich , who helped popularize the “Pizzagate” lie, has forthrightly cited his unlikely influences. “Look, I read postmodernist theory in college,” Cernovich told the New Yorker in 2016. “If everything is a narrative, then we need alternatives to the dominant narrative. I don’t seem like a guy who reads [Jacques] Lacan, do I?
  • When truth becomes malleable and contestable regardless of evidence, a mere tussle of manufactured narratives, it becomes less about conveying facts than about picking sides, particularly in politics.
  • In “On Truth,” Cambridge University philosopher Simon Blackburn writes that truth is attainable, if at all, “only at the vanishing end points of enquiry,” adding that, “instead of ‘facts first’ we may do better if we think of ‘enquiry first,’ with the notion of fact modestly waiting to be invited to the feast afterward.
  • He is concerned, but not overwhelmingly so, about the survival of truth under Trump. “Outside the fevered world of politics, truth has a secure enough foothold,” Blackburn writes. “Perjury is still a serious crime, and we still hope that our pilots and surgeons know their way about.
  • Kavanaugh and Rich offer similar consolation: “Facts and data have become more important in most other fields, with political and civil discourse being striking exceptions. Thus, it is hard to argue that the world is truly ‘post-fact.’ ”
  • McIntyre argues persuasively that our methods of ascertaining truth — not just the facts themselves — are under attack, too, and that this assault is especially dangerous.
  • Ideologues don’t just disregard facts they disagree with, he explains, but willingly embrace any information, however dubious, that fits their agenda. “This is not the abandonment of facts, but a corruption of the process by which facts are credibly gathered and reliably used to shape one’s beliefs about reality. Indeed, the rejection of this undermines the idea that some things are true irrespective of how we feel about them.”
  • “It is hardly a depressing new phenomenon that people’s beliefs are capable of being moved by their hopes, grievances and fears,” Blackburn writes. “In order to move people, objective facts must become personal beliefs.” But it can’t work — or shouldn’t work — in reverse.
  • More than fearing a post-truth world, Blackburn is concerned by a “post-shame environment,” in which politicians easily brush off their open disregard for truth.
  • it is human nature to rationalize away the dissonance. “Why get upset by his lies, when all politicians lie?” Kakutani asks, distilling the mind-set. “Why get upset by his venality, when the law of the jungle rules?”
  • So any opposition is deemed a witch hunt, or fake news, rigged or just so unfair. Trump is not killing the truth. But he is vandalizing it, constantly and indiscriminately, diminishing its prestige and appeal, coaxing us to look away from it.
  • the collateral damage includes the American experiment.
  • “One of the most important ways to fight back against post-truth is to fight it within ourselves,” he writes, whatever our particular politics may be. “It is easy to identify a truth that someone else does not want to see. But how many of us are prepared to do this with our own beliefs? To doubt something that we want to believe, even though a little piece of us whispers that we do not have all the facts?”
5More

Building a Nation of Know-Nothings - NYTimes.com - 1 views

  • It’s not just that 47 percent of Republicans believe the lie that Obama is a Muslim, or that 27 percent in the party doubt that the president of the United States is a citizen. But fully half of them believe falsely that the big bailout of banks and insurance companies under TARP was enacted by Obama, and not by President Bush.
  • Take a look at Tuesday night’s box score in the baseball game between New York and Toronto. The Yankees won, 11-5. Now look at the weather summary, showing a high of 71 for New York. The score and temperature are not subject to debate. Yet a president’s birthday or whether he was even in the White House on the day TARP was passed are apparently open questions. A growing segment of the party poised to take control of Congress has bought into denial of the basic truths of Barack Obama’s life. What’s more, this astonishing level of willful ignorance has come about largely by design, and has been aided by a press afraid to call out the primary architects of the lies.
  • It would be nice to dismiss the stupid things that Americans believe as harmless, the price of having such a large, messy democracy.
  • ...1 more annotation...
  • So what if one-in-five believe the sun revolves around the earth, or aren’t sure from which country the United States gained its independence? But false belief in weapons of mass-destruction led the United States to a trillion-dollar war. And trust in rising home value as a truism as reliable as a sunrise was a major contributor to the catastrophic collapse of the economy. At its worst extreme, a culture of misinformation can produce something like Iran, which is run by a Holocaust denier.
  •  
    A major part of the US population now accepts denies basic facts, influenced by a deliberate partisan misinformation campaign tolerated by the press.
8More

Books are getting shorter; here's why - 0 views

  • "A leading brain scientist in England points out that texting actually decreases the ability to think in complex ways because it eliminates complexity in sentence structure. Put it all together and it seems that no one has patience to sit quietly and read a book, as we might have a generation or even ten years ago."
  • "People are publishing books that are radically shorter than in the past," he says.
  • But books aren't just getting shorter, says Levin. What the reader wants from the author is changing too.
  • ...4 more annotations...
  • "But it's also a little bit of intellectual laziness," he adds. "That's what happens in an era when people are famous for being famous instead of famous for having accomplished something distinctive. If you and the media say you're special, you probably are."
  • "It's a paradox," he says. "We distrust authority if it's in the form of a major institution, like government, business or Wall Street. But if an individual claims authority in a given field, we assume the person must be telling the truth about his or her credentials. It's the natural trust we extend others -- we typically assume that people are who they say they are."
  • He says readers no longer want an author to prove his or her assertions. They just want to know the author is giving legitimate answers to their questions.
  • With a printed book people feel more committed to reading the entire thing, but with a digital book not so much, which is another reason a lot of today's books are shorter.
  •  
    Interesting analysis on how social media affect human behavior. Intellectual laziness and our desire for simplicity leads us in the path of logical fallacies.
13More

What's behind the confidence of the incompetent? This suddenly popular psychological ph... - 0 views

  • Someone who has very little knowledge in a subject claims to know a lot. That person might even boast about being an expert.
  • This phenomenon has a name: the Dunning-Kruger effect. It’s not a disease, syndrome or mental illness; it is present in everybody to some extent, and it’s been around as long as human cognition, though only recently has it been studied and documented in social psychology.
  • Charles Darwin followed that up in 1871 with “ignorance more frequently begets confidence than does knowledge.”
  • ...10 more annotations...
  • Put simply, incompetent people think they know more than they really do, and they tend to be more boastful about it.
  • To test Darwin’s theory, the researchers quizzed people on several topics, such as grammar, logical reasoning and humor. After each test, they asked the participants how they thought they did. Specifically, participants were asked how many of the other quiz-takers they beat.
  • Time after time, no matter the subject, the people who did poorly on the tests ranked their competence much higher
  • On average, test takers who scored as low as the 10th percentile ranked themselves near the 70th percentile. Those least likely to know what they were talking about believed they knew as much as the experts.
  • Dunning and Kruger’s results have been replicated in at least a dozen different domains: math skills, wine tasting, chess, medical knowledge among surgeons and firearm safety among hunters.
  • Even though President Trump’s statements are rife with errors, falsehoods or inaccuracies, he expresses great confidence in his aptitude. He says he does not read extensively because he solves problems “with very little knowledge other than the knowledge I [already] had.” He has said in interviews he doesn’t read lengthy reports because “I already know exactly what it is.”
  • He has “the best words” and cites his “high levels of intelligence” in rejecting the scientific consensus on climate change. Decades ago, he said he could end the Cold War: “It would take an hour and a half to learn everything there is to learn about missiles,” Trump told The Washington Post’s Lois Romano over dinner in 1984. “I think I know most of it anyway.”
  • Whether people want to understand “the other side” or they’re just looking for an epithet, the Dunning-Kruger effect works as both, Dunning said, which he believes explains the rise of interest.
  • Dunning says the effect is particularly dangerous when someone with influence or the means to do harm doesn’t have anyone who can speak honestly about their mistakes.
  • Not surprisingly (though no less concerning), Dunning’s follow-up research shows the poorest performers are also the least likely to accept criticism or show interest in self improvement.
32More

Opinion | How Genetics Is Changing Our Understanding of 'Race' - The New York Times - 0 views

  • In 1942, the anthropologist Ashley Montagu published “Man’s Most Dangerous Myth: The Fallacy of Race,” an influential book that argued that race is a social concept with no genetic basis.
  • eginning in 1972, genetic findings began to be incorporated into this argument. That year, the geneticist Richard Lewontin published an important study of variation in protein types in blood. He grouped the human populations he analyzed into seven “races” — West Eurasians, Africans, East Asians, South Asians, Native Americans, Oceanians and Australians — and found that around 85 percent of variation in the protein types could be accounted for by variation within populations and “races,” and only 15 percent by variation across them. To the extent that there was variation among humans, he concluded, most of it was because of “differences between individuals.”
  • In this way, a consensus was established that among human populations there are no differences large enough to support the concept of “biological race.” Instead, it was argued, race is a “social construct,” a way of categorizing people that changes over time and across countries.
  • ...29 more annotations...
  • t is true that race is a social construct. It is also true, as Dr. Lewontin wrote, that human populations “are remarkably similar to each other” from a genetic point of view.
  • this consensus has morphed, seemingly without questioning, into an orthodoxy. The orthodoxy maintains that the average genetic differences among people grouped according to today’s racial terms are so trivial when it comes to any meaningful biological traits that those differences can be ignored.
  • With the help of these tools, we are learning that while race may be a social construct, differences in genetic ancestry that happen to correlate to many of today’s racial constructs are real.
  • I have deep sympathy for the concern that genetic discoveries could be misused to justify racism. But as a geneticist I also know that it is simply no longer possible to ignore average genetic differences among “races.”
  • Groundbreaking advances in DNA sequencing technology have been made over the last two decades
  • Care.
  • The orthodoxy goes further, holding that we should be anxious about any research into genetic differences among populations
  • You will sometimes hear that any biological differences among populations are likely to be small, because humans have diverged too recently from common ancestors for substantial differences to have arisen under the pressure of natural selection. This is not true. The ancestors of East Asians, Europeans, West Africans and Australians were, until recently, almost completely isolated from one another for 40,000 years or longer, which is more than sufficient time for the forces of evolution to work
  • I am worried that well-meaning people who deny the possibility of substantial biological differences among human populations are digging themselves into an indefensible position, one that will not survive the onslaught of science.
  • I am also worried that whatever discoveries are made — and we truly have no idea yet what they will be — will be cited as “scientific proof” that racist prejudices and agendas have been correct all along, and that those well-meaning people will not understand the science well enough to push back against these claims.
  • This is why it is important, even urgent, that we develop a candid and scientifically up-to-date way of discussing any such difference
  • While most people will agree that finding a genetic explanation for an elevated rate of disease is important, they often draw the line there. Finding genetic influences on a propensity for disease is one thing, they argue, but looking for such influences on behavior and cognition is another
  • Is performance on an intelligence test or the number of years of school a person attends shaped by the way a person is brought up? Of course. But does it measure something having to do with some aspect of behavior or cognition? Almost certainly.
  • Recent genetic studies have demonstrated differences across populations not just in the genetic determinants of simple traits such as skin color, but also in more complex traits like bodily dimensions and susceptibility to diseases.
  • in Iceland, there has been measurable genetic selection against the genetic variations that predict more years of education in that population just within the last century.
  • consider what kinds of voices are filling the void that our silence is creating
  • Nicholas Wade, a longtime science journalist for The New York Times, rightly notes in his 2014 book, “A Troublesome Inheritance: Genes, Race and Human History,” that modern research is challenging our thinking about the nature of human population differences. But he goes on to make the unfounded and irresponsible claim that this research is suggesting that genetic factors explain traditional stereotypes.
  • 139 geneticists (including myself) pointed out in a letter to The New York Times about Mr. Wade’s book, there is no genetic evidence to back up any of the racist stereotypes he promotes.
  • Another high-profile example is James Watson, the scientist who in 1953 co-discovered the structure of DNA, and who was forced to retire as head of the Cold Spring Harbor Laboratories in 2007 after he stated in an interview — without any scientific evidence — that research has suggested that genetic factors contribute to lower intelligence in Africans than in Europeans.
  • What makes Dr. Watson’s and Mr. Wade’s statements so insidious is that they start with the accurate observation that many academics are implausibly denying the possibility of average genetic differences among human populations, and then end with a claim — backed by no evidence — that they know what those differences are and that they correspond to racist stereotypes
  • They use the reluctance of the academic community to openly discuss these fraught issues to provide rhetorical cover for hateful ideas and old racist canards.
  • This is why knowledgeable scientists must speak out. If we abstain from laying out a rational framework for discussing differences among populations, we risk losing the trust of the public and we actively contribute to the distrust of expertise that is now so prevalent.
  • If scientists can be confident of anything, it is that whatever we currently believe about the genetic nature of differences among populations is most likely wrong.
  • For example, my laboratory discovered in 2016, based on our sequencing of ancient human genomes, that “whites” are not derived from a population that existed from time immemorial, as some people believe. Instead, “whites” represent a mixture of four ancient populations that lived 10,000 years ago and were each as different from one another as Europeans and East Asians are today.
  • For me, a natural response to the challenge is to learn from the example of the biological differences that exist between males and females
  • The differences between the sexes are far more profound than those that exist among human populations, reflecting more than 100 million years of evolution and adaptation. Males and females differ by huge tracts of genetic material
  • How do we accommodate the biological differences between men and women? I think the answer is obvious: We should both recognize that genetic differences between males and females exist and we should accord each sex the same freedoms and opportunities regardless of those differences
  • fulfilling these aspirations in practice is a challenge. Yet conceptually it is straightforward.
  • Compared with the enormous differences that exist among individuals, differences among populations are on average many times smaller, so it should be only a modest challenge to accommodate a reality in which the average genetic contributions to human traits differ.
24More

The Facebook Fallacy: Privacy Is Up to You - The New York Times - 0 views

  • As Facebook’s co-founder and chief executive parried questions from members of Congress about how the social network would protect its users’ privacy, he returned time and again to what probably sounded like an unimpeachable proposition.
  • By providing its users with greater and more transparent controls over the personal data they share and how it is used for targeted advertising, he insisted, Facebook could empower them to make their own call and decide how much privacy they were willing to put on the block.
  • providing a greater sense of control over their personal data won’t make Facebook users more cautious. It will instead encourage them to share more.
  • ...21 more annotations...
  • “Disingenuous is the adjective I had in my mind,”
  • “Fifteen years ago it would have been legitimate to propose this argument,” he added. “But it is no longer legitimate to ignore the behavioral problems and propose simply more transparency and controls.”
  • Professor Acquisti and two colleagues, Laura Brandimarte and the behavioral economist George Loewenstein, published research on this behavior nearly six years ago. “Providing users of modern information-sharing technologies with more granular privacy controls may lead them to share more sensitive information with larger, and possibly riskier, audiences,” they concluded.
  • the critical question is whether, given the tools, we can be trusted to manage the experience. The increasing body of research into how we behave online suggests not.
  • “Privacy control settings give people more rope to hang themselves,” Professor Loewenstein told me. “Facebook has figured this out, so they give you incredibly granular controls.”
  • This paradox is hardly the only psychological quirk for the social network to exploit. Consider default settings. Tons of research in behavioral economics has found that people tend to stick to the default setting of whatever is offered to them, even when they could change it easily.
  • “Facebook is acutely aware of this,” Professor Loewenstein told me. In 2005, its default settings shared most profile fields with, at most, friends of friends. Nothing was shared by default with the full internet.
  • By 2010, however, likes, name, gender, picture and a lot of other things were shared with everybody online. “Facebook changed the defaults because it appreciated their power,” Professor Loewenstein added.
  • The phenomenon even has a name: the “control paradox.”
  • people who profess concern about privacy will provide the emails of their friends in exchange for some pizza.
  • They also found that providing consumers reassuring though irrelevant information about their ability to protect their privacy will make them less likely to avoid surveillance.
  • Another experiment revealed that people are more willing to come clean about their engagement in illicit or questionable behavior when they believe others have done so, too
  • Those in the industry often argue that people don’t really care about their privacy — that they may seem concerned when they answer surveys, but still routinely accept cookies and consent to have their data harvested in exchange for cool online experiences
  • Professor Acquisti thinks this is a fallacy. The cognitive hurdles to manage our privacy online are simply too steep.
  • While we are good at handling our privacy in the offline world, lowering our voices or closing the curtains as the occasion may warrant, there are no cues online to alert us to a potential privacy invasion
  • Even if we were to know precisely what information companies like Facebook have about us and how it will be used, which we don’t, it would be hard for us to assess potential harms
  • Members of Congress have mostly let market forces prevail online, unfettered by government meddling. Privacy protection in the internet economy has relied on the belief that consumers will make rational choices
  • Europe’s stringent new privacy protection law, which Facebook has promised to apply in the United States, may do better than the American system of disclosure and consen
  • the European system also relies mostly on faith that consumers will make rational choices.
  • The more that psychologists and behavioral economists study psychological biases and quirks, the clearer it seems that rational choices alone won’t work. “I don’t think any kind of disclosure or opt in or opt out is going to protect us from our worst instincts,”
  • What to do? Professor Acquisti suggests flipping the burden of proof. The case for privacy regulation rests on consumers’ proving that data collection is harmful. Why not ask the big online platforms like Facebook to prove they can’t work without it? If reducing data collection imposes a cost, we could figure out who bears it — whether consumers, advertisers or Facebook’s bottom line.
12More

This Is Not a Market | Dissent Magazine - 0 views

  • Given how ordinary people use the term, it’s not surprising that academic economists are a little vague about it—but you’ll be glad to hear that they know they’re being vague. A generation of economists have criticized their colleagues’ inability to specify what a “market” actually is. George Stigler, back in 1967, thought it “a source of embarrassment that so little attention has been paid to the theory of markets.” Sociologists agree: according to Harrison White, there is no “neoclassical theory of the market—[only] a pure theory of exchange.” And Wayne Baker found that the idea of the market is “typically assumed—not studied” by most economists, who “implicitly characterize ‘market’ as a ‘featureless plane.’
  • When we say “market” now, we mean nothing particularly specific, and, at the same time, everything—the entire economy, of course, but also our lives in general. If you can name it, there’s a market in it: housing, education, the law, dating. Maybe even love is “just an economy based on resource scarcity.”
  • The use of markets to describe everything is odd, because talking about “markets” doesn’t even help us understand how the economy works—let alone the rest of our lives. Even though nobody seems to know what it means, we use the metaphor freely, even unthinkingly. Let the market decide. The markets are volatile. The markets responded poorly. Obvious facts—that the economy hasn’t rebounded after the recession—are hidden or ignored, because “the market” is booming, and what is the economy other than “the market”? Well, it’s lots of other things. We might see that if we talked about it a bit differently.
  • ...9 more annotations...
  • For instance, we might choose a different metaphor—like, say, the traffic system. Sounds ridiculous? No more so than the market metaphor. After all, we already talk about one important aspect of economic life in terms of traffic: online activity. We could describe it in market terms (the market demands Trump memes!), but we use a different metaphor, because it’s just intuitively more suitable. That last Trump meme is generating a lot of traffic. Redirect your attention as required.
  • We don’t know much about markets, because we don’t deal with them very often. But most of us know plenty about traffic systems: drivers will know the frustration of trying to turn left onto a major road, of ceaseless, pointless lane-switching on a stalled rush-hour freeway, but also the joys of clear highways.
  • We know the traffic system because, whether we like it or not, we are always involved in it, from birth
  • As of birth, Jean is in the economy—even if s/he rarely goes to a market. You can’t not be an economic actor; you can’t not be part of the transport system.
  • Consider also the composition of the traffic system and the economy. A market, whatever else it is, is always essentially the same thing: a place where people can come together to buy and sell things. We could set up a market right now, with a few fences and a sign announcing that people could buy and sell. We don’t even really need the fences. A traffic system, however, is far more complex. To begin with, the system includes publicly and privately run elements: most cars are privately owned, as are most airlines
  • If we don’t evaluate traffic systems based on their size, or their growth, how do we evaluate them? Mostly, by how well they help people get where they want to go. The market metaphor encourages us to think that all economic activity is motivated by the search for profit, and pursued in the same fashion everywhere. In a market, everyone’s desires are perfectly interchangeable. But, while everybody engages in the transport system, we have no difficulty remembering that we all want to go to different places, in different ways, at different times, at different speeds, for different reasons
  • Deciding how to improve the traffic system, how to expand people’s opportunities, is obviously a question of resource allocation and prioritization on a scale that private individuals—even traders—cannot influence on their own. That’s why government have not historically trusted the “magic of the markets” to produce better opportunities for transport. We intuitively understand that these decisions are made at the level of mass society and public policy. And, whether you like it or not, this is true for decisions about the economy as well.
  • Thinking of the economy in terms of the market—a featureless plane, with no entry or exit costs, little need for regulation, and equal opportunity for all—obscures this basic insight. And this underlying misconception creates a lot of problems: we’ve fetishized economic growth, we’ve come to distrust government regulation, and we imagine that the inequalities in our country, and our world, are natural or justified. If we imagine the economy otherwise—as a traffic system, for example—we see more clearly how the economy actually works.
  • We see that our economic life looks a lot less like going to “market” for fun and profit than it does sitting in traffic on our morning commute, hoping against hope that we’ll get where we want to go, and on time.
« First ‹ Previous 161 - 180 of 275 Next › Last »
Showing 20 items per page