Skip to main content

Home/ TOK Friends/ Group items tagged characteristics

Rss Feed Group items tagged

Javier E

A scholar asks, 'Can democracy survive the Internet?' - The Washington Post - 0 views

  • Nathaniel Persily, a law professor at Stanford University
  • has written about this in a forthcoming issue of the Journal of Democracy in an article with a title that sums up his concerns: “Can Democracy Survive the Internet?”
  • Persily argues that the 2016 campaign broke down previously established rules and distinctions “between insiders and outsiders, earned media and advertising, media and non-media, legacy media and new media, news and entertainment and even foreign and domestic sources of campaign communication.”
  • ...10 more annotations...
  • Clinton played by old rules; Trump did not. He recognized the potential rewards of exploiting what the Internet offered, and he conducted his campaign through unconventional means.
  • “That’s what Donald Trump realized that a lot of us didn’t,” Persily said. “That it was more important to swamp the communication environment than it was to advocate for a particular belief or fight for the truth of a particular story,”
  • Persily notes that the Internet reacted to the Trump campaign “like an ecosystem welcoming a new and foreign species. His candidacy triggered new strategies and promoted established Internet forces. Some of these (such as the ‘alt-right’) were moved by ideological affinity, while others sought to profit financially or to further a geopolitical agenda.
  • The rise and power of the Internet has accelerated the decline of institutions that once provided a mediating force in campaigns. Neither the legacy media nor the established political parties exercise the power they once had as referees, particularly in helping to sort out the integrity of information.
  • legacy media that once helped set the agenda for political conversation now often take their cues from new media.
  • The Internet, however, involves characteristics that heighten the disruptive and damaging influences on political campaigns. One, Persily said, is the velocity of information, the speed with which news, including fake news, moves and expands and is absorbed. Viral communication can create dysfunction in campaigns and within democracies.
  • Another factor is the pervasiveness of anonymous communication, clearly greater and more odious today. Anonymity facilitates a coarsening of speech on the Internet. It has become more and more difficult to determine the sources of such information, including whether these communications are produced by real people or by automated programs known as “bots.”
  • “the prevalence of bots in spreading propaganda and fake news appears to have reached new heights. One study found that between 16 September and 21 October 2016, bots produced about a fifth of all tweets related to the upcoming election. Across all three presidential debates, pro-Trump twitter bots generated about four times as many tweets as pro-Clinton bots. During the final debate in particular, that figure rose to seven times as many.”
  • the fear of dark money and “shady outsiders” running television commercials “seems quaint when compared to networks of thousands of bots of uncertain geographic origin creating automated messages designed to malign candidates and misinform voters.”
  • When asked how worrisome all this is, Persily said, “I’m extremely concerned.” He was quick to say he did not believe government should or even could regulate this new environment. But, he said, “We need to come to grips with how the new communication environment affects people’s political beliefs, the information they receive and then the choices that they make.”
Javier E

The Science of Snobbery: How We're Duped Into Thinking Fancy Things Are Better - The At... - 0 views

  • Expert judges and amateurs alike claim to judge classical musicians based on sound. But Tsay’s research suggests that the original judges, despite their experience and expertise, judged the competition (which they heard and watched live) based on visual information, just as amateurs do.
  • just like with classical music, we do not appraise wine in the way that we expect. 
  • Priceonomics revisited this seemingly damning research: the lack of correlation between wine enjoyment and price in blind tastings, the oenology students tricked by red food dye into describing a white wine like a red, a distribution of medals at tastings equivalent to what one would expect from pure chance, the grand crus described like cheap wines and vice-versa when the bottles are switched.
  • ...26 more annotations...
  • Taste does not simply equal your taste buds. It draws on information from all our senses as well as context. As a result, food is susceptible to the same trickery as wine. Adding yellow food dye to vanilla pudding leads people to experience a lemony taste. Diners eating in the dark at a chic concept restaurant confuse veal for tuna. Branding, packaging, and price tags are equally important to enjoyment. Cheap fish is routinely passed off as its pricier cousins at seafood and sushi restaurants. 
  • Just like with wine and classical music, we often judge food based on very different criteria than what we claim. The result is that our perceptions are easily skewed in ways we don’t anticipate. 
  • What does it mean for wine that presentation so easily trumps the quality imbued by being grown on premium Napa land or years of fruitful aging? Is it comforting that the same phenomenon is found in food and classical music, or is it a strike against the authenticity of our enjoyment of them as well? How common must these manipulations be until we concede that the influence of the price tag of a bottle of wine or the visual appearance of a pianist is not a trick but actually part of the quality?
  • To answer these questions, we need to investigate the underlying mechanism that leads us to judge wine, food, and music by criteria other than what we claim to value. And that mechanism seems to be the quick, intuitive judgments our minds unconsciously make
  • this unknowability also makes it easy to be led astray when our intuition makes a mistake. We may often be able to count on the price tag or packaging of food and wine for accurate information about quality. But as we believe that we’re judging based on just the product, we fail to recognize when presentation manipulates our snap judgments.
  • Participants were just as effective when watching 6 second video clips and when comparing their ratings to ratings of teacher effectiveness as measured by actual student test performance. 
  • The power of intuitive first impressions has been demonstrated in a variety of other contexts. One experiment found that people predicted the outcome of political elections remarkably well based on silent 10 second video clips of debates - significantly outperforming political pundits and predictions made based on economic indicators.
  • In a real world case, a number of art experts successfully identified a 6th century Greek statue as a fraud. Although the statue had survived a 14 month investigation by a respected museum that included the probings of a geologist, they instantly recognized something was off. They just couldn’t explain how they knew.
  • Cases like this represent the canon behind the idea of the “adaptive unconscious,” a concept made famous by journalist Malcolm Gladwell in his book Blink. The basic idea is that we constantly, quickly, and unconsciously do the equivalent of judging a book by its cover. After all, a cover provides a lot of relevant information in a world in which we don’t have time to read every page.
  • Gladwell describes the adaptive unconscious as “a kind of giant computer that quickly and quietly processes a lot of the data we need in order to keep functioning as human beings.”
  • In a famous experiment, psychologist Nalini Ambady provided participants in an academic study with 30 second silent video clips of a college professor teaching a class and asked them to rate the effectiveness of the professor.
  • In follow up experiments, Chia-Jung Tsay found that those judging musicians’ auditions based on visual cues were not giving preference to attractive performers. Rather, they seemed to look for visual signs of relevant characteristics like passion, creativity, and uniqueness. Seeing signs of passion is valuable information. But in differentiating between elite performers, it gives an edge to someone who looks passionate over someone whose play is passionate
  • Outside of these more eccentric examples, it’s our reliance on quick judgments, and ignorance of their workings, that cause people to act on ugly, unconscious biases
  • It’s also why - from a business perspective - packaging and presentation is just as important as the good or service on offer. Why marketing is just as important as product. 
  • Gladwell ends Blink optimistically. By paying closer attention to our powers of rapid cognition, he argues, we can avoid its pitfalls and harness its powers. We can blindly audition musicians behind a screen, look at a piece of art devoid of other context, and pay particular attention to possible unconscious bias in our performance reports.
  • But Gladwell’s success in demonstrating how the many calculations our adaptive unconscious performs without our awareness undermines his hopeful message of consciously harnessing its power.
  • As a former world-class tennis player and coach of over 50 years, Braden is a perfect example of the ideas behind thin slicing. But if he can’t figure out what his unconscious is up to when he recognizes double faults, why should anyone else expect to be up to the task?
  • flawed judgment in fields like medicine and investing has more serious consequences. The fact that expertise is so tricky leads psychologist Daniel Kahneman to assert that most experts should seek the assistance of statistics and algorithms in making decisions.
  • In his book Thinking, Fast and Slow, he describes our two modes of thought: System 1, like the adaptive unconscious, is our “fast, instinctive, and emotional” intuition. System 2 is our “slower, more deliberative, and more logical” conscious thought. Kahneman believes that we often leave decisions up to System 1 and generally place far “too much confidence in human judgment” due to the pitfalls of our intuition described above.
  • Not every judgment will be made in a field that is stable and regular enough for an algorithm to help us make judgments or predictions. But in those cases, he notes, “Hundreds of studies have shown that wherever we have sufficient information to build a model, it will perform better than most people.”
  • Experts can avoid the pitfalls of intuition more easily than laypeople. But they need help too, especially as our collective confidence in expertise leads us to overconfidence in their judgments. 
  • This article has referred to the influence of price tags and context on products and experiences like wine and classical music concerts as tricks that skew our perception. But maybe we should consider them a real, actual part of the quality.
  • Losing ourselves in a universe of relativism, however, will lead us to miss out on anything new or unique. Take the example of the song “Hey Ya!” by Outkast. When the music industry heard it, they felt sure it would be a hit. When it premiered on the radio, however, listeners changed the channel. The song sounded too dissimilar from songs people liked, so they responded negatively. 
  • It took time for people to get familiar with the song and realize that they enjoyed it. Eventually “Hey Ya!” became the hit of the summer.
  • Many boorish people talking about the ethereal qualities of great wine probably can't even identify cork taint because their impressions are dominated by the price tag and the wine label. But the classic defense of wine - that you need to study it to appreciate it - is also vindicated. The open question - which is both editorial and empiric - is what it means for the industry that constant vigilance and substantial study is needed to dependably appreciate wine for the product quality alone. But the questions is relevant to the enjoyment of many other products and experiences that we enjoy in life.
  • Maybe the most important conclusion is to not only recognize the fallibility of our judgments and impressions, but to recognize when it matters, and when it doesn’t
Javier E

Have Smartphones Destroyed a Generation? - The Atlantic - 0 views

  • She told me she’d spent most of the summer hanging out alone in her room with her phone. That’s just the way her generation is, she said. “We didn’t have a choice to know any life without iPads or iPhones. I think we like our phones more than we like actual people.”
  • The arrival of the smartphone has radically changed every aspect of teenagers’ lives, from the nature of their social interactions to their mental health. These changes have affected young people in every corner of the nation and in every type of household
  • Around 2012, I noticed abrupt shifts in teen behaviors and emotional states. The gentle slopes of the line graphs became steep mountains and sheer cliffs, and many of the distinctive characteristics of the Millennial generation began to disappear. In all my analyses of generational data—some reaching back to the 1930s—I had never seen anything like it.
  • ...54 more annotations...
  • the trends persisted, across several years and a series of national surveys. The changes weren’t just in degree, but in kind.
  • The biggest difference between the Millennials and their predecessors was in how they viewed the world; teens today differ from the Millennials not just in their views but in how they spend their time. The experiences they have every day are radically different from those of the generation that came of age just a few years before them.
  • it was exactly the moment when the proportion of Americans who owned a smartphone surpassed 50 percent.
  • theirs is a generation shaped by the smartphone and by the concomitant rise of social media. I call them iGen
  • Born between 1995 and 2012, members of this generation are growing up with smartphones, have an Instagram account before they start high school, and do not remember a time before the internet.
  • iGen’s oldest members were early adolescents when the iPhone was introduced, in 2007, and high-school students when the iPad entered the scene, in 2010. A 2017 survey of more than 5,000 American teens found that three out of four owned an iPhone.
  • . I had grown accustomed to line graphs of trends that looked like modest hills and valleys. Then I began studying Athena’s generation.
  • More comfortable in their bedrooms than in a car or at a party, today’s teens are physically safer than teens have ever been. They’re markedly less likely to get into a car accident and, having less of a taste for alcohol than their predecessors, are less susceptible to drinking’s attendant ills.
  • Psychologically, however, they are more vulnerable than Millennials were: Rates of teen depression and suicide have skyrocketed since 2011. It’s not an exaggeration to describe iGen as being on the brink of the worst mental-health crisis in decades. Much of this deterioration can be traced to their phones.
  • the twin rise of the smartphone and social media has caused an earthquake of a magnitude we’ve not seen in a very long time, if ever. There is compelling evidence that the devices we’ve placed in young people’s hands are having profound effects on their lives—and making them seriously unhappy.
  • But the allure of independence, so powerful to previous generations, holds less sway over today’s teens, who are less likely to leave the house without their parents. The shift is stunning: 12th-graders in 2015 were going out less often than eighth-graders did as recently as 2009.
  • Today’s teens are also less likely to date. The initial stage of courtship, which Gen Xers called “liking” (as in “Ooh, he likes you!”), kids now call “talking”—an ironic choice for a generation that prefers texting to actual conversation. After two teens have “talked” for a while, they might start dating.
  • only about 56 percent of high-school seniors in 2015 went out on dates; for Boomers and Gen Xers, the number was about 85 percent.
  • The decline in dating tracks with a decline in sexual activity. The drop is the sharpest for ninth-graders, among whom the number of sexually active teens has been cut by almost 40 percent since 1991. The average teen now has had sex for the first time by the spring of 11th grade, a full year later than the average Gen Xer
  • The teen birth rate hit an all-time low in 2016, down 67 percent since its modern peak, in 1991.
  • Nearly all Boomer high-school students had their driver’s license by the spring of their senior year; more than one in four teens today still lack one at the end of high school.
  • In conversation after conversation, teens described getting their license as something to be nagged into by their parents—a notion that would have been unthinkable to previous generations.
  • In the late 1970s, 77 percent of high-school seniors worked for pay during the school year; by the mid-2010s, only 55 percent did. The number of eighth-graders who work for pay has been cut in half.
  • Beginning with Millennials and continuing with iGen, adolescence is contracting again—but only because its onset is being delayed. Across a range of behaviors—drinking, dating, spending time unsupervised— 18-year-olds now act more like 15-year-olds used to, and 15-year-olds more like 13-year-olds. Childhood now stretches well into high school.
  • In an information economy that rewards higher education more than early work history, parents may be inclined to encourage their kids to stay home and study rather than to get a part-time job. Teens, in turn, seem to be content with this homebody arrangement—not because they’re so studious, but because their social life is lived on their phone. They don’t need to leave home to spend time with their friends.
  • eighth-, 10th-, and 12th-graders in the 2010s actually spend less time on homework than Gen X teens did in the early 1990s.
  • The time that seniors spend on activities such as student clubs and sports and exercise has changed little in recent years. Combined with the decline in working for pay, this means iGen teens have more leisure time than Gen X teens did, not less.
  • So what are they doing with all that time? They are on their phone, in their room, alone and often distressed.
  • despite spending far more time under the same roof as their parents, today’s teens can hardly be said to be closer to their mothers and fathers than their predecessors were. “I’ve seen my friends with their families—they don’t talk to them,” Athena told me. “They just say ‘Okay, okay, whatever’ while they’re on their phones. They don’t pay attention to their family.” Like her peers, Athena is an expert at tuning out her parents so she can focus on her phone.
  • The number of teens who get together with their friends nearly every day dropped by more than 40 percent from 2000 to 2015; the decline has been especially steep recently.
  • Eighth-graders who are heavy users of social media increase their risk of depression by 27 percent, while those who play sports, go to religious services, or even do homework more than the average teen cut their risk significantly.
  • The roller rink, the basketball court, the town pool, the local necking spot—they’ve all been replaced by virtual spaces accessed through apps and the web.
  • The results could not be clearer: Teens who spend more time than average on screen activities are more likely to be unhappy, and those who spend more time than average on nonscreen activities are more likely to be happy.
  • There’s not a single exception. All screen activities are linked to less happiness, and all nonscreen activities are linked to more happiness
  • Eighth-graders who spend 10 or more hours a week on social media are 56 percent more likely to say they’re unhappy than those who devote less time to social media
  • If you were going to give advice for a happy adolescence based on this survey, it would be straightforward: Put down the phone, turn off the laptop, and do something—anything—that does not involve a screen
  • Social-networking sites like Facebook promise to connect us to friends. But the portrait of iGen teens emerging from the data is one of a lonely, dislocated generation. Teens who visit social-networking sites every day but see their friends in person less frequently are the most likely to agree with the statements “A lot of times I feel lonely,” “I often feel left out of things,” and “I often wish I had more good friends.” Teens’ feelings of loneliness spiked in 2013 and have remained high since.
  • This doesn’t always mean that, on an individual level, kids who spend more time online are lonelier than kids who spend less time online.
  • Teens who spend more time on social media also spend more time with their friends in person, on average—highly social teens are more social in both venues, and less social teens are less so.
  • The more time teens spend looking at screens, the more likely they are to report symptoms of depression.
  • It’s not only a matter of fewer kids partying; fewer kids are spending time simply hanging out
  • Teens who spend three hours a day or more on electronic devices are 35 percent more likely to have a risk factor for suicide, such as making a suicide plan. (That’s much more than the risk related to, say, watching TV.)
  • Since 2007, the homicide rate among teens has declined, but the suicide rate has increased. As teens have started spending less time together, they have become less likely to kill one another, and more likely to kill themselves. In 2011, for the first time in 24 years, the teen suicide rate was higher than the teen homicide rate.
  • For all their power to link kids day and night, social media also exacerbate the age-old teen concern about being left out.
  • Today’s teens may go to fewer parties and spend less time together in person, but when they do congregate, they document their hangouts relentlessly—on Snapchat, Instagram, Facebook. Those not invited to come along are keenly aware of it. Accordingly, the number of teens who feel left out has reached all-time highs across age groups.
  • Forty-eight percent more girls said they often felt left out in 2015 than in 2010, compared with 27 percent more boys. Girls use social media more often, giving them additional opportunities to feel excluded and lonely when they see their friends or classmates getting together without them.
  • Social media levy a psychic tax on the teen doing the posting as well, as she anxiously awaits the affirmation of comments and likes. When Athena posts pictures to Instagram, she told me, “I’m nervous about what people think and are going to say. It sometimes bugs me when I don’t get a certain amount of likes on a picture.”
  • Girls have also borne the brunt of the rise in depressive symptoms among today’s teens. Boys’ depressive symptoms increased by 21 percent from 2012 to 2015, while girls’ increased by 50 percent—more than twice as much
  • The rise in suicide, too, is more pronounced among girls. Although the rate increased for both sexes, three times as many 12-to-14-year-old girls killed themselves in 2015 as in 2007, compared with twice as many boys
  • Social media give middle- and high-school girls a platform on which to carry out the style of aggression they favor, ostracizing and excluding other girls around the clock.
  • I asked my undergraduate students at San Diego State University what they do with their phone while they sleep. Their answers were a profile in obsession. Nearly all slept with their phone, putting it under their pillow, on the mattress, or at the very least within arm’s reach of the bed. They checked social media right before they went to sleep, and reached for their phone as soon as they woke up in the morning
  • the smartphone is cutting into teens’ sleep: Many now sleep less than seven hours most nights. Sleep experts say that teens should get about nine hours of sleep a night; a teen who is getting less than seven hours a night is significantly sleep deprived
  • Fifty-seven percent more teens were sleep deprived in 2015 than in 1991. In just the four years from 2012 to 2015, 22 percent more teens failed to get seven hours of sleep.
  • Two national surveys show that teens who spend three or more hours a day on electronic devices are 28 percent more likely to get less than seven hours of sleep than those who spend fewer than three hours, and teens who visit social-media sites every day are 19 percent more likely to be sleep deprived.
  • Teens who read books and magazines more often than the average are actually slightly less likely to be sleep deprived—either reading lulls them to sleep, or they can put the book down at bedtime.
  • Sleep deprivation is linked to myriad issues, including compromised thinking and reasoning, susceptibility to illness, weight gain, and high blood pressure. It also affects mood: People who don’t sleep enough are prone to depression and anxiety.
  • correlations between depression and smartphone use are strong enough to suggest that more parents should be telling their kids to put down their phone.
  • What’s at stake isn’t just how kids experience adolescence. The constant presence of smartphones is likely to affect them well into adulthood. Among people who suffer an episode of depression, at least half become depressed again later in life. Adolescence is a key time for developing social skills; as teens spend less time with their friends face-to-face, they have fewer opportunities to practice them
  • Significant effects on both mental health and sleep time appear after two or more hours a day on electronic devices. The average teen spends about two and a half hours a day on electronic devices. Some mild boundary-setting could keep kids from falling into harmful habits.
Javier E

In Defense of Facts - The Atlantic - 1 views

  • over 13 years, he has published a series of anthologies—of the contemporary American essay, of the world essay, and now of the historical American essay—that misrepresents what the essay is and does, that falsifies its history, and that contains, among its numerous selections, very little one would reasonably classify within the genre. And all of this to wide attention and substantial acclaim
  • D’Agata’s rationale for his “new history,” to the extent that one can piece it together from the headnotes that preface each selection, goes something like this. The conventional essay, nonfiction as it is, is nothing more than a delivery system for facts. The genre, as a consequence, has suffered from a chronic lack of critical esteem, and thus of popular attention. The true essay, however, deals not in knowing but in “unknowing”: in uncertainty, imagination, rumination; in wandering and wondering; in openness and inconclusion
  • Every piece of this is false in one way or another.
  • ...31 more annotations...
  • There are genres whose principal business is fact—journalism, history, popular science—but the essay has never been one of them. If the form possesses a defining characteristic, it is that the essay makes an argument
  • That argument can rest on fact, but it can also rest on anecdote, or introspection, or cultural interpretation, or some combination of all these and more
  • what makes a personal essay an essay and not just an autobiographical narrative is precisely that it uses personal material to develop, however speculatively or intuitively, a larger conclusion.
  • Nonfiction is the source of the narcissistic injury that seems to drive him. “Nonfiction,” he suggests, is like saying “not art,” and if D’Agata, who has himself published several volumes of what he refers to as essays, desires a single thing above all, it is to be known as a maker of art.
  • D’Agata tells us that the term has been in use since about 1950. In fact, it was coined in 1867 by the staff of the Boston Public Library and entered widespread circulation after the turn of the 20th century. The concept’s birth and growth, in other words, did coincide with the rise of the novel to literary preeminence, and nonfiction did long carry an odor of disesteem. But that began to change at least as long ago as the 1960s, with the New Journalism and the “nonfiction novel.”
  • What we really seem to get in D’Agata’s trilogy, in other words, is a compendium of writing that the man himself just happens to like, or that he wants to appropriate as a lineage for his own work.
  • What it’s like is abysmal: partial to trivial formal experimentation, hackneyed artistic rebellion, opaque expressions of private meaning, and modish political posturing
  • If I bought a bag of chickpeas and opened it to find that it contained some chickpeas, some green peas, some pebbles, and some bits of goat poop, I would take it back to the store. And if the shopkeeper said, “Well, they’re ‘lyric’ chickpeas,” I would be entitled to say, “You should’ve told me that before I bought them.”
  • when he isn’t cooking quotes or otherwise fudging the record, he is simply indifferent to issues of factual accuracy, content to rely on a mixture of guesswork, hearsay, and his own rather faulty memory.
  • His rejoinders are more commonly a lot more hostile—not to mention juvenile (“Wow, Jim, your penis must be so much bigger than mine”), defensive, and in their overarching logic, deeply specious. He’s not a journalist, he insists; he’s an essayist. He isn’t dealing in anything as mundane as the facts; he’s dealing in “art, dickhead,” in “poetry,” and there are no rules in art.
  • D’Agata replies that there is something between history and fiction. “We all believe in emotional truths that could never hold water, but we still cling to them and insist on their relevance.” The “emotional truths” here, of course, are D’Agata’s, not Presley’s. If it feels right to say that tae kwon do was invented in ancient India (not modern Korea, as Fingal discovers it was), then that is when it was invented. The term for this is truthiness.
  • D’Agata clearly wants to have it both ways. He wants the imaginative freedom of fiction without relinquishing the credibility (and for some readers, the significance) of nonfiction. He has his fingers crossed, and he’s holding them behind his back. “John’s a different kind of writer,” an editor explains to Fingal early in the book. Indeed he is. But the word for such a writer isn’t essayist. It’s liar.
  • he point of all this nonsense, and a great deal more just like it, is to advance an argument about the essay and its history. The form, D’Agata’s story seems to go, was neglected during the long ages that worshiped “information” but slowly emerged during the 19th and 20th centuries as artists learned to defy convention and untrammel their imaginations, coming fully into its own over the past several decades with the dawning recognition of the illusory nature of knowledge.
  • Most delectable is when he speaks about “the essay’s traditional ‘five-paragraph’ form.” I almost fell off my chair when I got to that one. The five-paragraph essay—introduction, three body paragraphs, conclusion; stultifying, formulaic, repetitive—is the province of high-school English teachers. I have never met one outside of a classroom, and like any decent college writing instructor, I never failed to try to wean my students away from them. The five-paragraph essay isn’t an essay; it’s a paper.
  • What he fails to understand is that facts and the essay are not antagonists but siblings, offspring of the same historical moment
  • —by ignoring the actual contexts of his selections, and thus their actual intentions—D’Agata makes the familiar contemporary move of imposing his own conceits and concerns upon the past. That is how ethnography turns into “song,” Socrates into an essayist, and the whole of literary history into a single man’s “emotional truth.”
  • The history of the essay is indeed intertwined with “facts,” but in a very different way than D’Agata imagines. D’Agata’s mind is Manichaean. Facts bad, imagination good
  • When he refers to his selections as essays, he does more than falsify the essay as a genre. He also effaces all the genres that they do belong to: not only poetry, fiction, journalism, and travel, but, among his older choices, history, parable, satire, the sermon, and more—genres that possess their own particular traditions, conventions, and expectation
  • one needs to recognize that facts themselves have a history.
  • Facts are not just any sort of knowledge, such as also existed in the ancient and medieval worlds. A fact is a unit of information that has been established through uniquely modern methods
  • Fact, etymologically, means “something done”—that is, an act or deed
  • It was only in the 16th century—an age that saw the dawning of a new empirical spirit, one that would issue not only in modern science, but also in modern historiography, journalism, and scholarship—that the word began to signify our current sense of “real state of things.”
  • It was at this exact time, and in this exact spirit, that the essay was born. What distinguished Montaigne’s new form—his “essays” or attempts to discover and publish the truth about himself—was not that it was personal (precursors like Seneca also wrote personally), but that it was scrupulously investigative. Montaigne was conducting research into his soul, and he was determined to get it right.
  • His famous motto, Que sais-je?—“What do I know?”—was an expression not of radical doubt but of the kind of skepticism that fueled the modern revolution in knowledge.
  • It is no coincidence that the first English essayist, Galileo’s contemporary Francis Bacon, was also the first great theorist of science.
  • That knowledge is problematic—difficult to establish, labile once created, often imprecise and always subject to the limitations of the human mind—is not the discovery of postmodernism. It is a foundational insight of the age of science, of fact and information, itself.
  • The point is not that facts do not exist, but that they are unstable (and are becoming more so as the pace of science quickens). Knowledge is always an attempt. Every fact was established by an argument—by observation and interpretation—and is susceptible to being overturned by a different one
  • A fact, you might say, is nothing more than a frozen argument, the place where a given line of investigation has come temporarily to rest.
  • Sometimes those arguments are scientific papers. Sometimes they are news reports, which are arguments with everything except the conclusions left out (the legwork, the notes, the triangulation of sources—the research and the reasoning).
  • When it comes to essays, though, we don’t refer to those conclusions as facts. We refer to them as wisdom, or ideas
  • the essay draws its strength not from separating reason and imagination but from putting them in conversation. A good essay moves fluidly between thought and feeling. It subjects the personal to the rigors of the intellect and the discipline of external reality. The truths it finds are more than just emotional.
kushnerha

BBC - Future - Will emoji become a new language? - 2 views

  • Emoji are now used in around half of every sentence on sites like Instagram, and Facebook looks set to introduce them alongside the famous “like” button as a way of expression your reaction to a post.
  • If you were to believe the headlines, this is just the tipping point: some outlets have claimed that emoji are an emerging language that could soon compete with English in global usage. To many, this would be an exciting evolution of the way we communicate; to others, it is linguistic Armageddon.
  • Do emoji show the same characteristics of other communicative systems and actual languages? And what do they help us to express that words alone can’t say?When emoji appear with text, they often supplement or enhance the writing. This is similar to gestures that appear along with speech. Over the past three decades, research has shown that our hands provide important information that often transcends and clarifies the message in speech. Emoji serve this function too – for instance, adding a kissy or winking face can disambiguate whether a statement is flirtatiously teasing or just plain mean.
  • ...17 more annotations...
  • This is a key point about language use: rarely is natural language ever limited to speech alone. When we are speaking, we constantly use gestures to illustrate what we mean. For this reason, linguists say that language is “multi-modal”. Writing takes away that extra non-verbal information, but emoji may allow us to re-incorporate it into our text.
  • Emoji are not always used as embellishments, however – sometimes, strings of the characters can themselves convey meaning in a longer sequence on their own. But to constitute their own language, they would need a key component: grammar.
  • A grammatical system is a set of constraints that governs how the meaning of an utterance is packaged in a coherent way. Natural language grammars have certain traits that distinguish them. For one, they have individual units that play different roles in the sequence – like nouns and verbs in a sentence. Also, grammar is different from meaning
  • When emoji are isolated, they are primarily governed by simple rules related to meaning alone, without these more complex rules. For instance, according to research by Tyler Schnoebelen, people often create strings of emoji that share a common meaning
  • This sequence has little internal structure; even when it is rearranged, it still conveys the same message. These images are connected solely by their broader meaning. We might consider them to be a visual list: “here are all things related to celebrations and birthdays.” Lists are certainly a conventionalised way of communicating, but they don’t have grammar the way that sentences do.
  • What if the order did matter though? What if they conveyed a temporal sequence of events? Consider this example, which means something like “a woman had a party where they drank, and then opened presents and then had cake”:
  • In all cases, the doer of the action (the agent) precedes the action. In fact, this pattern is commonly found in both full languages and simple communication systems. For example, the majority of the world’s languages place the subject before the verb of a sentence.
  • These rules may seem like the seeds of grammar, but psycholinguist Susan Goldin-Meadow and colleagues have found this order appears in many other systems that would not be considered a language. For example, this order appears when people arrange pictures to describe events from an animated cartoon, or when speaking adults communicate using only gestures. It also appears in the gesture systems created by deaf children who cannot hear spoken languages and are not exposed to sign languages.
  • describes the children as lacking exposure to a language and thus invent their own manual systems to communicate, called “homesigns”. These systems are limited in the size of their vocabularies and the types of sequences they can create. For this reason, the agent-act order seems not to be due to a grammar, but from basic heuristics – practical workarounds – based on meaning alone. Emoji seem to tap into this same system.
  • Nevertheless, some may argue that despite emoji’s current simplicity, this may be the groundwork for emerging complexity – that although emoji do not constitute a language at the present time, they could develop into one over time.
  • Could an emerging “emoji visual language” be developing in a similar way, with actual grammatical structure? To answer that question, you need to consider the intrinsic constraints on the technology itself.Emoji are created by typing into a computer like text. But, unlike text, most emoji are provided as whole units, except for the limited set of emoticons which convert to emoji, like :) or ;). When writing text, we use the building blocks (letters) to create the units (words), not by searching through a list of every whole word in the language.
  • emoji force us to convey information in a linear unit-unit string, which limits how complex expressions can be made. These constraints may mean that they will never be able to achieve even the most basic complexity that we can create with normal and natural drawings.
  • What’s more, these limits also prevent users from creating novel signs – a requisite for all languages, especially emerging ones. Users have no control over the development of the vocabulary. As the “vocab list” for emoji grows, it will become increasingly unwieldy: using them will require a conscious search process through an external list, not an easy generation from our own mental vocabulary, like the way we naturally speak or draw. This is a key point – it means that emoji lack the flexibility needed to create a new language.
  • we already have very robust visual languages, as can be seen in comics and graphic novels. As I argue in my book, The Visual Language of Comics, the drawings found in comics use a systematic visual vocabulary (such as stink lines to represent smell, or stars to represent dizziness). Importantly, the available vocabulary is not constrained by technology and has developed naturally over time, like spoken and written languages.
  • grammar of sequential images is more of a narrative structure – not of nouns and verbs. Yet, these sequences use principles of combination like any other grammar, including roles played by images, groupings of images, and hierarchic embedding.
  • measured participants’ brainwaves while they viewed sequences one image at a time where a disruption appeared either within the groupings of panels or at the natural break between groupings. The particular brainwave responses that we observed were similar to those that experimenters find when violating the syntax of sentences. That is, the brain responds the same way to violations of “grammar”, whether in sentences or sequential narrative images.
  • I would hypothesise that emoji can use a basic narrative structure to organise short stories (likely made up of agent-action sequences), but I highly doubt that they would be able to create embedded clauses like these. I would also doubt that you would see the same kinds of brain responses that we saw with the comic strip sequences.
clairemann

Raising the minimum wage is a health issue, too - 1 views

  • Congress just missed one of its best shots at improving health when the Senate failed to advance a bill that would have raised the minimum wage to US$15 an hour. Study after study has linked higher income to better health.
  • With that job, you’ll likely make more visits to primary care doctors, dentists and specialists who work in preventive care.
  • An inadequate income does none of these things. Instead, it increases susceptibility to psychological stress, malaise, illness and disease. This is one reason those who move off welfare benefits and gain employment improve their well-being.
  • ...6 more annotations...
  • Numerous studies show employment is linked to self-esteem, purpose and identity. It provides relationships, social connections, social status and regular productive activity; a job is an integral part of a person’s identity.
  • One study found that people with a disability who were employed were less likely to have frequent mental distress, including anxiety and depression, than those with a disability who were not employed (18% vs. 40%). This finding held up even when accounting for demographics and individual characteristics.
  • The average unemployment benefit is $320 weekly; the amount varies by state. The American Rescue Plan, recently passed to provide economic aid to million of Americans hit hard by the pandemic, adds an additional $300 to unemployment benefits through Sept. 6.
  • Compare that to the current federal minimum wage: $7.25 an hour. That’s $290 for a 40-hour week, less than what unemployment benefits pay. That means, for millions of Americans, being employed means less income.
  • Why not increase the minimum wage – at least enough to make it more than unemployment benefits? That way, more people would be motivated to seek jobs.
  • That said, people who are fit to work should be encouraged to seek, not shun, employment. With unemployment benefits more than the basic minimum wage in many states, we are sending the wrong message to millions. There’s more to a higher minimum wage than just more money. It also means more happiness, better health and a longer life.
katedriscoll

What are Cognitive Biases? | Interaction Design Foundation (IxDF) - 0 views

  • ognitive bias is an umbrella term that refers to the systematic ways in which the context and framing of information influence individuals’ judgment and decision-making. There are many kinds of cognitive biases that influence individuals differently, but their common characteristic is that—in step with human individuality—they lead to judgment and decision-making that deviates from rational objectivity.
  • In some cases, cognitive biases make our thinking and decision-making faster and more efficient. The reason is that we do not stop to consider all available information, as our thoughts proceed down some channels instead of others. In other cases, however, cognitive biases can lead to errors for exactly the same reason. An example is confirmation bias, where we tend to favor information that reinforces or confirms our pre-existing beliefs. For instance, if we believe that planes are dangerous, a handful of stories about plane crashes tend to be more memorable than millions of stories about safe, successful flights. Thus, the prospect of air travel equates to an avoidable risk of doom for a person inclined to think in this way, regardless of how much time has passed without news of an air catastrophe.
katedriscoll

Confirmation bias in the utilization of others' opinion strength | Nature Neuroscience - 0 views

  • Humans tend to discount information that undermines past choices and judgments. This
  • confirmation bias has significant impact on domains ranging from politics to science and education. Little is known about the mechanisms underlying this fundamental characteristic of belief formation. Here we report a mechanism underlying the confirmation bias. Specifically, we provide evidence for a failure to use the strength of others’ disconfirming opinions to alter confidence in judgments, but adequate use when opinions are confirmatory. This bias is related to reduced neural sensitivity to the strength of others’ opinions in the posterior medial prefrontal cortex when opinions are disconfirming. Our results demonstrate that existing judgments alter the neural representation of information strength, leaving the individual less likely to alter opinions in the face of disagreement.
ilanaprincilus06

The 'Time Has Come' For A Global Pandemic Treaty, WHO's Tedros Says : Coronavirus Updat... - 0 views

  • The COVID-19 pandemic proves that the world needs a pandemic treaty, says WHO Director-General Tedros Adhanom Ghebreyesus.
  • It's the one major change, Tedros said, that would do the most to boost global health security and also empower the World Health Organization.
  • More than two dozen world leaders said in March that they support an international treaty or framework on pandemic preparedness and response, signing a letter whose signatories notably did not include the U.S., China or Russia.
  • ...7 more annotations...
  • "The United States was one of the countries that supported the resolution to hold the special session," the WHO said Monday in response to an NPR inquiry. "That is not to say it has committed to support the treaty yet, as the process of moving forward was only confirmed today."
  • "The safety of the world's people cannot rely solely on the goodwill of governments."
  • A treaty would make countries more accountable to one another, he said.
  • The lack of sharing — of information, technology, resources and data — is the COVID-19 pandemic's defining characteristic, the WHO leader said.
  • "a monumental error for any country to think the danger has passed."
  • Tedros' remarks echoed the frustrations he raised last year, when he said the pandemic was presenting humanity with a test — one that we are failing.
  • "Are we unable to distinguish or identify the common enemy?"
caelengrubb

What Is A Paradigm Shift, Anyway? : 13.7: Cosmos And Culture : NPR - 0 views

  • Thomas Kuhn, the well-known physicist, philosopher and historian of science, was born 94 years ago today. He went on to become an important and broad-ranging thinker, and one of the most influential philosophers of the 20th century.
  • The Structure of Scientific Revolutions, transformed the philosophy of science and changed the way many scientists think about their work. But his influence extended well beyond the academy: The book was widely read — and seeped into popular culture
  • One measure of his influence is the widespread use of the term "paradigm shift," which he introduced in articulating his views about how science changes over time.
  • ...10 more annotations...
  • Talk of paradigms and paradigm shifts has since become commonplace — not only in science, but also in business, social movements and beyond.
  • He suggested that scientific revolutions are not a matter of incremental advance; they involve "paradigm shifts."
  • Kuhn posited two kinds of scientific change: incremental developments in the course of what he called "normal science," and scientific revolutions that punctuate these more stable periods.
  • But what, exactly, is a paradigm shift? Or, for that matter, a paradigm?
  • Accordingly, a paradigm shift is defined as "an important change that happens when the usual way of thinking about or doing something is replaced by a new and different way."
  • It turns out this question is hard to answer — not because paradigm has an especially technical or obscure definition, but because it has many. In a paper published in 1970, Margaret Masterson presented a careful reading of Kuhn's 1962 book. She identified 21 distinct senses in which Kuhn used the term paradigm.
  • First, a paradigm could refer to a special kind of achievement
  • "Achievements that share these two characteristics I shall henceforth refer to as 'paradigms.' "
  • But in other parts of the text, paradigms cover more ground. Paradigms can offer general epistemological viewpoints, like the "philosophical paradigm initiated by Descartes," or define a broad sweep of reality, as when "Paradigms determine large areas of experience at the same time."
  • In the end, Masterson distills Kuhn's 21 senses of paradigm into a more respectable three, and she identifies what she sees as both novel and important aspects of Kuhn's "paradigm view" of science. But for our purposes, Masterson's analysis sheds light on two questions that turn out to be related: what Kuhn meant by paradigm in the first place, and how a single word managed to assume such a broad and expansive set of meanings after being unleashed by Kuhn's book.
caelengrubb

Why language might be the optimal self-regulating system | Aeon Essays - 0 views

  • Language changes all the time. Some changes really are chaotic, and disruptive.
  • Descriptivists – that is, virtually all academic linguists – will point out that semantic creep is how languages work. It’s just something words do: look up virtually any nontechnical word in the great historical Oxford English Dictionary (OED), which lists a word’s senses in historical order
  • here is another fact to bear in mind: no language has fallen apart from lack of care
  • ...9 more annotations...
  • Prescriptivists cannot point to a single language that became unusable or inexpressive as a result of people’s failure to uphold traditional vocabulary and grammar. Every language existing today is fantastically expressive
  • Nonetheless, despite potential harm done by an individual word’s change in meaning, cultures tend to have all the words they need for all the things they want to talk about.
  • Every language has a characteristic inventory of contrasting sounds, called phonemes.
  • The answer is that language is a system. Sounds, words and grammar do not exist in isolation: each of these three levels of language constitutes a system in itself.
  • During the Great Vowel Shift, ee and oo started to move towards the sounds they have today. Nobody knows why
  • Words also weaken with frequent use
  • At the level of grammar, change might seem the most unsettling, threatening a deeper kind of harm than a simple mispronunciation or new use for an old word
  • what are the objects without those crucial case endings? The answer is boring: word order
  • Language is self-regulating. It’s a genius system – with no genius
caelengrubb

The Linguistic Evolution of 'Like' - The Atlantic - 0 views

  • In our mouths or in print, in villages or in cities, in buildings or in caves, a language doesn’t sit still. It can’t. Language change has preceded apace even in places known for preserving a language in amber
  • Because we think of like as meaning “akin to” or “similar to,” kids decorating every sentence or two with it seems like overuse. After all, how often should a coherently minded person need to note that something is similar to something rather than just being that something?
  • First, let’s take like in just its traditional, accepted forms. Even in its dictionary definition, like is the product of stark changes in meaning that no one would ever guess.
  • ...19 more annotations...
  • To an Old English speaker, the word that later became like was the word for, of all things, “body.”
  • The word was lic, and lic was part of a word, gelic, that meant “with the body,” as in “with the body of,” which was a way of saying “similar to”—as in like
  • It was just that, step by step, the syllable lic, which to an Old English speaker meant “body,” came to mean, when uttered by people centuries later, “similar to”—and life went on.
  • Like has become a piece of grammar: It is the source of the suffix -ly.
  • Like has become a part of compounds. Likewise began as like plus a word, wise, which was different from the one meaning “smart when either a child or getting old.”
  • Dictionaries tell us it’s pronounced “like-MINE-did,” but I, for one, say “LIKE- minded” and have heard many others do so
  • Therefore, like is ever so much more than some isolated thing clinically described in a dictionary with a definition like “(preposition) ‘having the same characteristics or qualities as; similar to.’”
  • What we are seeing in like’s transformations today are just the latest chapters in a story that began with an ancient word that was supposed to mean “body.”
  • It’s under this view of language—as something becoming rather than being, a film rather than a photo, in motion rather than at rest—that we should consider the way young people use (drum roll, please) like
  • The new like, then, is associated with hesitation.
  • So today’s like did not spring mysteriously from a crowd on the margins of unusual mind-set and then somehow jump the rails from them into the general population.
  • The problem with the hesitation analysis is that this was a thoroughly confident speaker.
  • It’s real-life usage of this kind—to linguists it is data, just like climate patterns are to meteorologists—that suggests that the idea of like as the linguistic equivalent to slumped shoulders is off.
  • Understandably so, of course—the meaning of like suggests that people are claiming that everything is “like” itself rather than itself.
  • The new like acknowledges unspoken objection while underlining one’s own point (the factuality). Like grandparents translates here as “There were, despite what you might think, actually grandparents.”
  • Then there is a second new like, which is closer to what people tend to think of all its new uses: it is indeed a hedge.
  • Then, the two likes I have mentioned must be distinguished from yet a third usage, the quotative like—as in “And she was like, ‘I didn’t even invite him.’
  • This is yet another way that like has become grammar. The meaning “similar to” is as natural a source here as it was for -ly: mimicking people’s utterances is talking similarly to, as in “like,” them.
  • Thus the modern American English speaker has mastered not just two, but actually three different new usages of like.
Javier E

We should know by now that progress isn't guaranteed - and often backfires - The Washin... - 1 views

  • We assume that progress is the natural order of things. Problems are meant to be solved. History is an upward curve of well-being. But what if all this is a fantasy
  • our most powerful disruptions shared one characteristic: They were not widely foreseen
  • This was true of the terrorism of 9/11; the financial crisis of 2008-2009 and the parallel Great Recession; and now the coronavirus pandemic
  • ...13 more annotations...
  • In each case, there was a failure of imagination, as Tom Friedman has noted. Warnings found little receptiveness among the public or government officials. We didn’t think what happened could happen. The presumption of progress bred complacency.
  • We fooled ourselves into thinking we had engineered permanent improvements in our social and economic systems.
  • To be fair, progress as it’s commonly understood — higher living standards — has not been at a standstill. Many advances have made life better
  • Similar inconsistencies and ambiguities attach to economic growth. It raises some up and pushes others down.
  • What we should have learned by now is that progress is often grudging, incomplete or contradictory.
  • Still, the setbacks loom ever larger. Our governmental debt is high, and economic stability is low. Many of the claims of progress turn out to be exaggerated, superficial, delusional or unattainable,
  • Sure, the Internet enables marvelous things. But it also imposes huge costs on society
  • Global warming is another example. It is largely a result of the burning of fossil fuels, which has been the engine of our progress. Now, it is anti-progress.
  • the lesson of both economic growth and technologies is that they are double-edged swords and must be judged as such.
  • What connects these various problems is the belief that the future can be orchestrated.
  • The reality is that our control over the future is modest at best, nonexistent at worst. We react more to events than lead them.
  • We worship at the altar of progress without adequately acknowledging its limits.
  • it does mean that we should be more candid about what is possible. If not, we might yet again wander over the “border between reality and impossibility.”
ilanaprincilus06

Meet the neuroscientist shattering the myth of the gendered brain | Science | The Guardian - 0 views

  • Whatever its sex, this baby’s future is predetermined by the entrenched belief that males and females do all kinds of things differently, better or worse, because they have different brains.
  • how vital it is, how life-changing, that we finally unpack – and discard – the sexist stereotypes and binary coding that limit and harm us.
  • she is out in the world, debunking the “pernicious” sex differences myth: the idea that you can “sex” a brain or that there is such a thing as a male brain and a female brain.
  • ...18 more annotations...
  • since the 18th century “when people were happy to spout off about what men and women’s brains were like – before you could even look at them. They came up with these nice ideas and metaphors that fitted the status quo and society, and gave rise to different education for men and women.”
  • she couldn’t find any beyond the negligible, and other research was also starting to question the very existence of such differences. For example, once any differences in brain size were accounted for, “well-known” sex differences in key structures disappeared.
  • Are there any significant differences based on sex alone? The answer, she says, is no.
  • “The idea of the male brain and the female brain suggests that each is a characteristically homogenous thing and that whoever has got a male brain, say, will have the same kind of aptitudes, preferences and personalities as everyone else with that ‘type’ of brain. We now know that is not the case.
  • ‘Forget the male and female brain; it’s a distraction, it’s inaccurate.’ It’s possibly harmful, too, because it’s used as a hook to say, well, there’s no point girls doing science because they haven’t got a science brain, or boys shouldn’t be emotional or should want to lead.”
  • The next question was, what then is driving the differences in behaviour between girls and boys, men and women?
  • “that the brain is moulded from birth onwards and continues to be moulded through to the ‘cognitive cliff’ in old age when our grey cells start disappearing.
  • the brain is much more a function of experiences. If you learn a skill your brain will change, and it will carry on changing.”
  • The brain is also predictive and forward-thinking in a way we had never previously realised.
  • The rules will change how the brain works and how someone behaves.” The upshot of gendered rules? “The ‘gender gap’ becomes a self-fulfilling prophecy.”
  • The brain is a biological organ. Sex is a biological factor. But it is not the sole factor; it intersects with so many variables.”
  • Letting go of age-old certainties is frightening, concedes Rippon, who is both optimistic about the future, and fearful for it.
  • On the plus side, our plastic brains are good learners. All we need to do is change the life lessons.
  • One major breakthrough in recent years has been the realisation that, even in adulthood, our brains are continually being changed, not just by the education we receive, but also by the jobs we do, the hobbies we have, the sports we play.
  • Once we acknowledge that our brains are plastic and mouldable, then the power of gender stereotypes becomes evident.
  • Beliefs about sex differences (even if ill-founded) inform stereotypes, which commonly provide just two labels – girl or boy, female or male – which, in turn, historically carry with them huge amounts of “contents assured” information and save us having to judge each individual on their own merits
  • With input from exciting breakthroughs in neuroscience, the neat, binary distinctiveness of these labels is being challenged – we are coming to realise that nature is inextricably entangled with nurture.
  • The 21st century is not just challenging the old answers – it is challenging the question itself.
johnsonel7

Psychology's Bias Toward Rich Western Societies Limits Findings - 0 views

  • In the field of psychology, the image is canon: a child sitting in front of a marshmallow, resisting the temptation to eat it. If she musters up the willpower to resist long enough, she’ll be rewarded when the experimenter returns with a second marshmallow. Using this “marshmallow test,” the Austrian-born psychologist Walter Mischel demonstrated that children who could resist immediate gratification and wait for a second marshmallow went on to greater achievements in life. They did better in school, had better SAT scores, and even managed their stress more skillfully.
  • People reasoned from these studies of the 1970s and ’80s that there must be some deep individual characteristic, some personality feature, that set kids up for higher achievements throughout life. But what if that wasn’t the right conclusion to draw from these studies? What if patience, and maybe other personality features too, are more a product of where we are than who we are?
  • The other challenge concerns whom psychologists have been studying for the past century. While scholars know a fair amount about how traits develop, that knowledge derives from research on a very specific and peculiar subset of humans: those living in industrialized societies.
  • ...2 more annotations...
  • For uncertainty, they got to choose between a safe bag that always paid out one candy or a risky bag that gave them only a one-in-six chance of more candy. We found lots of variation, especially between the Shuar and the three other communities. Children in the U.S., Argentina, and India behaved similarly, tending to be more patient and more tolerant of uncertainty, while the Shuar showed a very different pattern of behavior. They were more impatient, and warier of uncertainty; they almost never picked the risky bag.
  • In a follow-up study the next year, we looked within Shuar communities and found the same patterns. Shuar kids living near the cities acted more like Americans than the Shuar kids in the rainforest. Something about living near cities — and perhaps something about industrialization more broadly — seemed to be shaping kids’ behavior
johnsonel7

Baidu has a new trick for teaching AI the meaning of language - MIT Technology Review - 0 views

  • Earlier this month, a Chinese tech giant quietly dethroned Microsoft and Google in an ongoing competition in AI. The company was Baidu, China’s closest equivalent to Google, and the competition was the General Language Understanding Evaluation, otherwise known as GLUE.
  • GLUE is a widely accepted benchmark for how well an AI system understands human language. It consists of nine different tests for things like picking out the names of people and organizations in a sentence and figuring out what a pronoun like “it” refers to when there are multiple potential antecedents. A language model that scores highly on GLUE, therefore, can handle diverse reading comprehension tasks. Out of a full score of 100, the average person scores around 87 points. Baidu is now the first team to surpass 90 with its model, ERNIE.
  • BERT, by contrast, considers the context before and after a word all at once, making it bidirectional. It does this using a technique known as “masking.” In a given passage of text, BERT randomly hides 15% of the words and then tries to predict them from the remaining ones. This allows it to make more accurate predictions because it has twice as many cues to work from.
  • ...3 more annotations...
  • When Baidu researchers began developing their own language model, they wanted to build on the masking technique. But they realized they needed to tweak it to accommodate the Chinese language.In English, the word serves as the semantic unit—meaning a word pulled completely out of context still contains meaning. The same cannot be said for characters in Chinese.
  • It considers the ordering of sentences and the distances between them, for example, to understand the logical progression of a paragraph. Most important, however, it uses a method called continuous training that allows it to train on new data and new tasks without it forgetting those it learned before. This allows it to get better and better at performing a broad range of tasks over time with minimal human interference.
  • “When we first started this work, we were thinking specifically about certain characteristics of the Chinese language,” says Hao Tian, the chief architect of Baidu Research. “But we quickly discovered that it was applicable beyond that.”
katherineharron

CES 2020: Toyota is building a 'smart' city to test AI, robots and self-driving cars - ... - 0 views

  • armaker Toyota has unveiled plans for a 2,000-person "city of the future," where it will test autonomous vehicles, smart technology and robot-assisted living.
  • "With people buildings and vehicles all connected and communicating with each other through data and sensors, we will be able to test AI technology, in both the virtual and the physical world, maximizing its potential," he said on stage during Tuesday's unveiling. "We want to turn artificial intelligence into intelligence amplified."
  • The project is a collaboration between the Japanese carmaker and Danish architecture firm Bjarke Ingels Group (BIG), which designed the city's master plan. Buildings on the site will be made primarily from wood, and partly constructed using robotics. But the designs also look to Japan's past for inspiration, incorporating traditional joinery techniques and the sweeping roofs characteristic of the country's architecture.
  • ...2 more annotations...
  • Smart technology will extend inside residents' homes, according to Ingels, whose firm also designed the 2 World Trade Center in New York, and Google's headquarters in both London and Silicon Valley.
  • "In an age when technology, social media and online retail is replacing and eliminating our natural meeting places, the Woven City will explore ways to stimulate human interaction in the urban space," he said. "After all, human connectivity is the kind of connectivity that triggers wellbeing and happiness, productivity and innovation."
blythewallick

Why We Fear the Unknown | Psychology Today - 0 views

  • Despite our better nature, it seems, fear of foreigners or other strange-seeming people comes out when we are under stress. That fear, known as xenophobia, seems almost hardwired into the human psyche.
  • Researchers are discovering the extent to which xenophobia can be easily—even arbitrarily—turned on. In just hours, we can be conditioned to fear or discriminate against those who differ from ourselves by characteristics as superficial as eye color. Even ideas we believe are just common sense can have deep xenophobic underpinnings.
  • But other research shows that when it comes to whom we fear and how we react, we do have a choice. We can, it seems, choose not to give in to our xenophobic tendencies.
  • ...7 more annotations...
  • The targets of xenophobia—derived from the Greek word for stranger—are no longer the Japanese. Instead, they are Muslim immigrants. Or Mexicans. Or the Chinese. Or whichever group we have come to fear.
  • The teacher, Jane Elliott, divided her class into two groups—those with blue eyes and those with brown or green eyes. The brown-eyed group received privileges and treats, while the blue-eyed students were denied rewards and told they were inferior. Within hours, the once-harmonious classroom became two camps, full of mutual fear and resentment. Yet, what is especially shocking is that the students were only in the third grade.
  • The drive to completely and quickly divide the world into "us" and "them" is so powerful that it must surely come from some deep-seated need.
  • Once the division is made, the inferences and projections begin to occur. For one, we tend to think more highly of people in the in-group than those in the out-group, a belief based only on group identity. Also, a person tends to feel that others in the in-group are similar to one's self in ways that—although stereotypical—may have little to do with the original criteria used to split the groups.
  • The differences in reaction time are small but telling. Again and again, researchers found that subjects readily tie in-group images with pleasant words and out-group images with unpleasant words. One study compares such groups as whites and blacks, Jews and Christians, and young people and old people. And researchers found that if you identify yourself in one group, it's easier to pair images of that group with pleasant words—and easier to pair the opposite group with unpleasant imagery. This reveals the underlying biases and enables us to study how quickly they can form.
  • If categorization and bias come so easily, are people doomed to xenophobia and racism? It's pretty clear that we are susceptible to prejudice and that there is an unconscious desire to divide the world into "us" and "them." Fortunately, however, research also shows that prejudices are fluid and that when we become conscious of our biases we can take active—and successful—steps to combat them
  • Unfortunately, such stereotypes are reinforced so often that they can become ingrained. It is difficult to escape conventional wisdom and treat all people as individuals, rather than members of a group. But that seems to be the best way to avoid the trap of dividing the world in two—and discriminating against one part of humanity.
blythewallick

What the brains of people with excellent general knowledge look like: Some people seem ... - 0 views

  • "Although we can precisely measure the general knowledge of people and this wealth of knowledge is very important for an individual's journey through life, we currently know little about the links between general knowledge and the characteristics of the brain,"
  • This makes it possible to reconstruct the pathways of nerve fibres and thus gain an insight into the structural network properties of the brain. By means of mathematical algorithms, the researchers assigned an individual value to the brain of each participant, which reflected the efficiency of his or her structural fibre network.
  • The participants also completed a general knowledge test called the Bochum Knowledge Test, which was developed in Bochum by Dr. Rüdiger Hossiep. It is comprised of over 300 questions from various fields of knowledge such as art and architecture or biology and chemistry. The team led by Erhan Genç finally investigated whether the efficiency of structural networking is associated with the amount of general knowledge stored.
  • ...2 more annotations...
  • "We assume that individual units of knowledge are dispersed throughout the entire brain in the form of pieces of information," explains Erhan Genç. "Efficient networking of the brain is essential in order to put together the information stored in various areas of the brain and successfully recall knowledge content."
  • To answer the question of which constants occur in Einstein's theory of relativity, you have to connect the meaning of the term "constant" with knowledge of the theory of relativity. "We assume that more efficient networking of the brain contributes to better integration of pieces of information and thus leads to better results in a general knowledge test,
Javier E

The Cancel Culture Checklist - Persuasion - 0 views

  • a third of Americans say that they are personally worried about losing their jobs or missing out on career opportunities if they express their real political opinions.
  • Cancel culture now poses a real threat to intellectual freedom in the United States.
  • Americans in all walks of life have been publicly shamed, pressured into ritualistic apologies or summarily fired
  • ...29 more annotations...
  • But critics of the critics of cancel culture make a powerful retort. Accusing others of canceling can, they claim, be a way to stigmatize legitimate criticism. As Hannah Giorgis writes in the Atlantic, “critical tweets are not censorship.”
  • So what, exactly, does a cancellation consist of? And how does it differ from the exercise of free speech and robust critical debate?
  • At a conceptual level, the difference is clear. Criticism marshals evidence and arguments in a rational effort to persuade.
  • Canceling, by contrast, seeks to organize and manipulate the social or media environment in order to isolate, deplatform or intimidate ideological opponents
  • its intent—or at least its predictable outcome—is to coerce conformity and reduce the scope for forms of criticism that are not sanctioned by the prevailing consensus of some local majority.
  • In practice, however, telling canceling apart from criticism can be difficult because both take the form of criticizing others.
  • The more signs you see, the more certain you can be that you are looking at a cancel campaign.
  • A better approach might therefore be diagnostic. Like the symptoms of cancer, the hallmarks of a cancellation are many. Though not all instances involve every single characteristic, they all involve some of its key attribute
  • Six warning signs make up my personal checklist for cancel culture.
  • Punitiveness
  • A critical culture seeks to correct rather than punish. In science, the penalty for being wrong is not that you lose your job or your friends. Normally, the only penalty is that you lose the argument
  • Canceling, by contrast, seeks to punish rather than correct—and often for a single misstep rather than a long track record of failure
  • Deplatforming
  • A critical culture tolerates dissent rather than silencing it. It understands that dissent can seem obnoxious, harmful, hateful and, yes, unsafe.
  • Canceling, by contrast, seeks to shut up and shout down its targets. Cancelers often define the mere act of disagreeing with them as a threat to their safety or even an act of violence
  • Organization
  • Critical culture relies on persuasion. The way to win an argument is to convince others that you are right.
  • By contrast, it’s common to see cancelers organize hundreds of petition-signers or thousands of social media users to dig up and prosecute an indictment.
  • Secondary Boycotts
  • With its commitments to exploring a wide range of ideas and correcting rather than coercing the errant, a critical culture sees no value in instilling a climate of fear
  • But instilling fear is what canceling is all about. By choosing targets unpredictably (almost anything can trigger a campaign), providing no safe harbors (even conformists can get hit), and implicitly threatening anyone who sides with those who are targeted, canceling sends the message: “you could be next.”
  • Moral Grandstanding
  • Precisely because speech can be hurtful, critical culture discourages extreme rhetoric. It encourages people to listen to each other, to use evidence and argumentation, to behave reasonably and to avoid personal attacks.
  • Cancel culture is much more invested in what philosophers Justin Tosi and Brandon Warmke call “moral grandstanding”: the display of moral outrage to impress one’s peer group, dominate others, or both
  • Truthiness
  • Concern for accuracy is the north star of a critical culture. Not everyone gets every fact right, nor do people always agree on what is true; and yet people in a critical culture try to present their own and others’ viewpoints honestly and accurately.
  • canceling is not about seeking truth or persuading others; it is a form of information warfare, in which truthiness suffices if it serves the cause.
  • Those are my six warning signs. If you spot one or two, you should fear that a canceling may be happening; if you see five or six, you can be sure.
  • Though our critics like to claim that those of us who worry about cancel culture just don’t like being criticized on the internet, cancel culture is all too real. And though it may at times bear a superficial resemblance to critical culture, the two are diametrically opposed—and not so very difficult to tell apart.
« First ‹ Previous 61 - 80 of 95 Next ›
Showing 20 items per page