Skip to main content

Home/ TOK Friends/ Group items tagged trends

Rss Feed Group items tagged

Javier E

Facebook's Troubling One-Way Mirror - The New York Times - 1 views

  • If you bothered to read the fine print when you created your Facebook account, you would have noticed just how much of yourself you were giving over to Mark Zuckerberg and his $340 billion social network.
  • In exchange for an admittedly magical level of connectivity, you were giving them your life as content — the right to run ads around video from your daughter’s basketball game; pictures from your off-the-chain birthday party, or an emotional note about your return to health after serious illness. You also gave them the right to use your information to help advertisers market to you
  • at the heart of the relationship is a level of trust and a waiving of privacy that Facebook requires from its users as it pursues its mission to “make the world more open and connected.”
  • ...13 more annotations...
  • how open is Facebook willing to be in return?
  • not very.
  • that should concern anyone of any political persuasion as Facebook continues to gain influence over the national — and international — conversation
  • Increasingly, those users are spending time on Facebook not only to share personal nuggets with friends, but, for more than 40 percent of American adults, according to Pew Research Center, to stay on top of news
  • It now has an inordinate power to control a good part of the national discussion should it choose to do so, a role it shares with Sili
  • There was the initial statement that Facebook could find “no evidence” supporting the allegations; Facebook said it did not “insert stories artificially” into the Trending list, and that it had “rigorous guidelines” to ensure neutrality. But when journalists like my colleague Farhad Manjoo asked for more details about editorial guidelines, the company declined to discuss them.
  • Only after The Guardian newspaper obtained an old copy of the Trending Topics guidelines did Facebook provide more information, and an up-to-date copy of them. (They showed that humans work with algorithms to shape the lists and introduce headlines on their own under some circumstances, contradicting Facebook’s initial statement, Recode noted.) It was openness by way of a bullet to the foot.
  • a more important issue emerged during the meeting that had been lying beneath the surface, and has been for a while now: the power of the algorithms that determine what goes into individual Facebook pages.
  • “What they have is a disproportionate amount of power, and that’s the real story,” Mr. Carlson told me. “It’s just concentrated in a way you’ve never seen before in media.”
  • What most people don’t realize is that not everything they like or share necessarily gets a prominent place in their friends’ newsfeeds: The Facebook algorithm sends it to those it determines will find it most engaging.
  • For outlets like The Daily Caller, The Huffington Post, The Washington Post or The New York Times — for whom Facebook’s audience is vital to growth — any algorithmic change can affect how many people see their journalism.
  • This gives Facebook enormous influence over how newsrooms, almost universally eager for Facebook exposure, make decisions and money. Alan Rusbridger, a former editor of The Guardian, called this a “profound and alarming” development in a column in The New Statesman last week.
  • , Facebook declines to talk in great detail about its algorithms, noting that it does not want to make it easy to game its system. That system, don’t forget, is devised to keep people on Facebook by giving them what they want
kushnerha

Facebook's Bias Is Built-In, and Bears Watching - The New York Times - 2 views

  • Facebook is the world’s most influential source of news.That’s true according to every available measure of size — the billion-plus people who devour its News Feed every day, the cargo ships of profit it keeps raking in, and the tsunami of online traffic it sends to other news sites.
  • But Facebook has also acquired a more subtle power to shape the wider news business. Across the industry, reporters, editors and media executives now look to Facebook the same way nesting baby chicks look to their engorged mother — as the source of all knowledge and nourishment, the model for how to behave in this scary new-media world. Case in point: The New York Times, among others, recently began an initiative to broadcast live video. Why do you suppose that might be? Yup, the F word. The deal includes payments from Facebook to news outlets, including The Times.
  • Yet few Americans think of Facebook as a powerful media organization, one that can alter events in the real world. When blowhards rant about the mainstream media, they do not usually mean Facebook, the mainstreamiest of all social networks. That’s because Facebook operates under a veneer of empiricism. Many people believe that what you see on Facebook represents some kind of data-mined objective truth unmolested by the subjective attitudes of fair-and-balanced human beings.
  • ...11 more annotations...
  • None of that is true. This week, Facebook rushed to deny a report in Gizmodo that said the team in charge of its “trending” news list routinely suppressed conservative points of view. Last month, Gizmodo also reported that Facebook employees asked Mark Zuckerberg, the social network’s chief executive, if the company had a responsibility to “help prevent President Trump in 2017.” Facebook denied it would ever try to manipulate elections.
  • Even if you believe that Facebook isn’t monkeying with the trending list or actively trying to swing the vote, the reports serve as timely reminders of the ever-increasing potential dangers of Facebook’s hold on the news.
  • The question isn’t whether Facebook has outsize power to shape the world — of course it does, and of course you should worry about that power. If it wanted to, Facebook could try to sway elections, favor certain policies, or just make you feel a certain way about the world, as it once proved it could do in an experiment devised to measure how emotions spread online.
  • There is no evidence Facebook is doing anything so alarming now. The danger is nevertheless real. The biggest worry is that Facebook doesn’t seem to recognize its own power, and doesn’t think of itself as a news organization with a well-developed sense of institutional ethics and responsibility, or even a potential for bias. Neither does its audience, which might believe that Facebook is immune to bias because it is run by computers.
  • That myth should die. It’s true that beyond the Trending box, most of the stories Facebook presents to you are selected by its algorithms, but those algorithms are as infused with bias as any other human editorial decision.
  • “With Facebook, humans are never not involved. Humans are in every step of the process — in terms of what we’re clicking on, who’s shifting the algorithms behind the scenes, what kind of user testing is being done, and the initial training data provided by humans.”Everything you see on Facebook is therefore the product of these people’s expertise and considered judgment, as well as their conscious and unconscious biases apart from possible malfeasance or potential corruption. It’s often hard to know which, because Facebook’s editorial sensibilities are secret. So are its personalities: Most of the engineers, designers and others who decide what people see on Facebook will remain forever unknown to its audience.
  • Facebook also has an unmistakable corporate ethos and point of view. The company is staffed mostly by wealthy coastal Americans who tend to support Democrats, and it is wholly controlled by a young billionaire who has expressed policy preferences that many people find objectionable.
  • You could argue that none of this is unusual. Many large media outlets are powerful, somewhat opaque, operated for profit, and controlled by wealthy people who aren’t shy about their policy agendas — Bloomberg News, The Washington Post, Fox News and The New York Times, to name a few.But there are some reasons to be even more wary of Facebook’s bias. One is institutional. Many mainstream outlets have a rigorous set of rules and norms about what’s acceptable and what’s not in the news business.
  • Those algorithms could have profound implications for society. For instance, one persistent worry about algorithmic-selected news is that it might reinforce people’s previously held points of view. If News Feed shows news that we’re each likely to Like, it could trap us into echo chambers and contribute to rising political polarization. In a study last year, Facebook’s scientists asserted the echo chamber effect was muted.
  • are Facebook’s engineering decisions subject to ethical review? Nobody knows.
  • The other reason to be wary of Facebook’s bias has to do with sheer size. Ms. Caplan notes that when studying bias in traditional media, scholars try to make comparisons across different news outlets. To determine if The Times is ignoring a certain story unfairly, look at competitors like The Washington Post and The Wall Street Journal. If those outlets are covering a story and The Times isn’t, there could be something amiss about the Times’s news judgment.Such comparative studies are nearly impossible for Facebook. Facebook is personalized, in that what you see on your News Feed is different from what I see on mine, so the only entity in a position to look for systemic bias across all of Facebook is Facebook itself. Even if you could determine the spread of stories across all of Facebook’s readers, what would you compare it to?
kushnerha

Is That Even a Thing? - The New York Times - 3 views

  • Speakers and writers of American English have recently taken to identifying a staggering and constantly changing array of trends, events, memes, products, lifestyle choices and phenomena of nearly every kind with a single label — a thing.
  • It would be easy to call this a curiosity of the language and leave it at that. Linguistic trends come and go.
  • One could, on the other hand, consider the use of “a thing” a symptom of an entire generation’s linguistic sloth, general inarticulateness and penchant for cutesy, empty, half-ironic formulations that create a self-satisfied barrier preventing any form of genuine engagement with the world around them.
  • ...9 more annotations...
  • My assumption is that language and experience mutually influence each other. Language not only captures experience, it conditions it. It sets expectations for experience and gives shape to it as it happens. What might register as inarticulateness can reflect a different way of understanding and experiencing the world.
  • The word “thing” has of course long played a versatile and generic role in our language, referring both to physical objects and abstract matters. “The thing is …” “Here’s the thing.” “The play’s the thing.” In these examples, “thing” denotes the matter at hand and functions as stage setting to emphasize an important point. One new thing about “a thing,” then, is the typical use of the indefinite article “a” to precede it. We talk about a thing because we are engaged in cataloging. The question is whether something counts as a thing. “A thing” is not just stage setting. Information is conveyed.
  • What information? One definition of “a thing” that suggests itself right away is “cultural phenomenon.” A new app, an item of celebrity gossip, the practices of a subculture. It seems likely that “a thing” comes from the phrase the coolest/newest/latest thing. But now, in a society where everything, even the past, is new — “new thing” verges on the redundant. If they weren’t new they wouldn’t be things.
  • Clearly, cultural phenomena have long existed and been called “fads,” “trends,” “rages” or have been designated by the category they belong to — “product,” “fashion,” “lifestyle,” etc. So why the application of this homogenizing general term to all of them? I think there are four main reasons.
  • First, the flood of content into the cultural sphere. That we are inundated is well known. Information besieges us in waves that thrash us against the shore until we retreat to the solid ground of work or sleep or exercise or actual human interaction, only to wade cautiously back into our smartphones. As we spend more and more time online, it becomes the content of our experience, and in this sense “things” have earned their name. “A thing” has become the basic unit of cultural ontology.
  • Second, the fragmentation of this sphere. The daily barrage of culture requires that we choose a sliver of the whole in order to keep up. Netflix genres like “Understated Romantic Road Trip Movies” make it clear that the individual is becoming his or her own niche market — the converse of the celebrity as brand. We are increasingly a society of brands attuning themselves to markets, and markets evaluating brands. The specificity of the market requires a wider range of content — of things — to satisfy it
  • Third, the closing gap between satire and the real thing. The absurd excess of things has reached a point where the ironic detachment needed to cope with them is increasingly built into the things themselves, their marketing and the language we use to talk about them. The designator “a thing” is thus almost always tinged with ironic detachment. It puts the thing at arm’s length. You can hardly say “a thing” without a wary glint in your eye.
  • Finally, the growing sense that these phenomena are all the same. As we step back from “things,” they recede into the distance and begin to blur together. We call them all by the same name because they are the same at bottom: All are pieces of the Internet. A thing is for the most part experienced through this medium and generated by it. Even if they arise outside it, things owe their existence as things to the Internet. Google is thus always the arbiter of the question, “Is that a real thing?”
  • “A thing,” then, corresponds to a real need we have, to catalog and group together the items of cultural experience, while keeping them at a sufficient distance so that we can at least feign unified consciousness in the face of a world gone to pieces.
carolinewren

A Closer Look at the Global Warming Trend, Record Hot 2014 and What's Ahead - NYTimes.com - 1 views

  • that 2014 was the warmest year since careful record keeping began in 1880.
  • 2010 and 2014 are basically tied for warmest year.
  • The two agencies use slightly different methods, so they have different readings for the difference between 2014 and the previous warmest year, 2010, with N.O.A.A. putting it at 0.07 degrees Fahrenheit (0.04 degrees Celsius), while NASA got 0.036 degrees (0.02 Celsius) — which this analysis says is well “within uncertainty of measurement.”
  • ...2 more annotations...
  • changes in global temperature year to year, even decade to decade, have little meaning in tracking a long-term trend like the impact on temperature of rising concentrations of greenhouse gases.
  • the apparent slowdown has led to numerous assertions that “global warming has stopped.”
Javier E

In Defense of Big Love - The New York Times - 1 views

  • Beauty is what you experience when you look at a flower or a lovely face. It is contained, pleasurable, intimate and romantic. Sublime is what you feel when you look at a mountain range or a tornado. It involves awe, veneration, maybe even a touch of fear
  • neuroscientists have shown that the experiences of beauty and awe activate different parts of the brain.
  • I’d say that in America today some of the little loves are fraying, and big love is almost a foreign language. Almost nobody speaks about the American project in the same ardent tones that were once routine.
  • ...4 more annotations...
  • The distinction between the beautiful and the sublime is the distinction between the intimate and the transcendent. This sort of distinction doesn’t just happen in aesthetics, but in life in general. We have big and little loves.
  • Big love involves thinking in sweeping historical terms. But today the sense that America is pursuing a noble mission in the world has been humbled by failures and passivity. The country feels more divided than unified around common purpose
  • Big love involves politics, and thus compromise, competition and messiness. Americans today are less likely to discern the noble within the grittiness of reality. The very words that the founders used to describe their big love for their country sound archaic: glory, magnanimity, sacred honor and greatness.
  • There is, in sum, less animating desire in the country at the moment, and therefore less energy and daring. The share of Americans moving across state lines in search of opportunity has fallen by more than half since the 1970s. The rate of new business creation is down. Productivity is falling for the first time in three decades. Economic growth is anemic. There’s a spiritual and cultural element behind these trends
Javier E

Welcome, Robot Overlords. Please Don't Fire Us? | Mother Jones - 0 views

  • There will be no place to go but the unemployment line.
  • There will be no place to go but the unemployment line.
  • at this point our tale takes a darker turn. What do we do over the next few decades as robots become steadily more capable and steadily begin taking away all our jobs?
  • ...34 more annotations...
  • The economics community just hasn't spent much time over the past couple of decades focusing on the effect that machine intelligence is likely to have on the labor marke
  • The Digital Revolution is different because computers can perform cognitive tasks too, and that means machines will eventually be able to run themselves. When that happens, they won't just put individuals out of work temporarily. Entire classes of workers will be out of work permanently. In other words, the Luddites weren't wrong. They were just 200 years too early
  • Slowly but steadily, labor's share of total national income has gone down, while the share going to capital owners has gone up. The most obvious effect of this is the skyrocketing wealth of the top 1 percent, due mostly to huge increases in capital gains and investment income.
  • Robotic pets are growing so popular that Sherry Turkle, an MIT professor who studies the way we interact with technology, is uneasy about it: "The idea of some kind of artificial companionship," she says, "is already becoming the new normal."
  • robots will take over more and more jobs. And guess who will own all these robots? People with money, of course. As this happens, capital will become ever more powerful and labor will become ever more worthless. Those without money—most of us—will live on whatever crumbs the owners of capital allow us.
  • Economist Paul Krugman recently remarked that our long-standing belief in skills and education as the keys to financial success may well be outdated. In a blog post titled "Rise of the Robots," he reviewed some recent economic data and predicted that we're entering an era where the prime cause of income inequality will be something else entirely: capital vs. labor.
  • while it's easy to believe that some jobs can never be done by machines—do the elderly really want to be tended by robots?—that may not be true.
  • Third, as more people compete for fewer jobs, we'd expect to see middle-class incomes flatten in a race to the bottom.
  • The question we want to answer is simple: If CBTC is already happening—not a lot, but just a little bit—what trends would we expect to see? What are the signs of a computer-driven economy?
  • if automation were displacing labor, we'd expect to see a steady decline in the share of the population that's employed.
  • Second, we'd expect to see fewer job openings than in the past.
  • In the economics literature, the increase in the share of income going to capital owners is known as capital-biased technological change
  • Fourth, with consumption stagnant, we'd expect to see corporations stockpile more cash and, fearing weaker sales, invest less in new products and new factories
  • Fifth, as a result of all this, we'd expect to see labor's share of national income decline and capital's share rise.
  • We're already seeing them, and not just because of the crash of 2008. They started showing up in the statistics more than a decade ago. For a while, though, they were masked by the dot-com and housing bubbles, so when the financial crisis hit, years' worth of decline was compressed into 24 months. The trend lines dropped off the cliff.
  • Corporate executives should worry too. For a while, everything will seem great for them: Falling labor costs will produce heftier profits and bigger bonuses. But then it will all come crashing down. After all, robots might be able to produce goods and services, but they can't consume them
  • in another sense, we should be very alarmed. It's one thing to suggest that robots are going to cause mass unemployment starting in 2030 or so. We'd have some time to come to grips with that. But the evidence suggests that—slowly, haltingly—it's happening already, and we're simply not prepared for it.
  • the first jobs to go will be middle-skill jobs. Despite impressive advances, robots still don't have the dexterity to perform many common kinds of manual labor that are simple for humans—digging ditches, changing bedpans. Nor are they any good at jobs that require a lot of cognitive skill—teaching classes, writing magazine articles
  • in the middle you have jobs that are both fairly routine and require no manual dexterity. So that may be where the hollowing out starts: with desk jobs in places like accounting or customer support.
  • In fact, there's even a digital sports writer. It's true that a human being wrote this story—ask my mother if you're not sure—but in a decade or two I might be out of a job too
  • Doctors should probably be worried as well. Remember Watson, the Jeopardy!-playing computer? It's now being fed millions of pages of medical information so that it can help physicians do a better job of diagnosing diseases. In another decade, there's a good chance that Watson will be able to do this without any human help at all.
  • Take driverless cars.
  • The next step might be passenger vehicles on fixed routes, like airport shuttles. Then long-haul trucks. Then buses and taxis. There are 2.5 million workers who drive trucks, buses, and taxis for a living, and there's a good chance that, one by one, all of them will be displaced
  • There will be no place to go but the unemployment lin
  • we'll need to let go of some familiar convictions. Left-leaning observers may continue to think that stagnating incomes can be improved with better education and equality of opportunity. Conservatives will continue to insist that people without jobs are lazy bums who shouldn't be coddled. They'll both be wrong.
  • The modern economy is complex, and most of these trends have multiple causes.
  • we'll probably have only a few options open to us. The simplest, because it's relatively familiar, is to tax capital at high rates and use the money to support displaced workers. In other words, as The Economist's Ryan Avent puts it, "redistribution, and a lot of it."
  • would we be happy in a society that offers real work to a dwindling few and bread and circuses for the rest?
  • Most likely, owners of capital would strongly resist higher taxes, as they always have, while workers would be unhappy with their enforced idleness. Still, the ancient Romans managed to get used to it—with slave labor playing the role of robots—and we might have to, as well.
  •  economist Noah Smith suggests that we might have to fundamentally change the way we think about how we share economic growth. Right now, he points out, everyone is born with an endowment of labor by virtue of having a body and a brain that can be traded for income. But what to do when that endowment is worth a fraction of what it is today? Smith's suggestion: "Why not also an endowment of capital? What if, when each citizen turns 18, the government bought him or her a diversified portfolio of equity?"
  • In simple terms, if owners of capital are capturing an increasing fraction of national income, then that capital needs to be shared more widely if we want to maintain a middle-class society.
  • it's time to start thinking about our automated future in earnest. The history of mass economic displacement isn't encouraging—fascists in the '20s, Nazis in the '30s—and recent high levels of unemployment in Greece and Italy have already produced rioting in the streets and larger followings for right-wing populist parties. And that's after only a few years of misery.
  • When the robot revolution finally starts to happen, it's going to happen fast, and it's going to turn our world upside down. It's easy to joke about our future robot overlords—R2-D2 or the Terminator?—but the challenge that machine intelligence presents really isn't science fiction anymore. Like Lake Michigan with an inch of water in it, it's happening around us right now even if it's hard to see
  • A robotic paradise of leisure and contemplation eventually awaits us, but we have a long and dimly lit tunnel to navigate before we get there.
Javier E

Neither Hot Nor Cold on Climate - The New York Times - 0 views

  • t this is where the second objection to lukewarmism comes in
  • in actual right-wing politics no serious assessment of the science and the risks is taking place to begin with. Instead there’s just a mix of business-class and blue-collar self-interest and a trollish, “If liberals are for it, we’re against it” anti-intellectualism. So while lukewarmers may fancy ourselves serious interlocutors for liberals, we’re actually just running interference on behalf of know-nothing and do-nothingism, attacking flawed policies on behalf of a Republican Party that will never, ever advance any policies of its own.
  • This critique is … not necessarily wrong. A Republican Party that was really shaped by lukewarmism would probably still oppose the Paris deal and shrink from sweeping carbon taxes. But it would be actively debating and budgeting for the two arenas — innovation and mitigation — where the smartest skeptics of regulatory solutions tend to place their faith.
  • ...7 more annotations...
  • This is not what the G.O.P. seems inclined to do. Instead it lets lukewarmers poke holes in liberal proposals for climate insurance policies, and then sits back satisfied that no insurance policy, no extra effort, is necessary at all.
  • the anti-Paris sentiments that moved Trump weren’t entirely reality-based either. And a clear Republican plan for how to “prepare for and adapt to whatever climate change brings” does not actually exist.
  • In its absence, lukewarmism is a critique without an affirmative agenda, a theory of the case without a party that’s prepared to ever act on it.
  • I also want to concede two problems with this approach. The first is that no less than alarmism, lukewarmism can be vulnerable to cherry-picking and selection bias
  • when you’re dealing with long-term trends, there’s a lot of evidence to choose from
  • This means that every lukewarmer, including especially those in positions of political authority, should be pressed to identify trends that would push them toward greater alarmism and a sharper focus on the issue.
  • the closer the real trend gets to the worst-case projections, the more my lukewarmism will look Pollyannish and require substantial reassessment.
blythewallick

You're Only as Old as You Feel - The New York Times - 0 views

  • Simply asking people how old they feel may tell you a lot about their health and well-being.
  • “I don’t know if she dropped something and had to pick it up, or if her shoe was untied,” Ms. Heller said, but she eagerly bounded over to help. The woman blamed old age for her incapacity, explaining that she was 70. But Ms. Heller was 71.“This woman felt every bit her age,” she recalled. “I don’t let age stop me. I think it’s a mind-set, really.”
  • People with a healthy lifestyle and living conditions and a fortunate genetic inheritance tend to score “younger” on these assessments and are said to have a lower “biological age.” But there’s a much easier way to determine the shape people are in. It’s called “subjective age.”
  • ...8 more annotations...
  • When scientists ask: “How old do you feel, most of the time?” the answer tends to reflect the state of people’s physical and mental health. “This simple question seems to be particularly powerful,” says Antonio Terracciano, a professor of geriatrics at Florida State University College of Medicine in Tallahassee.Sign up for the Well NewsletterGet the best of Well, with the latest on health, fitness and nutrition.Sign Up* Captcha is incomplete. Please try again.Thank you for subscribingYou can also view our other newsletters or visit your account to opt out or manage email preferences.An error has occurred. Please try again later.You are already subscribed to this email.View all New York Times newsletters.
  • Scientists are finding that people who feel younger than their chronological age are typically healthier and more psychologically resilient than those who feel older. They perform better on memory tasks and are at lower risk of cognitive decline.
  • If you’re over 40, chances are you feel younger than your driver’s license suggests. Some 80 percent of people do, according to Dr. Stephan. A small fraction of people — fewer than 10 percent — feel older.
  • At age 50, people may feel about five years, or 10 percent, younger, but by the time they’re 70 they may feel 15 percent or even 20 percent younger.
  • In a 2018 German study, investigators asked people in their 60s, 70s and early 80s how old they felt, then measured their walking speed in two settings. Participants walked 20 feet in the laboratory while being observed and timed. They also wore belts containing an accelerometer while out and about in their daily lives. Those who reported feeling younger tended to walk faster during the lab assessment. But feeling younger had no impact on their walking speed in real life.
  • Indeed, in cultures where elders are respected for their wisdom and experience, people don’t even understand the concept of subjective age, he said. When a graduate student of Dr. Weiss’s did research in Jordan, the people he spoke with “would say, ‘I’m 80. I don’t know what you mean by ‘How old do I feel?’”
  • As we age, we tend to become generally happier and more satisfied, said Dr. Tracey Gendron, a gerontologist at Virginia Commonwealth University who questions the whole notion of subjective age research
  • “Older age is a time that we can actually look forward to. People really just enjoy themselves more and are at peace with who they are. I would love for everyone to say their age at every year and celebrate it.”
kiraagne

Before Kyle Rittenhouse's Murder Trial, a Debate Over Terms Like 'Victim' - The New Yor... - 0 views

  • A judge’s decision that the word “victim” generally could not be used in court to refer to the people shot by Kyle Rittenhouse after protests in Kenosha, Wis., last year drew widespread attention and outrage this week.
  • Mr. Rittenhouse, who has been charged with six criminal counts, including first-degree reckless homicide, first-degree intentional homicide and attempted first-degree intentional homicide in the deaths of two men and the wounding of another, is expected to argue that he fired his gun because he feared for his life.
  • Prosecutors say he was a violent vigilante who illegally possessed the rifle and whose actions resulted in chaos and bloodshed.
  • ...6 more annotations...
  • This week, as Judge Schroeder ruled on a motion by the prosecution, he also said that he would allow the terms “looters” and “rioters” to be used to refer to the men who were shot
  • The experts said the term “victim” can appear prejudicial in a court of law, heavily influencing a jury by presupposing which people have been wronged.
  • State law in Wisconsin allows a person to fire in self-defense if the shooter “reasonably believes that such force is necessary to prevent imminent death or great bodily harm to himself or herself.”Editors’ PicksTo Save a Swirling Season, Atlanta Turned to Soft ServeThink You Know the 1960s? ‘The Shattering’ Asks You to Think Again.
  • “In a self-defense case, the people who were shot are to some extent on trial,
  • Prosecutors have repeatedly tried to introduce evidence of Mr. Rittenhouse’s associations with the far-right Proud Boys, as well as a cellphone video taken weeks before the shootings in Kenosha in which Mr. Rittenhouse suggested that he wished he had his rifle so he could shoot men leaving a pharmacy. The judge did not allow either as evidence for trial.
  • Thomas Binger, a prosecutor, argued that the judge was creating a “double standard” and said that the words he sought to have prohibited — relating to rioting and other damage — were “as loaded, if not more loaded, than the term ‘victim.’
Javier E

6 analog trends that are good for the soul - The Washington Post - 0 views

  • Sometimes it’s better to be inefficient
  • a countertrend growing as digital technologies began to take off with the advent of smartphones, streaming services and social media. The more we rely on digital technology for work, learning and socializing, “the more we seek out analog alternatives as a balance or a different way of engaging with the world,”
  • the trend isn’t driven by older generations seeking nostalgia but rather “by younger people who may have even never encountered this technology in the first place,
  • ...1 more annotation...
  • Choosing the less-efficient way of doing something, especially things we do for pleasure, can help us reassess our relationship with time and forgo the constant need for productivity.
nataliedepaulo1

In 'Science,' Obama Argues Trump Can't Undo the Clean-Energy Revolution - The Atlantic - 0 views

  • Obama in Science: The Renewable Revolution Will Outlast Trump
  • For the past five years, solar and wind energy have exploded in popularity in the  United States. Since the election of Donald Trump, energy analysts have been trying to figure out if that trend will continue.
  • Some analysts argue that the trend is irreversible. The cost of solar and wind power are falling so fast that they will soon beat fossil fuels on price alone, regardless of what the federal government does. Since 2008, the price per watt of utility-scale solar energy has fallen by 64 percent. Even Walmart puts solar panels on its roofs now.
  • ...1 more annotation...
  • As Obama leaves office, he is communicating his message about the inevitability of clean energy again. Millions of Americans—including the more than 700,000 employed in the renewable-energy industry—are hoping he is right.
Javier E

The decline effect and the scientific method : The New Yorker - 3 views

  • The test of replicability, as it’s known, is the foundation of modern research. Replicability is how the community enforces itself. It’s a safeguard for the creep of subjectivity. Most of the time, scientists know what results they want, and that can influence the results they get. The premise of replicability is that the scientific community can correct for these flaws.
  • But now all sorts of well-established, multiply confirmed findings have started to look increasingly uncertain. It’s as if our facts were losing their truth: claims that have been enshrined in textbooks are suddenly unprovable.
  • This phenomenon doesn’t yet have an official name, but it’s occurring across a wide range of fields, from psychology to ecology.
  • ...39 more annotations...
  • If replication is what separates the rigor of science from the squishiness of pseudoscience, where do we put all these rigorously validated findings that can no longer be proved? Which results should we believe?
  • Schooler demonstrated that subjects shown a face and asked to describe it were much less likely to recognize the face when shown it later than those who had simply looked at it. Schooler called the phenomenon “verbal overshadowing.”
  • The most likely explanation for the decline is an obvious one: regression to the mean. As the experiment is repeated, that is, an early statistical fluke gets cancelled out. The extrasensory powers of Schooler’s subjects didn’t decline—they were simply an illusion that vanished over time.
  • yet Schooler has noticed that many of the data sets that end up declining seem statistically solid—that is, they contain enough data that any regression to the mean shouldn’t be dramatic. “These are the results that pass all the tests,” he says. “The odds of them being random are typically quite remote, like one in a million. This means that the decline effect should almost never happen. But it happens all the time!
  • this is why Schooler believes that the decline effect deserves more attention: its ubiquity seems to violate the laws of statistics
  • In 2001, Michael Jennions, a biologist at the Australian National University, set out to analyze “temporal trends” across a wide range of subjects in ecology and evolutionary biology. He looked at hundreds of papers and forty-four meta-analyses (that is, statistical syntheses of related studies), and discovered a consistent decline effect over time, as many of the theories seemed to fade into irrelevance.
  • Jennions admits that his findings are troubling, but expresses a reluctance to talk about them
  • publicly. “This is a very sensitive issue for scientists,” he says. “You know, we’re supposed to be dealing with hard facts, the stuff that’s supposed to stand the test of time. But when you see these trends you become a little more skeptical of things.”
  • While publication bias almost certainly plays a role in the decline effect, it remains an incomplete explanation. For one thing, it fails to account for the initial prevalence of positive results among studies that never even get submitted to journals. It also fails to explain the experience of people like Schooler, who have been unable to replicate their initial data despite their best efforts.
  • Jennions, similarly, argues that the decline effect is largely a product of publication bias, or the tendency of scientists and scientific journals to prefer positive data over null results, which is what happens when no effect is found. The bias was first identified by the statistician Theodore Sterling, in 1959, after he noticed that ninety-seven per cent of all published psychological studies with statistically significant data found the effect they were looking for
  • Sterling saw that if ninety-seven per cent of psychology studies were proving their hypotheses, either psychologists were extraordinarily lucky or they published only the outcomes of successful experiments.
  • One of his most cited papers has a deliberately provocative title: “Why Most Published Research Findings Are False.”
  • suspects that an equally significant issue is the selective reporting of results—the data that scientists choose to document in the first place. Palmer’s most convincing evidence relies on a statistical tool known as a funnel graph. When a large number of studies have been done on a single subject, the data should follow a pattern: studies with a large sample size should all cluster around a common value—the true result—whereas those with a smaller sample size should exhibit a random scattering, since they’re subject to greater sampling error. This pattern gives the graph its name, since the distribution resembles a funnel.
  • after Palmer plotted every study of fluctuating asymmetry, he noticed that the distribution of results with smaller sample sizes wasn’t random at all but instead skewed heavily toward positive results. Palmer has since documented a similar problem in several other contested subject areas. “Once I realized that selective reporting is everywhere in science, I got quite depressed,” Palmer told me. “As a researcher, you’re always aware that there might be some nonrandom patterns, but I had no idea how widespread it is.”
  • Palmer summarized the impact of selective reporting on his field: “We cannot escape the troubling conclusion that some—perhaps many—cherished generalities are at best exaggerated in their biological significance and at worst a collective illusion nurtured by strong a-priori beliefs often repeated.”
  • Palmer emphasizes that selective reporting is not the same as scientific fraud. Rather, the problem seems to be one of subtle omissions and unconscious misperceptions, as researchers struggle to make sense of their results. Stephen Jay Gould referred to this as the “sho
  • horning” process.
  • “A lot of scientific measurement is really hard,” Simmons told me. “If you’re talking about fluctuating asymmetry, then it’s a matter of minuscule differences between the right and left sides of an animal. It’s millimetres of a tail feather. And so maybe a researcher knows that he’s measuring a good male”—an animal that has successfully mated—“and he knows that it’s supposed to be symmetrical. Well, that act of measurement is going to be vulnerable to all sorts of perception biases. That’s not a cynical statement. That’s just the way human beings work.”
  • For Simmons, the steep rise and slow fall of fluctuating asymmetry is a clear example of a scientific paradigm, one of those intellectual fads that both guide and constrain research: after a new paradigm is proposed, the peer-review process is tilted toward positive results. But then, after a few years, the academic incentives shift—the paradigm has become entrenched—so that the most notable results are now those that disprove the theory.
  • John Ioannidis, an epidemiologist at Stanford University, argues that such distortions are a serious issue in biomedical research. “These exaggerations are why the decline has become so common,” he says. “It’d be really great if the initial studies gave us an accurate summary of things. But they don’t. And so what happens is we waste a lot of money treating millions of patients and doing lots of follow-up studies on other themes based on results that are misleading.”
  • In 2005, Ioannidis published an article in the Journal of the American Medical Association that looked at the forty-nine most cited clinical-research studies in three major medical journals.
  • the data Ioannidis found were disturbing: of the thirty-four claims that had been subject to replication, forty-one per cent had either been directly contradicted or had their effect sizes significantly downgraded.
  • the most troubling fact emerged when he looked at the test of replication: out of four hundred and thirty-two claims, only a single one was consistently replicable. “This doesn’t mean that none of these claims will turn out to be true,” he says. “But, given that most of them were done badly, I wouldn’t hold my breath.”
  • According to Ioannidis, the main problem is that too many researchers engage in what he calls “significance chasing,” or finding ways to interpret the data so that it passes the statistical test of significance—the ninety-five-per-cent boundary invented by Ronald Fisher.
  • One of the classic examples of selective reporting concerns the testing of acupuncture in different countries. While acupuncture is widely accepted as a medical treatment in various Asian countries, its use is much more contested in the West. These cultural differences have profoundly influenced the results of clinical trials.
  • The problem of selective reporting is rooted in a fundamental cognitive flaw, which is that we like proving ourselves right and hate being wrong.
  • “It feels good to validate a hypothesis,” Ioannidis said. “It feels even better when you’ve got a financial interest in the idea or your career depends upon it. And that’s why, even after a claim has been systematically disproven”—he cites, for instance, the early work on hormone replacement therapy, or claims involving various vitamins—“you still see some stubborn researchers citing the first few studies
  • That’s why Schooler argues that scientists need to become more rigorous about data collection before they publish. “We’re wasting too much time chasing after bad studies and underpowered experiments,”
  • The current “obsession” with replicability distracts from the real problem, which is faulty design.
  • “Every researcher should have to spell out, in advance, how many subjects they’re going to use, and what exactly they’re testing, and what constitutes a sufficient level of proof. We have the tools to be much more transparent about our experiments.”
  • Schooler recommends the establishment of an open-source database, in which researchers are required to outline their planned investigations and document all their results. “I think this would provide a huge increase in access to scientific work and give us a much better way to judge the quality of an experiment,”
  • scientific research will always be shadowed by a force that can’t be curbed, only contained: sheer randomness. Although little research has been done on the experimental dangers of chance and happenstance, the research that exists isn’t encouraging.
  • The disturbing implication of the Crabbe study is that a lot of extraordinary scientific data are nothing but noise. The hyperactivity of those coked-up Edmonton mice wasn’t an interesting new fact—it was a meaningless outlier, a by-product of invisible variables we don’t understand.
  • The problem, of course, is that such dramatic findings are also the most likely to get published in prestigious journals, since the data are both statistically significant and entirely unexpected
  • This suggests that the decline effect is actually a decline of illusion. While Karl Popper imagined falsification occurring with a single, definitive experiment—Galileo refuted Aristotelian mechanics in an afternoon—the process turns out to be much messier than that.
  • Many scientific theories continue to be considered true even after failing numerous experimental tests.
  • Even the law of gravity hasn’t always been perfect at predicting real-world phenomena. (In one test, physicists measuring gravity by means of deep boreholes in the Nevada desert found a two-and-a-half-per-cent discrepancy between the theoretical predictions and the actual data.)
  • Such anomalies demonstrate the slipperiness of empiricism. Although many scientific ideas generate conflicting results and suffer from falling effect sizes, they continue to get cited in the textbooks and drive standard medical practice. Why? Because these ideas seem true. Because they make sense. Because we can’t bear to let them go. And this is why the decline effect is so troubling. Not because it reveals the human fallibility of science, in which data are tweaked and beliefs shape perceptions. (Such shortcomings aren’t surprising, at least for scientists.) And not because it reveals that many of our most exciting theories are fleeting fads and will soon be rejected. (That idea has been around since Thomas Kuhn.)
  • The decline effect is troubling because it reminds us how difficult it is to prove anything. We like to pretend that our experiments define the truth for us. But that’s often not the case. Just because an idea is true doesn’t mean it can be proved. And just because an idea can be proved doesn’t mean it’s true. When the experiments are done, we still have to choose what to believe. ♦
Javier E

How 'Concept Creep' Made Americans So Sensitive to Harm - The Atlantic - 0 views

  • How did American culture arrive at these moments? A new research paper by Nick Haslam, a professor of psychology at the University of Melbourne, Australia, offers as useful a framework for understanding what’s going on as any I’ve seen. In “Concept Creep: Psychology's Expanding Concepts of Harm and Pathology,”
  • concepts like abuse, bullying, trauma, mental disorder, addiction, and prejudice, “now encompass a much broader range of phenomena than before,”expanded meanings that reflect “an ever-increasing sensitivity to harm.”
  • “they also have potentially damaging ramifications for society and psychology that cannot be ignored.”
  • ...20 more annotations...
  • He calls these expansions of meaning “concept creep.”
  • critics may hold concept creep responsible for damaging cultural trends, he writes, “such as supposed cultures of fear, therapy, and victimhood, the shifts I present have some positive implications.”
  • Concept creep is inevitable and vital if society is to make good use of new information. But why has the direction of concept creep, across so many different concepts, trended toward greater sensitivity to harm as opposed to lesser sensitivity?
  • The concept of abuse expanded too far.
  • Classically, psychological investigations recognized two forms of child abuse, physical and sexual, Haslam writes. In more recent decades, however, the concept of abuse has witnessed “horizontal creep” as new forms of abuse were recognized or studied. For example, “emotional abuse” was added as a new subtype of abuse. Neglect, traditionally a separate category, came to be seen as a type of abuse, too.
  • Meanwhile, the concept of abuse underwent “vertical creep.” That is, the behavior seen as qualifying for a given kind of abuse became steadily less extreme. Some now regard any spanking as physical abuse. Within psychology, “the boundary of neglect is indistinct,” Haslam writes. “As a consequence, the concept of neglect can become over-inclusive, identifying behavior as negligent that is substantially milder or more subtle than other forms of abuse. This is not to deny that some forms of neglect are profoundly damaging, merely to argue that the concept’s boundaries are sufficiently vague and elastic to encompass forms that are not severe.”
  • How did a working-class mom get arrested, lose her fast food job, and temporarily lose custody of her 9-year-old for letting the child play alone at a nearby park?
  • One concerns the field of psychology and its incentives. “It could be argued that just as successful species increase their territory, invading and adapting to new habitats, successful concepts and disciplines also expand their range into new semantic niches,” he theorizes. “Concepts that successfully attract the attention of researchers and practitioners are more likely to be applied in new ways and new contexts than those that do not.”
  • Concept creep can be necessary or needless. It can align concepts more or less closely with underlying realities. It can change society for better or worse. Yet many who push for more sensitivy to harm seem unaware of how oversensitivty can do harm.
  • The other theory posits an ideological explanation. “Psychology has played a role in the liberal agenda of sensitivity to harm and responsiveness to the harmed,” he writes “and its increased focus on negative phenomena—harms such as abuse, addiction, bullying, mental disorder, prejudice, and trauma—has been symptomatic of the success of that social agenda.”
  • Jonathan Haidt, who believes it has gone too far, offers a fourth theory. “If an increasingly left-leaning academy is staffed by people who are increasingly hostile to conservatives, then we can expect that their concepts will shift, via motivated scholarship, in ways that will help them and their allies (e.g., university administrators) to prosecute and condemn conservatives,
  • While Haslam and Haidt appear to have meaningfully different beliefs about why concept creep arose within academic psychology and spread throughout society, they were in sufficient agreement about its dangers to co-author a Guardian op-ed on the subject.
  • It focuses on how greater sensitivity to harm has affected college campuses.
  • “Of course young people need to be protected from some kinds of harm, but overprotection is harmful, too, for it causes fragility and hinders the development of resilience,” they wrote. “As Nasim Taleb pointed out in his book Antifragile, muscles need resistance to develop, bones need stress and shock to strengthen and the growing immune system needs to be exposed to pathogens in order to function. Similarly, he noted, children are by nature anti-fragile – they get stronger when they learn to recover from setbacks, failures and challenges to their cherished ideas.”
  • police officers fearing harm from dogs kill them by the hundreds or perhaps thousands every year in what the DOJ calls an epidemic.
  • After the terrorist attacks of September 11, 2001, the Bush Administration and many Americans grew increasingly sensitive to harms, real and imagined, from terrorism
  • Dick Cheney declared, “If there's a 1% chance that Pakistani scientists are helping al-Qaeda build or develop a nuclear weapon, we have to treat it as a certainty in terms of our response. It's not about our analysis ... It's about our response.” The invasion of Iraq was predicated, in part, on the idea that 9/11 “changed everything,”
  • Before 9/11, the notion of torturing prisoners was verboten. After the Bush Administration’s torture was made public, popular debate focused on mythical “ticking time bomb” scenarios, in which a whole city would be obliterated but for torture. Now Donald Trump suggests that torture should be used more generally against terrorists. Torture is, as well, an instance in which people within the field of psychology pushed concept creep in the direction of less sensitivity to harm,
  • Haslam endorses two theories
  • there are many reasons to be concerned about excessive sensitivity to harm:
Javier E

Think Less, Think Better - The New York Times - 1 views

  • the capacity for original and creative thinking is markedly stymied by stray thoughts, obsessive ruminations and other forms of “mental load.”
  • Many psychologists assume that the mind, left to its own devices, is inclined to follow a well-worn path of familiar associations. But our findings suggest that innovative thinking, not routine ideation, is our default cognitive mode when our minds are clear.
  • We found that a high mental load consistently diminished the originality and creativity of the response: Participants with seven digits to recall resorted to the most statistically common responses (e.g., white/black), whereas participants with two digits gave less typical, more varied pairings (e.g., white/cloud).
  • ...8 more annotations...
  • In another experiment, we found that longer response times were correlated with less diverse responses, ruling out the possibility that participants with low mental loads simply took more time to generate an interesting response.
  • it seems that with a high mental load, you need more time to generate even a conventional thought. These experiments suggest that the mind’s natural tendency is to explore and to favor novelty, but when occupied it looks for the most familiar and inevitably least interesting solution.
  • Much of our lives are spent somewhere between those extremes. There are functional benefits to both modes: If we were not exploratory, we would never have ventured out of the caves; if we did not exploit the certainty of the familiar, we would have taken too many risks and gone extinct. But there needs to be a healthy balance
  • In general, there is a tension in our brains between exploration and exploitation. When we are exploratory, we attend to things with a wide scope, curious and desiring to learn. Other times, we rely on, or “exploit,” what we already know, leaning on our expectations, trusting the comfort of a predictable environment
  • All these loads can consume mental capacity, leading to dull thought and anhedonia — a flattened ability to experience pleasure.
  • ancient meditative practice helps free the mind to have richer experiences of the present
  • your life leaves too much room for your mind to wander. As a result, only a small fraction of your mental capacity remains engaged in what is before it, and mind-wandering and ruminations become a tax on the quality of your life
  • Honing an ability to unburden the load on your mind, be it through meditation or some other practice, can bring with it a wonderfully magnified experience of the world — and, as our study suggests, of your own mind.
oliviaodon

Sorry, climate change deniers, but the global warming 'pause' still never happened | Sy... - 0 views

  • Another day, another series of ridiculous and incorrect claims about global warming getting far more air than they deserve. The latest comes from none other than David Rose, a man who has serially misunderstood climate change so consistently that if he told me the sun would rise tomorrow, I'd be more inclined to believe the Earth had stopped rotating. He writes articles for the Daily Mail —it would be an insult to the fish to wrap them in this tabloid — and he uses a lot of typical techniques wielded by deniers, including cherry picking and misdirection. While he doesn't always deny global warming is happening, he does think it's not as bad as scientists say. I'll also note he has claimed the world is cooling, too, despite all the evidence (and I do mean all of it). But if you deny what the overwhelming majority of climate scientists are telling you, then in my opinion that makes you a denier.
  • Rose is, as usual, grossly exaggerating the death of global warming.
  • First, the "pause" is a claim that global warming has stopped since 1998 or so. This claim was never really true. 1998 was an unusually warm year, so if you start your measurements there it doesn't look like temperatures have risen much. But if you go back farther in time, the upward trend is very obvious. You have to look at the trend, and not short-term fluctuations!
  • ...1 more annotation...
  • This shows that there can sometimes be a disconnect between the honest research of scientists and the way the public perceives that research. It's not anyone's fault really; the scientists are using the best methods and practices they have to understand reality, but the public gets and processes their information differently (not in a worse way, just different). It reminds me of the trouble we get using the word "theory"; to a scientist it means an extremely well-tested and reliable idea, but to the public it means more like a "guess." Same word, different uses, and it can give someone the wrong idea when used in the wrong context.
Javier E

Why It's OK to Let Apps Make You a Better Person - Evan Selinger - Technology - The Atl... - 0 views

  • one theme emerges from the media coverage of people's relationships with our current set of technologies: Consumers want digital willpower. App designers in touch with the latest trends in behavioral modification--nudging, the quantified self, and gamification--and good old-fashioned financial incentive manipulation, are tackling weakness of will. They're harnessing the power of payouts, cognitive biases, social networking, and biofeedback. The quantified self becomes the programmable self.
  • the trend still has multiple interesting dimensions
  • Individuals are turning ever more aspects of their lives into managerial problems that require technological solutions. We have access to an ever-increasing array of free and inexpensive technologies that harness incredible computational power that effectively allows us to self-police behavior everywhere we go. As pervasiveness expands, so does trust.
  • ...20 more annotations...
  • Some embrace networked, data-driven lives and are comfortable volunteering embarrassing, real time information about what we're doing, whom we're doing it with, and how we feel about our monitored activities.
  • Put it all together and we can see that our conception of what it means to be human has become "design space." We're now Humanity 2.0, primed for optimization through commercial upgrades. And today's apps are more harbinger than endpoint.
  • philosophers have had much to say about the enticing and seemingly inevitable dispersion of technological mental prosthetic that promise to substitute or enhance some of our motivational powers.
  • beyond the practical issues lie a constellation of central ethical concerns.
  • It simply means that when it comes to digital willpower, we should be on our guard to avoid confusing situational with integrated behaviors.
  • it is antithetical to the ideal of " resolute choice." Some may find the norm overly perfectionist, Spartan, or puritanical. However, it is not uncommon for folks to defend the idea that mature adults should strive to develop internal willpower strong enough to avoid external temptations, whatever they are, and wherever they are encountered.
  • In part, resolute choosing is prized out of concern for consistency, as some worry that lapse of willpower in any context indicates a generally weak character.
  • Fragmented selves behave one way while under the influence of digital willpower, but another when making decisions without such assistance. In these instances, inconsistent preferences are exhibited and we risk underestimating the extent of our technological dependency.
  • they should cause us to pause as we think about a possible future that significantly increases the scale and effectiveness of willpower-enhancing apps. Let's call this hypothetical future Digital Willpower World and characterize the ethical traps we're about to discuss as potential general pitfalls
  • the problem of inauthenticity, a staple of the neuroethics debates, might arise. People might start asking themselves: Has the problem of fragmentation gone away only because devices are choreographing our behavior so powerfully that we are no longer in touch with our so-called real selves -- the selves who used to exist before Digital Willpower World was formed?
  • Infantalized subjects are morally lazy, quick to have others take responsibility for their welfare. They do not view the capacity to assume personal responsibility for selecting means and ends as a fundamental life goal that validates the effort required to remain committed to the ongoing project of maintaining willpower and self-control.
  • Michael Sandel's Atlantic essay, "The Case Against Perfection." He notes that technological enhancement can diminish people's sense of achievement when their accomplishments become attributable to human-technology systems and not an individual's use of human agency.
  • Borgmann worries that this environment, which habituates us to be on auto-pilot and delegate deliberation, threatens to harm the powers of reason, the most central component of willpower (according to the rationalist tradition).
  • In several books, including Technology and the Character of Contemporary Life, he expresses concern about technologies that seem to enhance willpower but only do so through distraction. Borgmann's paradigmatic example of the non-distracted, focally centered person is a serious runner. This person finds the practice of running maximally fulfilling, replete with the rewarding "flow" that can only comes when mind/body and means/ends are unified, while skill gets pushed to the limit.
  • Perhaps the very conception of a resolute self was flawed. What if, as psychologist Roy Baumeister suggests, willpower is more "staple of folk psychology" than real way of thinking about our brain processes?
  • novel approaches suggest the will is a flexible mesh of different capacities and cognitive mechanisms that can expand and contract, depending on the agent's particular setting and needs. Contrary to the traditional view that identifies the unified and cognitively transparent self as the source of willed actions, the new picture embraces a rather diffused, extended, and opaque self who is often guided by irrational trains of thought. What actually keeps the self and its will together are the given boundaries offered by biology, a coherent self narrative created by shared memories and experiences, and society. If this view of the will as an expa
  • nding and contracting system with porous and dynamic boundaries is correct, then it might seem that the new motivating technologies and devices can only increase our reach and further empower our willing selves.
  • "It's a mistake to think of the will as some interior faculty that belongs to an individual--the thing that pushes the motor control processes that cause my action," Gallagher says. "Rather, the will is both embodied and embedded: social and physical environment enhance or impoverish our ability to decide and carry out our intentions; often our intentions themselves are shaped by social and physical aspects of the environment."
  • It makes perfect sense to think of the will as something that can be supported or assisted by technology. Technologies, like environments and institutions can facilitate action or block it. Imagine I have the inclination to go to a concert. If I can get my ticket by pressing some buttons on my iPhone, I find myself going to the concert. If I have to fill out an application form and carry it to a location several miles away and wait in line to pick up my ticket, then forget it.
  • Perhaps the best way forward is to put a digital spin on the Socratic dictum of knowing myself and submit to the new freedom: the freedom of consuming digital willpower to guide me past the sirens.
Javier E

Young Women Often Trendsetters in Vocal Patterns - NYTimes.com - 0 views

  • vocal trends associated with young women are often seen as markers of immaturity or even stupidity.
  • such thinking is outmoded. Girls and women in their teens and 20s deserve credit for pioneering vocal trends and popular slang, they say, adding that young women use these embellishments in much more sophisticated ways than people tend to realize.
  • they’re not just using them because they’re girls. They’re using them to achieve some kind of interactional and stylistic end.”
  • ...7 more annotations...
  • “The truth is this: Young women take linguistic features and use them as power tools for building relationships.”
  • women tend to be maybe half a generation ahead of males on average.”
  • Less clear is why. Some linguists suggest that women are more sensitive to social interactions and hence more likely to adopt subtle vocal cues. Others say women use language to assert their power in a culture that, at least in days gone by, asked them to be sedate and decorous. Another theory is that young women are simply given more leeway by society to speak flamboyantly.
  • Several studies have shown that uptalk can be used for any number of purposes, even to dominate a listener.
  • by far the most common uptalkers were fathers of young women. For them, it was “a way of showing themselves to be friendly and not asserting power in the situation,” she said.
  • So what does the use of vocal fry denote?
  • a natural result of women’s lowering their voices to sound more authoritative. It can also be used to communicate disinterest, something teenage girls are notoriously fond of doing.
anonymous

Young Women Often Trendsetters in Vocal Patterns - NYTimes.com - 0 views

  • Whether it be uptalk (pronouncing statements as if they were questions? Like this?), creating slang words like “bitchin’ ” and “ridic,” or the incessant use of “like” as a conversation filler, vocal trends associated with young women are often seen as markers of immaturity or even stupidity. Right? But linguists — many of whom once promoted theories consistent with that attitude — now say such thinking is outmoded. Girls and women in their teens and 20s deserve credit for pioneering vocal trends and popular slang, they say, adding that young women use these embellishments in much more sophisticated ways than people tend to realize. “A lot of these really flamboyant things you hear are cute, and girls are supposed to be cute,” said Penny Eckert, a professor of linguistics at Stanford University. “But they’re not just using them because they’re girls. They’re using them to achieve some kind of interactional and stylistic end.”
Javier E

George Packer: Is Amazon Bad for Books? : The New Yorker - 0 views

  • Amazon is a global superstore, like Walmart. It’s also a hardware manufacturer, like Apple, and a utility, like Con Edison, and a video distributor, like Netflix, and a book publisher, like Random House, and a production studio, like Paramount, and a literary magazine, like The Paris Review, and a grocery deliverer, like FreshDirect, and someday it might be a package service, like U.P.S. Its founder and chief executive, Jeff Bezos, also owns a major newspaper, the Washington Post. All these streams and tributaries make Amazon something radically new in the history of American business
  • Amazon is not just the “Everything Store,” to quote the title of Brad Stone’s rich chronicle of Bezos and his company; it’s more like the Everything. What remains constant is ambition, and the search for new things to be ambitious about.
  • It wasn’t a love of books that led him to start an online bookstore. “It was totally based on the property of books as a product,” Shel Kaphan, Bezos’s former deputy, says. Books are easy to ship and hard to break, and there was a major distribution warehouse in Oregon. Crucially, there are far too many books, in and out of print, to sell even a fraction of them at a physical store. The vast selection made possible by the Internet gave Amazon its initial advantage, and a wedge into selling everything else.
  • ...38 more annotations...
  • it’s impossible to know for sure, but, according to one publisher’s estimate, book sales in the U.S. now make up no more than seven per cent of the company’s roughly seventy-five billion dollars in annual revenue.
  • A monopoly is dangerous because it concentrates so much economic power, but in the book business the prospect of a single owner of both the means of production and the modes of distribution is especially worrisome: it would give Amazon more control over the exchange of ideas than any company in U.S. history.
  • “The key to understanding Amazon is the hiring process,” one former employee said. “You’re not hired to do a particular job—you’re hired to be an Amazonian. Lots of managers had to take the Myers-Briggs personality tests. Eighty per cent of them came in two or three similar categories, and Bezos is the same: introverted, detail-oriented, engineer-type personality. Not musicians, designers, salesmen. The vast majority fall within the same personality type—people who graduate at the top of their class at M.I.T. and have no idea what to say to a woman in a bar.”
  • According to Marcus, Amazon executives considered publishing people “antediluvian losers with rotary phones and inventory systems designed in 1968 and warehouses full of crap.” Publishers kept no data on customers, making their bets on books a matter of instinct rather than metrics. They were full of inefficiences, starting with overpriced Manhattan offices.
  • For a smaller house, Amazon’s total discount can go as high as sixty per cent, which cuts deeply into already slim profit margins. Because Amazon manages its inventory so well, it often buys books from small publishers with the understanding that it can’t return them, for an even deeper discount
  • According to one insider, around 2008—when the company was selling far more than books, and was making twenty billion dollars a year in revenue, more than the combined sales of all other American bookstores—Amazon began thinking of content as central to its business. Authors started to be considered among the company’s most important customers. By then, Amazon had lost much of the market in selling music and videos to Apple and Netflix, and its relations with publishers were deteriorating
  • In its drive for profitability, Amazon did not raise retail prices; it simply squeezed its suppliers harder, much as Walmart had done with manufacturers. Amazon demanded ever-larger co-op fees and better shipping terms; publishers knew that they would stop being favored by the site’s recommendation algorithms if they didn’t comply. Eventually, they all did.
  • Brad Stone describes one campaign to pressure the most vulnerable publishers for better terms: internally, it was known as the Gazelle Project, after Bezos suggested “that Amazon should approach these small publishers the way a cheetah would pursue a sickly gazelle.”
  • ithout dropping co-op fees entirely, Amazon simplified its system: publishers were asked to hand over a percentage of their previous year’s sales on the site, as “marketing development funds.”
  • The figure keeps rising, though less for the giant pachyderms than for the sickly gazelles. According to the marketing executive, the larger houses, which used to pay two or three per cent of their net sales through Amazon, now relinquish five to seven per cent of gross sales, pushing Amazon’s percentage discount on books into the mid-fifties. Random House currently gives Amazon an effective discount of around fifty-three per cent.
  • In December, 1999, at the height of the dot-com mania, Time named Bezos its Person of the Year. “Amazon isn’t about technology or even commerce,” the breathless cover article announced. “Amazon is, like every other site on the Web, a content play.” Yet this was the moment, Marcus said, when “content” people were “on the way out.”
  • By 2010, Amazon controlled ninety per cent of the market in digital books—a dominance that almost no company, in any industry, could claim. Its prohibitively low prices warded off competition
  • In 2004, he set up a lab in Silicon Valley that would build Amazon’s first piece of consumer hardware: a device for reading digital books. According to Stone’s book, Bezos told the executive running the project, “Proceed as if your goal is to put everyone selling physical books out of a job.”
  • Lately, digital titles have levelled off at about thirty per cent of book sales.
  • The literary agent Andrew Wylie (whose firm represents me) says, “What Bezos wants is to drag the retail price down as low as he can get it—a dollar-ninety-nine, even ninety-nine cents. That’s the Apple play—‘What we want is traffic through our device, and we’ll do anything to get there.’ ” If customers grew used to paying just a few dollars for an e-book, how long before publishers would have to slash the cover price of all their titles?
  • As Apple and the publishers see it, the ruling ignored the context of the case: when the key events occurred, Amazon effectively had a monopoly in digital books and was selling them so cheaply that it resembled predatory pricing—a barrier to entry for potential competitors. Since then, Amazon’s share of the e-book market has dropped, levelling off at about sixty-five per cent, with the rest going largely to Apple and to Barnes & Noble, which sells the Nook e-reader. In other words, before the feds stepped in, the agency model introduced competition to the market
  • But the court’s decision reflected a trend in legal thinking among liberals and conservatives alike, going back to the seventies, that looks at antitrust cases from the perspective of consumers, not producers: what matters is lowering prices, even if that goal comes at the expense of competition. Barry Lynn, a market-policy expert at the New America Foundation, said, “It’s one of the main factors that’s led to massive consolidation.”
  • Publishers sometimes pass on this cost to authors, by redefining royalties as a percentage of the publisher’s receipts, not of the book’s list price. Recently, publishers say, Amazon began demanding an additional payment, amounting to approximately one per cent of net sales
  • brick-and-mortar retailers employ forty-seven people for every ten million dollars in revenue earned; Amazon employs fourteen.
  • Since the arrival of the Kindle, the tension between Amazon and the publishers has become an open battle. The conflict reflects not only business antagonism amid technological change but a division between the two coasts, with different cultural styles and a philosophical disagreement about what techies call “disruption.”
  • Bezos told Charlie Rose, “Amazon is not happening to bookselling. The future is happening to bookselling.”
  • n Grandinetti’s view, the Kindle “has helped the book business make a more orderly transition to a mixed print and digital world than perhaps any other medium.” Compared with people who work in music, movies, and newspapers, he said, authors are well positioned to thrive. The old print world of scarcity—with a limited number of publishers and editors selecting which manuscripts to publish, and a limited number of bookstores selecting which titles to carry—is yielding to a world of digital abundance. Grandinetti told me that, in these new circumstances, a publisher’s job “is to build a megaphone.”
  • it offers an extremely popular self-publishing platform. Authors become Amazon partners, earning up to seventy per cent in royalties, as opposed to the fifteen per cent that authors typically make on hardcovers. Bezos touts the biggest successes, such as Theresa Ragan, whose self-published thrillers and romances have been downloaded hundreds of thousands of times. But one survey found that half of all self-published authors make less than five hundred dollars a year.
  • The business term for all this clear-cutting is “disintermediation”: the elimination of the “gatekeepers,” as Bezos calls the professionals who get in the customer’s way. There’s a populist inflection to Amazon’s propaganda, an argument against élitist institutions and for “the democratization of the means of production”—a common line of thought in the West Coast tech world
  • “Book publishing is a very human business, and Amazon is driven by algorithms and scale,” Sargent told me. When a house gets behind a new book, “well over two hundred people are pushing your book all over the place, handing it to people, talking about it. A mass of humans, all in one place, generating tremendous energy—that’s the magic potion of publishing. . . . That’s pretty hard to replicate in Amazon’s publishing world, where they have hundreds of thousands of titles.”
  • By producing its own original work, Amazon can sell more devices and sign up more Prime members—a major source of revenue. While the company was building the
  • Like the publishing venture, Amazon Studios set out to make the old “gatekeepers”—in this case, Hollywood agents and executives—obsolete. “We let the data drive what to put in front of customers,” Carr told the Wall Street Journal. “We don’t have tastemakers deciding what our customers should read, listen to, and watch.”
  • book publishers have been consolidating for several decades, under the ownership of media conglomerates like News Corporation, which squeeze them for profits, or holding companies such as Rivergroup, which strip them to service debt. The effect of all this corporatization, as with the replacement of independent booksellers by superstores, has been to privilege the blockbuster.
  • The combination of ceaseless innovation and low-wage drudgery makes Amazon the epitome of a successful New Economy company. It’s hiring as fast as it can—nearly thirty thousand employees last year.
  • the long-term outlook is discouraging. This is partly because Americans don’t read as many books as they used to—they are too busy doing other things with their devices—but also because of the relentless downward pressure on prices that Amazon enforces.
  • he digital market is awash with millions of barely edited titles, most of it dreck, while r
  • Amazon believes that its approach encourages ever more people to tell their stories to ever more people, and turns writers into entrepreneurs; the price per unit might be cheap, but the higher number of units sold, and the accompanying royalties, will make authors wealthier
  • In Friedman’s view, selling digital books at low prices will democratize reading: “What do you want as an author—to sell books to as few people as possible for as much as possible, or for as little as possible to as many readers as possible?”
  • The real talent, the people who are writers because they happen to be really good at writing—they aren’t going to be able to afford to do it.”
  • Seven-figure bidding wars still break out over potential blockbusters, even though these battles often turn out to be follies. The quest for publishing profits in an economy of scarcity drives the money toward a few big books. So does the gradual disappearance of book reviewers and knowledgeable booksellers, whose enthusiasm might have rescued a book from drowning in obscurity. When consumers are overwhelmed with choices, some experts argue, they all tend to buy the same well-known thing.
  • These trends point toward what the literary agent called “the rich getting richer, the poor getting poorer.” A few brand names at the top, a mass of unwashed titles down below, the middle hollowed out: the book business in the age of Amazon mirrors the widening inequality of the broader economy.
  • “If they did, in my opinion they would save the industry. They’d lose thirty per cent of their sales, but they would have an additional thirty per cent for every copy they sold, because they’d be selling directly to consumers. The industry thinks of itself as Procter & Gamble*. What gave publishers the idea that this was some big goddam business? It’s not—it’s a tiny little business, selling to a bunch of odd people who read.”
  • Bezos is right: gatekeepers are inherently élitist, and some of them have been weakened, in no small part, because of their complacency and short-term thinking. But gatekeepers are also barriers against the complete commercialization of ideas, allowing new talent the time to develop and learn to tell difficult truths. When the last gatekeeper but one is gone, will Amazon care whether a book is any good? ♦
1 - 20 of 213 Next › Last »
Showing 20 items per page