Skip to main content

Home/ TOK Friends/ Group items tagged Set

Rss Feed Group items tagged

sanderk

How Procrastination Affects Your Health - Thrive Global - 0 views

  • fine line between procrastination and being “pressure prompted.” If you’re like me and pressure prompted, you are someone who often does your best work when faced with a looming deadline. While being pressure prompted may entail a bit of procrastination, it is procrastination within acceptable limits. In other words, it is a set of conditions that offers just enough pressure to ensure you’re at the top of your game without divulging into chaos or most importantly, impacting other members of your team by preventing them from delivering their best work in a timely manner.
  • Procrastination is a condition that has consequences on one’s mental and physical health and performance at school and in the workplace.
  • Piers Steel defines procrastination as “a self-regulatory failure leading to poor performance and reduced well-being.” Notably, Steel further emphasizes that procrastination is both common (80% to 90% of college-age students suffer from it at least some of the time) and something most people (95%) wish to overcome.
  • ...5 more annotations...
  • Steel even argues that procrastination may now be on the rise as people increasingly turn to the immediate gratification made possible by information technologies and specifically, social media platforms.
  • for a small percentage of people, procrastination isn’t just a temporary or occasional problem but rather something that comes to structure their lives and ultimately limit their potential.
  • In a 2008 study, Peter Gröpel & Piers Steel investigated predictors of procrastination in a large Internet-based study that included over 9,000 participants. Their results revealed two important findings. First, their results showed that goal setting reduced procrastination; second, they found that it was strongly associated with lack of energy.
  • While it is true that intrinsically motivated people may have an easier time getting into flow, anyone, even a chronic procrastinator, can cultivate flow. The first step is easy—it simply entails coming up with a clear goal.
  • The second step is to stop feeling ashamed about your procrastinating tendencies.
  •  
    This article is very interesting because it says that procrastination is not necessarily bad. Procrastination can be good for people in small quantities because it causes them to be pressured into actually doing their work. However, there is a point where procrastination becomes an issue. I find it interesting how phones and computers have caused procrastination problems to become more severe. Phones and computers can give people instant gratification which leads to more procrastination. As the article says if people set goals for themselves and are disciplined they can overcome procrastination.
Javier E

Why the Past 10 Years of American Life Have Been Uniquely Stupid - The Atlantic - 0 views

  • Social scientists have identified at least three major forces that collectively bind together successful democracies: social capital (extensive social networks with high levels of trust), strong institutions, and shared stories.
  • Social media has weakened all three.
  • gradually, social-media users became more comfortable sharing intimate details of their lives with strangers and corporations. As I wrote in a 2019 Atlantic article with Tobias Rose-Stockwell, they became more adept at putting on performances and managing their personal brand—activities that might impress others but that do not deepen friendships in the way that a private phone conversation will.
  • ...118 more annotations...
  • the stage was set for the major transformation, which began in 2009: the intensification of viral dynamics.
  • Before 2009, Facebook had given users a simple timeline––a never-ending stream of content generated by their friends and connections, with the newest posts at the top and the oldest ones at the bottom
  • That began to change in 2009, when Facebook offered users a way to publicly “like” posts with the click of a button. That same year, Twitter introduced something even more powerful: the “Retweet” button, which allowed users to publicly endorse a post while also sharing it with all of their followers.
  • “Like” and “Share” buttons quickly became standard features of most other platforms.
  • Facebook developed algorithms to bring each user the content most likely to generate a “like” or some other interaction, eventually including the “share” as well.
  • Later research showed that posts that trigger emotions––especially anger at out-groups––are the most likely to be shared.
  • By 2013, social media had become a new game, with dynamics unlike those in 2008. If you were skillful or lucky, you might create a post that would “go viral” and make you “internet famous”
  • If you blundered, you could find yourself buried in hateful comments. Your posts rode to fame or ignominy based on the clicks of thousands of strangers, and you in turn contributed thousands of clicks to the game.
  • This new game encouraged dishonesty and mob dynamics: Users were guided not just by their true preferences but by their past experiences of reward and punishment,
  • As a social psychologist who studies emotion, morality, and politics, I saw this happening too. The newly tweaked platforms were almost perfectly designed to bring out our most moralistic and least reflective selves. The volume of outrage was shocking.
  • It was just this kind of twitchy and explosive spread of anger that James Madison had tried to protect us from as he was drafting the U.S. Constitution.
  • The Framers of the Constitution were excellent social psychologists. They knew that democracy had an Achilles’ heel because it depended on the collective judgment of the people, and democratic communities are subject to “the turbulency and weakness of unruly passions.”
  • The key to designing a sustainable republic, therefore, was to build in mechanisms to slow things down, cool passions, require compromise, and give leaders some insulation from the mania of the moment while still holding them accountable to the people periodically, on Election Day.
  • The tech companies that enhanced virality from 2009 to 2012 brought us deep into Madison’s nightmare.
  • a less quoted yet equally important insight, about democracy’s vulnerability to triviality.
  • Madison notes that people are so prone to factionalism that “where no substantial occasion presents itself, the most frivolous and fanciful distinctions have been sufficient to kindle their unfriendly passions and excite their most violent conflicts.”
  • Social media has both magnified and weaponized the frivolous.
  • It’s not just the waste of time and scarce attention that matters; it’s the continual chipping-away of trust.
  • a democracy depends on widely internalized acceptance of the legitimacy of rules, norms, and institutions.
  • when citizens lose trust in elected leaders, health authorities, the courts, the police, universities, and the integrity of elections, then every decision becomes contested; every election becomes a life-and-death struggle to save the country from the other side
  • The most recent Edelman Trust Barometer (an international measure of citizens’ trust in government, business, media, and nongovernmental organizations) showed stable and competent autocracies (China and the United Arab Emirates) at the top of the list, while contentious democracies such as the United States, the United Kingdom, Spain, and South Korea scored near the bottom (albeit above Russia).
  • The literature is complex—some studies show benefits, particularly in less developed democracies—but the review found that, on balance, social media amplifies political polarization; foments populism, especially right-wing populism; and is associated with the spread of misinformation.
  • When people lose trust in institutions, they lose trust in the stories told by those institutions. That’s particularly true of the institutions entrusted with the education of children.
  • Facebook and Twitter make it possible for parents to become outraged every day over a new snippet from their children’s history lessons––and math lessons and literature selections, and any new pedagogical shifts anywhere in the country
  • The motives of teachers and administrators come into question, and overreaching laws or curricular reforms sometimes follow, dumbing down education and reducing trust in it further.
  • young people educated in the post-Babel era are less likely to arrive at a coherent story of who we are as a people, and less likely to share any such story with those who attended different schools or who were educated in a different decade.
  • former CIA analyst Martin Gurri predicted these fracturing effects in his 2014 book, The Revolt of the Public. Gurri’s analysis focused on the authority-subverting effects of information’s exponential growth, beginning with the internet in the 1990s. Writing nearly a decade ago, Gurri could already see the power of social media as a universal solvent, breaking down bonds and weakening institutions everywhere it reached.
  • he notes a constructive feature of the pre-digital era: a single “mass audience,” all consuming the same content, as if they were all looking into the same gigantic mirror at the reflection of their own society. I
  • The digital revolution has shattered that mirror, and now the public inhabits those broken pieces of glass. So the public isn’t one thing; it’s highly fragmented, and it’s basically mutually hostile
  • Facebook, Twitter, YouTube, and a few other large platforms unwittingly dissolved the mortar of trust, belief in institutions, and shared stories that had held a large and diverse secular democracy together.
  • I think we can date the fall of the tower to the years between 2011 (Gurri’s focal year of “nihilistic” protests) and 2015, a year marked by the “great awokening” on the left and the ascendancy of Donald Trump on the right.
  • Twitter can overpower all the newspapers in the country, and stories cannot be shared (or at least trusted) across more than a few adjacent fragments—so truth cannot achieve widespread adherence.
  • fter Babel, nothing really means anything anymore––at least not in a way that is durable and on which people widely agree.
  • Politics After Babel
  • “Politics is the art of the possible,” the German statesman Otto von Bismarck said in 1867. In a post-Babel democracy, not much may be possible.
  • The ideological distance between the two parties began increasing faster in the 1990s. Fox News and the 1994 “Republican Revolution” converted the GOP into a more combative party.
  • So cross-party relationships were already strained before 2009. But the enhanced virality of social media thereafter made it more hazardous to be seen fraternizing with the enemy or even failing to attack the enemy with sufficient vigor.
  • What changed in the 2010s? Let’s revisit that Twitter engineer’s metaphor of handing a loaded gun to a 4-year-old. A mean tweet doesn’t kill anyone; it is an attempt to shame or punish someone publicly while broadcasting one’s own virtue, brilliance, or tribal loyalties. It’s more a dart than a bullet
  • from 2009 to 2012, Facebook and Twitter passed out roughly 1 billion dart guns globally. We’ve been shooting one another ever since.
  • “devoted conservatives,” comprised 6 percent of the U.S. population.
  • the warped “accountability” of social media has also brought injustice—and political dysfunction—in three ways.
  • First, the dart guns of social media give more power to trolls and provocateurs while silencing good citizens.
  • a small subset of people on social-media platforms are highly concerned with gaining status and are willing to use aggression to do so.
  • Across eight studies, Bor and Petersen found that being online did not make most people more aggressive or hostile; rather, it allowed a small number of aggressive people to attack a much larger set of victims. Even a small number of jerks were able to dominate discussion forums,
  • Additional research finds that women and Black people are harassed disproportionately, so the digital public square is less welcoming to their voices.
  • Second, the dart guns of social media give more power and voice to the political extremes while reducing the power and voice of the moderate majority.
  • The “Hidden Tribes” study, by the pro-democracy group More in Common, surveyed 8,000 Americans in 2017 and 2018 and identified seven groups that shared beliefs and behaviors.
  • Social media has given voice to some people who had little previously, and it has made it easier to hold powerful people accountable for their misdeeds
  • The group furthest to the left, the “progressive activists,” comprised 8 percent of the population. The progressive activists were by far the most prolific group on social media: 70 percent had shared political content over the previous year. The devoted conservatives followed, at 56 percent.
  • These two extreme groups are similar in surprising ways. They are the whitest and richest of the seven groups, which suggests that America is being torn apart by a battle between two subsets of the elite who are not representative of the broader society.
  • they are the two groups that show the greatest homogeneity in their moral and political attitudes.
  • likely a result of thought-policing on social media:
  • political extremists don’t just shoot darts at their enemies; they spend a lot of their ammunition targeting dissenters or nuanced thinkers on their own team.
  • Finally, by giving everyone a dart gun, social media deputizes everyone to administer justice with no due process. Platforms like Twitter devolve into the Wild West, with no accountability for vigilantes.
  • Enhanced-virality platforms thereby facilitate massive collective punishment for small or imagined offenses, with real-world consequences, including innocent people losing their jobs and being shamed into suicide
  • we don’t get justice and inclusion; we get a society that ignores context, proportionality, mercy, and truth.
  • Since the tower fell, debates of all kinds have grown more and more confused. The most pervasive obstacle to good thinking is confirmation bias, which refers to the human tendency to search only for evidence that confirms our preferred beliefs
  • search engines were supercharging confirmation bias, making it far easier for people to find evidence for absurd beliefs and conspiracy theorie
  • The most reliable cure for confirmation bias is interaction with people who don’t share your beliefs. They confront you with counterevidence and counterargument.
  • In his book The Constitution of Knowledge, Jonathan Rauch describes the historical breakthrough in which Western societies developed an “epistemic operating system”—that is, a set of institutions for generating knowledge from the interactions of biased and cognitively flawed individuals
  • English law developed the adversarial system so that biased advocates could present both sides of a case to an impartial jury.
  • Newspapers full of lies evolved into professional journalistic enterprises, with norms that required seeking out multiple sides of a story, followed by editorial review, followed by fact-checking.
  • Universities evolved from cloistered medieval institutions into research powerhouses, creating a structure in which scholars put forth evidence-backed claims with the knowledge that other scholars around the world would be motivated to gain prestige by finding contrary evidence.
  • Part of America’s greatness in the 20th century came from having developed the most capable, vibrant, and productive network of knowledge-producing institutions in all of human history
  • But this arrangement, Rauch notes, “is not self-maintaining; it relies on an array of sometimes delicate social settings and understandings, and those need to be understood, affirmed, and protected.”
  • This, I believe, is what happened to many of America’s key institutions in the mid-to-late 2010s. They got stupider en masse because social media instilled in their members a chronic fear of getting darted
  • it was so pervasive that it established new behavioral norms backed by new policies seemingly overnight
  • Participants in our key institutions began self-censoring to an unhealthy degree, holding back critiques of policies and ideas—even those presented in class by their students—that they believed to be ill-supported or wrong.
  • The stupefying process plays out differently on the right and the left because their activist wings subscribe to different narratives with different sacred values.
  • The “Hidden Tribes” study tells us that the “devoted conservatives” score highest on beliefs related to authoritarianism. They share a narrative in which America is eternally under threat from enemies outside and subversives within; they see life as a battle between patriots and traitors.
  • they are psychologically different from the larger group of “traditional conservatives” (19 percent of the population), who emphasize order, decorum, and slow rather than radical change.
  • The traditional punishment for treason is death, hence the battle cry on January 6: “Hang Mike Pence.”
  • Right-wing death threats, many delivered by anonymous accounts, are proving effective in cowing traditional conservatives
  • The wave of threats delivered to dissenting Republican members of Congress has similarly pushed many of the remaining moderates to quit or go silent, giving us a party ever more divorced from the conservative tradition, constitutional responsibility, and reality.
  • The stupidity on the right is most visible in the many conspiracy theories spreading across right-wing media and now into Congress.
  • The Democrats have also been hit hard by structural stupidity, though in a different way. In the Democratic Party, the struggle between the progressive wing and the more moderate factions is open and ongoing, and often the moderates win.
  • The problem is that the left controls the commanding heights of the culture: universities, news organizations, Hollywood, art museums, advertising, much of Silicon Valley, and the teachers’ unions and teaching colleges that shape K–12 education. And in many of those institutions, dissent has been stifled:
  • Liberals in the late 20th century shared a belief that the sociologist Christian Smith called the “liberal progress” narrative, in which America used to be horrifically unjust and repressive, but, thanks to the struggles of activists and heroes, has made (and continues to make) progress toward realizing the noble promise of its founding.
  • It is also the view of the “traditional liberals” in the “Hidden Tribes” study (11 percent of the population), who have strong humanitarian values, are older than average, and are largely the people leading America’s cultural and intellectual institutions.
  • when the newly viralized social-media platforms gave everyone a dart gun, it was younger progressive activists who did the most shooting, and they aimed a disproportionate number of their darts at these older liberal leaders.
  • Confused and fearful, the leaders rarely challenged the activists or their nonliberal narrative in which life at every institution is an eternal battle among identity groups over a zero-sum pie, and the people on top got there by oppressing the people on the bottom. This new narrative is rigidly egalitarian––focused on equality of outcomes, not of rights or opportunities. It is unconcerned with individual rights.
  • The universal charge against people who disagree with this narrative is not “traitor”; it is “racist,” “transphobe,” “Karen,” or some related scarlet letter marking the perpetrator as one who hates or harms a marginalized group.
  • The punishment that feels right for such crimes is not execution; it is public shaming and social death.
  • anyone on Twitter had already seen dozens of examples teaching the basic lesson: Don’t question your own side’s beliefs, policies, or actions. And when traditional liberals go silent, as so many did in the summer of 2020, the progressive activists’ more radical narrative takes over as the governing narrative of an organization.
  • This is why so many epistemic institutions seemed to “go woke” in rapid succession that year and the next, beginning with a wave of controversies and resignations at The New York Times and other newspapers, and continuing on to social-justice pronouncements by groups of doctors and medical associations
  • The problem is structural. Thanks to enhanced-virality social media, dissent is punished within many of our institutions, which means that bad ideas get elevated into official policy.
  • In a 2018 interview, Steve Bannon, the former adviser to Donald Trump, said that the way to deal with the media is “to flood the zone with shit.” He was describing the “firehose of falsehood” tactic pioneered by Russian disinformation programs to keep Americans confused, disoriented, and angry.
  • artificial intelligence is close to enabling the limitless spread of highly believable disinformation. The AI program GPT-3 is already so good that you can give it a topic and a tone and it will spit out as many essays as you like, typically with perfect grammar and a surprising level of coherence.
  • Renée DiResta, the research manager at the Stanford Internet Observatory, explained that spreading falsehoods—whether through text, images, or deep-fake videos—will quickly become inconceivably easy. (She co-wrote the essay with GPT-3.)
  • American factions won’t be the only ones using AI and social media to generate attack content; our adversaries will too.
  • In the 20th century, America’s shared identity as the country leading the fight to make the world safe for democracy was a strong force that helped keep the culture and the polity together.
  • In the 21st century, America’s tech companies have rewired the world and created products that now appear to be corrosive to democracy, obstacles to shared understanding, and destroyers of the modern tower.
  • What changes are needed?
  • I can suggest three categories of reforms––three goals that must be achieved if democracy is to remain viable in the post-Babel era.
  • We must harden democratic institutions so that they can withstand chronic anger and mistrust, reform social media so that it becomes less socially corrosive, and better prepare the next generation for democratic citizenship in this new age.
  • Harden Democratic Institutions
  • we must reform key institutions so that they can continue to function even if levels of anger, misinformation, and violence increase far above those we have today.
  • Reforms should reduce the outsize influence of angry extremists and make legislators more responsive to the average voter in their district.
  • One example of such a reform is to end closed party primaries, replacing them with a single, nonpartisan, open primary from which the top several candidates advance to a general election that also uses ranked-choice voting
  • A second way to harden democratic institutions is to reduce the power of either political party to game the system in its favor, for example by drawing its preferred electoral districts or selecting the officials who will supervise elections
  • These jobs should all be done in a nonpartisan way.
  • Reform Social Media
  • Social media’s empowerment of the far left, the far right, domestic trolls, and foreign agents is creating a system that looks less like democracy and more like rule by the most aggressive.
  • it is within our power to reduce social media’s ability to dissolve trust and foment structural stupidity. Reforms should limit the platforms’ amplification of the aggressive fringes while giving more voice to what More in Common calls “the exhausted majority.”
  • the main problem with social media is not that some people post fake or toxic stuff; it’s that fake and outrage-inducing content can now attain a level of reach and influence that was not possible before
  • Perhaps the biggest single change that would reduce the toxicity of existing platforms would be user verification as a precondition for gaining the algorithmic amplification that social media offers.
  • One of the first orders of business should be compelling the platforms to share their data and their algorithms with academic researchers.
  • Prepare the Next Generation
  • Childhood has become more tightly circumscribed in recent generations––with less opportunity for free, unstructured play; less unsupervised time outside; more time online. Whatever else the effects of these shifts, they have likely impeded the development of abilities needed for effective self-governance for many young adults
  • Depression makes people less likely to want to engage with new people, ideas, and experiences. Anxiety makes new things seem more threatening. As these conditions have risen and as the lessons on nuanced social behavior learned through free play have been delayed, tolerance for diverse viewpoints and the ability to work out disputes have diminished among many young people
  • Students did not just say that they disagreed with visiting speakers; some said that those lectures would be dangerous, emotionally devastating, a form of violence. Because rates of teen depression and anxiety have continued to rise into the 2020s, we should expect these views to continue in the generations to follow, and indeed to become more severe.
  • The most important change we can make to reduce the damaging effects of social media on children is to delay entry until they have passed through puberty.
  • The age should be raised to at least 16, and companies should be held responsible for enforcing it.
  • et them out to play. Stop starving children of the experiences they most need to become good citizens: free play in mixed-age groups of children with minimal adult supervision
  • while social media has eroded the art of association throughout society, it may be leaving its deepest and most enduring marks on adolescents. A surge in rates of anxiety, depression, and self-harm among American teens began suddenly in the early 2010s. (The same thing happened to Canadian and British teens, at the same time.) The cause is not known, but the timing points to social media as a substantial contributor—the surge began just as the large majority of American teens became daily users of the major platforms.
  • What would it be like to live in Babel in the days after its destruction? We know. It is a time of confusion and loss. But it is also a time to reflect, listen, and build.
  • In recent years, Americans have started hundreds of groups and organizations dedicated to building trust and friendship across the political divide, including BridgeUSA, Braver Angels (on whose board I serve), and many others listed at BridgeAlliance.us. We cannot expect Congress and the tech companies to save us. We must change ourselves and our communities.
  • when we look away from our dysfunctional federal government, disconnect from social media, and talk with our neighbors directly, things seem more hopeful. Most Americans in the More in Common report are members of the “exhausted majority,” which is tired of the fighting and is willing to listen to the other side and compromise. Most Americans now see that social media is having a negative impact on the country, and are becoming more aware of its damaging effects on children.
kortanekev

How scientific is political science? | David Wearing | Opinion | The Guardian - 0 views

  • The prevailing view within the discipline is that scholars should set aside moral values and political concerns in favour of detached enquiry into the mechanics of how the political world functions.
  • But I have yet to be convinced by the idea that the study of politics can be apolitical and value-neutral. Our choice of research topics will inevitably reflect our own political and moral priorities, and the way in which that research is framed and conducted is bound to reflect assumptions which – whether held consciously, semi-consciously or unconsciously – remain of a moral and political nature.
  •  
    Good example of the way our biases affect our ability to set aside preconceived notions and beliefs, our ability to objectively analyze things and conduct good science- this fallacy is especially prevalent in political "science" as most all go in with strong personal opinions.  (Evie 12/7/16) 
kushnerha

Is That Even a Thing? - The New York Times - 3 views

  • Speakers and writers of American English have recently taken to identifying a staggering and constantly changing array of trends, events, memes, products, lifestyle choices and phenomena of nearly every kind with a single label — a thing.
  • It would be easy to call this a curiosity of the language and leave it at that. Linguistic trends come and go.
  • One could, on the other hand, consider the use of “a thing” a symptom of an entire generation’s linguistic sloth, general inarticulateness and penchant for cutesy, empty, half-ironic formulations that create a self-satisfied barrier preventing any form of genuine engagement with the world around them.
  • ...9 more annotations...
  • My assumption is that language and experience mutually influence each other. Language not only captures experience, it conditions it. It sets expectations for experience and gives shape to it as it happens. What might register as inarticulateness can reflect a different way of understanding and experiencing the world.
  • The word “thing” has of course long played a versatile and generic role in our language, referring both to physical objects and abstract matters. “The thing is …” “Here’s the thing.” “The play’s the thing.” In these examples, “thing” denotes the matter at hand and functions as stage setting to emphasize an important point. One new thing about “a thing,” then, is the typical use of the indefinite article “a” to precede it. We talk about a thing because we are engaged in cataloging. The question is whether something counts as a thing. “A thing” is not just stage setting. Information is conveyed.
  • What information? One definition of “a thing” that suggests itself right away is “cultural phenomenon.” A new app, an item of celebrity gossip, the practices of a subculture. It seems likely that “a thing” comes from the phrase the coolest/newest/latest thing. But now, in a society where everything, even the past, is new — “new thing” verges on the redundant. If they weren’t new they wouldn’t be things.
  • Clearly, cultural phenomena have long existed and been called “fads,” “trends,” “rages” or have been designated by the category they belong to — “product,” “fashion,” “lifestyle,” etc. So why the application of this homogenizing general term to all of them? I think there are four main reasons.
  • First, the flood of content into the cultural sphere. That we are inundated is well known. Information besieges us in waves that thrash us against the shore until we retreat to the solid ground of work or sleep or exercise or actual human interaction, only to wade cautiously back into our smartphones. As we spend more and more time online, it becomes the content of our experience, and in this sense “things” have earned their name. “A thing” has become the basic unit of cultural ontology.
  • Second, the fragmentation of this sphere. The daily barrage of culture requires that we choose a sliver of the whole in order to keep up. Netflix genres like “Understated Romantic Road Trip Movies” make it clear that the individual is becoming his or her own niche market — the converse of the celebrity as brand. We are increasingly a society of brands attuning themselves to markets, and markets evaluating brands. The specificity of the market requires a wider range of content — of things — to satisfy it
  • Third, the closing gap between satire and the real thing. The absurd excess of things has reached a point where the ironic detachment needed to cope with them is increasingly built into the things themselves, their marketing and the language we use to talk about them. The designator “a thing” is thus almost always tinged with ironic detachment. It puts the thing at arm’s length. You can hardly say “a thing” without a wary glint in your eye.
  • Finally, the growing sense that these phenomena are all the same. As we step back from “things,” they recede into the distance and begin to blur together. We call them all by the same name because they are the same at bottom: All are pieces of the Internet. A thing is for the most part experienced through this medium and generated by it. Even if they arise outside it, things owe their existence as things to the Internet. Google is thus always the arbiter of the question, “Is that a real thing?”
  • “A thing,” then, corresponds to a real need we have, to catalog and group together the items of cultural experience, while keeping them at a sufficient distance so that we can at least feign unified consciousness in the face of a world gone to pieces.
Emily Freilich

In Fiery Protest, Italian Museum Sets Art Ablaze : NPR - 0 views

  • Manfredi's "art war" consists of setting works of art on fire to protest cuts to Italy's arts budget. He's pledged to incinerate two or three pieces of art each week from a museum collection housing about 1,000 exhibits.
  • The budgets of state-run museums, archaeological sites and libraries are among the hardest hit.
  • not just about funding, but also an appeal for moral help and attention from authorities.
  • ...3 more annotations...
  • "We want the institutions in Italy and around the world to understand that the culture is very important," he says. "And it's not possible when there is an economic problem in the world, [that] the first that the government destroys is the art."
  • Italian government spending on the arts has been slashed by some 76 percent over the past two years.
  • during the recession — when people don't have money to buy gasoline — the number of visitors to museums and archaeological sites is actually growing. Resca looks to the Greek philosopher Aristotle to explain the phenomenon. "He said that during successful period[s], culture was an ornament," Resca says. "In bad periods, culture is a big shelter.
Javier E

Walmart's Visible Hand - NYTimes.com - 1 views

  • Conservatives — with the backing, I have to admit, of many economists — normally argue that the market for labor is like the market for anything else. The law of supply and demand, they say, determines the level of wages, and the invisible hand of the market will punish anyone who tries to defy this law.
  • Specifically, this view implies that any attempt to push up wages will either fail or have bad consequences. Setting a minimum wage, it’s claimed, will reduce employment and create a labor surplus, the same way attempts to put floors under the prices of agricultural commodities used to lead to butter mountains, wine lakes and so on
  • Pressuring employers to pay more, or encouraging workers to organize into unions, will have the same effect.
  • ...13 more annotations...
  • But labor economists have long questioned this view
  • the labor force — is people. And because workers are people, wages are not, in fact, like the price of butter, and how much workers are paid depends as much on social forces and political power as it does on simple supply and demand.
  • What’s the evidence? First, there is what actually happens when minimum wages are increased. Many states set minimum wages above the federal level, and we can look at what happens when a state raises its minimum while neighboring states do no
  • the overwhelming conclusion from studying these natural experiments is that moderate increases in the minimum wage have little or no negative effect on employment.
  • Then there’s history. It turns out that the middle-class society we used to have didn’t evolve as a result of impersonal market forces — it was created by political action, and in a brief period of time
  • Part of the answer is direct government intervention, especially during World War II, when government wage-setting authority was used to narrow gaps between the best paid and the worst paid. Part of it, surely, was a sharp increase in unionization. Part of it was the full-employment economy of the war years, which created very strong demand for workers and empowered them to seek higher pay.
  • How did that happen?
  • America was still a very unequal society in 1940, but by 1950 it had been transformed by a dramatic reduction in income disparities, which the economists Claudia Goldin and Robert Margo labeled the Great Compression.
  • the Great Compression didn’t go away as soon as the war was over. Instead, full employment and pro-worker politics changed pay norms, and a strong middle class endured for more than a generation. Oh, and the decades after the war were also marked by unprecedented economic growth.
  • Walmart is under political pressure over wages so low that a substantial number of employees are on food stamps and Medicaid. Meanwhile, workers are gaining clout thanks to an improving labor market, reflected in increasing willingness to quit bad jobs.
  • its justification for the move echoes what critics of its low-wage policy have been saying for years: Paying workers better will lead to reduced turnover, better morale and higher productivity.
  • What this means, in turn, is that engineering a significant pay raise for tens of millions of Americans would almost surely be much easier than conventional wisdom suggests. Raise minimum wages by a substantial amount; make it easier for workers to organize, increasing their bargaining power; direct monetary and fiscal policy toward full employment, as opposed to keeping the economy depressed out of fear that we’ll suddenly turn into Weimar Germany. It’s not a hard list to implement — and if we did these things we could make major strides back toward the kind of society most of us want to live in.
  • The point is that extreme inequality and the falling fortunes of America’s workers are a choice, not a destiny imposed by the gods of the market. And we can change that choice if we want to.
demetriar

What Faces Can't Tell Us - NYTimes.com - 0 views

  • Research subjects were asked to look at photographs of facial expressions (smiling, scowling and so on) and match them to a limited set of emotion words (happiness, anger and so on) or to stories with phrases
  • In recent years, however, at my laboratory we began to worry that this research method was flawed. In particular, we suspected that by providing subjects with a preselected set of emotion words, these experiments had inadvertently “primed” the subjects — in effect, hinting at the answers — and thus skewed the results.
  • preliminary studies, some of which were later published in the journal Emotion, in which subjects were not given any clues and instead were asked to freely describe the emotion on a face (or to view two faces and answer yes or no as to whether they expressed the same emotion). The subjects’ performance plummeted.
  • ...3 more annotations...
  • If the emotional content of facial expressions were in fact universal, the Himba subjects would have sorted the photographs into six piles by expression, but they did not.
  • These findings strongly suggest that emotions are not universally recognized in facial expressions, challenging the theory, attributed to Charles Darwin, that facial movements might be evolved behaviors for expressing emotion.
  • The answer is that we don’t passively recognize emotions but actively perceive them, drawing heavily (if unwittingly) on a wide variety of contextual clues — a body position, a hand gesture, a vocalization, the social setting and so on.
Javier E

Specs that see right through you - tech - 05 July 2011 - New Scientist - 0 views

  • a number of "social X-ray specs" that are set to transform how we interact with each other. By sensing emotions that we would otherwise miss, these technologies can thwart disastrous social gaffes and help us understand each other better.
  • In conversation, we pantomime certain emotions that act as social lubricants. We unconsciously nod to signal that we are following the other person's train of thought, for example, or squint a bit to indicate that we are losing track. Many of these signals can be misinterpreted - sometimes because different cultures have their own specific signals.
  • n 2005, she enlisted Simon Baron-Cohen, also at Cambridge, to help her identify a set of more relevant emotional facial states. They settled on six: thinking, agreeing, concentrating, interested - and, of course, the confused and disagreeing expressions
  • ...16 more annotations...
  • More often, we fail to spot them altogether. D
  • To create this lexicon, they hired actors to mime the expressions, then asked volunteers to describe their meaning, taking the majority response as the accurate one.
  • The camera tracks 24 "feature points" on your conversation partner's face, and software developed by Picard analyses their myriad micro-expressions, how often they appear and for how long. It then compares that data with its bank of known expressions (see diagram).
  • Eventually, she thinks the system could be incorporated into a pair of augmented-reality glasses, which would overlay computer graphics onto the scene in front of the wearer.
  • the average person only managed to interpret, correctly, 54 per cent of Baron-Cohen's expressions on real, non-acted faces. This suggested to them that most people - not just those with autism - could use some help sensing the mood of people they are talking to.
  • set up a company called Affectiva, based in Waltham, Massachusetts, which is selling their expression recognition software. Their customers include companies that, for example, want to measure how people feel about their adverts or movie.
  • it's hard to fool the machine for long
  • In addition to facial expressions, we radiate a panoply of involuntary "honest signals", a term identified by MIT Media Lab researcher Alex Pentland in the early 2000s to describe the social signals that we use to augment our language. They include body language such as gesture mirroring, and cues such as variations in the tone and pitch of the voice. We do respond to these cues, but often not consciously. If we were more aware of them in others and ourselves, then we would have a fuller picture of the social reality around us, and be able to react more deliberately.
  • develop a small electronic badge that hangs around the neck. Its audio sensors record how aggressive the wearer is being, the pitch, volume and clip of their voice, and other factors. They called it the "jerk-o-meter".
  • it helped people realise when they were being either obnoxious or unduly self-effacing.
  • y the end of the experiment, all the dots had gravitated towards more or less the same size and colour. Simply being able to see their role in a group made people behave differently, and caused the group dynamics to become more even. The entire group's emotional intelligence had increased (
  • Some of our body's responses during a conversation are not designed for broadcast to another person - but it's possible to monitor those too. Your temperature and skin conductance can also reveal secrets about your emotional state, and Picard can tap them with a glove-like device called the Q Sensor. In response to stresses, good or bad, our skin becomes clammy, increasing its conductance, and the Q Sensor picks this up.
  • Physiological responses can now even be tracked remotely, in principle without your consent. Last year, Picard and one of her graduate students showed that it was possible to measure heart rate without any surface contact with the body. They used software linked to an ordinary webcam to read information about heart rate, blood pressure and skin temperature based on, among other things, colour changes in the subject's face
  • In Rio de Janeiro and Sao Paolo, police officers can decide whether someone is a criminal just by looking at them. Their glasses scan the features of a face, and match them against a database of criminal mugshots. A red light blinks if there's a match.
  • Thad Starner at Georgia Institute of Technology in Atlanta wears a small device he has built that looks like a monocle. It can retrieve video, audio or text snippets of past conversations with people he has spoken with, and even provide real-time links between past chats and topics he is currently discussing.
  • The US military has built a radar-imaging device that can see through walls to capture 3D images of people and objects beyond.
Javier E

Liu Cixin's War of the Worlds | The New Yorker - 0 views

  • he briskly dismissed the idea that fiction could serve as commentary on history or on current affairs. “The whole point is to escape the real world!” he said.
  • Chinese tech entrepreneurs discuss the Hobbesian vision of the trilogy as a metaphor for cutthroat competition in the corporate world; other fans include Barack Obama, who met Liu in Beijing two years ago, and Mark Zuckerberg. Liu’s international career has become a source of national pride. In 2015, China’s then Vice-President, Li Yuanchao, invited Liu to Zhongnanhai—an off-limits complex of government accommodation sometimes compared to the Kremlin—to discuss the books and showed Liu his own copies, which were dense with highlights and annotations.
  • In China, one of his stories has been a set text in the gao kao—the notoriously competitive college-entrance exams that determine the fate of ten million pupils annually; another has appeared in the national seventh-grade-curriculum textbook. When a reporter recently challenged Liu to answer the middle-school questions about the “meaning” and the “central themes” of his story, he didn’t get a single one right. “I’m a writer,” he told me, with a shrug.
  • ...20 more annotations...
  • Liu’s tomes—they tend to be tomes—have been translated into more than twenty languages, and the trilogy has sold some eight million copies worldwide. He has won China’s highest honor for science-fiction writing, the Galaxy Award, nine times, and in 2015 he became the first Asian writer to win the Hugo Award, the most prestigious international science-fiction prize
  • “The Three-Body Problem” takes its title from an analytical problem in orbital mechanics which has to do with the unpredictable motion of three bodies under mutual gravitational pull. Reading an article about the problem, Liu thought, What if the three bodies were three suns? How would intelligent life on a planet in such a solar system develop? From there, a structure gradually took shape that almost resembles a planetary system, with characters orbiting the central conceit like moons. For better or worse, the characters exist to support the framework of the story rather than to live as individuals on the page.
  • Concepts that seemed abstract to others took on, for him, concrete forms; they were like things he could touch, inducing a “druglike euphoria.” Compared with ordinary literature, he came to feel, “the stories of science are far more magnificent, grand, involved, profound, thrilling, strange, terrifying, mysterious, and even emotional
  • Pragmatic choices like this one, or like the decision his grandparents made when their sons were conscripted, recur in his fiction—situations that present equally unconscionable choices on either side of a moral fulcrum
  • The great flourishing of science fiction in the West at the end of the nineteenth century occurred alongside unprecedented technological progress and the proliferation of the popular press—transformations that were fundamental to the development of the genre
  • Joel Martinsen, the translator of the second volume of Liu’s trilogy, sees the series as a continuation of this tradition. “It’s not hard to read parallels between the Trisolarans and imperialist designs on China, driven by hunger for resources and fear of being wiped out,” he told me. Even Liu, unwilling as he is to endorse comparisons between the plot and China’s current face-off with the U.S., did at one point let slip that “the relationship between politics and science fiction cannot be underestimated.”
  • Speculative fiction is the art of imagining alternative worlds, and the same political establishment that permits it to be used as propaganda for the existing regime is also likely to recognize its capacity to interrogate the legitimacy of the status quo.
  • Liu has been criticized for peopling his books with characters who seem like cardboard cutouts installed in magnificent dioramas. Liu readily admits to the charge. “I did not begin writing for love of literature,” he told me. “I did so for love of science.”
  • Liu believes that this trend signals a deeper shift in the Chinese mind-set—that technological advances have spurred a new excitement about the possibilities of cosmic exploration.
  • Liu’s imagination is dauntingly capacious, his narratives conceived on a scale that feels, at times, almost hallucinogenic. The time line of the trilogy spans 18,906,450 years, encompassing ancient Egypt, the Qin dynasty, the Byzantine Empire, the Cultural Revolution, the present, and a time eighteen million years in the future
  • The first book is set on Earth, though some of its scenes take place in virtual reality; by the end of the third book, the scope of the action is interstellar and annihilation unfolds across several dimensions. The London Review of Books has called the trilogy “one of the most ambitious works of science fiction ever written.”
  • Although physics furnishes the novels’ premises, it is politics that drives the plots. At every turn, the characters are forced to make brutal calculations in which moral absolutism is pitted against the greater good
  • In Liu’s fictional universe, idealism is fatal and kindness an exorbitant luxury. As one general says in the trilogy, “In a time of war, we can’t afford to be too scrupulous.” Indeed, it is usually when people do not play by the rules of Realpolitik that the most lives are lost.
  • “I know what you are thinking,” he told me with weary clarity. “What about individual liberty and freedom of governance?” He sighed, as if exhausted by a debate going on in his head. “But that’s not what Chinese people care about. For ordinary folks, it’s the cost of health care, real-estate prices, their children’s education. Not democracy.”
  • Liu closed his eyes for a long moment and then said quietly, “This is why I don’t like to talk about subjects like this. The truth is you don’t really—I mean, can’t truly—understand.”
  • Liu explained to me, the existing regime made the most sense for today’s China, because to change it would be to invite chaos. “If China were to transform into a democracy, it would be hell on earth,”
  • It was an opinion entirely consistent with his systems-level view of human societies, just as mine reflected a belief in democracy and individualism as principles to be upheld regardless of outcomes
  • “I cannot escape and leave behind reality, just like I cannot leave behind my shadow. Reality brands each of us with its indelible mark. Every era puts invisible shackles on those who have lived through it, and I can only dance in my chains.
  • Chinese people of his generation were lucky, he said. The changes they had seen were so huge that they now inhabited a world entirely different from that of their childhood. “China is a futuristic country,” he said. “I realized that the world around me became more and more like science fiction, and this process is speeding up.”
  • “We have statues of a few martyrs, but we never—We don’t memorialize those, the individuals.” He took off his glasses and blinked, peering into the wide expanse of green and concrete. “This is how we Chinese have always been,” he said. “When something happens, it passes, and time buries the stories.”
Javier E

The Navy's USS Gabrielle Giffords and the Future of Work - The Atlantic - 0 views

  • Minimal manning—and with it, the replacement of specialized workers with problem-solving generalists—isn’t a particularly nautical concept. Indeed, it will sound familiar to anyone in an organization who’s been asked to “do more with less”—which, these days, seems to be just about everyone.
  • Ten years from now, the Deloitte consultant Erica Volini projects, 70 to 90 percent of workers will be in so-called hybrid jobs or superjobs—that is, positions combining tasks once performed by people in two or more traditional roles.
  • If you ask Laszlo Bock, Google’s former culture chief and now the head of the HR start-up Humu, what he looks for in a new hire, he’ll tell you “mental agility.
  • ...40 more annotations...
  • “What companies are looking for,” says Mary Jo King, the president of the National Résumé Writers’ Association, “is someone who can be all, do all, and pivot on a dime to solve any problem.”
  • The phenomenon is sped by automation, which usurps routine tasks, leaving employees to handle the nonroutine and unanticipated—and the continued advance of which throws the skills employers value into flux
  • Or, for that matter, on the relevance of the question What do you want to be when you grow up?
  • By 2020, a 2016 World Economic Forum report predicted, “more than one-third of the desired core skill sets of most occupations” will not have been seen as crucial to the job when the report was published
  • I asked John Sullivan, a prominent Silicon Valley talent adviser, why should anyone take the time to master anything at all? “You shouldn’t!” he replied.
  • Minimal manning—and the evolution of the economy more generally—requires a different kind of worker, with not only different acquired skills but different inherent abilities
  • It has implications for the nature and utility of a college education, for the path of careers, for inequality and employability—even for the generational divide.
  • Then, in 2001, Donald Rumsfeld arrived at the Pentagon. The new secretary of defense carried with him a briefcase full of ideas from the corporate world: downsizing, reengineering, “transformational” technologies. Almost immediately, what had been an experimental concept became an article of faith
  • But once cadets got into actual command environments, which tend to be fluid and full of surprises, a different picture emerged. “Psychological hardiness”—a construct that includes, among other things, a willingness to explore “multiple possible response alternatives,” a tendency to “see all experience as interesting and meaningful,” and a strong sense of self-confidence—was a better predictor of leadership ability in officers after three years in the field.
  • Because there really is no such thing as multitasking—just a rapid switching of attention—I began to feel overstrained, put upon, and finally irked by the impossible set of concurrent demands. Shouldn’t someone be giving me a hand here? This, Hambrick explained, meant I was hitting the limits of working memory—basically, raw processing power—which is an important aspect of “fluid intelligence” and peaks in your early 20s. This is distinct from “crystallized intelligence”—the accumulated facts and know-how on your hard drive—which peaks in your 50
  • Others noticed the change but continued to devote equal attention to all four tasks. Their scores fell. This group, Hambrick found, was high in “conscientiousness”—a trait that’s normally an overwhelming predictor of positive job performance. We like conscientious people because they can be trusted to show up early, double-check the math, fill the gap in the presentation, and return your car gassed up even though the tank was nowhere near empty to begin with. What struck Hambrick as counterintuitive and interesting was that conscientiousness here seemed to correlate with poor performance.
  • he discovered another correlation in his test: The people who did best tended to score high on “openness to new experience”—a personality trait that is normally not a major job-performance predictor and that, in certain contexts, roughly translates to “distractibility.”
  • To borrow the management expert Peter Drucker’s formulation, people with this trait are less focused on doing things right, and more likely to wonder whether they’re doing the right things.
  • High in fluid intelligence, low in experience, not terribly conscientious, open to potential distraction—this is not the classic profile of a winning job candidate. But what if it is the profile of the winning job candidate of the future?
  • One concerns “grit”—a mind-set, much vaunted these days in educational and professional circles, that allows people to commit tenaciously to doing one thing well
  • These ideas are inherently appealing; they suggest that dedication can be more important than raw talent, that the dogged and conscientious will be rewarded in the end.
  • he studied West Point students and graduates.
  • Traditional measures such as SAT scores and high-school class rank “predicted leader performance in the stable, highly regulated environment of West Point” itself.
  • It would be supremely ironic if the advance of the knowledge economy had the effect of devaluing knowledge. But that’s what I heard, recurrentl
  • “Fluid, learning-intensive environments are going to require different traits than classical business environments,” I was told by Frida Polli, a co-founder of an AI-powered hiring platform called Pymetrics. “And they’re going to be things like ability to learn quickly from mistakes, use of trial and error, and comfort with ambiguity.”
  • “We’re starting to see a big shift,” says Guy Halfteck, a people-analytics expert. “Employers are looking less at what you know and more and more at your hidden potential” to learn new things
  • advice to employers? Stop hiring people based on their work experience. Because in these environments, expertise can become an obstacle.
  • “The Curse of Expertise.” The more we invest in building and embellishing a system of knowledge, they found, the more averse we become to unbuilding it.
  • All too often experts, like the mechanic in LePine’s garage, fail to inspect their knowledge structure for signs of decay. “It just didn’t occur to him,” LePine said, “that he was repeating the same mistake over and over.
  • The devaluation of expertise opens up ample room for different sorts of mistakes—and sometimes creates a kind of helplessness.
  • Aboard littoral combat ships, the crew lacks the expertise to carry out some important tasks, and instead has to rely on civilian help
  • Meanwhile, the modular “plug and fight” configuration was not panning out as hoped. Converting a ship from sub-hunter to minesweeper or minesweeper to surface combatant, it turned out, was a logistical nightmare
  • So in 2016 the concept of interchangeability was scuttled for a “one ship, one mission” approach, in which the extra 20-plus sailors became permanent crew members
  • “As equipment breaks, [sailors] are required to fix it without any training,” a Defense Department Test and Evaluation employee told Congress. “Those are not my words. Those are the words of the sailors who were doing the best they could to try to accomplish the missions we gave them in testing.”
  • These results were, perhaps, predictable given the Navy’s initial, full-throttle approach to minimal manning—and are an object lesson on the dangers of embracing any radical concept without thinking hard enough about the downsides
  • a world in which mental agility and raw cognitive speed eclipse hard-won expertise is a world of greater exclusion: of older workers, slower learners, and the less socially adept.
  • if you keep going down this road, you end up with one really expensive ship with just a few people on it who are geniuses … That’s not a future we want to see, because you need a large enough crew to conduct multiple tasks in combat.
  • hat does all this mean for those of us in the workforce, and those of us planning to enter it? It would be wrong to say that the 10,000-hours-of-deliberate-practice idea doesn’t hold up at all. In some situations, it clearly does
  • A spinal surgery will not be performed by a brilliant dermatologist. A criminal-defense team will not be headed by a tax attorney. And in tech, the demand for specialized skills will continue to reward expertise handsomely.
  • But in many fields, the path to success isn’t so clear. The rules keep changing, which means that highly focused practice has a much lower return
  • In uncertain environments, Hambrick told me, “specialization is no longer the coin of the realm.”
  • It leaves us with lifelong learning,
  • I found myself the target of career suggestions. “You need to be a video guy, an audio guy!” the Silicon Valley talent adviser John Sullivan told me, alluding to the demise of print media
  • I found the prospect of starting over just plain exhausting. Building a professional identity takes a lot of resources—money, time, energy. After it’s built, we expect to reap gains from our investment, and—let’s be honest—even do a bit of coasting. Are we equipped to continually return to apprentice mode? Will this burn us out?
  • Everybody I met on the Giffords seemed to share that mentality. They regarded every minute on board—even during a routine transit back to port in San Diego Harbor—as a chance to learn something new.
Javier E

The Moral Instinct - The New York Times - 2 views

  • Today, a new field is using illusions to unmask a sixth sense, the moral sense. Moral intuitions are being drawn out of people in the lab, on Web sites and in brain scanners, and are being explained with tools from game theory, neuroscience and evolutionary biology.
  • The other hallmark is that people feel that those who commit immoral acts deserve to be punished
  • If morality is a mere trick of the brain, some may fear, our very grounds for being moral could be eroded. Yet as we shall see, the science of the moral sense can instead be seen as a way to strengthen those grounds, by clarifying what morality is and how it should steer our actions.
  • ...13 more annotations...
  • The starting point for appreciating that there is a distinctive part of our psychology for morality is seeing how moral judgments differ from other kinds of opinions we have on how people ought to behave.
  • Moralization is a psychological state that can be turned on and off like a switch, and when it is on, a distinctive mind-set commandeers our thinking. This is the mind-set that makes us deem actions immoral (“killing is wrong”), rather than merely disagreeable (“I hate brussels sprouts”), unfashionable (“bell-bottoms are out”) or imprudent (“don’t scratch mosquito bites”).
  • The first hallmark of moralization is that the rules it invokes are felt to be universal
  • Many of these moralizations, like the assault on smoking, may be understood as practical tactics to reduce some recently identified harm. But whether an activity flips our mental switches to the “moral” setting isn’t just a matter of how much harm it does
  • We all know what it feels like when the moralization switch flips inside us — the righteous glow, the burning dudgeon, the drive to recruit others to the cause.
  • The human moral sense turns out to be an organ of considerable complexity, with quirks that reflect its evolutionary history and its neurobiological foundations.
  • At the same time, many behaviors have been amoralized, switched from moral failings to lifestyle choices. They include divorce, illegitimacy, being a working mother, marijuana use and homosexuality.
  • This wave of amoralization has led the cultural right to lament that morality itself is under assault, as we see in the group that anointed itself the Moral Majority. In fact there seems to be a Law of Conservation of Moralization, so that as old behaviors are taken out of the moralized column, new ones are added to it.
  • Much of our recent social history, including the culture wars between liberals and conservatives, consists of the moralization or amoralization of particular kinds of behavior.
  • People don’t generally engage in moral reasoning, Haidt argues, but moral rationalization: they begin with the conclusion, coughed up by an unconscious emotion, and then work backward to a plausible justification.
  • When psychologists say “most people” they usually mean “most of the two dozen sophomores who filled out a questionnaire for beer money.” But in this case it means most of the 200,000 people from a hundred countries who shared their intuitions on a Web-based experiment conducted by the psychologists Fiery Cushman and Liane Young and the biologist Marc Hauser. A difference between the acceptability of switch-pulling and man-heaving, and an inability to justify the choice, was found in respondents from Europe, Asia and North and South America; among men and women, blacks and whites, teenagers and octogenarians, Hindus, Muslims, Buddhists, Christians, Jews and atheists; people with elementary-school educations and people with Ph.D.’s.
  • Joshua Greene, a philosopher and cognitive neuroscientist, suggests that evolution equipped people with a revulsion to manhandling an innocent person. This instinct, he suggests, tends to overwhelm any utilitarian calculus that would tot up the lives saved and lost
  • the findings corroborate Greene’s theory that our nonutilitarian intuitions come from the victory of an emotional impulse over a cost-benefit analysis.
Javier E

The Story Behind the SAT Overhaul - NYTimes.com - 2 views

  • “When you cover too many topics,” Coleman said, “the assessments designed to measure those standards are inevitably superficial.” He pointed to research showing that more students entering college weren’t prepared and were forced into “remediation programs from which they never escape.” In math, for example, if you examined data from top-performing countries, you found an approach that emphasized “far fewer topics, far deeper,” the opposite of the curriculums he found in the United States, which he described as “a mile wide and an inch deep.”
  • The lessons he brought with him from thinking about the Common Core were evident — that American education needed to be more focused and less superficial, and that it should be possible to test the success of the newly defined standards through an exam that reflected the material being taught in the classroom.
  • she and her team had extensive conversations with students, teachers, parents, counselors, admissions officers and college instructors, asking each group to tell them in detail what they wanted from the test. What they arrived at above all was that a test should reflect the most important skills that were imparted by the best teachers
  • ...12 more annotations...
  • for example, a good instructor would teach Martin Luther King Jr.’s “I Have a Dream” speech by encouraging a conversation that involved analyzing the text and identifying the evidence, both factual and rhetorical, that makes it persuasive. “The opposite of what we’d want is a classroom where a teacher might ask only: ‘What was the year the speech was given? Where was it given?’ ”
  • in the past, assembling the SAT focused on making sure the questions performed on technical grounds, meaning: Were they appropriately easy or difficult among a wide range of students, and were they free of bias when tested across ethnic, racial and religious subgroups? The goal was “maximizing differentiation” among kids, which meant finding items that were answered correctly by those students who were expected to get them right and incorrectly by the weaker students. A simple way of achieving this, Coleman said, was to test the kind of obscure vocabulary words for which the SAT was famous
  • In redesigning the test, the College Board shifted its emphasis. It prioritized content, measuring each question against a set of specifications that reflect the kind of reading and math that students would encounter in college and their work lives. Schmeiser and others then spent much of early last year watching students as they answered a set of 20 or so problems, discussing the questions with the students afterward. “The predictive validity is going to come out the same,” she said of the redesigned test. “But in the new test, we have much more control over the content and skills that are being measured.”
  • Evidence-based reading and writing, he said, will replace the current sections on reading and writing. It will use as its source materials pieces of writing — from science articles to historical documents to literature excerpts — which research suggests are important for educated Americans to know and understand deeply. “The Declaration of Independence, the Constitution, the Bill of Rights and the Federalist Papers,” Coleman said, “have managed to inspire an enduring great conversation about freedom, justice, human dignity in this country and the world” — therefore every SAT will contain a passage from either a founding document or from a text (like Lincoln’s Gettysburg Address) that is part of the “great global conversation” the founding documents inspired.
  • The idea is that the test will emphasize words students should be encountering, like “synthesis,” which can have several meanings depending on their context. Instead of encouraging students to memorize flashcards, the test should promote the idea that they must read widely throughout their high-school years.
  • The Barbara Jordan vocabulary question would have a follow-up — “How do you know your answer is correct?” — to which students would respond by identifying lines in the passage that supported their answer.
  • . No longer will it be good enough to focus on tricks and trying to eliminate answer choices. We are not interested in students just picking an answer, but justifying their answers.”
  • the essay portion of the test will also be reformulated so that it will always be the same, some version of: “As you read the passage in front of you, consider how the author uses evidence such as facts or examples; reasoning to develop ideas and to connect claims and evidence; and stylistic or persuasive elements to add power to the ideas expressed. Write an essay in which you explain how the author builds an argument to persuade an audience.”
  • The math section, too, will be predicated on research that shows that there are “a few areas of math that are a prerequisite for a wide range of college courses” and careers. Coleman conceded that some might treat the news that they were shifting away from more obscure math problems to these fewer fundamental skills as a dumbing-down the test, but he was adamant that this was not the case. He explained that there will be three areas of focus: problem solving and data analysis, which will include ratios and percentages and other mathematical reasoning used to solve problems in the real world; the “heart of algebra,” which will test how well students can work with linear equations (“a powerful set of tools that echo throughout many fields of study”); and what will be called the “passport to advanced math,” which will focus on the student’s familiarity with complex equations and their applications in science and social science.
  • “Sometimes in the past, there’s been a feeling that tests were measuring some sort of ineffable entity such as intelligence, whatever that might mean. Or ability, whatever that might mean. What this is is a clear message that good hard work is going to pay off and achievement is going to pay off. This is one of the most significant developments that I have seen in the 40-plus years that I’ve been working in admissions in higher education.”
  • The idea of creating a transparent test and then providing a free website that any student could use — not to learn gimmicks but to get a better grounding and additional practice in the core knowledge that would be tested — was appealing to Coleman.
  • (The College Board won’t pay Khan Academy.) They talked about a hypothetical test-prep experience in which students would log on to a personal dashboard, indicate that they wanted to prepare for the SAT and then work through a series of preliminary questions to demonstrate their initial skill level and identify the gaps in their knowledge. Khan said he could foresee a way to estimate the amount of time it would take to achieve certain benchmarks. “It might go something like, ‘O.K., we think you’ll be able to get to this level within the next month and this level within the next two months if you put in 30 minutes a day,’ ” he said. And he saw no reason the site couldn’t predict for anyone, anywhere the score he or she might hope to achieve with a commitment to a prescribed amount of work.
Javier E

What Romantic Regime Are You In? - The New York Times - 0 views

  • the American model involves too much calculation and gamesmanship
  • “The greatest problem with the Regime of Choice stems from its misconception of maturity as absolute self-sufficiency,” Aronson writes. “Attachment is infantilized. The desire for recognition is rendered as ‘neediness.’ Intimacy must never challenge ‘personal boundaries.’”
  • The dating market becomes a true market, where people carefully appraise each other, looking for red flags. The emphasis is on the prudential choice, selecting the right person who satisfies your desires.
  • ...9 more annotations...
  • But somehow as people pragmatically “select” each other, marriage as an institution has gone into crisis. Marriage rates have plummeted at every age level. Most children born to women under 30 are born outside of wedlock. The choice mind-set seems to be self-defeating.
  • see a different set of attitudes and presuppositions, which you might call a Regime of Covenants. A covenant is not a choice, but a life-altering promise and all the binding the promise entails.
  • The Regime of Covenants acknowledges the fact that we don’t really choose our most important attachments the way you choose a toaster. In the flux of life you meet some breathtakingly amazing people, usually in the swirl of complex circumstances. There is a sense of being blown around by currents more astounding than you can predict and control
  • When you are drawn together and make a pledge with a person, the swirl doesn’t end; it’s just that you’ll ride it together. In the Regime of Covenants, making the right one-time selection is less important than the ongoing action to serve the relationship.
  • The Covenant people tend to have a “we” consciousness. The good of the relationship itself comes first and the needs of the partner are second and the individual needs are third. The covenant only works if each partner, as best as possible, puts the other’s needs above his or her own, with the understanding that the other will reciprocate.
  • The underlying truth of a Covenantal Regime is that you have to close off choice if you want to get to the promised land. The people one sees in long, successful marriages have walked the stations of vulnerability. They’ve overthrown the proud ego and learned to be utterly dependent on the other.
  • You only do all this if you’ve set up a framework in which exit is not an easy option, in which you’re assured the other person’s love is not going away, and in which the only way to survive the crises is to go deeper into the relationship itself.
  • The final feature of a covenant is that the relationship is not just about itself; it serves some larger purpose. The obvious one in many cases is raising children. But the deeper one is transformation. People in such a covenant try to love the other in a way that brings out their loveliness.
  • The Covenant Regime is based on the idea that our current formula is a conspiracy to make people unhappy. Love is realistically a stronger force than self-interest. Detached calculation in such matters is self-strangulating. The deepest joy sneaks in the back door when you are surrendering to some sacred promise.
Javier E

The Problem With History Classes - The Atlantic - 3 views

  • The passion and urgency with which these battles are fought reflect the misguided way history is taught in schools. Currently, most students learn history as a set narrative—a process that reinforces the mistaken idea that the past can be synthesized into a single, standardized chronicle of several hundred pages. This teaching pretends that there is a uniform collective story, which is akin to saying everyone remembers events the same.
  • Yet, history is anything but agreeable. It is not a collection of facts deemed to be "official" by scholars on high. It is a collection of historians exchanging different, often conflicting analyses.
  • rather than vainly seeking to transcend the inevitable clash of memories, American students would be better served by descending into the bog of conflict and learning the many "histories" that compose the American national story.
  • ...18 more annotations...
  • Perhaps Fisher offers the nation an opportunity to divorce, once and for all, memory from history. History may be an attempt to memorialize and preserve the past, but it is not memory; memories can serve as primary sources, but they do not stand alone as history. A history is essentially a collection of memories, analyzed and reduced into meaningful conclusions—but that collection depends on the memories chosen.
  • Memories make for a risky foundation: As events recede further into the past, the facts are distorted or augmented by entirely new details
  • people construct unique memories while informing perfectly valid histories. Just as there is a plurality of memories, so, too, is there a plurality of histories.
  • Scholars who read a diverse set of historians who are all focused on the same specific period or event are engaging in historiography
  • This approach exposes textbooks as nothing more than a compilation of histories that the authors deemed to be most relevant and useful.
  • In historiography, the barrier between historian and student is dropped, exposing a conflict-ridden landscape. A diplomatic historian approaches an event from the perspective of the most influential statesmen (who are most often white males), analyzing the context, motives, and consequences of their decisions. A cultural historian peels back the objects, sights, and sounds of a period to uncover humanity’s underlying emotions and anxieties. A Marxist historian adopts the lens of class conflict to explain the progression of events. There are intellectual historians, social historians, and gender historians, among many others. Historians studying the same topic will draw different interpretations—sometimes radically so, depending on the sources they draw from
  • Jacoba Urist points out that history is "about explaining and interpreting past events analytically." If students are really to learn and master these analytical tools, then it is absolutely essential that they read a diverse set of historians and learn how brilliant men and women who are scrutinizing the same topic can reach different conclusions
  • Rather than constructing a curriculum based on the muddled consensus of boards, legislatures, and think tanks, schools should teach students history through historiography. The shortcomings of one historian become apparent after reading the work of another one on the list.
  • Although, as Urist notes, the AP course is "designed to teach students to think like historians," my own experience in that class suggests that it fails to achieve that goal.
  • The course’s framework has always served as an outline of important concepts aiming to allow educators flexibility in how to teach; it makes no reference to historiographical conflicts. Historiography was an epiphany for me because I had never before come face-to-face with how historians think and reason
  • When I took AP U.S. History, I jumbled these diverse histories into one indistinct narrative. Although the test involved open-ended essay questions, I was taught that graders were looking for a firm thesis—forcing students to adopt a side. The AP test also, unsurprisingly, rewards students who cite a wealth of supporting details
  • By the time I took the test in 2009, I was a master at "checking boxes," weighing political factors equally against those involving socioeconomics and ensuring that previously neglected populations like women and ethnic minorities received their due. I did not know that I was pulling ideas from different historiographical traditions. I still subscribed to the idea of a prevailing national narrative and served as an unwitting sponsor of synthesis, oblivious to the academic battles that made such synthesis impossible.
  • Although there may be an inclination to seek to establish order where there is chaos, that urge must be resisted in teaching history. Public controversies over memory are hardly new. Students must be prepared to confront divisiveness, not conditioned to shoehorn agreement into situations where none is possible
  • When conflict is accepted rather than resisted, it becomes possible for different conceptions of American history to co-exist. There is no longer a need to appoint a victor.
  • More importantly, the historiographical approach avoids pursuing truth for the sake of satisfying a national myth
  • The country’s founding fathers crafted some of the finest expressions of personal liberty and representative government the world has ever seen; many of them also held fellow humans in bondage. This paradox is only a problem if the goal is to view the founding fathers as faultless, perfect individuals. If multiple histories are embraced, no one needs to fear that one history will be lost.
  • History is not indoctrination. It is a wrestling match. For too long, the emphasis has been on pinning the opponent. It is time to shift the focus to the struggle itself
  • There is no better way to use the past to inform the present than by accepting the impossibility of a definitive history—and by ensuring that current students are equipped to grapple with the contested memories in their midst.
kushnerha

BBC - Future - Will emoji become a new language? - 2 views

  • Emoji are now used in around half of every sentence on sites like Instagram, and Facebook looks set to introduce them alongside the famous “like” button as a way of expression your reaction to a post.
  • If you were to believe the headlines, this is just the tipping point: some outlets have claimed that emoji are an emerging language that could soon compete with English in global usage. To many, this would be an exciting evolution of the way we communicate; to others, it is linguistic Armageddon.
  • Do emoji show the same characteristics of other communicative systems and actual languages? And what do they help us to express that words alone can’t say?When emoji appear with text, they often supplement or enhance the writing. This is similar to gestures that appear along with speech. Over the past three decades, research has shown that our hands provide important information that often transcends and clarifies the message in speech. Emoji serve this function too – for instance, adding a kissy or winking face can disambiguate whether a statement is flirtatiously teasing or just plain mean.
  • ...17 more annotations...
  • This is a key point about language use: rarely is natural language ever limited to speech alone. When we are speaking, we constantly use gestures to illustrate what we mean. For this reason, linguists say that language is “multi-modal”. Writing takes away that extra non-verbal information, but emoji may allow us to re-incorporate it into our text.
  • Emoji are not always used as embellishments, however – sometimes, strings of the characters can themselves convey meaning in a longer sequence on their own. But to constitute their own language, they would need a key component: grammar.
  • A grammatical system is a set of constraints that governs how the meaning of an utterance is packaged in a coherent way. Natural language grammars have certain traits that distinguish them. For one, they have individual units that play different roles in the sequence – like nouns and verbs in a sentence. Also, grammar is different from meaning
  • When emoji are isolated, they are primarily governed by simple rules related to meaning alone, without these more complex rules. For instance, according to research by Tyler Schnoebelen, people often create strings of emoji that share a common meaning
  • This sequence has little internal structure; even when it is rearranged, it still conveys the same message. These images are connected solely by their broader meaning. We might consider them to be a visual list: “here are all things related to celebrations and birthdays.” Lists are certainly a conventionalised way of communicating, but they don’t have grammar the way that sentences do.
  • What if the order did matter though? What if they conveyed a temporal sequence of events? Consider this example, which means something like “a woman had a party where they drank, and then opened presents and then had cake”:
  • In all cases, the doer of the action (the agent) precedes the action. In fact, this pattern is commonly found in both full languages and simple communication systems. For example, the majority of the world’s languages place the subject before the verb of a sentence.
  • These rules may seem like the seeds of grammar, but psycholinguist Susan Goldin-Meadow and colleagues have found this order appears in many other systems that would not be considered a language. For example, this order appears when people arrange pictures to describe events from an animated cartoon, or when speaking adults communicate using only gestures. It also appears in the gesture systems created by deaf children who cannot hear spoken languages and are not exposed to sign languages.
  • describes the children as lacking exposure to a language and thus invent their own manual systems to communicate, called “homesigns”. These systems are limited in the size of their vocabularies and the types of sequences they can create. For this reason, the agent-act order seems not to be due to a grammar, but from basic heuristics – practical workarounds – based on meaning alone. Emoji seem to tap into this same system.
  • Nevertheless, some may argue that despite emoji’s current simplicity, this may be the groundwork for emerging complexity – that although emoji do not constitute a language at the present time, they could develop into one over time.
  • Could an emerging “emoji visual language” be developing in a similar way, with actual grammatical structure? To answer that question, you need to consider the intrinsic constraints on the technology itself.Emoji are created by typing into a computer like text. But, unlike text, most emoji are provided as whole units, except for the limited set of emoticons which convert to emoji, like :) or ;). When writing text, we use the building blocks (letters) to create the units (words), not by searching through a list of every whole word in the language.
  • emoji force us to convey information in a linear unit-unit string, which limits how complex expressions can be made. These constraints may mean that they will never be able to achieve even the most basic complexity that we can create with normal and natural drawings.
  • What’s more, these limits also prevent users from creating novel signs – a requisite for all languages, especially emerging ones. Users have no control over the development of the vocabulary. As the “vocab list” for emoji grows, it will become increasingly unwieldy: using them will require a conscious search process through an external list, not an easy generation from our own mental vocabulary, like the way we naturally speak or draw. This is a key point – it means that emoji lack the flexibility needed to create a new language.
  • we already have very robust visual languages, as can be seen in comics and graphic novels. As I argue in my book, The Visual Language of Comics, the drawings found in comics use a systematic visual vocabulary (such as stink lines to represent smell, or stars to represent dizziness). Importantly, the available vocabulary is not constrained by technology and has developed naturally over time, like spoken and written languages.
  • grammar of sequential images is more of a narrative structure – not of nouns and verbs. Yet, these sequences use principles of combination like any other grammar, including roles played by images, groupings of images, and hierarchic embedding.
  • measured participants’ brainwaves while they viewed sequences one image at a time where a disruption appeared either within the groupings of panels or at the natural break between groupings. The particular brainwave responses that we observed were similar to those that experimenters find when violating the syntax of sentences. That is, the brain responds the same way to violations of “grammar”, whether in sentences or sequential narrative images.
  • I would hypothesise that emoji can use a basic narrative structure to organise short stories (likely made up of agent-action sequences), but I highly doubt that they would be able to create embedded clauses like these. I would also doubt that you would see the same kinds of brain responses that we saw with the comic strip sequences.
Javier E

How to Fight the Man - NYTimes.com - 0 views

  • This seems to be a moment when many people — in religion, economics and politics — are disgusted by current institutions, but then they are vague about what sorts of institutions should replace them. This seems to be a moment of fervent protest movements that are ultimately vague and ineffectual.
  • We can all theorize why the intense desire for change has so far produced relatively few coherent recipes for change.
  • My own theory revolves around a single bad idea. For generations people have been told: Think for yourself; come up with your own independent worldview. Unless your name is Nietzsche, that’s probably a bad idea. Very few people have the genius or time to come up with a comprehensive and rigorous worldview.
  • ...5 more annotations...
  • The paradox of reform movements is that, if you want to defy authority, you probably shouldn’t think entirely for yourself. You should attach yourself to a counter-tradition and school of thought that has been developed over the centuries and that seems true.
  • The old leftists had dialectical materialism and the Marxist view of history. Libertarians have Hayek and von Mises. Various spiritual movements have drawn from Transcendentalism, Stoicism, Gnosticism, Thomism, Augustine, Tolstoy, or the Catholic social teaching that inspired Dorothy Day.
  • These belief systems helped people envision alternate realities. They helped people explain why the things society values are not the things that should be valued. They gave movements a set of organizing principles. Joining a tradition doesn’t mean suppressing your individuality. Applying an ancient tradition to a new situation is a creative, stimulating and empowering act. Without a tradition, everything is impermanence and flux.
  • If I could offer advice to a young rebel, it would be to rummage the past for a body of thought that helps you understand and address the shortcomings you see. Give yourself a label.
  • Effective rebellion isn’t just expressing your personal feelings. It means replacing one set of authorities and institutions with a better set of authorities and institutions.
  •  
    An intellectual tradition-a neglected but essential ingredient of social change.
Javier E

Accelerationism: how a fringe philosophy predicted the future we live in | World news |... - 1 views

  • Roger Zelazny, published his third novel. In many ways, Lord of Light was of its time, shaggy with imported Hindu mythology and cosmic dialogue. Yet there were also glints of something more forward-looking and political.
  • accelerationism has gradually solidified from a fictional device into an actual intellectual movement: a new way of thinking about the contemporary world and its potential.
  • Accelerationists argue that technology, particularly computer technology, and capitalism, particularly the most aggressive, global variety, should be massively sped up and intensified – either because this is the best way forward for humanity, or because there is no alternative.
  • ...31 more annotations...
  • Accelerationists favour automation. They favour the further merging of the digital and the human. They often favour the deregulation of business, and drastically scaled-back government. They believe that people should stop deluding themselves that economic and technological progress can be controlled.
  • Accelerationism, therefore, goes against conservatism, traditional socialism, social democracy, environmentalism, protectionism, populism, nationalism, localism and all the other ideologies that have sought to moderate or reverse the already hugely disruptive, seemingly runaway pace of change in the modern world
  • Robin Mackay and Armen Avanessian in their introduction to #Accelerate: The Accelerationist Reader, a sometimes baffling, sometimes exhilarating book, published in 2014, which remains the only proper guide to the movement in existence.
  • “We all live in an operating system set up by the accelerating triad of war, capitalism and emergent AI,” says Steve Goodman, a British accelerationist
  • A century ago, the writers and artists of the Italian futurist movement fell in love with the machines of the industrial era and their apparent ability to invigorate society. Many futurists followed this fascination into war-mongering and fascism.
  • One of the central figures of accelerationism is the British philosopher Nick Land, who taught at Warwick University in the 1990s
  • Land has published prolifically on the internet, not always under his own name, about the supposed obsolescence of western democracy; he has also written approvingly about “human biodiversity” and “capitalistic human sorting” – the pseudoscientific idea, currently popular on the far right, that different races “naturally” fare differently in the modern world; and about the supposedly inevitable “disintegration of the human species” when artificial intelligence improves sufficiently.
  • In our politically febrile times, the impatient, intemperate, possibly revolutionary ideas of accelerationism feel relevant, or at least intriguing, as never before. Noys says: “Accelerationists always seem to have an answer. If capitalism is going fast, they say it needs to go faster. If capitalism hits a bump in the road, and slows down” – as it has since the 2008 financial crisis – “they say it needs to be kickstarted.”
  • On alt-right blogs, Land in particular has become a name to conjure with. Commenters have excitedly noted the connections between some of his ideas and the thinking of both the libertarian Silicon Valley billionaire Peter Thiel and Trump’s iconoclastic strategist Steve Bannon.
  • “In Silicon Valley,” says Fred Turner, a leading historian of America’s digital industries, “accelerationism is part of a whole movement which is saying, we don’t need [conventional] politics any more, we can get rid of ‘left’ and ‘right’, if we just get technology right. Accelerationism also fits with how electronic devices are marketed – the promise that, finally, they will help us leave the material world, all the mess of the physical, far behind.”
  • In 1972, the philosopher Gilles Deleuze and the psychoanalyst Félix Guattari published Anti-Oedipus. It was a restless, sprawling, appealingly ambiguous book, which suggested that, rather than simply oppose capitalism, the left should acknowledge its ability to liberate as well as oppress people, and should seek to strengthen these anarchic tendencies, “to go still further … in the movement of the market … to ‘accelerate the process’”.
  • By the early 90s Land had distilled his reading, which included Deleuze and Guattari and Lyotard, into a set of ideas and a writing style that, to his students at least, were visionary and thrillingly dangerous. Land wrote in 1992 that capitalism had never been properly unleashed, but instead had always been held back by politics, “the last great sentimental indulgence of mankind”. He dismissed Europe as a sclerotic, increasingly marginal place, “the racial trash-can of Asia”. And he saw civilisation everywhere accelerating towards an apocalypse: “Disorder must increase... Any [human] organisation is ... a mere ... detour in the inexorable death-flow.”
  • With the internet becoming part of everyday life for the first time, and capitalism seemingly triumphant after the collapse of communism in 1989, a belief that the future would be almost entirely shaped by computers and globalisation – the accelerated “movement of the market” that Deleuze and Guattari had called for two decades earlier – spread across British and American academia and politics during the 90s. The Warwick accelerationists were in the vanguard.
  • In the US, confident, rainbow-coloured magazines such as Wired promoted what became known as “the Californian ideology”: the optimistic claim that human potential would be unlocked everywhere by digital technology. In Britain, this optimism influenced New Labour
  • The Warwick accelerationists saw themselves as participants, not traditional academic observers
  • The CCRU gang formed reading groups and set up conferences and journals. They squeezed into the narrow CCRU room in the philosophy department and gave each other impromptu seminars.
  • The main result of the CCRU’s frantic, promiscuous research was a conveyor belt of cryptic articles, crammed with invented terms, sometimes speculative to the point of being fiction.
  • At Warwick, however, the prophecies were darker. “One of our motives,” says Plant, “was precisely to undermine the cheery utopianism of the 90s, much of which seemed very conservative” – an old-fashioned male desire for salvation through gadgets, in her view.
  • K-punk was written by Mark Fisher, formerly of the CCRU. The blog retained some Warwick traits, such as quoting reverently from Deleuze and Guattari, but it gradually shed the CCRU’s aggressive rhetoric and pro-capitalist politics for a more forgiving, more left-leaning take on modernity. Fisher increasingly felt that capitalism was a disappointment to accelerationists, with its cautious, entrenched corporations and endless cycles of essentially the same products. But he was also impatient with the left, which he thought was ignoring new technology
  • lex Williams, co-wrote a Manifesto for an Accelerationist Politics. “Capitalism has begun to constrain the productive forces of technology,” they wrote. “[Our version of] accelerationism is the basic belief that these capacities can and should be let loose … repurposed towards common ends … towards an alternative modernity.”
  • What that “alternative modernity” might be was barely, but seductively, sketched out, with fleeting references to reduced working hours, to technology being used to reduce social conflict rather than exacerbate it, and to humanity moving “beyond the limitations of the earth and our own immediate bodily forms”. On politics and philosophy blogs from Britain to the US and Italy, the notion spread that Srnicek and Williams had founded a new political philosophy: “left accelerationism”.
  • Two years later, in 2015, they expanded the manifesto into a slightly more concrete book, Inventing the Future. It argued for an economy based as far as possible on automation, with the jobs, working hours and wages lost replaced by a universal basic income. The book attracted more attention than a speculative leftwing work had for years, with interest and praise from intellectually curious leftists
  • Even the thinking of the arch-accelerationist Nick Land, who is 55 now, may be slowing down. Since 2013, he has become a guru for the US-based far-right movement neoreaction, or NRx as it often calls itself. Neoreactionaries believe in the replacement of modern nation-states, democracy and government bureaucracies by authoritarian city states, which on neoreaction blogs sound as much like idealised medieval kingdoms as they do modern enclaves such as Singapore.
  • Land argues now that neoreaction, like Trump and Brexit, is something that accelerationists should support, in order to hasten the end of the status quo.
  • In 1970, the American writer Alvin Toffler, an exponent of accelerationism’s more playful intellectual cousin, futurology, published Future Shock, a book about the possibilities and dangers of new technology. Toffler predicted the imminent arrival of artificial intelligence, cryonics, cloning and robots working behind airline check-in desks
  • Land left Britain. He moved to Taiwan “early in the new millennium”, he told me, then to Shanghai “a couple of years later”. He still lives there now.
  • In a 2004 article for the Shanghai Star, an English-language paper, he described the modern Chinese fusion of Marxism and capitalism as “the greatest political engine of social and economic development the world has ever known”
  • Once he lived there, Land told me, he realised that “to a massive degree” China was already an accelerationist society: fixated by the future and changing at speed. Presented with the sweeping projects of the Chinese state, his previous, libertarian contempt for the capabilities of governments fell away
  • Without a dynamic capitalism to feed off, as Deleuze and Guattari had in the early 70s, and the Warwick philosophers had in the 90s, it may be that accelerationism just races up blind alleys. In his 2014 book about the movement, Malign Velocities, Benjamin Noys accuses it of offering “false” solutions to current technological and economic dilemmas. With accelerationism, he writes, a breakthrough to a better future is “always promised and always just out of reach”.
  • “The pace of change accelerates,” concluded a documentary version of the book, with a slightly hammy voiceover by Orson Welles. “We are living through one of the greatest revolutions in history – the birth of a new civilisation.”
  • Shortly afterwards, the 1973 oil crisis struck. World capitalism did not accelerate again for almost a decade. For much of the “new civilisation” Toffler promised, we are still waiting
Javier E

FaceApp helped a middle-aged man become a popular younger woman. His fan base has never... - 1 views

  • Soya’s fame illustrated a simple truth: that social media is less a reflection of who we are, and more a performance of who we want to be.
  • It also seemed to herald a darker future where our fundamental senses of reality are under siege: The AI that allows anyone to fabricate a face can also be used to harass women with “deepfake” pornography, invent fraudulent LinkedIn personas and digitally impersonate political enemies.
  • As the photos began receiving hundreds of likes, Soya’s personality and style began to come through. She was relentlessly upbeat. She never sneered or bickered or trolled. She explored small towns, savored scenic vistas, celebrated roadside restaurants’ simple meals.
  • ...25 more annotations...
  • She took pride in the basic things, like cleaning engine parts. And she only hinted at the truth: When one fan told her in October, “It’s great to be young,” Soya replied, “Youth does not mean a certain period of life, but how to hold your heart.”
  • She seemed, well, happy, and FaceApp had made her that way. Creating the lifelike impostor had taken only a few taps: He changed the “Gender” setting to “Female,” the “Age” setting to “Teen,” and the “Impression” setting — a mix of makeup filters — to a glamorous look the app calls “Hollywood.”
  • Soya pouted and scowled on rare occasions when Nakajima himself felt frustrated. But her baseline expression was an extra-wide smile, activated with a single tap.
  • Nakajima grew his shimmering hair below his shoulders and raided his local convenience store for beauty supplies he thought would make the FaceApp images more convincing: blushes, eyeliners, concealers, shampoos.
  • “When I compare how I feel when I started to tweet as a woman and now, I do feel that I’m gradually gravitating toward this persona … this fantasy world that I created,” Nakajima said. “When I see photos of what I tweeted, I feel like, ‘Oh. That’s me.’ ”
  • The sensation Nakajima was feeling is so common that there’s a term for it: the Proteus effect, named for the shape-shifting Greek god. Stanford University researchers first coined it in 2007 to describe how people inhabiting the body of a digital avatar began to act the part
  • People made to appear taller in virtual-reality simulations acted more assertively, even after the experience ended. Prettier characters began to flirt.
  • What is it about online disguises? Why are they so good at bending people’s sense of self-perception?
  • they tap into this “very human impulse to play with identity and pretend to be someone you’re not.”
  • Users in the Internet’s early days rarely had any presumptions of authenticity, said Melanie C. Green, a University of Buffalo professor who studies technology and social trust. Most people assumed everyone else was playing a character clearly distinguished from their real life.
  • “This identity play was considered one of the huge advantages of being online,” Green said. “You could switch your gender and try on all of these different personas. It was a playground for people to explore.”
  • It wasn’t until the rise of giant social networks like Facebook — which used real identities to, among other things, supercharge targeted advertising — that this big game of pretend gained an air of duplicity. Spaces for playful performance shrank, and the biggest Internet watering holes began demanding proof of authenticity as a way to block out malicious intent.
  • The Web’s big shift from text to visuals — the rise of photo-sharing apps, live streams and video calls — seemed at first to make that unspoken rule of real identities concrete. It seemed too difficult to fake one’s appearance when everyone’s face was on constant display.
  • Now, researchers argue, advances in image-editing artificial intelligence have done for the modern Internet what online pseudonyms did for the world’s first chat rooms. Facial filters have allowed anyone to mold themselves into the character they want to play.
  • researchers fear these augmented reality tools could end up distorting the beauty standards and expectations of actual reality.
  • Some political and tech theorists worry this new world of synthetic media threatens to detonate our concept of truth, eroding our shared experiences and infusing every online relationship with suspicion and self-doubt.
  • Deceptive political memes, conspiracy theories, anti-vaccine hoaxes and other scams have torn the fabric of our democracy, culture and public health.
  • But she also thinks about her kids, who assume “that everything online is fabricated,” and wonders whether the rules of online identity require a bit more nuance — and whether that generational shift is already underway.
  • “Bots pretending to be people, automated representations of humanity — that, they perceive as exploitative,” she said. “But if it’s just someone engaging in identity experimentation, they’re like: ‘Yeah, that’s what we’re all doing.'
  • To their generation, “authenticity is not about: ‘Does your profile picture match your real face?’ Authenticity is: ‘Is your voice your voice?’
  • “Their feeling is: ‘The ideas are mine. The voice is mine. The content is mine. I’m just looking for you to receive it without all the assumptions and baggage that comes with it.’ That’s the essence of a person’s identity. That’s who they really are.”
  • But wasn’t this all just a big con? Nakajima had tricked people with a “cool girl” stereotype to boost his Twitter numbers. He hadn’t elevated the role of women in motorcycling; if anything, he’d supplanted them. And the character he’d created was paper thin: Soya had no internal complexity outside of what Nakajima had projected, just that eternally superimposed smile.
  • Perhaps he should have accepted his irrelevance and faded into the digital sunset, sharing his life for few to see. But some of Soya’s followers have said they never felt deceived: It was Nakajima — his enthusiasm, his attitude about life — they’d been charmed by all along. “His personality,” as one Twitter follower said, “shined through.”
  • In Nakajima’s mind, he’d used the tools of a superficial medium to craft genuine connections. He had not felt real until he had become noticed for being fake.
  • Nakajima said he doesn’t know how long he’ll keep Soya alive. But he said he’s grateful for the way she helped him feel: carefree, adventurous, seen.
tongoscar

US still out front in tech race, China experts say in response to Pentagon claim | Sout... - 0 views

  • these technologies had not only military applications but were also critical for long-term economic prosperity, making them important to the future of US-China
  • China exhibited hypersonic missiles and drones at last month’s National Day parade, and has just launched a commercial 5G – fifth generation mobile network – service on Friday, which is the biggest in the world.
  • Despite breakthroughs in certain fields like 5G, there was more generally a clear gap between China’s digital information and electronics technologies and the world’s technological leaders, according to Beijing-based naval expert Li Jie.
  • ...5 more annotations...
  • “Imagine what the world would look like if China was setting standards,” he said. “Over time, that means we have fewer levers to shape what the US wants to do, both from a global technology standpoint and also what are the values that are highlighted around the world as ones to be looked up to.”
  • Chinese experts have rejected the claim by a senior Pentagon official that the US is lagging behind China in some key dual-use technologies.
  • But, Brown warned, for China to set the pace for these technologies would be “game-changing”.
  • For the past 50 to 80 years, the US had led the way and set the standards in almost all important technologies and industries,
  • Huawei, China’s telecommunication giant has won contracts to construct the 5G infrastructures for many countries, despite the US campaign to ban Huawei equipment over security concerns.
manhefnawi

Trailblazing Scottish Mountaineer and Poet Nan Shepherd on the Transcendent Rewards of ... - 0 views

  • To place one foot in front of the other in a steady rhythm is to allow self and world to cohere, to set the mind itself into motion. We walk for different reasons and to different ends — for Thoreau, every walk was “a sort of crusade”; for artist Maira Kalman, it is “the glory of life.” “Nature’s particular gift to the walker,” Kenneth Grahame wrote in his splendid 1913 manifesto for walking as creative fuel, “is to set the mind jogging, to make it garrulous, exalted, a little mad maybe — certainly creative and suprasensitive.”
‹ Previous 21 - 40 of 638 Next › Last »
Showing 20 items per page