Skip to main content

Home/ History Readings/ Group items tagged hyperpartisan

Rss Feed Group items tagged

Javier E

The Real Story About Fake News Is Partisanship - The New York Times - 0 views

  • Partisan bias now operates more like racism than mere political disagreement, academic research on the subject shows
  • Americans’ deep bias against the political party they oppose is so strong that it acts as a kind of partisan prism for facts, refracting a different reality to Republicans than to Democrats.
  • the repercussions go far beyond stories shared on Facebook and Reddit, affecting Americans’ faith in government — and the government’s ability to function.
  • ...24 more annotations...
  • until a few decades ago, people’s feelings about their party and the opposing party were not too different. But starting in the 1980s, Americans began to report increasingly negative opinions of their opposing party.
  • Not only did party identity turn out to affect people’s behavior and decision making broadly, even on apolitical subjects, but according to their data it also had more influence on the way Americans behaved than race did.
  • Partisanship, for a long period of time, wasn’t viewed as part of who we are,” he said. “It wasn’t core to our identity. It was just an ancillary trait. But in the modern era we view party identity as something akin to gender, ethnicity or race — the core traits that we use to describe ourselves to others.”
  • That has made the personal political. “Politics has become so important that people select relationships on that basis,”
  • it has become quite rare for Democrats to marry Republicans,
  • in a 2009 survey of married couples that only 9 percent consisted of Democrat-Republican pairs
  • And it has become more rare for children to have a different party affiliation from their parents. Advertisement Continue reading the main story
  • it has also made the political personal. Today, political parties are no longer just the people who are supposed to govern the way you want. They are a team to support, and a tribe to feel a part of
  • And the public’s view of politics is becoming more and more zero-sum: It’s about helping their team win, and making sure the other team loses.
  • Partisan tribalism makes people more inclined to seek out and believe stories that justify their pre-existing partisan biases, whether or not they are true.
  • “There are many, many decades of research on communication on the importance of source credibility,
  • “You want to show that you’re a good member of your tribe,” Mr. Westwood said. “You want to show others that Republicans are bad or Democrats are bad, and your tribe is good. Social media provides a unique opportunity to publicly declare to the world what your beliefs are
  • Partisan bias fuels fake news because people of all partisan stripes are generally quite bad at figuring out what news stories to believe. Instead, they use trust as a shortcut. Rather than evaluate a story directly, people look to see if someone credible believes it, and rely on that person’s judgment to fill in the gaps in their knowledge.
  • Sharing those stories on social media is a way to show public support for one’s partisan team — roughly the equivalent of painting your face with team colors on game day.
  • They found that participants gave more money if they were told the other player supported the same political party as they did.
  • Partisanship’s influence on trust means that when there is a partisan divide among experts, Mr. Sides said, “you get people believing wildly different sets of facts.”
  • the bigger concern is that the natural consequence of this growing national divide will be a feedback loop in which the public’s bias encourages extremism among politicians, undermining public faith in government institutions and their ability to function.
  • “This is an incentive for Republicans and Democrats in Congress to behave in a hyperpartisan manner in order to excite their base.”
  • That feeds partisan bias among the public by reinforcing the idea that the opposition is made up of bad or dangerous people, which then creates more demand for political extremism.
  • The result is an environment in which compromise and collaboration with the opposing party are seen as signs of weakness, and of being a bad member of the tribe.
  • “It’s a vicious cycle,” Mr. Iyengar said. “All of this is going to make policy-making and fact-finding more problematic.”
  • Now, “you have quite a few people who are willing to call into question an institution for centuries that has been sacrosanct,”
  • . “The consequences of that are insane,” he said, “and potentially devastating to the norms of democ
  • “I don’t think things are going to get better in the short term; I don’t think they’re going to get better in the long term. I think this is the new normal.”
Javier E

Inside Facebook's (Totally Insane, Unintentionally Gigantic, Hyperpartisan) Political-M... - 1 views

  • According to the company, its site is used by more than 200 million people in the United States each month, out of a total population of 320 million. A 2016 Pew study found that 44 percent of Americans read or watch news on Facebook.
  • we can know, based on these facts alone, that Facebook is hosting a huge portion of the political conversation in America.
  • Using a tool called CrowdTangle, which tracks engagement for Facebook pages across the network, you can see which pages are most shared, liked and commented on, and which pages dominate the conversation around election topics.
  • ...22 more annotations...
  • Individually, these pages have meaningful audiences, but cumulatively, their audience is gigantic: tens of millions of people. On Facebook, they rival the reach of their better-funded counterparts in the political media, whether corporate giants like CNN or The New York Times, or openly ideological web operations like Breitbart or Mic.
  • these new publishers are happy to live inside the world that Facebook has created. Their pages are accommodated but not actively courted by the company and are not a major part of its public messaging about media. But they are, perhaps, the purest expression of Facebook’s design and of the incentives coded into its algorithm — a system that has already reshaped the web and has now inherited, for better or for worse, a great deal of America’s political discourse.
  • In 2010, Facebook released widgets that publishers could embed on their sites, reminding readers to share, and these tools were widely deployed. By late 2012, when Facebook passed a billion users, referrals from the social network were sending visitors to publishers’ websites at rates sometimes comparable to Google, the web’s previous de facto distribution hub. Publishers took note of what worked on Facebook and adjusted accordingly.
  • While web publishers have struggled to figure out how to take advantage of Facebook’s audience, these pages have thrived. Unburdened of any allegiance to old forms of news media and the practice, or performance, of any sort of ideological balance, native Facebook page publishers have a freedom that more traditional publishers don’t: to engage with Facebook purely on its terms.
  • Rafael Rivero is an acquaintance of Provost’s who, with his twin brother, Omar, runs a page called Occupy Democrats, which passed three million followers in June. This accelerating growth is attributed by Rivero, and by nearly every left-leaning page operator I spoke with, not just to interest in the election but especially to one campaign in particular: “Bernie Sanders is the Facebook candidate,
  • Now that the nomination contest is over, Rivero has turned to making anti-Trump content. A post from earlier this month got straight to the point: “Donald Trump is unqualified, unstable and unfit to lead. Share if you agree!” More than 40,000 people did.“It’s like a meme war,” Rivero says, “and politics is being won and lost on social media.”
  • truly Facebook-native political pages have begun to create and refine a new approach to political news: cherry-picking and reconstituting the most effective tactics and tropes from activism, advocacy and journalism into a potent new mixture. This strange new class of media organization slots seamlessly into the news feed and is especially notable in what it asks, or doesn’t ask, of its readers. The point is not to get them to click on more stories or to engage further with a brand. The point is to get them to share the post that’s right in front of them. Everything else is secondary.
  • All have eventually run up against the same reality: A company that can claim nearly every internet-using adult as a user is less a partner than a context — a self-contained marketplace to which you have been granted access but which functions according to rules and incentives that you cannot control.
  • For media companies, the ability to reach an audience is fundamentally altered, made greater in some ways and in others more challenging. For a dedicated Facebook user, a vast array of sources, spanning multiple media and industries, is now processed through the same interface and sorting mechanism, alongside updates from friends, family, brands and celebrities.
  • The flood of visitors aligned with two core goals of most media companies: to reach people and to make money. But as Facebook’s growth continued, its influence was intensified by broader trends in internet use, primarily the use of smartphones, on which Facebook became more deeply enmeshed with users’ daily routines. Soon, it became clear that Facebook wasn’t just a source of readership; it was, increasingly, where readers lived.
  • It is a framework built around personal connections and sharing, where value is both expressed and conferred through the concept of engagement. Of course, engagement, in one form or another, is what media businesses have always sought, and provocation has always sold news. But now the incentives are literalized in buttons and written into software.
  • Each day, according to Facebook’s analytics, posts from the Make America Great page are seen by 600,000 to 1.7 million people. In July, articles posted to the page, which has about 450,000 followers, were shared, commented on or liked more than four million times, edging out, for example, the Facebook page of USA Today
  • Nicoloff’s business model is not dissimilar from the way most publishers use Facebook: build a big following, post links to articles on an outside website covered in ads and then hope the math works out in your favor. For many, it doesn’t: Content is expensive, traffic is unpredictable and website ads are both cheap and alienating to readers.
  • In July, visitors arriving to Nicoloff’s website produced a little more than $30,000 in revenue. His costs, he said, total around $8,000, partly split between website hosting fees and advertising buys on Facebook itself.
  • of course, there’s the content, which, at a few dozen posts a day, Nicoloff is far too busy to produce himself. “I have two people in the Philippines who post for me,” Nicoloff said, “a husband-and-wife combo.” From 9 a.m. Eastern time to midnight, the contractors scour the internet for viral political stories, many explicitly pro-Trump. If something seems to be going viral elsewhere, it is copied to their site and promoted with an urgent headline.
  • In the end, Nicoloff takes home what he jokingly described as a “doctor’s salary” — in a good month, more than $20,000.
  • In their angry, cascading comment threads, Make America Great’s followers express no such ambivalence. Nearly every page operator I spoke to was astonished by the tone their commenters took, comparing them to things like torch-wielding mobs and sharks in a feeding frenzy
  • A dozen or so of the sites are published in-house, but posts from the company’s small team of writers are free to be shared among the entire network. The deal for a would-be Liberty Alliance member is this: You bring the name and the audience, and the company will build you a prefab site, furnish it with ads, help you fill it with content and keep a cut of the revenue. Coca told me the company brought in $12 million in revenue last year.
  • Because the pages are run independently, the editorial product is varied. But it is almost universally tuned to the cadences and styles that seem to work best on partisan Facebook. It also tracks closely to conservative Facebook media’s big narratives, which, in turn, track with the Trump campaign’s messaging: Hillary Clinton is a crook and possibly mentally unfit; ISIS is winning; Black Lives Matter is the real racist movement; Donald Trump alone can save us; the system — all of it — is rigged.
  • It’s an environment that’s at best indifferent and at worst hostile to traditional media brands; but for this new breed of page operator, it’s mostly upside. In front of largely hidden and utterly sympathetic audiences, incredible narratives can take shape, before emerging, mostly formed, into the national discourse.
  • How much of what happens on the platform is a reflection of a political mood and widely held beliefs, simply captured in a new medium, and how much of it might be created, or intensified, by the environment it provides? What is Facebook doing to our politics?
  • for the page operators, the question is irrelevant to the task at hand. Facebook’s primacy is a foregone conclusion, and the question of Facebook’s relationship to political discourse is absurd — they’re one and the same. As Rafael Rivero put it to me, “Facebook is where it’s all happening.”
Javier E

(1) The Resilience Of Republican Christianism - 0 views

  • I tried to sketch out the essence of an actual conservative sensibility and politics: one of skepticism, limited government and an acceptance of human imperfection.
  • My point was that this conservative tradition had been lost in America, in so far as it had ever been found, because it had been hijacked by religious and political fundamentalism
  • I saw the fundamentalist psyche — rigid, abstract, authoritarian — as integral to the GOP in the Bush years and beyond, a phenomenon that, if sustained, would render liberal democracy practically moribund. It was less about the policy details, which change over time, than an entire worldview.
  • ...26 more annotations...
  • the intellectual right effectively dismissed the book
  • Here is David Brooks, echoing the conservative consensus in 2006:
  • As any number of historians, sociologists and pollsters can tell you, the evangelical Protestants who now exercise a major influence on the Republican Party are an infinitely diverse and contradictory group, and their relationship to these hyperpartisans is extremely ambivalent.
  • The idea that members of the religious right form an “infinitely diverse and contradictory group” and were in no way “hyperpartisan” is now clearly absurd. Christianism, in fact, turned out to be the central pillar of Trump’s success, with white evangelicals giving unprecedented and near-universal support — 84 percent — to a shameless, disgusting pagan, because and only because he swore to smite their enemies.
  • The fusion of Trump and Christianism is an unveiling of a sort — proof of principle that, in its core, Christianism is not religious but political, a reactionary cult susceptible to authoritarian preacher
  • Christianism is to the American right what critical theory is to the American left: a reductionist, totalizing creed that “others” half the country, and deeply misreads the genius of the American project.
  • Christianism starts, as critical theory does, by attacking the core of the Founding: in particular, its Enlightenment defense of universal reason, and its revolutionary removal of religion from the state.
  • Mike Johnson’s guru, pseudo-historian David Barton, claims that the Founders were just like evangelicals today, and intended the government at all levels to enforce “Christian values” — primarily, it seems, with respect to the private lives of others. As Pete Wehner notes, “If you listen to Johnson speak on the ‘so-called separation of Church and state’ and claim that ‘the Founders wanted to protect the church from an encroaching state, not the other way around,’ you will hear echoes of Barton.”
  • Christianism is a way to think about politics without actually thinking. Johnson expressed this beautifully last week: “I am a Bible-believing Christian. Someone asked me today in the media, they said, ‘It’s curious, people are curious: What does Mike Johnson think about any issue under the sun?’ I said, ‘Well, go pick up a Bible off your shelf and read it. That’s my worldview.
  • this tells us nothing, of course. The Bible demands interpretation in almost every sentence and almost every word; it contains universes of moral thought and thesauri of ambiguous words in a different ancient language; it has no clear blueprint for contemporary American politics, period
  • Yet Johnson uses it as an absolute authority to back up any policy he might support
  • The submission to (male) authority is often integral to fundamentalism
  • Trump was an authority figure, period. He was a patriarch. He was the patriarch of their tribe. And he was in power, which meant that God put him there. After which nothing needs to be said. So of course if the patriarch says the election is rigged, you believe him.
  • And of course you do what you can to make sure that God’s will be done — by attempting to overturn the election results if necessary.
  • Christianism is a just-so story, with no deep moral conflicts. Material wealth does not pose a moral challenge, for example, as it has done for Christians for millennia. For Christianists, it’s merely proof that God has blessed you and you deserve it.
  • “I believe that scripture, the Bible is very clear: that God is the one that raises up those in authority. And I believe that God has ordained and allowed each one of us to be brought here for this specific moment.” That means that Trump was blessed by God, and not just by the Electoral College in 2016. And because he was blessed by God, it was impossible that Biden beat him fairly in 2020.
  • More than three-quarters of those representing the most evangelical districts are election deniers, compared to just half of those in the remaining districts. Fully three-quarters of the deniers in the caucus hail from evangelical districts.
  • since the Tea Party, the turnover in primary challenges in these evangelical districts has been historic — a RINO-shredding machine. No wonder there were crosses being carried on Capitol Hill on January 6, 2021. The insurrectionists were merely following God’s will. And Trump’s legal team was filled with the faithful.
  • Tom Edsall shows the skew that has turned American politics into something of a religious war: “When House districts are ranked by the percentage of voters who are white evangelicals, the top quintile is represented by 81 Republicans and 6 Democrats and the second quintile by 68 Republicans and 19 Democrats.”
  • the overwhelming majority of the Republican House Caucus (70%) represents the Most Evangelical districts (top two quintiles). Thus, we can see that a group that represents less than 15% of the US population commands 70% of the districts comprising the majority party in the House of Representatives.
  • And almost all those districts are safe as houses. When you add Christianism to gerrymandering, you get a caucus that has no incentive to do anything but perform for the cable shows.
  • This is not a caucus interested in actually doing anything.
  • I don’t know how we best break the grip of the fundamentalist psyche on the right. It’s a deep human tendency — to give over control to a patriarch or a holy book rather than engage in the difficult process of democratic interaction with others, compromise, and common ground.
  • he phenomenon has been given new life by a charismatic con-man in Donald Trump, preternaturally able to corral the cultural fears and anxieties of those with brittle, politicized faith.
  • What I do know is that, unchecked, this kind of fundamentalism is a recipe not for civil peace but for civil conflict
  • It’s a mindset, a worldview, as deep in the human psyche as the racial tribalism now endemic on the left. It controls one of our two major parties. And in so far as it has assigned all decisions to one man, Donald Trump, it is capable of supporting the overturning of an election — or anything else, for that matter, that the patriarch wants. Johnson is a reminder of that.
Javier E

Facebook, in Cross Hairs After Election, Is Said to Question Its Influence - The New Yo... - 0 views

  • Facebook has been in the eye of a postelection storm for the last few days, embroiled in accusations that it helped spread misinformation and fake news stories that influenced how the American electorate voted.
  • — many company executives and employees have been asking one another if, or how, they shaped the minds, opinions and votes of Americans.
  • Some employees are worried about the spread of racist and so-called alt-right memes across the network, according to interviews with 10 current and former Facebook employees. Others are asking whether they contributed to a “filter bubble” among users who largely interact with people who share the same beliefs.
  • ...6 more annotations...
  • “A fake story claiming Pope Francis — actually a refugee advocate — endorsed Mr. Trump was shared almost a million times, likely visible to tens of millions,” Zeynep Tufekci, an associate professor at the University of North Carolina who studies the social impact of technology, said of a recent post on Facebook. “Its correction was barely heard. Of course Facebook had significant influence in this last election’s outcome.”
  • Chris Cox, a senior vice president of product and one of Mr. Zuckerberg’s top lieutenants, has long described Facebook as an unbiased and blank canvas to give people a voice.
  • “Of all the content on Facebook, more than 99% of what people see is authentic. Only a very small amount is fake news and hoaxes,” Mr. Zuckerberg wrote. “Overall, this makes it extremely unlikely hoaxes changed the outcome of this election in one direction or the other.”
  • Almost half of American adults rely on Facebook as a source of news, according to a study by the Pew Research Center. And Facebook often emphasizes its ability to sway its users with advertisers, portraying itself as an effective mechanism to help promote their products.
  • More recently, issues with fake news on the site have mushroomed. Multiple Facebook employees were particularly disturbed last week when a fake news site called The Denver Guardian spread across the social network with negative and false messages about Mrs. Clinton, including a claim that an F.B.I. agent connected to Mrs. Clinton’s email disclosures had murdered his wife and shot himself.
  • Even in private, Mr. Zuckerberg has continued to resist the notion that Facebook can unduly affect how people think and behave. In a Facebook post circulated on Wednesday to a small group of his friends, which was obtained by The New York Times, Mr. Zuckerberg challenged the idea that Facebook had a direct effect on the way people voted.
Javier E

Lies in the Guise of News in the Trump Era - The New York Times - 0 views

  • You may not realize that our Kenyan-born Muslim president was plotting to serve a third term as our illegitimate president, by allowing Hillary Clinton to win and then indicting her; Pope Francis’ endorsement of Donald Trump helped avert the election-rigging.
  • Freedom Daily had the most inaccurate Facebook page reviewed, and also produced the right-wing content most likely to go viral.
  • alt-right websites are both far more pernicious and increasingly influential. President-Elect Trump was, after all, propelled into politics partly as a champion of the lie that President Obama was born abroad and ineligible for the White House.
  • ...10 more annotations...
  • alt-right websites will continue to spew misinformation that undermines tolerance and democracy. I find them particularly loathsome because they do their best to magnify prejudice against blacks, Muslims and Latinos, tearing our social fabric.
  • A BuzzFeed investigation found that of the Facebook posts it examined from three major right-wing websites, 38 percent were either false or a mixture of truth and falsehood.
  • More discouraging, it was the lies that readers were particularly eager to share and thus profitable to publish.
  • one takeaway from this astonishing presidential election is that fake news is gaining ground, empowering nuts and undermining our democracy.
  • Alt-right and fake news sites for some reason have emerged in particular in Macedonia, in the former Yugoslavia. BuzzFeed found more than 100 sites about U.S. politics from a single town, Veles, population 45,000, in Macedonia. “I started the site for a easy way to make money,” a 17-year-old Macedonian who runs DailyNewsPolitics.com told BuzzFeed.
  • Facebook has been a powerful platform to disseminate these lies. If people see many articles on their Facebook feed, shared by numerous conservative friends, all indicating that Hillary Clinton is about to be indicted for crimes she committed, they may believe it.
  • These sites were dubbed “alt-right” because they originally were an alternative to mainstream conservatism. Today they have morphed into the mainstream: After all, Steve Bannon, the head of Breitbart, one of these sites full of misinformation, ran Trump’s campaign.
  • alt-right sites agitate for racial hatred. Freedom Daily lately has had “trending now” headlines like “Black Lives Thug Rapes/Kills 69 y/o White Woman.
  • There are also hyperpartisan left-wing websites with inaccuracies, but they are less prone to fabrication than the right-wing sites. Indeed, the Macedonian entrepreneurs originally came up with leftist websites targeting Bernie Sanders supporters but didn’t find much reader interest in them.
  • The landscape ahead looks grim to me. While the business model for mainstream journalism is in crisis, these alt-right websites expand as they monetize false “news” that promotes racism and undermines democracy
Javier E

Facebook's problem isn't Trump - it's the algorithm - Popular Information - 0 views

  • Facebook is in the business of making money. And it's very good at it. In the first three months of 2021, Facebook raked in over $11 billion in profits, almost entirely from displaying targeted advertising to its billions of users. 
  • In order to keep the money flowing, Facebook also needs to moderate content. When people use Facebook to livestream a murder, incite a genocide, or plan a white supremacist rally, it is not a good look.
  • But content moderation is a tricky business. This is especially true on Facebook where billions of pieces of content are posted every day. In a lot of cases, it is difficult to determine what content is truly harmful. No matter what you do, someone is unhappy. And it's a distraction from Facebook's core business of selling ads.
  • ...17 more annotations...
  • In 2019, Facebook came up with a solution to offload the most difficult content moderation decisions. The company created the "Oversight Board," a quasi-judicial body that Facebook claims is independent. The Board, stocked with impressive thinkers from around the world, would issue "rulings" about whether certain Facebook content moderation decisions were correct.
  • the decision, which is nearly 12,000 words long, illustrates that whether Trump is ultimately allowed to return to Facebook is of limited significance. The more important questions are about the nature of the algorithm that gives people with views like Trump such a powerful voice on Facebook. 
  • The Oversight Board was Facebook's idea. It spent years constructing the organization, selected its chairs, and funded its endowment. But now that the Oversight Board is finally up and running and taking on high-profile cases, Facebook is choosing to ignore questions that the Oversight Board believes are essential to doing its job.
  • This is a key passage (emphasis added): 
  • duces no original reporting. But, on Facebook in April, The Daily Wire received more than double the distribution of the Washington Post and the New York Times combined:
  • A critical issue, as the Oversight Board suggests, is not simply Trump's posts but how those kinds of posts are amplified by Facebook's algorithms. Equally important is how Facebook's algorithms amplify false, paranoid, violent, right-wing content from people other than Trump — including those that follow Trump on Facebook.
  • The jurisdiction of the Oversight Board excludes both the algorithm and Facebook's business practices.
  • Facebook stated to the Board that it considered Mr. Trump’s “repeated use of Facebook and other platforms to undermine confidence in the integrity of the election (necessitating repeated application by Facebook of authoritative labels correcting the misinformation) represented an extraordinary abuse of the platform.” The Board sought clarification from Facebook about the extent to which the platform’s design decisions, including algorithms, policies, procedures and technical features, amplified Mr. Trump’s posts after the election and whether Facebook had conducted any internal analysis of whether such design decisions may have contributed to the events of January 6. Facebook declined to answer these questions. This makes it difficult for the Board to assess whether less severe measures, taken earlier, may have been sufficient to protect the rights of others.
  • Donald Trump's Facebook page is a symptom, not the cause, of the problem. Its algorithm favors low-quality, far-right content. Trump is just one of many beneficiaries.
  • NewsWhip is a social media analytics service which tracks which websites get the most engagement on Facebook. It just released its analysis for April and it shows low-quality right-wing aggregation sites dominate major news organizations.
  • The Oversight Board has no power to compel Facebook to answer. It's an important reminder that, for all the pomp and circumstance, the Oversight Board is not a court. The scope of its authority is limited by Facebook executives' willingness to play along. 
  • This actually understates how much better The Daily Wire's content performs on Facebook than the Washington Post and the New York Times. The Daily Wire published just 1,385 pieces of content in April compared to over 6,000 by the Washington Post and the New York Times. Each piece of content The Daily Wire published in April received 54,084 engagements on Facebook, compared to 2,943 for the New York Times and 1,973 for the Washington Post. 
  • It's important to note here that Facebook's algorithm is not reflecting reality — it's creating a reality that doesn't exist anywhere else. In the rest of the world, Western Journal is not more popular than the New York Times, NBC News, the BBC, and the Washington Post. That's only true on Facebook.
  • Facebook has made a conscious decision to surface low-quality content and recognizes its dangers.
  • Shortly after the November election, Facebook temporarily tweaked its algorithm to emphasize "'news ecosystem quality' scores, or N.E.Q., a secret internal ranking it assigns to news publishers based on signals about the quality of their journalism." The purpose was to attempt to cut down on election misinformation being spread on the platform by Trump and his allies. The result was "a spike in visibility for big, mainstream publishers like CNN, The New York Times and NPR, while posts from highly engaged hyperpartisan pages, such as Breitbart and Occupy Democrats, became less visible." 
  • BuzzFeed reported that some Facebook staff members wanted to make the change permanent. But that suggestion was opposed by Joel Kaplan, a top Facebook executive and Republican operative who frequently intervenes on behalf of right-wing publishers. The algorithm change was quickly rolled back.
  • Other proposed changes to the Facebook algorithm over the years have been rejected or altered because of their potential negative impact on right-wing sites like The Daily Wire. 
Javier E

Here's a Look Inside Facebook's Data Wars - The New York Times - 0 views

  • On one side were executives, including Mr. Silverman and Brian Boland, a Facebook vice president in charge of partnerships strategy, who argued that Facebook should publicly share as much information as possible about what happens on its platform — good, bad or ugly.
  • On the other side were executives, including the company’s chief marketing officer and vice president of analytics, Alex Schultz, who worried that Facebook was already giving away too much.
  • One day in April, the people behind CrowdTangle, a data analytics tool owned by Facebook, learned that transparency had limits.
  • ...27 more annotations...
  • They argued that journalists and researchers were using CrowdTangle, a kind of turbocharged search engine that allows users to analyze Facebook trends and measure post performance, to dig up information they considered unhelpful — showing, for example, that right-wing commentators like Ben Shapiro and Dan Bongino were getting much more engagement on their Facebook pages than mainstream news outlets.
  • These executives argued that Facebook should selectively disclose its own data in the form of carefully curated reports, rather than handing outsiders the tools to discover it themselves.Team Selective Disclosure won, and CrowdTangle and its supporters lost.
  • the CrowdTangle story is important, because it illustrates the way that Facebook’s obsession with managing its reputation often gets in the way of its attempts to clean up its platform
  • The company, blamed for everything from election interference to vaccine hesitancy, badly wants to rebuild trust with a skeptical public. But the more it shares about what happens on its platform, the more it risks exposing uncomfortable truths that could further damage its image.
  • Facebook’s executives were more worried about fixing the perception that Facebook was amplifying harmful content than figuring out whether it actually was amplifying harmful content. Transparency, they said, ultimately took a back seat to image management.
  • the executives who pushed hardest for transparency appear to have been sidelined. Mr. Silverman, CrowdTangle’s co-founder and chief executive, has been taking time off and no longer has a clearly defined role at the company, several people with knowledge of the situation said. (Mr. Silverman declined to comment about his status.) And Mr. Boland, who spent 11 years at Facebook, left the company in November.
  • “One of the main reasons that I left Facebook is that the most senior leadership in the company does not want to invest in understanding the impact of its core products,” Mr. Boland said, in his first interview since departing. “And it doesn’t want to make the data available for others to do the hard work and hold them accountable.”
  • Mr. Boland, who oversaw CrowdTangle as well as other Facebook transparency efforts, said the tool fell out of favor with influential Facebook executives around the time of last year’s presidential election, when journalists and researchers used it to show that pro-Trump commentators were spreading misinformation and hyperpartisan commentary with stunning success.
  • “People were enthusiastic about the transparency CrowdTangle provided until it became a problem and created press cycles Facebook didn’t like,” he said. “Then, the tone at the executive level changed.”
  • Facebook was happy that I and other journalists were finding its tool useful. With only about 25,000 users, CrowdTangle is one of Facebook’s smallest products, but it has become a valuable resource for power users including global health organizations, election officials and digital marketers, and it has made Facebook look transparent compared with rival platforms like YouTube and TikTok, which don’t release nearly as much data.
  • Last fall, the leaderboard was full of posts by Mr. Trump and pro-Trump media personalities. Since Mr. Trump was barred from Facebook in January, it has been dominated by a handful of right-wing polemicists like Mr. Shapiro, Mr. Bongino and Sean Hannity, with the occasional mainstream news article, cute animal story or K-pop fan blog sprinkled in.
  • But the mood shifted last year when I started a Twitter account called @FacebooksTop10, on which I posted a daily leaderboard showing the sources of the most-engaged link posts by U.S. pages, based on CrowdTangle data.
  • The account went semi-viral, racking up more than 35,000 followers. Thousands of people retweeted the lists, including conservatives who were happy to see pro-Trump pundits beating the mainstream media and liberals who shared them with jokes like “Look at all this conservative censorship!” (If you’ve been under a rock for the past two years, conservatives in the United States frequently complain that Facebook is censoring them.)
  • Inside Facebook, the account drove executives crazy. Some believed that the data was being misconstrued and worried that it was painting Facebook as a far-right echo chamber. Others worried that the lists might spook investors by suggesting that Facebook’s U.S. user base was getting older and more conservative. Every time a tweet went viral, I got grumpy calls from Facebook executives who were embarrassed by the disparity between what they thought Facebook was — a clean, well-lit public square where civility and tolerance reign — and the image they saw reflected in the Twitter lists.
  • Mr. Boland, the former Facebook vice president, said that was a convenient deflection. He said that in internal discussions, Facebook executives were less concerned about the accuracy of the data than about the image of Facebook it presented.“It told a story they didn’t like,” he said of the Twitter account, “and frankly didn’t want to admit was true.”
  • Several executives proposed making reach data public on CrowdTangle, in hopes that reporters would cite that data instead of the engagement data they thought made Facebook look bad.But Mr. Silverman, CrowdTangle’s chief executive, replied in an email that the CrowdTangle team had already tested a feature to do that and found problems with it. One issue was that false and misleading news stories also rose to the top of those lists.“Reach leaderboard isn’t a total win from a comms point of view,” Mr. Silverman wrote.
  • executives argued that my Top 10 lists were misleading. They said CrowdTangle measured only “engagement,” while the true measure of Facebook popularity would be based on “reach,” or the number of people who actually see a given post. (With the exception of video views, reach data isn’t public, and only Facebook employees and page owners have access to it.)
  • Mr. Schultz, Facebook’s chief marketing officer, had the dimmest view of CrowdTangle. He wrote that he thought “the only way to avoid stories like this” would be for Facebook to publish its own reports about the most popular content on its platform, rather than releasing data through CrowdTangle.“If we go down the route of just offering more self-service data you will get different, exciting, negative stories in my opinion,” he wrote.
  • there’s a problem with reach data: Most of it is inaccessible and can’t be vetted or fact-checked by outsiders. We simply have to trust that Facebook’s own, private data tells a story that’s very different from the data it shares with the public.
  • Mr. Zuckerberg is right about one thing: Facebook is not a giant right-wing echo chamber.But it does contain a giant right-wing echo chamber — a kind of AM talk radio built into the heart of Facebook’s news ecosystem, with a hyper-engaged audience of loyal partisans who love liking, sharing and clicking on posts from right-wing pages, many of which have gotten good at serving up Facebook-optimized outrage bait at a consistent clip.
  • CrowdTangle’s data made this echo chamber easier for outsiders to see and quantify. But it didn’t create it, or give it the tools it needed to grow — Facebook did — and blaming a data tool for these revelations makes no more sense than blaming a thermometer for bad weather.
  • It’s worth noting that these transparency efforts are voluntary, and could disappear at any time. There are no regulations that require Facebook or any other social media companies to reveal what content performs well on their platforms, and American politicians appear to be more interested in fighting over claims of censorship than getting access to better data.
  • It’s also worth noting that Facebook can turn down the outrage dials and show its users calmer, less divisive news any time it wants. (In fact, it briefly did so after the 2020 election, when it worried that election-related misinformation could spiral into mass violence.) And there is some evidence that it is at least considering more permanent changes.
  • The project, which some employees refer to as the “Top 10” project, is still underway, the people said, and it’s unclear whether its findings have been put in place. Mr. Osborne, the Facebook spokesman, said that the team looks at a variety of ranking changes, and that the experiment wasn’t driven by a desire to change the Top 10 lists.
  • This year, Mr. Hegeman, the executive in charge of Facebook’s news feed, asked a team to figure out how tweaking certain variables in the core news feed ranking algorithm would change the resulting Top 10 lists, according to two people with knowledge of the project.
  • As for CrowdTangle, the tool is still available, and Facebook is not expected to cut off access to journalists and researchers in the short term, according to two people with knowledge of the company’s plans.
  • Mr. Boland, however, said he wouldn’t be surprised if Facebook executives decided to kill off CrowdTangle entirely or starve it of resources, rather than dealing with the headaches its data creates.
aleija

YouTube Cut Down Misinformation. Then It Boosted Fox News. - The New York Times - 0 views

  • That algorithm decided which videos YouTube recommended that users watch next; the company said it was responsible for 70 percent of the one billion hours a day people spent on YouTube. But it had become clear that those recommendations tended to steer viewers toward videos that were hyperpartisan, divisive, misleading or downright false.
  • . In the weeks leading up to Tuesday’s election, YouTube recommended far fewer fringe channels alongside news videos than it did in 2016, which helped it to reduce its spread of disinformation, according to research by Guillaume Chaslot, a former Google engineer who helped build YouTube’s recommendation engine and now studies it.
  • The ascent of Fox News on the social media platforms was a reminder that tech companies have been walking a tricky line between limiting misinformation and appeasing politicians complaining that Silicon Valley is biased — all while still keeping people clicking, watching and sharing on their sites.
  • ...2 more annotations...
  • “The channel most recommended in our data set in 2016 was Alex Jones,” the notorious internet conspiracy theorist, who has since been barred from YouTube, Mr. Chaslot said. “Now it’s Fox News.”
  • YouTube’s promotion of Fox News’s unabashedly conservative pundits also undercut arguments from some of those same pundits that the biggest tech companies are trying to silence them.
Javier E

Facebook Executives Shut Down Efforts to Make the Site Less Divisive - WSJ - 0 views

  • A Facebook Inc. team had a blunt message for senior executives. The company’s algorithms weren’t bringing people together. They were driving people apart.
  • “Our algorithms exploit the human brain’s attraction to divisiveness,” read a slide from a 2018 presentation. “If left unchecked,” it warned, Facebook would feed users “more and more divisive content in an effort to gain user attention & increase time on the platform.”
  • That presentation went to the heart of a question dogging Facebook almost since its founding: Does its platform aggravate polarization and tribal behavior? The answer it found, in some cases, was yes.
  • ...27 more annotations...
  • in the end, Facebook’s interest was fleeting. Mr. Zuckerberg and other senior executives largely shelved the basic research, according to previously unreported internal documents and people familiar with the effort, and weakened or blocked efforts to apply its conclusions to Facebook products.
  • At Facebook, “There was this soul-searching period after 2016 that seemed to me this period of really sincere, ‘Oh man, what if we really did mess up the world?’
  • Another concern, they and others said, was that some proposed changes would have disproportionately affected conservative users and publishers, at a time when the company faced accusations from the right of political bias.
  • Americans were drifting apart on fundamental societal issues well before the creation of social media, decades of Pew Research Center surveys have shown. But 60% of Americans think the country’s biggest tech companies are helping further divide the country, while only 11% believe they are uniting it, according to a Gallup-Knight survey in March.
  • Facebook policy chief Joel Kaplan, who played a central role in vetting proposed changes, argued at the time that efforts to make conversations on the platform more civil were “paternalistic,” said people familiar with his comments.
  • The high number of extremist groups was concerning, the presentation says. Worse was Facebook’s realization that its algorithms were responsible for their growth. The 2016 presentation states that “64% of all extremist group joins are due to our recommendation tools” and that most of the activity came from the platform’s “Groups You Should Join” and “Discover” algorithms: “Our recommendation systems grow the problem.”
  • In a sign of how far the company has moved, Mr. Zuckerberg in January said he would stand up “against those who say that new types of communities forming on social media are dividing us.” People who have heard him speak privately said he argues social media bears little responsibility for polarization.
  • Fixing the polarization problem would be difficult, requiring Facebook to rethink some of its core products. Most notably, the project forced Facebook to consider how it prioritized “user engagement”—a metric involving time spent, likes, shares and comments that for years had been the lodestar of its system.
  • Even before the teams’ 2017 creation, Facebook researchers had found signs of trouble. A 2016 presentation that names as author a Facebook researcher and sociologist, Monica Lee, found extremist content thriving in more than one-third of large German political groups on the platform.
  • Swamped with racist, conspiracy-minded and pro-Russian content, the groups were disproportionately influenced by a subset of hyperactive users, the presentation notes. Most of them were private or secret.
  • One proposal Mr. Uribe’s team championed, called “Sparing Sharing,” would have reduced the spread of content disproportionately favored by hyperactive users, according to people familiar with it. Its effects would be heaviest on content favored by users on the far right and left. Middle-of-the-road users would gain influence.
  • The Common Ground team sought to tackle the polarization problem directly, said people familiar with the team. Data scientists involved with the effort found some interest groups—often hobby-based groups with no explicit ideological alignment—brought people from different backgrounds together constructively. Other groups appeared to incubate impulses to fight, spread falsehoods or demonize a population of outsiders.
  • Mr. Pariser said that started to change after March 2018, when Facebook got in hot water after disclosing that Cambridge Analytica, the political-analytics startup, improperly obtained Facebook data about tens of millions of people. The shift has gained momentum since, he said: “The internal pendulum swung really hard to ‘the media hates us no matter what we do, so let’s just batten down the hatches.’ ”
  • Building these features and combating polarization might come at a cost of lower engagement, the Common Ground team warned in a mid-2018 document, describing some of its own proposals as “antigrowth” and requiring Facebook to “take a moral stance.”
  • Taking action would require Facebook to form partnerships with academics and nonprofits to give credibility to changes affecting public conversation, the document says. This was becoming difficult as the company slogged through controversies after the 2016 presidential election.
  • Asked to combat fake news, spam, clickbait and inauthentic users, the employees looked for ways to diminish the reach of such ills. One early discovery: Bad behavior came disproportionately from a small pool of hyperpartisan users.
  • A second finding in the U.S. saw a larger infrastructure of accounts and publishers on the far right than on the far left. Outside observers were documenting the same phenomenon. The gap meant even seemingly apolitical actions such as reducing the spread of clickbait headlines—along the lines of “You Won’t Believe What Happened Next”—affected conservative speech more than liberal content in aggregate.
  • Every significant new integrity-ranking initiative had to seek the approval of not just engineering managers but also representatives of the public policy, legal, marketing and public-relations departments.
  • “Engineers that were used to having autonomy maybe over-rotated a bit” after the 2016 election to address Facebook’s perceived flaws, she said. The meetings helped keep that in check. “At the end of the day, if we didn’t reach consensus, we’d frame up the different points of view, and then they’d be raised up to Mark.”
  • Disapproval from Mr. Kaplan’s team or Facebook’s communications department could scuttle a project, said people familiar with the effort. Negative policy-team reviews killed efforts to build a classification system for hyperpolarized content. Likewise, the Eat Your Veggies process shut down efforts to suppress clickbait about politics more than on other topics.
  • Under Facebook’s engagement-based metrics, a user who likes, shares or comments on 1,500 pieces of content has more influence on the platform and its algorithms than one who interacts with just 15 posts, allowing “super-sharers” to drown out less-active users
  • Accounts with hyperactive engagement were far more partisan on average than normal Facebook users, and they were more likely to behave suspiciously, sometimes appearing on the platform as much as 20 hours a day and engaging in spam-like behavior. The behavior suggested some were either people working in shifts or bots.
  • “We’re explicitly not going to build products that attempt to change people’s beliefs,” one 2018 document states. “We’re focused on products that increase empathy, understanding, and humanization of the ‘other side.’ ”
  • The debate got kicked up to Mr. Zuckerberg, who heard out both sides in a short meeting, said people briefed on it. His response: Do it, but cut the weighting by 80%. Mr. Zuckerberg also signaled he was losing interest in the effort to recalibrate the platform in the name of social good, they said, asking that they not bring him something like that again.
  • Mr. Uribe left Facebook and the tech industry within the year. He declined to discuss his work at Facebook in detail but confirmed his advocacy for the Sparing Sharing proposal. He said he left Facebook because of his frustration with company executives and their narrow focus on how integrity changes would affect American politics
  • While proposals like his did disproportionately affect conservatives in the U.S., he said, in other countries the opposite was true.
  • The tug of war was resolved in part by the growing furor over the Cambridge Analytica scandal. In a September 2018 reorganization of Facebook’s newsfeed team, managers told employees the company’s priorities were shifting “away from societal good to individual value,” said people present for the discussion. If users wanted to routinely view or post hostile content about groups they didn’t like, Facebook wouldn’t suppress it if the content didn’t specifically violate the company’s rules.
1 - 10 of 10
Showing 20 items per page