Skip to main content

Home/ History Readings/ Group items tagged Misinformation

Rss Feed Group items tagged

aniyahbarnett

Twitter adds climate change topic under rising pressure to combat lies - 0 views

  • Twitter is adding a topic that directs users to credible information about climate change in a new effort to combat the spread of misinformation
  • they will see posts from global environmental and sustainability organizations, environmental activists, environmental researchers and environmental institutions in their feed
  • er has no policy to label or take down climate change misinformatio
  • ...3 more annotations...
  • Twitter and the nation’s leading social media companies including Facebook, Google’s YouTube and TikTok are increasingly on the hot seat over climate change misinformation
  • This year, there have been 83,590
  • yet all of the top five Twitter accounts pushing climate change denial promote claims it does.
Javier E

How Misinformation Threatened a Montana National Heritage Area - The New York Times - 0 views

  • Ms. Grulkowski had just heard about a years-in-the-making effort to designate her corner of central Montana a national heritage area, celebrating its role in the story of the American West. A small pot of federal matching money was there for the taking, to help draw more visitors and preserve underfunded local tourist attractions.
  • She collected addresses from a list of voters and spent $1,300 sending a packet denouncing the proposed heritage area to 1,498 farmers and ranchers. She told them the designation would forbid landowners to build sheds, drill wells or use fertilizers and pesticides. It would alter water rights, give tourists access to private property, create a new taxation district and prohibit new septic systems and burials on private land, she said.
  • From the vantage point of informed democratic decision making, it’s a haunting tale about how a sustained political campaign can succeed despite — or perhaps as a result of — being divorced from reality.
  • ...20 more annotations...
  • “Misinformation is the new playbook,” Bob Kelly, the mayor of Great Falls, said. “You don’t like something? Create alternative facts and figures as a way to undermine reality.”
  • “We’ve run into the uneducable,” Ellen Sievert, a retired historic preservation officer for Great Falls and surrounding Cascade County, said. “I don’t know how we get through that.”
  • Steve Taylor, a former mayor of Neihart (pop. 43) whose family owns a car dealership in Great Falls, is a conservative who voted for Donald J. Trump twice, though he said he has regretted those votes since the Jan. 6 Capitol riot. Fellow Republicans, he said, have painted the heritage area as a liberal plot
  • “They make it a political thing because if you have a Democrat involved, then they are all against it,” he said. “It’s so hard to build something and so easy to tear it down. It’s maddening. It’s so easy to destroy something with untruths.”
  • And she came across a vein of conspiratorial accusations that national heritage areas were a kind of Trojan horse that could open the door to future federal land grabs.
  • Beginning in 2013, Ms. Weber teamed up with local preservationists, formed a nonprofit, enlisted local businesses and raised $50,000 for a required feasibility study. In 2014, the Great Falls City Commission included the heritage area as part of its official growth policy.
  • The proposal would take in four National Historic Landmarks: Lewis and Clark’s portage route around Great Falls; Fort Benton, a pioneer town along the Missouri River that was the last stop for steamships heading west from St. Louis in the 1800s; the First Peoples Buffalo Jump, a steep cliff over which Blackfoot hunters herded buffalo to their deaths; and the home and studio of C.M. Russell, the turn-of-the-century “cowboy artist” whose paintings of the American West shaped the popular image of frontier life.
  • The park service requires demonstrations of public support, which Ms. Weber and her allies solicited. For six years, the process went on largely undisturbed. Ms. Weber hosted dozens of public meetings and was a regular on local radio stations. Opponents made scarcely a peep.
  • The proposal for the Big Sky Country National Heritage Area, encompassing most of two central Montana counties that are together roughly the size of Connecticut, was the brainchild of Jane Weber, a U.S. Forest Service retiree who spent a decade on the Cascade County Commission.
  • Ms. Grulkowski’s interest was piqued.At the time, she was becoming engrossed in the online world of far-right media. From her home on 34 acres in Stockett, a farming community of 157 people south of Great Falls, she watched videos from outlets like His Glory TV, where hosts refer to President Biden as “the so-called president.” She subscribed to the Telegram messaging channel of Seth Keshel, a prolific disinformation spreader.
  • Then the 2020 political season arrived.
  • By May, their campaign had reached the state capital, where Mr. Gianforte signed the bill barring any national heritage area in Montana after it passed on a near-party-line vote. A heritage area, the bill’s text asserted, would “interfere with state and private property rights.”
  • In two hours of talking at his farm, Mr. Bandel could offer no evidence to back up that claim. He said he distrusted assurances that there were no such designs. “They say, ‘Don’t worry, we’re going to do it right. Don’t worry, we’ll take care of you. I think Adolf Hitler said that, too, didn’t he?” Mr. Bandel said. “The fear of the unknown is a huge fear.”
  • Mr. Bandel said he trusted Ms. Grulkowski with the details.
  • But when pressed, Ms. Grulkowski, too, was unable to identify a single instance of a property owner’s being adversely affected by a heritage area. “It’s not that there are a lot of specific instances,” she said. “There’s a lot of very wide open things that could happen.”
  • That somewhat amorphous fear was more the point.
  • “We didn’t believe in any of that stuff until last July,” Ms. Grulkowski said. “Then we stumbled on something on the internet, and we watched it, and it took us two days to get over that. And it had to do with the child trafficking that leads to everything. It just didn’t seem right, and that was just over the top. And then we started seeing things that are lining up with that everywhere.”
  • One thing Ms. Grulkowski does not do — because she refuses to pay — is read The Great Falls Tribune, the local daily. It’s not what it once was, with just eight journalists, down from 45 in 2000, said Richard Ecke, who spent 38 years at the paper before the owner, Gannett, laid him off as opinion editor in 2016. He is vice chairman of the proposed heritage area’s board.
  • In the paper’s place, information and misinformation about the heritage area spread on Facebook and in local outlets that parroted Ms. Grulkowski. Last winter, a glossy magazine distributed to Montana farmers put the subject on its cover, headlined “Intrusive Raid on Private Property Rights.”
  • Ms. Grulkowski now has ambitions beyond Montana. She wants to push Congress not to renew heritage areas that already exist.Buoyed by the trust her neighbors have placed in her, she has begun campaigning for Ms. Weber’s old seat on the county commission, in part to avenge the way she feels: mistreated by those in power.She doesn’t feel she’s been told the whole truth.
Javier E

Stanford's top disinformation research group collapses under pressure - The Washington ... - 0 views

  • The collapse of the five-year-old Observatory is the latest and largest of a series of setbacks to the community of researchers who try to detect propaganda and explain how false narratives are manufactured, gather momentum and become accepted by various groups
  • It follows Harvard’s dismissal of misinformation expert Joan Donovan, who in a December whistleblower complaint alleged he university’s close and lucrative ties with Facebook parent Meta led the university to clamp down on her work, which was highly critical of the social media giant’s practices.
  • Starbird said that while most academic studies of online manipulation look backward from much later, the Observatory’s “rapid analysis” helped people around the world understand what they were seeing on platforms as it happened.
  • ...9 more annotations...
  • Brown University professor Claire Wardle said the Observatory had created innovative methodology and trained the next generation of experts.
  • “Closing down a lab like this would always be a huge loss, but doing so now, during a year of global elections, makes absolutely no sense,” said Wardle, who previously led research at anti-misinformation nonprofit First Draft. “We need universities to use their resources and standing in the community to stand up to criticism and headlines.”
  • The study of misinformation has become increasingly controversial, and Stamos, DiResta and Starbird have been besieged by lawsuits, document requests and threats of physical harm. Leading the charge has been Rep. Jim Jordan (R-Ohio), whose House subcommittee alleges the Observatory improperly worked with federal officials and social media companies to violate the free-speech rights of conservatives.
  • In a joint statement, Stamos and DiResta said their work involved much more than elections, and that they had been unfairly maligned.
  • “The politically motivated attacks against our research on elections and vaccines have no merit, and the attempts by partisan House committee chairs to suppress First Amendment-protected research are a quintessential example of the weaponization of government,” they said.
  • Stamos founded the Observatory after publicizing that Russia has attempted to influence the 2016 election by sowing division on Facebook, causing a clash with the company’s top executives. Special counsel Robert S. Mueller III later cited the Facebook operation in indicting a Kremlin contractor. At Stanford, Stamos and his team deepened his study of influence operations from around the world, including one it traced to the Pentagon.
  • Stamos told associates he stepped back from leading the Observatory last year in part because the political pressure had taken a toll. Stamos had raised most of the money for the project, and the remaining faculty have not been able to replicate his success, as many philanthropic groups shift their focus on artificial intelligence and other, fresher topics.
  • In supporting the project further, the university would have risked alienating conservative donors, Silicon Valley figures, and members of Congress, who have threatened to stop all federal funding for disinformation research or cut back general support.
  • The Observatory’s non-election work has included developing curriculum for teaching college students about how to handle trust and safety issues on social media platforms and launching the first peer-reviewed journal dedicated to that field. It has also investigated rings publishing child sexual exploitation material online and flaws in the U.S. system for reporting it, helping to prepare platforms to handle an influx of computer-generated material.
Javier E

Opinion | The India-Pakistan Conflict Was a Parade of Lies - The New York Times - 0 views

  • Social networks are now so deeply embedded into global culture that it feels irresponsible to think of them as some exogenous force. Instead, when it comes to misinformation, the internet is a mere cog in the larger machinery of deceit.
  • There are other important gears in that machine: politicians and celebrities; parts of the news media (especially television, where most people still get their news); and motivated actors of all sorts, from governments to scammers to multinational brands.
  • It is in the confluence of all these forces that you come upon the true nightmare: a society in which small and big lies pervade every discussion, across every medium; where deceit is assumed, trust is naïve, and a consensus view of reality begins to feel frighteningly anachronistic.
  • ...9 more annotations...
  • It’s easier to appreciate the simmering pot when you’re looking at it from the outside
  • India conducted airstrikes against Pakistan. After I learned about them, I tried to follow the currents of misinformation in the unfolding conflict between two nuclear-armed nations on the brink of hot war.
  • What I found was alarming; it should terrify the world, not just Indians and Pakistanis. Whether you got your news from outlets based in India or Pakistan during the conflict, you would have struggled to find your way through a miasma of lies. The lies flitted across all media: there was lying on Facebook, Twitter and WhatsApp; there was lying on TV; there were lies from politicians; there were lies from citizens.
  • just about everyone, including many journalists, played fast and loose with facts. Many discussions were tinged with rumor and supposition. Pictures were doctored, doctored pictures were shared and aired, and real pictures were dismissed as doctored.
  • Many of the lies were directed and weren’t innocent slip-ups in the fog of war but efforts to discredit the enemy, to boost nationalistic pride, to shame anyone who failed to toe a jingoistic line. The lies fit a pattern, clamoring for war, and on both sides they suggested a society that had slipped the bonds of rationality and fallen completely to the post-fact order.
  • If you dive into the tireless fact-checking sites policing the region, you’ll find scores more lies from last week, some that flow across both sides of the conflict and many so intricate they defy easy explanation.
  • And you will be filled with a sense of despair.
  • The Indian government recently introduced a set of draconian digital restrictions meant, it says, to reduce misinformation. But when mendacity crosses all media and all social institutions, when it becomes embedded in the culture, focusing on digital platforms misses the point.
  • In India, Pakistan and everywhere else, addressing digital mendacity will require a complete social overhaul. “The battle is going to be long and difficult,” Govindraj Ethiraj, a journalist who runs the Indian fact-checking site Boom, told me. The information war is a forever war. We’re just getting started.
Javier E

For Two Months, I Got My News From Print Newspapers. Here's What I Learned. - The New Y... - 0 views

  • In January, after the breaking-newsiest year in recent memory, I decided to travel back in time. I turned off my digital news notifications, unplugged from Twitter and other social networks, and subscribed to home delivery of three print newspapers — The Times, The Wall Street Journal and my local paper, The San Francisco Chronicle — plus a weekly newsmagazine, The Economist.
  • I have spent most days since then getting the news mainly from print, though my self-imposed asceticism allowed for podcasts, email newsletters and long-form nonfiction (books and magazine articles). Basically, I was trying to slow-jam the news — I still wanted to be informed, but was looking to formats that prized depth and accuracy over speed.
  • It has been life changing. Turning off the buzzing breaking-news machine I carry in my pocket was like unshackling myself from a monster who had me on speed dial, always ready to break into my day with half-baked bulleti
  • ...20 more annotations...
  • Most of all, I realized my personal role as a consumer of news in our broken digital news environment.
  • And I’m embarrassed about how much free time I have — in two months, I managed to read half a dozen books, took up pottery and (I think) became a more attentive husband and father.
  • Now I am not just less anxious and less addicted to the news, I am more widely informed
  • We have spent much of the past few years discovering that the digitization of news is ruining how we collectively process information. Technology allows us to burrow into echo chambers, exacerbating misinformation and polarization and softening up society for propaganda.
  • With artificial intelligence making audio and video as easy to fake as text, we’re entering a hall-of-mirrors dystopia, what some are calling an “information apocaly
  • the experiment taught me several lessons about the pitfalls of digital news and how to avoid them.
  • I distilled those lessons into three short instructions, the way the writer Michael Pollan once boiled down nutrition advice: Get news. Not too quickly. Avoid social.
  • The Times has about 3.6 million paying subscribers, but about three-quarters of them pay for just the digital version. During the 2016 election, fewer than 3 percent of Americans cited print as their most important source of campaign news; for people under 30, print was their least important source.
  • What do you get for all that dough? News. That sounds obvious until you try it — and you realize how much of what you get online isn’t quite news, and more like a never-ending stream of commentary, one that does more to distort your understanding of the world than illuminate it.
  • On social networks, every news story comes to you predigested. People don’t just post stories — they post their takes on stories, often quoting key parts of a story to underscore how it proves them right, so readers are never required to delve into the story to come up with their own view.
  • the prominence of commentary over news online and on cable news feels backward, and dangerously so. It is exactly our fealty to the crowd — to what other people are saying about the news, rather than the news itself — that makes us susceptible to misinformation.
  • Real life is slow; it takes professionals time to figure out what happened, and how it fits into context. Technology is fast. Smartphones and social networks are giving us facts about the news much faster than we can make sense of them, letting speculation and misinformation fill the gap.
  • I was getting news a day old, but in the delay between when the news happened and when it showed up on my front door, hundreds of experienced professionals had done the hard work for me.
  • I was left with the simple, disconnected and ritualistic experience of reading the news, mostly free from the cognitive load of wondering whether the thing I was reading was possibly a blatant lie.
  • One weird aspect of the past few years is how a “tornado of news-making has scrambled Americans’ grasp of time and memory,” as my colleague Matt Flegenheimer put it last year. By providing a daily digest of the news, the newspaper alleviates this sense. Sure, there’s still a lot of news — but when you read it once a day, the world feels contained and comprehensible
  • What’s important is choosing a medium that highlights deep stories over quickly breaking ones.
  • And, more important, you can turn off news notifications. They distract and feed into a constant sense of fragmentary paranoia about the world
  • Avoid social.This is the most important rule of all. After reading newspapers for a few weeks, I began to see it wasn’t newspapers that were so great, but social media that was so bad.
  • The built-in incentives on Twitter and Facebook reward speed over depth, hot takes over facts and seasoned propagandists over well-meaning analyzers of news.
  • for goodness’ sake, please stop getting your news mainly from Twitter and Facebook. In the long run, you and everyone else will be better off.
Javier E

Where Countries Are Tinderboxes and Facebook Is a Match - The New York Times - 0 views

  • For months, we had been tracking riots and lynchings around the world linked to misinformation and hate speech on Facebook, which pushes whatever content keeps users on the site longest — a potentially damaging practice in countries with weak institutions.
  • Time and again, communal hatreds overrun the newsfeed — the primary portal for news and information for many users — unchecked as local media are displaced by Facebook and governments find themselves with little leverage over the company
  • Some users, energized by hate speech and misinformation, plot real-world attacks.
  • ...23 more annotations...
  • A reconstruction of Sri Lanka’s descent into violence, based on interviews with officials, victims and ordinary users caught up in online anger, found that Facebook’s newsfeed played a central role in nearly every step from rumor to killing.
  • Facebook officials, they say, ignored repeated warnings of the potential for violence, resisting pressure to hire moderators or establish emergency points of contact.
  • Sri Lankans say they see little evidence of change. And in other countries, as Facebook expands, analysts and activists worry they, too, may see violence.
  • As Facebook pushes into developing countries, it tends to be initially received as a force for good.In Sri Lanka, it keeps families in touch even as many work abroad. It provides for unprecedented open expression and access to information. Government officials say it was essential for the democratic transition that swept them into office in 2015.
  • where institutions are weak or undeveloped, Facebook’s newsfeed can inadvertently amplify dangerous tendencies. Designed to maximize user time on site, it promotes whatever wins the most attention. Posts that tap into negative, primal emotions like anger or fear, studies have found, produce the highest engagement, and so proliferate.
  • n developing countries, Facebook is often perceived as synonymous with the internet and reputable sources are scarce, allowing emotionally charged rumors to run rampant. Shared among trusted friends and family members, they can become conventional wisdom.
  • “There needs to be some kind of engagement with countries like Sri Lanka by big companies who look at us only as markets,” he said. “We’re a society, we’re not just a market.”
  • Last year, in rural Indonesia, rumors spread on Facebook and WhatsApp, a Facebook-owned messaging tool, that gangs were kidnapping local children and selling their organs. Some messages included photos of dismembered bodies or fake police fliers. Almost immediately, locals in nine villages lynched outsiders they suspected of coming for their children.
  • Near-identical social media rumors have also led to attacks in India and Mexico. Lynchings are increasingly filmed and posted back to Facebook, where they go viral as grisly tutorials.
  • One post declared, “Kill all Muslims, don’t even save an infant.” A prominent extremist urged his followers to descend on the city of Kandy to “reap without leaving an iota behind.”
  • where people do not feel they can rely on the police or courts to keep them safe, research shows, panic over a perceived threat can lead some to take matters into their own hands — to lynch.
  • “You report to Facebook, they do nothing,” one of the researchers, Amalini De Sayrah, said. “There’s incitements to violence against entire communities and Facebook says it doesn’t violate community standards.”
  • In government offices across town, officials “felt a sense of helplessness,” Sudarshana Gunawardana, the head of public information, recounted. Before Facebook, he said, officials facing communal violence “could ask media heads to be sensible, they could have their own media strategy.”
  • now it was as if his country’s information policies were set at Facebook headquarters in Menlo Park, Calif. The officials rushed out statements debunking the sterilization rumors but could not match Facebook’s influence
  • Desperate, the researchers flagged the video and subsequent posts using Facebook’s on-site reporting tool.Though they and government officials had repeatedly asked Facebook to establish direct lines, the company had insisted this tool would be sufficient, they said. But nearly every report got the same response: the content did not violate Facebook’s standards.
  • Facebook’s most consequential impact may be in amplifying the universal tendency toward tribalism. Posts dividing the world into “us” and “them” rise naturally, tapping into users’ desire to belong.
  • Its gamelike interface rewards engagement, delivering a dopamine boost when users accrue likes and responses, training users to indulge behaviors that win affirmation
  • And because its algorithm unintentionally privileges negativity, the greatest rush comes by attacking outsiders: The other sports team. The other political party. The ethnic minority.
  • Mass media has long been used to mobilize mass violence. Facebook, by democratizing communication tools, gives anyone with a smartphone the ability to broadcast hate.
  • Facebook did not create Sri Lanka’s history of ethnic distrust any more than it created anti-Rohingya sentiment in Myanmar.
  • In India, Facebook-based misinformation has been linked repeatedly to religious violence, including riots in 2012 that left several dead, foretelling what has since become a wider trend.
  • “We don’t completely blame Facebook,” said Harindra Dissanayake, a presidential adviser in Sri Lanka. “The germs are ours, but Facebook is the wind, you know?”
  • Mr. Kumarasinghe died on March 3, online emotions surged into calls for action: attend the funeral to show support. Sinhalese arrived by the busload, fanning out to nearby towns. Online, they migrated from Facebook to private WhatsApp groups, where they could plan in secret.
katherineharron

Feds on high alert Thursday after warnings about potential threats to US Capitol - CNNP... - 0 views

  • Federal law enforcement is on high alert Thursday in the wake of an intelligence bulletin issued earlier this week about a group of violent militia extremists having discussed plans to take control of the US Capitol and remove Democratic lawmakers on or around March 4 -- a date when some conspiracy theorists believe former President Donald Trump will be returning to the presidency.
  • The House changed its schedule in light of warnings from US Capitol Police, moving a vote planned for Thursday to Wednesday night to avoid being in session on March 4. The Senate is still expected to be in session debating the Covid-19 relief bill.
  • Those intelligence sharing and planning failures have been laid bare over the last two months in several hearings and have been a focal point of criticism from lawmakers investigating the violent attack that left several people dead.
  • ...10 more annotations...
  • The violent extremists also discussed plans to persuade thousands to travel to Washington, DC, to participate in the March 4 plot, according to the joint intelligence bulletin.
  • it is mostly online talk and not necessarily an indication anyone is coming to Washington to act on it. Read More
  • Some of the conspiracy theorists believe that the former President will be inaugurated on March 4, according to the joint bulletin. Between 1793 and 1933, inauguration often fell on March 4 or a surrounding date.
  • Pittman assured lawmakers, though, that her department is in an "enhanced" security posture and that the National Guard and Capitol Police have been briefed on what to expect in the coming days.
  • The effort to improve preparation extends to communicating with state and local officials. DHS held a call Wednesday with state and local law enforcement officials from around the country to discuss current threats posed by domestic extremists, including concerns about potential violence surrounding March 4 and beyond, according to two sources familiar with the matter. While specific details from the call remain unclear, both sources said the overarching message from DHS officials is that addressing threats posed by domestic extremists requires increased communication and intelligence sharing across federal and state and local entities, as well as a shift in how law enforcement officials interpret the information they receive.
  • Federal officials are emphasizing the point that gaps in intelligence sharing left law enforcement unprepared for the chaos that unfolded on January 6, even though they were notified of potential violence days before the attack, and that going forward, bulletins issued by DHS and FBI indicate a threat is serious enough to be communicated to relevant entities, even if the intelligence is based primarily on online chatter or other less definitive indicators, the sources said.
  • Perceived election fraud and other conspiracy theories associated with the presidential transition may contribute to violence with little or no warning, according to the bulletin, which is part of a series of intelligence products to highlight potential domestic violent extremist threats to the Washington, DC, region. "Given that the Capitol complex is currently fortified like a military installation, I don't anticipate any successful attacks against the property," said Brian Harrell, the former assistant secretary for infrastructure protection at DHS. "However, all threats should be taken seriously and investigations launched against those who would call for violence. We continue to see far-right extremist groups that are fueled by misinformation and conspiracy theories quickly become the most dangerous threat to society."
  • "You really cannot underestimate the potential that an individual or a small group of individuals will engage in violence because they believe a false narrative that they're seeing online,"
  • Although March 4 is a concern to law enforcement, it's not a "standalone event," the official said; rather, it's part of a "continuum of violence" based domestic extremist conspiracy theories. "It's a threat that continues to be of concern to law enforcement. And I suspect that we are going to have to be focused on it for months to come," the official said.
  • Pittman warned last month that militia groups involved in the January 6 insurrection want to "blow up the Capitol" and "kill as many members as possible" when President Joe Biden addresses a joint session of Congress.
Javier E

Facebook's problem isn't Trump - it's the algorithm - Popular Information - 0 views

  • Facebook is in the business of making money. And it's very good at it. In the first three months of 2021, Facebook raked in over $11 billion in profits, almost entirely from displaying targeted advertising to its billions of users. 
  • In order to keep the money flowing, Facebook also needs to moderate content. When people use Facebook to livestream a murder, incite a genocide, or plan a white supremacist rally, it is not a good look.
  • But content moderation is a tricky business. This is especially true on Facebook where billions of pieces of content are posted every day. In a lot of cases, it is difficult to determine what content is truly harmful. No matter what you do, someone is unhappy. And it's a distraction from Facebook's core business of selling ads.
  • ...17 more annotations...
  • In 2019, Facebook came up with a solution to offload the most difficult content moderation decisions. The company created the "Oversight Board," a quasi-judicial body that Facebook claims is independent. The Board, stocked with impressive thinkers from around the world, would issue "rulings" about whether certain Facebook content moderation decisions were correct.
  • the decision, which is nearly 12,000 words long, illustrates that whether Trump is ultimately allowed to return to Facebook is of limited significance. The more important questions are about the nature of the algorithm that gives people with views like Trump such a powerful voice on Facebook. 
  • The Oversight Board was Facebook's idea. It spent years constructing the organization, selected its chairs, and funded its endowment. But now that the Oversight Board is finally up and running and taking on high-profile cases, Facebook is choosing to ignore questions that the Oversight Board believes are essential to doing its job.
  • This is a key passage (emphasis added): 
  • duces no original reporting. But, on Facebook in April, The Daily Wire received more than double the distribution of the Washington Post and the New York Times combined:
  • A critical issue, as the Oversight Board suggests, is not simply Trump's posts but how those kinds of posts are amplified by Facebook's algorithms. Equally important is how Facebook's algorithms amplify false, paranoid, violent, right-wing content from people other than Trump — including those that follow Trump on Facebook.
  • The jurisdiction of the Oversight Board excludes both the algorithm and Facebook's business practices.
  • Facebook stated to the Board that it considered Mr. Trump’s “repeated use of Facebook and other platforms to undermine confidence in the integrity of the election (necessitating repeated application by Facebook of authoritative labels correcting the misinformation) represented an extraordinary abuse of the platform.” The Board sought clarification from Facebook about the extent to which the platform’s design decisions, including algorithms, policies, procedures and technical features, amplified Mr. Trump’s posts after the election and whether Facebook had conducted any internal analysis of whether such design decisions may have contributed to the events of January 6. Facebook declined to answer these questions. This makes it difficult for the Board to assess whether less severe measures, taken earlier, may have been sufficient to protect the rights of others.
  • Donald Trump's Facebook page is a symptom, not the cause, of the problem. Its algorithm favors low-quality, far-right content. Trump is just one of many beneficiaries.
  • NewsWhip is a social media analytics service which tracks which websites get the most engagement on Facebook. It just released its analysis for April and it shows low-quality right-wing aggregation sites dominate major news organizations.
  • The Oversight Board has no power to compel Facebook to answer. It's an important reminder that, for all the pomp and circumstance, the Oversight Board is not a court. The scope of its authority is limited by Facebook executives' willingness to play along. 
  • This actually understates how much better The Daily Wire's content performs on Facebook than the Washington Post and the New York Times. The Daily Wire published just 1,385 pieces of content in April compared to over 6,000 by the Washington Post and the New York Times. Each piece of content The Daily Wire published in April received 54,084 engagements on Facebook, compared to 2,943 for the New York Times and 1,973 for the Washington Post. 
  • It's important to note here that Facebook's algorithm is not reflecting reality — it's creating a reality that doesn't exist anywhere else. In the rest of the world, Western Journal is not more popular than the New York Times, NBC News, the BBC, and the Washington Post. That's only true on Facebook.
  • Facebook has made a conscious decision to surface low-quality content and recognizes its dangers.
  • Shortly after the November election, Facebook temporarily tweaked its algorithm to emphasize "'news ecosystem quality' scores, or N.E.Q., a secret internal ranking it assigns to news publishers based on signals about the quality of their journalism." The purpose was to attempt to cut down on election misinformation being spread on the platform by Trump and his allies. The result was "a spike in visibility for big, mainstream publishers like CNN, The New York Times and NPR, while posts from highly engaged hyperpartisan pages, such as Breitbart and Occupy Democrats, became less visible." 
  • BuzzFeed reported that some Facebook staff members wanted to make the change permanent. But that suggestion was opposed by Joel Kaplan, a top Facebook executive and Republican operative who frequently intervenes on behalf of right-wing publishers. The algorithm change was quickly rolled back.
  • Other proposed changes to the Facebook algorithm over the years have been rejected or altered because of their potential negative impact on right-wing sites like The Daily Wire. 
Javier E

Here's a Look Inside Facebook's Data Wars - The New York Times - 0 views

  • On one side were executives, including Mr. Silverman and Brian Boland, a Facebook vice president in charge of partnerships strategy, who argued that Facebook should publicly share as much information as possible about what happens on its platform — good, bad or ugly.
  • On the other side were executives, including the company’s chief marketing officer and vice president of analytics, Alex Schultz, who worried that Facebook was already giving away too much.
  • One day in April, the people behind CrowdTangle, a data analytics tool owned by Facebook, learned that transparency had limits.
  • ...27 more annotations...
  • They argued that journalists and researchers were using CrowdTangle, a kind of turbocharged search engine that allows users to analyze Facebook trends and measure post performance, to dig up information they considered unhelpful — showing, for example, that right-wing commentators like Ben Shapiro and Dan Bongino were getting much more engagement on their Facebook pages than mainstream news outlets.
  • These executives argued that Facebook should selectively disclose its own data in the form of carefully curated reports, rather than handing outsiders the tools to discover it themselves.Team Selective Disclosure won, and CrowdTangle and its supporters lost.
  • the CrowdTangle story is important, because it illustrates the way that Facebook’s obsession with managing its reputation often gets in the way of its attempts to clean up its platform
  • The company, blamed for everything from election interference to vaccine hesitancy, badly wants to rebuild trust with a skeptical public. But the more it shares about what happens on its platform, the more it risks exposing uncomfortable truths that could further damage its image.
  • Facebook’s executives were more worried about fixing the perception that Facebook was amplifying harmful content than figuring out whether it actually was amplifying harmful content. Transparency, they said, ultimately took a back seat to image management.
  • the executives who pushed hardest for transparency appear to have been sidelined. Mr. Silverman, CrowdTangle’s co-founder and chief executive, has been taking time off and no longer has a clearly defined role at the company, several people with knowledge of the situation said. (Mr. Silverman declined to comment about his status.) And Mr. Boland, who spent 11 years at Facebook, left the company in November.
  • “One of the main reasons that I left Facebook is that the most senior leadership in the company does not want to invest in understanding the impact of its core products,” Mr. Boland said, in his first interview since departing. “And it doesn’t want to make the data available for others to do the hard work and hold them accountable.”
  • Mr. Boland, who oversaw CrowdTangle as well as other Facebook transparency efforts, said the tool fell out of favor with influential Facebook executives around the time of last year’s presidential election, when journalists and researchers used it to show that pro-Trump commentators were spreading misinformation and hyperpartisan commentary with stunning success.
  • “People were enthusiastic about the transparency CrowdTangle provided until it became a problem and created press cycles Facebook didn’t like,” he said. “Then, the tone at the executive level changed.”
  • Facebook was happy that I and other journalists were finding its tool useful. With only about 25,000 users, CrowdTangle is one of Facebook’s smallest products, but it has become a valuable resource for power users including global health organizations, election officials and digital marketers, and it has made Facebook look transparent compared with rival platforms like YouTube and TikTok, which don’t release nearly as much data.
  • Last fall, the leaderboard was full of posts by Mr. Trump and pro-Trump media personalities. Since Mr. Trump was barred from Facebook in January, it has been dominated by a handful of right-wing polemicists like Mr. Shapiro, Mr. Bongino and Sean Hannity, with the occasional mainstream news article, cute animal story or K-pop fan blog sprinkled in.
  • But the mood shifted last year when I started a Twitter account called @FacebooksTop10, on which I posted a daily leaderboard showing the sources of the most-engaged link posts by U.S. pages, based on CrowdTangle data.
  • The account went semi-viral, racking up more than 35,000 followers. Thousands of people retweeted the lists, including conservatives who were happy to see pro-Trump pundits beating the mainstream media and liberals who shared them with jokes like “Look at all this conservative censorship!” (If you’ve been under a rock for the past two years, conservatives in the United States frequently complain that Facebook is censoring them.)
  • Inside Facebook, the account drove executives crazy. Some believed that the data was being misconstrued and worried that it was painting Facebook as a far-right echo chamber. Others worried that the lists might spook investors by suggesting that Facebook’s U.S. user base was getting older and more conservative. Every time a tweet went viral, I got grumpy calls from Facebook executives who were embarrassed by the disparity between what they thought Facebook was — a clean, well-lit public square where civility and tolerance reign — and the image they saw reflected in the Twitter lists.
  • Mr. Boland, the former Facebook vice president, said that was a convenient deflection. He said that in internal discussions, Facebook executives were less concerned about the accuracy of the data than about the image of Facebook it presented.“It told a story they didn’t like,” he said of the Twitter account, “and frankly didn’t want to admit was true.”
  • Several executives proposed making reach data public on CrowdTangle, in hopes that reporters would cite that data instead of the engagement data they thought made Facebook look bad.But Mr. Silverman, CrowdTangle’s chief executive, replied in an email that the CrowdTangle team had already tested a feature to do that and found problems with it. One issue was that false and misleading news stories also rose to the top of those lists.“Reach leaderboard isn’t a total win from a comms point of view,” Mr. Silverman wrote.
  • executives argued that my Top 10 lists were misleading. They said CrowdTangle measured only “engagement,” while the true measure of Facebook popularity would be based on “reach,” or the number of people who actually see a given post. (With the exception of video views, reach data isn’t public, and only Facebook employees and page owners have access to it.)
  • Mr. Schultz, Facebook’s chief marketing officer, had the dimmest view of CrowdTangle. He wrote that he thought “the only way to avoid stories like this” would be for Facebook to publish its own reports about the most popular content on its platform, rather than releasing data through CrowdTangle.“If we go down the route of just offering more self-service data you will get different, exciting, negative stories in my opinion,” he wrote.
  • there’s a problem with reach data: Most of it is inaccessible and can’t be vetted or fact-checked by outsiders. We simply have to trust that Facebook’s own, private data tells a story that’s very different from the data it shares with the public.
  • Mr. Zuckerberg is right about one thing: Facebook is not a giant right-wing echo chamber.But it does contain a giant right-wing echo chamber — a kind of AM talk radio built into the heart of Facebook’s news ecosystem, with a hyper-engaged audience of loyal partisans who love liking, sharing and clicking on posts from right-wing pages, many of which have gotten good at serving up Facebook-optimized outrage bait at a consistent clip.
  • CrowdTangle’s data made this echo chamber easier for outsiders to see and quantify. But it didn’t create it, or give it the tools it needed to grow — Facebook did — and blaming a data tool for these revelations makes no more sense than blaming a thermometer for bad weather.
  • It’s worth noting that these transparency efforts are voluntary, and could disappear at any time. There are no regulations that require Facebook or any other social media companies to reveal what content performs well on their platforms, and American politicians appear to be more interested in fighting over claims of censorship than getting access to better data.
  • It’s also worth noting that Facebook can turn down the outrage dials and show its users calmer, less divisive news any time it wants. (In fact, it briefly did so after the 2020 election, when it worried that election-related misinformation could spiral into mass violence.) And there is some evidence that it is at least considering more permanent changes.
  • The project, which some employees refer to as the “Top 10” project, is still underway, the people said, and it’s unclear whether its findings have been put in place. Mr. Osborne, the Facebook spokesman, said that the team looks at a variety of ranking changes, and that the experiment wasn’t driven by a desire to change the Top 10 lists.
  • This year, Mr. Hegeman, the executive in charge of Facebook’s news feed, asked a team to figure out how tweaking certain variables in the core news feed ranking algorithm would change the resulting Top 10 lists, according to two people with knowledge of the project.
  • As for CrowdTangle, the tool is still available, and Facebook is not expected to cut off access to journalists and researchers in the short term, according to two people with knowledge of the company’s plans.
  • Mr. Boland, however, said he wouldn’t be surprised if Facebook executives decided to kill off CrowdTangle entirely or starve it of resources, rather than dealing with the headaches its data creates.
liamhudgings

To Predict the Role of Fake News in 2020, Look to Canada | JSTOR Daily - 0 views

  • How will online misinformation (“fake news”) affect America’s 2020 elections? It’s the kind of question that might send voters scurrying to the nearest stack of political science journals. But you’d be better off looking north—to Canada, and its impending federal elections, which will be held on October 21, 2019.
  • In part that’s because Canada has taken steps to address the potential for misinformation in this election cycle, developing what Politico recently characterized as “the most detailed plan anywhere in the Western world to combat foreign meddling in its upcoming election.” The government’s plan includes transparency guidelines for political advertising online, the establishment of a cybersecurity task force dedicated to monitoring for potential election threats, and the allocation of $7 million Canadian dollars to digital and civic literacy initiatives.
  • if all that fails, the country also has a non-partisan panel that’s empowered to alert the public in the event of significant foreign interference in the election.
  • ...4 more annotations...
  • …data about disinformation campaigns are spotty at best. Many of these activities occur in secretive military contexts, or behind the proprietary walls of private actors. Thus, painting a complete picture of these activities online by government actors is extremely difficult, and there will be gaps in the data and cases collected.
  • Precisely because the internet re-invents itself so quickly, each election cycle takes place in what is effectively a brand-new online context.
  • That difficulty has led to some amusingly off-base predictions.
  • Today, that challenge takes the form of wrestling with online misinformation—a challenge I’ll resist characterizing as unprecedented, even though it really is tempting to argue that these factors take the political significance of the internet to a whole new level. The experience of repeatedly encountering brand-new territory has left me not just skeptical of the hyperbole, but also skeptical of political scientists’ ability to inform our efforts at grappling with each successive online challenge. Their work tends to be useful only in retrospect.
delgadool

How misinformation overwhelmed our democracy - Vox - 0 views

  • some people simply refuse to acknowledge inconvenient facts about their own side.
  • We live in a media ecosystem that overwhelms people with information. Some of that information is accurate, some of it is bogus, and much of it is intentionally misleading. The result is a polity that has increasingly given up on finding out the truth.
  • “epistemic crisis.”
  • ...12 more annotations...
  • We’re in an age of manufactured nihilism.
  • The issue for many people isn’t exactly a denial of truth as such. It’s more a growing weariness over the process of finding the truth at all.
  • I call this “manufactured” because it’s the consequence of a deliberate strategy
  • What we’re facing is a new form of propaganda that wasn’t really possible until the digital age. And it works not by creating a consensus around any particular narrative but by muddying the waters so that consensus isn’t achievable.
  • For most of recent history, the goal of propaganda was to reinforce a consistent narrative. But zone-flooding takes a different approach: It seeks to disorient audiences with an avalanche of competing stories.
  • Yet CNN and MSNBC have shown zero hesitation in giving her a platform to lie because they see their job as giving government officials — even ones who lie — a platform.
  • And we know that false claims, if they’re repeated enough, become more plausible the more often they’re shared, something psychologists have called the “illusory truth” effect. Our brains, it turns out, tend to associate repetition with truthfulness. Some interesting new research, moreover, found that the more people encounter information the more likely they are to feel justified in spreading it, whether it’s true or not.
  • It’s worth noting that this polarization is asymmetric. The left overwhelmingly receives its news from organizations like the New York Times, the Washington Post, or cable news networks like MSNBC or CNN. Some of the reporting is surely biased, and probably biased in favor of liberals, but it’s still (mostly) anchored to basic journalistic ethics.
  • The fact is, Trump did what Democrats have accused him of doing. We know, with absolute certainty, that the president tried to get a foreign government to investigate a family member of one of his political rivals
  • The way impeachment has played out underscores just how the new media ecosystem is a problem for our democracy.
  • Trump can dictate an entire news cycle with a few unhinged tweets or an absurd press conference. The media cycle is easily commandeered by misinformation, innuendo, and outrageous content. These are problems because of the norms that govern journalism and because the political economy of media makes it very hard to ignore or dispel bullshit stories. This is at the root of our nihilism problem, and a solution is nowhere in sight.
  • As is often the case, the diagnosis is much easier than the cure. But liberal democracy cannot function without a shared understanding of reality. As long as the zone is flooded with shit, that shared understanding is impossible.
mimiterranova

Black And Latino Voters Flooded With Disinformation In Election's Final Days : NPR - 0 views

  • Someone started posting memes full of false claims that seemed designed to discourage people from voting.
  • 'Democrats and Republicans are the same. There's no point in voting.' 'Obama didn't do anything for you during his term, why should you vote for a Democrat this time around?' "
  • Black and Latino voters are being flooded with similar messages in the final days of the election, according to voting rights activists and experts who track disinformation
  • ...4 more annotations...
  • "We are now talking about this misinformation as a part of the same trajectory as a poll tax, as a literacy test," he said. "A sustained campaign targeted at Black Americans — and often brown Americans as well — to limit our political power, to limit our ability to shape the decisions tha
  • t are made in this country."
  • Their strategy, Banks said, was "masquerading as black Americans, drawing people into conversation and ultimately turning that conversation toward bad information and often toward a sort of deep cynicism that made people sort of less inspired to participate."
  • Groups tracking disinformation have also noted attacks on Sen. Kamala Harris, the Democratic vice presidential nominee, such as false claims about her racial identity and her history as a prosecutor in California. "There's been rampant misinformation about her record, who she is, what she's about," the New Florida Majority's Bullard said.
dytonka

Most important 2020 election misinformation threat not from overseas - 0 views

  • But the cybersecurity expert, who now is director of the Stanford Internet Observatory, says the toughest misinformation threat technology companies face, and can’t solve, is from within the U.S. — disinformation sowed by U.S. politicians, and one figure, in particular.
  • “The most important disinformation this cycle is coming from domestic sources, and that is huge problem for technology companies,” Stamos said. “They are loathe to wade into democratic processes in the U.S.”
  • f President Trump goes to a White House podium and declares victory, it will be covered by every major news organization.
kaylynfreeman

How Three Election-Related Falsehoods Spread - The New York Times - 0 views

  • The data showed how a single rumor pushing a false narrative could rapidly gain traction on Facebook and Twitter, generating tens of thousands of shares and comments. That has made the misinformation particularly hard for elections officials to fight.
  • 1. False claims of ballot “harvesting”This misinformation features the unproven assertion that ballots are being “harvested,” or collected and dropped off in bulk by unauthorized people.
  • Representative Ilhan Omar, a Minnesota Democrat, was falsely accused last month of being engaged in or connected to systematic illegal ballot harvesting.
  • ...5 more annotations...
  • There were 3,959 public Facebook posts sharing this rumor, according to our analysis. Those posts generated 953,032 likes, comments and shares. Among those who shared the lie were two pro-Trump Facebook groups targeting Minnesota residents, as well as President Trump himself. At least 26,300 tweets also discussed the falsehood.
  • 2. False claims of mail-in ballots being dumped or shreddedMail-in ballots and related materials being tossed was another popular falsehood that election officials said they were hearing.
    • kaylynfreeman
       
      i heard that as well
  • 3. False claims of planned violence at polling sites by Antifa and Black Lives Matter protesters
  • Election officials also said people were confronting them with false assertions that antifa, the loose collection of left-wing activists, and Black Lives Matter protesters were coordinating riots at polling places across the country.Image
  • He said in an email that his post was not a call for violence and that The New York Times should focus on “the key planners and financiers of all the rioting, arson, looting and murder” instead.
kaylynfreeman

Disinformation in the 2020 Presidential Election: Latest Updates - The New York Times - 0 views

  • No, George Soros doesn’t control voting machines.Claim: The billionaire George Soros owns Smartmatic, a company that makes voting machines. He can manipulate the machines toward a candidate of his choosing.Fact: Mr. Soros does not own Smartmatic.Background: Rumors that Mr. Soros, a well-known donor to liberal causes, owns Smartmatic have circulated for years, including during the 2016 presidential election and the 2018 midterm elections.
  • No, ballots aren’t being thrown away.Claim: There are photographs of ballots being thrown away, providing proof of problems with mail-in voting in California.Fact: The photographs depict old, empty envelopes from the November 2018 midterm elections that were discarded after the vote was counted.Background: The images have been circulating in recent months to back claims made by President Trump that mail-in voting, which is expected to nearly double because of the pandemic, will increase voter fraud. Republicans in Congress as well as right-wing outlets have shared the photographs.
  • No, people aren’t voting more than once.Claim: People are casting multiple votes using mail-in ballots or absentee ballots.Fact: Election experts have calculated that, in a 20-year period, fraud involving mailed ballots has affected 0.00006 percent of votes, or one case per state every six or seven years.Background: Several viral Twitter posts have claimed that mail-in ballots cannot be “verified” or have already been cast. Mr. Trump, who has repeatedly attacked state efforts to expand voting by mail, has falsely said mail-in ballots are “dangerous,” “unconstitutional,” “a scam” or rife with “fraud.”
  • ...3 more annotations...
  • No, there aren’t any new online-voting options.Claim: People can vote by text message, by email or on a state-run website.Fact: Outside of a small amount of overseas absentee voters, no state allows Americans to vote by email, website or text message.Background: In 2016 and 2018, posts on Facebook, Twitter and other social media sites claimed that voters could cast their ballots through newly formed websites, or through text-messaging services.
  • No, voting machines aren’t doing strange things.Claim: Voting machines are malfunctioning and causing votes to be improperly recorded.Fact: A handful of voting-machine malfunctions are reported every election cycle in most states. The errors are most often due to mistakes by users.Background: Frequently circulated videos purport to show machines malfunctioning or refusing to let people cast their vote for a particular candidate. A 2016 video shot by a woman in Pennsylvania and posted to Twitter claimed that a voting machine was not allowing her to vote for Mr. Trump. The video, which is likely to resurface this year, was provided as evidence that machines were rigged. But as ProPublica reported, the problem with the machine was user error.
  • No, ICE isn’t monitoring polling locations.Claim: U.S. Immigration and Customs Enforcement agents will be at polling stations.Fact: ICE will not be at polling stations.Background: This rumor has made the rounds for a decade. During the 2018 election, claims that ICE would be at polling stations proliferated on Twitter, making misinformation aimed at suppressing the vote one of the most prevalent forms of misinformation on the platform, according to Twitter.
aleija

YouTube Cut Down Misinformation. Then It Boosted Fox News. - The New York Times - 0 views

  • That algorithm decided which videos YouTube recommended that users watch next; the company said it was responsible for 70 percent of the one billion hours a day people spent on YouTube. But it had become clear that those recommendations tended to steer viewers toward videos that were hyperpartisan, divisive, misleading or downright false.
  • . In the weeks leading up to Tuesday’s election, YouTube recommended far fewer fringe channels alongside news videos than it did in 2016, which helped it to reduce its spread of disinformation, according to research by Guillaume Chaslot, a former Google engineer who helped build YouTube’s recommendation engine and now studies it.
  • “The channel most recommended in our data set in 2016 was Alex Jones,” the notorious internet conspiracy theorist, who has since been barred from YouTube, Mr. Chaslot said. “Now it’s Fox News.”
  • ...2 more annotations...
  • The ascent of Fox News on the social media platforms was a reminder that tech companies have been walking a tricky line between limiting misinformation and appeasing politicians complaining that Silicon Valley is biased — all while still keeping people clicking, watching and sharing on their sites.
  • YouTube’s promotion of Fox News’s unabashedly conservative pundits also undercut arguments from some of those same pundits that the biggest tech companies are trying to silence them.
osichukwuocha

This Misinformation Was Coming From Inside the House - The New York Times - 0 views

  • that wearing masks has “little to no medical value” and could do more “harm” than wearing no mask at all.
  • But it was especially remarkable given the source. Published on the right-wing website RedState, it turned out to have been written under a pseudonym by William B. Crews, a public affairs officer at the National Institutes of Health,
  • Mr. Crews had published a slew of incorrect claims about this virus this year
  • ...6 more annotations...
  • Mr. Crews was especially focused on undermining efforts to persuade the public to wear masks, saying that “math tells you the diameter of the virus is orders of magnitude smaller than the smallest opening between mask fibers.”
  • “While we are assured that it is essential to bring our lives to a screeching halt in order to prevent the spread of Wuhan virus, there is actually no real evidence of the measurable kind that backs up that supposition,” he wrote.
  • writing that lockdowns and social distancing rules imposed in other countries had severe economic consequences without any public health benefit.
  • Numerous studies have shown that while the virus itself is small enough to pass through cloth, it travels within particles and respiratory droplets that masks can catch, helping to reduce community spread.
  • delays in imposing lockdowns in the United States most likely resulted in tens of thousands of preventable deaths
  • Misleading and even maliciously false narratives about the coronavirus have been shared widely across online networks
Javier E

How a Kennedy became a 'superspreader' of hoaxes on COVID-19, vaccines, 5G and more - T... - 0 views

  • In 2017, after a meeting with then president-elect Mr. Trump in New York, Mr. Kennedy Jr. announced that he had been asked to chair a commission to review vaccine safety. The move alarmed doctors, epidemiologists and public health experts, who pointed out that Mr. Trump had previously raised concerns that vaccines cause autism.
  • Even though the commission never materialized, to Mr. Kennedy Jr.’s bitter disappointment, the fact that the meeting took place at all signals how closely conspiracy theories and misinformation have been interwoven in everyday politics.
  • “To some extent, conspiracy theories rule the day,” Prof. Offit told us.
  • ...5 more annotations...
  • “You have [U.S. Republican senator] Lindsey Graham talking about the deep state; you could argue the President was elected around conspiracy theories. So Kennedy’s well placed to fit into that trend. He appeals to the notion that there are dark forces working against us.”
  • Larry Sabato, one of America’s leading political scientists, believes the confusion created by the President will find its denouement on Nov. 3, presidential election day, when “we’ll find out whether the truth matters in American politics.” Mr. Sabato said: “What is disturbing is that for tens of millions, it doesn’t matter anymore. We are in the postfactual era, not just in America, but around the world.
  • “Almost all of these theories are pretty, pretty darn boring. And I hate to complain about my job. It’s the same crap over and over again. Same theories, different nouns. There’s nothing to even QAnon, which people look at and say, ‘Oh my God, that’s so wacky.’ Well, the idea of a pedophile deep state working against the president is the plot of Oliver Stone’s JFK movie that came out 30 years ago. ... The idea that your enemies are pedophiles and Satanists and sex traffickers goes back millennia. So there’s really even nothing new there.”
  • Mr. Kennedy Jr.’s siblings Kathleen Kennedy Townsend and former congressman Joseph Kennedy, as well as niece Maeve Kennedy McKean, published an excoriating article in Politico claiming that “he has helped to spread dangerous misinformation over social media and is complicit in sowing distrust of the science behind vaccines."
  • “We love Bobby,” they said, and praised his record on environmental issues. “However, on vaccines he is wrong.”
lilyrashkind

Jan. 6 committee subpoenas tech giants after 'inadequate responses' - 0 views

  • The House committee investigating the Jan. 6 riot at the U.S. Capitol subpoenaed Reddit, Twitter and the parent companies of Google and Facebook on Thursday after their "inadequate responses" to requests for information about what they did and didn't do in the lead-up to the deadly attack.
  • "It's disappointing that after months of engagement, we still do not have the documents and information necessary to answer those basic questions," committee Chair Bennie Thompson, D-Miss., said in a statement. "The Select Committee is working to get answers for the American people and help ensure nothing like January 6th ever happens again. We cannot allow our important work to be delayed any further."
  • "Additionally, Meta has failed to provide critical internal and external analyses conducted by the company regarding misinformation, disinformation, and malinformation relating to the 2020 election, efforts to challenge or overturn the election, and the use of Meta by domestic violent extremists to affect the 2020 election," the letter said.
  • ...8 more annotations...
  • Andy Stone, a spokesperson for Meta, responded in a statement saying, "As Chairman Thompson said recently, 'Facebook is working with [the committee] to provide the necessary information we requested.'
  • "To this day, YouTube is a platform on which user video spreads misinformation about the election," the committee said.
  • In a statement, Google said: “We’ve been actively cooperating with the Select Committee since they started their investigation, responding substantively to their requests for documents, and are committed to working with Congress through this process.
  • A Reddit spokesperson said, "We received the subpoena and will continue to work with the committee on their requests."Twitter declined to comment.
  • The panel first sought records from the four companies and others in August, asking for information related to "the spread of misinformation
  • The committee is seeking information dating to the spring of 2020.
  • FBI officials have acknowledged that there were calls for violence at the Jan. 6 "Stop the Steal" rally by Trump supporters, which was held just before the Capitol attack, but they have said the calls did not add up to specific, credible intelligence.
  • Testifying before a Senate committee in March, FBI Director Christopher Wray suggested that the amount of vitriol online makes it difficult to sort out.
‹ Previous 21 - 40 of 233 Next › Last »
Showing 20 items per page