Skip to main content

Home/ History Readings/ Group items matching "zuckerberg" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
9More

Opinion | Now Social Media Grows a Conscience? - The New York Times - 0 views

  • Propelled by the nation’s stunned reaction to last week’s violent siege of the U.S. Capitol, social media companies have sought to separate themselves from President Trump and lawmakers who were complicit in the riots.
  • The actions, a long time coming, are sure to limit the appearance of some of the most inflammatory posts and tweets, particularly leading up to next week’s presidential inauguration.
  • Facebook, Twitter and YouTube are trying to claim the mantle of champions of free speech and impartial loudspeakers for whoever has a deeply held conviction.
  • ...6 more annotations...
  • There’s nothing wrong with making a buck, of course. But until Facebook, Twitter and the rest view their platforms as something more than just businesses, policing the sites will be a perpetual game of Whac-a-Mole.
  • Consider, for instance, that it was only on Monday that Facebook announced a purge of content promoting the false election fraud claims behind the campaign known as “stop the steal.” That’s been a rallying cry since Election Day, more than two months ago. Facebook’s co-founder and chief executive, Mark Zuckerberg, who is the controlling shareholder of the company, has said he believes that politicians should be allowed to knowingly lie on Facebook.
  • These companies have consistently ignored warnings about how their very structure foments misinformation and division. A Facebook-ordered civil rights audit released in July effectively gave the company a failing grade.
  • Mr. Lehrich said Facebook should make a chronological news feed the default, rather than an algorithm that shows users what it thinks is most relevant. And it ought not to thrust users unwittingly into groups or toward certain pages that align with what the software thinks will interest them. Users could still opt into those services.
  • With the shuttering of Mr. Trump’s accounts, some will point to Big Tech’s tremendous reach as well as concerns about curtailing free speech. But these companies have throttled speech for years, when it serves their purposes.
  • The companies aren’t likely to surrender the power they’ve accumulated any time soon — that’s why Facebook, which also owns Instagram, faces twin antitrust lawsuits from the Federal Trade Commission and 48 attorneys general.
7More

As Trump Clashes With Big Tech, China's Censored Internet Takes His Side - The New York... - 0 views

  • After Twitter and Facebook kicked President Trump off their platforms, and his supporters began comparing his social media muzzling to Chinese censorship, the president won support from an unexpected source: China.
  • Mr. Trump’s expulsion from American social media for spurring the violent crowd at the Capitol last week has consumed the Chinese internet, one of the most harshly censored forums on earth. Overwhelmingly, people who face prison for what they write are condemning what they regard as censorship elsewhere.
  • Much of the condemnation is being driven by China’s propaganda arms. By highlighting the decisions by Twitter and Facebook, they believe they are reinforcing their message to the Chinese people that nobody in the world truly enjoys freedom of speech. That gives the party greater moral authority to crack down on Chinese speech.
  • ...4 more annotations...
  • Chinese internet companies conduct their own censorship, but they do so out of fear of what Beijing officials might do to them. Last February, ifeng.com, a news portal, was punished for running original content about the coronavirus outbreak. Under the Chinese regulations, these websites can’t produce original news content.
  • For those reasons, many Chinese are dumbfounded by the idea that private companies such as Twitter and Facebook have the power to reject a sitting American president.
  • But when Mr. Kuang created two cartoons to express his displeasure at Mr. Trump’s bans, China’s censors did nothing. In one of them, President Trump’s mouth was brutally sewn up. In another, the Facebook founder Mark Zuckerberg is portrayed as Qin Shi Huang, China’s first emperor, a brutal tyrant who burned books and executed scholars more than 2,000 years ago.
  • The journalist Zhao Jing, who goes by the name Michael Anti, is puzzled why Chinese Trump supporters so zealously defend his freedom of speech. Mr. Trump has the White House, executive orders and Fox News, he wrote: “What else do you want for him to have freedom of speech?”
30More

Here's a Look Inside Facebook's Data Wars - The New York Times - 0 views

  • On one side were executives, including Mr. Silverman and Brian Boland, a Facebook vice president in charge of partnerships strategy, who argued that Facebook should publicly share as much information as possible about what happens on its platform — good, bad or ugly.
  • On the other side were executives, including the company’s chief marketing officer and vice president of analytics, Alex Schultz, who worried that Facebook was already giving away too much.
  • One day in April, the people behind CrowdTangle, a data analytics tool owned by Facebook, learned that transparency had limits.
  • ...27 more annotations...
  • They argued that journalists and researchers were using CrowdTangle, a kind of turbocharged search engine that allows users to analyze Facebook trends and measure post performance, to dig up information they considered unhelpful — showing, for example, that right-wing commentators like Ben Shapiro and Dan Bongino were getting much more engagement on their Facebook pages than mainstream news outlets.
  • These executives argued that Facebook should selectively disclose its own data in the form of carefully curated reports, rather than handing outsiders the tools to discover it themselves.Team Selective Disclosure won, and CrowdTangle and its supporters lost.
  • the CrowdTangle story is important, because it illustrates the way that Facebook’s obsession with managing its reputation often gets in the way of its attempts to clean up its platform
  • The company, blamed for everything from election interference to vaccine hesitancy, badly wants to rebuild trust with a skeptical public. But the more it shares about what happens on its platform, the more it risks exposing uncomfortable truths that could further damage its image.
  • Facebook’s executives were more worried about fixing the perception that Facebook was amplifying harmful content than figuring out whether it actually was amplifying harmful content. Transparency, they said, ultimately took a back seat to image management.
  • the executives who pushed hardest for transparency appear to have been sidelined. Mr. Silverman, CrowdTangle’s co-founder and chief executive, has been taking time off and no longer has a clearly defined role at the company, several people with knowledge of the situation said. (Mr. Silverman declined to comment about his status.) And Mr. Boland, who spent 11 years at Facebook, left the company in November.
  • “One of the main reasons that I left Facebook is that the most senior leadership in the company does not want to invest in understanding the impact of its core products,” Mr. Boland said, in his first interview since departing. “And it doesn’t want to make the data available for others to do the hard work and hold them accountable.”
  • Mr. Boland, who oversaw CrowdTangle as well as other Facebook transparency efforts, said the tool fell out of favor with influential Facebook executives around the time of last year’s presidential election, when journalists and researchers used it to show that pro-Trump commentators were spreading misinformation and hyperpartisan commentary with stunning success.
  • “People were enthusiastic about the transparency CrowdTangle provided until it became a problem and created press cycles Facebook didn’t like,” he said. “Then, the tone at the executive level changed.”
  • Facebook was happy that I and other journalists were finding its tool useful. With only about 25,000 users, CrowdTangle is one of Facebook’s smallest products, but it has become a valuable resource for power users including global health organizations, election officials and digital marketers, and it has made Facebook look transparent compared with rival platforms like YouTube and TikTok, which don’t release nearly as much data.
  • Last fall, the leaderboard was full of posts by Mr. Trump and pro-Trump media personalities. Since Mr. Trump was barred from Facebook in January, it has been dominated by a handful of right-wing polemicists like Mr. Shapiro, Mr. Bongino and Sean Hannity, with the occasional mainstream news article, cute animal story or K-pop fan blog sprinkled in.
  • But the mood shifted last year when I started a Twitter account called @FacebooksTop10, on which I posted a daily leaderboard showing the sources of the most-engaged link posts by U.S. pages, based on CrowdTangle data.
  • The account went semi-viral, racking up more than 35,000 followers. Thousands of people retweeted the lists, including conservatives who were happy to see pro-Trump pundits beating the mainstream media and liberals who shared them with jokes like “Look at all this conservative censorship!” (If you’ve been under a rock for the past two years, conservatives in the United States frequently complain that Facebook is censoring them.)
  • Inside Facebook, the account drove executives crazy. Some believed that the data was being misconstrued and worried that it was painting Facebook as a far-right echo chamber. Others worried that the lists might spook investors by suggesting that Facebook’s U.S. user base was getting older and more conservative. Every time a tweet went viral, I got grumpy calls from Facebook executives who were embarrassed by the disparity between what they thought Facebook was — a clean, well-lit public square where civility and tolerance reign — and the image they saw reflected in the Twitter lists.
  • Mr. Boland, the former Facebook vice president, said that was a convenient deflection. He said that in internal discussions, Facebook executives were less concerned about the accuracy of the data than about the image of Facebook it presented.“It told a story they didn’t like,” he said of the Twitter account, “and frankly didn’t want to admit was true.”
  • Several executives proposed making reach data public on CrowdTangle, in hopes that reporters would cite that data instead of the engagement data they thought made Facebook look bad.But Mr. Silverman, CrowdTangle’s chief executive, replied in an email that the CrowdTangle team had already tested a feature to do that and found problems with it. One issue was that false and misleading news stories also rose to the top of those lists.“Reach leaderboard isn’t a total win from a comms point of view,” Mr. Silverman wrote.
  • executives argued that my Top 10 lists were misleading. They said CrowdTangle measured only “engagement,” while the true measure of Facebook popularity would be based on “reach,” or the number of people who actually see a given post. (With the exception of video views, reach data isn’t public, and only Facebook employees and page owners have access to it.)
  • Mr. Schultz, Facebook’s chief marketing officer, had the dimmest view of CrowdTangle. He wrote that he thought “the only way to avoid stories like this” would be for Facebook to publish its own reports about the most popular content on its platform, rather than releasing data through CrowdTangle.“If we go down the route of just offering more self-service data you will get different, exciting, negative stories in my opinion,” he wrote.
  • there’s a problem with reach data: Most of it is inaccessible and can’t be vetted or fact-checked by outsiders. We simply have to trust that Facebook’s own, private data tells a story that’s very different from the data it shares with the public.
  • Mr. Zuckerberg is right about one thing: Facebook is not a giant right-wing echo chamber.But it does contain a giant right-wing echo chamber — a kind of AM talk radio built into the heart of Facebook’s news ecosystem, with a hyper-engaged audience of loyal partisans who love liking, sharing and clicking on posts from right-wing pages, many of which have gotten good at serving up Facebook-optimized outrage bait at a consistent clip.
  • CrowdTangle’s data made this echo chamber easier for outsiders to see and quantify. But it didn’t create it, or give it the tools it needed to grow — Facebook did — and blaming a data tool for these revelations makes no more sense than blaming a thermometer for bad weather.
  • It’s worth noting that these transparency efforts are voluntary, and could disappear at any time. There are no regulations that require Facebook or any other social media companies to reveal what content performs well on their platforms, and American politicians appear to be more interested in fighting over claims of censorship than getting access to better data.
  • It’s also worth noting that Facebook can turn down the outrage dials and show its users calmer, less divisive news any time it wants. (In fact, it briefly did so after the 2020 election, when it worried that election-related misinformation could spiral into mass violence.) And there is some evidence that it is at least considering more permanent changes.
  • The project, which some employees refer to as the “Top 10” project, is still underway, the people said, and it’s unclear whether its findings have been put in place. Mr. Osborne, the Facebook spokesman, said that the team looks at a variety of ranking changes, and that the experiment wasn’t driven by a desire to change the Top 10 lists.
  • This year, Mr. Hegeman, the executive in charge of Facebook’s news feed, asked a team to figure out how tweaking certain variables in the core news feed ranking algorithm would change the resulting Top 10 lists, according to two people with knowledge of the project.
  • As for CrowdTangle, the tool is still available, and Facebook is not expected to cut off access to journalists and researchers in the short term, according to two people with knowledge of the company’s plans.
  • Mr. Boland, however, said he wouldn’t be surprised if Facebook executives decided to kill off CrowdTangle entirely or starve it of resources, rather than dealing with the headaches its data creates.
6More

The Joe Biden campaign intensifies its criticism of Facebook - CNNPolitics - 0 views

  • The Biden campaign ramped up its criticism of Facebook in a second letter to the social media giant, calling for the platform to reject another false anti-Joe Biden ad.
  • In Thursday's letter, Biden campaign manager Greg Schultz blasted Facebook for what he called a "deeply flawed" policy, saying it gives "blanket permission" for candidates to use the platform to "mislead American voters all while Facebook profits from their advertising dollars." The ad was paid for by a political action committee called The Committee to Defend the President.
  • "The ad you wrote us about is currently inactive on our platform and is not running. Should it become active it would then be sent to third party fact checkers where they would have the option to review it," Harbath said.
  • ...3 more annotations...
  • The ad, which is now inactive, accuses the 2020 Democratic presidential candidate of "blackmailing" US allies, alleging impropriety while he was vice president and his son Hunter served on the board of a Ukrainian energy company. There is no evidence of wrongdoing by Joe or Hunter Biden. The ad ends with the narrator saying, "Send Quid Pro Joe Biden into retirement."
  • Warren recently ran a false ad on Facebook deliberately to draw attention to the issue. Facebook then tweeted at Warren that the "FCC doesn't want broadcast companies censoring candidates' speech." It continued, "We agree it's better to let voters -- not companies -- decide."
  • Under current policy, Facebook exempts ads by politicians from third-party fact-checking -- a loophole, both Biden and Warren allege, that allows Zuckerberg to continue taking money from President Donald Trump's campaign and its supporters despite Trump's ads spreading lies about Biden and his son. While it allows politicians to run ads with false information, its policy does not allow PACs to do so. Facebook does not fact check ads from PACS before they run on the platform.
12More

Air travel shows what happens when we give companies ruinous power over us - The Washin... - 0 views

  • Like 40 percent of U.S. adults, I regularly wouldn’t be able to scrounge $400 in a crisis. But if you don’t have $400 (or considerably more) on hand, your poverty can trouble you in all sorts of other, more mundane ways, thanks to the abusive nature of the companies that provide us with services.
  • odysseys like mine are not — or are not merely — tales of airline villainy. They are stories about the background radiation of our rapacious economy, one in which customer and corporate desperation unwittingly amplify each other, accelerating the mutual distrust.
  • Nowhere is this cycle more apparent than airports, where holidays, weekends and rush hours are attacks on the notion that our time has value
  • ...9 more annotations...
  • What is most galling about this economy is that we are supposed to proffer compliance and complicity as companies profit amorally off of us. Facebook unveils supposedly robust privacy protections on the same day it launches a service to connect you with your “secret crush.”
  • You’re supposed to pay whatever rent landlords want, whatever bills hospitals charge, whatever price surge the car-share makes up.
  • From Apple to John Deere, digital-rights-management technology has made us “tenants on our own devices.” The terms of service turn us into the servants. And what recourse do we have? We ask to speak with the manager, vent to Yelp, endure the hold muzak and hack our way to rival bargains. But let’s be honest: We don’t have power.
  • “How can you treat us like this? Do you think that this is normal?” Hundreds in the line broke into applause. At no point in those 12 hours did a United employee walk up and down the line to see how we were doing, offer blankets or water, or get our customer service session started early, the way they do in long lines at, say, Starbucks.
  • “What you need to do,” Benilda said, “is buy a new ticket. Because now you’ll just be on standby for the next flight and the next. That could last for days.”
  • For those of us living hand-to-mouth — which is to say, most of us — it takes years of nothing going wrong to earn your way out of poverty. I had gone wrong: I had slept, awaking back at square one
  • Maybe a few of us were in dire straits because we were confused or uninformed or lazy or irresponsible, a common argument about why people remain poor. But not all of us. Besides, personal fortitude is no match for structural inequalities.
  • Fifty-three hours after arriving at the airport in Newark, I landed in San Francisco; I’d scored a standby seat. My trip took almost triple the time it would have in 1933, when the transcontinental Boeing 247 debuted. Driving across the country would have been nine hours faster.
  • What is strangest and saddest about the broad brokenness of America is that, actually, this is the way it works. Have-not consumers pay to be complicit in our own fleecing. That is the toxic marrow in America’s bones. More than a century after conquering the onetime impossibility of flight, we have yet to master the long-time impossibility of fairness.
2More

As the US descends into chaos, what better time for Britain to go the same way? | US Ca... - 0 views

  • It was nice to hear from Mark Zuckerberg, who grandly announced he’d blocked Trump’s Facebook and Instagram accounts. This is not so much a case of shutting the stable door after the horse has bolted as doping the horse, whipping it into a frenzy, encouraging it to bolt, fostering a world in which humans are subjugated by horses, monetising every snort and whinny, allowing the very existence of “humans” and “horses” to become just one of a bunch of competing opinions, and then – only when that one particular horse has outlived its usefulness and seems destined for the glue factory – gently closing the stable door with a self-satisfied little “click”.
  • there would never have been a Trump presidency without Fox News, with the channel spending years before his election pushing his birtherism, boomer-bait and belief that the news is really just another TV show whose ratings were his primary obsession. Doubling down on all its worst instincts from the moment Barack Obama was elected, Fox News terrified and radicalised with wild disinformation, creating a post-fact black hole so powerful that even previously mild-mannered rivals got sucked into it.
6More

Opinion | Facebook Has Been a Disaster for the World - The New York Times - 0 views

  • Facebook has been incredibly lucrative for its founder, Mark Zuckerberg, who ranks among the wealthiest men in the world. But it’s been a disaster for the world itself, a powerful vector for paranoia, propaganda and conspiracy-theorizing as well as authoritarian crackdowns and vicious attacks on the free press. Wherever it goes, chaos and destabilization follow.
  • The most disturbing revelations from Zhang’s memo relate to the failure of Facebook to take swift action against coordinated activity in countries like Honduras and Azerbaijan, where political leaders used armies of fake accounts to attack opponents and undermine independent media. “We simply didn’t care enough to stop them,”
  • “In the three years I’ve spent at Facebook, I’ve found multiple blatant attempts by foreign national governments to abuse our platform on vast scales to mislead their own citizenry, and caused international news on multiple occasions,” Zhang wrote. “I have personally made decisions that affected national presidents without oversight and taken action to enforce against so many prominent politicians globally that I’ve lost count,”
  • ...3 more annotations...
  • “There are five major ways that authoritarian regimes exploit Facebook and other social media services,” Siva Vaidhyanathan, a media scholar at the University of Virginia, writes in “Antisocial Media: How Facebook Disconnects Us and Undermines Democracy.” They can “organize countermovements to emerging civil society or protest movements,” “frame the public debate along their terms,” let citizens “voice complaints without direct appeal or protest” and “coordinate among elites to rally support.” They can also use social media to aid in the “surveillance and harassment of opposition activists and journalists.”
  • Facebook, according to the company’s own investigation, is home to thousands of QAnon groups and pages with millions of members and followers. Its recommendation algorithms push users to engage with QAnon content, spreading the conspiracy to people who may never have encountered it otherwise
  • Similarly, a report from the German Marshall Fund pegs the recent spate of fire conspiracies — false claims of arson in Oregon by antifa or Black Lives Matter — to the uncontrolled spread of rumors and disinformation on Facebook.
5More

Opinion | What Years of Emails and Texts Reveal About Your Friendly Tech Companies - Th... - 0 views

  • he picture that emerges from these documents is not one of steady entrepreneurial brilliance. Rather, at points where they might have been vulnerable to hotter, newer start-ups, Big Tech companies have managed to avoid the rigors of competition. Their two main tools — buying their way out of the problem and a willingness to lose money — are both made possible by sky-high Wall Street valuations, which go only higher with acquisitions of competitors, fueling a cycle of enrichment and consolidation of power
  • As Mr. Zuckerberg bluntly boasted in an email, because of its immense wealth Facebook “can likely always just buy any competitive start-ups.”
  • The greater scandal here may be that the federal government has let these companies get away with this
  • ...2 more annotations...
  • the government in the 2010s allowed more than 500 start-up acquisitions to go unchallenged. This hands-off approach effectively gave tech executives a green light to consolidate the industry.
  • It may be profitable and savvy to eliminate rivals to maintain a monopoly, but it remains illegal in this country under the Sherman Antitrust Act and Standard Oil v. United States. Unless we re-establish that legal fact, Big Tech will continue to fight dirty and keep on winning.
8More

There's Rich, And There's Jeff Bezos Rich: Meet The World's Centibillionaires : NPR - 0 views

  • You probably think 2020 has turned out to be a pretty lousy year, what with the coronavirus pandemic, a global recession and unceasing partisan warfare in Washington. Then again, you're not Jeff Bezos or Elon Musk.
  • Bezos. With a net worth of $182 billion, the Amazon founder is by far the wealthiest person on the planet.
  • Forbes magazine once called Bezos the richest human being who has ever lived.
  • ...5 more annotations...
  • "It is bigger than the GDP of most countries in the world. I mean, it is larger than the market cap of many companies that are on the S&P 500,"
  • Rounding out the group of five centibillionaires are Musk, Bill Gates of Microsoft with a net worth of $129 billion
  • In fact, Musk and Zuckerberg each ascended to the $100 billion club this year after shares of Tesla and Facebook rose 677% and 39%, respectively.
  • "It's not anything to celebrate. It's kind of a disturbing milestone," he says. "But I think it's a predictable outcome of four decades of flat wages and steadily concentrating wealth and power."
  • Bezos, for example, has such a staggering fortune that despite going through a messy divorce and giving up one quarter of his Amazon shares to his ex-wife, MacKenzie Scott, he still remains the world's richest person.
21More

Election Fraud Attack: Ex-Houston Police Captain Charged With Assaulting Man : NPR - 0 views

  • The suspect, Mark Anthony Aguirre, told police he was part of a group of private citizens investigating claims of the massive fraud allegedly funded by Facebook CEO Mark Zuckerberg and involving election ballots forged by Hispanic children. He said the plot was underway in Harris County, Texas, prior to the Nov. 3 election.
    • carolinehayter
       
      the absurdity of that statement...
  • Aguirre said he was working for the group Liberty Center for God and Country when, on Oct. 19, he pulled a gun on a man who he believed was the mastermind of the scheme.
  • Authorities found no evidence that he was involved in any fraud scheme claimed by Aguirre.
  • ...17 more annotations...
  • Harris County District Attorney Kim Ogg said Aguirre "crossed the line from dirty politics to commission of a violent crime and we are lucky no one was killed."
  • "His alleged investigation was backward from the start — first alleging a crime had occurred and then trying to prove it happened," Ogg said.
  • Claims of voter fraud during this year's election — by President Trump, Aguirre and others — have been debunked. Evidence that President-elect Joe Biden won the election hasn't stopped Trump and others from challenging the results in court — an effort that has also repeatedly failed. This week, the Electoral College made Biden's victory official.
  • Aguirre's scheme was reportedly part of a paid investigation by the Liberty Center group, whose CEO is Republican activist Steven Hotze. It was later discovered that Aguirre was paid $266,400 by the organization for this involvement.
  • Liberty Center for God and Country's Facebook page says the organization's goal "is to provide the bold and courageous leadership necessary to restore our nation to its Godly heritage by following the strategy that our pilgrim forefathers gave us."
  • hould be "tarred and feathered" for coronavirus lockdown measures in the state.
  • ould be "
  • state
  • tarred
  • he had raised more than $600,000 over a three-week period
  • That fundraising push, Hotze said, "prevented the Democrats from carrying out their massive election fraud scheme in Harris County, and prevented them from carrying Texas for Biden. Our efforts saved Texas."
  • Aguirre and two other unidentified companions with the Liberty Center watched the victim for four days prior to the Oct. 19 attack, according to police records. They were convinced that there were 750,000 fraudulent ballots in the man's vehicle and home.
  • Aguirre said the victim was using Hispanic children to sign the ballots because children's fingerprints wouldn't appear on any database, according to the affidavit. He also claimed Facebook's founder gave $9.37 billion for "ballot harvesting."
  • The victim was driving his box truck during the early morning hours of Oct. 19, when he noticed a black SUV pull into his lane, almost hitting him. A few seconds later, the driver of the SUV later identified as Aguirre, allegedly slammed into the back of the man's vehicle. When the victim pulled over and got out to check on Aguirre, the former police officer allegedly pointed a gun at the victim and demanded he get on the ground.
  • While Aguirre had his knee into the man's back, according to the affidavit, he ordered two other people arrived on the scene to search the victim's truck. One of them then drove the truck as Aguirre kept the man pinned to the ground. The truck was found abandoned a few blocks away about 30 minutes after the incident. When police searched the victim's truck, only air-conditioner parts and tools were found. No ballots were discovered in the truck or in the man's home. Aguirre was charged with aggravated assault with a deadly weapon, a second-degree felony punishable by up to 20 years in prison.
  • Ex-Houston Police Officer
  • An ex-captain in the Houston Police Department was arrested Tuesday for allegedly running a man off the road and assaulting him in an attempt to prove a bizarre voter-fraud conspiracy pushed by a right-wing organization.
2More

Review: 'The Contrarian,' Max Chafkin's Biography of Peter Thiel - The Atlantic - 0 views

  • He came under the influence of the Stanford philosopher René Girard, who placed the imitative instinct at the center of human behavior.
  • In Girard’s telling, imitation generated conflict, as people fought for the same things—the same jobs, schools, and material possessions—even though such trophies would fail to make them happier. Life, Thiel eventually would come to realize, could be cast as a struggle to escape the false siren of copycat cravings. To be free, you had to carve your own path. You had to be a contrarian.
17More

Facebook flounders in the court of public opinion | The Economist - 2 views

  • “YOU ARE a 21st-century American hero,” gushed Ed Markey, a Democratic senator from Massachusetts. He was not addressing the founder of one of the country’s largest companies, Facebook, but the woman who found fault with it
  • Frances Haugen, who had worked at the social-media giant before becoming a whistleblower, testified in front of a Senate subcommittee for over three hours on October 5th, highlighting Facebook’s “moral bankruptcy” and the firm’s downplaying of its harmful impact, including fanning teenage depression and ethnic violence.
  • Facebook’s own private research, for example, found that its photo-sharing site, Instagram, worsened teens’ suicidal thoughts and eating disorders. Yet it still made a point of sending young users engaging content that stoked their anxiety—while proceeding to develop a version of its site for those under the age of 13.
  • ...14 more annotations...
  • In 2018 a different whistleblower outed Facebook for its sketchy collaboration with Cambridge Analytica, a research organisation that allowed users’ data to be collected without their consent and used for political profiling by Donald Trump’s campaign. Facebook’s founder, Mark Zuckerberg, went to Washington, DC to apologise, and in 2019 America’s consumer-protection agency, the Federal Trade Commission, agreed to a $5bn settlement with Facebook. That is the largest fine ever levied against a tech firm.
  • Congress has repeatedly called in tech bosses for angry questioning and public shaming without taking direct action afterwards.
  • Senators, who cannot agree on such uncontroversial things as paying for the government’s expenses, united against a common enemy and promised Ms Haugen that they would hold Facebook to account.
  • Congress could update and strengthen the Children’s Online Privacy Protection Act (COPPA), which was passed in 1998 and bars the collection of data from children under the age of 13.
  • If Congress does follow through with legislation, it is likely to focus narrowly on protecting children online, as opposed to broader reforms, for which there is still no political consensus.
  • Social media’s harmful effects on children and teenagers is a concern that transcends partisanship and is easier to understand than sneaky data-gathering, viral misinformation and other social-networking sins.
  • Other legislative proposals take aim at manipulative marketing and design features that make social media so addictive for the young.
  • However, Ms Haugen’s most significant impact on big tech may be inspiring others to come forward and blow the whistle on their employers’ malfeasance.
  • “A case like this one opens the floodgates and will trigger hundreds more cases,” predicts Steve Kohn, a lawyer who has represented several high-profile whistleblowers.
  • One is the industry’s culture of flouting rules and a history of non-compliance. Another is a legal framework that makes whistleblowing less threatening and more attractive than it used to be.
  • The Dodd-Frank Act, which was enacted in 2010, gives greater protections to whistleblowers by preventing retaliation from employers and by offering rewards to successful cases of up to 10-30% of the money collected from sanctions against a firm.
  • If the threat of public shaming encourages corporate accountability, that is a good thing. But it could also make tech firms less inclusive and transparent, predicts Matt Perault, a former Facebook executive who is director of the Centre for Technology Policy at the University of North Carolina at Chapel Hill.
  • People may become less willing to share off-the-wall ideas if they worry about public leaks; companies may become less open with their staff; and executives could start including only a handful of trusted senior staff in meetings that might have otherwise been less restricted.
  • Facebook and other big tech firms, which have been criticised for violating people’s privacy online, can no longer count on any privacy either.
26More

How Facebook Failed the World - The Atlantic - 0 views

  • In the United States, Facebook has facilitated the spread of misinformation, hate speech, and political polarization. It has algorithmically surfaced false information about conspiracy theories and vaccines, and was instrumental in the ability of an extremist mob to attempt a violent coup at the Capitol. That much is now painfully familiar.
  • these documents show that the Facebook we have in the United States is actually the platform at its best. It’s the version made by people who speak our language and understand our customs, who take our civic problems seriously because those problems are theirs too. It’s the version that exists on a free internet, under a relatively stable government, in a wealthy democracy. It’s also the version to which Facebook dedicates the most moderation resources.
  • Elsewhere, the documents show, things are different. In the most vulnerable parts of the world—places with limited internet access, where smaller user numbers mean bad actors have undue influence—the trade-offs and mistakes that Facebook makes can have deadly consequences.
  • ...23 more annotations...
  • According to the documents, Facebook is aware that its products are being used to facilitate hate speech in the Middle East, violent cartels in Mexico, ethnic cleansing in Ethiopia, extremist anti-Muslim rhetoric in India, and sex trafficking in Dubai. It is also aware that its efforts to combat these things are insufficient. A March 2021 report notes, “We frequently observe highly coordinated, intentional activity … by problematic actors” that is “particularly prevalent—and problematic—in At-Risk Countries and Contexts”; the report later acknowledges, “Current mitigation strategies are not enough.”
  • As recently as late 2020, an internal Facebook report found that only 6 percent of Arabic-language hate content on Instagram was detected by Facebook’s systems. Another report that circulated last winter found that, of material posted in Afghanistan that was classified as hate speech within a 30-day range, only 0.23 percent was taken down automatically by Facebook’s tools. In both instances, employees blamed company leadership for insufficient investment.
  • last year, according to the documents, only 13 percent of Facebook’s misinformation-moderation staff hours were devoted to the non-U.S. countries in which it operates, whose populations comprise more than 90 percent of Facebook’s users.
  • Among the consequences of that pattern, according to the memo: The Hindu-nationalist politician T. Raja Singh, who posted to hundreds of thousands of followers on Facebook calling for India’s Rohingya Muslims to be shot—in direct violation of Facebook’s hate-speech guidelines—was allowed to remain on the platform despite repeated requests to ban him, including from the very Facebook employees tasked with monitoring hate speech.
  • The granular, procedural, sometimes banal back-and-forth exchanges recorded in the documents reveal, in unprecedented detail, how the most powerful company on Earth makes its decisions. And they suggest that, all over the world, Facebook’s choices are consistently driven by public perception, business risk, the threat of regulation, and the specter of “PR fires,” a phrase that appears over and over in the documents.
  • “It’s an open secret … that Facebook’s short-term decisions are largely motivated by PR and the potential for negative attention,” an employee named Sophie Zhang wrote in a September 2020 internal memo about Facebook’s failure to act on global misinformation threats.
  • In a memo dated December 2020 and posted to Workplace, Facebook’s very Facebooklike internal message board, an employee argued that “Facebook’s decision-making on content policy is routinely influenced by political considerations.”
  • To hear this employee tell it, the problem was structural: Employees who are primarily tasked with negotiating with governments over regulation and national security, and with the press over stories, were empowered to weigh in on conversations about building and enforcing Facebook’s rules regarding questionable content around the world. “Time and again,” the memo quotes a Facebook researcher saying, “I’ve seen promising interventions … be prematurely stifled or severely constrained by key decisionmakers—often based on fears of public and policy stakeholder responses.”
  • And although Facebook users post in at least 160 languages, the company has built robust AI detection in only a fraction of those languages, the ones spoken in large, high-profile markets such as the U.S. and Europe—a choice, the documents show, that means problematic content is seldom detected.
  • A 2020 Wall Street Journal article reported that Facebook’s top public-policy executive in India had raised concerns about backlash if the company were to do so, saying that cracking down on leaders from the ruling party might make running the business more difficult.
  • Employees weren’t placated. In dozens and dozens of comments, they questioned the decisions Facebook had made regarding which parts of the company to involve in content moderation, and raised doubts about its ability to moderate hate speech in India. They called the situation “sad” and Facebook’s response “inadequate,” and wondered about the “propriety of considering regulatory risk” when it comes to violent speech.
  • “I have a very basic question,” wrote one worker. “Despite having such strong processes around hate speech, how come there are so many instances that we have failed? It does speak on the efficacy of the process.”
  • Two other employees said that they had personally reported certain Indian accounts for posting hate speech. Even so, one of the employees wrote, “they still continue to thrive on our platform spewing hateful content.”
  • Taken together, Frances Haugen’s leaked documents show Facebook for what it is: a platform racked by misinformation, disinformation, conspiracy thinking, extremism, hate speech, bullying, abuse, human trafficking, revenge porn, and incitements to violence
  • It is a company that has pursued worldwide growth since its inception—and then, when called upon by regulators, the press, and the public to quell the problems its sheer size has created, it has claimed that its scale makes completely addressing those problems impossible.
  • Instead, Facebook’s 60,000-person global workforce is engaged in a borderless, endless, ever-bigger game of whack-a-mole, one with no winners and a lot of sore arms.
  • Zhang details what she found in her nearly three years at Facebook: coordinated disinformation campaigns in dozens of countries, including India, Brazil, Mexico, Afghanistan, South Korea, Bolivia, Spain, and Ukraine. In some cases, such as in Honduras and Azerbaijan, Zhang was able to tie accounts involved in these campaigns directly to ruling political parties. In the memo, posted to Workplace the day Zhang was fired from Facebook for what the company alleged was poor performance, she says that she made decisions about these accounts with minimal oversight or support, despite repeated entreaties to senior leadership. On multiple occasions, she said, she was told to prioritize other work.
  • A Facebook spokesperson said that the company tries “to keep people safe even if it impacts our bottom line,” adding that the company has spent $13 billion on safety since 2016. “​​Our track record shows that we crack down on abuse abroad with the same intensity that we apply in the U.S.”
  • Zhang's memo, though, paints a different picture. “We focus upon harm and priority regions like the United States and Western Europe,” she wrote. But eventually, “it became impossible to read the news and monitor world events without feeling the weight of my own responsibility.”
  • Indeed, Facebook explicitly prioritizes certain countries for intervention by sorting them into tiers, the documents show. Zhang “chose not to prioritize” Bolivia, despite credible evidence of inauthentic activity in the run-up to the country’s 2019 election. That election was marred by claims of fraud, which fueled widespread protests; more than 30 people were killed and more than 800 were injured.
  • “I have blood on my hands,” Zhang wrote in the memo. By the time she left Facebook, she was having trouble sleeping at night. “I consider myself to have been put in an impossible spot—caught between my loyalties to the company and my loyalties to the world as a whole.”
  • What happened in the Philippines—and in Honduras, and Azerbaijan, and India, and Bolivia—wasn’t just that a very large company lacked a handle on the content posted to its platform. It was that, in many cases, a very large company knew what was happening and failed to meaningfully intervene.
  • solving problems for users should not be surprising. The company is under the constant threat of regulation and bad press. Facebook is doing what companies do, triaging and acting in its own self-interest.
9More

Opinion | What to Do About Facebook, and What Not to Do - The New York Times - 0 views

  • Facebook’s alarming power. The company is among the largest collectors of humanity’s most private information, one of the planet’s most-trafficked sources of news, and it seems to possess the ability, in some degree, to alter public discourse. Worse, essentially all of Facebook’s power is vested in Zuckerberg alone.
  • This feels intolerable; as the philosopher Kanye West put it, “No one man should have all that power.”
  • Persily proposes piercing the black box before we do anything else. He has written draft legislation that would compel large tech platforms to provide to outside researchers a range of data about what users see on the service, how they engage with it, and what information the platform provides to advertisers and governments.
  • ...6 more annotations...
  • Nathaniel Persily, a professor at Stanford Law School, has a neat way of describing the most basic problem in policing Facebook: “At present,” Persily has written, “we do not know even what we do not know” about social media’s effect on the world.
  • Rashad Robinson, president of the civil rights advocacy group Color of Charge, favored another proposed law, the Algorithmic Justice and Online Platform Transparency Act, which would also require that platforms release data about how they collect and use personal information about, among other demographic categories, users’ race, ethnicity, sex, religion, gender identity, sexual orientation and disability status, in order to show whether their systems are being applied in discriminatory ways.
  • one idea as “unsexy but important”: Educating the public to resist believing everything they see online.
  • What we need, then, is something like a society-wide effort to teach people how to process digital information.
  • In his new book, “Tech Panic: Why We Shouldn’t Fear Facebook and the Future,” Robby Soave, an editor at Reason magazine, argues that the media and lawmakers have become too worked up about the dangers posed by Facebook.He doesn’t disagree that the company’s rise has had some terrible effects, but he worries that some proposals could exacerbate Facebook’s dominance — a point with which I agree.
  • But Soave will probably get what he wants. As long as there’s wide disagreement among politicians about how to address Facebook’s ills, doing nothing might be the likeliest outcome.
22More

In India, Facebook Struggles to Combat Misinformation and Hate Speech - The New York Times - 0 views

  • On Feb. 4, 2019, a Facebook researcher created a new user account to see what it was like to experience the social media site as a person living in Kerala, India.For the next three weeks, the account operated by a simple rule: Follow all the recommendations generated by Facebook’s algorithms to join groups, watch videos and explore new pages on the site.
  • The result was an inundation of hate speech, misinformation and celebrations of violence, which were documented in an internal Facebook report published later that month.AdvertisementContinue reading the main story“Following this test user’s News Feed, I’ve seen more images of dead people in the past three weeks than I’ve seen in my entire life total,” the Facebook researcher wrote.
  • The report was one of dozens of studies and memos written by Facebook employees grappling with the effects of the platform on India. They provide stark evidence of one of the most serious criticisms levied by human rights activists and politicians against the world-spanning company: It moves into a country without fully understanding its potential impact on local culture and politics, and fails to deploy the resources to act on issues once they occur.
  • ...19 more annotations...
  • Facebook’s problems on the subcontinent present an amplified version of the issues it has faced throughout the world, made worse by a lack of resources and a lack of expertise in India’s 22 officially recognized languages.
  • The documents include reports on how bots and fake accounts tied to the country’s ruling party and opposition figures were wreaking havoc on national elections
  • They also detail how a plan championed by Mark Zuckerberg, Facebook’s chief executive, to focus on “meaningful social interactions,” or exchanges between friends and family, was leading to more misinformation in India, particularly during the pandemic.
  • Facebook did not have enough resources in India and was unable to grapple with the problems it had introduced there, including anti-Muslim posts,
  • Eighty-seven percent of the company’s global budget for time spent on classifying misinformation is earmarked for the United States, while only 13 percent is set aside for the rest of the world — even though North American users make up only 10 percent of the social network’s daily active users
  • That lopsided focus on the United States has had consequences in a number of countries besides India. Company documents showed that Facebook installed measures to demote misinformation during the November election in Myanmar, including disinformation shared by the Myanmar military junta.
  • In Sri Lanka, people were able to automatically add hundreds of thousands of users to Facebook groups, exposing them to violence-inducing and hateful content
  • In India, “there is definitely a question about resourcing” for Facebook, but the answer is not “just throwing more money at the problem,” said Katie Harbath, who spent 10 years at Facebook as a director of public policy, and worked directly on securing India’s national elections. Facebook, she said, needs to find a solution that can be applied to countries around the world.
  • Two months later, after India’s national elections had begun, Facebook put in place a series of steps to stem the flow of misinformation and hate speech in the country, according to an internal document called Indian Election Case Study.
  • After the attack, anti-Pakistan content began to circulate in the Facebook-recommended groups that the researcher had joined. Many of the groups, she noted, had tens of thousands of users. A different report by Facebook, published in December 2019, found Indian Facebook users tended to join large groups, with the country’s median group size at 140,000 members.
  • Graphic posts, including a meme showing the beheading of a Pakistani national and dead bodies wrapped in white sheets on the ground, circulated in the groups she joined.After the researcher shared her case study with co-workers, her colleagues commented on the posted report that they were concerned about misinformation about the upcoming elections in India
  • According to a memo written after the trip, one of the key requests from users in India was that Facebook “take action on types of misinfo that are connected to real-world harm, specifically politics and religious group tension.”
  • The case study painted an optimistic picture of Facebook’s efforts, including adding more fact-checking partners — the third-party network of outlets with which Facebook works to outsource fact-checking — and increasing the amount of misinformation it removed.
  • The study did not note the immense problem the company faced with bots in India, nor issues like voter suppression. During the election, Facebook saw a spike in bots — or fake accounts — linked to various political groups, as well as efforts to spread misinformation that could have affected people’s understanding of the voting process.
  • , Facebook found that over 40 percent of top views, or impressions, in the Indian state of West Bengal were “fake/inauthentic.” One inauthentic account had amassed more than 30 million impressions.
  • A report published in March 2021 showed that many of the problems cited during the 2019 elections persisted.
  • Much of the material circulated around Facebook groups promoting Rashtriya Swayamsevak Sangh, an Indian right-wing and nationalist paramilitary group. The groups took issue with an expanding Muslim minority population in West Bengal and near the Pakistani border, and published posts on Facebook calling for the ouster of Muslim populations from India and promoting a Muslim population control law.
  • Facebook also hesitated to designate R.S.S. as a dangerous organization because of “political sensitivities” that could affect the social network’s operation in the country.
  • Of India’s 22 officially recognized languages, Facebook said it has trained its A.I. systems on five. (It said it had human reviewers for some others.) But in Hindi and Bengali, it still did not have enough data to adequately police the content, and much of the content targeting Muslims “is never flagged or actioned,” the Facebook report said.
19More

Jan. 6 Committee Subpoenas Twitter, Meta, Alphabet and Reddit - The New York Times - 0 views

  • The House committee investigating the Jan. 6 attack on the Capitol issued subpoenas on Thursday to four major social media companies — Alphabet, Meta, Reddit and Twitter — criticizing them for allowing extremism to spread on their platforms and saying they have failed to cooperate adequately with the inquiry.
  • In letters accompanying the subpoenas, the panel named Facebook, a unit of Meta, and YouTube, which is owned by Alphabet’s Google subsidiary, as among the worst offenders that contributed to the spread of misinformation and violent extremism.
  • The committee sent letters in August to 15 social media companies — including sites where misinformation about election fraud spread, such as the pro-Trump website TheDonald.win — seeking documents pertaining to efforts to overturn the election and any domestic violent extremists associated with the Jan. 6 rally and attack.
  • ...16 more annotations...
  • “It’s disappointing that after months of engagement, we still do not have the documents and information necessary to answer those basic questions,”
  • In the days after the attack, Reddit banned a discussion forum dedicated to former President Donald J. Trump, where tens of thousands of Mr. Trump’s supporters regularly convened to express solidarity with him.
  • In the year since the events of Jan. 6, social media companies have been heavily scrutinized for whether their sites played an instrumental role in organizing the attack.
  • In the months surrounding the 2020 election, employees inside Meta raised warning signs that Facebook posts and comments containing “combustible election misinformation” were spreading quickly across the social network, according to a cache of documents and photos reviewed by The New York Times.
  • Frances Haugen, a former Facebook employee turned whistle-blower, said the company relaxed its safeguards too quickly after the election, which then led it to be used in the storming of the Capitol.
  • On Twitter, many of Mr. Trump’s followers used the site to amplify and spread false allegations of election fraud, while connecting with other Trump supporters and conspiracy theorists using the site. And on YouTube, some users broadcast the events of Jan. 6 using the platform’s video streaming technology.
  • Meta said that it had “produced documents to the committee on a schedule committee staff requested — and we will continue to do so.”
  • The committee said letters to the four firms accompanied the subpoenas.The panel said YouTube served as a platform for “significant communications by its users that were relevant to the planning and execution of Jan. 6 attack on the United States Capitol,” including livestreams of the attack as it was taking place.
  • The panel said Facebook and other Meta platforms were used to share messages of “hate, violence and incitement; to spread misinformation, disinformation and conspiracy theories around the election; and to coordinate or attempt to coordinate the Stop the Steal movement.”
  • “Meta has declined to commit to a deadline for producing or even identifying these materials,” Mr. Thompson wrote to Mark Zuckerberg, Meta’s chief executive.
  • The panel said it was focused on Reddit because the platform hosted the r/The_Donald subreddit community that grew significantly before migrating in 2020 to the website TheDonald.win, which ultimately hosted significant discussion and planning related to the Jan. 6 attack.
  • “Unfortunately, the select committee believes Twitter has failed to disclose critical information,” the panel stated.
  • In recent years, Big Tech and Washington have had a history of butting heads. Some Republicans have accused sites including Facebook, Instagram and Twitter of silencing conservative voices.
  • The Federal Trade Commission is investigating whether a number of tech companies have grown too big, and in the process abused their market power to stifle competition. And a bipartisan group of senators and representatives continues to say sites like Facebook and YouTube are not doing enough to curb the spread of misinformation and conspiracy theories.
  • After months of discussions with the companies, only the four large corporations were issued subpoenas on Thursday, because the committee said the firms were “unwilling to commit to voluntarily and expeditiously” cooperating with its work.
  • The panel has interviewed more than 340 witnesses and issued dozens of subpoenas, including for bank and phone records.
13More

Facebook whistleblower '60 Minutes' interview: Frances Haugen says the company prioriti... - 0 views

  • The identity of the Facebook whistleblower who released tens of thousands of pages of internal research and documents — leading to a firestorm for the social media company in recent weeks — was revealed on "60 Minutes" Sunday night as Frances Haugen.
  • The 37-year-old former Facebook product manager who worked on civic integrity issues at the company says the documents show that Facebook knows its platforms are used to spread hate, violence and misinformation
  • Facebook over and over again chose to optimize for its own interests, like making more money," Haugen told "60 Minutes."
  • ...10 more annotations...
  • Haugen filed at least eight complaints with the Securities and Exchange Commission alleging that the company is hiding research about its shortcomings from investors and the public.
  • Haugen, who started at Facebook in 2019 after previously working for other tech giants like Google (GOOGL GOOGLE) and Pinterest (PINS), is set to testify on Tuesday before the Senate Subcommittee on Consumer Protection, Product Safety, and Data Security.
  • Facebook has aggressively pushed back against the reports, calling many of the claims "misleading" and arguing that its apps do more good than harm.
  • Lena Pietsch said in a statement to CNN Business immediately following the "60 Minutes" interview. "We continue to make significant improvements to tackle the spread of misinformation and harmful content. To suggest we encourage bad content and do nothing is just not true."
  • Pietsch released a more than 700-word statement laying out what it called "missing facts" from the segment
  • Haugen said she believes Facebook Founder and CEO Mark Zuckerberg "never set out to make a hateful platform, but he has allowed choices to be made where the side effects of those choices are that hateful and polarizing content gets more distribution and more reach."
  • Haugen said she was recruited by Facebook in 2019 and took the job to work on addressing misinformation. But after the company decided to dissolve its civic integrity team shortly after the 2020 Presidential Election, her feelings about the company started to change.
  • The social media company's algorithm that's designed to show users content that they're most likely to engage with is responsible for many of its problems
  • Haugen said that while "no one at Facebook is malevolent ... the incentives are misaligned."
  • the more anger that they get exposed to, the more they interact and the more they consume."
7More

The Ugly Honesty of Elon Musk's Twitter Rebrand - The Atlantic - 0 views

  • Sexual desire and frustration, familiar feelings for the outcast teenage nerd, pervade the social internet. S3xy-ness is everywhere. Posts by women are dismayingly likely to produce advances, or threats, from creepers on all platforms; at the same time, sex appeal is a pillar for the influencer economy, or else a viable and even noble way to win financial independence. The internet is for porn, as the song goes.
  • In all these ways, online life today descends from where it started, as a safe harbor for the computer nerds who made it. They were socially awkward, concerned with machines instead of people, and devoted to the fantasy of converting their impotence into power.
  • Musk’s obsession with X as a brand, and his childish desire to broadcast that obsession from the rooftops in hoggish, bright pulsations, calls attention to this baggage. It reminds us that the world’s richest man is a computer geek, but one with enormous power instead of none
  • ...4 more annotations...
  • When that conversion was achieved, and the nerds took over the world, they adopted the bravado of the jocks they once despised. (Zuck-Musk cage match, anyone?) But they didn’t stop being nerds. We, the public, never agreed to adopt their worldview as the basis for political, social, or aesthetic life. We got it nevertheless.
  • It calls attention to the putrid smell that suffuses the history of the internet
  • I’m kind of tired of pretending that the stench does not exist, as if doing otherwise would be tantamount to expressing prejudice against neurodivergence. This is a bad culture, and it always has been.
  • If the X rebrand disgusts you—if, like me, you’ve been made a little queasy by having the new logo thrust upon your phone via automatic update—that feeling is about more than Musk alone. He has merely surfaced what has been there all along. The internet is magical and empowering. The internet is childish and disgusting.
« First ‹ Previous 81 - 100 of 108 Next ›
Showing 20 items per page