Skip to main content

Home/ History Readings/ Group items tagged zuckerberg

Rss Feed Group items tagged

Javier E

The Mark Zuckerberg Manifesto: Great for Facebook, Bad for Journalism - The Atlantic - 0 views

  • 85 percent of all online advertising revenue is funneled to either Facebook or Google—leaving a paltry 15 percent for news organizations to fight over.
  • Now, Zuckerberg is making it clear that he wants Facebook to take over many of the actual functions—not just ad dollars—that traditional news organizations once had.
  • Zuckerberg uses abstract language in his memo—he wants Facebook to develop “the social infrastructure for community,” he writes—but what he’s really describing is building a media company with classic journalistic goals: The Facebook of the future, he writes, will be “for keeping us safe, for informing us, for civic engagement, and for inclusion of all.”
  • ...16 more annotations...
  • In the past, the deaths of news organizations have jeopardized the prospect of a safe, well-informed, civically-engaged community
  • One 2014 paper found a substantial drop-off in civic engagement in both Seattle and Denver from 2008 to 2009, after both cities saw the closure of longstanding daily newspapers
  • The problem is that Zuckerberg lays out concrete ideas about how to build community on Facebook, how to encourage civic engagement, and how to improve the quality and inclusiveness of discourse—but he bakes in an assumption that news, which has always been subsidized by the advertising dollars his company now commands, will continue to feed into Facebook’s system at little to no cost to Facebook
  • In some ways, Zuckerberg is building a news organization without journalists. The uncomfortable truth for journalists, though, is that Facebook is much better at community building in the digital age than news organizations are.
  • Facebook is asking its users to act as unpaid publishers and curators of content
  • for context: The Japanese newspaper Yomiuri Shimbun claims that its  circulation of 9 million copies daily makes it the largest in the world
  • Last quarter, Facebook counted nearly 1.9 billion monthly active users.
  • The New York Times had about 1.6 million digital subscribers as of last fall.
  • you can see how Zuckerberg is continuing to push Facebook’s hands-off approach to editorial responsibility. Facebook is outsourcing its decision-making power about what’s in your News Feed. Instead of the way a newspaper editor decides what’s on the front page, the user will decide.
  • “For those who don’t make a decision, the default will be whatever the majority of people in your region selected, like a referendum,” Zuckerberg wrote. Which makes some sense. There are all kinds of issues with an American company imposing its cultural values uniformly on 1.9 billion individuals all over the world.
  • In the United States, the combined daily prime time average viewership for CNN, Fox News, and MSNBC was 3.1 million people in 2015,
  • and now also to act as unpaid editors, volunteering to teach Facebook’s algorithmic editors how and when to surface the content Facebook does not pay for.
  • In other words, Facebook is building a global newsroom run by robot editors and its own readers.
  • he must also realize that what he’s building is a grave threat to journalism
  • Lip service to the crucial function of the Fourth Estate is not enough to sustain it. All of this is the news industry’s problem; not Zuckerberg’s. But it’s also a problem for anyone who believes in and relies on quality journalism to make sense of the world.
  • Zuckerberg doesn’t want Facebook to kill journalism as we know it. He really, really doesn’t. But that doesn’t mean he won’t.
Javier E

Opinion | George Soros: Mark Zuckerberg Should Not Be in Control of Facebook - The New ... - 0 views

  • I believe that Mr. Trump and Facebook’s chief executive, Mark Zuckerberg, realize that their interests are aligned — the president’s in winning elections, Mr. Zuckerberg’s in making money.
  • In 2016, Facebook provided the Trump campaign with embedded staff who helped to optimize its advertising program. (Hillary Clinton’s campaign was also approached, but it declined to embed a Facebook team in her campaign’s operations.)
  • Brad Parscale, the digital director of Mr. Trump’s 2016 campaign and now his campaign manager for 2020, said that Facebook helped Mr. Trump and gave him the edge. This seems to have marked the beginning of a special relationship.
  • ...8 more annotations...
  • Mr. Zuckerberg met with Mr. Trump in the Oval Office on Sept. 19, 2019. We don’t know what was said. But from an interview on the sidelines at the World Economic Forum on Jan. 22, we do know what Mr. Trump said about the meeting: Mr. Zuckerberg “told me that I’m No. 1 in the world in Facebook.”
  • Mr. Trump apparently had no problem with Facebook’s decision not to fact-check political ads. “I’d rather have him just do whatever he is going to do,” Mr. Trump said of Mr. Zuckerberg. “He’s done a hell of a job, when you think of it.”
  • Facebook’s decision not to require fact-checking for political candidates’ advertising in 2020 has flung open the door for false, manipulated, extreme and incendiary statements. Such content is rewarded with prime placement and promotion if it meets Facebook-designed algorithmic standards for popularity and engagement.
  • Facebook’s design tends to obscure the sources of inflammatory and false content, and fails to adequately punish those who spread false information. Nor does the company effectively warn those who are exposed to lies.
  • Facebook has been used to cause worse damage in other countries than the United States. In Myanmar, for example, military personnel used Facebook to help incite the public against the Rohingya, who were targeted in a military assault of incredible cruelty including murder, rape and the burning of entire villages: Around 700,000 Rohingya fled to Bangladesh.
  • within the last year, Facebook has introduced new features on its mobile app that actually intensify the fire of incendiary political attacks — making them easier and quicker to propagate. The system is cost-free to the poster and revenue-generating for Facebook.
  • Facebook is a publisher not just a neutral moderator or “platform.” It should be held accountable for the content that appears on its site
  • I repeat and reaffirm my accusation against Facebook under the leadership of Mr. Zuckerberg and Ms. Sandberg. They follow only one guiding principle: maximize profits irrespective of the consequences
Javier E

A Revealing Look At Zuckerberg | Talking Points Memo - 0 views

  • these tradeoffs get to the heart of Facebook’s problem and the heart of what the site is. The harm is inherent to Facebook’s business model. When you find ways to reduce harm they’re almost always at the expense of engagement metrics the maximization of which are the goal of basically everything Facebook does. The comparison may be a loaded or contentious one. But it is a bit like the Tobacco companies. The product is the problem, not how it’s used or abused. It’s the product. That’s a challenging place for a company to be.
  • Facebook now makes up a very big part of the whole global information ecosystem. In many countries around the world Facebook for all intents and purposes is the Internet. The weather patterns of information as we might call them are heavily shaped by Facebook’s algorithms and the various tweaks and adjustments it makes to them in different countries. Facebook may not create the misinformation or hate speech or hyper-nationalist frenzies but its algorithms help drive them.
  • the guiding light for those algorithms is first to maximize engagement.
  • ...9 more annotations...
  • That part we know. That’s the business model. But in a different way they are driven by goals and drives of this one guy, Mark Zuckerberg
  • my read is that it was more the ‘winning’ part than the money, though of course the two become somewhat indistinguishable. So Zuckerberg is a near free speech absolutist, as the story conveys. Except when it might mean going dark in a medium-to-large-sized country.
  • One interesting anecdote in the article comes out of Vietnam, where Facebook is estimated to make about $1 billion a year. A few years ago Vietnam demanded that Facebook start censoring anti-government posts or really any criticism of the government or be taken off line in the country. Essentially Vietnam insisted that Facebook delegate content moderation within Vietnam to the government of Vietnam. Zuckerberg personally made the decision to agree to the demands.
  • his article and much else makes pretty clear that it really is still Mark Zuckerberg that runs the show. And what drives him? This article and much else suggests that what shapes Zuckerberg’s goals are perhaps three things in descending order: 1) to win (in all its dimensions), 2) to maximize profits and 3) to cater to the complaints of the right which is most effective and aggressive about complaining about purported mistreatment.
  • He apparently justified this on the reasoning that Facebook disappearing in Vietnam would take away the speech rights of more people than the censorship would. If that sounds like self-justifying nonsense thank you for reading closely.
  • this is just too much power for one person to have. But it’s more that the win, win, win!!! mentality which certainly lots of CEOS and especially Founder-CEOS have in spades is here harnessed to an engine that does a lot of damage.
  • Back in 2018 I wrote about a distinct but related issue. No big tech company has been worse at launching off on new ventures or ideas, having whole cottage industries grow up around those ventures, and then shifting gears and having countless partner businesses go belly up
  • there is a related indifference or oblivious to the impact or social costs of what Facebook does, if in many case only because of its sheer scale.
  • This isn’t just corporate culture, or perhaps Zuckerberg himself. A lot of it is tied to Facebook’s relationship to the rest of the web. Google is structurally much more connected to and reliant on the open web. Facebook is much more a closed system which remains highly profitable regardless of the chaos it may create around it.
zachcutler

Why Mark Zuckerberg (probably) won't run for office - Jan. 12, 2017 - 0 views

  • When a public figure hires a top political campaign manager and announces plans to tour the country, you assume he or she plans to run for office.
  • After a tumultuous last year, my hope for this challenge is to get out and talk to more people about how they're living, working and thinking about the future," Zuckerberg wrote in a post, sounding curiously like a politician.
  • Beyond that, Callahan doubts Zuckerberg is ready to put any public office ahead of Facebook.
  • ...5 more annotations...
  • Zuckerberg will be 36 in 2020, just barely old enough to legally serve as president. Apart from the age issue, Tusk questions whether the tech executive has the "temperament and personality" to wage a successful presidential campaign.
  • Yet Tusk think this probably isn't true for Zuckerberg because most political offices would be a step down for him.
  • "I like my current job at the Foundation better than I would being President. Also I wouldn't be good at doing what you need to do to get elected,"
  • The most likely politician in Facebook's C-Suite is COO Sheryl Sandberg.
  • During the presidential campaign, Sandberg was rumored to be on the short list to serve in Hillary Clinton's cabinet as Treasury Secretary.
Javier E

Facebook's Mark Zuckerberg, at a Turning Point - NYTimes.com - 0 views

  • interviews with dozens of venture capitalists and entrepreneurs in Silicon Valley, as well as with Facebook colleagues and outsiders who have mentored him along his climb, paint a promising picture. Beneath that hoodie, these people say, is an increasingly assured leader, one tempered by failures — and there have been some big ones — as well as astonishing successes.
  • He cultivated as advisers such tech giants as Bill Gates and Steve Jobs, as well as others as varied as Marc Andreessen, the co-founder of Netscape, and Donald E. Graham, the chairman and chief executive of the Washington Post Company.
  • “Not only did he have an incredible vision for the industry, but he had an incredible vision for himself.”
  • ...4 more annotations...
  • When Facebook goes public, he will own a minority stake in the company — but will control more than half of the voting power.
  • Mr. Zuckerberg is fascinated by ancient Greece and Rome. As a boy, a favorite video game was Civilization, the object of which is to “build an empire to stand the test of time.” Civilization, one friend says, was “training wheels for starting Facebook.”
  • particularly in the early days, Mr. Zuckerberg was so confident that he often came across as aloof. He wasn’t the best communicator, Mr. Green says. “You can see that as a bad thing, but you have to have an irrational level of self-confidence to start something like Facebook
  • Eager to protect Mr. Zuckerberg, he helped come up with legal documents that guaranteed Mr. Zuckerberg two Facebook board seats. (Mr. Parker got one.) As long as Mr. Zuckerberg held a seat, his shares couldn’t be taken from him
Javier E

Opinion | Zuckerberg's So-Called Shift Toward Privacy - The New York Times - 0 views

  • The platitudes were there, as I expected, but the evasions were worse than I anticipated: The plan, in effect, is to entrench Facebook’s interests while sidestepping all the important issues.
  • Here are four pressing questions about privacy that Mr. Zuckerberg conspicuously did not address: Will Facebook stop collecting data about people’s browsing behavior, which it does extensively? Will it stop purchasing information from data brokers who collect or “scrape” vast amounts of data about billions of people, often including information related to our health and finances? Will it stop creating “shadow profiles” — collections of data about people who aren’t even on Facebook? And most important: Will it change its fundamental business model, which is based on charging advertisers to take advantage of this widespread surveillance to “micro-target” consumers?
  • Mr. Zuckerberg said that the company would expand end-to-end encryption of messaging, which prevents Facebook — or anyone other than the participants in a conversation — from seeing the content of messages. I’m certainly in favor of messaging privacy: It is a cornerstone of the effort to push back against the cloud of surveillance that has descended over the globe.
  • ...7 more annotations...
  • But what we really need — and it is not clear what Facebook has in mind — is privacy for true person-to-person messaging apps, not messaging apps that also allow for secure mass messaging.
  • Once end-to-end encryption is put in place, Facebook can wash its hands of the content. We don’t want to end up with all the same problems we now have with viral content online — only with less visibility and nobody to hold responsible for it.
  • encrypted messaging, in addition to releasing Facebook from the obligation to moderate content, wouldn’t interfere with the surveillance that Facebook conducts for the benefit of advertisers. As Mr. Zuckerberg admitted in an interview after he posted his plan, Facebook isn’t “really using the content of messages to target ads today anyway.” In other words, he is happy to bolster privacy when doing so would decrease Facebook’s responsibilities, but not when doing so would decrease its advertising revenue.
  • Mr. Zuckerberg emphasized in his post was his intention to make Facebook’s messaging platforms, Messenger, WhatsApp and Instagram, “interoperable.” He described this decision as part of his “privacy-focused vision,” though it is not clear how doing so — which would presumably involve sharing user data — would serve privacy interests.
  • Merging those apps just might, however, serve Facebook’s interest in avoiding antitrust remedies. Just as regulators are realizing that allowing Facebook to gobble up all its competitors (including WhatsApp and Instagram) may have been a mistake, Mr. Zuckerberg decides to scramble the eggs to make them harder to separate into independent entities. What a coincidence.
  • This supposed shift toward a “privacy-focused vision” looks more to me like shrewd competitive positioning, dressed up in privacy rhetoric.
  • Sheryl Sandberg, Facebook's chief operating officer, likes to say that the company’s problem is that it has been “way too idealistic.” I think the problem is the invasive way it makes its money and its lack of meaningful oversight
Javier E

Mark Zuckerberg and Sheryl Sandberg's Partnership Did Not Survive Trump - The New York ... - 0 views

  • Mr. Zuckerberg wasn’t interested in politics and didn’t keep up with the news. The year before, while Mr. Zuckerberg was visiting Donald Graham, then the chairman of The Washington Post, a reporter handed the young C.E.O. a book on politics that the reporter had written. Mr. Zuckerberg said to Mr. Graham, “I’m never going to have time to read this.”
  • “I teased him because there were very few things where you’ll find unanimity about, and one of those things is that reading books is a good way to learn. There is no dissent on that point,” Mr. Graham said. “Mark eventually came to agree with me on that, and like everything he did, he picked it up very quickly and became a tremendous reader.”
Grace Gannon

Mark Zuckerberg says he believes in freedom of speech. Does Facebook? - 0 views

  •  
    Facebook's Mark Zuckerberg is dealing with the contradictions of promoting free speech while running the world's biggest social network, after being called out on the gap between his words and Facebook policy. Zuckerberg, who has been outspoken in his support of Charlie Hebdo and freedom of speech in general following the attack on the magazine, posted a Facebook message about Sunday's vigil in the heart of Paris.
carolinehayter

'Stop Lying': Muslim Rights Group Sues Facebook Over Claims It Removes Hate Groups : NPR - 0 views

  • Frustrated with what it sees as a lack of progress, Muslim Advocates on Thursday filed a consumer protection lawsuit against Facebook, Zuckerberg and Sandberg, among other executives, demanding the social network start taking anti-Muslim activity more seriously.
  • The suit alleges that statements made by the executives about the removal of hateful and violent content have misled people into believing that Facebook is doing more than it actually is to combat anti-Muslim bigotry on the world's largest social network.
  • The suit cites research from Elon University professor Megan Squire, who found that anti-Muslim bias serves "as a common denominator among hate groups around the world" on Facebook. Squire, in 2018, alerted the company to more than 200 anti-Muslim groups on its platform. According to the suit, half of them remain active.
  • ...12 more annotations...
  • "We do not allow hate groups on Facebook overall. So if there is a group that their primary purpose or a large part of what they do is spreading hate, we will ban them from the platform overall," Zuckerberg told Congress in 2018. Facebook's Community Standards ban hate speech, violent and graphic content and "dangerous individuals and organizations," like an organized hate group.
  • Lawyers for Muslim Advocates say Facebook's passivity flies in the face of statements Zuckerberg has made to Congress that if something runs afoul of Facebook's rules, the company will remove it.
  • A year earlier, Muslim Advocates provided Facebook a list of 26 anti-Muslim hate groups. Nineteen of them remain active today, according to the suit.
  • "This is not, 'Oh a couple of things are falling through the cracks,'" Bauer said. "This is pervasive content that persists despite academics pointing it out, nonprofits pointing it out. Facebook has made a decision to not take this material down."
  • The lawsuit is asking a judge to declare the statements made by Facebook executives about its content moderation policies fraudulent misrepresentations.
  • It seeks an order preventing Facebook officials from making such remarks.
  • "A corporation is not entitled to exaggerate or misrepresent the safety of a product to drive up sales,
  • Since 2013, officials from Muslim Advocates have met with Facebook leadership, including Zuckerberg, "to educate them about the dangers of allowing anti-Muslim content to flourish on the platform," the suit says. But in the group's view, Facebook never lived up to its promises. Had the company done so, the group alleges in the lawsuit, "it would have significantly reduced the extent to which its platform encouraged and enabled anti-Muslim violence."
  • In the lawsuit, the group says it told Facebook that a militia group, the Texas Patriot Network, was using the platform to organize an armed protest at a Muslim convention in Houston in 2019. It took Facebook 24 hours to take the event down. The Texas Patriot Network is still active on the social network.
  • The suit also referenced an August 2020 event in Milwaukee, Wis. People gathered in front of a mosque and yelled hateful, threatening slurs against Muslims. It was broadcast live on Facebook. The video was removed days later after Muslims Advocates alerted Facebook to the content.
  • It pointed to the Christchurch mass shooting in New Zealand, which left 51 people dead. The shooter live-streamed the massacre on Facebook.
  • "Civil rights advocates have expressed alarm," the outside auditors wrote. "That Muslims feel under siege on Facebook."
katherineharron

Facebook is allowing politicians to lie openly. It's time to regulate (Opinion) - CNN - 0 views

  • At the center of the exchange was a tussle between Sen. Elizabeth Warren, who has been pushing for the break-up of tech giants like Facebook and Google, and Sen. Kamala Harris, who pointedly asked whether Warren would join her in demanding that Twitter suspend President Donald Trump's account on the platform.
  • This is a highly-charged and heavily politicized question, particularly for Democratic candidates. Last month, Facebook formalized a bold new policy that shocked many observers, announcing that the company would not seek to fact-check or censor politicians -- including in the context of paid political advertising, and even during an election season.Over the past few days, this decree has pushed US political advertising into something like the Wild West: President Donald Trump, who will likely face the Democratic candidate in next year's general election, has already taken the opportunity to spread political lies with no accountability.
  • Warren responded to the Trump ad with a cheeky point: In an ad she has circulated over Facebook, she claims that "Mark Zuckerberg and Facebook just endorsed Donald Trump for re-election." Later in the ad, she acknowledges this is a falsehood, and contends that "what [Mark] Zuckerberg has done is given Donald Trump free rein to lie on his platform — and then to pay Facebook gobs of money to push out their lies to American voters."
  • ...6 more annotations...
  • Should our politicians fail to reform regulations for internet platforms and digital advertising, our political future will be at risk. The 2016 election revealed the tremendous harm to the American democratic process that can result from coordinated misinformation campaigns; 2020 will be far worse if we do nothing to contain the capacity for politicians to lie on social media.
  • This new Facebook policy opens a frightening new world for political communication — and for national politics. It is now the case that leading politicians can openly spread political lies without repercussion. Indeed, the Trump campaign was already spreading other falsehoods through online advertising immediately before Facebook made its announcement — and as one might predict, most of those advertisements have not been removed from the platform.
  • It is disconcerting to think that by fiat, Facebook can deem a political ad to be dishonest because it contains fake buttons (which can deceive the viewer into clicking on a survey button when in fact there is no interactive feature in the ad), but the company will refuse to take action against ads containing widely-debunked political lies, even during an American presidential election.
  • Facebook has one principal counterargument against regulation: that the company must maintain strong commitments to free speech and freedom of political expression. This came across in Mark Zuckerberg's speech at Georgetown University on Thursday, in which he described social media as a kind of "Fifth Estate" and characterized politicians' calls to take action as an attempt to restrict freedom of expression. Quoting at times from Frederick Douglass and Supreme Court jurisprudence, Zuckerberg said "we are at a crossroads" and asserted: "When it's not absolutely clear what to do, we should err on the side of free expression."
  • Unfortunately for Facebook, this argument holds little water. If you determine that an ad containing a fake button is non-compliant because it "[entices] users to select an answer," then you certainly should not knowingly broadcast ads that entice voters to unwittingly consume publicly-known lies -- whether they are distributed by the President or any other politician. Indeed, as one official in Biden's presidential campaign has noted, Zuckerberg's argumentation amounts to an insidious "choice to cloak Facebook's policy in a feigned concern for free expression" to "use the Constitution as a shield for his company's bottom line."
  • If Facebook cannot take appropriate action and remove paid political lies from its platform, the only answer must be earnest regulation of the company -- regulation that forces Facebook to be transparent about the nature of political ads and prevents it from propagating political falsehoods, even if they are enthusiastically distributed by President Trump.
Javier E

Opinion | It's Time to Break Up Facebook - The New York Times - 1 views

  • For many people today, it’s hard to imagine government doing much of anything right, let alone breaking up a company like Facebook. This isn’t by coincidence.
  • Starting in the 1970s, a small but dedicated group of economists, lawyers and policymakers sowed the seeds of our cynicism. Over the next 40 years, they financed a network of think tanks, journals, social clubs, academic centers and media outlets to teach an emerging generation that private interests should take precedence over public ones
  • Their gospel was simple: “Free” markets are dynamic and productive, while government is bureaucratic and ineffective. By the mid-1980s, they had largely managed to relegate energetic antitrust enforcement to the history books.
  • ...51 more annotations...
  • This shift, combined with business-friendly tax and regulatory policy, ushered in a period of mergers and acquisitions that created megacorporations
  • In the past 20 years, more than 75 percent of American industries, from airlines to pharmaceuticals, have experienced increased concentration, and the average size of public companies has tripled. The results are a decline in entrepreneurship, stalled productivity growth, and higher prices and fewer choices for consumers.
  • Because Facebook so dominates social networking, it faces no market-based accountability. This means that every time Facebook messes up, we repeat an exhausting pattern: first outrage, then disappointment and, finally, resignation.
  • Over a decade later, Facebook has earned the prize of domination. It is worth half a trillion dollars and commands, by my estimate, more than 80 percent of the world’s social networking revenue. It is a powerful monopoly, eclipsing all of its rivals and erasing competition from the social networking category.
  • Facebook’s monopoly is also visible in its usage statistics. About 70 percent of American adults use social media, and a vast majority are on Facebook products
  • Over two-thirds use the core site, a third use Instagram, and a fifth use WhatsApp.
  • As a result of all this, would-be competitors can’t raise the money to take on Facebook. Investors realize that if a company gets traction, Facebook will copy its innovations, shut it down or acquire it for a relatively modest sum
  • Facebook’s dominance is not an accident of history. The company’s strategy was to beat every competitor in plain view, and regulators and the government tacitly — and at times explicitly — approved
  • The F.T.C.’s biggest mistake was to allow Facebook to acquire Instagram and WhatsApp. In 2012, the newer platforms were nipping at Facebook’s heels because they had been built for the smartphone, where Facebook was still struggling to gain traction. Mark responded by buying them, and the F.T.C. approved.
  • Neither Instagram nor WhatsApp had any meaningful revenue, but both were incredibly popular. The Instagram acquisition guaranteed Facebook would preserve its dominance in photo networking, and WhatsApp gave it a new entry into mobile real-time messaging.
  • When it hasn’t acquired its way to dominance, Facebook has used its monopoly position to shut out competing companies or has copied their technology.
  • In 2014, the rules favored curiosity-inducing “clickbait” headlines. In 2016, they enabled the spread of fringe political views and fake news, which made it easier for Russian actors to manipulate the American electorate.
  • As markets become more concentrated, the number of new start-up businesses declines. This holds true in other high-tech areas dominated by single companies, like search (controlled by Google) and e-commerce (taken over by Amazon)
  • I don’t blame Mark for his quest for domination. He has demonstrated nothing more nefarious than the virtuous hustle of a talented entrepreneur
  • It’s on our government to ensure that we never lose the magic of the invisible hand. How did we allow this to happen
  • a narrow reliance on whether or not consumers have experienced price gouging fails to take into account the full cost of market domination
  • It doesn’t recognize that we also want markets to be competitive to encourage innovation and to hold power in check. And it is out of step with the history of antitrust law. Two of the last major antitrust suits, against AT&T and IBM in the 1980s, were grounded in the argument that they had used their size to stifle innovation and crush competition.
  • It is a disservice to the laws and their intent to retain such a laserlike focus on price effects as the measure of all that antitrust was meant to do.”
  • Facebook is the perfect case on which to reverse course, precisely because Facebook makes its money from targeted advertising, meaning users do not pay to use the service. But it is not actually free, and it certainly isn’t harmless.
  • We pay for Facebook with our data and our attention, and by either measure it doesn’t come cheap.
  • The choice is mine, but it doesn’t feel like a choice. Facebook seeps into every corner of our lives to capture as much of our attention and data as possible and, without any alternative, we make the trade.
  • The vibrant marketplace that once drove Facebook and other social media companies to compete to come up with better products has virtually disappeared. This means there’s less chance of start-ups developing healthier, less exploitative social media platforms. It also means less accountability on issues like privacy.
  • The most problematic aspect of Facebook’s power is Mark’s unilateral control over speech. There is no precedent for his ability to monitor, organize and even censor the conversations of two billion people.
  • Facebook engineers write algorithms that select which users’ comments or experiences end up displayed in the News Feeds of friends and family. These rules are proprietary and so complex that many Facebook employees themselves don’t understand them.
  • What started out as lighthearted entertainment has become the primary way that people of all ages communicate online.
  • In January 2018, Mark announced that the algorithms would favor non-news content shared by friends and news from “trustworthy” sources, which his engineers interpreted — to the confusion of many — as a boost for anything in the category of “politics, crime, tragedy.”
  • As if Facebook’s opaque algorithms weren’t enough, last year we learned that Facebook executives had permanently deleted their own messages from the platform, erasing them from the inboxes of recipients; the justification was corporate security concerns.
  • No one at Facebook headquarters is choosing what single news story everyone in America wakes up to, of course. But they do decide whether it will be an article from a reputable outlet or a clip from “The Daily Show,” a photo from a friend’s wedding or an incendiary call to kill others.
  • Mark knows that this is too much power and is pursuing a twofold strategy to mitigate it. He is pivoting Facebook’s focus toward encouraging more private, encrypted messaging that Facebook’s employees can’t see, let alone control
  • Second, he is hoping for friendly oversight from regulators and other industry executives.
  • In an op-ed essay in The Washington Post in March, he wrote, “Lawmakers often tell me we have too much power over speech, and I agree.” And he went even further than before, calling for more government regulation — not just on speech, but also on privacy and interoperability, the ability of consumers to seamlessly leave one network and transfer their profiles, friend connections, photos and other data to another.
  • I don’t think these proposals were made in bad faith. But I do think they’re an attempt to head off the argument that regulators need to go further and break up the company. Facebook isn’t afraid of a few more rules. It’s afraid of an antitrust case and of the kind of accountability that real government oversight would bring.
  • We don’t expect calcified rules or voluntary commissions to work to regulate drug companies, health care companies, car manufacturers or credit card providers. Agencies oversee these industries to ensure that the private market works for the public good. In these cases, we all understand that government isn’t an external force meddling in an organic market; it’s what makes a dynamic and fair market possible in the first place. This should be just as true for social networking as it is for air travel or pharmaceuticals.
  • Just breaking up Facebook is not enough. We need a new agency, empowered by Congress to regulate tech companies. Its first mandate should be to protect privacy.
  • First, Facebook should be separated into multiple companies. The F.T.C., in conjunction with the Justice Department, should enforce antitrust laws by undoing the Instagram and WhatsApp acquisitions and banning future acquisitions for several years.
  • How would a breakup work? Facebook would have a brief period to spin off the Instagram and WhatsApp businesses, and the three would become distinct companies, most likely publicly traded.
  • Facebook is indeed more valuable when there are more people on it: There are more connections for a user to make and more content to be shared. But the cost of entering the social network business is not that high. And unlike with pipes and electricity, there is no good argument that the country benefits from having only one dominant social networking company.
  • others worry that the breakup of Facebook or other American tech companies could be a national security problem. Because advancements in artificial intelligence require immense amounts of data and computing power, only large companies like Facebook, Google and Amazon can afford these investments, they say. If American companies become smaller, the Chinese will outpace us.
  • The American government needs to do two things: break up Facebook’s monopoly and regulate the company to make it more accountable to the American people.
  • But the biggest winners would be the American people. Imagine a competitive market in which they could choose among one network that offered higher privacy standards, another that cost a fee to join but had little advertising and another that would allow users to customize and tweak their feeds as they saw fit
  • The cost of breaking up Facebook would be next to zero for the government, and lots of people stand to gain economically. A ban on short-term acquisitions would ensure that competitors, and the investors who take a bet on them, would have the space to flourish. Digital advertisers would suddenly have multiple companies vying for their dollars.
  • The Europeans have made headway on privacy with the General Data Protection Regulation, a law that guarantees users a minimal level of protection. A landmark privacy bill in the United States should specify exactly what control Americans have over their digital information, require clearer disclosure to users and provide enough flexibility to the agency to exercise effective oversight over time
  • The agency should also be charged with guaranteeing basic interoperability across platforms.
  • Finally, the agency should create guidelines for acceptable speech on social media
  • We will have to create similar standards that tech companies can use. These standards should of course be subject to the review of the courts, just as any other limits on speech are. But there is no constitutional right to harass others or live-stream violence.
  • These are difficult challenges. I worry that government regulators will not be able to keep up with the pace of digital innovation
  • I worry that more competition in social networking might lead to a conservative Facebook and a liberal one, or that newer social networks might be less secure if government regulation is weak
  • Professor Wu has written that this “policeman at the elbow” led IBM to steer clear “of anything close to anticompetitive conduct, for fear of adding to the case against it.”
  • Finally, an aggressive case against Facebook would persuade other behemoths like Google and Amazon to think twice about stifling competition in their own sectors, out of fear that they could be next.
  • The alternative is bleak. If we do not take action, Facebook’s monopoly will become even more entrenched. With much of the world’s personal communications in hand, it can mine that data for patterns and trends, giving it an advantage over competitors for decades to come.
  • This movement of public servants, scholars and activists deserves our support. Mark Zuckerberg cannot fix Facebook, but our government can.
Javier E

Facebook Is a Doomsday Machine - The Atlantic - 0 views

  • megadeath is not the only thing that makes the Doomsday Machine petrifying. The real terror is in its autonomy, this idea that it would be programmed to detect a series of environmental inputs, then to act, without human interference. “There is no chance of human intervention, control, and final decision,” wrote the military strategist Herman Kahn in his 1960 book, On Thermonuclear War, which laid out the hypothetical for a Doomsday Machine. The concept was to render nuclear war unwinnable, and therefore unthinkable.
  • No machine should be that powerful by itself—but no one person should be either.
  • so far, somewhat miraculously, we have figured out how to live with the bomb. Now we need to learn how to survive the social web.
  • ...41 more annotations...
  • There’s a notion that the social web was once useful, or at least that it could have been good, if only we had pulled a few levers: some moderation and fact-checking here, a bit of regulation there, perhaps a federal antitrust lawsuit. But that’s far too sunny and shortsighted a view.
  • Today’s social networks, Facebook chief among them, were built to encourage the things that make them so harmful. It is in their very architecture.
  • I realized only recently that I’ve been thinking far too narrowly about the problem.
  • Megascale is nearly the existential threat that megadeath is. No single machine should be able to control the fate of the world’s population—and that’s what both the Doomsday Machine and Facebook are built to do.
  • Facebook does not exist to seek truth and report it, or to improve civic health, or to hold the powerful to account, or to represent the interests of its users, though these phenomena may be occasional by-products of its existence.
  • The company’s early mission was to “give people the power to share and make the world more open and connected.” Instead, it took the concept of “community” and sapped it of all moral meaning.
  • Facebook—along with Google and YouTube—is perfect for amplifying and spreading disinformation at lightning speed to global audiences.
  • Facebook decided that it needed not just a very large user base, but a tremendous one, unprecedented in size. That decision set Facebook on a path to escape velocity, to a tipping point where it can harm society just by existing.
  • No one, not even Mark Zuckerberg, can control the product he made. I’ve come to realize that Facebook is not a media company. It’s a Doomsday Machine.
  • Scale and engagement are valuable to Facebook because they’re valuable to advertisers. These incentives lead to design choices such as reaction buttons that encourage users to engage easily and often, which in turn encourage users to share ideas that will provoke a strong response.
  • Every time you click a reaction button on Facebook, an algorithm records it, and sharpens its portrait of who you are.
  • The hyper-targeting of users, made possible by reams of their personal data, creates the perfect environment for manipulation—by advertisers, by political campaigns, by emissaries of disinformation, and of course by Facebook itself, which ultimately controls what you see and what you don’t see on the site.
  • there aren’t enough moderators speaking enough languages, working enough hours, to stop the biblical flood of shit that Facebook unleashes on the world, because 10 times out of 10, the algorithm is faster and more powerful than a person.
  • At megascale, this algorithmically warped personalized informational environment is extraordinarily difficult to moderate in a meaningful way, and extraordinarily dangerous as a result.
  • These dangers are not theoretical, and they’re exacerbated by megascale, which makes the platform a tantalizing place to experiment on people
  • Even after U.S. intelligence agencies identified Facebook as a main battleground for information warfare and foreign interference in the 2016 election, the company has failed to stop the spread of extremism, hate speech, propaganda, disinformation, and conspiracy theories on its site.
  • it wasn’t until October of this year, for instance, that Facebook announced it would remove groups, pages, and Instragram accounts devoted to QAnon, as well as any posts denying the Holocaust.
  • In the days after the 2020 presidential election, Zuckerberg authorized a tweak to the Facebook algorithm so that high-accuracy news sources such as NPR would receive preferential visibility in people’s feeds, and hyper-partisan pages such as Breitbart News’s and Occupy Democrats’ would be buried, according to The New York Times, offering proof that Facebook could, if it wanted to, turn a dial to reduce disinformation—and offering a reminder that Facebook has the power to flip a switch and change what billions of people see online.
  • reducing the prevalence of content that Facebook calls “bad for the world” also reduces people’s engagement with the site. In its experiments with human intervention, the Times reported, Facebook calibrated the dial so that just enough harmful content stayed in users’ news feeds to keep them coming back for more.
  • Facebook’s stated mission—to make the world more open and connected—has always seemed, to me, phony at best, and imperialist at worst.
  • Facebook is a borderless nation-state, with a population of users nearly as big as China and India combined, and it is governed largely by secret algorithms
  • How much real-world violence would never have happened if Facebook didn’t exist? One of the people I’ve asked is Joshua Geltzer, a former White House counterterrorism official who is now teaching at Georgetown Law. In counterterrorism circles, he told me, people are fond of pointing out how good the United States has been at keeping terrorists out since 9/11. That’s wrong, he said. In fact, “terrorists are entering every single day, every single hour, every single minute” through Facebook.
  • Evidence of real-world violence can be easily traced back to both Facebook and 8kun. But 8kun doesn’t manipulate its users or the informational environment they’re in. Both sites are harmful. But Facebook might actually be worse for humanity.
  • In previous eras, U.S. officials could at least study, say, Nazi propaganda during World War II, and fully grasp what the Nazis wanted people to believe. Today, “it’s not a filter bubble; it’s a filter shroud,” Geltzer said. “I don’t even know what others with personalized experiences are seeing.”
  • Mary McCord, the legal director at the Institute for Constitutional Advocacy and Protection at Georgetown Law, told me that she thinks 8kun may be more blatant in terms of promoting violence but that Facebook is “in some ways way worse” because of its reach. “There’s no barrier to entry with Facebook,” she said. “In every situation of extremist violence we’ve looked into, we’ve found Facebook postings. And that reaches tons of people. The broad reach is what brings people into the fold and normalizes extremism and makes it mainstream.” In other words, it’s the megascale that makes Facebook so dangerous.
  • Facebook’s megascale gives Zuckerberg an unprecedented degree of influence over the global population. If he isn’t the most powerful person on the planet, he’s very near the top.
  • “The thing he oversees has such an effect on cognition and people’s beliefs, which can change what they do with their nuclear weapons or their dollars.”
  • Facebook’s new oversight board, formed in response to backlash against the platform and tasked with making decisions concerning moderation and free expression, is an extension of that power. “The first 10 decisions they make will have more effect on speech in the country and the world than the next 10 decisions rendered by the U.S. Supreme Court,” Geltzer said. “That’s power. That’s real power.”
  • Facebook is also a business, and a place where people spend time with one another. Put it this way: If you owned a store and someone walked in and started shouting Nazi propaganda or recruiting terrorists near the cash register, would you, as the shop owner, tell all of the other customers you couldn’t possibly intervene?
  • In 2004, Zuckerberg said Facebook ran advertisements only to cover server costs. But over the next two years Facebook completely upended and redefined the entire advertising industry. The pre-social web destroyed classified ads, but the one-two punch of Facebook and Google decimated local news and most of the magazine industry—publications fought in earnest for digital pennies, which had replaced print dollars, and social giants scooped them all up anyway.
  • In other words, if the Dunbar number for running a company or maintaining a cohesive social life is 150 people; the magic number for a functional social platform is maybe 20,000 people. Facebook now has 2.7 billion monthly users.
  • in 2007, Zuckerberg said something in an interview with the Los Angeles Times that now takes on a much darker meaning: “The things that are most powerful aren’t the things that people would have done otherwise if they didn’t do them on Facebook. Instead, it’s the things that would never have happened otherwise.”
  • We’re still in the infancy of this century’s triple digital revolution of the internet, smartphones, and the social web, and we find ourselves in a dangerous and unstable informational environment, powerless to resist forces of manipulation and exploitation that we know are exerted on us but remain mostly invisible
  • The Doomsday Machine offers a lesson: We should not accept this current arrangement. No single machine should be able to control so many people.
  • we need a new philosophical and moral framework for living with the social web—a new Enlightenment for the information age, and one that will carry us back to shared reality and empiricism.
  • localized approach is part of what made megascale possible. Early constraints around membership—the requirement at first that users attended Harvard, and then that they attended any Ivy League school, and then that they had an email address ending in .edu—offered a sense of cohesiveness and community. It made people feel more comfortable sharing more of themselves. And more sharing among clearly defined demographics was good for business.
  • we need to adopt a broader view of what it will take to fix the brokenness of the social web. That will require challenging the logic of today’s platforms—and first and foremost challenging the very concept of megascale as a way that humans gather.
  • The web’s existing logic tells us that social platforms are free in exchange for a feast of user data; that major networks are necessarily global and centralized; that moderators make the rules. None of that need be the case.
  • We need people who dismantle these notions by building alternatives. And we need enough people to care about these other alternatives to break the spell of venture capital and mass attention that fuels megascale and creates fatalism about the web as it is now.
  • We must also find ways to repair the aspects of our society and culture that the social web has badly damaged. This will require intellectual independence, respectful debate, and the same rebellious streak that helped establish Enlightenment values centuries ago.
  • Right now, too many people are allowing algorithms and tech giants to manipulate them, and reality is slipping from our grasp as a result. This century’s Doomsday Machine is here, and humming along.
Javier E

Facebook Papers: 'History Will Not Judge Us Kindly' - The Atlantic - 0 views

  • Facebook’s hypocrisies, and its hunger for power and market domination, are not secret. Nor is the company’s conflation of free speech and algorithmic amplification
  • But the events of January 6 proved for many people—including many in Facebook’s workforce—to be a breaking point.
  • these documents leave little room for doubt about Facebook’s crucial role in advancing the cause of authoritarianism in America and around the world. Authoritarianism predates the rise of Facebook, of course. But Facebook makes it much easier for authoritarians to win.
  • ...59 more annotations...
  • Again and again, the Facebook Papers show staffers sounding alarms about the dangers posed by the platform—how Facebook amplifies extremism and misinformation, how it incites violence, how it encourages radicalization and political polarization. Again and again, staffers reckon with the ways in which Facebook’s decisions stoke these harms, and they plead with leadership to do more.
  • And again and again, staffers say, Facebook’s leaders ignore them.
  • Facebook has dismissed the concerns of its employees in manifold ways.
  • One of its cleverer tactics is to argue that staffers who have raised the alarm about the damage done by their employer are simply enjoying Facebook’s “very open culture,” in which people are encouraged to share their opinions, a spokesperson told me. This stance allows Facebook to claim transparency while ignoring the substance of the complaints, and the implication of the complaints: that many of Facebook’s employees believe their company operates without a moral compass.
  • When you stitch together the stories that spanned the period between Joe Biden’s election and his inauguration, it’s easy to see Facebook as instrumental to the attack on January 6. (A spokesperson told me that the notion that Facebook played an instrumental role in the insurrection is “absurd.”)
  • what emerges from a close reading of Facebook documents, and observation of the manner in which the company connects large groups of people quickly, is that Facebook isn’t a passive tool but a catalyst. Had the organizers tried to plan the rally using other technologies of earlier eras, such as telephones, they would have had to identify and reach out individually to each prospective participant, then persuade them to travel to Washington. Facebook made people’s efforts at coordination highly visible on a global scale.
  • The platform not only helped them recruit participants but offered people a sense of strength in numbers. Facebook proved to be the perfect hype machine for the coup-inclined.
  • In November 2019, Facebook staffers noticed they had a serious problem. Facebook offers a collection of one-tap emoji reactions. Today, they include “like,” “love,” “care,” “haha,” “wow,” “sad,” and “angry.” Company researchers had found that the posts dominated by “angry” reactions were substantially more likely to go against community standards, including prohibitions on various types of misinformation, according to internal documents.
  • In July 2020, researchers presented the findings of a series of experiments. At the time, Facebook was already weighting the reactions other than “like” more heavily in its algorithm—meaning posts that got an “angry” reaction were more likely to show up in users’ News Feeds than posts that simply got a “like.” Anger-inducing content didn’t spread just because people were more likely to share things that made them angry; the algorithm gave anger-inducing content an edge. Facebook’s Integrity workers—employees tasked with tackling problems such as misinformation and espionage on the platform—concluded that they had good reason to believe targeting posts that induced anger would help stop the spread of harmful content.
  • By dialing anger’s weight back to zero in the algorithm, the researchers found, they could keep posts to which people reacted angrily from being viewed by as many users. That, in turn, translated to a significant (up to 5 percent) reduction in the hate speech, civic misinformation, bullying, and violent posts—all of which are correlated with offline violence—to which users were exposed.
  • Facebook rolled out the change in early September 2020, documents show; a Facebook spokesperson confirmed that the change has remained in effect. It was a real victory for employees of the Integrity team.
  • But it doesn’t normally work out that way. In April 2020, according to Frances Haugen’s filings with the SEC, Facebook employees had recommended tweaking the algorithm so that the News Feed would deprioritize the surfacing of content for people based on their Facebook friends’ behavior. The idea was that a person’s News Feed should be shaped more by people and groups that a person had chosen to follow. Up until that point, if your Facebook friend saw a conspiracy theory and reacted to it, Facebook’s algorithm might show it to you, too. The algorithm treated any engagement in your network as a signal that something was worth sharing. But now Facebook workers wanted to build circuit breakers to slow this form of sharing.
  • Experiments showed that this change would impede the distribution of hateful, polarizing, and violence-inciting content in people’s News Feeds. But Zuckerberg “rejected this intervention that could have reduced the risk of violence in the 2020 election,” Haugen’s SEC filing says. An internal message characterizing Zuckerberg’s reasoning says he wanted to avoid new features that would get in the way of “meaningful social interactions.” But according to Facebook’s definition, its employees say, engagement is considered “meaningful” even when it entails bullying, hate speech, and reshares of harmful content.
  • This episode, like Facebook’s response to the incitement that proliferated between the election and January 6, reflects a fundamental problem with the platform
  • Facebook’s megascale allows the company to influence the speech and thought patterns of billions of people. What the world is seeing now, through the window provided by reams of internal documents, is that Facebook catalogs and studies the harm it inflicts on people. And then it keeps harming people anyway.
  • “I am worried that Mark’s continuing pattern of answering a different question than the question that was asked is a symptom of some larger problem,” wrote one Facebook employee in an internal post in June 2020, referring to Zuckerberg. “I sincerely hope that I am wrong, and I’m still hopeful for progress. But I also fully understand my colleagues who have given up on this company, and I can’t blame them for leaving. Facebook is not neutral, and working here isn’t either.”
  • It is quite a thing to see, the sheer number of Facebook employees—people who presumably understand their company as well as or better than outside observers—who believe their employer to be morally bankrupt.
  • I spoke with several former Facebook employees who described the company’s metrics-driven culture as extreme, even by Silicon Valley standards
  • Facebook workers are under tremendous pressure to quantitatively demonstrate their individual contributions to the company’s growth goals, they told me. New products and features aren’t approved unless the staffers pitching them demonstrate how they will drive engagement.
  • e worries have been exacerbated lately by fears about a decline in new posts on Facebook, two former employees who left the company in recent years told me. People are posting new material less frequently to Facebook, and its users are on average older than those of other social platforms.
  • One of Facebook’s Integrity staffers wrote at length about this dynamic in a goodbye note to colleagues in August 2020, describing how risks to Facebook users “fester” because of the “asymmetrical” burden placed on employees to “demonstrate legitimacy and user value” before launching any harm-mitigation tactics—a burden not shared by those developing new features or algorithm changes with growth and engagement in mind
  • The note said:We were willing to act only after things had spiraled into a dire state … Personally, during the time that we hesitated, I’ve seen folks from my hometown go further and further down the rabbithole of QAnon and Covid anti-mask/anti-vax conspiracy on FB. It has been painful to observe.
  • Current and former Facebook employees describe the same fundamentally broken culture—one in which effective tactics for making Facebook safer are rolled back by leadership or never approved in the first place.
  • That broken culture has produced a broken platform: an algorithmic ecosystem in which users are pushed toward ever more extreme content, and where Facebook knowingly exposes its users to conspiracy theories, disinformation, and incitement to violence.
  • One example is a program that amounts to a whitelist for VIPs on Facebook, allowing some of the users most likely to spread misinformation to break Facebook’s rules without facing consequences. Under the program, internal documents show, millions of high-profile users—including politicians—are left alone by Facebook even when they incite violence
  • whitelisting influential users with massive followings on Facebook isn’t just a secret and uneven application of Facebook’s rules; it amounts to “protecting content that is especially likely to deceive, and hence to harm, people on our platforms.”
  • Facebook workers tried and failed to end the program. Only when its existence was reported in September by The Wall Street Journal did Facebook’s Oversight Board ask leadership for more information about the practice. Last week, the board publicly rebuked Facebook for not being “fully forthcoming” about the program.
  • As a result, Facebook has stoked an algorithm arms race within its ranks, pitting core product-and-engineering teams, such as the News Feed team, against their colleagues on Integrity teams, who are tasked with mitigating harm on the platform. These teams establish goals that are often in direct conflict with each other.
  • “We can’t pretend we don’t see information consumption patterns, and how deeply problematic they are for the longevity of democratic discourse,” a user-experience researcher wrote in an internal comment thread in 2019, in response to a now-infamous memo from Andrew “Boz” Bosworth, a longtime Facebook executive. “There is no neutral position at this stage, it would be powerfully immoral to commit to amorality.”
  • Zuckerberg has defined Facebook’s mission as making “social infrastructure to give people the power to build a global community that works for all of us,” but in internal research documents his employees point out that communities aren’t always good for society:
  • When part of a community, individuals typically act in a prosocial manner. They conform, they forge alliances, they cooperate, they organize, they display loyalty, they expect obedience, they share information, they influence others, and so on. Being in a group changes their behavior, their abilities, and, importantly, their capability to harm themselves or others
  • Thus, when people come together and form communities around harmful topics or identities, the potential for harm can be greater.
  • The infrastructure choices that Facebook is making to keep its platform relevant are driving down the quality of the site, and exposing its users to more dangers
  • hose dangers are also unevenly distributed, because of the manner in which certain subpopulations are algorithmically ushered toward like-minded groups
  • And the subpopulations of Facebook users who are most exposed to dangerous content are also most likely to be in groups where it won’t get reported.
  • And it knows that 3 percent of Facebook users in the United States are super-consumers of conspiracy theories, accounting for 37 percent of known consumption of misinformation on the platform.
  • Zuckerberg’s positioning of Facebook’s role in the insurrection is odd. He lumps his company in with traditional media organizations—something he’s ordinarily loath to do, lest the platform be expected to take more responsibility for the quality of the content that appears on it—and suggests that Facebook did more, and did better, than journalism outlets in its response to January 6. What he fails to say is that journalism outlets would never be in the position to help investigators this way, because insurrectionists don’t typically use newspapers and magazines to recruit people for coups.
  • Facebook wants people to believe that the public must choose between Facebook as it is, on the one hand, and free speech, on the other. This is a false choice. Facebook has a sophisticated understanding of measures it could take to make its platform safer without resorting to broad or ideologically driven censorship tactics.
  • Facebook knows that no two people see the same version of the platform, and that certain subpopulations experience far more dangerous versions than others do
  • Facebook knows that people who are isolated—recently widowed or divorced, say, or geographically distant from loved ones—are disproportionately at risk of being exposed to harmful content on the platform.
  • It knows that repeat offenders are disproportionately responsible for spreading misinformation.
  • All of this makes the platform rely more heavily on ways it can manipulate what its users see in order to reach its goals. This explains why Facebook is so dependent on the infrastructure of groups, as well as making reshares highly visible, to keep people hooked.
  • It could consistently enforce its policies regardless of a user’s political power.
  • Facebook could ban reshares.
  • It could choose to optimize its platform for safety and quality rather than for growth.
  • It could tweak its algorithm to prevent widespread distribution of harmful content.
  • Facebook could create a transparent dashboard so that all of its users can see what’s going viral in real time.
  • It could make public its rules for how frequently groups can post and how quickly they can grow.
  • It could also automatically throttle groups when they’re growing too fast, and cap the rate of virality for content that’s spreading too quickly.
  • Facebook could shift the burden of proof toward people and communities to demonstrate that they’re good actors—and treat reach as a privilege, not a right
  • You must be vigilant about the informational streams you swim in, deliberate about how you spend your precious attention, unforgiving of those who weaponize your emotions and cognition for their own profit, and deeply untrusting of any scenario in which you’re surrounded by a mob of people who agree with everything you’re saying.
  • It could do all of these things. But it doesn’t.
  • Lately, people have been debating just how nefarious Facebook really is. One argument goes something like this: Facebook’s algorithms aren’t magic, its ad targeting isn’t even that good, and most people aren’t that stupid.
  • All of this may be true, but that shouldn’t be reassuring. An algorithm may just be a big dumb means to an end, a clunky way of maneuvering a massive, dynamic network toward a desired outcome. But Facebook’s enormous size gives it tremendous, unstable power.
  • Facebook takes whole populations of people, pushes them toward radicalism, and then steers the radicalized toward one another.
  • When the most powerful company in the world possesses an instrument for manipulating billions of people—an instrument that only it can control, and that its own employees say is badly broken and dangerous—we should take notice.
  • The lesson for individuals is this:
  • Facebook could say that its platform is not for everyone. It could sound an alarm for those who wander into the most dangerous corners of Facebook, and those who encounter disproportionately high levels of harmful content
  • Without seeing how Facebook works at a finer resolution, in real time, we won’t be able to understand how to make the social web compatible with democracy.
Javier E

Facebook stock drop shows dream of connecting the whole world is dead. - The Washington... - 0 views

  • The social network remains massive, indispensable for many, and isn’t going away anytime soon. This is not Facebook’s “Myspace moment,” at least not yet.
  • it’s a harbinger of a shift already well underway in Menlo Park, one in which Facebook is no longer the center of Meta’s attention or the locus of its most important innovations, but a profitable legacy product to be maintained.
  • When they built Facebook, Zuckerberg and company didn’t just want to build the largest social network. They set out to build something truly ubiquitous, something that everyone would use, something that would become part of the fabric of global society — something that everyone had to use, if only because everyone else was. And they got further than almost anyone could have imagined. Just not all the way.
  • ...9 more annotations...
  • it underscores that Instagram, WhatsApp and, increasingly, Reality Labs — the division tasked with developing virtual and augmented reality hardware and software — are the company’s future.
  • To understand how integral growth was to Facebook’s identity, it’s worth revisiting a memo that executive Andrew “Boz” Bosworth, now Meta’s CTO, sent to the company in 2016.
  • “The natural state of the world is not connected,” Bosworth wrote in the memo, which was leaked and published by BuzzFeed in 2018. “It is not unified. It is fragmented by borders, languages, and increasingly by different products. The best products don’t win. The ones everyone use [sic] win.”
  • Facebook’s “imperative,” in Bosworth’s telling — its raison d’etre — was to be that product that everyone used, the tool that unified at last a fragmented human race in a single, vast network. And the company would pursue that imperative at any cost, even the cost of users’ lives, “because that’s what we do,” he wrote. “We connect people.”
  • Wednesday’s earnings report showed that Facebook’s ascent has stalled just about everywhere. The biggest decline in daily usage was not in the United States but in a category that it calls “rest of world,” including Latin America and Africa.
  • Zuckerberg knew before just about anyone else that social media was no longer enough to keep the company on top. Now he’s trying to will into existence a grand new vision of a digital world in which we all have second lives that play out through avatars inhabiting virtual spaces and realms.
  • Several years ago, the company realized that it had saturated among U.S. and Canadian users, and it overhauled its core news feed algorithm to prioritize engagement — getting existing users to spend more time on the network.
  • losing users does not necessarily mean losing money in the short term: Facebook’s revenue per user also continued to grow last quarter.
  • Yet the end of Facebook’s growth era marks a turning point in the history of social media and the Internet. If Zuckerberg couldn’t connect the whole world with Facebook, given all the resources and momentum and desire one could ask for, he may have to confront the possibility that no single network ever will.
Javier E

Facebook's hardware ambitions are undercut by its anti-China strategy - The Washington ... - 0 views

  • For more than a year, Meta CEO Mark Zuckerberg has made a point of stoking fears about China. He’s told U.S. lawmakers that China “steals” American technology and played up nationalist concerns about threats from Chinese-owned rival TikTok.
  • Meta has a growing problem: The social media service wants to transform itself into a powerhouse in hardware, and it makes virtually all of it in China.So the company is racing to get out.
  • Facebook has hit walls, say three people familiar with the discussions, who spoke on the condition of anonymity to describe internal conversations.
  • ...7 more annotations...
  • Until recently, the people said, Meta executives viewed the company’s reliance on China to make Oculus virtual reality headsets as a relatively minor concern because the company’s core focus was its social media and messaging apps.
  • All that has changed now that Meta has rebranded itself as a hardware company
  • “Meta is building a complicated hardware product. You can’t just turn on a dime and make it elsewhere,”
  • Facebook’s public criticism of China began in 2019 when Zuckerberg warned, in a speech at Georgetown University, that China was exporting a dangerous vision for the internet to the rest of the world — and noted that Facebook was abandoning its efforts to break into that country’s market.
  • The anti-China stance has since extended into a full-blown corporate strategy. Nick Clegg, the company’s president, wrote an op-ed attacking China in The Washington Post in 2020, the same year Zuckerberg attacked China in a congressional antitrust hearing.
  • At the antitrust hearing in Congress in 2020, Zuckerberg used his opening remarks to attack China in terms that went much further than his industry peers. He said it was “well-documented that the Chinese government steals technology from American companies,” and repeated that the country was “building its own version of the internet” that went against American values. He described Facebook as a “proudly American” company and noted that TikTok was the company’s fastest-growing rival.
  • “They were trying to find things that [Zuckerberg] could agree with Trump on, and it’s a pretty slim list,” said one of the people, describing how the company landed on its anti-China strategy. “If you’re not going to try to be in this country anyway, you might as well use it to your political advantage by contrasting yourself with Apple and TikTok.”
redavistinnell

Mark Zuckerberg speaks in support of Muslims after week of 'hate' | Technology | The Gu... - 0 views

  • Mark Zuckerberg speaks in support of Muslims after week of ‘hate’
  • “After the Paris attacks and hate this week, I can only imagine the fear Muslims feel that they will be persecuted for the actions of others,” he added.
  • The comments come after Trump was widely criticised for saying on Monday that Muslims should be banned from entering the US. He said in a speech following a mass shooting committed by a Muslim couple in San Bernardino, California, last weekend: “We need a total and complete shutdown of Muslims entering the United States while we figure out what the hell is going on. We are out of control.”
  • ...1 more annotation...
  • In the face of mounting criticism, Trump has said he will never leave the 2016 presidential race.
Javier E

What Mark Zuckerberg Didn't Say About What Facebook Knows About You - WSJ - 0 views

  • When testifying before the Senate Tuesday, Mr. Zuckerberg said, “I think everyone should have control over how their information is used.” He also said, “You have full access to understand all—every piece of information that Facebook might know about you—and you can get rid of all of it.”
  • Not exactly. There are important classes of information Facebook collects on us that we can’t control. We don’t get to “opt in” or remove every specific piece. Often, we aren’t even informed of their existence—except in the abstract—and we aren’t shown how the social network uses this harvested information.
  • The website log is a good example, in part because of its sheer mass. The browsing histories of hundreds of millions—possibly billions—of people are gathered by a variety of advertising trackers, which Facebook has been offering to web publishers ever since it introduced the “Like” button in 2009.
  • ...17 more annotations...
  • They’ve become, as predicted, a nearly web-wide system for tracking all users—even when you don’t click the button.
  • “If you downloaded this file [of sites Facebook knows you visited], it would look like a quarter to half your browsing history,” Mr. Garcia-Martinez adds.
  • Another reason Facebook doesn’t give you this data: The company claims recovering it from its databases is difficult.
  • In one case, it took Facebook 106 days to deliver to a Belgian mathematician, Paul-Olivier Dehaye, all the data the company had gathered on him through its most common tracking system. Facebook doesn’t say how long it stores this information.
  • When you opt out of interest-based ads, the system that uses your browsing history to target you, Facebook continues tracking you anyway. It just no longer uses the data to show you ads.
  • There is more data Facebook collects that it doesn’t explain. It encourages users to upload their phone contacts, including names, phone numbers and email addresses
  • Facebook never discloses if such personal information about you has been uploaded by other users from their contact lists, how many times that might have happened or who might have uploaded it.
  • This data enables Facebook not only to keep track of active users across its multiple products, but also to fill in the missing links. If three people named Smith all upload contact info for the same fourth Smith, chances are this person is related
  • Facebook now knows that person exists, even if he or she has never been on Facebook. And of course, people without Facebook accounts certainly can’t see what information the company has in these so-called shadow profiles.
  • “In general, we collect data on people who have not signed up for Facebook for security purposes,” Mr. Zuckerberg told Congress
  • There’s also a form of location data you can’t control unless you delete your whole account. This isn’t the app’s easy-to-turn-off GPS tracking. It’s the string of IP addresses, a form of device identification on the internet, that can show where your computer or phone is each time it connects to Facebook.
  • Facebook says it uses your IP address to target ads when you are near a specific place, but as you can see in your downloaded Facebook data, the log of stored IP addresses can go back years.
  • Location is a powerful signal for Facebook, allowing it to infer how you are connected to other people, even if you don’t identify them as family members, co-workers or lovers
  • All this data, plus the elements Facebook lets you control, can potentially reveal everything from your wealth to whether you are depressed.
  • That level of precision is at the heart of Facebook’s recent troubles: Just because Facebook uses it to accomplish a seemingly innocent task—in Mr. Zuckerberg’s words, making ad “experiences better, and more relevant”— doesn’t mean we shouldn’t be worried.
  • Regulators the world over are coming to similar conclusions: Our personal data has become too sensitive—and too lucrative—to be left without restraints in the hands of self-interested corporations.
  • Facebook, Alphabet Inc.’s Google and a host of smaller companies that compete with and support the giants in the digital ad space have become addicted to the kind of information that helps microtarget ads.
Javier E

Elon Musk, Jeff Bezos, Mark Zuckerberg, and Bill Gates profited the most during the pan... - 0 views

  • The wealth of nine of the country’s top titans has increased by more than $360 billion in the past year.
  • Tesla’s Elon Musk more than quadrupled his fortune and jockeyed with Amazon’s Jeff Bezos for the title of world’s wealthiest person. Facebook’s Mark Zuckerberg topped $100 billion. Google co-founders Larry Page and Sergey Brin gained a combined $65 billion.
  • the $360 billion increase in top billionaire wealth approaches the $410 billion the U.S. government is spending on the latest round of $1,400 stimulus checks, passed with the $1.9 trillion pandemic relief package this week.
  • ...1 more annotation...
  • “In my view, we can no longer tolerate billionaires like Jeff Bezos, Mark Zuckerberg and Elon Musk becoming obscenely rich at a time of unprecedented economic pain and suffering,” Sen. Bernie Sanders (I-Vt.) said
rerobinson03

Trump Is Banned on Facebook 'at Least' Until His Term is Over - The New York Times - 0 views

  • Facebook on Thursday said it will block President Trump on its platforms at least until the end of his term on Jan. 20, as the mainstream online world moved forcefully to limit the president after years of inaction.
  • Mark Zuckerberg, Facebook’s chief executive, said in a post that the social network decided to cut off Mr. Trump because a rampage by pro-Trump supporters in the nation’s capital a day earlier, which was urged on by the president, showed that he “intends to use his remaining time in office to undermine the peaceful and lawful transition of power to his elected successor, Joe Biden.”
  • The actions were a striking change for a social media industry that has long declined to interfere with Mr. Trump’s posts, which were often filled with falsehoods and threats.
  • ...6 more annotations...
  • “We believe the risks of allowing the president to continue to use our service during this period are simply too great,” Mr. Zuckerberg wrote.
  • Lawmakers and even employees of the companies said the platforms had waited too long to take serious action against Mr. Trump.
  • Jack Dorsey, Twitter’s chief executive, spent Thursday morning liking and retweeting commentary that urged caution against a permanent ban of Mr. Trump. That suggested he would not deviate from the plan to allow Mr. Trump back onto the service.
  • At Facebook, that unwillingness changed on Wednesday after Mr. Trump egged on his supporters using social media and a mob stormed the Capitol building.
  • After Twitter locked Mr. Trump’s account late Wednesday, Mr. Zuckerberg approved removing two posts from the president’s Facebook page, the two people said. By that evening, Mr. Zuckerberg had decided to restrict Mr. Trump’s Facebook account for the rest of his term — and perhaps indefinitely, they said.
  • The social media companies’ clampdown extended beyond Mr. Trump. Twitter overnight permanently suspended Lin Wood, a lawyer who had used his account to promote the conspiracy theory QAnon and to urge on Wednesday’s mob. The company also removed a post from Dan Bongino, a conservative podcast host, on Thursday.
Javier E

Facebook Executives Shut Down Efforts to Make the Site Less Divisive - WSJ - 0 views

  • A Facebook Inc. team had a blunt message for senior executives. The company’s algorithms weren’t bringing people together. They were driving people apart.
  • “Our algorithms exploit the human brain’s attraction to divisiveness,” read a slide from a 2018 presentation. “If left unchecked,” it warned, Facebook would feed users “more and more divisive content in an effort to gain user attention & increase time on the platform.”
  • That presentation went to the heart of a question dogging Facebook almost since its founding: Does its platform aggravate polarization and tribal behavior? The answer it found, in some cases, was yes.
  • ...27 more annotations...
  • in the end, Facebook’s interest was fleeting. Mr. Zuckerberg and other senior executives largely shelved the basic research, according to previously unreported internal documents and people familiar with the effort, and weakened or blocked efforts to apply its conclusions to Facebook products.
  • At Facebook, “There was this soul-searching period after 2016 that seemed to me this period of really sincere, ‘Oh man, what if we really did mess up the world?’
  • Another concern, they and others said, was that some proposed changes would have disproportionately affected conservative users and publishers, at a time when the company faced accusations from the right of political bias.
  • Americans were drifting apart on fundamental societal issues well before the creation of social media, decades of Pew Research Center surveys have shown. But 60% of Americans think the country’s biggest tech companies are helping further divide the country, while only 11% believe they are uniting it, according to a Gallup-Knight survey in March.
  • Facebook policy chief Joel Kaplan, who played a central role in vetting proposed changes, argued at the time that efforts to make conversations on the platform more civil were “paternalistic,” said people familiar with his comments.
  • The high number of extremist groups was concerning, the presentation says. Worse was Facebook’s realization that its algorithms were responsible for their growth. The 2016 presentation states that “64% of all extremist group joins are due to our recommendation tools” and that most of the activity came from the platform’s “Groups You Should Join” and “Discover” algorithms: “Our recommendation systems grow the problem.”
  • In a sign of how far the company has moved, Mr. Zuckerberg in January said he would stand up “against those who say that new types of communities forming on social media are dividing us.” People who have heard him speak privately said he argues social media bears little responsibility for polarization.
  • Fixing the polarization problem would be difficult, requiring Facebook to rethink some of its core products. Most notably, the project forced Facebook to consider how it prioritized “user engagement”—a metric involving time spent, likes, shares and comments that for years had been the lodestar of its system.
  • Even before the teams’ 2017 creation, Facebook researchers had found signs of trouble. A 2016 presentation that names as author a Facebook researcher and sociologist, Monica Lee, found extremist content thriving in more than one-third of large German political groups on the platform.
  • Swamped with racist, conspiracy-minded and pro-Russian content, the groups were disproportionately influenced by a subset of hyperactive users, the presentation notes. Most of them were private or secret.
  • One proposal Mr. Uribe’s team championed, called “Sparing Sharing,” would have reduced the spread of content disproportionately favored by hyperactive users, according to people familiar with it. Its effects would be heaviest on content favored by users on the far right and left. Middle-of-the-road users would gain influence.
  • The Common Ground team sought to tackle the polarization problem directly, said people familiar with the team. Data scientists involved with the effort found some interest groups—often hobby-based groups with no explicit ideological alignment—brought people from different backgrounds together constructively. Other groups appeared to incubate impulses to fight, spread falsehoods or demonize a population of outsiders.
  • Mr. Pariser said that started to change after March 2018, when Facebook got in hot water after disclosing that Cambridge Analytica, the political-analytics startup, improperly obtained Facebook data about tens of millions of people. The shift has gained momentum since, he said: “The internal pendulum swung really hard to ‘the media hates us no matter what we do, so let’s just batten down the hatches.’ ”
  • Building these features and combating polarization might come at a cost of lower engagement, the Common Ground team warned in a mid-2018 document, describing some of its own proposals as “antigrowth” and requiring Facebook to “take a moral stance.”
  • Taking action would require Facebook to form partnerships with academics and nonprofits to give credibility to changes affecting public conversation, the document says. This was becoming difficult as the company slogged through controversies after the 2016 presidential election.
  • Asked to combat fake news, spam, clickbait and inauthentic users, the employees looked for ways to diminish the reach of such ills. One early discovery: Bad behavior came disproportionately from a small pool of hyperpartisan users.
  • A second finding in the U.S. saw a larger infrastructure of accounts and publishers on the far right than on the far left. Outside observers were documenting the same phenomenon. The gap meant even seemingly apolitical actions such as reducing the spread of clickbait headlines—along the lines of “You Won’t Believe What Happened Next”—affected conservative speech more than liberal content in aggregate.
  • Every significant new integrity-ranking initiative had to seek the approval of not just engineering managers but also representatives of the public policy, legal, marketing and public-relations departments.
  • “Engineers that were used to having autonomy maybe over-rotated a bit” after the 2016 election to address Facebook’s perceived flaws, she said. The meetings helped keep that in check. “At the end of the day, if we didn’t reach consensus, we’d frame up the different points of view, and then they’d be raised up to Mark.”
  • Disapproval from Mr. Kaplan’s team or Facebook’s communications department could scuttle a project, said people familiar with the effort. Negative policy-team reviews killed efforts to build a classification system for hyperpolarized content. Likewise, the Eat Your Veggies process shut down efforts to suppress clickbait about politics more than on other topics.
  • Under Facebook’s engagement-based metrics, a user who likes, shares or comments on 1,500 pieces of content has more influence on the platform and its algorithms than one who interacts with just 15 posts, allowing “super-sharers” to drown out less-active users
  • Accounts with hyperactive engagement were far more partisan on average than normal Facebook users, and they were more likely to behave suspiciously, sometimes appearing on the platform as much as 20 hours a day and engaging in spam-like behavior. The behavior suggested some were either people working in shifts or bots.
  • “We’re explicitly not going to build products that attempt to change people’s beliefs,” one 2018 document states. “We’re focused on products that increase empathy, understanding, and humanization of the ‘other side.’ ”
  • The debate got kicked up to Mr. Zuckerberg, who heard out both sides in a short meeting, said people briefed on it. His response: Do it, but cut the weighting by 80%. Mr. Zuckerberg also signaled he was losing interest in the effort to recalibrate the platform in the name of social good, they said, asking that they not bring him something like that again.
  • Mr. Uribe left Facebook and the tech industry within the year. He declined to discuss his work at Facebook in detail but confirmed his advocacy for the Sparing Sharing proposal. He said he left Facebook because of his frustration with company executives and their narrow focus on how integrity changes would affect American politics
  • While proposals like his did disproportionately affect conservatives in the U.S., he said, in other countries the opposite was true.
  • The tug of war was resolved in part by the growing furor over the Cambridge Analytica scandal. In a September 2018 reorganization of Facebook’s newsfeed team, managers told employees the company’s priorities were shifting “away from societal good to individual value,” said people present for the discussion. If users wanted to routinely view or post hostile content about groups they didn’t like, Facebook wouldn’t suppress it if the content didn’t specifically violate the company’s rules.
1 - 20 of 108 Next › Last »
Showing 20 items per page