Facebook Is a Doomsday Machine - The Atlantic - 0 views
www.theatlantic.com/...617384
Facebook twitter social media internet algorithm smartphone crisis culture
shared by Javier E on 17 Dec 20
- No Cached
-
megadeath is not the only thing that makes the Doomsday Machine petrifying. The real terror is in its autonomy, this idea that it would be programmed to detect a series of environmental inputs, then to act, without human interference. “There is no chance of human intervention, control, and final decision,” wrote the military strategist Herman Kahn in his 1960 book, On Thermonuclear War, which laid out the hypothetical for a Doomsday Machine. The concept was to render nuclear war unwinnable, and therefore unthinkable.
-
so far, somewhat miraculously, we have figured out how to live with the bomb. Now we need to learn how to survive the social web.
- ...41 more annotations...
-
There’s a notion that the social web was once useful, or at least that it could have been good, if only we had pulled a few levers: some moderation and fact-checking here, a bit of regulation there, perhaps a federal antitrust lawsuit. But that’s far too sunny and shortsighted a view.
-
Today’s social networks, Facebook chief among them, were built to encourage the things that make them so harmful. It is in their very architecture.
-
Megascale is nearly the existential threat that megadeath is. No single machine should be able to control the fate of the world’s population—and that’s what both the Doomsday Machine and Facebook are built to do.
-
Facebook does not exist to seek truth and report it, or to improve civic health, or to hold the powerful to account, or to represent the interests of its users, though these phenomena may be occasional by-products of its existence.
-
The company’s early mission was to “give people the power to share and make the world more open and connected.” Instead, it took the concept of “community” and sapped it of all moral meaning.
-
Facebook—along with Google and YouTube—is perfect for amplifying and spreading disinformation at lightning speed to global audiences.
-
Facebook decided that it needed not just a very large user base, but a tremendous one, unprecedented in size. That decision set Facebook on a path to escape velocity, to a tipping point where it can harm society just by existing.
-
No one, not even Mark Zuckerberg, can control the product he made. I’ve come to realize that Facebook is not a media company. It’s a Doomsday Machine.
-
Scale and engagement are valuable to Facebook because they’re valuable to advertisers. These incentives lead to design choices such as reaction buttons that encourage users to engage easily and often, which in turn encourage users to share ideas that will provoke a strong response.
-
Every time you click a reaction button on Facebook, an algorithm records it, and sharpens its portrait of who you are.
-
The hyper-targeting of users, made possible by reams of their personal data, creates the perfect environment for manipulation—by advertisers, by political campaigns, by emissaries of disinformation, and of course by Facebook itself, which ultimately controls what you see and what you don’t see on the site.
-
there aren’t enough moderators speaking enough languages, working enough hours, to stop the biblical flood of shit that Facebook unleashes on the world, because 10 times out of 10, the algorithm is faster and more powerful than a person.
-
At megascale, this algorithmically warped personalized informational environment is extraordinarily difficult to moderate in a meaningful way, and extraordinarily dangerous as a result.
-
These dangers are not theoretical, and they’re exacerbated by megascale, which makes the platform a tantalizing place to experiment on people
-
Even after U.S. intelligence agencies identified Facebook as a main battleground for information warfare and foreign interference in the 2016 election, the company has failed to stop the spread of extremism, hate speech, propaganda, disinformation, and conspiracy theories on its site.
-
it wasn’t until October of this year, for instance, that Facebook announced it would remove groups, pages, and Instragram accounts devoted to QAnon, as well as any posts denying the Holocaust.
-
In the days after the 2020 presidential election, Zuckerberg authorized a tweak to the Facebook algorithm so that high-accuracy news sources such as NPR would receive preferential visibility in people’s feeds, and hyper-partisan pages such as Breitbart News’s and Occupy Democrats’ would be buried, according to The New York Times, offering proof that Facebook could, if it wanted to, turn a dial to reduce disinformation—and offering a reminder that Facebook has the power to flip a switch and change what billions of people see online.
-
reducing the prevalence of content that Facebook calls “bad for the world” also reduces people’s engagement with the site. In its experiments with human intervention, the Times reported, Facebook calibrated the dial so that just enough harmful content stayed in users’ news feeds to keep them coming back for more.
-
Facebook’s stated mission—to make the world more open and connected—has always seemed, to me, phony at best, and imperialist at worst.
-
Facebook is a borderless nation-state, with a population of users nearly as big as China and India combined, and it is governed largely by secret algorithms
-
How much real-world violence would never have happened if Facebook didn’t exist? One of the people I’ve asked is Joshua Geltzer, a former White House counterterrorism official who is now teaching at Georgetown Law. In counterterrorism circles, he told me, people are fond of pointing out how good the United States has been at keeping terrorists out since 9/11. That’s wrong, he said. In fact, “terrorists are entering every single day, every single hour, every single minute” through Facebook.
-
Evidence of real-world violence can be easily traced back to both Facebook and 8kun. But 8kun doesn’t manipulate its users or the informational environment they’re in. Both sites are harmful. But Facebook might actually be worse for humanity.
-
In previous eras, U.S. officials could at least study, say, Nazi propaganda during World War II, and fully grasp what the Nazis wanted people to believe. Today, “it’s not a filter bubble; it’s a filter shroud,” Geltzer said. “I don’t even know what others with personalized experiences are seeing.”
-
Mary McCord, the legal director at the Institute for Constitutional Advocacy and Protection at Georgetown Law, told me that she thinks 8kun may be more blatant in terms of promoting violence but that Facebook is “in some ways way worse” because of its reach. “There’s no barrier to entry with Facebook,” she said. “In every situation of extremist violence we’ve looked into, we’ve found Facebook postings. And that reaches tons of people. The broad reach is what brings people into the fold and normalizes extremism and makes it mainstream.” In other words, it’s the megascale that makes Facebook so dangerous.
-
Facebook’s megascale gives Zuckerberg an unprecedented degree of influence over the global population. If he isn’t the most powerful person on the planet, he’s very near the top.
-
“The thing he oversees has such an effect on cognition and people’s beliefs, which can change what they do with their nuclear weapons or their dollars.”
-
Facebook’s new oversight board, formed in response to backlash against the platform and tasked with making decisions concerning moderation and free expression, is an extension of that power. “The first 10 decisions they make will have more effect on speech in the country and the world than the next 10 decisions rendered by the U.S. Supreme Court,” Geltzer said. “That’s power. That’s real power.”
-
Facebook is also a business, and a place where people spend time with one another. Put it this way: If you owned a store and someone walked in and started shouting Nazi propaganda or recruiting terrorists near the cash register, would you, as the shop owner, tell all of the other customers you couldn’t possibly intervene?
-
In 2004, Zuckerberg said Facebook ran advertisements only to cover server costs. But over the next two years Facebook completely upended and redefined the entire advertising industry. The pre-social web destroyed classified ads, but the one-two punch of Facebook and Google decimated local news and most of the magazine industry—publications fought in earnest for digital pennies, which had replaced print dollars, and social giants scooped them all up anyway.
-
localized approach is part of what made megascale possible. Early constraints around membership—the requirement at first that users attended Harvard, and then that they attended any Ivy League school, and then that they had an email address ending in .edu—offered a sense of cohesiveness and community. It made people feel more comfortable sharing more of themselves. And more sharing among clearly defined demographics was good for business.
-
in 2007, Zuckerberg said something in an interview with the Los Angeles Times that now takes on a much darker meaning: “The things that are most powerful aren’t the things that people would have done otherwise if they didn’t do them on Facebook. Instead, it’s the things that would never have happened otherwise.”
-
We’re still in the infancy of this century’s triple digital revolution of the internet, smartphones, and the social web, and we find ourselves in a dangerous and unstable informational environment, powerless to resist forces of manipulation and exploitation that we know are exerted on us but remain mostly invisible
-
The Doomsday Machine offers a lesson: We should not accept this current arrangement. No single machine should be able to control so many people.
-
we need a new philosophical and moral framework for living with the social web—a new Enlightenment for the information age, and one that will carry us back to shared reality and empiricism.
-
In other words, if the Dunbar number for running a company or maintaining a cohesive social life is 150 people; the magic number for a functional social platform is maybe 20,000 people. Facebook now has 2.7 billion monthly users.
-
we need to adopt a broader view of what it will take to fix the brokenness of the social web. That will require challenging the logic of today’s platforms—and first and foremost challenging the very concept of megascale as a way that humans gather.
-
The web’s existing logic tells us that social platforms are free in exchange for a feast of user data; that major networks are necessarily global and centralized; that moderators make the rules. None of that need be the case.
-
We need people who dismantle these notions by building alternatives. And we need enough people to care about these other alternatives to break the spell of venture capital and mass attention that fuels megascale and creates fatalism about the web as it is now.
-
We must also find ways to repair the aspects of our society and culture that the social web has badly damaged. This will require intellectual independence, respectful debate, and the same rebellious streak that helped establish Enlightenment values centuries ago.
-
Right now, too many people are allowing algorithms and tech giants to manipulate them, and reality is slipping from our grasp as a result. This century’s Doomsday Machine is here, and humming along.