Skip to main content

Home/ History Readings/ Group items tagged facebook

Rss Feed Group items tagged

Javier E

Facebook Whistleblower's Testimony Builds Momentum for Tougher Tech Laws - WSJ - 0 views

  • “I saw Facebook repeatedly encounter conflicts between its own profit and our safety. Facebook consistently resolved these conflicts in favor of its own profits,” Ms. Haugen told a Senate consumer protection subcommittee. “As long as Facebook is operating in the shadows, hiding its research from public scrutiny, it is unaccountable. Until the incentives change, Facebook will not change.”
  • “There is no one currently holding Mark accountable but himself,” she said. Facebook under Mr. Zuckerberg makes decisions based on how they will affect measurements of user engagement, rather than their potential downsides for the public, she said.
  • “Mark has built an organization that is very metrics-driven,” she said. “The metrics make the decision. Unfortunately that itself is a decision.”
  • ...13 more annotations...
  • Sen. Richard Blumenthal (D., Conn.), the chairman of the subcommittee conducting Tuesday’s hearing, called on Mr. Zuckerberg to appear before Congress to testify, terming the company “morally bankrupt.”
  • Facebook has said it plans to continue doing internal research and is working on ways to make that work available to others. The company has recently battled with some academic researchers over access to its data, but Facebook says that it works cooperatively with many others.
  • Republican and Democratic lawmakers at the hearing renewed their calls for regulation, such as strengthening privacy and competition laws and special online protections for children, as well as toughening of the platforms’ accountability. One idea that got a particular boost was requiring more visibility into social-media data as well as the algorithms that shape users’ experiences.
  • “The severity of this crisis demands that we break out of previous regulatory frames,” she said. “Tweaks to outdated privacy protections…will not be sufficient.”
  • A good starting point, she added, would be “full access to data for research not directed by Facebook. On this foundation, we can build sensible rules and standards to address consumer harms, illegal content, data protection, anticompetitive practices, algorithmic systems and more.”
  • Ms. Haugen also raised national-security concerns about Facebook, citing foreign surveillance on the platform—for example, Chinese monitoring of Uyghur populations—and what she termed Facebook’s “consistent understaffing” of its counterintelligence teams.
  • Ms. Haugen made the case for policy changes to address her perceived concerns. In products such as cars and cigarettes, she said, independent researchers can evaluate health effects, but “the public cannot do the same with Facebook.”
  • “This inability to see in Facebook’s actual systems and confirm that they work as communicated is like the Department of Transportation regulating cars by only watching them drive down the highway,” she said, arguing for an independent government agency that would employ experts to audit the impact of social media.
  • She said that if Congress moves to change Section 230, a federal accountability law that protects Facebook and other companies from liability for user-generated content, it should distinguish between that kind of content and choices that companies make about what type of content to promote.
  • “Facebook should not get a pass on choices it makes to prioritize virality and growth and reactiveness over public safety,” she said.
  • Ms. Haugen was hired by Facebook two years ago to help protect against election interference on Facebook. She said she acted because she was frustrated by what she viewed as Facebook’s lack of openness about the platform’s potential for harm and its unwillingness to address its flaws.
  • “I would simply say, let’s get to work,” said Sen. John Thune (R., S.D.), who has sponsored several measures on algorithm transparency. “We’ve got some things we can do here.”
  • “There’s always reason for skepticism” about Congress reaching consensus on legislation, Mr. Blumenthal said after the hearing. But he added that “there are times when the dynamic is so powerful that something actually is done…I have rarely, if ever, seen the kind of unanimity on display today.”
Javier E

Facebook, in Cross Hairs After Election, Is Said to Question Its Influence - The New Yo... - 0 views

  • Facebook has been in the eye of a postelection storm for the last few days, embroiled in accusations that it helped spread misinformation and fake news stories that influenced how the American electorate voted.
  • — many company executives and employees have been asking one another if, or how, they shaped the minds, opinions and votes of Americans.
  • Some employees are worried about the spread of racist and so-called alt-right memes across the network, according to interviews with 10 current and former Facebook employees. Others are asking whether they contributed to a “filter bubble” among users who largely interact with people who share the same beliefs.
  • ...6 more annotations...
  • “A fake story claiming Pope Francis — actually a refugee advocate — endorsed Mr. Trump was shared almost a million times, likely visible to tens of millions,” Zeynep Tufekci, an associate professor at the University of North Carolina who studies the social impact of technology, said of a recent post on Facebook. “Its correction was barely heard. Of course Facebook had significant influence in this last election’s outcome.”
  • Chris Cox, a senior vice president of product and one of Mr. Zuckerberg’s top lieutenants, has long described Facebook as an unbiased and blank canvas to give people a voice.
  • “Of all the content on Facebook, more than 99% of what people see is authentic. Only a very small amount is fake news and hoaxes,” Mr. Zuckerberg wrote. “Overall, this makes it extremely unlikely hoaxes changed the outcome of this election in one direction or the other.”
  • Almost half of American adults rely on Facebook as a source of news, according to a study by the Pew Research Center. And Facebook often emphasizes its ability to sway its users with advertisers, portraying itself as an effective mechanism to help promote their products.
  • More recently, issues with fake news on the site have mushroomed. Multiple Facebook employees were particularly disturbed last week when a fake news site called The Denver Guardian spread across the social network with negative and false messages about Mrs. Clinton, including a claim that an F.B.I. agent connected to Mrs. Clinton’s email disclosures had murdered his wife and shot himself.
  • Even in private, Mr. Zuckerberg has continued to resist the notion that Facebook can unduly affect how people think and behave. In a Facebook post circulated on Wednesday to a small group of his friends, which was obtained by The New York Times, Mr. Zuckerberg challenged the idea that Facebook had a direct effect on the way people voted.
Javier E

Opinion | George Soros: Mark Zuckerberg Should Not Be in Control of Facebook - The New ... - 0 views

  • I believe that Mr. Trump and Facebook’s chief executive, Mark Zuckerberg, realize that their interests are aligned — the president’s in winning elections, Mr. Zuckerberg’s in making money.
  • In 2016, Facebook provided the Trump campaign with embedded staff who helped to optimize its advertising program. (Hillary Clinton’s campaign was also approached, but it declined to embed a Facebook team in her campaign’s operations.)
  • Brad Parscale, the digital director of Mr. Trump’s 2016 campaign and now his campaign manager for 2020, said that Facebook helped Mr. Trump and gave him the edge. This seems to have marked the beginning of a special relationship.
  • ...8 more annotations...
  • Mr. Zuckerberg met with Mr. Trump in the Oval Office on Sept. 19, 2019. We don’t know what was said. But from an interview on the sidelines at the World Economic Forum on Jan. 22, we do know what Mr. Trump said about the meeting: Mr. Zuckerberg “told me that I’m No. 1 in the world in Facebook.”
  • Mr. Trump apparently had no problem with Facebook’s decision not to fact-check political ads. “I’d rather have him just do whatever he is going to do,” Mr. Trump said of Mr. Zuckerberg. “He’s done a hell of a job, when you think of it.”
  • Facebook’s decision not to require fact-checking for political candidates’ advertising in 2020 has flung open the door for false, manipulated, extreme and incendiary statements. Such content is rewarded with prime placement and promotion if it meets Facebook-designed algorithmic standards for popularity and engagement.
  • Facebook’s design tends to obscure the sources of inflammatory and false content, and fails to adequately punish those who spread false information. Nor does the company effectively warn those who are exposed to lies.
  • Facebook has been used to cause worse damage in other countries than the United States. In Myanmar, for example, military personnel used Facebook to help incite the public against the Rohingya, who were targeted in a military assault of incredible cruelty including murder, rape and the burning of entire villages: Around 700,000 Rohingya fled to Bangladesh.
  • within the last year, Facebook has introduced new features on its mobile app that actually intensify the fire of incendiary political attacks — making them easier and quicker to propagate. The system is cost-free to the poster and revenue-generating for Facebook.
  • Facebook is a publisher not just a neutral moderator or “platform.” It should be held accountable for the content that appears on its site
  • I repeat and reaffirm my accusation against Facebook under the leadership of Mr. Zuckerberg and Ms. Sandberg. They follow only one guiding principle: maximize profits irrespective of the consequences
Javier E

Opinion | The Real Reason Facebook Won't Fact-Check Political Ads - The New York Times - 0 views

  • Facebook’s decision to refrain from policing the claims of political ads is not unreasonable. But the company’s officers have been incompetent at explaining and defending this decision.
  • If Facebook’s leaders were willing to level with us, they would stop defending themselves by appealing to lofty values like free speech
  • They would focus instead on more practical realities: Facebook is incapable of vetting political ads effectively and consistently at the global scale. And political ads are essential to maintaining the company’s presence in countries around the world.
  • ...18 more annotations...
  • The truth or falsity of most political ads is not so easy.
  • During Game 7 of the World Series on Wednesday, the Trump campaign ran a television ad claiming that he has created six million jobs and half a million manufacturing jobs. Is that statement true or false? Was there a net gain of 500,000 more manufacturing jobs in the United States since Jan. 20, 2017? Or is that a gross number, waiting to be reduced by some number of manufacturing jobs lost?
  • Is the ad’s use of the active voice, saying that President Trump is creating those jobs, honest? Is Mr. Trump directly responsible? Or did the momentum of the economic recovery since 2010 push manufacturers to add those positions? Should Facebook block the ad if one of seven claims is false? Vetting such claims takes time and effort, and might not be possible at all.
  • Facebook could also defend political ads by conceding that it must continue the practice to maintain its status and markets
  • Ad fact-checking can’t be done consistently in the United States. It definitely can’t be done at a global scale — 2.8 billion users of all Facebook-owned services posting in more than 100 languages
  • Given the task of policing for truth on Facebook, it’s unrealistic and simplistic to demand veracity from a system that is too big to govern.
  • Might Facebook ban political ads altogether, like Twitter has? Mr. Zuckerberg could concede that it’s not an easy task. What’s not political? If an ad calling for a carbon tax is political, is an ad promoting the reputation of an oil company political?
  • Those are the false positives we know of. We have no idea how many false negatives Facebook has let slip through.
  • imagine Facebook’s contracted fact checkers doing that sort of research and interrogation for millions of ads from 22 presidential candidates in the United States, from candidates for 35 Senate seats, 435 House of Representatives seats and thousands of state legislative races.
  • Over all, Facebook has no incentive to stop carrying political ads. Its revenue keeps growing despite a flurry of scandals and mistakes. So its leaders would lose little by being straight with the public about its limitations and motives. But they won’t. They will continue to defend their practices in disingenuous ways until we force them to change their ways.
  • We should know better than to demand of Facebook’s leaders that they do what is not in the best interests of the company. Instead, citizens around the world should demand effective legislation that can curb Facebook’s power.
  • The key is to limit data collection and the use of personal data to ferry ads and other content to discrete segments of Facebook users — the very core of the Facebook business model.
  • here’s something Congress could do: restrict the targeting of political ads in any medium to the level of the electoral district of the race. Tailoring messages for African-American voters, men or gun enthusiasts would still be legal, as this rule would not govern content. But people not in those groups would see those tailored messages as well and could learn more about their candidates.
  • Currently, two people in the same household can receive different ads from the same candidate running for state senate. That means a candidate can lie to one or both voters and they might never know about the other’s ads. This data-driven obscurity limits accountability and full deliberation.
  • A reason to be concerned about false claims in ads is that Facebook affords us so little opportunity to respond to ads not aimed at us personally. This proposal would limit that problem.
  • The overall regulatory goal should be to install friction into the system of targeted digital political ads
  • This process would not be easy, as political incumbents and powerful corporations that sell targeted ads (not just Facebook and Google, but also Verizon, AT&T, Comcast and The New York Times, for example) are invested in the status quo.
  • We can’t expect corporate leaders to do anything but lead their corporations. We can’t expect them to be honest with us, either. We must change their businesses for them so they stop undermining our democracies.
Javier E

Facebook Has 50 Minutes of Your Time Each Day. It Wants More. - The New York Times - 0 views

  • Fifty minutes.That’s the average amount of time, the company said, that users spend each day on its Facebook, Instagram and Messenger platforms
  • there are only 24 hours in a day, and the average person sleeps for 8.8 of them. That means more than one-sixteenth of the average user’s waking time is spent on Facebook.
  • That’s more than any other leisure activity surveyed by the Bureau of Labor Statistics, with the exception of watching television programs and movies (an average per day of 2.8 hours)
  • ...19 more annotations...
  • It’s more time than people spend reading (19 minutes); participating in sports or exercise (17 minutes); or social events (four minutes). It’s almost as much time as people spend eating and drinking (1.07 hours).
  • the average time people spend on Facebook has gone up — from around 40 minutes in 2014 — even as the number of monthly active users has surged. And that’s just the average. Some users must be spending many hours a day on the site,
  • Time is the best measure of engagement, and engagement correlates with advertising effectiveness. Time also increases the supply of impressions that Facebook can sell, which brings in more revenue (a 52 percent increase last quarter to $5.4 billion).
  • time has become the holy grail of digital media.
  • And time enables Facebook to learn more about its users — their habits and interests — and thus better target its ads. The result is a powerful network effect that competitors will be hard pressed to match.
  • the only one that comes close is Alphabet’s YouTube, where users spent an average of 17 minutes a day on the site. That’s less than half the 35 minutes a day users spent on Facebook
  • Users spent an average of nine minutes on all of Yahoo’s sites, two minutes on LinkedIn and just one minute on Twitter
  • People spending the most time on Facebook also tend to fall into the prized 18-to-34 demographic sought by advertisers.
  • “You hear a narrative that young people are fleeing Facebook. The data show that’s just not true. Younger users have a wider appetite for social media, and they spend a lot of time on multiple networks. But they spend more time on Facebook by a wide margin.”
  • What aren’t Facebook users doing during the 50 minutes they spend there? Is it possibly interfering with work (and productivity), or, in the case of young people, studying and reading?
  • While the Bureau of Labor Statistics surveys nearly every conceivable time-occupying activity (even fencing and spelunking), it doesn’t specifically tally the time spent on social media, both because the activity may have multiple purposes — both work and leisure — and because people often do it at the same time they are ostensibly engaged in other activities
  • The closest category would be “computer use for leisure,” which has grown from eight minutes in 2006, when the bureau began collecting the data, to 14 minutes in 2014, the most recent survey. Or perhaps it would be “socializing and communicating with others,” which slipped from 40 minutes to 38 minutes.
  • But time spent on most leisure activities hasn’t changed much in those eight years of the bureau’s surveys. Time spent reading dropped from an average of 22 minutes to 19 minutes. Watching television and movies increased from 2.57 hours to 2.8. Average time spent working declined from 3.4 hours to 3.25. (Those hours seem low because much of the population, which includes both young people and the elderly, does not work.)
  • The bureau’s numbers, since they cover the entire population, may be too broad to capture important shifts among important demographic groups
  • ComScore reported that television viewing (both live and recorded) dropped 2 percent last year, and it said younger viewers in particular are abandoning traditional live television. People ages 18-34 spent just 47 percent of their viewing time on television screens, and 40 percent on mobile devices.
  • Among those 55 and older, 70 percent of their viewing time was on television, according to comScore. So among young people, much social media time may be coming at the expense of traditional television.
  • comScore’s data suggests that people are spending on average just six to seven minutes a day using social media on their work computers. “I don’t think Facebook is displacing other activity,” he said. “People use it during downtime during the course of their day, in the elevator, or while commuting, or waiting.
  • Facebook, naturally, is busy cooking up ways to get us to spend even more time on the platform
  • A crucial initiative is improving its News Feed, tailoring it more precisely to the needs and interests of its users, based on how long people spend reading particular posts. For people who demonstrate a preference for video, more video will appear near the top of their news feed. The more time people spend on Facebook, the more data they will generate about themselves, and the better the company will get at the task.
Javier E

Facebook Is Not the Town Square - The Bulwark - 0 views

  • everyone knows that Facebook is just our new, digital Town Square, right? You can’t blame Facebook if it’s just a distillation of all our worst and best impulses.
  • Except that it’s not.
  • Have you ever been to an actual town and visited its square?
  • ...17 more annotations...
  • On Facebook, you have to endure anonymous abuse of this nature with absolutely no recourse other than to hit the “report” button and hope that some community standards drone, somewhere, suspends the offending account for a couple days.
  • In a real town square, you can see people’s faces and usually you know them already.
  • On Facebook, you’re dumped into a group of “friends” you’ve never met, or interacted with—many of whom might not even be actual human beings. And the only help you get in determining social context is a combination of text, emojis, and gifs.
  • On Facebook, you may be arguing with hired Russian trolls who are actively employed by Vladimir Putin to sow discord in the world.
  • In a real town square, if someone claims they plan to assault you and your family you can punch their physical face with your physical fist.
  • In a real town square, you can fit, at most, a few hundred people. If you’re in the square of a giant, world-historic city in Russia or China, you might be able to squeeze in 600,000 people. On Facebook, you have . . . everyone on planet Earth!
  • In a real town square, people who insist that COVID vaccines are filled with mind control nanobots and that the Jews are enslaving children are relegated to the fringes.
  • Facebook follows you everywhere—like a psychotic ex. It’s always hiding in the bushes and you have no choice but to wonder what the heck it’s up to right now,
  • In a real town square, if the entire town became convinced that their mayor is the Christ risen and decided that they needed to stockpile AK-47s in preparation for the apocalypse then visitors to the town would quietly leave (and warn the authorities).
  • On Facebook, those folks get a guest pass to every other Town Square—again in the world—and are free to go around preaching their lunacy to others without being constrained by space, time, or economics.
  • In a real town square, the town doesn’t benefit financially by attracting the stupidest/craziest/most pernicious townfolk to the soapbox and then doing everything in their power to make sure the residents of the town are afraid to leave the square for fear of missing something truly terrible.
  • Facebook makes approximately all its money by getting you to rubber-neck through your day as you slow-roll past trainwreck after trainwreck.
  • In the real world, the Town Square stays (as the name suggests) in TOWN!
  • On Facebook, they are brought together into powerful collectives, afforded megaphones, and algorithmically ushered into everyone’s sphere of influence to corrupt otherwise rational and healthy discourse.
  • But in fairness, there is one aspect of the town square metaphor where Facebook is a pretty decent facsimile of the real thing. It’s the one where we used to drag innocent people to a gallows, accuse them of something completely insane, like “witchcraft,” and then either ruin or end their lives.
  • Facebook is pretty good at that because its fortunes are made by making sure that you keep coming back—it doesn’t matter if you’re sharing pics of kitties, contributing to a genocide, or part of an angry mob that’s destroying someone’s livelihood or reputation on some fanciful whim.
  • What’s another real-world concept where a private unregulated enterprise gets to make a fortune running psychological experiments on the population of the world, which leads to a slow collapse of civil and civic order and drives everyone insane? I’m actually drawing a blank—but it sure as hell isn’t a “town square.”
Javier E

The Mark Zuckerberg Manifesto: Great for Facebook, Bad for Journalism - The Atlantic - 0 views

  • 85 percent of all online advertising revenue is funneled to either Facebook or Google—leaving a paltry 15 percent for news organizations to fight over.
  • Now, Zuckerberg is making it clear that he wants Facebook to take over many of the actual functions—not just ad dollars—that traditional news organizations once had.
  • Zuckerberg uses abstract language in his memo—he wants Facebook to develop “the social infrastructure for community,” he writes—but what he’s really describing is building a media company with classic journalistic goals: The Facebook of the future, he writes, will be “for keeping us safe, for informing us, for civic engagement, and for inclusion of all.”
  • ...16 more annotations...
  • In the past, the deaths of news organizations have jeopardized the prospect of a safe, well-informed, civically-engaged community
  • One 2014 paper found a substantial drop-off in civic engagement in both Seattle and Denver from 2008 to 2009, after both cities saw the closure of longstanding daily newspapers
  • The problem is that Zuckerberg lays out concrete ideas about how to build community on Facebook, how to encourage civic engagement, and how to improve the quality and inclusiveness of discourse—but he bakes in an assumption that news, which has always been subsidized by the advertising dollars his company now commands, will continue to feed into Facebook’s system at little to no cost to Facebook
  • In some ways, Zuckerberg is building a news organization without journalists. The uncomfortable truth for journalists, though, is that Facebook is much better at community building in the digital age than news organizations are.
  • Facebook is asking its users to act as unpaid publishers and curators of content
  • for context: The Japanese newspaper Yomiuri Shimbun claims that its  circulation of 9 million copies daily makes it the largest in the world
  • In the United States, the combined daily prime time average viewership for CNN, Fox News, and MSNBC was 3.1 million people in 2015,
  • The New York Times had about 1.6 million digital subscribers as of last fall.
  • you can see how Zuckerberg is continuing to push Facebook’s hands-off approach to editorial responsibility. Facebook is outsourcing its decision-making power about what’s in your News Feed. Instead of the way a newspaper editor decides what’s on the front page, the user will decide.
  • “For those who don’t make a decision, the default will be whatever the majority of people in your region selected, like a referendum,” Zuckerberg wrote. Which makes some sense. There are all kinds of issues with an American company imposing its cultural values uniformly on 1.9 billion individuals all over the world.
  • Last quarter, Facebook counted nearly 1.9 billion monthly active users.
  • and now also to act as unpaid editors, volunteering to teach Facebook’s algorithmic editors how and when to surface the content Facebook does not pay for.
  • In other words, Facebook is building a global newsroom run by robot editors and its own readers.
  • he must also realize that what he’s building is a grave threat to journalism
  • Lip service to the crucial function of the Fourth Estate is not enough to sustain it. All of this is the news industry’s problem; not Zuckerberg’s. But it’s also a problem for anyone who believes in and relies on quality journalism to make sense of the world.
  • Zuckerberg doesn’t want Facebook to kill journalism as we know it. He really, really doesn’t. But that doesn’t mean he won’t.
sidneybelleroche

Facebook whistleblower '60 Minutes' interview: Frances Haugen says the company prioriti... - 0 views

  • The identity of the Facebook whistleblower who released tens of thousands of pages of internal research and documents — leading to a firestorm for the social media company in recent weeks — was revealed on "60 Minutes" Sunday night as Frances Haugen.
  • The 37-year-old former Facebook product manager who worked on civic integrity issues at the company says the documents show that Facebook knows its platforms are used to spread hate, violence and misinformation
  • Facebook over and over again chose to optimize for its own interests, like making more money," Haugen told "60 Minutes."
  • ...10 more annotations...
  • Haugen filed at least eight complaints with the Securities and Exchange Commission alleging that the company is hiding research about its shortcomings from investors and the public.
  • Haugen, who started at Facebook in 2019 after previously working for other tech giants like Google (GOOGL GOOGLE) and Pinterest (PINS), is set to testify on Tuesday before the Senate Subcommittee on Consumer Protection, Product Safety, and Data Security.
  • Facebook has aggressively pushed back against the reports, calling many of the claims "misleading" and arguing that its apps do more good than harm.
  • Lena Pietsch said in a statement to CNN Business immediately following the "60 Minutes" interview. "We continue to make significant improvements to tackle the spread of misinformation and harmful content. To suggest we encourage bad content and do nothing is just not true."
  • Pietsch released a more than 700-word statement laying out what it called "missing facts" from the segment
  • Haugen said she believes Facebook Founder and CEO Mark Zuckerberg "never set out to make a hateful platform, but he has allowed choices to be made where the side effects of those choices are that hateful and polarizing content gets more distribution and more reach."
  • Haugen said she was recruited by Facebook in 2019 and took the job to work on addressing misinformation. But after the company decided to dissolve its civic integrity team shortly after the 2020 Presidential Election, her feelings about the company started to change.
  • The social media company's algorithm that's designed to show users content that they're most likely to engage with is responsible for many of its problems
  • Haugen said that while "no one at Facebook is malevolent ... the incentives are misaligned."
  • the more anger that they get exposed to, the more they interact and the more they consume."
Javier E

Conservatives Accuse Facebook of Political Bias - The New York Times - 0 views

  • Facebook scrambled on Monday to respond to a new and startling line of attack: accusations of political bias.
  • Gizmodo, which said that Facebook’s team in charge of the site’s “trending” list had intentionally suppressed articles from conservative news sources. The social network uses the trending feature to indicate the most popular news articles of the day to users.
  • The journalist Glenn Greenwald, hardly a conservative ally, weighed in on Twitter: “Aside from fueling right-wing persecution, this is a key reminder of dangers of Silicon Valley controlling content.”
  • ...9 more annotations...
  • The back-and-forth highlights the extent to which Facebook has now muscled its way into America’s political conversation — and the risks that the company faces as it becomes a central force in news consumption and production.
  • 63 percent of Facebook’s users considered the service a news source.
  • In April, Facebook embraced this role openly, releasing a video to implore people to search Facebook to discover “the other side of the story.” Politicians have increasingly shared their messages through the social network.
  • Facebook’s data scientists analyzed how 10.1 million of the most partisan American users on the social network navigated the site over a six-month period. They found that people’s networks of friends and the articles they saw were skewed toward their ideological preferences — but that the effect was more limited than the worst case some theorists had predicted, in which people would see almost no information from the other side.
  • While Facebook has pledged to sponsor both the Democratic and Republican national conventions, the company’s top executives have not been shy about expressing where their political sympathies lie.
  • Facebook has long described its trending feature as largely automatic. “The topics you see are based on a number of factors including engagement, timeliness, pages you’ve liked and your location,” according to a description on Facebook’s site.
  • The trending feature is curated by a team of contract employees,
  • Any “suppression,” the former employees said, was based on perceived credibility — any articles judged by curators to be unreliable or poorly sourced, whether left-leaning or right-leaning, were avoided, though this was a personal judgment call.
  • According to a report last year by Pew, only 17 percent surveyed said that technology companies had a negative influence on the country. For the news media, that number was 65 percent — and rising.
katyshannon

All Flags Facebook profile picture converter - Tech Insider - 0 views

  • In light of Friday's attacks on Paris, Facebook activated a feature allowing people to super-impose the French flag over their profile picture.
  • Some railed against this idea since the feature was not provided after an attack by ISIS on Lebanon just one day earlier.
  • One site called LunaPics began offering users the option to convert their profile picture into a show of support for Lebanese victims instead.  Now a website called the "All Flags Profile Photo Converter" cropped up.
  • ...5 more annotations...
  • The seemingly tongue-in-cheek site states "Show your support to all countries attacked by ISIS, add all their flags to your Facebook profile photo."
  • Upload your photo, click "convert," and voila! A mash-up of 17 different countries' flags will automatically be pasted over your picture. 
  • Charlotte Farhan's refusal to change her picture went viral when she posted the following caption: "“If I did this for only Paris this would be wrong,” she wrote. “ If I did this for every attack on the world, I would have to change my profile every day several times a day.” Now folks only have to change it this one time, if they so choose.
  • Though initial backlash focused on the Lebanon attacks, "All Flags" has broadened the support to every country victimized by ISIS. That way no one can be offended about someone else's Facebook profile picture.
  • Here's what the site says: Syria, Iraq, Turkey, Lebanon, Pakistan, Yemen, Nigeria, Cameroon, Bahrain, Russia, France, Egypt, Algeria, Afghan, Libya, Chad, Kenya. The country list is based on our research. We are not experts, so please help us complete the list. Contact us and we'll add them.
  •  
    A new Facebook filter was created after the controversy over the Facebook French flag filter
knudsenlu

Donald Trump Was the 'Perfect Candidate' for Facebook - The Atlantic - 0 views

  • Here is the central tenet of Facebook’s business: If lots of people click on, comment on, or share an ad, Facebook charges that advertiser less money to reach people. The platform is a brawl for user attention, and Facebook sees a more engaging ad as a better ad, which should be shown to more users.
  • And yet, in the context of the 2016 Presidential Election, this way of auctioning advertising—originally developed by Google and normalized in the pre-Trump age—can seem strange, unfair, and possibly even against the rules that govern election advertising.
  • Trump, of course, was the canny marketer, while Clinton’s team was the unengaging competitor. While most everyone covering the digital portion of the election has known this, the logical conclusion that follows can still feel startling.
  • ...5 more annotations...
  • “During the run-up to the election, the Trump and Clinton campaigns bid ruthlessly for the same online real estate in front of the same swing-state voters. But because Trump used provocative content to stoke social-media buzz, and he was better able to drive likes, comments, and shares than Clinton, his bids received a boost from Facebook’s click model, effectively winning him more media for less money,” García Martínez continues. “In essence, Clinton was paying Manhattan prices for the square footage on your smartphone’s screen, while Trump was paying Detroit prices. Facebook users in swing states who felt Trump had taken over their news feeds may not have been hallucinating.”
  • Trump was a socialgenic candidate with a team that maximized—or exploited—his potential to create engagement: As dozens of stories have attested over the last two years, Trump was the “clickbait candidate.” Clinton’s posts and advertisements, for whatever basket of reasons, did not generate the same volume of likes, clicks, and shares. And in today’s electioneering, that has severe consequences.
  • From Facebook’s perspective, their platform is “neutral,” in the sense that it provides all advertisers with an equal opportunity to maximize their reach and minimize their costs. “The auction system works the same for everybody,” says Andy Stone, a Facebook spokesperson. “It affords equality of opportunity.”
  • Their personal politics mattered far less than the politics of the system that they half-wittingly created. While the clickbait candidate this last round was Donald Trump, future elections could just as easily feature a left-wing ideologue with an equally engaging style.
  • The University of Virginia media-studies professor Siva Vaidhyanathan, who has a book coming out on Facebook in September—Antisocial Media: How Facebook Disconnects Us and Undermines Democracy—had a stark response, especially with the midterms six months away. “There is no reform. The problem with Facebook is Facebook,” he told me. “When you marry a friction-free social network of 2 billion people to a powerful, precise, cheap ad system that runs on user profiling you get this mess. And no one can switch it off. So we are screwed.”
Javier E

Students Protest Intro Humanities Course at Reed - The Atlantic - 0 views

  • Of the 25 demands issued by RAR that day, the largest section was devoted to reforming Humanities 110.
  • outrage has been increasingly common in the course, Humanities 110, over the past 13 months. On September 26, 2016, the newly formed RAR organized a boycott of all classes in response to a Facebook post from the actor Isaiah Washington
  • A required year-long course for freshmen, Hum 110 consists of lectures that everyone attends and small break-out classes “where students learn how to discuss, debate, and defend their readings.” It’s the heart of the academic experience at Reed, which ranks second for future Ph.D.s in the humanities and fourth in all subjects.
  • ...28 more annotations...
  • As Professor Peter Steinberger details in a 2011 piece for Reed magazine, “What Hum 110 Is All About,” the course is intended to train students whose “primary goal” is “to engage in original, open-ended, critical inquiry.”
  • But for RAR, Hum 110 is all about oppression. “We believe that the first lesson that freshmen should learn about Hum 110 is that it perpetuates white supremacy—by centering ‘whiteness’ as the only required class at Reed,” according to a RAR statement delivered to all new freshmen
  • The texts that make up the Hum 110 syllabus—from the ancient Mediterranean, Mesopotamia, Persia, and Egypt regions—are “Eurocentric,” “Caucasoid,” and thus “oppressive,” RAR leaders have stated. Hum 110 “feels like a cruel test for students of color,” one leader remarked on public radio. “It traumatized my peers.”
  • Reed is home to the most liberal student body of any college, according to The Princeton Review. It’s also ranked the second most-studious—a rigor inculcated in Hum 110.
  • A major crisis for Reed College started when RAR put those core qualities—social justice and academic study—on a collision course.
  • Beginning on boycott day, RAR protested every single Hum lecture that school year.
  • A Hum protest is visually striking: Up to several dozen RAR supporters position themselves alongside the professor and quietly hold signs reading “We demand space for students of color,” “We cannot be erased,” “Fuck Hum 110,” “Stop silencing black and brown voices; the rest of society is already standing on their necks,” and so on. The signs are often accompanied by photos of black Americans killed by police.
  • One of the first Hum professors to request that RAR not occupy the classroom was Lucía Martínez Valdivia, who said her preexisting PTSD would make it difficult to face protesters. In an open letter, RAR offered sympathy to Martínez Valdivia but then accused her of being anti-black, discriminating against those with disabilities, and engaging in gaslighting—without specifying those charges. When someone asked for specifics, a RAR leader replied, “Asking for people to display their trauma so that you feel sufficiently satisfied is a form of violence.”
  • But another RAR member did offer a specific via Facebook: “The​ ​appropriation​ ​of​ ​AAVE [African American Vernacular English]​ ​on​ ​her​ ​shirt​ ​during​ ​lecture:​ ​‘Poetry​ ​is​ ​lit’ ​is​ ​a​ ​form​ ​of​ ​anti-blackness.”
  • During Martínez Valdivia’s lecture on Sappho, protesters sat together in the seats wearing all black; they confronted her after class, with at least one of them yelling at the professor about her past trauma, bringing her to tears. “I am intimidated by these students,” Martínez Valdivia later wrote, noting she is “scared to teach courses on race, gender, or sexuality, or even texts that bring these issues up in any way—and I am a gay mixed-race woman.” Such fear, she revealed in an op-ed for The Washington Post, prompted some of her colleagues— “including people of color, immigrants, and those without tenure”—to avoid lecturing altogether.
  • what about the majority of students not in RAR? I spoke with a few dozen of them to get an understanding of what campus was like last year, and a clear pattern emerged: intimidation, stigma, and silence when it came to discussing Hum 110, or racial politics in general.
  • Raphael, the founder of the Political Dissidents Club, warned incoming students over Facebook that “Reed’s culture can be stifling/suffocating and narrow minded.”
  • The most popular public forum at Reed is Facebook, where social tribes coalesce and where the most emotive and partisan views get the most attention. “Facebook conversations at Reed bring out the extreme aspects of political discourse on campus,” said Yuta, a sophomore who recently co-founded a student group, The Thinkery, “dedicated to critical and open discussion.”
  • In mid-April, when students were studying for finals, a RAR leader grew frustrated that more supporters weren’t showing up to protest Hum 110. In a post viewable only to Reed students, the leader let loose: To all the white & able(mentally/physically) who don’t come to sit-ins(ever, anymore, rarely): all i got is shade for you. [... If] you ain’t with me, then I will accept that you are against me. There’s 6 hums left, I best be seein all u phony ass white allies show-up. […] How you gonna be makin all ur white supremacy messes & not help clean-up your own community by coming and sitting for a frickin hour & still claim that you ain’t a laughin at a lynchin kinda white.
  • Nonwhite students weren’t spared; a group of them agreed to “like” Patrick’s comment in a show of support. A RAR member demanded those “non-black pocs [people of color]” explain themselves, calling them “anti-black pos [pieces of shit].”
  • As tensions continued to mount, one student decided to create an online forum to debate Hum 110. Laura, a U.S. Army veteran who served twice in Afghanistan, named the Facebook page “Reed Discusses Hum 110.” But it seemed like people didn’t want to engage publicly:
  • Another student wrote to Laura in a private message, “I'm coming into this as a ‘POC’ but I disagree with everything [RAR has been] saying for a long time [and] it feels as if it isn't safe for anyone to express anything that goes against what they're saying.”
  • Laura could relate—her father “immigrated from Syria and was brown”—so she stood in front of Hum 110 just before class to distribute an anonymous survey to gauge opinions about the protests, an implicit rebuke to RAR. Laura, who lives in the neighboring city of Beaverton, said she saw this move as risky. “I would’ve rethought what I did had I lived on campus,” she said.
  • If Facebook is no place to debate Hum 110, what about the printed page? Not so much: During the entire 2016–17 school year, not a single op-ed or even a quote critical of RAR’s methods—let alone goals—was published in the student newspaper, according to a review of archived issues. The only thing that comes close?
  • The student magazine, The Grail, did publish a fair amount of dissent over RAR—but almost all anonymously
  • This school year, students are ditching anonymity and standing up to RAR in public—and almost all of them are freshmen of color
  • The pushback from freshmen first came over Facebook. “To interrupt a lecture in a classroom setting is in serious violation of academic freedom and is just unthoughtful and wrong,” wrote a student from China named Sicheng, who distributed a letter of dissent against RAR. Another student, Isabel, ridiculed the group for its “unsolicited emotional theater.
  • I met the student who shot the video. A sophomore from India, he serves as a mentor for international students. (He asked not to be identified by name.) “A lot of them told me how disappointed they were—that they traveled such a long distance to come to this school, and worked so hard to get to this school, and their first lecture was canceled,” he said. He also recalled the mood last year for many students of color like himself: “There was very much a standard opinion you had to have [about RAR], otherwise people would look at you funny, and some people would say stuff to you—a lot of people were called ‘race traitors.
  • Another student from India, Jagannath, responded to the canceled lecture by organizing a freshmen-only meeting on the quad. “For us to rise out of this culture of private concerns, hatred, and fear, we need to find a way to think, speak, and act together,” he wrote in a mass email. Jagannath told me that upperclassmen warned him he was “very crazy” to hold a public meeting, but it was a huge success; about 150 freshmen showed up, and by all accounts, their debate over Hum 110 was civil and constructive. In the absence of Facebook and protest signs, the freshmen were taking back their class.
  • In the intervening year, the Reed administration had met many of RAR’s demands, including new hires in the Office of Inclusive Community, fast-tracking the reevaluation of the Hum 110 syllabus that traditionally happens every 10 years, and arranging a long series of “6 by 6 meetings”—six RAR students and six Hum professors—to solicit ideas for that syllabus. (Those meetings ended when RAR members stopped coming; they complained of being “forced to sit in hours of fruitless meetings listening to full-grown adults cry about Aristotle.”)
  • the more accommodation that’s been made, the more disruptive the protests have become—and the more heightened the rhetoric. “Black lives matter” was the common chant at last year’s boycott. This year’s? “No cops, no KKK, no racist U.S.A.” RAR increasingly claims those cops will be unleashed on them—or, in their words, Hum professors are “entertaining threatening violence on our bodies.”
  • Rollo later told me that RAR “had a beautiful opportunity to address police violence” but squandered it with extreme rhetoric. “Identity politics is divisive,” he insisted. As far as Hum 110, “I like to do my own interpreting,” and he resents RAR “playing the race card on ancient Egyptian culture.
  • Reed is just one college—and a small one at that. But the freshman revolt against RAR could be a blueprint for other campuses. If the “most liberal student body” in the country can reject divisive racial rhetoric and come together to debate a diversity of views, others could follow.
Javier E

'The end of Trump': how Facebook deepens millennials' confirmation bias | US news | The... - 0 views

  • “Among millennials, especially,” Douthat argues, “there’s a growing constituency for whom rightwing ideas are so alien or triggering, leftwing orthodoxy so pervasive and unquestioned, that supporting a candidate like Hillary Clinton looks like a needless form of compromise.”
  • Unlike Twitter – or real life – where interaction with those who disagree with you on political matters is an inevitability, Facebook users can block, mute and unfriend any outlet or person that will not further bolster their current worldview.
  • Even Facebook itself sees the segmentation of users along political lines on its site - and synchronizes it not only with the posts users see, but with the advertisements they’re shown.
  • ...4 more annotations...
  • Test it out yourself: Go to facebook.com/ads/preferences on your browser and click the “Lifestyle and Culture” tab under the “Interests” banner. You see the box titled “US Politics”? It’s followed with a parenthetical notation of your political alignment, from “Very Conservative” to “Very Liberal”.
  • Sites such as US Uncut, Occupy Democrats, Addicting Info, Make America Great and The Other 98% may barely have homepages, but their Facebook pages are rich with millions of followers and sky-high engagement – in many cases higher than many mainstream news outlets combined
  • “News sources” – largely aggregators of video clips and interviews from other sites – that barely exist beyond the sharing economy of Facebook have arisen as major players in the site’s political news sphere.
  • Occupy Democrats, a far-left page popular with supporters of onetime Democratic presidential candidate Bernie Sanders, has 3.8 million likes on its Facebook page. MSNBC, another left-leaning outlet with far wider reach outside of Facebook, has a mere 1.6 million.
Javier E

Facebook will start telling you when a story may be fake - The Washington Post - 0 views

  • The social network is going to partner with the Poynter International Fact-Checking Network, which includes groups such as Snopes and the Associated Press, to evaluate articles flagged by Facebook users. If those articles do not pass the smell test for the fact-checkers, Facebook will label that evaluation whenever they are posted or shared, along with a link to the organization that debunked the story.
  • Mosseri said the social network still wants to be a place where people with all kinds of opinions can express themselves but has no interest in being the arbiter of what’s true and what's not for its 1 billion users.
  • The new system will work like this: If a story on Facebook is patently false — saying that a celebrity is dead when they are still alive, for example — then users will see a notice that the story has been disputed or debunked. People who try to share stories that have been found false will also see an alert before they post. Flagged stories will appear lower in the news feed than unflagged stories.
  • ...9 more annotations...
  • Users will also be able to report potentially false stories to Facebook or send messages directly to the person posting a questionable article.
  • The company is focusing, for now, on what Mosseri called the “bottom of the barrel” websites that are purposefully set up to deceive and spread fake news, as well as those that are impersonating other news organizations. “We are not looking to flag legitimate organizations,” Mosseri said. “We’re looking for pages posing as legitimate organizations.” Articles from legitimate sites that are controversial or even wrong should not get flagged, he said.
  • The company will also prioritize checking stories that are getting lots of flags from users and are being shared widely, to go after the biggest targets possible.
  • "From a journalistic side, is it enough? It’s a little late.”
  • Facebook is fine to filter out other content -- such as pornography -- for which the definition is unclear. There's no clear explanation for why Facebook hasn't decided to apply similar filters to fake news. “I think that’s a little weak,” Tu said. “If you recognize that it’s bad and journalists at the AP say it’s bad, you shouldn’t have it on your site.”
  • Others said Facebook's careful approach may be warranted. "I think we'll have to wait and see early results to determine how effective the strategy is," said Alexios Mantzarlis, of Poynter's International Fact-Checking Network. "In my eyes, erring on the side of caution is not a bad idea with something so complicated," he said
  • Facebook is also trying to crack down on people who have made a business in fake news by tweaking the social network's advertising practices. Any article that has been disputed, for example, cannot be used in an ad. Facebook is also playing around with ways to limit links from publishers with landing pages that are mostly ads — a common tactic for fake-news websites
  • With those measures in place, “we’re hoping financially motivated spammers might move away from fake news,” Mosseri said
  • Paul Horner, a fake news writer who makes a living writing viral hoaxes, said he wasn't immediately worried about Facebook's new crackdown on fake news sites. "It's really easy to start a new site. I have 50 domain names. I have a dedicated server. I can start up a new site within 48 hours," he said, shortly after Facebook announced its new anti-hoax programs.  If his sites, which he describes as "satire"-focused, do end up getting hit too hard, Horner says he has "backup plans."
Maria Delzi

BBC News - Facebook sued over alleged private message 'scanning' - 0 views

  • Facebook is facing a class action lawsuit over allegations that it monitors users' private messages.
  • The lawsuit claims that when users share a link to another website via a private message, Facebook scans it to profile the sender's web activity.
  • The lawsuit is claiming the greater of either $100 (£61) a day for each day of alleged violations or $10,000, for each user.
  • ...7 more annotations...
  • The lawsuit, filed earlier this week, cites independent research that, it claims, found Facebook reviews the contents of its users' private messages "for purposes unrelated to the facilitation of message transmission".
  • "Representing to users that the content of Facebook messages is "private" creates an especially profitable opportunity for Facebook," it says.
  • "because users who believe they are communicating on a service free from surveillance are likely to reveal facts about themselves that they would not reveal had they known the content was being monitored.
  • Writing on his blog, security expert Graham Cluley said that if the site was not examining links shared privately, Facebook would be failing a "duty of care" to its users.
  • Facebook has come under attack over its privacy policies in the past.
  • In September last year, it faced criticism over a proposed change to its privacy policy which would have allowed ads to be created using the names and profile pictures of Facebook users.
  • Facebook undertook to change the wording in the wake of a legal action launched in 2011 which saw it pay $20m to compensate users who claimed it had used their data without explicit permission.
Javier E

Opinion | Zuckerberg's So-Called Shift Toward Privacy - The New York Times - 0 views

  • The platitudes were there, as I expected, but the evasions were worse than I anticipated: The plan, in effect, is to entrench Facebook’s interests while sidestepping all the important issues.
  • Here are four pressing questions about privacy that Mr. Zuckerberg conspicuously did not address: Will Facebook stop collecting data about people’s browsing behavior, which it does extensively? Will it stop purchasing information from data brokers who collect or “scrape” vast amounts of data about billions of people, often including information related to our health and finances? Will it stop creating “shadow profiles” — collections of data about people who aren’t even on Facebook? And most important: Will it change its fundamental business model, which is based on charging advertisers to take advantage of this widespread surveillance to “micro-target” consumers?
  • Mr. Zuckerberg said that the company would expand end-to-end encryption of messaging, which prevents Facebook — or anyone other than the participants in a conversation — from seeing the content of messages. I’m certainly in favor of messaging privacy: It is a cornerstone of the effort to push back against the cloud of surveillance that has descended over the globe.
  • ...7 more annotations...
  • But what we really need — and it is not clear what Facebook has in mind — is privacy for true person-to-person messaging apps, not messaging apps that also allow for secure mass messaging.
  • Once end-to-end encryption is put in place, Facebook can wash its hands of the content. We don’t want to end up with all the same problems we now have with viral content online — only with less visibility and nobody to hold responsible for it.
  • encrypted messaging, in addition to releasing Facebook from the obligation to moderate content, wouldn’t interfere with the surveillance that Facebook conducts for the benefit of advertisers. As Mr. Zuckerberg admitted in an interview after he posted his plan, Facebook isn’t “really using the content of messages to target ads today anyway.” In other words, he is happy to bolster privacy when doing so would decrease Facebook’s responsibilities, but not when doing so would decrease its advertising revenue.
  • Mr. Zuckerberg emphasized in his post was his intention to make Facebook’s messaging platforms, Messenger, WhatsApp and Instagram, “interoperable.” He described this decision as part of his “privacy-focused vision,” though it is not clear how doing so — which would presumably involve sharing user data — would serve privacy interests.
  • Merging those apps just might, however, serve Facebook’s interest in avoiding antitrust remedies. Just as regulators are realizing that allowing Facebook to gobble up all its competitors (including WhatsApp and Instagram) may have been a mistake, Mr. Zuckerberg decides to scramble the eggs to make them harder to separate into independent entities. What a coincidence.
  • This supposed shift toward a “privacy-focused vision” looks more to me like shrewd competitive positioning, dressed up in privacy rhetoric.
  • Sheryl Sandberg, Facebook's chief operating officer, likes to say that the company’s problem is that it has been “way too idealistic.” I think the problem is the invasive way it makes its money and its lack of meaningful oversight
Javier E

How key Republicans inside Facebook are shifting its politics to the right | Technology... - 0 views

  • David Brock, founder and chairman of Media Matters for America, a progressive media watchdog, said: “Mark Zuckerberg continues to kowtow to the right and rightwing criticism. It began when he met with a bunch of rightwingers in May 2016 and then Facebook changed its algorithm policies and we saw a lot of fake news as a result.
  • “I think there’s a consistent pattern of Zuckerberg and the Breitbart issue is the most recent one where the right is able to make false claims of conservative bias on Facebook and then he bends over backwards to accommodate that criticism.”
  • The Republican strain in Facebook was highlighted in a recent edition of the Popular Information newsletter, which stated that the top three leaders in the company’s Washington office are veteran party operatives. “Facebook’s DC office ensures that the company’s content policies meet the approval of Republicans in Congress,” Popular Information said
  • ...5 more annotations...
  • oel Kaplan, vice-president of global public policy at Facebook, manages the company’s relationships with policymakers around the world. A former law clerk to archconservative justice Antonin Scalia on the supreme court, he served as deputy chief of staff for policy under former president George W Bush from 2006 to 2009, joining Facebook two years later
  • Warren noted on Twitter this week: “Since he was hired, Facebook spent over $71 million on lobbying—nearly 100 times what it had spent before Kaplan joined.”
  • Kaplan has reportedly advocated for rightwing sites such as Breitbart and the Daily Caller, which earlier this year became a partner in Facebook’s factchecking program. Founded by Fox News’s Tucker Carlson, the Daily Caller is pro-Trump, anti-immigrant and widely criticised for the way it reported on a fake nude photo of the Democratic congresswoman Alexandria Ocasio-Cortez.
  • Facebook’s Washington headquarters also includes Kevin Martin, vice-president of US public policy and former chairman, under Bush, of the Federal Communications Commission – where a congressional report said his “heavy-handed, opaque and non-collegial management style … created distrust, suspicion and turmoil”
  • Katie Harbath, the company’s public policy director for global elections, led digital strategy for Rudy Giuliani’s 2008 presidential campaign and the Republican National Committee. She has been the principal defender of the company’s decision to allow political advert
Javier E

Opinion | Facebook Has Been a Disaster for the World - The New York Times - 0 views

  • Facebook has been incredibly lucrative for its founder, Mark Zuckerberg, who ranks among the wealthiest men in the world. But it’s been a disaster for the world itself, a powerful vector for paranoia, propaganda and conspiracy-theorizing as well as authoritarian crackdowns and vicious attacks on the free press. Wherever it goes, chaos and destabilization follow.
  • The most disturbing revelations from Zhang’s memo relate to the failure of Facebook to take swift action against coordinated activity in countries like Honduras and Azerbaijan, where political leaders used armies of fake accounts to attack opponents and undermine independent media. “We simply didn’t care enough to stop them,”
  • “In the three years I’ve spent at Facebook, I’ve found multiple blatant attempts by foreign national governments to abuse our platform on vast scales to mislead their own citizenry, and caused international news on multiple occasions,” Zhang wrote. “I have personally made decisions that affected national presidents without oversight and taken action to enforce against so many prominent politicians globally that I’ve lost count,”
  • ...3 more annotations...
  • “There are five major ways that authoritarian regimes exploit Facebook and other social media services,” Siva Vaidhyanathan, a media scholar at the University of Virginia, writes in “Antisocial Media: How Facebook Disconnects Us and Undermines Democracy.” They can “organize countermovements to emerging civil society or protest movements,” “frame the public debate along their terms,” let citizens “voice complaints without direct appeal or protest” and “coordinate among elites to rally support.” They can also use social media to aid in the “surveillance and harassment of opposition activists and journalists.”
  • Facebook, according to the company’s own investigation, is home to thousands of QAnon groups and pages with millions of members and followers. Its recommendation algorithms push users to engage with QAnon content, spreading the conspiracy to people who may never have encountered it otherwise
  • Similarly, a report from the German Marshall Fund pegs the recent spate of fire conspiracies — false claims of arson in Oregon by antifa or Black Lives Matter — to the uncontrolled spread of rumors and disinformation on Facebook.
Javier E

A Revealing Look At Zuckerberg | Talking Points Memo - 0 views

  • these tradeoffs get to the heart of Facebook’s problem and the heart of what the site is. The harm is inherent to Facebook’s business model. When you find ways to reduce harm they’re almost always at the expense of engagement metrics the maximization of which are the goal of basically everything Facebook does. The comparison may be a loaded or contentious one. But it is a bit like the Tobacco companies. The product is the problem, not how it’s used or abused. It’s the product. That’s a challenging place for a company to be.
  • Facebook now makes up a very big part of the whole global information ecosystem. In many countries around the world Facebook for all intents and purposes is the Internet. The weather patterns of information as we might call them are heavily shaped by Facebook’s algorithms and the various tweaks and adjustments it makes to them in different countries. Facebook may not create the misinformation or hate speech or hyper-nationalist frenzies but its algorithms help drive them.
  • the guiding light for those algorithms is first to maximize engagement.
  • ...9 more annotations...
  • That part we know. That’s the business model. But in a different way they are driven by goals and drives of this one guy, Mark Zuckerberg
  • my read is that it was more the ‘winning’ part than the money, though of course the two become somewhat indistinguishable. So Zuckerberg is a near free speech absolutist, as the story conveys. Except when it might mean going dark in a medium-to-large-sized country.
  • One interesting anecdote in the article comes out of Vietnam, where Facebook is estimated to make about $1 billion a year. A few years ago Vietnam demanded that Facebook start censoring anti-government posts or really any criticism of the government or be taken off line in the country. Essentially Vietnam insisted that Facebook delegate content moderation within Vietnam to the government of Vietnam. Zuckerberg personally made the decision to agree to the demands.
  • his article and much else makes pretty clear that it really is still Mark Zuckerberg that runs the show. And what drives him? This article and much else suggests that what shapes Zuckerberg’s goals are perhaps three things in descending order: 1) to win (in all its dimensions), 2) to maximize profits and 3) to cater to the complaints of the right which is most effective and aggressive about complaining about purported mistreatment.
  • He apparently justified this on the reasoning that Facebook disappearing in Vietnam would take away the speech rights of more people than the censorship would. If that sounds like self-justifying nonsense thank you for reading closely.
  • this is just too much power for one person to have. But it’s more that the win, win, win!!! mentality which certainly lots of CEOS and especially Founder-CEOS have in spades is here harnessed to an engine that does a lot of damage.
  • Back in 2018 I wrote about a distinct but related issue. No big tech company has been worse at launching off on new ventures or ideas, having whole cottage industries grow up around those ventures, and then shifting gears and having countless partner businesses go belly up
  • there is a related indifference or oblivious to the impact or social costs of what Facebook does, if in many case only because of its sheer scale.
  • This isn’t just corporate culture, or perhaps Zuckerberg himself. A lot of it is tied to Facebook’s relationship to the rest of the web. Google is structurally much more connected to and reliant on the open web. Facebook is much more a closed system which remains highly profitable regardless of the chaos it may create around it.
Javier E

Facebook's Dangerous Experiment on Teen Girls - The Atlantic - 0 views

  • Much more than for boys, adolescence typically heightens girls’ self-consciousness about their changing body and amplifies insecurities about where they fit in their social network. Social media—particularly Instagram, which displaces other forms of interaction among teens, puts the size of their friend group on public display, and subjects their physical appearance to the hard metrics of likes and comment counts—takes the worst parts of middle school and glossy women’s magazines and intensifies them.
  • The preponderance of the evidence now available is disturbing enough to warrant action.
  • The toxicity comes from the very nature of a platform that girls use to post photographs of themselves and await the public judgments of others.
  • ...35 more annotations...
  • imilar increases occurred at the same time for girls in Canada for mood disorders and for self-harm. Girls in the U.K. also experienced very large increases in anxiety, depression, and self-harm (with much smaller increases for boys).
  • Some have argued that these increases reflect nothing more than Gen Z’s increased willingness to disclose their mental-health problems. But researchers have found corresponding increases in measurable behaviors such as suicide (for both sexes), and emergency-department admissions for self-harm (for girls only). From 2010 to 2014, rates of hospital admission for self-harm did not increase at all for women in their early 20s, or for boys or young men, but they doubled for girls ages 10 to 14.
  • The available evidence suggests that Facebook’s products have probably harmed millions of girls. If public officials want to make that case, it could go like this:
  • from 2010 to 2014, high-school students moved much more of their lives onto social-media platforms.
  • National surveys of American high-school students show that only about 63 percent reported using a “social networking site” on a daily basis back in 2010.
  • But as smartphone ownership increased, access became easier and visits became more frequent. By 2014, 80 percent of high-school students said they used a social-media platform on a daily basis, and 24 percent said that they were online “almost constantly.”
  • 2. The timing points to social media.
  • Notably, girls became much heavier users of the new visually oriented platforms, primarily Instagram (which by 2013 had more than 100 million users), followed by Snapchat, Pinterest, and Tumblr.
  • Boys are glued to their screens as well, but they aren’t using social media as much; they spend far more time playing video games. When a boy steps away from the console, he does not spend the next few hours worrying about what other players are saying about him
  • Instagram, in contrast, can loom in a girl’s mind even when the app is not open, driving hours of obsessive thought, worry, and shame.
  • 3. The victims point to Instagram.
  • In 2017, British researchers asked 1,500 teens to rate how each of the major social-media platforms affected them on certain well-being measures, including anxiety, loneliness, body image, and sleep. Instagram scored as the most harmful, followed by Snapchat and then Facebook.
  • Facebook’s own research, leaked by the whistleblower Frances Haugen, has a similar finding: “Teens blame Instagram for increases in the rate of anxiety and depression … This reaction was unprompted and consistent across all groups.” The researchers also noted that “social comparison is worse” on Instagram than on rival apps.
  • 4. No other suspect is equally plausible.
  • A recent experiment confirmed these observations: Young women were randomly assigned to use Instagram, use Facebook, or play a simple video game for seven minutes. The researchers found that “those who used Instagram, but not Facebook, showed decreased body satisfaction, decreased positive affect, and increased negative affect.”
  • Snapchat’s filters “keep the focus on the face,” whereas Instagram “focuses heavily on the body and lifestyle.
  • (Boys lost less, and may even have gained, when they took up multiplayer fantasy games, especially those that put them into teams.)
  • The subset of studies that allow researchers to isolate social media, and Instagram in particular, show a much stronger relationship with poor mental health. The same goes for those that zoom in on girls rather than all teens.
  • In a 2019 internal essay, Andrew Bosworth, a longtime company executive, wrote:While Facebook may not be nicotine I think it is probably like sugar. Sugar is delicious and for most of us there is a special place for it in our lives. But like all things it benefits from moderation.
  • Bosworth was proposing what medical researchers call a “dose-response relationship.” Sugar, salt, alcohol, and many other substances that are dangerous in large doses are harmless in small ones.
  • his framing also implies that any health problems caused by social media result from the user’s lack of self-control. That’s exactly what Bosworth concluded: “Each of us must take responsibility for ourselves.” The dose-response frame also points to cheap solutions that pose no threat to its business model. The company can simply offer more tools to help Instagram and Facebook users limit their consumption.
  • social-media platforms are not like sugar. They don’t just affect the individuals who overindulge. Rather, when teens went from texting their close friends on flip phones in 2010 to posting carefully curated photographs and awaiting comments and likes by 2014, the change rewired everyone’s social life.
  • Improvements in technology generally help friends connect, but the move onto social-media platforms also made it easier—indeed, almost obligatory––for users to perform for one another.
  • Public performance is risky. Private conversation is far more playful. A bad joke or poorly chosen word among friends elicits groans, or perhaps a rebuke and a chance to apologize. Getting repeated feedback in a low-stakes environment is one of the main ways that play builds social skills, physical skills, and the ability to properly judge risk. Play also strengthens friendships.
  • When girls started spending hours each day on Instagram, they lost many of the benefits of play.
  • First, Congress should pass legislation compelling Facebook, Instagram, and all other social-media platforms to allow academic researchers access to their data. One such bill is the Platform Transparency and Accountability Act, proposed by the Stanford University researcher Nate Persily.
  • The wrong photo can lead to school-wide or even national infamy, cyberbullying from strangers, and a permanent scarlet letter
  • Performative social media also puts girls into a trap: Those who choose not to play the game are cut off from their classmates
  • Instagram and, more recently, TikTok have become wired into the way teens interact, much as the telephone became essential to past generations.
  • f those platforms. Without a proper control group, we can’t be certain that the experiment has been a catastrophic failure, but it probably has been. Until someone comes up with a more plausible explanation for what has happened to Gen Z girls, the most prudent course of action for regulators, legislators, and parents is to take steps to mitigate the harm.
  • Correlation does not prove causation, but nobody has yet found an alternative explanation for the massive, sudden, gendered, multinational deterioration of teen mental health during the period in question.
  • Second, Congress should toughen the 1998 Children’s Online Privacy Protection Act. An early version of the legislation proposed 16 as the age at which children should legally be allowed to give away their data and their privacy.
  • Unfortunately, e-commerce companies lobbied successfully to have the age of “internet adulthood” set instead at 13. Now, more than two decades later, today’s 13-year-olds are not doing well. Federal law is outdated and inadequate. The age should be raised. More power should be given to parents, less to companies.
  • Third, while Americans wait for lawmakers to act, parents can work with local schools to establish a norm: Delay entry to Instagram and other social platforms until high school.
  • Right now, families are trapped. I have heard many parents say that they don’t want their children on Instagram, but they allow them to lie about their age and open accounts because, well, that’s what everyone else has done.
‹ Previous 21 - 40 of 716 Next › Last »
Showing 20 items per page