Skip to main content

Home/ TOK Friends/ Group items tagged Facebook

Rss Feed Group items tagged

Javier E

Facebook will start telling you when a story may be fake - The Washington Post - 0 views

  • The social network is going to partner with the Poynter International Fact-Checking Network, which includes groups such as Snopes and the Associated Press, to evaluate articles flagged by Facebook users. If those articles do not pass the smell test for the fact-checkers, Facebook will label that evaluation whenever they are posted or shared, along with a link to the organization that debunked the story.
  • Mosseri said the social network still wants to be a place where people with all kinds of opinions can express themselves but has no interest in being the arbiter of what’s true and what's not for its 1 billion users.
  • The new system will work like this: If a story on Facebook is patently false — saying that a celebrity is dead when they are still alive, for example — then users will see a notice that the story has been disputed or debunked. People who try to share stories that have been found false will also see an alert before they post. Flagged stories will appear lower in the news feed than unflagged stories.
  • ...9 more annotations...
  • Users will also be able to report potentially false stories to Facebook or send messages directly to the person posting a questionable article.
  • The company is focusing, for now, on what Mosseri called the “bottom of the barrel” websites that are purposefully set up to deceive and spread fake news, as well as those that are impersonating other news organizations. “We are not looking to flag legitimate organizations,” Mosseri said. “We’re looking for pages posing as legitimate organizations.” Articles from legitimate sites that are controversial or even wrong should not get flagged, he said.
  • The company will also prioritize checking stories that are getting lots of flags from users and are being shared widely, to go after the biggest targets possible.
  • "From a journalistic side, is it enough? It’s a little late.”
  • Facebook is fine to filter out other content -- such as pornography -- for which the definition is unclear. There's no clear explanation for why Facebook hasn't decided to apply similar filters to fake news. “I think that’s a little weak,” Tu said. “If you recognize that it’s bad and journalists at the AP say it’s bad, you shouldn’t have it on your site.”
  • Others said Facebook's careful approach may be warranted. "I think we'll have to wait and see early results to determine how effective the strategy is," said Alexios Mantzarlis, of Poynter's International Fact-Checking Network. "In my eyes, erring on the side of caution is not a bad idea with something so complicated," he said
  • Facebook is also trying to crack down on people who have made a business in fake news by tweaking the social network's advertising practices. Any article that has been disputed, for example, cannot be used in an ad. Facebook is also playing around with ways to limit links from publishers with landing pages that are mostly ads — a common tactic for fake-news websites
  • With those measures in place, “we’re hoping financially motivated spammers might move away from fake news,” Mosseri said
  • Paul Horner, a fake news writer who makes a living writing viral hoaxes, said he wasn't immediately worried about Facebook's new crackdown on fake news sites. "It's really easy to start a new site. I have 50 domain names. I have a dedicated server. I can start up a new site within 48 hours," he said, shortly after Facebook announced its new anti-hoax programs.  If his sites, which he describes as "satire"-focused, do end up getting hit too hard, Horner says he has "backup plans."
Javier E

Early Facebook and Google Employees Form Coalition to Fight What They Built - The New Y... - 0 views

  • A group of Silicon Valley technologists who were early employees at Facebook and Google, alarmed over the ill effects of social networks and smartphones, are banding together to challenge the companies they helped build.
  • The campaign, titled The Truth About Tech, will be funded with $7 million from Common Sense and capital raised by the Center for Humane Technology. Common Sense also has $50 million in donated media and airtime
  • . It will be aimed at educating students, parents and teachers about the dangers of technology, including the depression that can come from heavy use of social media.
  • ...9 more annotations...
  • Chamath Palihapitiya, a venture capitalist who was an early employee at Facebook, said in November that the social network was “ripping apart the social fabric of how society works.”
  • The new Center for Humane Technology includes an unprecedented alliance of former employees of some of today’s biggest tech companies. Apart from Mr. Harris, the center includes Sandy Parakilas, a former Facebook operations manager; Lynn Fox, a former Apple and Google communications executive; Dave Morin, a former Facebook executive; Justin Rosenstein, who created Facebook’s Like button and is a co-founder of Asana; Roger McNamee, an early investor in Facebook; and Renée DiResta, a technologist who studies bots.
  • Its first project to reform the industry will be to introduce a Ledger of Harms — a website aimed at guiding rank-and-file engineers who are concerned about what they are being asked to build. The site will include data on the health effects of different technologies and ways to make products that are healthier
  • Truth About Tech campaign was modeled on antismoking drives and focused on children because of their vulnerability.
  • Apple’s chief executive, Timothy D. Cook, told The Guardian last month that he would not let his nephew on social media, while the Facebook investor Sean Parker also recently said of the social network that “God only knows what it’s doing to our children’s brains.”Mr. Steyer said, “You see a degree of hypocrisy with all these guys in Silicon Valley.”
  • The new group also plans to begin lobbying for laws to curtail the power of big tech companies. It will initially focus on two pieces of legislation: a bill being introduced by Senator Edward J. Markey, Democrat of Massachusetts, that would commission research on technology’s impact on children’s health, and a bill in California by State Senator Bob Hertzberg, a Democrat, which would prohibit the use of digital bots without identification.
  • Mr. McNamee said he had joined the Center for Humane Technology because he was horrified by what he had helped enable as an early Facebook investor.
  • “Facebook appeals to your lizard brain — primarily fear and anger,” he said. “And with smartphones, they’ve got you for every waking moment.”
  • He said the people who made these products could stop them before they did more harm.
Javier E

'The Power of One,' Facebook whistleblower Frances Haugen's memoir - The Washington Post - 0 views

  • When an internal group proposed the conditions under which Facebook should step in and take down speech from political actors, Zuckerberg discarded its work. He said he’d address the issue himself over a weekend. His “solution”? Facebook would not touch speech by any politician, under any circumstances — a fraught decision under the simplistic surface, as Haugen points out. After all, who gets to count as a politician? The municipal dogcatcher?
  • t was also Zuckerberg, she says, who refused to make a small change that would have made the content in people’s feeds less incendiary — possibly because doing so would have caused a key metric to decline.
  • When the Wall Street Journal’s Jeff Horwitz began to break the stories that Haugen helped him document, the most damning one concerned Facebook’s horrifyingly disingenuous response to a congressional inquiry asking if the company had any research showing that its products were dangerous to teens. Facebook said it wasn’t aware of any consensus indicating how much screen time was too much. What Facebook did have was a pile of research showing that kids were being harmed by its products. Allow a clever company a convenient deflection, and you get something awfully close to a lie.
  • ...5 more annotations...
  • after the military in Myanmar used Facebook to stoke the murder of the Rohingya people, Haugen began to worry that this was a playbook that could be infinitely repeated — and only because Facebook chose not to invest in safety measures, such as detecting hate speech in poorer, more vulnerable places. “The scale of the problems was so vast,” she writes. “I believed people were going to die (in certain countries, at least) and for no reason other than higher profit margins.”
  • After a trip to Cambodia, where neighbors killed neighbors in the 1970s because of a “story that labeled people who had lived next to each other for generations as existential threats,” she’d started to wonder about what caused people to turn on one another to such a horr
  • ifying degree. “How quickly could a story become the truth people perceived?”
  • she points out is the false choice posited by most social media companies: free speech vs. censorship. She argues that lack of transparency is what contributed most to the problems at Facebook. No one on the outside can see inside the algorithms. Even many of those on the inside can’t. “You can’t take a single academic course, anywhere in the world, on the tradeoffs and choices that go into building a social media algorithm or, more importantly, the consequences of those choices,” she writes.
  • In that lack of accountability, social media is a very different ecosystem than the one that helped Ralph Nader take on the auto industry back in the 1960s. Then, there was a network of insurers and plaintiff’s lawyers who also wanted change — and the images of mangled bodies were a lot more visible than what happens inside the mind of a teenage girl. But what if the government forced companies to share their inner workings in the same way it mandates that food companies disclose the nutrition in what they make? What if the government forced social media companies to allow academics and other researchers access to the algorithms they use?
Javier E

How Calls for Privacy May Upend Business for Facebook and Google - The New York Times - 0 views

  • People detailed their interests and obsessions on Facebook and Google, generating a river of data that could be collected and harnessed for advertising. The companies became very rich. Users seemed happy. Privacy was deemed obsolete, like bloodletting and milkmen
  • It has been many months of allegations and arguments that the internet in general and social media in particular are pulling society down instead of lifting it up.
  • That has inspired a good deal of debate about more restrictive futures for Facebook and Google. At the furthest extreme, some dream of the companies becoming public utilities.
  • ...20 more annotations...
  • There are other avenues still, said Jascha Kaykas-Wolff, the chief marketing officer of Mozilla, the nonprofit organization behind the popular Firefox browser, including advertisers and large tech platforms collecting vastly less user data and still effectively customizing ads to consumers.
  • The greatest likelihood is that the internet companies, frightened by the tumult, will accept a few more rules and work a little harder for transparency.
  • The Cambridge Analytica case, said Vera Jourova, the European Union commissioner for justice, consumers and gender equality, was not just a breach of private data. “This is much more serious, because here we witness the threat to democracy, to democratic plurality,” she said.
  • Although many people had a general understanding that free online services used their personal details to customize the ads they saw, the latest controversy starkly exposed the machinery.
  • Consumers’ seemingly benign activities — their likes — could be used to covertly categorize and influence their behavior. And not just by unknown third parties. Facebook itself has worked directly with presidential campaigns on ad targeting, describing its services in a company case study as “influencing voters.”
  • “If your personal information can help sway elections, which affects everyone’s life and societal well-being, maybe privacy does matter after all.”
  • some trade group executives also warned that any attempt to curb the use of consumer data would put the business model of the ad-supported internet at risk.
  • “You’re undermining a fundamental concept in advertising: reaching consumers who are interested in a particular product,”
  • If suspicion of Facebook and Google is a relatively new feeling in the United States, it has been embedded in Europe for historical and cultural reasons that date back to the Nazi Gestapo, the Soviet occupation of Eastern Europe and the Cold War.
  • “We’re at an inflection point, when the great wave of optimism about tech is giving way to growing alarm,” said Heather Grabbe, director of the Open Society European Policy Institute. “This is the moment when Europeans turn to the state for protection and answers, and are less likely than Americans to rely on the market to sort out imbalances.”
  • In May, the European Union is instituting a comprehensive new privacy law, called the General Data Protection Regulation. The new rules treat personal data as proprietary, owned by an individual, and any use of that data must be accompanied by permission — opting in rather than opting out — after receiving a request written in clear language, not legalese.
  • the protection rules will have more teeth than the current 1995 directive. For example, a company experiencing a data breach involving individuals must notify the data protection authority within 72 hours and would be subject to fines of up to 20 million euros or 4 percent of its annual revenue.
  • “With the new European law, regulators for the first time have real enforcement tools,” said Jeffrey Chester, the executive director of the Center for Digital Democracy, a nonprofit group in Washington. “We now have a way to hold these companies accountable.”
  • Privacy advocates and even some United States regulators have long been concerned about the ability of online services to track consumers and make inferences about their financial status, health concerns and other intimate details to show them behavior-based ads. They warned that such microtargeting could unfairly categorize or exclude certain people.
  • the Do Not Track effort and the privacy bill were both stymied.Industry groups successfully argued that collecting personal details posed no harm to consumers and that efforts to hinder data collection would chill innovation.
  • “If it can be shown that the current situation is actually a market failure and not an individual-company failure, then there’s a case to be made for federal regulation” under certain circumstances
  • The business practices of Facebook and Google were reinforced by the fact that no privacy flap lasted longer than a news cycle or two. Nor did people flee for other services. That convinced the companies that digital privacy was a dead issue.
  • If the current furor dies down without meaningful change, critics worry that the problems might become even more entrenched. When the tech industry follows its natural impulses, it becomes even less transparent.
  • “To know the real interaction between populism and Facebook, you need to give much more access to researchers, not less,” said Paul-Jasper Dittrich, a German research fellow
  • There’s another reason Silicon Valley tends to be reluctant to share information about what it is doing. It believes so deeply in itself that it does not even think there is a need for discussion. The technology world’s remedy for any problem is always more technology
Javier E

George Soros: Facebook and Google a menace to society | Business | The Guardian - 0 views

  • Facebook and Google have become “obstacles to innovation” and are a “menace” to society whose “days are numbered”
  • “Mining and oil companies exploit the physical environment; social media companies exploit the social environment,” said the Hungarian-American businessman, according to a transcript of his speech.
  • “This is particularly nefarious because social media companies influence how people think and behave without them even being aware of it. This has far-reaching adverse consequences on the functioning of democracy, particularly on the integrity of elections.”
  • ...8 more annotations...
  • In addition to skewing democracy, social media companies “deceive their users by manipulating their attention and directing it towards their own commercial purposes” and “deliberately engineer addiction to the services they provide”. The latter, he said, “can be very harmful, particularly for adolescents”
  • There is a possibility that once lost, people who grow up in the digital age will have difficulty in regaining it. This may have far-reaching political consequences.”
  • Soros warned of an “even more alarming prospect” on the horizon if data-rich internet companies such as Facebook and Google paired their corporate surveillance systems with state-sponsored surveillance – a trend that’s already emerging in places such as the Philippines.
  • “This may well result in a web of totalitarian control the likes of which not even Aldous Huxley or George Orwell could have imagined,”
  • “The internet monopolies have neither the will nor the inclination to protect society against the consequences of their actions. That turns them into a menace and it falls to the regulatory authorities to protect society against them,
  • He also echoed the words of world wide web inventor Sir Tim Berners-Lee when he said the tech giants had become “obstacles to innovation” that need to be regulated as public utilities “aimed at preserving competition, innovation and fair and open universal access”.
  • Earlier this week, Salesforce’s chief executive, Marc Benioff, said that Facebook should be regulated like a cigarette company because it’s addictive and harmful.
  • In November, Roger McNamee, who was an early investor in Facebook, described Facebook and Google as threats to public health.
runlai_jiang

In Some Countries, Facebook's Fiddling Has Magnified Fake News - The New York Times - 0 views

  • In Some Countries, Facebook’s Fiddling Has Magnified Fake News
  • SAN FRANCISCO — One morning in October, the editors of Página Siete, Bolivia’s third-largest news site, noticed that traffic to their outlet coming from Facebook was plummeting.The publication had recently been hit by cyberattacks, and editors feared it was being targeted by hackers loyal to the government of President Evo Morales.
  • But it wasn’t the government’s fault. It was Facebook’s. The Silicon Valley company was testing a new version of its hugely popular News Feed, peeling off professional news sites from what people normally see and relegating them to a new section of Facebook called Explore.
  • ...4 more annotations...
  • Facebook said these News Feed modifications were not identical to those introduced last fall in six countries through its Explore program, but both alterations favor posts from friends and family over professional news sites. And what happened in those countries illustrates the unintended consequences of such a change in an online service that now has a global reach of more than two billion people every month.
  • The fabricated story circulated so widely that the local police issued a statement saying it wasn’t true. But when the police went to issue the warning on Facebook, they found that the message — unlike the fake news story they meant to combat — could no longer appear on News Feed because it came from an official account.Facebook explained its goals for the Explore program in Slovakia, Sri Lanka, Cambodia, Bolivia, Guatemala and Serbia in a blog post in October. “The goal of this test is to understand if people prefer to have separate places for personal and public content,” wrote Adam Mosseri, head of Facebook’s News Feed. “There is no current plan to roll this out beyond these test countries.”
  • The loss of visitors from Facebook was readily apparent in October, and Mr. Huallpa could communicate with Facebook only through a customer service form letter. He received an automatic reply in return.
  • ech giant may play in her country.“It’s a private company — they have the right to do as they please, of course,” she said. “But the first question we asked is ‘Why Bolivia?’ And we don’t even have the possibility of asking why. Why us?”
Javier E

Understanding What's Wrong With Facebook | Talking Points Memo - 0 views

  • to really understand the problem with Facebook we need to understand the structural roots of that problem, how much of it is baked into the core architecture of the site and its very business model
  • much of it is inherent in the core strategies of the post-2000, second wave Internet tech companies that now dominate our information space and economy.
  • Facebook is an ingenious engine for information and ideational manipulation.
  • ...17 more annotations...
  • Good old fashioned advertising does that to a degree. But Facebook is much more powerful, adaptive and efficient.
  • Facebook is designed to do specific things. It’s an engine to understand people’s minds and then manipulate their thinking.
  • Those tools are refined for revenue making but can be used for many other purposes. That makes it ripe for misuse and bad acting.
  • The core of all second wave Internet commerce operations was finding network models where costs grow mathematically and revenues grow exponentially.
  • The network and its dominance is the product and once it takes hold the cost inputs remained constrained while the revenues grow almost without limit.
  • Facebook is best understood as a fantastically profitable nuclear energy company whose profitability is based on dumping the waste on the side of the road and accepting frequent accidents and explosions as inherent to the enterprise.
  • That’s why these companies employ so few people relative to scale and profitability.
  • managing or distinguishing between legitimate and bad-acting uses of the powerful Facebook engine is one that would require huge, huge investments of money and armies of workers to manage
  • The core economic model requires doing all of it on the cheap. Indeed, what Zuckerberg et al. have created with Facebook is so vast that the money required not to do it on the cheap almost defies imagination.
  • Facebook’s core model and concept requires not taking responsibility for what others do with the engine created to drive revenue.
  • It all amounts to a grand exercise in socializing the externalities and keeping all the revenues for the owners.
  • Here’s a way to think about it. Nuclear power is actually incredibly cheap. The fuel is fairly plentiful and easy to pull out of the ground. You set up a little engine and it generates energy almost without limit. What makes it ruinously expensive is managing the externalities – all the risks and dangers, the radiation, accidents, the constant production of radioactive waste.
  • That’s why there’s no phone support for Google or Facebook or Twitter. If half the people on the planet are ‘customers’ or users that’s not remotely possible.
  • But back to Facebook. The point is that they’ve created a hugely powerful and potentially very dangerous machine
  • The core business model is based on harvesting the profits from the commercial uses of the machine and using algorithms and very, very limited personnel (relative to scale) to try to get a handle on the most outrageous and shocking abuses which the engine makes possible.
  • Zuckerberg may be a jerk and there really is a culture of bad acting within the organization. But it’s not about him being a jerk. Replace him and his team with non-jerks and you’d still have a similar core problem.
  • To manage the potential negative externalities, to take some responsibility for all the dangerous uses the engine makes possible would require money the owners are totally unwilling and in some ways are unable to spend.
Javier E

Here is the news - but only if Facebook thinks you need to know | John Naughton | Opini... - 0 views

  • power essentially comes in three varieties: the ability to compel people to do what they don’t want to do; the capability to stop them doing what they want to do; and the power to shape the way they think
  • This last is the kind of power exercised by our mass media. They can shape the public (and therefore the political) agenda by choosing the news that people read, hear or watch; and they can shape the ways in which that news is presented.
  • For a long time, Google was the 800lb gorilla in this domain, because its dominance of search determined what people could find in the unimaginable wastelands of cyberspace
  • ...7 more annotations...
  • search could be – and was – personalised, because Google’s algorithms could figure out what each user was most likely to be interested in, and therefore what kinds of information would be most relevant for her or him. So, imperceptibly, but inexorably over time, we have come to live in what Eli Pariser christened a “filter bubble”.
  • Before the internet, our problem with information was its scarcity. Now our problem is unmanageable abundance. So now the scarce resources are attention and time, over which a vicious war has broken out between traditional media and the internet-based upstarts.
  • YouTube has a billion users, half of whom access it via mobile devices. The average time spent on the site is 40 minutes. Facebook now claims to have 1.65 billion monthly active users, who spend on average 50 minutes a day on its services. So if Google is an 800lb gorilla, Facebook is a megaton King Kong.
  • Competition for attention and time is a zero-sum game that traditional media are losing. In desperation, they are trying both to appease Facebook and to harness its hold on people’s attention
  • In doing so, they have entered into a truly Faustian bargain. Because while publishers can without difficulty ship their stuff to Instant Articles, they cannot control which ones Facebook users actually get to see. This is because users’ news feeds are determined by Facebook’s machine-learning algorithms that try to guess what each user would like to see (and what might dispose them to click on an advertisement).
  • when you ask – as Professor George Brock memorably did – whether Mark Zuckerberg and his satraps understand that they have acquired editorial responsibilities, they look blank. Facebook is not a publisher, they explain, merely a “platform”. And, besides, no humans are involved in curating users’ news feeds: it’s all done by algorithms and is therefore neutral. In other words: nothing to see here; move on.
  • Any algorithm that has to make choices has criteria that are specified by its designers. And those criteria are expressions of human values. Engineers may think they are “neutral”, but long experience has shown us they are babes in the woods of politics, economics and ideology.
lenaurick

6 degrees of separation is too much - Facebook says we're all 3.5 degrees apart - Vox - 0 views

  • A well-known theory holds that most people, at least in the US and perhaps in the world, are six degrees of separation away from each other. Pick a random stranger anywhere in the country, the theory goes, and chances are you can build a chain of acquaintances between the two of you in no more than six hops.
  • The idea of "six degrees of separation" rests on a scientific foundation that's dubious at best. But Facebook, because its users give it access to possibly the richest data set ever on how 1.6 billion people know and interact with each other, set out to prove it with a statistical algorithm.
  • The average Facebook user is three and a half degrees of separation away from every other user, and the social network's post tells you your own distance from everyone else on the site.
  • ...8 more annotations...
  • Mark Zuckerberg is 3.17 degrees of separation from all Facebook users.
  • The typical Facebook user has 155 friends, but only describes 50 of them as friends in real life, according to a 2014 study from the Pew Research Center. Thirty-five percent of people have Facebook friends they've never met in person.
  • The original "six degrees of separation" experiment required people to know each other fairly well: They had to be on a first-name basis, at a time when society was slightly more formal, in order for the connection to count.
  • About one-third of the documents eventually reached the stockbroker, after a chain of, on average, six people — the six degrees of separation. It's a small world after all, Milgram concluded.
  • She found instead that Milgram's conclusions rested on a shaky foundation, and that class and race divided Americans more than his original paper admitted. The majority of Milgram's letters didn't make it to the Boston stockbroker. And further experiments that factored in class and race suggested that making connections across those barriers was even more challenging
  • While middle- and high-income people were able to find their targets regardless of their household income, low-income people could only connect with other low-income families, Kleinfeld wrote in a 2002 article in the journal Society.
  • The correct interpretation of Milgram, she argued, was not the optimistic conclusion that we're all only a few degrees of separation away. Instead, it's that there are still barriers that are insurmountable.
  • But the real divide is between people who are on Facebook and those who aren't on the internet at all. Facebook users make up about 62 percent of American adults but 72 percent of all internet users. Americans without the internet are disproportionately older, rural, and less educated. As Facebook users get closer, they might be becoming ever more isolated.
Javier E

Google's new media apocalypse: How the search giant wants to accelerate the end of the ... - 0 views

  • Google is announcing that it wants to cut out the middleman—that is to say, other websites—and serve you content within its own lovely little walled garden. That sound you just heard was a bunch of media publishers rushing to book an extra appointment with their shrink.
  • Back when search, and not social media, ruled the internet, Google was the sun around which the news industry orbited. Getting to the top of Google’s results was the key that unlocked buckets of page views. Outlet after outlet spent countless hours trying to figure out how to game Google’s prized, secretive algorithm. Whole swaths of the industry were killed instantly if Google tweaked the algorithm.
  • Facebook is now the sun. Facebook is the company keeping everyone up at night. Facebook is the place shaping how stories get chosen, how they get written, how they are packaged and how they show up on its site. And Facebook does all of this with just as much secrecy and just as little accountability as Google did.
  • ...3 more annotations...
  • Facebook just opened up its Instant Articles feature to all publishers. The feature allows external outlets to publish their content directly onto Facebook’s platform, eliminating that pesky journey to their actual website. They can either place their own ads on the content or join a revenue-sharing program with Facebook. Facebook has touted this plan as one which provides a better user experience and has noted the ability for publishers to create ads on the platform as well.
  • The benefit to Facebook is obvious: It gets to keep people inside its house. They don’t have to leave for even a second. The publisher essentially has to accept this reality, sigh about the gradual death of websites and hope that everything works out on the financial side.
  • It’s all part of a much bigger story: that of how the internet, that supposed smasher of gates and leveler of playing fields, has coalesced around a mere handful of mega-giants in the space of just a couple of decades. The gates didn’t really come down. The identities of the gatekeepers just changed. Google, Facebook, Apple, Amazon
Javier E

Does Facebook Turn People Into Narcissists? - NYTimes.com - 0 views

  • Those who frequently updated their Facebook status, tagged themselves in photos and had large numbers of virtual friends, were more likely to exhibit narcissistic traits, the study found. Another study found that people with high levels of narcissism were more likely to spend more than an hour a day on Facebook, and they were also more likely to post digitally enhanced personal photos. But what the research doesn’t answer is whether Facebook attracts narcissists or turns us into them.
  • researchers found, to their surprise, that frequency of Facebook use, whether it was for personal status updates or to connect with friends, was not associated with narcissism. Narcissism per se was associated with only one type of Facebook user — those who amassed unrealistically large numbers of Facebook friends.
  • frequent Facebook users were more likely to score high on “openness” and were less concerned about privacy. So what seems like self-promoting behavior may just reflect a generation growing up in the digital age, where information — including details about personal lives — flows freely and connects us.
  • ...1 more annotation...
  • The social medium of choice for the self-absorbed appears to be Twitter. The researchers found an association between tweeting about oneself and high narcissism scores.
Javier E

New Foils for the Right: Google and Facebook - The New York Times - 0 views

  • In a sign of escalation, Peter Schweizer, a right-wing journalist known for his investigations into Hillary Clinton, plans to release a new film focusing on technology companies and their role in filtering the news.
  • The documentary, which has not been previously reported, dovetails with concerns raised in recent weeks by right-wing groups about censorship on digital media — a new front in a rapidly evolving culture war.
  • The critique from conservatives, in contrast, casts the big tech companies as censorious and oppressive, all too eager to stifle right-wing content in an effort to mollify liberal critics.
  • ...9 more annotations...
  • Big Tech is easily associated with West Coast liberalism and Democratic politics, making it a fertile target for the right. And operational opacity at Facebook, Google and Twitter, which are reluctant to reveal details about their algorithms and internal policies, can leave them vulnerable, too.
  • “There’s not even a real basis to establish objective research about what’s happening on Facebook, because it’s closed.”
  • And former President Barack Obama said at an off-the-record conference at the Massachusetts Institute of Technology last month that he worried Americans were living in “entirely different realities” and that large tech companies like Facebook were “not just an invisible platform, they’re shaping our culture in powerful ways.” The contents of the speech were published by Reason magazine.
  • The panelists accused social media platforms of delisting their videos or stripping them of advertising. Such charges have long been staples of far-right online discourse, especially among YouTubers, but Mr. Schweizer’s project is poised to bring such arguments to a new — and potentially larger — audience.
  • He is also the president of the Government Accountability Institute, a conservative nonprofit organization. He and Mr. Bannon founded it with funding from the family of Robert Mercer, the billionaire hedge fund manager and donor to Donald J. Trump’s presidential campaign.
  • Jeffrey A. Zucker, the president of CNN, derided Google and Facebook as “monopolies” and called for regulators to step in during a speech in Spain last month, saying the tech hegemony is “the biggest issue facing the growth of journalism in the years ahead.”
  • “There are political activists in all of these companies that want to actively push a liberal agenda,” he said. “Why does it matter? Because these companies are so ubiquitous and powerful that they are controlling all the means of mass communication.”
  • The Facebook adjustment has affected virtually every media organization that is partly dependent on the platform for audiences, but it appears to have hit some harder than others. They include right-wing sites like Gateway Pundit and the millennial-focused Independent Journal Review, which was forced to lay off staff members last month.
  • The social news giant BuzzFeed recently bought ads on Facebook with the message, “Facebook is taking the news out of your News Feed, but we’ve got you covered,” directing users to download its app. Away from the political scrum, the viral lifestyle site LittleThings, once a top publisher on the platform, announced last week that it would cease operations, blaming “a full-on catastrophic update” to Facebook’s revised algorithms.
Javier E

Facebook, the Company That Loves Misery - WSJ - 1 views

  • For more than a decade Mark Zuckerberg has been running an experiment in openness. We are the test subjects. So what does he think about the fact that being “open and connected,” Facebook -style, is making us miserable?
  • Several studies, most recently one out of San Diego State University analyzing the leisure activities of a million teens, have concluded that the more time spent on Facebook, the less happy we tend to be
  • In 2010 Mr. Zuckerberg announced that the old social norm of privacy had “evolved”—a fortuitous discovery for someone who had devoted his life to whittling others’ privacy away. Indeed, Facebook’s essential conceit is that privacy is outmoded—the corset we never wanted and are so much freer without
  • ...9 more annotations...
  • We’ve known for a while that Facebook enables online communication with friends and family—but also a sharply targeted form of bullying. That it wastes our time. That, whatever relationships it nurtures, it kills off others entirely.
  • We registered as Facebook users imagining ourselves vacationing at a new resort, but it turned out to be a nudist colony
  • The Cambridge Analytica scandal came on like a slap, the kind that breaks the spell and makes you wonder what on earth you’ve been doing. How had we given so much away? We squandered assets we may never regain—privacy, dignity—and for what?
  • But privacy is also a shield, and it protects subject and observer alike
  • Mr. Zuckerberg’s promise to protect our data is laughable because exploiting our data is precisely his business.
  • Over the years, Facebook pushed us to share more of ourselves. “What are you doing right now?” became “What’s on your mind, Abigail?” It jiggered the order of posts to keep our navels and our friends’ well-gazed, all the while rendering us more vulnerable to abuse
  • When we discovered our pockets had been picked, Facebook suddenly seemed more hustler than host; its endless party, one great confidence scheme.
  • And now, Congress is calling on Mr. Zuckerberg to fix the problem as if the problem weren’t Facebook itself
  • Is there any piece of data about us that, on principle, Mr. Zuckerberg wouldn’t monetize
Javier E

Facebook will now ask users to rank news organizations they trust - The Washington Post - 0 views

  • Zuckerberg wrote Facebook is not “comfortable” deciding which news sources are the most trustworthy in a “world with so much division."
  • "We decided that having the community determine which sources are broadly trusted would be most objective," he wrote.
  • The new trust rankings will emerge from surveys the company is conducting. "Broadly trusted" outlets that are affirmed by a significant cross-section of users may see a boost in readership, while less known organizations or start-ups receiving poor ratings could see their web traffic decline
  • ...14 more annotations...
  • The company's changes include an effort to boost the content of local news outlets, which have suffered sizable subscription and readership declines
  • The changes follow another major News Feed redesign, announced last week, in which Facebook said users would begin to see less content from news organizations and brands in favor of "meaningful" posts from friends and family.
  • Currently, 5 percent of Facebook posts are generated by news organizations; that number is expected to drop to 4 percent after the redesign, Zuckerberg said.
  • On Friday, Google announced it would cancel a two-month-old experiment, called Knowledge Panel, that informed its users that a news article had been disputed by independent fact-checking organizations. Conservatives had complained the feature unfairly targeted a right-leaning outlet.
  • More than two-thirds of Americans now get some of their news from social media, according to Pew Research Center.
  • "The hard question we've struggled with is how to decide what news sources are broadly trusted," Zuckerberg wrote. "We could try to make that decision ourselves, but that's not something we're comfortable with. We considered asking outside experts, which would take the decision out of our hands but would likely not solve the objectivity problem. Or we could ask you -- the community -- and have your feedback determine the ranking."
  • "Just by putting things out to a vote in terms of what the community would find trustworthy undermines the role for any serious institutionalized process to determine what’s quality and what’s not,” he said.
  • rther criticism that the social network had become vulnerable to bad actors seeking to spread disinformation.
  • Jay Rosen, a journalism professor at New York University, said that Facebook learned the wrong lesson from Trending Topics, which was to try to avoid politics at all costs
  • “One of the things that can happen if you are determined to avoid politics at all costs is you are driven to illusory solutions,” he said. “I don’t think there is any alternative to using your judgement. But Facebook is convinced that there is. This idea that they can avoid judgement is part of their problem.”
  • Facebook revealed few details about how it is conducting its trust surveys,
  • That shift has empowered Facebook and Google, putting them in an uncomfortable position of deciding what news they should distribute to their global audiences. But it also has led to questions about whether these corporations should be considered media companie
  • Some experts wondered whether Facebook's latest effort could be gamed.
  • "This seems like a positive step toward improving the news environment on Facebook," Diresta said. "That said, the potential downside is that the survey approach unfairly penalizes emerging publications."
runlai_jiang

A New Antidote for Noisy Airports: Slower Planes - WSJ - 0 views

  • Urban airports like Boston’s Logan thought they had silenced noise issues with quieter planes. Now complaints pour in from suburbs 10 to 15 miles away because new navigation routes have created relentless noise for some homeowners. Photo: Alamy By Scott McCartney Scott McCartney The Wall Street Journal BiographyScott McCartney @MiddleSeat Scott.McCartney@wsj.com March 7, 2018 8:39 a.m. ET 146 COMMENTS saveSB107507240220
  • It turns out engines aren’t the major culprit anymore. New airplanes are much quieter. It’s the “whoosh” that big airplanes make racing through the air.
  • Computer models suggest slowing departures by 30 knots—about 35 miles an hour—would reduce noise on the ground significantly.
  • ...9 more annotations...
  • The FAA says it’s impressed and is moving forward with recommendations Boston has made.
  • . A working group is forming to evaluate the main recommendation to slow departing jets to a speed limit of 220 knots during the climb to 10,000 feet, down from 250 knots.
  • New routes put planes over quiet communities. Complaints soared. Phoenix neighborhoods sued the FAA; Chicago neighborhoods are pushing for rotating runway use. Neighborhoods from California to Washington, D.C., are fighting the new procedures that airlines and the FAA insist are vital to future travel.
  • “It’s a concentration problem. It’s a frequency problem. It’s not really a noise problem.”
  • “The flights wake you up. We get a lot of complaints from young families with children,” says Mr. Wright, a data analyst who works from home for a major health-care company.
  • In Boston, an analysis suggested only 54% of the complaints Massport received resulted from noise louder than 45 decibels—about the level of background noise. When it’s relentless, you notice it more.
  • With a 30-knot reduction, noise directly under the flight track would decrease by between 1.5 and 5 decibels and the footprint on the ground would get a lot skinnier, sharply reducing the number of people affected, Mr. Hansman says.
  • The industry trade association Airlines for America has offered cautious support of the Boston recommendations. In a statement, the group said the changes must be safe, work with a variety of aircraft and not reduce the airport’s capacity for takeoffs and landings.
  • Air-traffic controllers will need to delay a departure a bit to put more room between a slower plane and a faster one, or modify its course slightly.
Javier E

Opinion | Farhad Manjoo: I Was Wrong About Facebook - The New York Times - 0 views

  • I wasn’t just wrong about Facebook; I had the matter exactly backward. Had we all decided to leave Facebook then or at any time since, the internet and perhaps the world might now be a better place
  • my 2009 exhortation for people to go all in on Facebook still makes me cringe. My argument suffers from the same flaws I regularly climb up on my mainstream-media soapbox to denounce in tech bros:
  • why, at the dawn of 2009, was I foisting Facebook on the masses? I’ve got three answers.
  • ...12 more annotations...
  • a failure to seriously consider the implications of an invention as it becomes entrenched in society; a deep trust in networks, in the idea that allowing people to more freely associate would redound mainly to the good of society; and too much affection for the culture of Silicon Valley and the idea that the people who created a certain thing must have some clue about what to do with it.
  • I didn’t consider the far-reaching implications of Facebook’s ubiquity.
  • Social networks, I observed, got better as more people used them; it seemed reasonable that at some point one social network would gain widespread acceptance and become a comprehensive directory for connecting everyone.
  • As an immigrant, I’d also bought into the world-shrinking implications of such a network.
  • I got carried away by the excitement of new tech.
  • What I’d failed to consider was how all these various new things would interact with one another, especially as more people got online.
  • in calling for everyone to get on Facebook, I should have made a better stab at guessing what could go wrong if we all did. What would be the implications for privacy if we were all using Facebook on our phones — how much could this one service glean about you by being in your pocket all the time?
  • What would the implications for speech and media be if this single company became a central clearinghouse in the global discourse?
  • I trusted techies.
  • This was the vibe pervading media and politics in the late 2000s: Wall Street had ruined the world. Silicon Valley could put it right.
  • It does not seem in any way good for society — for the economy, for politics, for a basic sense of equality — that a handful of hundred-billion-dollar or even trillion-dollar companies should control such large swathes of the internet.
  • Obama’s regulators allowed Facebook to buy up its biggest competitors — first Instagram, then WhatsApp — and failed to crack down on its recklessness with users’ private data
summertyler

Why Facebook's News Experiment Matters to Readers - NYTimes.com - 1 views

  • Facebook’s new plan to host news publications’ stories directly is not only about page views, advertising revenue or the number of seconds it takes for an article to load. It is about who owns the relationship with readers.
  • Tech companies have always stepped on one another’s toes to try to become people’s gateway to the digital world — the only place people need to go to get what they want.
  • all kinds of companies are now becoming tech companies
  • ...3 more annotations...
  • Facebook’s experiment, called instant articles, is small to start — just a few articles from nine media companies, including The New York Times. But it signals a major shift in the relationship between publications and their readers. If you want to read the news, Facebook is saying, come to Facebook, not to NBC News or The Atlantic or The Times — and when you come, don’t leave. (For now, these articles can be viewed on an iPhone running the Facebook app.)
  • The front page of a newspaper and the cover of a magazine lost their dominance long ago. Web home pages are following suit. Increasingly, the articles, videos, photographs and graphics that media organizations publish are stand-alone fragments that readers happen upon one at a time, often on social media.
  • “In digital, every story becomes unbundled from each other, so if you’re not thinking of each story as living on its own, it’s tying yourself back to an analog era,”
  •  
    Should we trust social media with the news?
Javier E

I worked at Facebook. I know how Cambridge Analytica could have happened. - The Washing... - 0 views

  • During my 16 months at Facebook, I called many developers and demanded compliance, but I don’t recall the company conducting a single audit of a developer where the company inspected the developer’s data storage. Lawsuits and outright bans were also very rare. I believe the reason for lax enforcement was simple: Facebook didn’t want to make the public aware of huge weaknesses in its data security.
  • Concerned about the lack of protection for users, in 2012 I created a PowerPoint presentation that outlined the ways that data vulnerabilities on Facebook Platform exposed people to harm, and the various ways the company was trying to protect that data. There were many gaps that left users exposed. I also called out potential bad actors, including data brokers and foreign state actors. I sent the document to senior executives at the company but got little to no response. I had no dedicated engineers assigned to help resolve known issues, and no budget for external vendors.
  • Facebook will argue that things have changed since 2012 and that the company has much better processes in place now. If that were true, Cambridge Analytica would be small side note, a developer that Facebook shut down and sued out of existence in December 2015 when word first got out that it had violated Facebook’s policies to acquire the data of millions. Instead, it appears Facebook used the same playbook that I saw in 2012.
  • ...1 more annotation...
  • In the wake of this catastrophic violation, Mark Zuckerberg must be forced to testify before Congress and should be held accountable for the negligence of his company. Facebook has systematically failed to enforce its own policies. The only solution is external oversight.
Javier E

Fixation on Fake News Overshadows Waning Trust in Real Reporting - The New York Times - 2 views

  • It misunderstands a new media world in which every story, and source, is at risk of being discredited, not by argument but by sheer force.
  • During the months I spent talking to partisan Facebook page operators for a magazine article this year, it became clear that while the ecosystem contained easily identifiable and intentional fabrication, it contained much, much more of something else.
  • I recall a conversation with a fact checker about how to describe a story, posted on a pro-Trump website and promoted on a pro-Trump Facebook page — and, incidentally, copied from another pro-Trump site by overseas contractors. It tried to cast suspicion on Khizr Khan, the father of a slain American soldier, who had spoken out against Donald J. Trump.
  • ...14 more annotations...
  • The overarching claims of the story were disingenuous and horrifying; the facts it included had been removed from all useful context and placed in a new, sinister one; its insinuating mention of “Muslim martyrs,” in proximity to mentions of Mr. Khan’s son, and its misleading and strategic mention of Shariah law, amounted to a repulsive smear. It was a story that appealed to bigoted ideas and that would clearly appeal to those who held them.
  • This was a story the likes of which was an enormous force in this election, clearly designed to function well within Facebook’s economy of sharing. And it probably would not run afoul of the narrow definition of “fake news.”
  • Stories like that one get to the heart of the rhetorical and strategic risk of holding up “fake news” as a broad media offensive position, especially after an election cycle characterized by the euphoric inversion of rhetoric by some of Mr. Trump’s supporters, and by the candidate himself
  • This tactic was used on the language of social justice, which was appropriated by opponents and redeployed nihilistically, in an open effort to sap its power while simultaneously taking advantage of what power it retained
  • Anti-racists were cast as the real racists. Progressives were cast as secretly regressive on their own terms
  • This was not a new tactic, but it was newly effective. It didn’t matter that its targets knew that it was a bad-faith maneuver, a clear bid for power rather than an attempt to engage or reason. The referees called foul, but nobody could hear them over the roar of the crowds. Or maybe they could, but realized that nobody could make them listen.
  • This wide formulation of “fake news” will be applied back to the traditional news media, which does not yet understand how threatened its ability is to declare things true, even when they are.
  • the worst identified defenders make their money outside Facebook anyway.
  • Another narrow response from Facebook could be to assert editorial control over external forces
  • Facebook is a place where people construct and project identities to friends, family and peers. It is a marketplace in which news is valuable mainly to the extent that it serves those identities. It is a system built on ranking and vetting and votin
  • Fake news operations are closely aligned with the experienced incentives of the Facebook economy
  • the outrage is at risk of being misdirected, and will be followed by the realization that the colloquial “fake news” — the newslike media, amateur and professional, for which truth is defined first in personal and political terms, and which must only meet the bar of not being obviously, inarguably, demonstrably false — will continue growing apace, gaining authority by sheer force
  • Media companies have spent years looking to Facebook, waiting for the company to present a solution to their mounting business concerns
  • Those who expect the operator of the dominant media ecosystem of our time, in response to getting caught promoting lies, to suddenly return authority to the companies it has superseded are in for a similar surprise.
Javier E

Technology Imperialism, the Californian Ideology, and the Future of Higher Education - 2 views

  • What I hope to make explicit today is how much California – the place, the concept, “the dream machine” – shapes (wants to shape) the future of technology and the future of education.
  • In an announcement on Facebook – of course – Zuckerberg argued that “connectivity is a human right.”
  • As Zuckerberg frames it at least, the “human right” in this case is participation in the global economy
  • ...34 more annotations...
  • This is a revealing definition of “human rights,” I’d argue, particularly as it’s one that never addresses things like liberty, equality, or justice. It never addresses freedom of expression or freedom of assembly or freedom of association.
  • in certain countries, a number of people say they do not use the Internet yet they talk about how much time they spend on Facebook. According to one survey, 11% of Indonesians who said they used Facebook also said they did not use the Internet. A survey in Nigeria had similar results:
  • Evgeny Morozov has described this belief as “Internet-centrism,” an ideology he argues permeates the tech industry, its PR wing the tech blogosphere, and increasingly government policy
  • “Internet-centrism” describes the tendency to see “the Internet” – Morozov uses quotations around the phrase – as a new yet unchanging, autonomous, benevolent, and inevitable socio-technological development. “The Internet” is a master framework for how all institutions will supposedly operate moving forward
  • “The opportunity to connect” as a human right assumes that “connectivity” will hasten the advent of these other rights, I suppose – that the Internet will topple dictatorships, for example, that it will extend participation in civic life to everyone and, for our purposes here at this conference, that it will “democratize education.”
  • “The Silicon Valley Narrative,” as I call it, is the story that the technology industry tells about the world – not only the world-as-is but the world-as-Silicon-Valley-wants-it-to-be.
  • Facebook is really just synecdochal here, I should add – just one example of the forces I think are at play, politically, economically, technologically, culturally.
  • it matters at the level of ideology. Infrastructure is ideological, of course. The new infrastructure – “the Internet” if you will – has a particular political, economic, and cultural bent to it. It is not neutral.
  • This infrastructure matters. In this case, this is a French satellite company (Eutelsat). This is an American social network (Facebook). Mark Zuckerberg’s altruistic rhetoric aside, this is their plan – an economic plan – to monetize the world’s poor.
  • The content and the form of “connectivity” perpetuate imperialism, and not only in Africa but in all of our lives. Imperialism at the level of infrastructure – not just cultural imperialism but technological imperialism
  • Empire is not simply an endeavor of the nation-state – we have empire through technology (that’s not new) and now, the technology industry as empire.
  • To better analyze and assess both technology and education technology requires our understanding of these as ideological, argues Neil Selwyn – “‘a site of social struggle’ through which hegemonic positions are developed, legitimated, reproduced and challenged.”
  • This narrative has several commonly used tropes
  • It often features a hero: the technology entrepreneur. Smart. Independent. Bold. Risk-taking. White. Male
  • “The Silicon Valley narrative” invokes themes like “innovation” and “disruption.” It privileges the new; everything else that can be deemed “old” is viewed as obsolete.
  • Facebook is “the Internet” for a fairly sizable number of people. They know nothing else – conceptually, experientially. And, let’s be honest, Facebook wants to be “the Internet” for everyone.
  • “The Silicon Valley narrative” fosters a distrust of institutions – the government, the university. It is neoliberal. It hates paying taxes.
  • “The Silicon Valley narrative” draws from the work of Ayn Rand; it privileges the individual at all costs; it calls this “personalization.”
  • “The Silicon Valley narrative” does not neatly co-exist with public education. We forget this at our peril. This makes education technology, specifically, an incredibly fraught area.
  • Here’s the story I think we like to hear about ed-tech, about distance education, about “connectivity” and learning: Education technology is supportive, not exploitative. Education technology opens, not forecloses, opportunities. Education technology is driven by a rethinking of teaching and learning, not expanding markets or empire. Education technology meets individual and institutional and community goals.
  • That’s not really what the “Silicon Valley narrative” says about education
  • It is interested in data extraction and monetization and standardization and scale. It is interested in markets and return on investment. “Education is broken,” and technology will fix it
  • If “Silicon Valley” isn’t quite accurate, then I must admit that the word “narrative” is probably inadequate too
  • The better term here is “ideology.”
  • It contends that its workings are meritocratic: anyone who hustles can make it.
  • We tend to not see technology as ideological – its connections to libertarianism, neoliberalism, global capitalism, empire.
  • The California ideology ignores race and labor and the water supply; it is sustained by air and fantasy. It is built upon white supremacy and imperialism.
  • As is the technology sector, which has its own history, of course, in warfare and cryptography.
  • So far this year, some $3.76 billion of venture capital has been invested in education technology – a record-setting figure. That money will change the landscape – that’s its intention. That money carries with it a story about the future; it carries with it an ideology.
  • I want to show you this map, a proposal – a failed proposal, thankfully – by venture capitalist Tim Draper to split the state of California into six separate states: Jefferson, North California, Silicon Valley, Central California, West California, and South California. The proposal, which Draper tried to collect enough signatures to get on the ballot in California, would have created the richest state in the US – Silicon Valley would be first in per-capita income. It would also have created the nation’s poorest state, Central California, which would rank even below Mississippi.
  • We in education would be naive, I think, to think that the designs that venture capitalists and technology entrepreneurs have for us would be any less radical than creating a new state, like Draper’s proposed state of Silicon Valley, that would enormously wealthy and politically powerful.
  • When I hear talk of “unbundling” in education – one of the latest gerunds you’ll hear venture capitalists and ed-tech entrepreneurs invoke, meaning the disassembling of institutions into products and services – I can’t help but think of the “unbundling” that Draper wished to do to my state: carving up land and resources, shifting tax revenue and tax burdens, creating new markets, privatizing public institutions, redistributing power and doing so explicitly not in the service of equity or justice.
  • When a venture capitalist says that “software is eating the world,” we can push back on the inevitability implied in that. We can resist – not in the name of clinging to “the old” as those in educational institutions are so often accused of doing – but we can resist in the name of freedom and justice and a future that isn’t dictated by the wealthiest white men in Hollywood or Silicon Valley.
  • that’s not all that Silicon Valley really does.
‹ Previous 21 - 40 of 351 Next › Last »
Showing 20 items per page