Skip to main content

Home/ TOK Friends/ Group items tagged privacy

Rss Feed Group items tagged

Javier E

The Facebook Fallacy: Privacy Is Up to You - The New York Times - 0 views

  • As Facebook’s co-founder and chief executive parried questions from members of Congress about how the social network would protect its users’ privacy, he returned time and again to what probably sounded like an unimpeachable proposition.
  • By providing its users with greater and more transparent controls over the personal data they share and how it is used for targeted advertising, he insisted, Facebook could empower them to make their own call and decide how much privacy they were willing to put on the block.
  • providing a greater sense of control over their personal data won’t make Facebook users more cautious. It will instead encourage them to share more.
  • ...21 more annotations...
  • “Disingenuous is the adjective I had in my mind,”
  • “Fifteen years ago it would have been legitimate to propose this argument,” he added. “But it is no longer legitimate to ignore the behavioral problems and propose simply more transparency and controls.”
  • Professor Acquisti and two colleagues, Laura Brandimarte and the behavioral economist George Loewenstein, published research on this behavior nearly six years ago. “Providing users of modern information-sharing technologies with more granular privacy controls may lead them to share more sensitive information with larger, and possibly riskier, audiences,” they concluded.
  • the critical question is whether, given the tools, we can be trusted to manage the experience. The increasing body of research into how we behave online suggests not.
  • “Privacy control settings give people more rope to hang themselves,” Professor Loewenstein told me. “Facebook has figured this out, so they give you incredibly granular controls.”
  • This paradox is hardly the only psychological quirk for the social network to exploit. Consider default settings. Tons of research in behavioral economics has found that people tend to stick to the default setting of whatever is offered to them, even when they could change it easily.
  • “Facebook is acutely aware of this,” Professor Loewenstein told me. In 2005, its default settings shared most profile fields with, at most, friends of friends. Nothing was shared by default with the full internet.
  • By 2010, however, likes, name, gender, picture and a lot of other things were shared with everybody online. “Facebook changed the defaults because it appreciated their power,” Professor Loewenstein added.
  • The phenomenon even has a name: the “control paradox.”
  • people who profess concern about privacy will provide the emails of their friends in exchange for some pizza.
  • They also found that providing consumers reassuring though irrelevant information about their ability to protect their privacy will make them less likely to avoid surveillance.
  • Another experiment revealed that people are more willing to come clean about their engagement in illicit or questionable behavior when they believe others have done so, too
  • Those in the industry often argue that people don’t really care about their privacy — that they may seem concerned when they answer surveys, but still routinely accept cookies and consent to have their data harvested in exchange for cool online experiences
  • Professor Acquisti thinks this is a fallacy. The cognitive hurdles to manage our privacy online are simply too steep.
  • While we are good at handling our privacy in the offline world, lowering our voices or closing the curtains as the occasion may warrant, there are no cues online to alert us to a potential privacy invasion
  • Even if we were to know precisely what information companies like Facebook have about us and how it will be used, which we don’t, it would be hard for us to assess potential harms
  • Members of Congress have mostly let market forces prevail online, unfettered by government meddling. Privacy protection in the internet economy has relied on the belief that consumers will make rational choices
  • Europe’s stringent new privacy protection law, which Facebook has promised to apply in the United States, may do better than the American system of disclosure and consen
  • the European system also relies mostly on faith that consumers will make rational choices.
  • The more that psychologists and behavioral economists study psychological biases and quirks, the clearer it seems that rational choices alone won’t work. “I don’t think any kind of disclosure or opt in or opt out is going to protect us from our worst instincts,”
  • What to do? Professor Acquisti suggests flipping the burden of proof. The case for privacy regulation rests on consumers’ proving that data collection is harmful. Why not ask the big online platforms like Facebook to prove they can’t work without it? If reducing data collection imposes a cost, we could figure out who bears it — whether consumers, advertisers or Facebook’s bottom line.
sissij

The Future of Privacy - The New York Times - 0 views

  • Turning Point: Apple resists the F.B.I. in unlocking an iPhone in the San Bernardino terrorism case.
  • Privacy confuses me, beyond my simplest understanding, which is that individuals prefer, to different degrees, that information about them not be freely available to others.
  • What does it mean, in an ostensible democracy, for the state to keep secrets from its citizens? The idea of the secret state seems antithetical to democracy, since its citizens, the voters, can’t know what their government is doing.
  • ...7 more annotations...
  • If you have nothing to hide and you trust your government, what can you possibly have to fear? Except that one can just as readily ask: If you have nothing to hide, what do you really have, aside from the panoptic attention of a state, which itself keeps secrets?
  • Is individual privacy and state privacy the same thing?
  • In the short term, the span of a lifetime, many of us would argue for privacy, and therefore against transparency.
  • But history, the long term, is transparency; it is the absence of secrets.
  • The past, our own past, which our descendants will see us as having emerged from, will not be the past from which we now see ourselves emerging, but a reinterpretation of it, based on subsequently available information, greater transparency and fewer secrets.
  • our species is the poorer for every secret faithfully kept. Any permanently unbreakable encryption seems counter to that.
  • So perhaps that desire is as much a part of us, as a species, as our need to build these memory palaces.
  •  
    I found this article very interesting because it talked about the dilemma in the definition of privacy. I found that our idea of privacy is very complicated. As everybody wants to keep their secret, they are curious of others' secret at the same time. This article also relate privacy with the democratic society. I think it shows one of the weakness of the democratic society: it trying to fulfill the desire of everybody and sometimes those desires contradicts each other. I think privacy is just like freedom, something very theoretical and does not exist in the reality. --Sissi (12/7/2016)
Javier E

How Calls for Privacy May Upend Business for Facebook and Google - The New York Times - 0 views

  • People detailed their interests and obsessions on Facebook and Google, generating a river of data that could be collected and harnessed for advertising. The companies became very rich. Users seemed happy. Privacy was deemed obsolete, like bloodletting and milkmen
  • It has been many months of allegations and arguments that the internet in general and social media in particular are pulling society down instead of lifting it up.
  • That has inspired a good deal of debate about more restrictive futures for Facebook and Google. At the furthest extreme, some dream of the companies becoming public utilities.
  • ...20 more annotations...
  • If suspicion of Facebook and Google is a relatively new feeling in the United States, it has been embedded in Europe for historical and cultural reasons that date back to the Nazi Gestapo, the Soviet occupation of Eastern Europe and the Cold War.
  • The greatest likelihood is that the internet companies, frightened by the tumult, will accept a few more rules and work a little harder for transparency.
  • The Cambridge Analytica case, said Vera Jourova, the European Union commissioner for justice, consumers and gender equality, was not just a breach of private data. “This is much more serious, because here we witness the threat to democracy, to democratic plurality,” she said.
  • Although many people had a general understanding that free online services used their personal details to customize the ads they saw, the latest controversy starkly exposed the machinery.
  • Consumers’ seemingly benign activities — their likes — could be used to covertly categorize and influence their behavior. And not just by unknown third parties. Facebook itself has worked directly with presidential campaigns on ad targeting, describing its services in a company case study as “influencing voters.”
  • “If your personal information can help sway elections, which affects everyone’s life and societal well-being, maybe privacy does matter after all.”
  • some trade group executives also warned that any attempt to curb the use of consumer data would put the business model of the ad-supported internet at risk.
  • “You’re undermining a fundamental concept in advertising: reaching consumers who are interested in a particular product,”
  • There are other avenues still, said Jascha Kaykas-Wolff, the chief marketing officer of Mozilla, the nonprofit organization behind the popular Firefox browser, including advertisers and large tech platforms collecting vastly less user data and still effectively customizing ads to consumers.
  • “We’re at an inflection point, when the great wave of optimism about tech is giving way to growing alarm,” said Heather Grabbe, director of the Open Society European Policy Institute. “This is the moment when Europeans turn to the state for protection and answers, and are less likely than Americans to rely on the market to sort out imbalances.”
  • In May, the European Union is instituting a comprehensive new privacy law, called the General Data Protection Regulation. The new rules treat personal data as proprietary, owned by an individual, and any use of that data must be accompanied by permission — opting in rather than opting out — after receiving a request written in clear language, not legalese.
  • the protection rules will have more teeth than the current 1995 directive. For example, a company experiencing a data breach involving individuals must notify the data protection authority within 72 hours and would be subject to fines of up to 20 million euros or 4 percent of its annual revenue.
  • “With the new European law, regulators for the first time have real enforcement tools,” said Jeffrey Chester, the executive director of the Center for Digital Democracy, a nonprofit group in Washington. “We now have a way to hold these companies accountable.”
  • Privacy advocates and even some United States regulators have long been concerned about the ability of online services to track consumers and make inferences about their financial status, health concerns and other intimate details to show them behavior-based ads. They warned that such microtargeting could unfairly categorize or exclude certain people.
  • the Do Not Track effort and the privacy bill were both stymied.Industry groups successfully argued that collecting personal details posed no harm to consumers and that efforts to hinder data collection would chill innovation.
  • “If it can be shown that the current situation is actually a market failure and not an individual-company failure, then there’s a case to be made for federal regulation” under certain circumstances
  • The business practices of Facebook and Google were reinforced by the fact that no privacy flap lasted longer than a news cycle or two. Nor did people flee for other services. That convinced the companies that digital privacy was a dead issue.
  • If the current furor dies down without meaningful change, critics worry that the problems might become even more entrenched. When the tech industry follows its natural impulses, it becomes even less transparent.
  • “To know the real interaction between populism and Facebook, you need to give much more access to researchers, not less,” said Paul-Jasper Dittrich, a German research fellow
  • There’s another reason Silicon Valley tends to be reluctant to share information about what it is doing. It believes so deeply in itself that it does not even think there is a need for discussion. The technology world’s remedy for any problem is always more technology
Javier E

The Philosopher Whose Fingerprints Are All Over the FTC's New Approach to Privacy - Ale... - 0 views

  • The standard explanation for privacy freakouts is that people get upset because they've "lost control" of data about themselves or there is simply too much data available. Nissenbaum argues that the real problem "is the inapproproriateness of the flow of information due to the mediation of technology." In her scheme, there are senders and receivers of messages, who communicate different types of information with very specific expectations of how it will be used. Privacy violations occur not when too much data accumulates or people can't direct it, but when one of the receivers or transmission principles change. The key academic term is "context-relative informational norms." Bust a norm and people get upset.
  • Nissenbaum gets us past thinking about privacy as a binary: either something is private or something is public. Nissenbaum puts the context -- or social situation -- back into the equation. What you tell your bank, you might not tell your doctor.
  • Furthermore, these differences in information sharing are not bad or good; they are just the norms.
  • ...8 more annotations...
  • any privacy regulation that's going to make it through Congress has to provide clear ways for companies to continue profiting from data tracking. The key is coming up with an ethical framework in which they can do so, and Nissenbaum may have done just that. 
  • The traditional model of how this works says that your information is something like a currency and when you visit a website that collects data on you for one reason or another, you enter into a contract with that site. As long as the site gives you "notice" that data collection occurs -- usually via a privacy policy located through a link at the bottom of the page -- and you give "consent" by continuing to use the site, then no harm has been done. No matter how much data a site collects, if all they do is use it to show you advertising they hope is more relevant to you, then they've done nothing wrong.
  • Nevermind that if you actually read all the privacy policies you encounter in a year, it would take 76 work days. And that calculation doesn't even account for all the 3rd parties that drain data from your visits to other websites. Even more to the point: there is no obvious way to discriminate between two separate webpages on the basis of their data collection policies. While tools have emerged to tell you how many data trackers are being deployed at any site at a given moment, the dynamic nature of Internet advertising means that it is nearly impossible to know the story through time
  • How can anyone make a reasonable determination of how their information might be used when there are more than 50 or 100 or 200 tools in play on a single website in a single month?
  • Nissenbaum doesn't think it's possible to explain the current online advertising ecosystem in a useful way without resorting to a lot of detail. She calls this the "transparency paradox," and considers it insoluble.
  • she wants to import the norms from the offline world into the online world. When you go to a bank, she says, you have expectations of what might happen to your communications with that bank. That should be true whether you're online, on the phone, or at the teller.  Companies can use your data to do bank stuff, but they can't sell your data to car dealers looking for people with a lot of cash on hand.
  • let companies do standard data collection but require them to tell people when they are doing things with data that are inconsistent with the "context of the interaction" between a company and a person.
  • here's the big downside: it rests on the "norms" that people expect. While that may be socially optimal, it's actually quite difficult to figure out what the norms for a given situation might be. After all, there is someone else who depends on norms for his thinking about privacy.
Javier E

Web Privacy, and How Consumers Let Down Their Guard - NYTimes.com - 0 views

  • We are hurried and distracted and don’t pay close attention to what we are doing. Often, we turn over our data in exchange for a deal we can’t refuse.
  • his research argues that when it comes to privacy, policy makers should carefully consider how people actually behave. We don’t always act in our own best interest, his research suggests. We can be easily manipulated by how we are asked for information. Even something as simple as a playfully designed site can nudge us to reveal more of ourselves than a serious-looking one.
  • “His work has gone a long way in trying to help us figure out how irrational we are in privacy related decisions,” says Woodrow Hartzog, an assistant professor of law who studies digital privacy at Samford University in Birmingham, Ala. “We have too much confidence in our ability to make decisions.”
  • ...13 more annotations...
  • Solutions to our leaky privacy system tend to focus on transparency and control — that our best hope is knowing what our data is being used for and choosing whether to participate. But a challenge to that conventional wisdom emerges in his research. Giving users control may be an essential step, but it may also be a bit of an illusion.
  • personal data is what fuels the barons of the Internet age. Mr. Acquisti investigates the trade-offs that users make when they give up that data, and who gains and loses in those transactions. Often there are immediate rewards (cheap sandals) and sometimes intangible risks downstream (identity theft). “
  • “The technologist in me loves the amazing things the Internet is allowing us to do,” he said. “The individual who cares about freedom is concerned about the technology being hijacked, from a technology of freedom into a technology of surveillance.”
  • EARLY in his sojourn in this country, Mr. Acquisti asked himself a question that would become the guiding force of his career: Do Americans value their privacy?
  • If we have something — in this case, ownership of our purchase data — we are more likely to value it. If we don’t have it at the outset, we aren’t likely to pay extra to acquire it. Context matters.
  • “What worries me,” he said, “is that transparency and control are empty words that are used to push responsibility to the user for problems that are being created by others.”
  • We are constantly asked to make decisions about personal data amid a host of distractions, like an e-mail, a Twitter notification or a text message. If Mr. Acquisti is correct, those distractions may hinder our sense of self-protection when it comes to privacy.
  • His latest weapon against distraction is an iPad application, which lets him create a to-do list every morning and set timers for each task: 30 minutes for e-mail, 60 minutes to grade student papers, and so on.
  • it is not surprising that he is cautious in revealing himself online. He says he doesn’t feel compelled to post a picture of his meals on Instagram. He uses different browsers for different activities. He sometimes uses tools that show which ad networks are tracking him. But he knows he cannot hide entirely, which is why some people, he says, follow a policy of “rational ignorance.”
  • The online advertising industry insists that the data is scrambled to make it impossible to identify individuals.
  • Mr. Acquisti offers a sobering counterpoint. In 2011, he took snapshots with a webcam of nearly 100 students on campus. Within minutes, he had identified about one-third of them using facial recognition software. In addition, for about a fourth of the subjects whom he could identify, he found out enough about them on Facebook to guess at least a portion of their Social Security numbers.
  • The point of the experiment was to show how easy it is to identify people from the rich trail of data they scatter around the Web, including seemingly harmless pictures. Facebook can be especially valuable for identity thieves, particularly when a user’s birth date is visible to the public.
  • Does that mean Facebook users should lie about their birthdays (and break Facebook’s terms of service)? Mr. Acquisti demurred. He would say only that there are “complex trade-offs” to be made. “I reveal my date of birth and hometown on my Facebook profile and an identity thief can reconstruct my Social Security number and steal my identity,” he said, “or someone can send me ‘happy birthday’ messages on the day of my birthday, which makes me feel very good.”
Javier E

Opinion | The Apps on My Phone Are Stalking Me - The New York Times - 0 views

  • There is much about the future that keeps me up at night — A.I. weaponry, undetectable viral deepfakes
  • but in the last few years, one technological threat has blipped my fear radar much faster than others.That fear? Ubiquitous surveillance.
  • I am no longer sure that human civilization can undo or evade living under constant, extravagantly detailed physical and even psychic surveillance
  • ...24 more annotations...
  • as a species, we are not doing nearly enough to avoid always being watched or otherwise digitally recorded.
  • our location, your purchases, video and audio from within your home and office, your online searches and every digital wandering, biometric tracking of your face and other body parts, your heart rate and other vital signs, your every communication, recording, and perhaps your deepest thoughts or idlest dreams
  • in the future, if not already, much of this data and more will be collected and analyzed by some combination of governments and corporations, among them a handful of megacompanies whose powers nearly match those of governments
  • Over the last year, as part of Times Opinion’s Privacy Project, I’ve participated in experiments in which my devices were closely monitored in order to determine the kind of data that was being collected about me.
  • I’ve realized how blind we are to the kinds of insights tech companies are gaining about us through our gadgets. Our blindness not only keeps us glued to privacy-invading tech
  • it also means that we’ve failed to create a political culture that is in any way up to the task of limiting surveillance.
  • few of our cultural or political institutions are even much trying to tamp down the surveillance state.
  • Yet the United States and other supposedly liberty-loving Western democracies have not ruled out such a future
  • like Barack Obama before him, Trump and the Justice Department are pushing Apple to create a backdoor into the data on encrypted iPhones — they want the untrustworthy F.B.I. and any local cop to be able to see everything inside anyone’s phone.
  • the fact that both Obama and Trump agreed on the need for breaking iPhone encryption suggests how thoroughly political leaders across a wide spectrum have neglected privacy as a fundamental value worthy of protection.
  • Americans are sleepwalking into a future nearly as frightening as the one the Chinese are constructing. I choose the word “sleepwalking” deliberately, because when it comes to digital privacy, a lot of us prefer the comfortable bliss of ignorance.
  • Among other revelations: Advertising companies and data brokers are keeping insanely close tabs on smartphones’ location data, tracking users so precisely that their databases could arguably compromise national security or political liberty.
  • Tracking technologies have become cheap and widely available — for less than $100, my colleagues were able to identify people walking by surveillance cameras in Bryant Park in Manhattan.
  • The Clearview AI story suggests another reason to worry that our march into surveillance has become inexorable: Each new privacy-invading technology builds on a previous one, allowing for scary outcomes from new integrations and collections of data that few users might have anticipated.
  • The upshot: As the location-tracking apps followed me, I was able to capture the pings they sent to online servers — essentially recording their spying
  • On the map, you can see the apps are essentially stalking me. They see me drive out one morning to the gas station, then to the produce store, then to Safeway; later on I passed by a music school, stopped at a restaurant, then Whole Foods.
  • But location was only one part of the data the companies had about me; because geographic data is often combined with other personal information — including a mobile advertising ID that can help merge what you see and do online with where you go in the real world — the story these companies can tell about me is actually far more detailed than I can tell about myself.
  • I can longer pretend I’ve got nothing to worry about. Sure, I’m not a criminal — but do I want anyone to learn everything about me?
  • more to the point: Is it wise for us to let any entity learn everything about everyone?
  • The remaining uncertainty about the surveillance state is not whether we will submit to it — only how readily and completely, and how thoroughly it will warp our society.
  • Will we allow the government and corporations unrestricted access to every bit of data we ever generate, or will we decide that some kinds of collections, like the encrypted data on your phone, should be forever off limits, even when a judge has issued a warrant for it?
  • In the future, will there be room for any true secret — will society allow any unrecorded thought or communication to evade detection and commercial analysis?
  • How completely will living under surveillance numb creativity and silence radical thought?
  • Can human agency survive the possibility that some companies will know more about all of us than any of us can ever know about ourselves?
Javier E

Opinion | Privacy Is Too Big to Understand - The New York Times - 1 views

  • There is “no single rhetorical approach likely to work on a given audience and none too dangerous to try. Any story that sticks is a good one,”
  • This newsletter is about finding ways to make this stuff stick in your mind and to arm you with the information you need to take control of your digital life.
  • how to start? The definition of privacy itself. I think it’s time to radically expand it.
  • ...12 more annotations...
  • “Privacy” is an impoverished word — far too small a word to describe what we talk about when we talk about the mining, transmission, storing, buying, selling, use and misuse of our personal information.
  • “hyperobjects,” a concept so all-encompassing that it is impossible to adequately describe
  • invite skepticism because their scale is so vast and sometimes abstract.
  • When technology governs so many aspects of our lives — and when that technology is powered by the exploitation of our data — privacy isn’t just about knowing your secrets, it’s about autonomy
  • “Privacy is really about being able to define for ourselves who we are for the world and on our own terms,”
  • not a choice that belongs to an algorithm or data brokerEntities that collect, aggregate and sell individuals’ personal data, derivatives and inferences from disparate public and private sources. Glossary and definitely not to Facebook.”
  • privacy is about how that data is used to take away our control
  • real-time data, once assumed to be protected by phone companies, was available for sale to bounty hunters for a $300 fee
  • ICE officials partnered with a private data firm to track license plate data.
  • It means reckoning with private surveillance databases armed with dossiers on regular citizens and outsourced to the highest bidder
  • “Years ago we worried about the N.S.A. building huge server farms, but now it’s much cheaper to go to a private-service vendor and outsource this to a company who can cloak their activity in trade secrets,
  • “It’s comparable to asking people to stop using air conditioning because of the ozone layer. It’s not likely to happen because the immediate comfort is more valuable than the long-term fear.
Javier E

Opinion | The Only Answer Is Less Internet - The New York Times - 0 views

  • In our age of digital connection and constantly online life, you might say that two political regimes are evolving, one Chinese and one Western
  • The first regime is one in which your every transaction can be fed into a system of ratings and rankings
  • in which what seem like merely personal mistakes can cost you your livelihood and reputation, even your ability to hail a car or book a reservation
  • ...13 more annotations...
  • It’s one in which notionally private companies cooperate with the government to track dissidents and radicals and censor speech
  • ne in which your fellow citizens act as enforcers of the ideological consensus, making an example of you for comments you intended only for your friends
  • one in which even the wealth and power of your overlords can't buy privacy.
  • The second regime is the one they’re building in the People’s Republic of China.
  • Beijing has treated the darkest episodes of “Black Mirror” as a how-to guide for social control and subjugation
  • Unlike China’s system, our emerging post-privacy order is not (for now) totalitarian; its impositions are more decentralized and haphazard, more circumscribed and civilized, less designed and more evolved, more random in the punishments inflicted and the rules enforced.
  • our system cannot help recreating features of the Chinese order, because the way that we live on the internet leaves us naked before power in a radical new way.
  • the Western order in the internet age might be usefully described as a “liberalism with some police-state characteristics.” Those characteristics are shaped and limited by our political heritage of rights and individualism. But there is still plainly an authoritarian edge, a gentle “pink police state” aspect, to the new world that online life creates.
  • apart from the high-minded and the paranoid, privacy per se is not a major issue in our politics
  • for those who object inherently to our new nakedness, regard the earthquakes as too high a price for Amazon’s low prices, or fear what an Augustus or a Robespierre might someday do with all this architecture, the best hope for a partial restoration of privacy has to involve more than just an anxiety about privacy alone.
  • It requires a more general turn against the virtual, in which fears of digital nakedness are just one motivator among many — the political piece of a cause that’s also psychological, intellectual, aesthetic and religious.
  • This is the hard truth suggested by our online experience so far: That a movement to restore privacy must be, at some level, a movement against the internet
  • Not a pure Luddism, but a movement for limits, for internet-free spaces, for zones of enforced pre-virtual reality (childhood and education above all), for social conventions that discourage career-destroying tweets and crotch shots by encouraging us to put away our iPhones.
Javier E

Opinion | You Are the Object of Facebook's Secret Extraction Operation - The New York T... - 0 views

  • Facebook is not just any corporation. It reached trillion-dollar status in a single decade by applying the logic of what I call surveillance capitalism — an economic system built on the secret extraction and manipulation of human data
  • Facebook and other leading surveillance capitalist corporations now control information flows and communication infrastructures across the world.
  • These infrastructures are critical to the possibility of a democratic society, yet our democracies have allowed these companies to own, operate and mediate our information spaces unconstrained by public law.
  • ...56 more annotations...
  • The result has been a hidden revolution in how information is produced, circulated and acted upon
  • The world’s liberal democracies now confront a tragedy of the “un-commons.” Information spaces that people assume to be public are strictly ruled by private commercial interests for maximum profit.
  • The internet as a self-regulating market has been revealed as a failed experiment. Surveillance capitalism leaves a trail of social wreckage in its wake: the wholesale destruction of privacy, the intensification of social inequality, the poisoning of social discourse with defactualized information, the demolition of social norms and the weakening of democratic institutions.
  • These social harms are not random. They are tightly coupled effects of evolving economic operations. Each harm paves the way for the next and is dependent on what went before.
  • There is no way to escape the machine systems that surveil u
  • All roads to economic and social participation now lead through surveillance capitalism’s profit-maximizing institutional terrain, a condition that has intensified during nearly two years of global plague.
  • Will Facebook’s digital violence finally trigger our commitment to take back the “un-commons”?
  • Will we confront the fundamental but long ignored questions of an information civilization: How should we organize and govern the information and communication spaces of the digital century in ways that sustain and advance democratic values and principles?
  • Mark Zuckerberg’s start-up did not invent surveillance capitalism. Google did that. In 2000, when only 25 percent of the world’s information was stored digitally, Google was a tiny start-up with a great search product but little revenue.
  • By 2001, in the teeth of the dot-com bust, Google’s leaders found their breakthrough in a series of inventions that would transform advertising. Their team learned how to combine massive data flows of personal information with advanced computational analyses to predict where an ad should be placed for maximum “click through.”
  • Google’s scientists learned how to extract predictive metadata from this “data exhaust” and use it to analyze likely patterns of future behavior.
  • Prediction was the first imperative that determined the second imperative: extraction.
  • Lucrative predictions required flows of human data at unimaginable scale. Users did not suspect that their data was secretly hunted and captured from every corner of the internet and, later, from apps, smartphones, devices, cameras and sensors
  • User ignorance was understood as crucial to success. Each new product was a means to more “engagement,” a euphemism used to conceal illicit extraction operations.
  • When asked “What is Google?” the co-founder Larry Page laid it out in 2001,
  • “Storage is cheap. Cameras are cheap. People will generate enormous amounts of data,” Mr. Page said. “Everything you’ve ever heard or seen or experienced will become searchable. Your whole life will be searchable.”
  • Instead of selling search to users, Google survived by turning its search engine into a sophisticated surveillance medium for seizing human data
  • Company executives worked to keep these economic operations secret, hidden from users, lawmakers, and competitors. Mr. Page opposed anything that might “stir the privacy pot and endanger our ability to gather data,” Mr. Edwards wrote.
  • As recently as 2017, Eric Schmidt, the executive chairman of Google’s parent company, Alphabet, acknowledged the role of Google’s algorithmic ranking operations in spreading corrupt information. “There is a line that we can’t really get across,” he said. “It is very difficult for us to understand truth.” A company with a mission to organize and make accessible all the world’s information using the most sophisticated machine systems cannot discern corrupt information.
  • This is the economic context in which disinformation wins
  • In March 2008, Mr. Zuckerberg hired Google’s head of global online advertising, Sheryl Sandberg, as his second in command. Ms. Sandberg had joined Google in 2001 and was a key player in the surveillance capitalism revolution. She led the build-out of Google’s advertising engine, AdWords, and its AdSense program, which together accounted for most of the company’s $16.6 billion in revenue in 2007.
  • A Google multimillionaire by the time she met Mr. Zuckerberg, Ms. Sandberg had a canny appreciation of Facebook’s immense opportunities for extraction of rich predictive data. “We have better information than anyone else. We know gender, age, location, and it’s real data as opposed to the stuff other people infer,” Ms. Sandberg explained
  • The company had “better data” and “real data” because it had a front-row seat to what Mr. Page had called “your whole life.”
  • Facebook paved the way for surveillance economics with new privacy policies in late 2009. The Electronic Frontier Foundation warned that new “Everyone” settings eliminated options to restrict the visibility of personal data, instead treating it as publicly available information.
  • Mr. Zuckerberg “just went for it” because there were no laws to stop him from joining Google in the wholesale destruction of privacy. If lawmakers wanted to sanction him as a ruthless profit-maximizer willing to use his social network against society, then 2009 to 2010 would have been a good opportunity.
  • Facebook was the first follower, but not the last. Google, Facebook, Amazon, Microsoft and Apple are private surveillance empires, each with distinct business models.
  • In 2021 these five U.S. tech giants represent five of the six largest publicly traded companies by market capitalization in the world.
  • As we move into the third decade of the 21st century, surveillance capitalism is the dominant economic institution of our time. In the absence of countervailing law, this system successfully mediates nearly every aspect of human engagement with digital information
  • Today all apps and software, no matter how benign they appear, are designed to maximize data collection.
  • Historically, great concentrations of corporate power were associated with economic harms. But when human data are the raw material and predictions of human behavior are the product, then the harms are social rather than economic
  • The difficulty is that these novel harms are typically understood as separate, even unrelated, problems, which makes them impossible to solve. Instead, each new stage of harm creates the conditions for the next stage.
  • Fifty years ago the conservative economist Milton Friedman exhorted American executives, “There is one and only one social responsibility of business — to use its resources and engage in activities designed to increase its profits so long as it stays within the rules of the game.” Even this radical doctrine did not reckon with the possibility of no rules.
  • With privacy out of the way, ill-gotten human data are concentrated within private corporations, where they are claimed as corporate assets to be deployed at will.
  • The sheer size of this knowledge gap is conveyed in a leaked 2018 Facebook document, which described its artificial intelligence hub, ingesting trillions of behavioral data points every day and producing six million behavioral predictions each second.
  • Next, these human data are weaponized as targeting algorithms, engineered to maximize extraction and aimed back at their unsuspecting human sources to increase engagement
  • Targeting mechanisms change real life, sometimes with grave consequences. For example, the Facebook Files depict Mr. Zuckerberg using his algorithms to reinforce or disrupt the behavior of billions of people. Anger is rewarded or ignored. News stories become more trustworthy or unhinged. Publishers prosper or wither. Political discourse turns uglier or more moderate. People live or die.
  • Occasionally the fog clears to reveal the ultimate harm: the growing power of tech giants willing to use their control over critical information infrastructure to compete with democratically elected lawmakers for societal dominance.
  • when it comes to the triumph of surveillance capitalism’s revolution, it is the lawmakers of every liberal democracy, especially in the United States, who bear the greatest burden of responsibility. They allowed private capital to rule our information spaces during two decades of spectacular growth, with no laws to stop it.
  • All of it begins with extraction. An economic order founded on the secret massive-scale extraction of human data assumes the destruction of privacy as a nonnegotiable condition of its business operations.
  • We can’t fix all our problems at once, but we won’t fix any of them, ever, unless we reclaim the sanctity of information integrity and trustworthy communications
  • The abdication of our information and communication spaces to surveillance capitalism has become the meta-crisis of every republic, because it obstructs solutions to all other crises.
  • Neither Google, nor Facebook, nor any other corporate actor in this new economic order set out to destroy society, any more than the fossil fuel industry set out to destroy the earth.
  • like global warming, the tech giants and their fellow travelers have been willing to treat their destructive effects on people and society as collateral damage — the unfortunate but unavoidable byproduct of perfectly legal economic operations that have produced some of the wealthiest and most powerful corporations in the history of capitalism.
  • Where does that leave us?
  • Democracy is the only countervailing institutional order with the legitimate authority and power to change our course. If the ideal of human self-governance is to survive the digital century, then all solutions point to one solution: a democratic counterrevolution.
  • instead of the usual laundry lists of remedies, lawmakers need to proceed with a clear grasp of the adversary: a single hierarchy of economic causes and their social harms.
  • We can’t rid ourselves of later-stage social harms unless we outlaw their foundational economic causes
  • This means we move beyond the current focus on downstream issues such as content moderation and policing illegal content. Such “remedies” only treat the symptoms without challenging the illegitimacy of the human data extraction that funds private control over society’s information spaces
  • Similarly, structural solutions like “breaking up” the tech giants may be valuable in some cases, but they will not affect the underlying economic operations of surveillance capitalism.
  • Instead, discussions about regulating big tech should focus on the bedrock of surveillance economics: the secret extraction of human data from realms of life once called “private.
  • No secret extraction means no illegitimate concentrations of knowledge about people. No concentrations of knowledge means no targeting algorithms. No targeting means that corporations can no longer control and curate information flows and social speech or shape human behavior to favor their interests
  • the sober truth is that we need lawmakers ready to engage in a once-a-century exploration of far more basic questions:
  • How should we structure and govern information, connection and communication in a democratic digital century?
  • What new charters of rights, legislative frameworks and institutions are required to ensure that data collection and use serve the genuine needs of individuals and society?
  • What measures will protect citizens from unaccountable power over information, whether it is wielded by private companies or governments?
  • The corporation that is Facebook may change its name or its leaders, but it will not voluntarily change its economics.
Javier E

Obscurity: A Better Way to Think About Your Data Than 'Privacy' - Woodrow Hartzog and E... - 1 views

  • Obscurity is the idea that when information is hard to obtain or understand, it is, to some degree, safe. Safety, here, doesn't mean inaccessible. Competent and determined data hunters armed with the right tools can always find a way to get it. Less committed folks, however, experience great effort as a deterrent.
  • Online, obscurity is created through a combination of factors. Being invisible to search engines increases obscurity. So does using privacy settings and pseudonyms. Disclosing information in coded ways that only a limited audience will grasp enhances obscurity, too
  • What obscurity draws our attention to, is that while the records were accessible to any member of the public prior to the rise of big data, more effort was required to obtain, aggregate, and publish them. In that prior context, technological constraints implicitly protected privacy interests.
  • ...9 more annotations...
  • the "you choose who to let in" narrative is powerful because it trades on traditional notions of space and boundary regulation, and further appeals to our heightened sense of individual responsibility, and, possibly even vanity. The basic message is that so long as we exercise good judgment when selecting our friends, no privacy problems will arise
  • What this appeal to status quo relations and existing privacy settings conceals is the transformative potential of Graph : new types of searching can emerge that, due to enhanced frequency and newly created associations between data points, weaken, and possibly obliterate obscurity.
  • the stalker frame muddies the concept, implying that the problem is people with bad intentions getting our information. Determined stalkers certainly pose a threat to the obscurity of information because they represent an increased likelihood that obscure information will be found and understood.
  • he other dominant narrative emerging is that the Graph will simplify "stalking."
  • Well-intentioned searches can be problematic, too.
  • It is not a stretch to assume Graph could enable searching through the content of posts a user has liked or commented on and generating categories of interests from it. For example, users could search which of their friends are interested in politics, or, perhaps, specifically, in left-wing politics.
  • In this scenario, a user who wasn't a fan of political groups or causes, didn't list political groups or causes as interests, and didn't post political stories, could still be identified as political.
  • In a system that purportedly relies upon user control, it is still unclear how and if users will be able to detect when their personal information is no longer obscure. How will they be able to anticipate the numerous different queries that might expose previously obscure information? Will users even be aware of all the composite results including their information?
  • Obscurity is a protective state that can further a number of goals, such as autonomy, self-fulfillment, socialization, and relative freedom from the abuse of power. A major task ahead is for society to determine how much obscurity citizens need to thrive.
Javier E

Quitters Never Win: The Costs of Leaving Social Media - Woodrow Hartzog and Evan Seling... - 2 views

  • Manjoo offers this security-centric path for folks who are anxious about the service being "one the most intrusive technologies ever built," and believe that "the very idea of making Facebook a more private place borders on the oxymoronic, a bit like expecting modesty at a strip club". Bottom line: stop tuning in and start dropping out if you suspect that the culture of oversharing, digital narcissism, and, above all, big-data-hungry, corporate profiteering will trump privacy settings.
  • Angwin plans on keeping a bare-bones profile. She'll maintain just enough presence to send private messages, review tagged photos, and be easy for readers to find. Others might try similar experiments, perhaps keeping friends, but reducing their communication to banal and innocuous expressions. But, would such disclosures be compelling or sincere enough to retain the technology's utility?
  • The other unattractive option is for social web users to willingly pay for connectivity with extreme publicity.
  • ...9 more annotations...
  • go this route if you believe privacy is dead, but find social networking too good to miss out on.
  • While we should be attuned to constraints and their consequences, there are at least four problems with conceptualizing the social media user's dilemma as a version of "if you can't stand the heat, get out of the kitchen".
  • The efficacy of abandoning social media can be questioned when others are free to share information about you on a platform long after you've left.
  • Second, while abandoning a single social technology might seem easy, this "love it or leave it" strategy -- which demands extreme caution and foresight from users and punishes them for their naivete -- isn't sustainable without great cost in the aggregate. If we look past the consequences of opting out of a specific service (like Facebook), we find a disconcerting and more far-reaching possibility: behavior that justifies a never-ending strategy of abandoning every social technology that threatens privacy -- a can being kicked down the road in perpetuity without us resolving the hard question of whether a satisfying balance between protection and publicity can be found online
  • if your current social network has no obligation to respect the obscurity of your information, what justifies believing other companies will continue to be trustworthy over time?
  • Sticking with the opt-out procedure turns digital life into a paranoid game of whack-a-mole where the goal is to stay ahead of the crushing mallet. Unfortunately, this path of perilously transferring risk from one medium to another is the direction we're headed if social media users can't make reasonable decisions based on the current context of obscurity, but instead are asked to assume all online social interaction can or will eventually lose its obscurity protection.
  • The fourth problem with the "leave if you're unhappy" ethos is that it is overly individualistic. If a critical mass participates in the "Opt-Out Revolution," what would happen to the struggling, the lonely, the curious, the caring, and the collaborative if the social web went dark?
  • Our point is that there is a middle ground between reclusion and widespread publicity, and the reduction of user options to quitting or coping, which are both problematic, need not be inevitable, especially when we can continue exploring ways to alleviate the user burden of retreat and the societal cost of a dark social web.
  • it is easy to presume that "even if you unfriend everybody on Facebook, and you never join Twitter, and you don't have a LinkedIn profile or an About.me page or much else in the way of online presence, you're still going to end up being mapped and charted and slotted in to your rightful place in the global social network that is life." But so long it remains possible to create obscurity through privacy enhancing technology, effective regulation, contextually appropriate privacy settings, circumspect behavior, and a clear understanding of how our data can be accessed and processed, that fatalism isn't justified.
Javier E

Just Don't Call It Privacy - The New York Times - 2 views

  • In a surveillance economy where companies track, analyze and capitalize on our clicks, the issue at hand isn’t privacy. The problem is unfettered data exploitation and its potential deleterious consequences — among them, unequal consumer treatment, financial fraud, identity theft, manipulative marketing and discrimination.
  • In other words, asking companies whose business models revolve around exploiting data-based consumer-influence techniques to explain their privacy policies seems about as useful as asking sharks to hold forth on veganism.
  • They should be examining business practices. They should be examining how these firms collect and use the personal data of customers, of internet users.”
  • ...7 more annotations...
  • Companies are sending their “policy and law folks to Washington to make the government go away — not the engineering folks who actually understand these systems in depth and can talk through alternatives,” Jonathan Mayer, an assistant professor of computer science and public affairs at Princeton University, told me.
  • revelations about Russian election interference and Cambridge Analytica, the voter-profiling company that obtained information on millions of Facebook users, have made it clear that data-driven influence campaigns can scale quickly and cause societal harm.
  • Do we want a future in which companies can freely parse the photos we posted last year, or the location data from the fitness apps we used last week, to infer whether we are stressed or depressed or financially strapped or emotionally vulnerable — and take advantage of that?
  • AT&T’s privacy policy says the mobile phone and cable TV provider may use third-party data to categorize subscribers, without using their real names, into interest segments and show them ads accordingly. That sounds reasonable enough
  • AT&T can find out which subscribers have indigestion — or at least which ones bought over-the-counter drugs to treat it.
  • In a case study for advertisers, AT&T describes segmenting DirecTV subscribers who bought antacids and then targeting them with ads for the medication. The firm was also able to track those subscribers’ spending. Households who saw the antacid ads spent 725 percent more on the drugs than a national audience.
  • Michael Balmoris, a spokesman for AT&T, said the company’s privacy policy was “transparent and precise, and describes in plain language how we use information and the choices we give customers.”
Javier E

Thieves of experience: On the rise of surveillance capitalism - 1 views

  • Harvard Business School professor emerita Shoshana Zuboff argues in her new book that the Valley’s wealth and power are predicated on an insidious, essentially pathological form of private enterprise—what she calls “surveillance capitalism.” Pioneered by Google, perfected by Facebook, and now spreading throughout the economy, surveillance capitalism uses human life as its raw material. Our everyday experiences, distilled into data, have become a privately-owned business asset used to predict and mold our behavior, whether we’re shopping or socializing, working or voting.
  • By reengineering the economy and society to their own benefit, Google and Facebook are perverting capitalism in a way that undermines personal freedom and corrodes democracy.
  • Under the Fordist model of mass production and consumption that prevailed for much of the twentieth century, industrial capitalism achieved a relatively benign balance among the contending interests of business owners, workers, and consumers. Enlightened executives understood that good pay and decent working conditions would ensure a prosperous middle class eager to buy the goods and services their companies produced. It was the product itself — made by workers, sold by companies, bought by consumers — that tied the interests of capitalism’s participants together. Economic and social equilibrium was negotiated through the product.
  • ...72 more annotations...
  • By removing the tangible product from the center of commerce, surveillance capitalism upsets the equilibrium. Whenever we use free apps and online services, it’s often said, we become the products, our attention harvested and sold to advertisers
  • this truism gets it wrong. Surveillance capitalism’s real products, vaporous but immensely valuable, are predictions about our future behavior — what we’ll look at, where we’ll go, what we’ll buy, what opinions we’ll hold — that internet companies derive from our personal data and sell to businesses, political operatives, and other bidders.
  • Unlike financial derivatives, which they in some ways resemble, these new data derivatives draw their value, parasite-like, from human experience.To the Googles and Facebooks of the world, we are neither the customer nor the product. We are the source of what Silicon Valley technologists call “data exhaust” — the informational byproducts of online activity that become the inputs to prediction algorithms
  • Another 2015 study, appearing in the Journal of Computer-Mediated Communication, showed that when people hear their phone ring but are unable to answer it, their blood pressure spikes, their pulse quickens, and their problem-solving skills decline.
  • The smartphone has become a repository of the self, recording and dispensing the words, sounds and images that define what we think, what we experience and who we are. In a 2015 Gallup survey, more than half of iPhone owners said that they couldn’t imagine life without the device.
  • So what happens to our minds when we allow a single tool such dominion over our perception and cognition?
  • Not only do our phones shape our thoughts in deep and complicated ways, but the effects persist even when we aren’t using the devices. As the brain grows dependent on the technology, the research suggests, the intellect weakens.
  • he has seen mounting evidence that using a smartphone, or even hearing one ring or vibrate, produces a welter of distractions that makes it harder to concentrate on a difficult problem or job. The division of attention impedes reasoning and performance.
  • internet companies operate in what Zuboff terms “extreme structural independence from people.” When databases displace goods as the engine of the economy, our own interests, as consumers but also as citizens, cease to be part of the negotiation. We are no longer one of the forces guiding the market’s invisible hand. We are the objects of surveillance and control.
  • Social skills and relationships seem to suffer as well.
  • In both tests, the subjects whose phones were in view posted the worst scores, while those who left their phones in a different room did the best. The students who kept their phones in their pockets or bags came out in the middle. As the phone’s proximity increased, brainpower decreased.
  • In subsequent interviews, nearly all the participants said that their phones hadn’t been a distraction—that they hadn’t even thought about the devices during the experiment. They remained oblivious even as the phones disrupted their focus and thinking.
  • The researchers recruited 520 undergraduates at UCSD and gave them two standard tests of intellectual acuity. One test gauged “available working-memory capacity,” a measure of how fully a person’s mind can focus on a particular task. The second assessed “fluid intelligence,” a person’s ability to interpret and solve an unfamiliar problem. The only variable in the experiment was the location of the subjects’ smartphones. Some of the students were asked to place their phones in front of them on their desks; others were told to stow their phones in their pockets or handbags; still others were required to leave their phones in a different room.
  • the “integration of smartphones into daily life” appears to cause a “brain drain” that can diminish such vital mental skills as “learning, logical reasoning, abstract thought, problem solving, and creativity.”
  •  Smartphones have become so entangled with our existence that, even when we’re not peering or pawing at them, they tug at our attention, diverting precious cognitive resources. Just suppressing the desire to check our phone, which we do routinely and subconsciously throughout the day, can debilitate our thinking.
  • They found that students who didn’t bring their phones to the classroom scored a full letter-grade higher on a test of the material presented than those who brought their phones. It didn’t matter whether the students who had their phones used them or not: All of them scored equally poorly.
  • A study of nearly a hundred secondary schools in the U.K., published last year in the journal Labour Economics, found that when schools ban smartphones, students’ examination scores go up substantially, with the weakest students benefiting the most.
  • Data, the novelist and critic Cynthia Ozick once wrote, is “memory without history.” Her observation points to the problem with allowing smartphones to commandeer our brains
  • Because smartphones serve as constant reminders of all the friends we could be chatting with electronically, they pull at our minds when we’re talking with people in person, leaving our conversations shallower and less satisfying.
  • In a 2013 study conducted at the University of Essex in England, 142 participants were divided into pairs and asked to converse in private for ten minutes. Half talked with a phone in the room, half without a phone present. The subjects were then given tests of affinity, trust and empathy. “The mere presence of mobile phones,” the researchers reported in the Journal of Social and Personal Relationships, “inhibited the development of interpersonal closeness and trust” and diminished “the extent to which individuals felt empathy and understanding from their partners.”
  • The evidence that our phones can get inside our heads so forcefully is unsettling. It suggests that our thoughts and feelings, far from being sequestered in our skulls, can be skewed by external forces we’re not even aware o
  •  Scientists have long known that the brain is a monitoring system as well as a thinking system. Its attention is drawn toward any object that is new, intriguing or otherwise striking — that has, in the psychological jargon, “salience.”
  • even in the history of captivating media, the smartphone stands out. It is an attention magnet unlike any our minds have had to grapple with before. Because the phone is packed with so many forms of information and so many useful and entertaining functions, it acts as what Dr. Ward calls a “supernormal stimulus,” one that can “hijack” attention whenever it is part of our surroundings — and it is always part of our surroundings.
  • Imagine combining a mailbox, a newspaper, a TV, a radio, a photo album, a public library and a boisterous party attended by everyone you know, and then compressing them all into a single, small, radiant object. That is what a smartphone represents to us. No wonder we can’t take our minds off it.
  • The irony of the smartphone is that the qualities that make it so appealing to us — its constant connection to the net, its multiplicity of apps, its responsiveness, its portability — are the very ones that give it such sway over our minds.
  • Phone makers like Apple and Samsung and app writers like Facebook, Google and Snap design their products to consume as much of our attention as possible during every one of our waking hours
  • Social media apps were designed to exploit “a vulnerability in human psychology,” former Facebook president Sean Parker said in a recent interview. “[We] understood this consciously. And we did it anyway.”
  • A quarter-century ago, when we first started going online, we took it on faith that the web would make us smarter: More information would breed sharper thinking. We now know it’s not that simple.
  • As strange as it might seem, people’s knowledge and understanding may actually dwindle as gadgets grant them easier access to online data stores
  • In a seminal 2011 study published in Science, a team of researchers — led by the Columbia University psychologist Betsy Sparrow and including the late Harvard memory expert Daniel Wegner — had a group of volunteers read forty brief, factual statements (such as “The space shuttle Columbia disintegrated during re-entry over Texas in Feb. 2003”) and then type the statements into a computer. Half the people were told that the machine would save what they typed; half were told that the statements would be erased.
  • Afterward, the researchers asked the subjects to write down as many of the statements as they could remember. Those who believed that the facts had been recorded in the computer demonstrated much weaker recall than those who assumed the facts wouldn’t be stored. Anticipating that information would be readily available in digital form seemed to reduce the mental effort that people made to remember it
  • The researchers dubbed this phenomenon the “Google effect” and noted its broad implications: “Because search engines are continually available to us, we may often be in a state of not feeling we need to encode the information internally. When we need it, we will look it up.”
  • as the pioneering psychologist and philosopher William James said in an 1892 lecture, “the art of remembering is the art of thinking.”
  • Only by encoding information in our biological memory can we weave the rich intellectual associations that form the essence of personal knowledge and give rise to critical and conceptual thinking. No matter how much information swirls around us, the less well-stocked our memory, the less we have to think with.
  • As Dr. Wegner and Dr. Ward explained in a 2013 Scientific American article, when people call up information through their devices, they often end up suffering from delusions of intelligence. They feel as though “their own mental capacities” had generated the information, not their devices. “The advent of the ‘information age’ seems to have created a generation of people who feel they know more than ever before,” the scholars concluded, even though “they may know ever less about the world around them.”
  • That insight sheds light on society’s current gullibility crisis, in which people are all too quick to credit lies and half-truths spread through social media. If your phone has sapped your powers of discernment, you’ll believe anything it tells you.
  • A second experiment conducted by the researchers produced similar results, while also revealing that the more heavily students relied on their phones in their everyday lives, the greater the cognitive penalty they suffered.
  • When we constrict our capacity for reasoning and recall or transfer those skills to a gadget, we sacrifice our ability to turn information into knowledge. We get the data but lose the meaning
  • We need to give our minds more room to think. And that means putting some distance between ourselves and our phones.
  • Google’s once-patient investors grew restive, demanding that the founders figure out a way to make money, preferably lots of it.
  • nder pressure, Page and Brin authorized the launch of an auction system for selling advertisements tied to search queries. The system was designed so that the company would get paid by an advertiser only when a user clicked on an ad. This feature gave Google a huge financial incentive to make accurate predictions about how users would respond to ads and other online content. Even tiny increases in click rates would bring big gains in income. And so the company began deploying its stores of behavioral data not for the benefit of users but to aid advertisers — and to juice its own profits. Surveillance capitalism had arrived.
  • Google’s business now hinged on what Zuboff calls “the extraction imperative.” To improve its predictions, it had to mine as much information as possible from web users. It aggressively expanded its online services to widen the scope of its surveillance.
  • Through Gmail, it secured access to the contents of people’s emails and address books. Through Google Maps, it gained a bead on people’s whereabouts and movements. Through Google Calendar, it learned what people were doing at different moments during the day and whom they were doing it with. Through Google News, it got a readout of people’s interests and political leanings. Through Google Shopping, it opened a window onto people’s wish lists,
  • The company gave all these services away for free to ensure they’d be used by as many people as possible. It knew the money lay in the data.
  • the organization grew insular and secretive. Seeking to keep the true nature of its work from the public, it adopted what its CEO at the time, Eric Schmidt, called a “hiding strategy” — a kind of corporate omerta backed up by stringent nondisclosure agreements.
  • Page and Brin further shielded themselves from outside oversight by establishing a stock structure that guaranteed their power could never be challenged, neither by investors nor by directors.
  • What’s most remarkable about the birth of surveillance capitalism is the speed and audacity with which Google overturned social conventions and norms about data and privacy. Without permission, without compensation, and with little in the way of resistance, the company seized and declared ownership over everyone’s information
  • The companies that followed Google presumed that they too had an unfettered right to collect, parse, and sell personal data in pretty much any way they pleased. In the smart homes being built today, it’s understood that any and all data will be beamed up to corporate clouds.
  • Google conducted its great data heist under the cover of novelty. The web was an exciting frontier — something new in the world — and few people understood or cared about what they were revealing as they searched and surfed. In those innocent days, data was there for the taking, and Google took it
  • Google also benefited from decisions made by lawmakers, regulators, and judges — decisions that granted internet companies free use of a vast taxpayer-funded communication infrastructure, relieved them of legal and ethical responsibility for the information and messages they distributed, and gave them carte blanche to collect and exploit user data.
  • Consider the terms-of-service agreements that govern the division of rights and the delegation of ownership online. Non-negotiable, subject to emendation and extension at the company’s whim, and requiring only a casual click to bind the user, TOS agreements are parodies of contracts, yet they have been granted legal legitimacy by the court
  • Law professors, writes Zuboff, “call these ‘contracts of adhesion’ because they impose take-it-or-leave-it conditions on users that stick to them whether they like it or not.” Fundamentally undemocratic, the ubiquitous agreements helped Google and other firms commandeer personal data as if by fiat.
  • n the choices we make as consumers and private citizens, we have always traded some of our autonomy to gain other rewards. Many people, it seems clear, experience surveillance capitalism less as a prison, where their agency is restricted in a noxious way, than as an all-inclusive resort, where their agency is restricted in a pleasing way
  • Zuboff makes a convincing case that this is a short-sighted and dangerous view — that the bargain we’ve struck with the internet giants is a Faustian one
  • but her case would have been stronger still had she more fully addressed the benefits side of the ledger.
  • there’s a piece missing. While Zuboff’s assessment of the costs that people incur under surveillance capitalism is exhaustive, she largely ignores the benefits people receive in return — convenience, customization, savings, entertainment, social connection, and so on
  • hat the industries of the future will seek to manufacture is the self.
  • Behavior modification is the thread that ties today’s search engines, social networks, and smartphone trackers to tomorrow’s facial-recognition systems, emotion-detection sensors, and artificial-intelligence bots.
  • All of Facebook’s information wrangling and algorithmic fine-tuning, she writes, “is aimed at solving one problem: how and when to intervene in the state of play that is your daily life in order to modify your behavior and thus sharply increase the predictability of your actions now, soon, and later.”
  • “The goal of everything we do is to change people’s actual behavior at scale,” a top Silicon Valley data scientist told her in an interview. “We can test how actionable our cues are for them and how profitable certain behaviors are for us.”
  • This goal, she suggests, is not limited to Facebook. It is coming to guide much of the economy, as financial and social power shifts to the surveillance capitalists
  • Combining rich information on individuals’ behavioral triggers with the ability to deliver precisely tailored and timed messages turns out to be a recipe for behavior modification on an unprecedented scale.
  • it was Facebook, with its incredibly detailed data on people’s social lives, that grasped digital media’s full potential for behavior modification. By using what it called its “social graph” to map the intentions, desires, and interactions of literally billions of individuals, it saw that it could turn its network into a worldwide Skinner box, employing psychological triggers and rewards to program not only what people see but how they react.
  • spying on the populace is not the end game. The real prize lies in figuring out ways to use the data to shape how people think and act. “The best way to predict the future is to invent it,” the computer scientist Alan Kay once observed. And the best way to predict behavior is to script it.
  • competition for personal data intensified. It was no longer enough to monitor people online; making better predictions required that surveillance be extended into homes, stores, schools, workplaces, and the public squares of cities and towns. Much of the recent innovation in the tech industry has entailed the creation of products and services designed to vacuum up data from every corner of our lives
  • “The typical complaint is that privacy is eroded, but that is misleading,” Zuboff writes. “In the larger societal pattern, privacy is not eroded but redistributed . . . . Instead of people having the rights to decide how and what they will disclose, these rights are concentrated within the domain of surveillance capitalism.” The transfer of decision rights is also a transfer of autonomy and agency, from the citizen to the corporation.
  • What we lose under this regime is something more fundamental than privacy. It’s the right to make our own decisions about privacy — to draw our own lines between those aspects of our lives we are comfortable sharing and those we are not
  • Other possible ways of organizing online markets, such as through paid subscriptions for apps and services, never even got a chance to be tested.
  • Online surveillance came to be viewed as normal and even necessary by politicians, government bureaucrats, and the general public
  • Google and other Silicon Valley companies benefited directly from the government’s new stress on digital surveillance. They earned millions through contracts to share their data collection and analysis techniques with the National Security Agenc
  • As much as the dot-com crash, the horrors of 9/11 set the stage for the rise of surveillance capitalism. Zuboff notes that, in 2000, members of the Federal Trade Commission, frustrated by internet companies’ lack of progress in adopting privacy protections, began formulating legislation to secure people’s control over their online information and severely restrict the companies’ ability to collect and store it. It seemed obvious to the regulators that ownership of personal data should by default lie in the hands of private citizens, not corporations.
  • The 9/11 attacks changed the calculus. The centralized collection and analysis of online data, on a vast scale, came to be seen as essential to national security. “The privacy provisions debated just months earlier vanished from the conversation more or less overnight,”
sanderk

As Technology Advances, What Will Happen With Online Privacy? - 0 views

  • individuals should be able to control when their personal data is collected by third parties and how it is used is nearly impossible to implement in a world where personal data is collected, created, used, processed, analyzed, shared, transferred, copied, and stored in unprecedented ways and at an extraordinary speed and volume.
  • There will be no opting out of this data-intensive world.
  • That said, we’re making progress. A series of high profile scandals in 2018 have ignited the privacy debate and put a spotlight on irresponsible business practices
  • ...3 more annotations...
  • Although Congress is now debating new Federal privacy legislation, I’m not optimistic that policymakers will be able to craft a law that will address these issues in a meaningful way — if they can create any law at all. There are too many stakeholders involved in the debate, each protecting their own economic self interest and competitive advantage in the marketplace
  • We need to engage more, make better informed decisions, use privacy settings and tools, and hold companies accountable when they misuse our data and violate our trust
  • As we debate privacy, we also shouldn’t forget that all of this new tech produces enormous benefits for our society - from curing diseases to easing traffic and reducing pollution
Javier E

Disruptions: Internet's Sad Legacy: No More Secrets - NYTimes.com - 0 views

  • many services that claim to offer that rarest of digital commodities — privacy — don’t really deliver. Read the fine print.
  • Snapchat’s privacy page explains that private images are stored on someone’s phone — and on its own servers. “Forensically, even after they are deleted,” Snapchat says, those images can be retrieved. Whisper’s privacy page says the company owns the intellectual property, both images and text, that people post; Whisper reserves the right to sell that stuff to third parties. And Telegram, while seemingly less innocuous with its claims, nonetheless leaves out something you might want to know: someone can just take a screenshot or picture of that “private” conversation.
  • Don’t have a smartphone yet? They still know where you are and where you’ve been. The American Civil Liberties Union released a report this year that found that technologies that let governments scan license plates are being used to build databases of vehicle locations across the United States.
  • ...1 more annotation...
  • A new book by Harvey Silverglate, a lawyer in Massachusetts, titled “Three Felonies a Day,” claims the average professional in the United States commits at least three crimes every day. How? While academics, lawyers and even government officials don’t actually know how many laws exist in today’s judicial system, it’s estimated that there are from 10,000 to 300,000 federal regulations that could be enforced criminally.
Javier E

Zachary Stockill: The Want for Privacy: Facebook's Assault on Friendship - 1 views

  • privacy is in turn the basis of a person's capacity for friendship and intimacy. [People] who lose the guarantee of privacy also eventually lose the capacity for making friends.
  • What is unsettling is that so many of us are voluntarily declining this right to privacy, and opening up our lives to a vast consortium of various, and often spurious, acquaintances: "Facebook friends."
  • Aside from the basics -- relationship status (whether listed or unlisted, have a look at the photo albums -- you'll know), age, school and other categories such as employment, by reading between the lines you will discover a wealth of information about poor Joe's hapless existence: his income, the details of his social life, if he got fat(ter), if his Grandma/dog/dealer died, what he's eating, the movies he likes, the movies he doesn't like, if he got dumb(er), if he's getting any, if he's a drunkard, if he drives a Camaro, if he voted for Obama (he didn't), if he watches Glenn Beck (he does), etc. etc. etc. It is likely that you will be able to determine, in a very real sense, the nature of Joe's current existence, warts and all.
  • ...2 more annotations...
  • what does it say about our society when we pass about freely the details of our personal lives with an audience of several hundred -- in some cases, thousands -- of onlookers, many of whom we barely like or even know? Indeed, many of these Facebook "friends" are genuine friends, lovers, family. Surely worthy of our trust. But how many of your Facebook "friends" are opportunistic voyeurs who remain your "friend" only to retain access to your world, far removed from any direct, meaningful, personal interaction?
  • I fear for the day when your dissociation from the physical exposes the fact that your online "community" is no substitute for genuine, human companionship and intimacy.
sissij

U.S. Will Ban Smoking in Public Housing Nationwide - The New York Times - 0 views

  • The final rule followed a period of public comment during which some opponents took exception to the government’s telling people what to do in the privacy of their own homes.
  • In New York, there was also concern about whether police officers would be involved in enforcing the rule; that will not be the case, Housing Authority officials said.
  • “Public housing will go smoke-free and remain smoke-free, and, because of that,” he said, “so many folks are going to live healthier lives and have a better shot at reaching their dreams because they have good health.”
  •  
    Banning smoking in public housing is still a controversial issue because it touch on whether we have the freedom to decide what we do at home. I think this issue shows that how our freedom is limited in this society. To what extent can we have our own privacy is still a debatable question because we all have a different understanding on how we are related to other people in this society. Or even do we really have privacy? As social animals, we are meant to stay in groups. Also, the last statement of this article sound like a logical fallacy because having a good health and having a better shot at reaching dreams don't really have that much direct relation. I think it is just a small factor. --Sissi (12/1/2016)
Javier E

Facebook, the Company That Loves Misery - WSJ - 1 views

  • For more than a decade Mark Zuckerberg has been running an experiment in openness. We are the test subjects. So what does he think about the fact that being “open and connected,” Facebook -style, is making us miserable?
  • Several studies, most recently one out of San Diego State University analyzing the leisure activities of a million teens, have concluded that the more time spent on Facebook, the less happy we tend to be
  • In 2010 Mr. Zuckerberg announced that the old social norm of privacy had “evolved”—a fortuitous discovery for someone who had devoted his life to whittling others’ privacy away. Indeed, Facebook’s essential conceit is that privacy is outmoded—the corset we never wanted and are so much freer without
  • ...9 more annotations...
  • We’ve known for a while that Facebook enables online communication with friends and family—but also a sharply targeted form of bullying. That it wastes our time. That, whatever relationships it nurtures, it kills off others entirely.
  • But privacy is also a shield, and it protects subject and observer alike
  • The Cambridge Analytica scandal came on like a slap, the kind that breaks the spell and makes you wonder what on earth you’ve been doing. How had we given so much away? We squandered assets we may never regain—privacy, dignity—and for what?
  • We registered as Facebook users imagining ourselves vacationing at a new resort, but it turned out to be a nudist colony
  • Mr. Zuckerberg’s promise to protect our data is laughable because exploiting our data is precisely his business.
  • Over the years, Facebook pushed us to share more of ourselves. “What are you doing right now?” became “What’s on your mind, Abigail?” It jiggered the order of posts to keep our navels and our friends’ well-gazed, all the while rendering us more vulnerable to abuse
  • When we discovered our pockets had been picked, Facebook suddenly seemed more hustler than host; its endless party, one great confidence scheme.
  • And now, Congress is calling on Mr. Zuckerberg to fix the problem as if the problem weren’t Facebook itself
  • Is there any piece of data about us that, on principle, Mr. Zuckerberg wouldn’t monetize
sissij

How the Republicans Sold Your Privacy to Internet Providers - The New York Times - 0 views

  • the House quietly voted to undo rules that keep internet service providers — the companies like Comcast, Verizon and Charter that you pay for online access — from selling your personal information.
  • President Trump will be able to sign legislation that will strike a significant blow against online privacy protection.
  • The bill is an effort by the F.C.C.’s new Republican majority and congressional Republicans to overturn a simple but vitally important concept
  • ...2 more annotations...
  • Reversing those protections is a dream for cable and telephone companies, which want to capitalize on the value of such personal information.
  • Apparently, the Trump administration and its allies in Congress value privacy for themselves over the privacy of the Americans who put them in office. What is good business for powerful cable and phone companies is just tough luck for the rest of us.
  •  
    In today's class, we discussed how the parties now only have a shell and don't really serve as a safeguard of democracy. The parties become more like a entrance ticket to the election. Many people get in a parties not because the ideology appeals to them, but because by entering the party, it raise the chances and opportunity of winning an election. Now, money and power are played as stakes in politics and the general population is left to ignorance. Sometimes, I feel like the company are the real citizens who the government is serving, not the people. --Sissi (3/29/2017)
peterconnelly

Meta Will Give Researchers More Information on Political Ad Targeting - The New York Times - 0 views

  • Meta, which owns Facebook and Instagram, said it planned to give outside researchers more detailed information on how political ads were targeted across its platform, providing insight into the ways that politicians, campaign operatives and political strategists buy and use ads ahead of the midterm elections.
  • The information includes which interest categories — such as “people who like dogs” or “people who enjoy the outdoors” — were chosen to aim an ad at someone.
  • While Meta has given outsiders some access into how its political ads were used in the past, it has restricted the amount of information that could be seen, citing privacy reasons.
  • ...3 more annotations...
  • “By making advertiser targeting criteria available for analysis and reporting on ads run about social issues, elections and politics, we hope to help people better understand the practices used to reach potential voters on our technologies,” the company said in a statement.
  • Meta said it had been bound by privacy rules and regulations on what types of data it could share with outsiders. In an interview, Jeff King, a vice president in Meta’s business integrity unit, said the company had hired thousands of workers over the past few years to review those privacy issues.
  • “Every single thing we release goes through a privacy review now,” he said. “We want to make sure we give people the right amount of data, but still remain privacy conscious while we do it.”
1 - 20 of 117 Next › Last »
Showing 20 items per page