Skip to main content

Home/ TOK Friends/ Group items tagged digital media

Rss Feed Group items tagged

adonahue011

Twitter is Showing That People Are Anxious and Depressed - The New York Times - 1 views

  • the lab offers this answer: Sunday, May 31. That day was not only the saddest day of 2020 so far, it was also the saddest day recorded by the lab in the last 13 years. Or at least, the saddest day on Twitter.
    • adonahue011
       
      The lab is offering the idea that May 31st was the saddest day of 2020, and the saddest in the last 13 years. The toll 2020 has put on all of us mentally is probably something at times we cannot even recognize.
    • adonahue011
       
      The lab is offering the idea that May 31st was the saddest day of 2020, and the saddest in the last 13 years. The toll 2020 has put on all of us mentally is probably something at times we cannot even recognize.
  • measuring word choices across millions of tweets, every day, the world over, to come up with a moving measure of well-being.
    • adonahue011
       
      They use a machine to track the words people are using on twitter specifically to measure the well-being of people
  • the main finding to emerge was our tendency toward relentless positivity on social media.
  • ...27 more annotations...
  • “Happiness is hard to know. It’s hard to measure,”
  • “We don’t have a lot of great data about how people are doing.”
    • adonahue011
       
      This is an interesting statement because it is so true. Yet it is so important to know how people are doing. Often times I think we personally miss some of the feelings we have, which is something we talked about in TOK. We cut out certain memories or feelings to make the narrative we want
  • to parse our national mental health through the prism of our online life.
  • that stockpile of information towered as high as it does now, in the summer of 2020
  • , Twitter reported a 34 percent increase in daily average user growth.
    • adonahue011
       
      Important statistic because we all took part in this
  • has gathered a random 10 percent of all public tweets, every day, across a dozen languages.
  • Twitter included “terrorist,” “violence” and “racist.” This was about a week after George Floyd was killed, near the start of the protests that would last all summe
  • the pandemic, the Hedonometer’s sadness readings have set multiple records. This year, “there was a full month — and we never see this — there was a full month of days that the Hedonometer was reading sadder than the Boston Marathon day,”
    • adonahue011
       
      This is saddening because it is the reality we have all had to learn how to deal with.
  • “These digital traces are markers that we’re not aware of, but they leave marks that tell us the degree to which you are avoiding things, the degree to which you are connected to people,”
    • adonahue011
       
      I agree with this statement because it is so similar to what we discussed in TOK with the idea that our brain lets us avoid things when we don't feel like we can deal with them.
  • one of the challenges of this line of research is that language itself is always evolving — and algorithms are notoriously bad at discerning context.
  • they were able to help predict which ones might develop postpartum depression, based on their posts before the birth of their babies.
    • adonahue011
       
      This type of research seems like a positive way to utilize social media. Not that the saddening posts are good but the way we can perceive this information is important
  • Using data from social media for the study of mental health also helps address the WEIRD problem:
  • psychology research is often exclusively composed of subjects who are Western, Educated, and from Industrialized, Rich, and Democratic countries.
    • adonahue011
       
      I never thought of this but it is so true! Using social media means that the stats are global.
  • We’re now able to look at a much more diverse variety of mental health experiences.”
  • but also anxiety, depression, stress and suicidal thoughts. Unsurprisingly, she found that all these levels were significantly higher than during the same months of 2019.
  • is really a representative place to check the state of the general population’s mental health.
  • argues that in the rush to embrace data, many researchers ignore the distorting effects of the platforms themselves.
    • adonahue011
       
      Contrasting opinion from the rest of the article
  • emotionally invested in the content we are presented with, coaxed toward remaining in a certain mental state.
    • adonahue011
       
      Interesting idea though I tend to think more in the opposite direction that social media is a pretty solid reflection.
  • The closest we get to looking at national mental health otherwise is through surveys like the one Gallup performs
  • the lowest rates of life satisfaction this year in over a decade, including during the 2008 recession
  • I have never been more exhausted at the end of the day than I am now,” said Michael Garfinkle, a psychoanalyst in New York.
  • There are so many contenders to consider: was it Thursday, March 12, the day after Tom Hanks announced he was sick and the N.B.A. announced it was canceled? Was it Monday, June 1, the day peaceful protesters were tear gassed so that President Trump could comfortably stroll to his Bible-wielding photo op?
Javier E

Why It's OK to Let Apps Make You a Better Person - Evan Selinger - Technology - The Atl... - 0 views

  • one theme emerges from the media coverage of people's relationships with our current set of technologies: Consumers want digital willpower. App designers in touch with the latest trends in behavioral modification--nudging, the quantified self, and gamification--and good old-fashioned financial incentive manipulation, are tackling weakness of will. They're harnessing the power of payouts, cognitive biases, social networking, and biofeedback. The quantified self becomes the programmable self.
  • the trend still has multiple interesting dimensions
  • Individuals are turning ever more aspects of their lives into managerial problems that require technological solutions. We have access to an ever-increasing array of free and inexpensive technologies that harness incredible computational power that effectively allows us to self-police behavior everywhere we go. As pervasiveness expands, so does trust.
  • ...20 more annotations...
  • Some embrace networked, data-driven lives and are comfortable volunteering embarrassing, real time information about what we're doing, whom we're doing it with, and how we feel about our monitored activities.
  • Put it all together and we can see that our conception of what it means to be human has become "design space." We're now Humanity 2.0, primed for optimization through commercial upgrades. And today's apps are more harbinger than endpoint.
  • philosophers have had much to say about the enticing and seemingly inevitable dispersion of technological mental prosthetic that promise to substitute or enhance some of our motivational powers.
  • beyond the practical issues lie a constellation of central ethical concerns.
  • It simply means that when it comes to digital willpower, we should be on our guard to avoid confusing situational with integrated behaviors.
  • it is antithetical to the ideal of " resolute choice." Some may find the norm overly perfectionist, Spartan, or puritanical. However, it is not uncommon for folks to defend the idea that mature adults should strive to develop internal willpower strong enough to avoid external temptations, whatever they are, and wherever they are encountered.
  • In part, resolute choosing is prized out of concern for consistency, as some worry that lapse of willpower in any context indicates a generally weak character.
  • Fragmented selves behave one way while under the influence of digital willpower, but another when making decisions without such assistance. In these instances, inconsistent preferences are exhibited and we risk underestimating the extent of our technological dependency.
  • they should cause us to pause as we think about a possible future that significantly increases the scale and effectiveness of willpower-enhancing apps. Let's call this hypothetical future Digital Willpower World and characterize the ethical traps we're about to discuss as potential general pitfalls
  • the problem of inauthenticity, a staple of the neuroethics debates, might arise. People might start asking themselves: Has the problem of fragmentation gone away only because devices are choreographing our behavior so powerfully that we are no longer in touch with our so-called real selves -- the selves who used to exist before Digital Willpower World was formed?
  • Infantalized subjects are morally lazy, quick to have others take responsibility for their welfare. They do not view the capacity to assume personal responsibility for selecting means and ends as a fundamental life goal that validates the effort required to remain committed to the ongoing project of maintaining willpower and self-control.
  • Michael Sandel's Atlantic essay, "The Case Against Perfection." He notes that technological enhancement can diminish people's sense of achievement when their accomplishments become attributable to human-technology systems and not an individual's use of human agency.
  • Borgmann worries that this environment, which habituates us to be on auto-pilot and delegate deliberation, threatens to harm the powers of reason, the most central component of willpower (according to the rationalist tradition).
  • In several books, including Technology and the Character of Contemporary Life, he expresses concern about technologies that seem to enhance willpower but only do so through distraction. Borgmann's paradigmatic example of the non-distracted, focally centered person is a serious runner. This person finds the practice of running maximally fulfilling, replete with the rewarding "flow" that can only comes when mind/body and means/ends are unified, while skill gets pushed to the limit.
  • Perhaps the very conception of a resolute self was flawed. What if, as psychologist Roy Baumeister suggests, willpower is more "staple of folk psychology" than real way of thinking about our brain processes?
  • novel approaches suggest the will is a flexible mesh of different capacities and cognitive mechanisms that can expand and contract, depending on the agent's particular setting and needs. Contrary to the traditional view that identifies the unified and cognitively transparent self as the source of willed actions, the new picture embraces a rather diffused, extended, and opaque self who is often guided by irrational trains of thought. What actually keeps the self and its will together are the given boundaries offered by biology, a coherent self narrative created by shared memories and experiences, and society. If this view of the will as an expa
  • nding and contracting system with porous and dynamic boundaries is correct, then it might seem that the new motivating technologies and devices can only increase our reach and further empower our willing selves.
  • "It's a mistake to think of the will as some interior faculty that belongs to an individual--the thing that pushes the motor control processes that cause my action," Gallagher says. "Rather, the will is both embodied and embedded: social and physical environment enhance or impoverish our ability to decide and carry out our intentions; often our intentions themselves are shaped by social and physical aspects of the environment."
  • It makes perfect sense to think of the will as something that can be supported or assisted by technology. Technologies, like environments and institutions can facilitate action or block it. Imagine I have the inclination to go to a concert. If I can get my ticket by pressing some buttons on my iPhone, I find myself going to the concert. If I have to fill out an application form and carry it to a location several miles away and wait in line to pick up my ticket, then forget it.
  • Perhaps the best way forward is to put a digital spin on the Socratic dictum of knowing myself and submit to the new freedom: the freedom of consuming digital willpower to guide me past the sirens.
Javier E

Opinion | Richard Powers on What We Can Learn From Trees - The New York Times - 0 views

  • Theo and Robin have a nightly ritual where they say a prayer that Alyssa, the deceased wife and mother, taught them: May all sentient beings be free from needless suffering. That prayer itself comes from the four immeasurables in the Buddhist tradition.
  • When we enter into or recover this sense of kinship that was absolutely fundamental to so many indigenous cultures everywhere around the world at many, many different points in history, that there is no radical break between us and our kin, that even consciousness is shared, to some degree and to a large degree, with a lot of other creatures, then death stops seeming like the enemy and it starts seeming like one of the most ingenious kinds of design for keeping evolution circulating and keeping the experiment running and recombining.
  • Look, I’m 64 years old. I can remember sitting in psychology class as an undergraduate and having my professor declare that no, of course animals don’t have emotions because they don’t have an internal life. They don’t have conscious awareness. And so what looks to you like your dog being extremely happy or being extremely guilty, which dogs do so beautifully, is just your projection, your anthropomorphizing of those other creatures. And this prohibition against anthropomorphism created an artificial gulf between even those animals that are ridiculously near of kin to us, genetically.
  • ...62 more annotations...
  • I don’t know if that sounds too complicated. But the point is, it’s not just giving up domination. It’s giving up this sense of separateness in favor of a sense of kinship. And those people who do often wonder how they failed to see how much continuity there is in the more-than-human world with the human world.
  • to go from terror into being and into that sense that the experiment is sacred, not this one outcome of the experiment, is to immediately transform the way that you think even about very fundamental social and economic and cultural things. If the experiment is sacred, how can we possibly justify our food systems, for instance?
  • when I first went to the Smokies and hiked up into the old growth in the Southern Appalachians, it was like somebody threw a switch. There was some odd filter that had just been removed, and the world sounded different and smelled different.
  • richard powersYeah. In human exceptionalism, we may be completely aware of evolutionary continuity. We may understand that we have a literal kinship with the rest of creation, that all life on Earth employs the same genetic code, that there is a very small core of core genes and core proteins that is shared across all the kingdoms and phyla of life. But conceptually, we still have this demented idea that somehow consciousness creates a sanctity and a separation that almost nullifies the continuous elements of evolution and biology that we’ve come to understand.
  • if we want to begin this process of rehabilitation and transformation of consciousness that we are going to need in order to become part of the living Earth, it is going to be other kinds of minds that give us that clarity and strength and diversity and alternative way of thinking that could free us from this stranglehold of thought that looks only to the maximizing return on investment in very leverageable ways.
  • richard powersIt amazed me to get to the end of the first draft of “Bewilderment” and to realize how much Buddhism was in the book, from the simplest things.
  • I think there is nothing more science inflected than being out in the living world and the more-than-human world and trying to understand what’s happening.
  • And of course, we can combine this with what we were talking about earlier with death. If we see all of evolution as somehow leading up to us, all of human, cultural evolution leading up to neoliberalism and here we are just busily trying to accumulate and make meaning for ourselves, death becomes the enemy.
  • And you’re making the point in different ways throughout the book that it is the minds we think of as unusual, that we would diagnose as having some kind of problem or dysfunction that are, in some cases, are the only ones responding to the moment in the most common sense way it deserves. It is almost everybody else’s brain that has been broken.
  • it isn’t surprising. If you think of the characteristics of this dominant culture that we’ve been talking about — the fixation on control, the fixation on mastery, the fixation on management and accumulation and the resistance of decay — it isn’t surprising that that culture is also threatened by difference and divergence. It seeks out old, stable hierarchies — clear hierarchies — of control, and anything that’s not quite exploitable or leverageable in the way that the normal is terrifying and threatening.
  • And the more I looked for it, the more it pervaded the book.
  • ezra kleinI’ve heard you say that it has changed the way you measure a good day. Can you tell me about that?richard powersThat’s true.I suppose when I was still enthralled to commodity-mediated individualist market-driven human exceptionalism — we need a single word for this
  • And since moving to the Smokies and since publishing “The Overstory,” my days have been entirely inverted. I wake up, I go to the window, and I look outside. Or I step out onto the deck — if I haven’t been sleeping on the deck, which I try to do as much as I can in the course of the year — and see what’s in the air, gauge the temperature and the humidity and the wind and see what season it is and ask myself, you know, what’s happening out there now at 1,700 feet or 4,000 feet or 5,000 feet.
  • let me talk specifically about the work of a scientist who has herself just recently published a book. It’s Dr. Suzanne Simard, and the book is “Finding the Mother Tree.” Simard has been instrumental in a revolution in our way of thinking about what’s happening underground at the root level in a forest.
  • it was a moving moment for me, as an easterner, to stand up there and to say, this is what an eastern forest looks like. This is what a healthy, fully-functioning forest looks like. And I’m 56 years old, and I’d never seen it.
  • the other topics of that culture tend to circle back around these sorts of trends, human fascinations, ways of magnifying our throw weight and our ability and removing the last constraints to our desires and, in particular, to eliminate the single greatest enemy of meaning in the culture of the technological sublime that is, itself, such a strong instance of the culture of human separatism and commodity-mediated individualist capitalism— that is to say, the removal of death.
  • Why is it that we have known about the crisis of species extinction for at least half a century and longer? And I mean the lay public, not just scientists. But why has this been general knowledge for a long time without public will demanding some kind of action or change
  • And when you make kinship beyond yourself, your sense of meaning gravitates outwards into that reciprocal relationship, into that interdependence. And you know, it’s a little bit like scales falling off your eyes. When you do turn that corner, all of the sources of anxiety that are so present and so deeply internalized become much more identifiable. And my own sense of hope and fear gets a much larger frame of reference to operate in.
  • I think, for most of my life, until I did kind of wake up to forests and to trees, I shared — without really understanding this as a kind of concession or a kind of subscription — I did share this cultural consensus that meaning is a private thing that we do for ourselves and by ourselves and that our kind of general sense of the discoveries of the 19th and 20th century have left us feeling a bit unsponsored and adrift beyond the accident of human existence.
  • The largest single influence on any human being’s mode of thought is other human beings. So if you are surrounded by lots of terrified but wishful-thinking people who want to believe that somehow the cavalry is going to come at the last minute and that we don’t really have to look inwards and change our belief in where meaning comes from, that we will somehow be able to get over the finish line with all our stuff and that we’ll avert this disaster, as we have other kinds of disasters in the past.
  • I think what was happening to me at that time, as I was turning outward and starting to take the non-human world seriously, is my sense of meaning was shifting from something that was entirely about me and authored by me outward into this more collaborative, reciprocal, interdependent, exterior place that involved not just me but all of these other ways of being that I could make kinship with.
  • And I think I was right along with that sense that somehow we are a thing apart. We can make purpose and make meaning completely arbitrarily. It consists mostly of trying to be more in yourself, of accumulating in one form or another.
  • I can’t really be out for more than two or three miles before my head just fills with associations and ideas and scenes and character sketches. And I usually have to rush back home to keep it all in my head long enough to get it down on paper.
  • for my journey, the way to characterize this transition is from being fascinated with technologies of mastery and control and what they’re doing to us as human beings, how they’re changing what the capacities and affordances of humanity are and how we narrate ourselves, to being fascinated with technologies and sciences of interdependence and cooperation, of those sciences that increase our sense of kinship and being one of many, many neighbors.
  • And that’s an almost impossible persuasion to rouse yourself from if you don’t have allies. And I think the one hopeful thing about the present is the number of people trying to challenge that consensual understanding and break away into a new way of looking at human standing is growing.
  • And when you do subscribe to a culture like that and you are confronted with the reality of your own mortality, as I was when I was living in Stanford, that sense of stockpiling personal meaning starts to feel a little bit pointless.
  • And I just head out. I head out based on what the day has to offer. And to have that come first has really changed not only how I write, but what I’ve been writing. And I think it really shows in “Bewilderment.” It’s a totally different kind of book from my previous 12.
  • the marvelous thing about the work, which continues to get more sophisticated and continues to turn up newer and newer astonishments, is that there was odd kind of reciprocal interdependence and cooperation across the species barrier, that Douglas firs and birches were actually involved in these sharing back and forth of essential nutrients. And that’s a whole new way of looking at forest.
  • she began to see that the forests were actually wired up in very complex and identifiable ways and that there was an enormous system of resource sharing going on underground, that trees were sharing not only sugars and the hydrocarbons necessary for survival, but also secondary metabolites. And these were being passed back and forth, both symbiotically between the trees and the fungi, but also across the network to other trees so that there were actually trees in wired up, fungally-connected forests where large, dominant, healthy trees were subsidizing, as it were, trees that were injured or not in favorable positions or damaged in some way or just failing to thrive.
  • so when I was still pretty much a card-carrying member of that culture, I had this sense that to become a better person and to get ahead and to really make more of myself, I had to be as productive as possible. And that meant waking up every morning and getting 1,000 words that I was proud of. And it’s interesting that I would even settle on a quantitative target. That’s very typical for that kind of mindset that I’m talking about — 1,000 words and then you’re free, and then you can do what you want with the day.
  • there will be a threshold, as there have been for these other great social transformations that we’ve witnessed in the last couple of decades where somehow it goes from an outsider position to absolutely mainstream and common sense.
  • I am persuaded by those scholars who have showed the degree to which the concept of nature is itself an artificial construction that’s born of cultures of human separatism. I believe that everything that life does is part of the living enterprise, and that includes the construction of cities. And there is no question at all the warning that you just gave about nostalgia creating a false binary between the built world and the true natural world is itself a form of cultural isolation.
  • Religion is a technology to discipline, to discipline certain parts of the human impulse. A lot of the book revolves around the decoded neurofeedback machine, which is a very real literalization of a technology, of changing the way we think
  • one of the things I think that we have to take seriously is that we have created technologies to supercharge some parts of our natural impulse, the capitalism I think should be understood as a technology to supercharge the growth impulse, and it creates some wonders out of that and some horrors out of that.
  • richard powersSure. I base my machine on existing technology. Decoded neurofeedback is a kind of nascent field of exploration. You can read about it; it’s been publishing results for a decade. I first came across it in 2013. It involves using fMRI to record the brain activity of a human being who is learning a process, interacting with an object or engaged in a certain emotional state. That neural activity is recorded and stored as a data structure. A second subsequent human being is then also scanned in real time and fed kinds of feedback based on their own internal neural activity as determined by a kind of software analysis of their fMRI data structures.
  • And they are queued little by little to approximate, to learn how to approximate, the recorded states of the original subject. When I first read about this, I did get a little bit of a revelation. I did feel my skin pucker and think, if pushed far enough, this would be something like a telepathy conduit. It would be a first big step in answering that age-old question of what does it feel like to be something other than we are
  • in the book I simply take that basic concept and extend it, juke it up a little bit, blur the line between what the reader might think is possible right now and what they might wonder about, and maybe even introduce possibilities for this empathetic transference
  • ezra kleinOne thing I loved about the role this played in the book is that it’s highlighting its inverse. So a reader might look at this and say, wow, wouldn’t that be cool if we had a machine that could in real time change how we think and change our neural pathways and change our mental state in a particular direction? But of course, all of society is that machine,
  • Robin and Theo are in an airport. And you’ve got TVs everywhere playing the news which is to say playing a constant loop of outrage, and disaster, and calamity. And Robbie, who’s going through these neural feedback sessions during this period, turns to his dad and says, “Dad, you know how the training’s rewiring my brain? This is what is rewiring everybody else.”
  • ezra kleinI think Marshall McLuhan knew it all. I really do. Not exactly what it would look like, but his view and Postman’s view that we are creating a digital global nervous system is a way they put it, it was exactly right. A nervous system, it was such the exact right metaphor.
  • the great insight of McLuhan, to me, what now gets called the medium is the message is this idea that the way media acts upon us is not in the content it delivers. The point of Twitter is not the link that you click or even the tweet that you read; it is that the nature and structure of the Twitter system itself begins to act on your system, and you become more like it.If you watch a lot of TV, you become more like TV. If you watch a lot of Twitter, you become more like Twitter, Facebook more like Facebook. Your identities become more important to you — that the content is distraction from the medium, and the medium changes you
  • it is happening to all of us in ways that at least we are not engaging in intentionally, not at that level of how do we want to be transformed.
  • richard powersI believe that the digital neural system is now so comprehensive that the idea that you could escape it somewhere, certainly not in the Smokies, even more remotely, I think, becomes more and more laughable. Yeah, and to build on this idea of the medium being the message, not the way in which we become more like the forms and affordances of the medium is that we begin to expect that those affordances, the method in which those media are used, the physiological dependencies and castes of behavior and thought that are required to operate them and interact with them are actual — that they’re real somehow, and that we just take them into human nature and say no, this is what we’ve always wanted and we’ve simply been able to become more like our true selves.
  • Well, the warpage in our sense of time, the warpage in our sense of place, are profound. The ways in which digital feedback and the affordances of social media and all the rest have changed our expectations with regard to what we need to concentrate on, what we need to learn for ourselves, are changing profoundly.
  • If you look far enough back, you can find Socrates expressing great anxiety and suspicion about the ways in which writing is going to transform the human brain and human expectation. He was worried that somehow it was going to ruin our memories. Well, it did up to a point — nothing like the way the digital technologies have ruined our memories.
  • my tradition is Jewish, the Sabbath is a technology, is a technology to create a different relationship between the human being, and time, and growth, and productive society than you would have without the Sabbath which is framed in terms of godliness but is also a way of creating separation from the other impulses of the weak.
  • Governments are a technology, monogamy is a technology, a religiously driven technology, but now one that is culturally driven. And these things do good and they do bad. I’m not making an argument for any one of them in particular. But the idea that we would need to invent something wholly new to come up with a way to change the way human beings act is ridiculous
  • My view of the story of this era is that capitalism was one of many forces, and it has become, in many societies, functionally the only one that it was in relationship with religion, it was in relationship with more rooted communities.
  • it has become not just an economic system but a belief system, and it’s a little bit untrammeled. I’m not an anti-capitalist person, but I believe it needs countervailing forces. And my basic view is that it doesn’t have them anymore.
  • the book does introduce this kind of fable, this kind of thought experiment about the way the affordances that a new and slightly stronger technology of empathy might deflect. First of all, the story of a little boy and then the story of his father who’s scrambling to be a responsible single parent. And then, beyond that, the community of people who hear about this boy and become fascinated with him as a narrative, which again ripples outward through these digital technologies in ways that can’t be controlled or whose consequences can be foreseen.
  • I’ve talked about it before is something I’ve said is that I think a push against, functionally, materialism and want is an important weight in our society that we need. And when people say it is the way we’ll deal with climate change in the three to five year time frame, I become much more skeptical because to the point of things like the technology you have in the book with neural feedback, I do think one of the questions you have to ask is, socially and culturally, how do you move people’s minds so you can then move their politics?
  • You’re going to need something, it seems to me, outside of politics, that changes humans’ sense of themselves more fundamentally. And that takes a minute at the scale of billions.
  • richard powersWell, you are correct. And I don’t think it’s giving away any great reveal in the book to say that a reader who gets far enough into the story probably has this moment of recursive awareness where they, he or she comes to understand that what Robin is doing in this gradual training on the cast of mind of some other person is precisely what they’re doing in the act of reading the novel “Bewilderment” — by living this act of active empathy for these two characters, they are undergoing their own kind of neurofeedback.
  • The more we understand about the complexities of living systems, of organisms and the evolution of organisms, the more capable it is to feel a kind of spiritual awe. And that certainly makes it easier to have reverence for the experiment beyond me and beyond my species. I don’t think those are incommensurable or incompatible ways of knowing the world. In fact, I think to invoke one last time that Buddhist precept of interbeing, I think there is a kind of interbeing between the desire, the true selfless desire to understand the world out there through presence, care, measurement, attention, reproduction of experiment and the desire to have a spiritual affinity and shared fate with the world out there. They’re really the same project.
  • richard powersWell, sure. If we turn back to the new forestry again and researchers like Suzanne Simard who were showing the literal interconnectivity across species boundaries and the cooperation of resource sharing between different species in a forest, that is rigorous science, rigorous reproducible science. And it does participate in that central principle of practice, or collection of practices, which always requires the renunciation of personal wish and ego and prior belief in favor of empirical reproduction.
  • I’ve begun to see people beginning to build out of the humbling sciences a worldview that seems quite spiritual. And as you’re somebody who seems to me to have done that and it has changed your life, would you reflect on that a bit?
  • So much of the book is about the possibility of life beyond Earth. Tell me a bit about the role that’s playing. Why did you make the possibility of alien life in the way it might look and feel and evolve and act so central in a book about protecting and cherishing life here?
  • richard powersI’m glad that we’re slipping this in at the end because yes this framing of the book around this question of are we alone or does the universe want life it’s really important. Theo, Robin’s father, is an astrobiologist.
  • Imagine that everything happens just right so that every square inch of this place is colonized by new forms of experiments, new kinds of life. And the father trying to entertain his son with the story of this remarkable place in the sun just stopping him and saying, Dad, come on, that’s asking too much. Get real, that’s science fiction. That’s the vision that I had when I finished the book, an absolutely limitless sense of just how lucky we’ve had it here.
  • one thing I kept thinking about that didn’t make it into the final book but exists as a kind of parallel story in my own head is the father and son on some very distant planet in some very distant star, many light years from here, playing that same game. And the father saying, OK, now imagine a world that’s just the right size, and it has plate tectonics, and it has water, and it has a nearby moon to stabilize its rotation, and it has incredible security and safety from asteroids because of other large planets in the solar system.
  • they make this journey across the universe through all kinds of incubators, all kinds of petri dishes for life and the possibilities of life. And rather than answer the question — so where is everybody? — it keeps deferring the question, it keeps making that question more subtle and stranger
  • For the purposes of the book, Robin, who desperately believes in the sanctity of life beyond himself, begs his father for these nighttime, bedtime stories, and Theo gives him easy travel to other planets. Father and son going to a new planet based on the kinds of planets that Theo’s science is turning up and asking this question, what would life look like if it was able to get started here?
Javier E

TikTok Brain Explained: Why Some Kids Seem Hooked on Social Video Feeds - WSJ - 0 views

  • Remember the good old days when kids just watched YouTube all day? Now that they binge on 15-second TikToks, those YouTube clips seem like PBS documentaries.
  • Many parents tell me their kids can’t sit through feature-length films anymore because to them the movies feel painfully slow. Others have observed their kids struggling to focus on homework. And reading a book? Forget about it.
  • What is happening to kids’ brains?
  • ...27 more annotations...
  • “It is hard to look at increasing trends in media consumption of all types, media multitasking and rates of ADHD in young people and not conclude that there is a decrease in their attention span,
  • Emerging research suggests that watching short, fast-paced videos makes it harder for kids to sustain activities that don’t offer instant—and constant—gratification.
  • One of the few studies specifically examining TikTok-related effects on the brain focused on Douyin, the TikTok equivalent in China, made by the same Chinese parent company, ByteDance Ltd. It found that the personalized videos the app’s recommendation engine shows users activate the reward centers of the brain, as compared with the general-interest videos shown to new users.
  • Brain scans of Chinese college students showed that areas involved in addiction were highly activated in those who watched personalized videos.
  • It also found some people have trouble controlling when to stop watching.
  • attention. “If kids’ brains become accustomed to constant changes, the brain finds it difficult to adapt to a nondigital activity where things don’t move quite as fast,”
  • A TikTok spokeswoman said the company wants younger teens to develop positive digital habits early on, and that it recently made some changes aimed at curbing extensive app usage. For example, TikTok won’t allow users ages 13 to 15 to receive push notifications after 9 p.m. TikTok also periodically reminds users to take a break to go outside or grab a snack.
  • Kids have a hard time pulling away from videos on YouTube, too, and Google has made several changes to help limit its use, including turning off autoplay by default on accounts of people under 18.
  • When kids do things that require prolonged focus, such as reading or solving math problems, they’re using directed attention
  • This function starts in the prefrontal cortex, the part of the brain responsible for decision making and impulse control.
  • “Directed attention is the ability to inhibit distractions and sustain attention and to shift attention appropriately. It requires higher-order skills like planning and prioritizing,”
  • Kids generally have a harder time doing this—and putting down their videogame controllers—because the prefrontal cortex isn’t fully developed until age 25.
  • “We speculate that individuals with lower self-control ability have more difficulty shifting attention away from favorite video stimulation,
  • “In the short-form snackable world, you’re getting quick hit after quick hit, and as soon as it’s over, you have to make a choice,” said Mass General’s Dr. Marci, who wrote the new book “Rewired: Protecting Your Brain in the Digital Age.” The more developed the prefrontal cortex, the better the choices.
  • Dopamine is a neurotransmitter that gets released in the brain when it’s expecting a reward. A flood of dopamine reinforces cravings for something enjoyable, whether it’s a tasty meal, a drug or a funny TikTok video.
  • “TikTok is a dopamine machine,” said John Hutton, a pediatrician and director of the Reading & Literacy Discovery Center at Cincinnati Children’s Hospital. “If you want kids to pay attention, they need to practice paying attention.”
  • Researchers are just beginning to conduct long-term studies on digital media’s effects on kids’ brains. The National Institutes of Health is funding a study of nearly 12,000 adolescents as they grow into adulthood to examine the impact that many childhood experiences—from social media to smoking—have on cognitive development.
  • she predicts they will find that when brains repeatedly process rapid, rewarding content, their ability to process less-rapid, less-rewarding things “may change or be harmed.”
  • “It’s like we’ve made kids live in a candy store and then we tell them to ignore all that candy and eat a plate of vegetables,”
  • “We have an endless flow of immediate pleasures that’s unprecedented in human history.”
  • Parents and kids can take steps to boost attention, but it takes effort
  • Swap screen time for real time. Exercise and free play are among the best ways to build attention during childhood,
  • “Depriving kids of tech doesn’t work, but simultaneously reducing it and building up other things, like playing outside, does,”
  • Practice restraint.
  • “When you practice stopping, it strengthens those connections in the brain to allow you to stop again next time.”
  • Use tech’s own tools. TikTok has a screen-time management setting that allows users to cap their app usage.
  • Ensure good sleep. Teens are suffering from a sleep deficit.
Javier E

Quitters Never Win: The Costs of Leaving Social Media - Woodrow Hartzog and Evan Seling... - 2 views

  • Manjoo offers this security-centric path for folks who are anxious about the service being "one the most intrusive technologies ever built," and believe that "the very idea of making Facebook a more private place borders on the oxymoronic, a bit like expecting modesty at a strip club". Bottom line: stop tuning in and start dropping out if you suspect that the culture of oversharing, digital narcissism, and, above all, big-data-hungry, corporate profiteering will trump privacy settings.
  • Angwin plans on keeping a bare-bones profile. She'll maintain just enough presence to send private messages, review tagged photos, and be easy for readers to find. Others might try similar experiments, perhaps keeping friends, but reducing their communication to banal and innocuous expressions. But, would such disclosures be compelling or sincere enough to retain the technology's utility?
  • The other unattractive option is for social web users to willingly pay for connectivity with extreme publicity.
  • ...9 more annotations...
  • go this route if you believe privacy is dead, but find social networking too good to miss out on.
  • While we should be attuned to constraints and their consequences, there are at least four problems with conceptualizing the social media user's dilemma as a version of "if you can't stand the heat, get out of the kitchen".
  • The efficacy of abandoning social media can be questioned when others are free to share information about you on a platform long after you've left.
  • Second, while abandoning a single social technology might seem easy, this "love it or leave it" strategy -- which demands extreme caution and foresight from users and punishes them for their naivete -- isn't sustainable without great cost in the aggregate. If we look past the consequences of opting out of a specific service (like Facebook), we find a disconcerting and more far-reaching possibility: behavior that justifies a never-ending strategy of abandoning every social technology that threatens privacy -- a can being kicked down the road in perpetuity without us resolving the hard question of whether a satisfying balance between protection and publicity can be found online
  • if your current social network has no obligation to respect the obscurity of your information, what justifies believing other companies will continue to be trustworthy over time?
  • Sticking with the opt-out procedure turns digital life into a paranoid game of whack-a-mole where the goal is to stay ahead of the crushing mallet. Unfortunately, this path of perilously transferring risk from one medium to another is the direction we're headed if social media users can't make reasonable decisions based on the current context of obscurity, but instead are asked to assume all online social interaction can or will eventually lose its obscurity protection.
  • The fourth problem with the "leave if you're unhappy" ethos is that it is overly individualistic. If a critical mass participates in the "Opt-Out Revolution," what would happen to the struggling, the lonely, the curious, the caring, and the collaborative if the social web went dark?
  • Our point is that there is a middle ground between reclusion and widespread publicity, and the reduction of user options to quitting or coping, which are both problematic, need not be inevitable, especially when we can continue exploring ways to alleviate the user burden of retreat and the societal cost of a dark social web.
  • it is easy to presume that "even if you unfriend everybody on Facebook, and you never join Twitter, and you don't have a LinkedIn profile or an About.me page or much else in the way of online presence, you're still going to end up being mapped and charted and slotted in to your rightful place in the global social network that is life." But so long it remains possible to create obscurity through privacy enhancing technology, effective regulation, contextually appropriate privacy settings, circumspect behavior, and a clear understanding of how our data can be accessed and processed, that fatalism isn't justified.
Javier E

In This Snapchat Campaign, Election News Is Big and Then It's Gone - The New York Times - 1 views

  • Every modern presidential election is at least in part defined by the cool new media breakthrough of its moment.
  • In 2000, there was email, and by golly was that a big change from the fax. The campaigns could get their messages in front of print and cable news reporters — who could still dominate the campaign narrative — at will,
  • Then 2008: Facebook made it that much easier for campaigns to reach millions of people directly,
  • ...17 more annotations...
  • The 2004 campaign was the year of the “Web log,” or blog, when mainstream reporters and campaigns officially began losing any control they may have had over political new
  • The question this year has been whether 2016 will be the “Snapchat election,
  • Snapchat represents a change to something else: the longevity of news, how durably it keeps in our brain cells and our servers.
  • Snapchat is recording the here and the now, playing for today. Tomorrow will bring something new that renders today obsolete. It’s a digital Tibetan sand painting made in the image of the millennial mind.
  • Snapchat executives say they set up the app this way because this is what their tens of millions of younger users want; it’s how they live.
  • They can’t possibly have enough bandwidth to process all the incoming information and still dwell on what already was, can they?
  • Experienced strategists and their candidates, who could always work through their election plans methodically — promoting their candidacies one foot in front of the other, adjusting here and there for the unexpected — suddenly found that they couldn’t operate the way they always did.
  • Marco Rubio’s campaign marched into the election season ready to fight the usual news-cycle-by-news-cycle skirmishes. It was surprised to learn that, lo and behold, “There was no news cycle — everything was one big fire hose,” Alex Conant, a senior Rubio strategist, told me. “News was constantly breaking and at the end of the day hardly anything mattered. Things would happen; 24 hours later, everyone was talking about something else.”
  • Then there was Jeb Bush, expecting to press ahead by presenting what he saw as leading-edge policy proposals that would set off a prolonged back-and-forth. When Mr. Bush rolled out a fairly sweeping plan to upend the college loan system, the poor guy thought this was going to become a big thing.
  • It drew only modest coverage and was quickly buried by the latest bit from Donald Trump.
  • In this “hit refresh” political culture, damaging news does not have to stick around for long, either. The next development, good or bad, replaces it almost immediately.
  • Mr. Miller pointed to a recent episode in which Mr. Trump said a protester at a rally had “ties to ISIS,” after that protester charged the stage. No such ties existed. “He says ‘ISIS is attacking me’; this was debunked in eight minutes by Twitter,” Mr. Miller said. “Cable talked about it for three hours and it went away.”
  • “Hillary Clinton said that she was under sniper fire in Bosnia” — she wasn’t — “and that has stuck with her for 20 years,”
  • Mr. Trump has mastered this era of short attention spans in politics by realizing that if you’re the one regularly feeding the stream, you can forever move past your latest trouble, and hasten the mass amnesia.
  • It was with this in mind that The Washington Post ran an editorial late last week reminding its readers of some of Mr. Trump’s more outlandish statements and policy positions
  • The Post urged its readers to “remember” more than two dozen items from Mr. Trump’s record, including that he promised “to round up 11 million undocumented immigrants and deport them,” and “lied about President Obama’s birth certificate.”
  • as the media habits of the young drive everybody else’s, I’m reminded of that old saw about those who forget history. Now, what was I saying?
Javier E

Why it's as hard to escape an echo chamber as it is to flee a cult | Aeon Essays - 0 views

  • there are two very different phenomena at play here, each of which subvert the flow of information in very distinct ways. Let’s call them echo chambers and epistemic bubbles. Both are social structures that systematically exclude sources of information. Both exaggerate their members’ confidence in their beliefs.
  • they work in entirely different ways, and they require very different modes of intervention
  • An epistemic bubble is when you don’t hear people from the other side. An echo chamber is what happens when you don’t trust people from the other side.
  • ...90 more annotations...
  • start with epistemic bubbles
  • That omission might be purposeful
  • But that omission can also be entirely inadvertent. Even if we’re not actively trying to avoid disagreement, our Facebook friends tend to share our views and interests
  • An ‘echo chamber’ is a social structure from which other relevant voices have been actively discredited. Where an epistemic bubble merely omits contrary views, an echo chamber brings its members to actively distrust outsiders.
  • an echo chamber is something like a cult. A cult isolates its members by actively alienating them from any outside sources. Those outside are actively labelled as malignant and untrustworthy.
  • In epistemic bubbles, other voices are not heard; in echo chambers, other voices are actively undermined.
  • The way to break an echo chamber is not to wave “the facts” in the faces of its members. It is to attack the echo chamber at its root and repair that broken trust.
  • Looking to others for corroboration is a basic method for checking whether one has reasoned well or badly
  • They have been in the limelight lately, most famously in Eli Pariser’s The Filter Bubble (2011) and Cass Sunstein’s #Republic: Divided Democracy in the Age of Social Media (2017).
  • The general gist: we get much of our news from Facebook feeds and similar sorts of social media. Our Facebook feed consists mostly of our friends and colleagues, the majority of whom share our own political and cultural views
  • various algorithms behind the scenes, such as those inside Google search, invisibly personalise our searches, making it more likely that we’ll see only what we want to see. These processes all impose filters on information.
  • Such filters aren’t necessarily bad. The world is overstuffed with information, and one can’t sort through it all by oneself: filters need to be outsourced.
  • That’s why we all depend on extended social networks to deliver us knowledge
  • any such informational network needs the right sort of broadness and variety to work
  • Each individual person in my network might be superbly reliable about her particular informational patch but, as an aggregate structure, my network lacks what Sanford Goldberg in his book Relying on Others (2010) calls ‘coverage-reliability’. It doesn’t deliver to me a sufficiently broad and representative coverage of all the relevant information.
  • Epistemic bubbles also threaten us with a second danger: excessive self-confidence.
  • An ‘epistemic bubble’ is an informational network from which relevant voices have been excluded by omission
  • Suppose that I believe that the Paleo diet is the greatest diet of all time. I assemble a Facebook group called ‘Great Health Facts!’ and fill it only with people who already believe that Paleo is the best diet. The fact that everybody in that group agrees with me about Paleo shouldn’t increase my confidence level one bit. They’re not mere copies – they actually might have reached their conclusions independently – but their agreement can be entirely explained by my method of selection.
  • Luckily, though, epistemic bubbles are easily shattered. We can pop an epistemic bubble simply by exposing its members to the information and arguments that they’ve missed.
  • echo chambers are a far more pernicious and robust phenomenon.
  • amieson and Cappella’s book is the first empirical study into how echo chambers function
  • echo chambers work by systematically alienating their members from all outside epistemic sources.
  • Their research centres on Rush Limbaugh, a wildly successful conservative firebrand in the United States, along with Fox News and related media
  • His constant attacks on the ‘mainstream media’ are attempts to discredit all other sources of knowledge. He systematically undermines the integrity of anybody who expresses any kind of contrary view.
  • outsiders are not simply mistaken – they are malicious, manipulative and actively working to destroy Limbaugh and his followers. The resulting worldview is one of deeply opposed force, an all-or-nothing war between good and evil
  • The result is a rather striking parallel to the techniques of emotional isolation typically practised in cult indoctrination
  • cult indoctrination involves new cult members being brought to distrust all non-cult members. This provides a social buffer against any attempts to extract the indoctrinated person from the cult.
  • The echo chamber doesn’t need any bad connectivity to function. Limbaugh’s followers have full access to outside sources of information
  • As Elijah Millgram argues in The Great Endarkenment (2015), modern knowledge depends on trusting long chains of experts. And no single person is in the position to check up on the reliability of every member of that chain
  • Their worldview can survive exposure to those outside voices because their belief system has prepared them for such intellectual onslaught.
  • exposure to contrary views could actually reinforce their views. Limbaugh might offer his followers a conspiracy theory: anybody who criticises him is doing it at the behest of a secret cabal of evil elites, which has already seized control of the mainstream media.
  • Perversely, exposure to outsiders with contrary views can thus increase echo-chamber members’ confidence in their insider sources, and hence their attachment to their worldview.
  • ‘evidential pre-emption’. What’s happening is a kind of intellectual judo, in which the power and enthusiasm of contrary voices are turned against those contrary voices through a carefully rigged internal structure of belief.
  • One might be tempted to think that the solution is just more intellectual autonomy. Echo chambers arise because we trust others too much, so the solution is to start thinking for ourselves.
  • that kind of radical intellectual autonomy is a pipe dream. If the philosophical study of knowledge has taught us anything in the past half-century, it is that we are irredeemably dependent on each other in almost every domain of knowledge
  • Limbaugh’s followers regularly read – but do not accept – mainstream and liberal news sources. They are isolated, not by selective exposure, but by changes in who they accept as authorities, experts and trusted sources.
  • we depend on a vastly complicated social structure of trust. We must trust each other, but, as the philosopher Annette Baier says, that trust makes us vulnerable. Echo chambers operate as a kind of social parasite on that vulnerability, taking advantage of our epistemic condition and social dependency.
  • I am quite confident that there are plenty of echo chambers on the political Left. More importantly, nothing about echo chambers restricts them to the arena of politics
  • The world of anti-vaccination is clearly an echo chamber, and it is one that crosses political lines. I’ve also encountered echo chambers on topics as broad as diet (Paleo!), exercise technique (CrossFit!), breastfeeding, some academic intellectual traditions, and many, many more
  • Here’s a basic check: does a community’s belief system actively undermine the trustworthiness of any outsiders who don’t subscribe to its central dogmas? Then it’s probably an echo chamber.
  • much of the recent analysis has lumped epistemic bubbles together with echo chambers into a single, unified phenomenon. But it is absolutely crucial to distinguish between the two.
  • Epistemic bubbles are rather ramshackle; they go up easily, and they collapse easily
  • Echo chambers are far more pernicious and far more robust. They can start to seem almost like living things. Their belief systems provide structural integrity, resilience and active responses to outside attacks
  • the two phenomena can also exist independently. And of the events we’re most worried about, it’s the echo-chamber effects that are really causing most of the trouble.
  • new data does, in fact, seem to show that people on Facebook actually do see posts from the other side, or that people often visit websites with opposite political affiliation.
  • their basis for evaluation – their background beliefs about whom to trust – are radically different. They are not irrational, but systematically misinformed about where to place their trust.
  • Many people have claimed that we have entered an era of ‘post-truth’.
  • Not only do some political figures seem to speak with a blatant disregard for the facts, but their supporters seem utterly unswayed by evidence. It seems, to some, that truth no longer matters.
  • This is an explanation in terms of total irrationality. To accept it, you must believe that a great number of people have lost all interest in evidence or investigation, and have fallen away from the ways of reason.
  • echo chambers offers a less damning and far more modest explanation. The apparent ‘post-truth’ attitude can be explained as the result of the manipulations of trust wrought by echo chambers.
  • We don’t have to attribute a complete disinterest in facts, evidence or reason to explain the post-truth attitude. We simply have to attribute to certain communities a vastly divergent set of trusted authorities.
  • An echo chamber doesn’t destroy their members’ interest in the truth; it merely manipulates whom they trust and changes whom they accept as trustworthy sources and institutions.
  • in many ways, echo-chamber members are following reasonable and rational procedures of enquiry. They’re engaging in critical reasoning. They’re questioning, they’re evaluating sources for themselves, they’re assessing different pathways to information. They are critically examining those who claim expertise and trustworthiness, using what they already know about the world
  • none of this weighs against the existence of echo chambers. We should not dismiss the threat of echo chambers based only on evidence about connectivity and exposure.
  • Notice how different what’s going on here is from, say, Orwellian doublespeak, a deliberately ambiguous, euphemism-filled language designed to hide the intent of the speaker.
  • echo chambers don’t trade in vague, ambiguous pseudo-speech. We should expect that echo chambers would deliver crisp, clear, unambiguous claims about who is trustworthy and who is not
  • clearly articulated conspiracy theories, and crisply worded accusations of an outside world rife with untrustworthiness and corruption.
  • Once an echo chamber starts to grip a person, its mechanisms will reinforce themselves.
  • In an epistemically healthy life, the variety of our informational sources will put an upper limit to how much we’re willing to trust any single person. Everybody’s fallible; a healthy informational network tends to discover people’s mistakes and point them out. This puts an upper ceiling on how much you can trust even your most beloved leader
  • nside an echo chamber, that upper ceiling disappears.
  • Being caught in an echo chamber is not always the result of laziness or bad faith. Imagine, for instance, that somebody has been raised and educated entirely inside an echo chamber
  • when the child finally comes into contact with the larger world – say, as a teenager – the echo chamber’s worldview is firmly in place. That teenager will distrust all sources outside her echo chamber, and she will have gotten there by following normal procedures for trust and learning.
  • It certainly seems like our teenager is behaving reasonably. She could be going about her intellectual life in perfectly good faith. She might be intellectually voracious, seeking out new sources, investigating them, and evaluating them using what she already knows.
  • The worry is that she’s intellectually trapped. Her earnest attempts at intellectual investigation are led astray by her upbringing and the social structure in which she is embedded.
  • Echo chambers might function like addiction, under certain accounts. It might be irrational to become addicted, but all it takes is a momentary lapse – once you’re addicted, your internal landscape is sufficiently rearranged such that it’s rational to continue with your addiction
  • Similarly, all it takes to enter an echo chamber is a momentary lapse of intellectual vigilance. Once you’re in, the echo chamber’s belief systems function as a trap, making future acts of intellectual vigilance only reinforce the echo chamber’s worldview.
  • There is at least one possible escape route, however. Notice that the logic of the echo chamber depends on the order in which we encounter the evidence. An echo chamber can bring our teenager to discredit outside beliefs precisely because she encountered the echo chamber’s claims first. Imagine a counterpart to our teenager who was raised outside of the echo chamber and exposed to a wide range of beliefs. Our free-range counterpart would, when she encounters that same echo chamber, likely see its many flaws
  • Those caught in an echo chamber are giving far too much weight to the evidence they encounter first, just because it’s first. Rationally, they should reconsider their beliefs without that arbitrary preference. But how does one enforce such informational a-historicity?
  • The escape route is a modified version of René Descartes’s infamous method.
  • Meditations on First Philosophy (1641). He had come to realise that many of the beliefs he had acquired in his early life were false. But early beliefs lead to all sorts of other beliefs, and any early falsehoods he’d accepted had surely infected the rest of his belief system.
  • The only solution, thought Descartes, was to throw all his beliefs away and start over again from scratch.
  • He could start over, trusting nothing and no one except those things that he could be entirely certain of, and stamping out those sneaky falsehoods once and for all. Let’s call this the Cartesian epistemic reboot.
  • Notice how close Descartes’s problem is to our hapless teenager’s, and how useful the solution might be. Our teenager, like Descartes, has problematic beliefs acquired in early childhood. These beliefs have infected outwards, infesting that teenager’s whole belief system. Our teenager, too, needs to throw everything away, and start over again.
  • Let’s call the modernised version of Descartes’s methodology the social-epistemic reboot.
  • when she starts from scratch, we won’t demand that she trust only what she’s absolutely certain of, nor will we demand that she go it alone
  • For the social reboot, she can proceed, after throwing everything away, in an utterly mundane way – trusting her senses, trusting others. But she must begin afresh socially – she must reconsider all possible sources of information with a presumptively equanimous eye. She must take the posture of a cognitive newborn, open and equally trusting to all outside sources
  • we’re not asking people to change their basic methods for learning about the world. They are permitted to trust, and trust freely. But after the social reboot, that trust will not be narrowly confined and deeply conditioned by the particular people they happened to be raised by.
  • Such a profound deep-cleanse of one’s whole belief system seems to be what’s actually required to escape. Look at the many stories of people leaving cults and echo chambers
  • Take, for example, the story of Derek Black in Florida – raised by a neo-Nazi father, and groomed from childhood to be a neo-Nazi leader. Black left the movement by, basically, performing a social reboot. He completely abandoned everything he’d believed in, and spent years building a new belief system from scratch. He immersed himself broadly and open-mindedly in everything he’d missed – pop culture, Arabic literature, the mainstream media, rap – all with an overall attitude of generosity and trust.
  • It was the project of years and a major act of self-reconstruction, but those extraordinary lengths might just be what’s actually required to undo the effects of an echo-chambered upbringing.
  • we need to attack the root, the systems of discredit themselves, and restore trust in some outside voices.
  • Stories of actual escapes from echo chambers often turn on particular encounters – moments when the echo-chambered individual starts to trust somebody on the outside.
  • Black’s is case in point. By high school, he was already something of a star on neo-Nazi media, with his own radio talk-show. He went on to college, openly neo-Nazi, and was shunned by almost every other student in his community college. But then Matthew Stevenson, a Jewish fellow undergraduate, started inviting Black to Stevenson’s Shabbat dinners. In Black’s telling, Stevenson was unfailingly kind, open and generous, and slowly earned Black’s trust. This was the seed, says Black, that led to a massive intellectual upheaval – a slow-dawning realisation of the depths to which he had been misled
  • Similarly, accounts of people leaving echo-chambered homophobia rarely involve them encountering some institutionally reported fact. Rather, they tend to revolve around personal encounters – a child, a family member, a close friend coming out.
  • hese encounters matter because a personal connection comes with a substantial store of trust.
  • We don’t simply trust people as educated experts in a field – we rely on their goodwill. And this is why trust, rather than mere reliability, is the key concept
  • goodwill is a general feature of a person’s character. If I demonstrate goodwill in action, then you have some reason to think that I also have goodwill in matters of thought and knowledge.
  • f one can demonstrate goodwill to an echo-chambered member – as Stevenson did with Black – then perhaps one can start to pierce that echo chamber.
  • the path I’m describing is a winding, narrow and fragile one. There is no guarantee that such trust can be established, and no clear path to its being established systematically.
  • what we’ve found here isn’t an escape route at all. It depends on the intervention of another. This path is not even one an echo-chamber member can trigger on her own; it is only a whisper-thin hope for rescue from the outside.
Javier E

'Meta-Content' Is Taking Over the Internet - The Atlantic - 0 views

  • Jenn, however, has complicated things by adding an unexpected topic to her repertoire: the dangers of social media. She recently spoke about disengaging from it for her well-being; she also posted an Instagram Story about the risks of ChatGPT
  • and, in none other than a YouTube video, recommended Neil Postman’s Amusing Ourselves to Death, a seminal piece of media critique from 1985 that denounces television’s reduction of life to entertainment.
  • (Her other book recommendations included Stolen Focus, by Johann Hari, and Recapture the Rapture, by Jamie Wheal.)
  • ...14 more annotations...
  • Social-media platforms are “preying on your insecurities; they’re preying on your temptations,” Jenn explained to me in an interview that shifted our parasocial connection, at least for an hour, to a mere relationship. “And, you know, I do play a role in this.” Jenn makes money through aspirational advertising, after all—a familiar part of any influencer’s job.
  • She’s pro–parasocial relationships, she explains to the camera, but only if we remain aware that we’re in one. “This relationship does not replace existing friendships, existing relationships,” she emphasizes. “This is all supplementary. Like, it should be in addition to your life, not a replacement.” I sat there watching her talk about parasocial relationships while absorbing the irony of being in one with her.
  • The open acknowledgment of social media’s inner workings, with content creators exposing the foundations of their content within the content itself, is what Alice Marwick, an associate communications professor at the University of North Carolina at Chapel Hill, described to me as “meta-content.”
  • Meta-content can be overt, such as the vlogger Casey Neistat wondering, in a vlog, if vlogging your life prevents you from being fully present in it;
  • But meta-content can also be subtle: a vlogger walking across the frame before running back to get the camera. Or influencers vlogging themselves editing the very video you’re watching, in a moment of space-time distortion.
  • Viewers don’t seem to care. We keep watching, fully accepting the performance. Perhaps that’s because the rise of meta-content promises a way to grasp authenticity by acknowledging artifice; especially in a moment when artifice is easier to create than ever before, audiences want to know what’s “real” and what isn’
  • “The idea of a space where you can trust no sources, there’s no place to sort of land, everything is put into question, is a very unsettling, unsatisfying way to live.
  • So we continue to search for, as Murray observes, the “agreed-upon things, our basic understandings of what’s real, what’s true.” But when the content we watch becomes self-aware and even self-critical, it raises the question of whether we can truly escape the machinations of social media. Maybe when we stare directly into the abyss, we begin to enjoy its company.
  • “The difference between BeReal and the social-media giants isn’t the former’s relationship to truth but the size and scale of its deceptions.” BeReal users still angle their camera and wait to take their daily photo at an aesthetic time of day. The snapshots merely remind us how impossible it is to stop performing online.
  • Jenn’s concern over the future of the internet stems, in part, from motherhood. She recently had a son, Lennon (whose first birthday party I watched on YouTube), and worries about the digital world he’s going to inherit.
  • Back in the age of MySpace, she had her own internet friends and would sneak out to parking lots at 1 a.m. to meet them in real life: “I think this was when technology was really used as a tool to connect us.” Now, she explained, it’s beginning to ensnare us. Posting content online is no longer a means to an end so much as the end itself.
  • We used to view influencers’ lives as aspirational, a reality that we could reach toward. Now both sides acknowledge that they’re part of a perfect product that the viewer understands is unattainable and the influencer acknowledges is not fully real.
  • “I forgot to say this to her in the interview, but I truly think that my videos are less about me and more of a reflection of where you are currently … You are kind of reflecting on your own life and seeing what resonates [with] you, and you’re discarding what doesn’t. And I think that’s what’s beautiful about it.”
  • meta-content is fundamentally a compromise. Recognizing the delusion of the internet doesn’t alter our course within it so much as remind us how trapped we truly are—and how we wouldn’t have it any other way.
Javier E

The Age of 'Infopolitics' - NYTimes.com - 0 views

  • we need a new way of thinking about our informational milieu. What we need is a concept of infopolitics that would help us understand the increasingly dense ties between politics and information
  • Infopolitics encompasses not only traditional state surveillance and data surveillance, but also “data analytics” (the techniques that enable marketers at companies like Target to detect, for instance, if you are pregnant), digital rights movements (promoted by organizations like the Electronic Frontier Foundation), online-only crypto-currencies (like Bitcoin or Litecoin), algorithmic finance (like automated micro-trading) and digital property disputes (from peer-to-peer file sharing to property claims in the virtual world of Second Life)
  • Surveying this iceberg is crucial because atop it sits a new kind of person: the informational person. Politically and culturally, we are increasingly defined through an array of information architectures: highly designed environments of data, like our social media profiles, into which we often have to squeeze ourselves
  • ...12 more annotations...
  • We have become what the privacy theorist Daniel Solove calls “digital persons.” As such we are subject to infopolitics (or what the philosopher Grégoire Chamayou calls “datapower,” the political theorist Davide Panagia “datapolitik” and the pioneering thinker Donna Haraway “informatics of domination”).
  • Once fingerprints, biometrics, birth certificates and standardized names were operational, it became possible to implement an international passport system, a social security number and all other manner of paperwork that tells us who someone is. When all that paper ultimately went digital, the reams of data about us became radically more assessable and subject to manipulation,
  • We like to think of ourselves as somehow apart from all this information. We are real — the information is merely about us.
  • But what is it that is real? What would be left of you if someone took away all your numbers, cards, accounts, dossiers and other informational prostheses? Information is not just about you — it also constitutes who you are.
  • We understandably do not want to see ourselves as bits and bytes. But unless we begin conceptualizing ourselves in this way, we leave it to others to do it for us
  • agencies and corporations will continue producing new visions of you and me, and they will do so without our input if we remain stubbornly attached to antiquated conceptions of selfhood that keep us from admitting how informational we already are.
  • What should we do about our Internet and phone patterns’ being fastidiously harvested and stored away in remote databanks where they await inspection by future algorithms developed at the National Security Agency, Facebook, credit reporting firms like Experian and other new institutions of information and control that will come into existence in future decades?
  • What bits of the informational you will fall under scrutiny? The political you? The sexual you? What next-generation McCarthyisms await your informational self? And will those excesses of oversight be found in some Senate subcommittee against which we democratic citizens might hope to rise up in revolt — or will they lurk among algorithmic automatons that silently seal our fates in digital filing systems?
  • Despite their decidedly different political sensibilities, what links together the likes of Senator Wyden and the international hacker network known as Anonymous is that they respect the severity of what is at stake in our information.
  • information is a site for the call of justice today, alongside more quintessential battlefields like liberty of thought and equality of opportunity.
  • we lack the intellectual framework to grasp the new kinds of political injustices characteristic of today’s information society.
  • though nearly all of us have a vague sense that something is wrong with the new regimes of data surveillance, it is difficult for us to specify exactly what is happening and why it raises serious concern
Javier E

New Foils for the Right: Google and Facebook - The New York Times - 0 views

  • In a sign of escalation, Peter Schweizer, a right-wing journalist known for his investigations into Hillary Clinton, plans to release a new film focusing on technology companies and their role in filtering the news.
  • The documentary, which has not been previously reported, dovetails with concerns raised in recent weeks by right-wing groups about censorship on digital media — a new front in a rapidly evolving culture war.
  • The critique from conservatives, in contrast, casts the big tech companies as censorious and oppressive, all too eager to stifle right-wing content in an effort to mollify liberal critics.
  • ...9 more annotations...
  • Big Tech is easily associated with West Coast liberalism and Democratic politics, making it a fertile target for the right. And operational opacity at Facebook, Google and Twitter, which are reluctant to reveal details about their algorithms and internal policies, can leave them vulnerable, too.
  • “There’s not even a real basis to establish objective research about what’s happening on Facebook, because it’s closed.”
  • And former President Barack Obama said at an off-the-record conference at the Massachusetts Institute of Technology last month that he worried Americans were living in “entirely different realities” and that large tech companies like Facebook were “not just an invisible platform, they’re shaping our culture in powerful ways.” The contents of the speech were published by Reason magazine.
  • “There are political activists in all of these companies that want to actively push a liberal agenda,” he said. “Why does it matter? Because these companies are so ubiquitous and powerful that they are controlling all the means of mass communication.”
  • He is also the president of the Government Accountability Institute, a conservative nonprofit organization. He and Mr. Bannon founded it with funding from the family of Robert Mercer, the billionaire hedge fund manager and donor to Donald J. Trump’s presidential campaign.
  • Jeffrey A. Zucker, the president of CNN, derided Google and Facebook as “monopolies” and called for regulators to step in during a speech in Spain last month, saying the tech hegemony is “the biggest issue facing the growth of journalism in the years ahead.”
  • The panelists accused social media platforms of delisting their videos or stripping them of advertising. Such charges have long been staples of far-right online discourse, especially among YouTubers, but Mr. Schweizer’s project is poised to bring such arguments to a new — and potentially larger — audience.
  • The Facebook adjustment has affected virtually every media organization that is partly dependent on the platform for audiences, but it appears to have hit some harder than others. They include right-wing sites like Gateway Pundit and the millennial-focused Independent Journal Review, which was forced to lay off staff members last month.
  • The social news giant BuzzFeed recently bought ads on Facebook with the message, “Facebook is taking the news out of your News Feed, but we’ve got you covered,” directing users to download its app. Away from the political scrum, the viral lifestyle site LittleThings, once a top publisher on the platform, announced last week that it would cease operations, blaming “a full-on catastrophic update” to Facebook’s revised algorithms.
Javier E

Facebook Has 50 Minutes of Your Time Each Day. It Wants More. - The New York Times - 0 views

  • Fifty minutes.That’s the average amount of time, the company said, that users spend each day on its Facebook, Instagram and Messenger platforms
  • there are only 24 hours in a day, and the average person sleeps for 8.8 of them. That means more than one-sixteenth of the average user’s waking time is spent on Facebook.
  • That’s more than any other leisure activity surveyed by the Bureau of Labor Statistics, with the exception of watching television programs and movies (an average per day of 2.8 hours)
  • ...19 more annotations...
  • It’s more time than people spend reading (19 minutes); participating in sports or exercise (17 minutes); or social events (four minutes). It’s almost as much time as people spend eating and drinking (1.07 hours).
  • the average time people spend on Facebook has gone up — from around 40 minutes in 2014 — even as the number of monthly active users has surged. And that’s just the average. Some users must be spending many hours a day on the site,
  • time has become the holy grail of digital media.
  • Time is the best measure of engagement, and engagement correlates with advertising effectiveness. Time also increases the supply of impressions that Facebook can sell, which brings in more revenue (a 52 percent increase last quarter to $5.4 billion).
  • And time enables Facebook to learn more about its users — their habits and interests — and thus better target its ads. The result is a powerful network effect that competitors will be hard pressed to match.
  • the only one that comes close is Alphabet’s YouTube, where users spent an average of 17 minutes a day on the site. That’s less than half the 35 minutes a day users spent on Facebook
  • ComScore reported that television viewing (both live and recorded) dropped 2 percent last year, and it said younger viewers in particular are abandoning traditional live television. People ages 18-34 spent just 47 percent of their viewing time on television screens, and 40 percent on mobile devices.
  • People spending the most time on Facebook also tend to fall into the prized 18-to-34 demographic sought by advertisers.
  • “You hear a narrative that young people are fleeing Facebook. The data show that’s just not true. Younger users have a wider appetite for social media, and they spend a lot of time on multiple networks. But they spend more time on Facebook by a wide margin.”
  • What aren’t Facebook users doing during the 50 minutes they spend there? Is it possibly interfering with work (and productivity), or, in the case of young people, studying and reading?
  • While the Bureau of Labor Statistics surveys nearly every conceivable time-occupying activity (even fencing and spelunking), it doesn’t specifically tally the time spent on social media, both because the activity may have multiple purposes — both work and leisure — and because people often do it at the same time they are ostensibly engaged in other activities
  • The closest category would be “computer use for leisure,” which has grown from eight minutes in 2006, when the bureau began collecting the data, to 14 minutes in 2014, the most recent survey. Or perhaps it would be “socializing and communicating with others,” which slipped from 40 minutes to 38 minutes.
  • But time spent on most leisure activities hasn’t changed much in those eight years of the bureau’s surveys. Time spent reading dropped from an average of 22 minutes to 19 minutes. Watching television and movies increased from 2.57 hours to 2.8. Average time spent working declined from 3.4 hours to 3.25. (Those hours seem low because much of the population, which includes both young people and the elderly, does not work.)
  • The bureau’s numbers, since they cover the entire population, may be too broad to capture important shifts among important demographic groups
  • Users spent an average of nine minutes on all of Yahoo’s sites, two minutes on LinkedIn and just one minute on Twitter
  • Among those 55 and older, 70 percent of their viewing time was on television, according to comScore. So among young people, much social media time may be coming at the expense of traditional television.
  • comScore’s data suggests that people are spending on average just six to seven minutes a day using social media on their work computers. “I don’t think Facebook is displacing other activity,” he said. “People use it during downtime during the course of their day, in the elevator, or while commuting, or waiting.
  • Facebook, naturally, is busy cooking up ways to get us to spend even more time on the platform
  • A crucial initiative is improving its News Feed, tailoring it more precisely to the needs and interests of its users, based on how long people spend reading particular posts. For people who demonstrate a preference for video, more video will appear near the top of their news feed. The more time people spend on Facebook, the more data they will generate about themselves, and the better the company will get at the task.
jlessner

Why Facebook's News Experiment Matters to Readers - NYTimes.com - 0 views

  • Facebook’s new plan to host news publications’ stories directly is not only about page views, advertising revenue or the number of seconds it takes for an article to load. It is about who owns the relationship with readers.
  • It’s why Google, a search engine, started a social network and why Facebook, a social network, started a search engine. It’s why Amazon, a shopping site, made a phone and why Apple, a phone maker, got into shopping.
  • Facebook’s experiment, called instant articles, is small to start — just a few articles from nine media companies, including The New York Times. But it signals a major shift in the relationship between publications and their readers. If you want to read the news, Facebook is saying, come to Facebook, not to NBC News or The Atlantic or The Times — and when you come, don’t leave. (For now, these articles can be viewed on an iPhone running the Facebook app.)
  • ...6 more annotations...
  • The front page of a newspaper and the cover of a magazine lost their dominance long ago.
  • Facebook executives have insisted that they intend to exert no editorial control because they leave the makeup of the news feed to the algorithm. But an algorithm is not autonomous. It is written by humans and tweaked all the time. Advertisement Continue reading the main story Advertisement Continue reading the main story
  • “In digital, every story becomes unbundled from each other, so if you’re not thinking of each story as living on its own, it’s tying yourself back to an analog era,” Mr. Kim said.
  • But news reports, like albums before them, have not been created that way. One of the services that editors bring to readers has been to use their news judgment, considering a huge range of factors, when they decide how articles fit together and where they show up. The news judgment of The New York Times is distinct from that of The New York Post, and for generations readers appreciated that distinction.
  • That raises some journalistic questions. The news feed algorithm works, in part, by showing people more of what they have liked in the past. Some studies have suggested that means they might not see as wide a variety of news or points of view, though others, including one by Facebook researchers, have found they still do.
  • Tech companies, Facebook included, are notoriously fickle with their algorithms. Publications became so dependent on Facebook in the first place because of a change in its algorithm that sent more traffic their way. Later, another change demoted articles from sites that Facebook deemed to run click-bait headlines. Then last month, Facebook decided to prioritize some posts from friends over those from publications.
anonymous

VHS Tapes Are Worth Money - The New York Times - 0 views

  • Who Is Still Buying VHS Tapes?
  • Despite the rise of streaming, there is still a vast library of moving images that are categorically unavailable anywhere else. Also a big nostalgia factor.
  • The last VCR, according to Dave Rodriguez, 33, a digital repository librarian at Florida State University in Tallahassee, Fla., was produced in 2016
  • ...33 more annotations...
  • But the VHS tape itself may be immortal.
  • Today, a robust marketplace exists, both virtually and in real life, for this ephemera.
  • “Hold steady. Price seems fair. It is a Classic.”
  • Driving the passionate collection of this form of media is the belief that VHS offers something that other types of media cannot.
  • “The general perception that people can essentially order whatever movie they want from home is flat-out wrong,”
  • “promised as a giant video store on the internet, where a customer was only one click away from the exact film they were looking for.”
  • “Anything that you can think of is on VHS tape, because, you’ve got to think, it was a revolutionary piece of the media,”
  • “It was a way for everyone to capture something and then put it out there.”
  • preservation
  • “just so much culture packed into VHS,”
  • a movie studio, an independent filmmaker, a parent shooting their kid’s first steps, etc.
  • finds the medium inspirational
  • “some weird, obscure movie on VHS I would have seen at my friend’s house, late at night, after his parents were asleep.
  • “The quality feels raw but warm and full of flavor,” he said of VHS.
  • views them as a byway connecting her with the past
  • from reels depicting family gatherings to movies that just never made the jump to DVD
  • “I think we were the last to grow up without the internet, cellphones or social media,” and clinging to the “old analog ways,” she said, feels “very natural.”
  • “I think that people are nostalgic for the aura of the VHS era,”
  • “So many cultural touch points are rooted there,” Mr. Harris said of the 1980s.
  • It was, he believes, “a time when, in some ways, Americans knew who we were.”
  • Not only could film connoisseurs peruse the aisles of video stores on Friday nights, but they could also compose home movies, from the artful to the inane
  • “In its heyday, it was mass-produced and widely adopted,”
  • She inherited some of them from her grandmother, a children’s librarian with a vast collection.
  • Historical Journal of Film, Radio and Television
  • the first technology that allowed mass, large-scale home media access to films.”
  • Mr. Arrow said that home videos captured on VHS, or taped television programs that contain old commercials and snippets from the news, are particularly insightful in diving into cultural history.
  • “There’ll be a news break, and you’ll see, like: Oh my god, O.J.’s still in the Bronco, and it’s on the news, and then it’ll cut back to ‘Mission Impossible’ or something.”
  • Marginalized communities, Mr. Harris said, who were not well represented in media in the 1980s, benefited from VHS technology, which allowed them to create an archival system that now brings to life people and communities that were otherwise absent from the screen.
  • The nature of VHS, Mr. Harris said, made self-documentation “readily available,
  • people who lacked representation could “begin to build a library, an archive, to affirm their existence and that of their community.”
  • VHS enthusiasts agree that these tapes occupy an irreplaceable place in culture.
  • “It’s like a time capsule,”
  • “The medium is like no other.”
Javier E

If Twitter is a Work Necessity - NYTimes.com - 0 views

  • For midcareer executives, particularly in the media and related industries, knowing how to use Twitter, update your timeline on Facebook, pin on Pinterest, check in on Foursquare and upload images on Instagram are among the digital skills that some employers expect people to have to land a job or to flourish in a current role.
  • digital literacy, including understanding social networking, is now a required skill. “They are essential skills that are needed to operate in the world and in the workplace,” she said. “And people will either need to learn through formal training or through their networks or they will feel increasingly left out.”
  • “If you don’t have a LinkedIn or Facebook account, then employers often don’t have a way to find out about you,” she said.
  • ...1 more annotation...
  • “We have to think about social media in a new strategic way,” he said. “It is no longer something that we can ignore. It is not a place to just wish your friends happy birthday. It is a place of business. It is a place where your career will be enhanced or degraded, depending on your use of these tools and services.”
Javier E

Why Silicon Valley can't fix itself | News | The Guardian - 1 views

  • After decades of rarely apologising for anything, Silicon Valley suddenly seems to be apologising for everything. They are sorry about the trolls. They are sorry about the bots. They are sorry about the fake news and the Russians, and the cartoons that are terrifying your kids on YouTube. But they are especially sorry about our brains.
  • Sean Parker, the former president of Facebook – who was played by Justin Timberlake in The Social Network – has publicly lamented the “unintended consequences” of the platform he helped create: “God only knows what it’s doing to our children’s brains.”
  • Parker, Rosenstein and the other insiders now talking about the harms of smartphones and social media belong to an informal yet influential current of tech critics emerging within Silicon Valley. You could call them the “tech humanists”. Amid rising public concern about the power of the industry, they argue that the primary problem with its products is that they threaten our health and our humanity.
  • ...52 more annotations...
  • It is clear that these products are designed to be maximally addictive, in order to harvest as much of our attention as they can. Tech humanists say this business model is both unhealthy and inhumane – that it damages our psychological well-being and conditions us to behave in ways that diminish our humanity
  • The main solution that they propose is better design. By redesigning technology to be less addictive and less manipulative, they believe we can make it healthier – we can realign technology with our humanity and build products that don’t “hijack” our minds.
  • its most prominent spokesman is executive director Tristan Harris, a former “design ethicist” at Google who has been hailed by the Atlantic magazine as “the closest thing Silicon Valley has to a conscience”. Harris has spent years trying to persuade the industry of the dangers of tech addiction.
  • In February, Pierre Omidyar, the billionaire founder of eBay, launched a related initiative: the Tech and Society Solutions Lab, which aims to “maximise the tech industry’s contributions to a healthy society”.
  • the tech humanists are making a bid to become tech’s loyal opposition. They are using their insider credentials to promote a particular diagnosis of where tech went wrong and of how to get it back on track
  • The real reason tech humanism matters is because some of the most powerful people in the industry are starting to speak its idiom. Snap CEO Evan Spiegel has warned about social media’s role in encouraging “mindless scrambles for friends or unworthy distractions”,
  • In short, the effort to humanise computing produced the very situation that the tech humanists now consider dehumanising: a wilderness of screens where digital devices chase every last instant of our attention.
  • After years of ignoring their critics, industry leaders are finally acknowledging that problems exist. Tech humanists deserve credit for drawing attention to one of those problems – the manipulative design decisions made by Silicon Valley.
  • these decisions are only symptoms of a larger issue: the fact that the digital infrastructures that increasingly shape our personal, social and civic lives are owned and controlled by a few billionaires
  • Because it ignores the question of power, the tech-humanist diagnosis is incomplete – and could even help the industry evade meaningful reform
  • Taken up by leaders such as Zuckerberg, tech humanism is likely to result in only superficial changes
  • they will not address the origin of that anger. If anything, they will make Silicon Valley even more powerful.
  • To the litany of problems caused by “technology that extracts attention and erodes society”, the text asserts that “humane design is the solution”. Drawing on the rhetoric of the “design thinking” philosophy that has long suffused Silicon Valley, the website explains that humane design “starts by understanding our most vulnerable human instincts so we can design compassionately”
  • this language is not foreign to Silicon Valley. On the contrary, “humanising” technology has long been its central ambition and the source of its power. It was precisely by developing a “humanised” form of computing that entrepreneurs such as Steve Jobs brought computing into millions of users’ everyday lives
  • Facebook had a new priority: maximising “time well spent” on the platform, rather than total time spent. By “time well spent”, Zuckerberg means time spent interacting with “friends” rather than businesses, brands or media sources. He said the News Feed algorithm was already prioritising these “more meaningful” activities.
  • Tech humanists say they want to align humanity and technology. But this project is based on a deep misunderstanding of the relationship between humanity and technology: namely, the fantasy that these two entities could ever exist in separation.
  • They believe we can use better design to make technology serve human nature rather than exploit and corrupt it. But this idea is drawn from the same tradition that created the world that tech humanists believe is distracting and damaging us.
  • The story of our species began when we began to make tools
  • All of which is to say: humanity and technology are not only entangled, they constantly change together.
  • This is not just a metaphor. Recent research suggests that the human hand evolved to manipulate the stone tools that our ancestors used
  • The ways our bodies and brains change in conjunction with the tools we make have long inspired anxieties that “we” are losing some essential qualities
  • Yet as we lose certain capacities, we gain new ones.
  • The nature of human nature is that it changes. It can not, therefore, serve as a stable basis for evaluating the impact of technology
  • Yet the assumption that it doesn’t change serves a useful purpose. Treating human nature as something static, pure and essential elevates the speaker into a position of power. Claiming to tell us who we are, they tell us how we should be.
  • Messaging, for instance, is considered the strongest signal. It’s reasonable to assume that you’re closer to somebody you exchange messages with than somebody whose post you once liked.
  • Harris and his fellow tech humanists also frequently invoke the language of public health. The Center for Humane Technology’s Roger McNamee has gone so far as to call public health “the root of the whole thing”, and Harris has compared using Snapchat to smoking cigarettes
  • The public-health framing casts the tech humanists in a paternalistic role. Resolving a public health crisis requires public health expertise. It also precludes the possibility of democratic debate. You don’t put the question of how to treat a disease up for a vote – you call a doctor.
  • They also remain confined to the personal level, aiming to redesign how the individual user interacts with technology rather than tackling the industry’s structural failures. Tech humanism fails to address the root cause of the tech backlash: the fact that a small handful of corporations own our digital lives and strip-mine them for profit.
  • This is a fundamentally political and collective issue. But by framing the problem in terms of health and humanity, and the solution in terms of design, the tech humanists personalise and depoliticise it.
  • Far from challenging Silicon Valley, tech humanism offers Silicon Valley a useful way to pacify public concerns without surrendering any of its enormous wealth and power.
  • these principles could make Facebook even more profitable and powerful, by opening up new business opportunities. That seems to be exactly what Facebook has planned.
  • reported that total time spent on the platform had dropped by around 5%, or about 50m hours per day. But, Zuckerberg said, this was by design: in particular, it was in response to tweaks to the News Feed that prioritised “meaningful” interactions with “friends” rather than consuming “public content” like video and news. This would ensure that “Facebook isn’t just fun, but also good for people’s well-being”
  • Zuckerberg said he expected those changes would continue to decrease total time spent – but “the time you do spend on Facebook will be more valuable”. This may describe what users find valuable – but it also refers to what Facebook finds valuable
  • not all data is created equal. One of the most valuable sources of data to Facebook is used to inform a metric called “coefficient”. This measures the strength of a connection between two users – Zuckerberg once called it “an index for each relationship”
  • Facebook records every interaction you have with another user – from liking a friend’s post or viewing their profile, to sending them a message. These activities provide Facebook with a sense of how close you are to another person, and different activities are weighted differently.
  • Holding humanity and technology separate clears the way for a small group of humans to determine the proper alignment between them
  • Why is coefficient so valuable? Because Facebook uses it to create a Facebook they think you will like: it guides algorithmic decisions about what content you see and the order in which you see it. It also helps improve ad targeting, by showing you ads for things liked by friends with whom you often interact
  • emphasising time well spent means creating a Facebook that prioritises data-rich personal interactions that Facebook can use to make a more engaging platform.
  • “time well spent” means Facebook can monetise more efficiently. It can prioritise the intensity of data extraction over its extensiveness. This is a wise business move, disguised as a concession to critics
  • industrialists had to find ways to make the time of the worker more valuable – to extract more money from each moment rather than adding more moments. They did this by making industrial production more efficient: developing new technologies and techniques that squeezed more value out of the worker and stretched that value further than ever before.
  • there is another way of thinking about how to live with technology – one that is both truer to the history of our species and useful for building a more democratic future. This tradition does not address “humanity” in the abstract, but as distinct human beings, whose capacities are shaped by the tools they use.
  • It sees us as hybrids of animal and machine – as “cyborgs”, to quote the biologist and philosopher of science Donna Haraway.
  • The cyborg way of thinking, by contrast, tells us that our species is essentially technological. We change as we change our tools, and our tools change us. But even though our continuous co-evolution with our machines is inevitable, the way it unfolds is not. Rather, it is determined by who owns and runs those machines. It is a question of power
  • The various scandals that have stoked the tech backlash all share a single source. Surveillance, fake news and the miserable working conditions in Amazon’s warehouses are profitable. If they were not, they would not exist. They are symptoms of a profound democratic deficit inflicted by a system that prioritises the wealth of the few over the needs and desires of the many.
  • If being technological is a feature of being human, then the power to shape how we live with technology should be a fundamental human right
  • The decisions that most affect our technological lives are far too important to be left to Mark Zuckerberg, rich investors or a handful of “humane designers”. They should be made by everyone, together.
  • Rather than trying to humanise technology, then, we should be trying to democratise it. We should be demanding that society as a whole gets to decide how we live with technology
  • What does this mean in practice? First, it requires limiting and eroding Silicon Valley’s power.
  • Antitrust laws and tax policy offer useful ways to claw back the fortunes Big Tech has built on common resources
  • democratic governments should be making rules about how those firms are allowed to behave – rules that restrict how they can collect and use our personal data, for instance, like the General Data Protection Regulation
  • This means developing publicly and co-operatively owned alternatives that empower workers, users and citizens to determine how they are run.
  • we might demand that tech firms pay for the privilege of extracting our data, so that we can collectively benefit from a resource we collectively create.
Javier E

FaceApp helped a middle-aged man become a popular younger woman. His fan base has never... - 1 views

  • Soya’s fame illustrated a simple truth: that social media is less a reflection of who we are, and more a performance of who we want to be.
  • It also seemed to herald a darker future where our fundamental senses of reality are under siege: The AI that allows anyone to fabricate a face can also be used to harass women with “deepfake” pornography, invent fraudulent LinkedIn personas and digitally impersonate political enemies.
  • As the photos began receiving hundreds of likes, Soya’s personality and style began to come through. She was relentlessly upbeat. She never sneered or bickered or trolled. She explored small towns, savored scenic vistas, celebrated roadside restaurants’ simple meals.
  • ...25 more annotations...
  • She took pride in the basic things, like cleaning engine parts. And she only hinted at the truth: When one fan told her in October, “It’s great to be young,” Soya replied, “Youth does not mean a certain period of life, but how to hold your heart.”
  • She seemed, well, happy, and FaceApp had made her that way. Creating the lifelike impostor had taken only a few taps: He changed the “Gender” setting to “Female,” the “Age” setting to “Teen,” and the “Impression” setting — a mix of makeup filters — to a glamorous look the app calls “Hollywood.”
  • Soya pouted and scowled on rare occasions when Nakajima himself felt frustrated. But her baseline expression was an extra-wide smile, activated with a single tap.
  • Nakajima grew his shimmering hair below his shoulders and raided his local convenience store for beauty supplies he thought would make the FaceApp images more convincing: blushes, eyeliners, concealers, shampoos.
  • “When I compare how I feel when I started to tweet as a woman and now, I do feel that I’m gradually gravitating toward this persona … this fantasy world that I created,” Nakajima said. “When I see photos of what I tweeted, I feel like, ‘Oh. That’s me.’ ”
  • The sensation Nakajima was feeling is so common that there’s a term for it: the Proteus effect, named for the shape-shifting Greek god. Stanford University researchers first coined it in 2007 to describe how people inhabiting the body of a digital avatar began to act the part
  • People made to appear taller in virtual-reality simulations acted more assertively, even after the experience ended. Prettier characters began to flirt.
  • What is it about online disguises? Why are they so good at bending people’s sense of self-perception?
  • they tap into this “very human impulse to play with identity and pretend to be someone you’re not.”
  • Users in the Internet’s early days rarely had any presumptions of authenticity, said Melanie C. Green, a University of Buffalo professor who studies technology and social trust. Most people assumed everyone else was playing a character clearly distinguished from their real life.
  • “This identity play was considered one of the huge advantages of being online,” Green said. “You could switch your gender and try on all of these different personas. It was a playground for people to explore.”
  • It wasn’t until the rise of giant social networks like Facebook — which used real identities to, among other things, supercharge targeted advertising — that this big game of pretend gained an air of duplicity. Spaces for playful performance shrank, and the biggest Internet watering holes began demanding proof of authenticity as a way to block out malicious intent.
  • The Web’s big shift from text to visuals — the rise of photo-sharing apps, live streams and video calls — seemed at first to make that unspoken rule of real identities concrete. It seemed too difficult to fake one’s appearance when everyone’s face was on constant display.
  • Now, researchers argue, advances in image-editing artificial intelligence have done for the modern Internet what online pseudonyms did for the world’s first chat rooms. Facial filters have allowed anyone to mold themselves into the character they want to play.
  • researchers fear these augmented reality tools could end up distorting the beauty standards and expectations of actual reality.
  • Some political and tech theorists worry this new world of synthetic media threatens to detonate our concept of truth, eroding our shared experiences and infusing every online relationship with suspicion and self-doubt.
  • Deceptive political memes, conspiracy theories, anti-vaccine hoaxes and other scams have torn the fabric of our democracy, culture and public health.
  • But she also thinks about her kids, who assume “that everything online is fabricated,” and wonders whether the rules of online identity require a bit more nuance — and whether that generational shift is already underway.
  • “Bots pretending to be people, automated representations of humanity — that, they perceive as exploitative,” she said. “But if it’s just someone engaging in identity experimentation, they’re like: ‘Yeah, that’s what we’re all doing.'
  • To their generation, “authenticity is not about: ‘Does your profile picture match your real face?’ Authenticity is: ‘Is your voice your voice?’
  • “Their feeling is: ‘The ideas are mine. The voice is mine. The content is mine. I’m just looking for you to receive it without all the assumptions and baggage that comes with it.’ That’s the essence of a person’s identity. That’s who they really are.”
  • But wasn’t this all just a big con? Nakajima had tricked people with a “cool girl” stereotype to boost his Twitter numbers. He hadn’t elevated the role of women in motorcycling; if anything, he’d supplanted them. And the character he’d created was paper thin: Soya had no internal complexity outside of what Nakajima had projected, just that eternally superimposed smile.
  • Perhaps he should have accepted his irrelevance and faded into the digital sunset, sharing his life for few to see. But some of Soya’s followers have said they never felt deceived: It was Nakajima — his enthusiasm, his attitude about life — they’d been charmed by all along. “His personality,” as one Twitter follower said, “shined through.”
  • In Nakajima’s mind, he’d used the tools of a superficial medium to craft genuine connections. He had not felt real until he had become noticed for being fake.
  • Nakajima said he doesn’t know how long he’ll keep Soya alive. But he said he’s grateful for the way she helped him feel: carefree, adventurous, seen.
Javier E

Thieves of experience: On the rise of surveillance capitalism - 1 views

  • Harvard Business School professor emerita Shoshana Zuboff argues in her new book that the Valley’s wealth and power are predicated on an insidious, essentially pathological form of private enterprise—what she calls “surveillance capitalism.” Pioneered by Google, perfected by Facebook, and now spreading throughout the economy, surveillance capitalism uses human life as its raw material. Our everyday experiences, distilled into data, have become a privately-owned business asset used to predict and mold our behavior, whether we’re shopping or socializing, working or voting.
  • By reengineering the economy and society to their own benefit, Google and Facebook are perverting capitalism in a way that undermines personal freedom and corrodes democracy.
  • Under the Fordist model of mass production and consumption that prevailed for much of the twentieth century, industrial capitalism achieved a relatively benign balance among the contending interests of business owners, workers, and consumers. Enlightened executives understood that good pay and decent working conditions would ensure a prosperous middle class eager to buy the goods and services their companies produced. It was the product itself — made by workers, sold by companies, bought by consumers — that tied the interests of capitalism’s participants together. Economic and social equilibrium was negotiated through the product.
  • ...72 more annotations...
  • By removing the tangible product from the center of commerce, surveillance capitalism upsets the equilibrium. Whenever we use free apps and online services, it’s often said, we become the products, our attention harvested and sold to advertisers
  • this truism gets it wrong. Surveillance capitalism’s real products, vaporous but immensely valuable, are predictions about our future behavior — what we’ll look at, where we’ll go, what we’ll buy, what opinions we’ll hold — that internet companies derive from our personal data and sell to businesses, political operatives, and other bidders.
  • Unlike financial derivatives, which they in some ways resemble, these new data derivatives draw their value, parasite-like, from human experience.To the Googles and Facebooks of the world, we are neither the customer nor the product. We are the source of what Silicon Valley technologists call “data exhaust” — the informational byproducts of online activity that become the inputs to prediction algorithms
  • Another 2015 study, appearing in the Journal of Computer-Mediated Communication, showed that when people hear their phone ring but are unable to answer it, their blood pressure spikes, their pulse quickens, and their problem-solving skills decline.
  • The smartphone has become a repository of the self, recording and dispensing the words, sounds and images that define what we think, what we experience and who we are. In a 2015 Gallup survey, more than half of iPhone owners said that they couldn’t imagine life without the device.
  • So what happens to our minds when we allow a single tool such dominion over our perception and cognition?
  • Not only do our phones shape our thoughts in deep and complicated ways, but the effects persist even when we aren’t using the devices. As the brain grows dependent on the technology, the research suggests, the intellect weakens.
  • he has seen mounting evidence that using a smartphone, or even hearing one ring or vibrate, produces a welter of distractions that makes it harder to concentrate on a difficult problem or job. The division of attention impedes reasoning and performance.
  • internet companies operate in what Zuboff terms “extreme structural independence from people.” When databases displace goods as the engine of the economy, our own interests, as consumers but also as citizens, cease to be part of the negotiation. We are no longer one of the forces guiding the market’s invisible hand. We are the objects of surveillance and control.
  • Social skills and relationships seem to suffer as well.
  • In both tests, the subjects whose phones were in view posted the worst scores, while those who left their phones in a different room did the best. The students who kept their phones in their pockets or bags came out in the middle. As the phone’s proximity increased, brainpower decreased.
  • In subsequent interviews, nearly all the participants said that their phones hadn’t been a distraction—that they hadn’t even thought about the devices during the experiment. They remained oblivious even as the phones disrupted their focus and thinking.
  • The researchers recruited 520 undergraduates at UCSD and gave them two standard tests of intellectual acuity. One test gauged “available working-memory capacity,” a measure of how fully a person’s mind can focus on a particular task. The second assessed “fluid intelligence,” a person’s ability to interpret and solve an unfamiliar problem. The only variable in the experiment was the location of the subjects’ smartphones. Some of the students were asked to place their phones in front of them on their desks; others were told to stow their phones in their pockets or handbags; still others were required to leave their phones in a different room.
  • the “integration of smartphones into daily life” appears to cause a “brain drain” that can diminish such vital mental skills as “learning, logical reasoning, abstract thought, problem solving, and creativity.”
  •  Smartphones have become so entangled with our existence that, even when we’re not peering or pawing at them, they tug at our attention, diverting precious cognitive resources. Just suppressing the desire to check our phone, which we do routinely and subconsciously throughout the day, can debilitate our thinking.
  • They found that students who didn’t bring their phones to the classroom scored a full letter-grade higher on a test of the material presented than those who brought their phones. It didn’t matter whether the students who had their phones used them or not: All of them scored equally poorly.
  • A study of nearly a hundred secondary schools in the U.K., published last year in the journal Labour Economics, found that when schools ban smartphones, students’ examination scores go up substantially, with the weakest students benefiting the most.
  • Data, the novelist and critic Cynthia Ozick once wrote, is “memory without history.” Her observation points to the problem with allowing smartphones to commandeer our brains
  • Because smartphones serve as constant reminders of all the friends we could be chatting with electronically, they pull at our minds when we’re talking with people in person, leaving our conversations shallower and less satisfying.
  • In a 2013 study conducted at the University of Essex in England, 142 participants were divided into pairs and asked to converse in private for ten minutes. Half talked with a phone in the room, half without a phone present. The subjects were then given tests of affinity, trust and empathy. “The mere presence of mobile phones,” the researchers reported in the Journal of Social and Personal Relationships, “inhibited the development of interpersonal closeness and trust” and diminished “the extent to which individuals felt empathy and understanding from their partners.”
  • The evidence that our phones can get inside our heads so forcefully is unsettling. It suggests that our thoughts and feelings, far from being sequestered in our skulls, can be skewed by external forces we’re not even aware o
  •  Scientists have long known that the brain is a monitoring system as well as a thinking system. Its attention is drawn toward any object that is new, intriguing or otherwise striking — that has, in the psychological jargon, “salience.”
  • even in the history of captivating media, the smartphone stands out. It is an attention magnet unlike any our minds have had to grapple with before. Because the phone is packed with so many forms of information and so many useful and entertaining functions, it acts as what Dr. Ward calls a “supernormal stimulus,” one that can “hijack” attention whenever it is part of our surroundings — and it is always part of our surroundings.
  • Imagine combining a mailbox, a newspaper, a TV, a radio, a photo album, a public library and a boisterous party attended by everyone you know, and then compressing them all into a single, small, radiant object. That is what a smartphone represents to us. No wonder we can’t take our minds off it.
  • The irony of the smartphone is that the qualities that make it so appealing to us — its constant connection to the net, its multiplicity of apps, its responsiveness, its portability — are the very ones that give it such sway over our minds.
  • Phone makers like Apple and Samsung and app writers like Facebook, Google and Snap design their products to consume as much of our attention as possible during every one of our waking hours
  • Social media apps were designed to exploit “a vulnerability in human psychology,” former Facebook president Sean Parker said in a recent interview. “[We] understood this consciously. And we did it anyway.”
  • A quarter-century ago, when we first started going online, we took it on faith that the web would make us smarter: More information would breed sharper thinking. We now know it’s not that simple.
  • As strange as it might seem, people’s knowledge and understanding may actually dwindle as gadgets grant them easier access to online data stores
  • In a seminal 2011 study published in Science, a team of researchers — led by the Columbia University psychologist Betsy Sparrow and including the late Harvard memory expert Daniel Wegner — had a group of volunteers read forty brief, factual statements (such as “The space shuttle Columbia disintegrated during re-entry over Texas in Feb. 2003”) and then type the statements into a computer. Half the people were told that the machine would save what they typed; half were told that the statements would be erased.
  • Afterward, the researchers asked the subjects to write down as many of the statements as they could remember. Those who believed that the facts had been recorded in the computer demonstrated much weaker recall than those who assumed the facts wouldn’t be stored. Anticipating that information would be readily available in digital form seemed to reduce the mental effort that people made to remember it
  • The researchers dubbed this phenomenon the “Google effect” and noted its broad implications: “Because search engines are continually available to us, we may often be in a state of not feeling we need to encode the information internally. When we need it, we will look it up.”
  • as the pioneering psychologist and philosopher William James said in an 1892 lecture, “the art of remembering is the art of thinking.”
  • Only by encoding information in our biological memory can we weave the rich intellectual associations that form the essence of personal knowledge and give rise to critical and conceptual thinking. No matter how much information swirls around us, the less well-stocked our memory, the less we have to think with.
  • As Dr. Wegner and Dr. Ward explained in a 2013 Scientific American article, when people call up information through their devices, they often end up suffering from delusions of intelligence. They feel as though “their own mental capacities” had generated the information, not their devices. “The advent of the ‘information age’ seems to have created a generation of people who feel they know more than ever before,” the scholars concluded, even though “they may know ever less about the world around them.”
  • That insight sheds light on society’s current gullibility crisis, in which people are all too quick to credit lies and half-truths spread through social media. If your phone has sapped your powers of discernment, you’ll believe anything it tells you.
  • A second experiment conducted by the researchers produced similar results, while also revealing that the more heavily students relied on their phones in their everyday lives, the greater the cognitive penalty they suffered.
  • When we constrict our capacity for reasoning and recall or transfer those skills to a gadget, we sacrifice our ability to turn information into knowledge. We get the data but lose the meaning
  • We need to give our minds more room to think. And that means putting some distance between ourselves and our phones.
  • Google’s once-patient investors grew restive, demanding that the founders figure out a way to make money, preferably lots of it.
  • nder pressure, Page and Brin authorized the launch of an auction system for selling advertisements tied to search queries. The system was designed so that the company would get paid by an advertiser only when a user clicked on an ad. This feature gave Google a huge financial incentive to make accurate predictions about how users would respond to ads and other online content. Even tiny increases in click rates would bring big gains in income. And so the company began deploying its stores of behavioral data not for the benefit of users but to aid advertisers — and to juice its own profits. Surveillance capitalism had arrived.
  • Google’s business now hinged on what Zuboff calls “the extraction imperative.” To improve its predictions, it had to mine as much information as possible from web users. It aggressively expanded its online services to widen the scope of its surveillance.
  • Through Gmail, it secured access to the contents of people’s emails and address books. Through Google Maps, it gained a bead on people’s whereabouts and movements. Through Google Calendar, it learned what people were doing at different moments during the day and whom they were doing it with. Through Google News, it got a readout of people’s interests and political leanings. Through Google Shopping, it opened a window onto people’s wish lists,
  • The company gave all these services away for free to ensure they’d be used by as many people as possible. It knew the money lay in the data.
  • the organization grew insular and secretive. Seeking to keep the true nature of its work from the public, it adopted what its CEO at the time, Eric Schmidt, called a “hiding strategy” — a kind of corporate omerta backed up by stringent nondisclosure agreements.
  • Page and Brin further shielded themselves from outside oversight by establishing a stock structure that guaranteed their power could never be challenged, neither by investors nor by directors.
  • What’s most remarkable about the birth of surveillance capitalism is the speed and audacity with which Google overturned social conventions and norms about data and privacy. Without permission, without compensation, and with little in the way of resistance, the company seized and declared ownership over everyone’s information
  • The companies that followed Google presumed that they too had an unfettered right to collect, parse, and sell personal data in pretty much any way they pleased. In the smart homes being built today, it’s understood that any and all data will be beamed up to corporate clouds.
  • Google conducted its great data heist under the cover of novelty. The web was an exciting frontier — something new in the world — and few people understood or cared about what they were revealing as they searched and surfed. In those innocent days, data was there for the taking, and Google took it
  • Google also benefited from decisions made by lawmakers, regulators, and judges — decisions that granted internet companies free use of a vast taxpayer-funded communication infrastructure, relieved them of legal and ethical responsibility for the information and messages they distributed, and gave them carte blanche to collect and exploit user data.
  • Consider the terms-of-service agreements that govern the division of rights and the delegation of ownership online. Non-negotiable, subject to emendation and extension at the company’s whim, and requiring only a casual click to bind the user, TOS agreements are parodies of contracts, yet they have been granted legal legitimacy by the court
  • Law professors, writes Zuboff, “call these ‘contracts of adhesion’ because they impose take-it-or-leave-it conditions on users that stick to them whether they like it or not.” Fundamentally undemocratic, the ubiquitous agreements helped Google and other firms commandeer personal data as if by fiat.
  • n the choices we make as consumers and private citizens, we have always traded some of our autonomy to gain other rewards. Many people, it seems clear, experience surveillance capitalism less as a prison, where their agency is restricted in a noxious way, than as an all-inclusive resort, where their agency is restricted in a pleasing way
  • Zuboff makes a convincing case that this is a short-sighted and dangerous view — that the bargain we’ve struck with the internet giants is a Faustian one
  • but her case would have been stronger still had she more fully addressed the benefits side of the ledger.
  • there’s a piece missing. While Zuboff’s assessment of the costs that people incur under surveillance capitalism is exhaustive, she largely ignores the benefits people receive in return — convenience, customization, savings, entertainment, social connection, and so on
  • hat the industries of the future will seek to manufacture is the self.
  • Behavior modification is the thread that ties today’s search engines, social networks, and smartphone trackers to tomorrow’s facial-recognition systems, emotion-detection sensors, and artificial-intelligence bots.
  • All of Facebook’s information wrangling and algorithmic fine-tuning, she writes, “is aimed at solving one problem: how and when to intervene in the state of play that is your daily life in order to modify your behavior and thus sharply increase the predictability of your actions now, soon, and later.”
  • “The goal of everything we do is to change people’s actual behavior at scale,” a top Silicon Valley data scientist told her in an interview. “We can test how actionable our cues are for them and how profitable certain behaviors are for us.”
  • This goal, she suggests, is not limited to Facebook. It is coming to guide much of the economy, as financial and social power shifts to the surveillance capitalists
  • Combining rich information on individuals’ behavioral triggers with the ability to deliver precisely tailored and timed messages turns out to be a recipe for behavior modification on an unprecedented scale.
  • it was Facebook, with its incredibly detailed data on people’s social lives, that grasped digital media’s full potential for behavior modification. By using what it called its “social graph” to map the intentions, desires, and interactions of literally billions of individuals, it saw that it could turn its network into a worldwide Skinner box, employing psychological triggers and rewards to program not only what people see but how they react.
  • spying on the populace is not the end game. The real prize lies in figuring out ways to use the data to shape how people think and act. “The best way to predict the future is to invent it,” the computer scientist Alan Kay once observed. And the best way to predict behavior is to script it.
  • competition for personal data intensified. It was no longer enough to monitor people online; making better predictions required that surveillance be extended into homes, stores, schools, workplaces, and the public squares of cities and towns. Much of the recent innovation in the tech industry has entailed the creation of products and services designed to vacuum up data from every corner of our lives
  • “The typical complaint is that privacy is eroded, but that is misleading,” Zuboff writes. “In the larger societal pattern, privacy is not eroded but redistributed . . . . Instead of people having the rights to decide how and what they will disclose, these rights are concentrated within the domain of surveillance capitalism.” The transfer of decision rights is also a transfer of autonomy and agency, from the citizen to the corporation.
  • What we lose under this regime is something more fundamental than privacy. It’s the right to make our own decisions about privacy — to draw our own lines between those aspects of our lives we are comfortable sharing and those we are not
  • Other possible ways of organizing online markets, such as through paid subscriptions for apps and services, never even got a chance to be tested.
  • Online surveillance came to be viewed as normal and even necessary by politicians, government bureaucrats, and the general public
  • Google and other Silicon Valley companies benefited directly from the government’s new stress on digital surveillance. They earned millions through contracts to share their data collection and analysis techniques with the National Security Agenc
  • As much as the dot-com crash, the horrors of 9/11 set the stage for the rise of surveillance capitalism. Zuboff notes that, in 2000, members of the Federal Trade Commission, frustrated by internet companies’ lack of progress in adopting privacy protections, began formulating legislation to secure people’s control over their online information and severely restrict the companies’ ability to collect and store it. It seemed obvious to the regulators that ownership of personal data should by default lie in the hands of private citizens, not corporations.
  • The 9/11 attacks changed the calculus. The centralized collection and analysis of online data, on a vast scale, came to be seen as essential to national security. “The privacy provisions debated just months earlier vanished from the conversation more or less overnight,”
Javier E

'Our minds can be hijacked': the tech insiders who fear a smartphone dystopia | Technol... - 0 views

  • Rosenstein belongs to a small but growing band of Silicon Valley heretics who complain about the rise of the so-called “attention economy”: an internet shaped around the demands of an advertising economy.
  • “It is very common,” Rosenstein says, “for humans to develop things with the best of intentions and for them to have unintended, negative consequences.”
  • most concerned about the psychological effects on people who, research shows, touch, swipe or tap their phone 2,617 times a day.
  • ...43 more annotations...
  • There is growing concern that as well as addicting users, technology is contributing toward so-called “continuous partial attention”, severely limiting people’s ability to focus, and possibly lowering IQ. One recent study showed that the mere presence of smartphones damages cognitive capacity – even when the device is turned off. “Everyone is distracted,” Rosenstein says. “All of the time.”
  • Drawing a straight line between addiction to social media and political earthquakes like Brexit and the rise of Donald Trump, they contend that digital forces have completely upended the political system and, left unchecked, could even render democracy as we know it obsolete.
  • Without irony, Eyal finished his talk with some personal tips for resisting the lure of technology. He told his audience he uses a Chrome extension, called DF YouTube, “which scrubs out a lot of those external triggers” he writes about in his book, and recommended an app called Pocket Points that “rewards you for staying off your phone when you need to focus”.
  • “One reason I think it is particularly important for us to talk about this now is that we may be the last generation that can remember life before,” Rosenstein says. It may or may not be relevant that Rosenstein, Pearlman and most of the tech insiders questioning today’s attention economy are in their 30s, members of the last generation that can remember a world in which telephones were plugged into walls.
  • One morning in April this year, designers, programmers and tech entrepreneurs from across the world gathered at a conference centre on the shore of the San Francisco Bay. They had each paid up to $1,700 to learn how to manipulate people into habitual use of their products, on a course curated by conference organiser Nir Eyal.
  • Eyal, 39, the author of Hooked: How to Build Habit-Forming Products, has spent several years consulting for the tech industry, teaching techniques he developed by closely studying how the Silicon Valley giants operate.
  • “The technologies we use have turned into compulsions, if not full-fledged addictions,” Eyal writes. “It’s the impulse to check a message notification. It’s the pull to visit YouTube, Facebook, or Twitter for just a few minutes, only to find yourself still tapping and scrolling an hour later.” None of this is an accident, he writes. It is all “just as their designers intended”
  • He explains the subtle psychological tricks that can be used to make people develop habits, such as varying the rewards people receive to create “a craving”, or exploiting negative emotions that can act as “triggers”. “Feelings of boredom, loneliness, frustration, confusion and indecisiveness often instigate a slight pain or irritation and prompt an almost instantaneous and often mindless action to quell the negative sensation,” Eyal writes.
  • The most seductive design, Harris explains, exploits the same psychological susceptibility that makes gambling so compulsive: variable rewards. When we tap those apps with red icons, we don’t know whether we’ll discover an interesting email, an avalanche of “likes”, or nothing at all. It is the possibility of disappointment that makes it so compulsive.
  • Finally, Eyal confided the lengths he goes to protect his own family. He has installed in his house an outlet timer connected to a router that cuts off access to the internet at a set time every day. “The idea is to remember that we are not powerless,” he said. “We are in control.
  • But are we? If the people who built these technologies are taking such radical steps to wean themselves free, can the rest of us reasonably be expected to exercise our free will?
  • Not according to Tristan Harris, a 33-year-old former Google employee turned vocal critic of the tech industry. “All of us are jacked into this system,” he says. “All of our minds can be hijacked. Our choices are not as free as we think they are.”
  • Harris, who has been branded “the closest thing Silicon Valley has to a conscience”, insists that billions of people have little choice over whether they use these now ubiquitous technologies, and are largely unaware of the invisible ways in which a small number of people in Silicon Valley are shaping their lives.
  • “I don’t know a more urgent problem than this,” Harris says. “It’s changing our democracy, and it’s changing our ability to have the conversations and relationships that we want with each other.” Harris went public – giving talks, writing papers, meeting lawmakers and campaigning for reform after three years struggling to effect change inside Google’s Mountain View headquarters.
  • He explored how LinkedIn exploits a need for social reciprocity to widen its network; how YouTube and Netflix autoplay videos and next episodes, depriving users of a choice about whether or not they want to keep watching; how Snapchat created its addictive Snapstreaks feature, encouraging near-constant communication between its mostly teenage users.
  • The techniques these companies use are not always generic: they can be algorithmically tailored to each person. An internal Facebook report leaked this year, for example, revealed that the company can identify when teens feel “insecure”, “worthless” and “need a confidence boost”. Such granular information, Harris adds, is “a perfect model of what buttons you can push in a particular person”.
  • Tech companies can exploit such vulnerabilities to keep people hooked; manipulating, for example, when people receive “likes” for their posts, ensuring they arrive when an individual is likely to feel vulnerable, or in need of approval, or maybe just bored. And the very same techniques can be sold to the highest bidder. “There’s no ethics,” he says. A company paying Facebook to use its levers of persuasion could be a car business targeting tailored advertisements to different types of users who want a new vehicle. Or it could be a Moscow-based troll farm seeking to turn voters in a swing county in Wisconsin.
  • It was Rosenstein’s colleague, Leah Pearlman, then a product manager at Facebook and on the team that created the Facebook “like”, who announced the feature in a 2009 blogpost. Now 35 and an illustrator, Pearlman confirmed via email that she, too, has grown disaffected with Facebook “likes” and other addictive feedback loops. She has installed a web browser plug-in to eradicate her Facebook news feed, and hired a social media manager to monitor her Facebook page so that she doesn’t have to.
  • Harris believes that tech companies never deliberately set out to make their products addictive. They were responding to the incentives of an advertising economy, experimenting with techniques that might capture people’s attention, even stumbling across highly effective design by accident.
  • It’s this that explains how the pull-to-refresh mechanism, whereby users swipe down, pause and wait to see what content appears, rapidly became one of the most addictive and ubiquitous design features in modern technology. “Each time you’re swiping down, it’s like a slot machine,” Harris says. “You don’t know what’s coming next. Sometimes it’s a beautiful photo. Sometimes it’s just an ad.”
  • The reality TV star’s campaign, he said, had heralded a watershed in which “the new, digitally supercharged dynamics of the attention economy have finally crossed a threshold and become manifest in the political realm”.
  • “Smartphones are useful tools,” he says. “But they’re addictive. Pull-to-refresh is addictive. Twitter is addictive. These are not good things. When I was working on them, it was not something I was mature enough to think about. I’m not saying I’m mature now, but I’m a little bit more mature, and I regret the downsides.”
  • All of it, he says, is reward-based behaviour that activates the brain’s dopamine pathways. He sometimes finds himself clicking on the red icons beside his apps “to make them go away”, but is conflicted about the ethics of exploiting people’s psychological vulnerabilities. “It is not inherently evil to bring people back to your product,” he says. “It’s capitalism.”
  • He identifies the advent of the smartphone as a turning point, raising the stakes in an arms race for people’s attention. “Facebook and Google assert with merit that they are giving users what they want,” McNamee says. “The same can be said about tobacco companies and drug dealers.”
  • McNamee chooses his words carefully. “The people who run Facebook and Google are good people, whose well-intentioned strategies have led to horrific unintended consequences,” he says. “The problem is that there is nothing the companies can do to address the harm unless they abandon their current advertising models.”
  • But how can Google and Facebook be forced to abandon the business models that have transformed them into two of the most profitable companies on the planet?
  • McNamee believes the companies he invested in should be subjected to greater regulation, including new anti-monopoly rules. In Washington, there is growing appetite, on both sides of the political divide, to rein in Silicon Valley. But McNamee worries the behemoths he helped build may already be too big to curtail.
  • Rosenstein, the Facebook “like” co-creator, believes there may be a case for state regulation of “psychologically manipulative advertising”, saying the moral impetus is comparable to taking action against fossil fuel or tobacco companies. “If we only care about profit maximisation,” he says, “we will go rapidly into dystopia.”
  • James Williams does not believe talk of dystopia is far-fetched. The ex-Google strategist who built the metrics system for the company’s global search advertising business, he has had a front-row view of an industry he describes as the “largest, most standardised and most centralised form of attentional control in human history”.
  • It is a journey that has led him to question whether democracy can survive the new technological age.
  • He says his epiphany came a few years ago, when he noticed he was surrounded by technology that was inhibiting him from concentrating on the things he wanted to focus on. “It was that kind of individual, existential realisation: what’s going on?” he says. “Isn’t technology supposed to be doing the complete opposite of this?
  • That discomfort was compounded during a moment at work, when he glanced at one of Google’s dashboards, a multicoloured display showing how much of people’s attention the company had commandeered for advertisers. “I realised: this is literally a million people that we’ve sort of nudged or persuaded to do this thing that they weren’t going to otherwise do,” he recalls.
  • Williams and Harris left Google around the same time, and co-founded an advocacy group, Time Well Spent, that seeks to build public momentum for a change in the way big tech companies think about design. Williams finds it hard to comprehend why this issue is not “on the front page of every newspaper every day.
  • “Eighty-seven percent of people wake up and go to sleep with their smartphones,” he says. The entire world now has a new prism through which to understand politics, and Williams worries the consequences are profound.
  • g. “The attention economy incentivises the design of technologies that grab our attention,” he says. “In so doing, it privileges our impulses over our intentions.”
  • That means privileging what is sensational over what is nuanced, appealing to emotion, anger and outrage. The news media is increasingly working in service to tech companies, Williams adds, and must play by the rules of the attention economy to “sensationalise, bait and entertain in order to survive”.
  • It is not just shady or bad actors who were exploiting the internet to change public opinion. The attention economy itself is set up to promote a phenomenon like Trump, who is masterly at grabbing and retaining the attention of supporters and critics alike, often by exploiting or creating outrage.
  • All of which has left Brichter, who has put his design work on the backburner while he focuses on building a house in New Jersey, questioning his legacy. “I’ve spent many hours and weeks and months and years thinking about whether anything I’ve done has made a net positive impact on society or humanity at all,” he says. He has blocked certain websites, turned off push notifications, restricted his use of the Telegram app to message only with his wife and two close friends, and tried to wean himself off Twitter. “I still waste time on it,” he confesses, “just reading stupid news I already know about.” He charges his phone in the kitchen, plugging it in at 7pm and not touching it until the next morning.
  • He stresses these dynamics are by no means isolated to the political right: they also play a role, he believes, in the unexpected popularity of leftwing politicians such as Bernie Sanders and Jeremy Corbyn, and the frequent outbreaks of internet outrage over issues that ignite fury among progressives.
  • All of which, Williams says, is not only distorting the way we view politics but, over time, may be changing the way we think, making us less rational and more impulsive. “We’ve habituated ourselves into a perpetual cognitive style of outrage, by internalising the dynamics of the medium,” he says.
  • It was another English science fiction writer, Aldous Huxley, who provided the more prescient observation when he warned that Orwellian-style coercion was less of a threat to democracy than the more subtle power of psychological manipulation, and “man’s almost infinite appetite for distractions”.
  • If the attention economy erodes our ability to remember, to reason, to make decisions for ourselves – faculties that are essential to self-governance – what hope is there for democracy itself?
  • “The dynamics of the attention economy are structurally set up to undermine the human will,” he says. “If politics is an expression of our human will, on individual and collective levels, then the attention economy is directly undermining the assumptions that democracy rests on.”
Javier E

It's Not Just the Discord Leak. Group Chats Are the Internet's New Chaos Machine. - The... - 0 views

  • Digital bulletin-board systems—proto–group chats, you could say—date back to the 1970s, and SMS-style group chats popped up in WhatsApp and iMessage in 2011.
  • As New York magazine put it in 2019, group chats became “an outright replacement for the defining mode of social organization of the past decade: the platform-centric, feed-based social network.”
  • unlike the Facebook feed or Twitter, where posts can be linked to wherever, group chats are a closed system—a safe and (ideally) private space. What happens in the group chat ought to stay there.
  • ...11 more annotations...
  • In every group chat, no matter the size, participants fall into informal roles. There is usually a leader—a person whose posting frequency drives the group or sets the agenda. Often, there are lurkers who rarely chime in
  • Larger group chats are not immune to the more toxic dynamics of social media, where competition for attention and herd behavior cause infighting, splintering, and back-channeling.
  • It’s enough to make one think, as the writer Max Read argued, that “venture-capitalist group chats are a threat to the global economy.” Now you might also say they are a threat to national security.
  • thanks to the private nature of the group chats, this information largely stayed out of the public eye. As Bloomberg reported, “By the time most people figured out that a bank run was a possibility … it was already well underway.”
  • The investor panic that led to the swift collapse of Silicon Valley Bank in March was effectively caused by runaway group-chat dynamics. “It wasn’t phone calls; it wasn’t social media,” a start-up founder told Bloomberg in March. “It was private chat rooms and message groups.
  • Unlike traditional social media or even forums and message boards, group chats are nearly impossible to monitor.
  • as our digital social lives start to splinter off from feeds and large audiences and into siloed areas, a different kind of unpredictability and chaos awaits. Where social networks create a context collapse—a process by which information meant for one group moves into unfamiliar networks and is interpreted by outsiders—group chats seem to be context amplifiers
  • group chats provide strong relationship dynamics, and create in-jokes and lore. For decades, researchers have warned of the polarizing effects of echo chambers across social networks; group chats realize this dynamic fully.
  • Weird things happen in echo chambers. Constant reinforcement of beliefs or ideas might lead to group polarization or radicalization. It may trigger irrational herd behavior such as, say, attempting to purchase a copy of the Constitution through a decentralized autonomous organization
  • Obsession with in-group dynamics might cause people to lose touch with the reality outside the walls of a particular community; the private-seeming nature of a closed group might also lull participants into a false sense of security, as it did with Teixiera.
  • the age of the group chat appears to be at least as unpredictable, swapping a very public form of volatility for a more siloed, incalculable version
‹ Previous 21 - 40 of 136 Next › Last »
Showing 20 items per page