Skip to main content

Home/ TOK Friends/ Group items tagged structural

Rss Feed Group items tagged

Javier E

At the Existentialist Café: Freedom, Being, and Apricot Cocktails with Jean-P... - 0 views

  • The phenomenologists’ leading thinker, Edmund Husserl, provided a rallying cry, ‘To the things themselves!’ It meant: don’t waste time on the interpretations that accrue upon things, and especially don’t waste time wondering whether the things are real. Just look at this that’s presenting itself to you, whatever this may be, and describe it as precisely as possible.
  • You might think you have defined me by some label, but you are wrong, for I am always a work in progress. I create myself constantly through action, and this is so fundamental to my human condition that, for Sartre, it is the human condition, from the moment of first consciousness to the moment when death wipes it out. I am my own freedom: no more, no less.
  • Sartre wrote like a novelist — not surprisingly, since he was one. In his novels, short stories and plays as well as in his philosophical treatises, he wrote about the physical sensations of the world and the structures and moods of human life. Above all, he wrote about one big subject: what it meant to be free. Freedom, for him, lay at the heart of all human experience, and this set humans apart from all other kinds of object.
  • ...97 more annotations...
  • Sartre listened to his problem and said simply, ‘You are free, therefore choose — that is to say, invent.’ No signs are vouchsafed in this world, he said. None of the old authorities can relieve you of the burden of freedom. You can weigh up moral or practical considerations as carefully as you like, but ultimately you must take the plunge and do something, and it’s up to you what that something is.
  • Even if the situation is unbearable — perhaps you are facing execution, or sitting in a Gestapo prison, or about to fall off a cliff — you are still free to decide what to make of it in mind and deed. Starting from where you are now, you choose. And in choosing, you also choose who you will be.
  • The war had made people realise that they and their fellow humans were capable of departing entirely from civilised norms; no wonder the idea of a fixed human nature seemed questionable.
  • If this sounds difficult and unnerving, it’s because it is. Sartre does not deny that the need to keep making decisions brings constant anxiety. He heightens this anxiety by pointing out that what you do really matters. You should make your choices as though you were choosing on behalf of the whole of humanity, taking the entire burden of responsibility for how the human race behaves. If you avoid this responsibility by fooling yourself that you are the victim of circumstance or of someone else’s bad advice, you are failing to meet the demands of human life and choosing a fake existence, cut off from your own ‘authenticity’.
  • Along with the terrifying side of this comes a great promise: Sartre’s existentialism implies that it is possible to be authentic and free, as long as you keep up the effort.
  • almost all agreed that it was, as an article in Les nouvelles littéraires phrased it, a ‘sickening mixture of philosophic pretentiousness, equivocal dreams, physiological technicalities, morbid tastes and hesitant eroticism … an introspective embryo that one would take distinct pleasure in crushing’.
  • he offered a philosophy designed for a species that had just scared the hell out of itself, but that finally felt ready to grow up and take responsibility.
  • In this rebellious world, just as with the Parisian bohemians and Dadaists in earlier generations, everything that was dangerous and provocative was good, and everything that was nice or bourgeois was bad.
  • Such interweaving of ideas and life had a long pedigree, although the existentialists gave it a new twist. Stoic and Epicurean thinkers in the classical world had practised philosophy as a means of living well, rather than of seeking knowledge or wisdom for their own sake. By reflecting on life’s vagaries in philosophical ways, they believed they could become more resilient, more able to rise above circumstances, and better equipped to manage grief, fear, anger, disappointment or anxiety.
  • In the tradition they passed on, philosophy is neither a pure intellectual pursuit nor a collection of cheap self-help tricks, but a discipline for flourishing and living a fully human, responsible life.
  • For Kierkegaard, Descartes had things back to front. In his own view, human existence comes first: it is the starting point for everything we do, not the result of a logical deduction. My existence is active: I live it and choose it, and this precedes any statement I can make about myself.
  • Studying our own moral genealogy cannot help us to escape or transcend ourselves. But it can enable us to see our illusions more clearly and lead a more vital, assertive existence.
  • What was needed, he felt, was not high moral or theological ideals, but a deeply critical form of cultural history or ‘genealogy’ that would uncover the reasons why we humans are as we are, and how we came to be that way. For him, all philosophy could even be redefined as a form of psychology, or history.
  • For those oppressed on grounds of race or class, or for those fighting against colonialism, existentialism offered a change of perspective — literally, as Sartre proposed that all situations be judged according to how they appeared in the eyes of those most oppressed, or those whose suffering was greatest.
  • She observed that we need not expect moral philosophers to ‘live by’ their ideas in a simplistic way, as if they were following a set of rules. But we can expect them to show how their ideas are lived in. We should be able to look in through the windows of a philosophy, as it were, and see how people occupy it, how they move about and how they conduct themselves.
  • the existentialists inhabited their historical and personal world, as they inhabited their ideas. This notion of ‘inhabited philosophy’ is one I’ve borrowed from the English philosopher and novelist Iris Murdoch, who wrote the first full-length book on Sartre and was an early adopter of existentialism
  • What is existentialism anyway?
  • An existentialist who is also phenomenological provides no easy rules for dealing with this condition, but instead concentrates on describing lived experience as it presents itself. — By describing experience well, he or she hopes to understand this existence and awaken us to ways of living more authentic lives.
  • Existentialists concern themselves with individual, concrete human existence. — They consider human existence different from the kind of being other things have. Other entities are what they are, but as a human I am whatever I choose to make of myself at every moment. I am free — — and therefore I’m responsible for everything I do, a dizzying fact which causes — an anxiety inseparable from human existence itself.
  • On the other hand, I am only free within situations, which can include factors in my own biology and psychology as well as physical, historical and social variables of the world into which I have been thrown. — Despite the limitations, I always want more: I am passionately involved in personal projects of all kinds. — Human existence is thus ambiguous: at once boxed in by borders and yet transcendent and exhilarating. —
  • The first part of this is straightforward: a phenomenologist’s job is to describe. This is the activity that Husserl kept reminding his students to do. It meant stripping away distractions, habits, clichés of thought, presumptions and received ideas, in order to return our attention to what he called the ‘things themselves’. We must fix our beady gaze on them and capture them exactly as they appear, rather than as we think they are supposed to be.
  • Husserl therefore says that, to phenomenologically describe a cup of coffee, I should set aside both the abstract suppositions and any intrusive emotional associations. Then I can concentrate on the dark, fragrant, rich phenomenon in front of me now. This ‘setting aside’ or ‘bracketing out’ of speculative add-ons Husserl called epoché — a term borrowed from the ancient Sceptics,
  • The point about rigour is crucial; it brings us back to the first half of the command to describe phenomena. A phenomenologist cannot get away with listening to a piece of music and saying, ‘How lovely!’ He or she must ask: is it plaintive? is it dignified? is it colossal and sublime? The point is to keep coming back to the ‘things themselves’ — phenomena stripped of their conceptual baggage — so as to bail out weak or extraneous material and get to the heart of the experience.
  • Husserlian ‘bracketing out’ or epoché allows the phenomenologist to temporarily ignore the question ‘But is it real?’, in order to ask how a person experiences his or her world. Phenomenology gives a formal mode of access to human experience. It lets philosophers talk about life more or less as non-philosophers do, while still being able to tell themselves they are being methodical and rigorous.
  • Besides claiming to transform the way we think about reality, phenomenologists promised to change how we think about ourselves. They believed that we should not try to find out what the human mind is, as if it were some kind of substance. Instead, we should consider what it does, and how it grasps its experiences.
  • For Brentano, this reaching towards objects is what our minds do all the time. Our thoughts are invariably of or about something, he wrote: in love, something is loved, in hatred, something is hated, in judgement, something is affirmed or denied. Even when I imagine an object that isn’t there, my mental structure is still one of ‘about-ness’ or ‘of-ness’.
  • Except in deepest sleep, my mind is always engaged in this aboutness: it has ‘intentionality’. Having taken the germ of this from Brentano, Husserl made it central to his whole philosophy.
  • Husserl saw in the idea of intentionality a way to sidestep two great unsolved puzzles of philosophical history: the question of what objects ‘really’ are, and the question of what the mind ‘really’ is. By doing the epoché and bracketing out all consideration of reality from both topics, one is freed to concentrate on the relationship in the middle. One can apply one’s descriptive energies to the endless dance of intentionality that takes place in our lives: the whirl of our minds as they seize their intended phenomena one after the other and whisk them around the floor,
  • Understood in this way, the mind hardly is anything at all: it is its aboutness. This makes the human mind (and possibly some animal minds) different from any other naturally occurring entity. Nothing else can be as thoroughly about or of things as the mind is:
  • Some Eastern meditation techniques aim to still this scurrying creature, but the extreme difficulty of this shows how unnatural it is to be mentally inert. Left to itself, the mind reaches out in all directions as long as it is awake — and even carries on doing it in the dreaming phase of its sleep.
  • a mind that is experiencing nothing, imagining nothing, or speculating about nothing can hardly be said to be a mind at all.
  • Three simple ideas — description, phenomenon, intentionality — provided enough inspiration to keep roomfuls of Husserlian assistants busy in Freiburg for decades. With all of human existence awaiting their attention, how could they ever run out of things to do?
  • For Sartre, this gives the mind an immense freedom. If we are nothing but what we think about, then no predefined ‘inner nature’ can hold us back. We are protean.
  • way of this interpretation. Real, not real; inside, outside; what difference did it make? Reflecting on this, Husserl began turning his phenomenology into a branch of ‘idealism’ — the philosophical tradition which denied external reality and defined everything as a kind of private hallucination.
  • For Sartre, if we try to shut ourselves up inside our own minds, ‘in a nice warm room with the shutters closed’, we cease to exist. We have no cosy home: being out on the dusty road is the very definition of what we are.
  • One might think that, if Heidegger had anything worth saying, he could have communicated it in ordinary language. The fact is that he does not want to be ordinary, and he may not even want to communicate in the usual sense. He wants to make the familiar obscure, and to vex us. George Steiner thought that Heidegger’s purpose was less to be understood than to be experienced through a ‘felt strangeness’.
  • He takes Dasein in its most ordinary moments, then talks about it in the most innovative way he can. For Heidegger, Dasein’s everyday Being is right here: it is Being-in-the-world, or In-der-Welt-sein. The main feature of Dasein’s everyday Being-in-the-world right here is that it is usually busy doing something.
  • Thus, for Heidegger, all Being-in-the-world is also a ‘Being-with’ or Mitsein. We cohabit with others in a ‘with-world’, or Mitwelt. The old philosophical problem of how we prove the existence of other minds has now vanished. Dasein swims in the with-world long before it wonders about other minds.
  • Sometimes the best-educated people were those least inclined to take the Nazis seriously, dismissing them as too absurd to last. Karl Jaspers was one of those who made this mistake, as he later recalled, and Beauvoir observed similar dismissive attitudes among the French students in Berlin.
  • In any case, most of those who disagreed with Hitler’s ideology soon learned to keep their view to themselves. If a Nazi parade passed on the street, they would either slip out of view or give the obligatory salute like everyone else, telling themselves that the gesture meant nothing if they did not believe in it. As the psychologist Bruno Bettelheim later wrote of this period, few people will risk their life for such a small thing as raising an arm — yet that is how one’s powers of resistance are eroded away, and eventually one’s responsibility and integrity go with them.
  • for Arendt, if you do not respond adequately when the times demand it, you show a lack of imagination and attention that is as dangerous as deliberately committing an abuse. It amounts to disobeying the one command she had absorbed from Heidegger in those Marburg days: Think!
  • ‘Everything takes place under a kind of anaesthesia. Objectively dreadful events produce a thin, puny emotional response. Murders are committed like schoolboy pranks. Humiliation and moral decay are accepted like minor incidents.’ Haffner thought modernity itself was partly to blame: people had become yoked to their habits and to mass media, forgetting to stop and think, or to disrupt their routines long enough to question what was going on.
  • Heidegger’s former lover and student Hannah Arendt would argue, in her 1951 study The Origins of Totalitarianism, that totalitarian movements thrived at least partly because of this fragmentation in modern lives, which made people more vulnerable to being swept away by demagogues. Elsewhere, she coined the phrase ‘the banality of evil’ to describe the most extreme failures of personal moral awareness.
  • His communicative ideal fed into a whole theory of history: he traced all civilisation to an ‘Axial Period’ in the fifth century BC, during which philosophy and culture exploded simultaneously in Europe, the Middle East and Asia, as though a great bubble of minds had erupted from the earth’s surface. ‘True philosophy needs communion to come into existence,’ he wrote, and added, ‘Uncommunicativeness in a philosopher is virtually a criterion of the untruth of his thinking.’
  • The idea of being called to authenticity became a major theme in later existentialism, the call being interpreted as saying something like ‘Be yourself!’, as opposed to being phony. For Heidegger, the call is more fundamental than that. It is a call to take up a self that you didn’t know you had: to wake up to your Being. Moreover, it is a call to action. It requires you to do something: to take a decision of some sort.
  • Being and Time contained at least one big idea that should have been of use in resisting totalitarianism. Dasein, Heidegger wrote there, tends to fall under the sway of something called das Man or ‘the they’ — an impersonal entity that robs us of the freedom to think for ourselves. To live authentically requires resisting or outwitting this influence, but this is not easy because das Man is so nebulous. Man in German does not mean ‘man’ as in English (that’s der Mann), but a neutral abstraction, something like ‘one’ in the English phrase ‘one doesn’t do that’,
  • for Heidegger, das Man is me. It is everywhere and nowhere; it is nothing definite, but each of us is it. As with Being, it is so ubiquitous that it is difficult to see. If I am not careful, however, das Man takes over the important decisions that should be my own. It drains away my responsibility or ‘answerability’. As Arendt might put it, we slip into banality, failing to think.
  • Jaspers focused on what he called Grenzsituationen — border situations, or limit situations. These are the moments when one finds oneself constrained or boxed in by what is happening, but at the same time pushed by these events towards the limits or outer edge of normal experience. For example, you might have to make a life-or-death choice, or something might remind you suddenly of your mortality,
  • Jaspers’ interest in border situations probably had much to do with his own early confrontation with mortality. From childhood, he had suffered from a heart condition so severe that he always expected to die at any moment. He also had emphysema, which forced him to speak slowly, taking long pauses to catch his breath. Both illnesses meant that he had to budget his energies with care in order to get his work done without endangering his life.
  • If I am to resist das Man, I must become answerable to the call of my ‘voice of conscience’. This call does not come from God, as a traditional Christian definition of the voice of conscience might suppose. It comes from a truly existentialist source: my own authentic self. Alas, this voice is one I do not recognise and may not hear, because it is not the voice of my habitual ‘they-self’. It is an alien or uncanny version of my usual voice. I am familiar with my they-self, but not with my unalienated voice — so, in a weird twist, my real voice is the one that sounds strangest to me.
  • Marcel developed a strongly theological branch of existentialism. His faith distanced him from both Sartre and Heidegger, but he shared a sense of how history makes demands on individuals. In his essay ‘On the Ontological Mystery’, written in 1932 and published in the fateful year of 1933, Marcel wrote of the human tendency to become stuck in habits, received ideas, and a narrow-minded attachment to possessions and familiar scenes. Instead, he urged his readers to develop a capacity for remaining ‘available’ to situations as they arise. Similar ideas of disponibilité or availability had been explored by other writers,
  • Marcel made it his central existential imperative. He was aware of how rare and difficult it was. Most people fall into what he calls ‘crispation’: a tensed, encrusted shape in life — ‘as though each one of us secreted a kind of shell which gradually hardened and imprisoned him’.
  • Bettelheim later observed that, under Nazism, only a few people realised at once that life could not continue unaltered: these were the ones who got away quickly. Bettelheim himself was not among them. Caught in Austria when Hitler annexed it, he was sent first to Dachau and then to Buchenwald, but was then released in a mass amnesty to celebrate Hitler’s birthday in 1939 — an extraordinary reprieve, after which he left at once for America.
  • we are used to reading philosophy as offering a universal message for all times and places — or at least as aiming to do so. But Heidegger disliked the notion of universal truths or universal humanity, which he considered a fantasy. For him, Dasein is not defined by shared faculties of reason and understanding, as the Enlightenment philosophers thought. Still less is it defined by any kind of transcendent eternal soul, as in religious tradition. We do not exist on a higher, eternal plane at all. Dasein’s Being is local: it has a historical situation, and is constituted in time and place.
  • For Marcel, learning to stay open to reality in this way is the philosopher’s prime job. Everyone can do it, but the philosopher is the one who is called on above all to stay awake, so as to be the first to sound the alarm if something seems wrong.
  • Second, it also means understanding that we are historical beings, and grasping the demands our particular historical situation is making on us. In what Heidegger calls ‘anticipatory resoluteness’, Dasein discovers ‘that its uttermost possibility lies in giving itself up’. At that moment, through Being-towards-death and resoluteness in facing up to one’s time, one is freed from the they-self and attains one’s true, authentic self.
  • If we are temporal beings by our very nature, then authentic existence means accepting, first, that we are finite and mortal. We will die: this all-important realisation is what Heidegger calls authentic ‘Being-towards-Death’, and it is fundamental to his philosophy.
  • Hannah Arendt, instead, left early on: she had the benefit of a powerful warning. Just after the Nazi takeover, in spring 1933, she had been arrested while researching materials on anti-Semitism for the German Zionist Organisation at Berlin’s Prussian State Library. Her apartment was searched; both she and her mother were locked up briefly, then released. They fled, without stopping to arrange travel documents. They crossed to Czechoslovakia (then still safe) by a method that sounds almost too fabulous to be true: a sympathetic German family on the border had a house with its front door in Germany and its back door in Czechoslovakia. The family would invite people for dinner, then let them leave through the back door at night.
  • As Sartre argued in his 1943 review of The Stranger, basic phenomenological principles show that experience comes to us already charged with significance. A piano sonata is a melancholy evocation of longing. If I watch a soccer match, I see it as a soccer match, not as a meaningless scene in which a number of people run around taking turns to apply their lower limbs to a spherical object. If the latter is what I’m seeing, then I am not watching some more essential, truer version of soccer; I am failing to watch it properly as soccer at all.
  • Much as they liked Camus personally, neither Sartre nor Beauvoir accepted his vision of absurdity. For them, life is not absurd, even when viewed on a cosmic scale, and nothing can be gained by saying it is. Life for them is full of real meaning, although that meaning emerges differently for each of us.
  • For Sartre, we show bad faith whenever we portray ourselves as passive creations of our race, class, job, history, nation, family, heredity, childhood influences, events, or even hidden drives in our subconscious which we claim are out of our control. It is not that such factors are unimportant: class and race, in particular, he acknowledged as powerful forces in people’s lives, and Simone de Beauvoir would soon add gender to that list.
  • Sartre takes his argument to an extreme point by asserting that even war, imprisonment or the prospect of imminent death cannot take away my existential freedom. They form part of my ‘situation’, and this may be an extreme and intolerable situation, but it still provides only a context for whatever I choose to do next. If I am about to die, I can decide how to face that death. Sartre here resurrects the ancient Stoic idea that I may not choose what happens to me, but I can choose what to make of it, spiritually speaking.
  • But the Stoics cultivated indifference in the face of terrible events, whereas Sartre thought we should remain passionately, even furiously engaged with what happens to us and with what we can achieve. We should not expect freedom to be anything less than fiendishly difficult.
  • Freedom does not mean entirely unconstrained movement, and it certainly does not mean acting randomly. We often mistake the very things that enable us to be free — context, meaning, facticity, situation, a general direction in our lives — for things that define us and take away our freedom. It is only with all of these that we can be free in a real sense.
  • Nor did he mean that privileged groups have the right to pontificate to the poor and downtrodden about the need to ‘take responsibility’ for themselves. That would be a grotesque misreading of Sartre’s point, since his sympathy in any encounter always lay with the more oppressed side. But for each of us — for me — to be in good faith means not making excuses for myself.
  • Camus’ novel gives us a deliberately understated vision of heroism and decisive action compared to those of Sartre and Beauvoir. One can only do so much. It can look like defeatism, but it shows a more realistic perception of what it takes to actually accomplish difficult tasks like liberating one’s country.
  • Camus just kept returning to his core principle: no torture, no killing — at least not with state approval. Beauvoir and Sartre believed they were taking a more subtle and more realistic view. If asked why a couple of innocuous philosophers had suddenly become so harsh, they would have said it was because the war had changed them in profound ways. It had shown them that one’s duties to humanity could be more complicated than they seemed. ‘The war really divided my life in two,’ Sartre said later.
  • Poets and artists ‘let things be’, but they also let things come out and show themselves. They help to ease things into ‘unconcealment’ (Unverborgenheit), which is Heidegger’s rendition of the Greek term alētheia, usually translated as ‘truth’. This is a deeper kind of truth than the mere correspondence of a statement to reality, as when we say ‘The cat is on the mat’ and point to a mat with a cat on it. Long before we can do this, both cat and mat must ‘stand forth out of concealedness’. They must un-hide themselves.
  • Heidegger does not use the word ‘consciousness’ here because — as with his earlier work — he is trying to make us think in a radically different way about ourselves. We are not to think of the mind as an empty cavern, or as a container filled with representations of things. We are not even supposed to think of it as firing off arrows of intentional ‘aboutness’, as in the earlier phenomenology of Brentano. Instead, Heidegger draws us into the depths of his Schwarzwald, and asks us to imagine a gap with sunlight filtering in. We remain in the forest, but we provide a relatively open spot where other beings can bask for a moment. If we did not do this, everything would remain in the thickets, hidden even to itself.
  • The astronomer Carl Sagan began his 1980 television series Cosmos by saying that human beings, though made of the same stuff as the stars, are conscious and are therefore ‘a way for the cosmos to know itself’. Merleau-Ponty similarly quoted his favourite painter Cézanne as saying, ‘The landscape thinks itself in me, and I am its consciousness.’ This is something like what Heidegger thinks humanity contributes to the earth. We are not made of spiritual nothingness; we are part of Being, but we also bring something unique with us. It is not much: a little open space, perhaps with a path and a bench like the one the young Heidegger used to sit on to do his homework. But through us, the miracle occurs.
  • Beauty aside, Heidegger’s late writing can also be troubling, with its increasingly mystical notion of what it is to be human. If one speaks of a human being mainly as an open space or a clearing, or a means of ‘letting beings be’ and dwelling poetically on the earth, then one doesn’t seem to be talking about any recognisable person. The old Dasein has become less human than ever. It is now a forestry feature.
  • Even today, Jaspers, the dedicated communicator, is far less widely read than Heidegger, who has influenced architects, social theorists, critics, psychologists, artists, film-makers, environmental activists, and innumerable students and enthusiasts — including the later deconstructionist and post-structuralist schools, which took their starting point from his late thinking. Having spent the late 1940s as an outsider and then been rehabilitated, Heidegger became the overwhelming presence in university philosophy all over the European continent from then on.
  • As Levinas reflected on this experience, it helped to lead him to a philosophy that was essentially ethical, rather than ontological like Heidegger’s. He developed his ideas from the work of Jewish theologian Martin Buber, whose I and Thou in 1923 had distinguished between my relationship with an impersonal ‘it’ or ‘them’, and the direct personal encounter I have with a ‘you’. Levinas took it further: when I encounter you, we normally meet face-to-face, and it is through your face that you, as another person, can make ethical demands on me. This is very different from Heidegger’s Mitsein or Being-with, which suggests a group of people standing alongside one another, shoulder to shoulder as if in solidarity — perhaps as a unified nation or Volk.
  • For Levinas, we literally face each other, one individual at a time, and that relationship becomes one of communication and moral expectation. We do not merge; we respond to one another. Instead of being co-opted into playing some role in my personal drama of authenticity, you look me in the eyes — and you remain Other. You remain you.
  • This relationship is more fundamental than the self, more fundamental than consciousness, more fundamental even than Being — and it brings an unavoidable ethical obligation. Ever since Husserl, phenomenologists and existentialists had being trying to stretch the definition of existence to incorporate our social lives and relationships. Levinas did more: he turned philosophy around entirely so that these relationships were the foundation of our existence, not an extension of it.
  • Her last work, The Need for Roots, argues, among other things, that none of us has rights, but each one of us has a near-infinite degree of duty and obligation to the other. Whatever the underlying cause of her death — and anorexia nervosa seems to have been involved — no one could deny that she lived out her philosophy with total commitment. Of all the lives touched on in this book, hers is surely the most profound and challenging application of Iris Murdoch’s notion that a philosophy can be ‘inhabited’.
  • Other thinkers took radical ethical turns during the war years. The most extreme was Simone Weil, who actually tried to live by the principle of putting other people’s ethical demands first. Having returned to France after her travels through Germany in 1932, she had worked in a factory so as to experience the degrading nature of such work for herself. When France fell in 1940, her family fled to Marseilles (against her protests), and later to the US and to Britain. Even in exile, Weil made extraordinary sacrifices. If there were people in the world who could not sleep in a bed, she would not do so either, so she slept on the floor.
  • The mystery tradition had roots in Kierkegaard’s ‘leap of faith’. It owed much to the other great nineteenth-century mystic of the impossible, Dostoevsky, and to older theological notions. But it also grew from the protracted trauma that was the first half of the twentieth century. Since 1914, and especially since 1939, people in Europe and elsewhere had come to the realisation that we cannot fully know or trust ourselves; that we have no excuses or explanations for what we do — and yet that we must ground our existence and relationships on something firm, because otherwise we cannot survive.
  • One striking link between these radical ethical thinkers, all on the fringes of our main story, is that they had religious faith. They also granted a special role to the notion of ‘mystery’ — that which cannot be known, calculated or understood, especially when it concerns our relationships with each other. Heidegger was different from them, since he rejected the religion he grew up with and had no real interest in ethics — probably as a consequence of his having no real interest in the human.
  • Meanwhile, the Christian existentialist Gabriel Marcel was also still arguing, as he had since the 1930s, that ethics trumps everything else in philosophy and that our duty to each other is so great as to play the role of a transcendent ‘mystery’. He too had been led to this position partly by a wartime experience: during the First World War he had worked for the Red Cross’ Information Service, with the unenviable job of answering relatives’ inquiries about missing soldiers. Whenever news came, he passed it on, and usually it was not good. As Marcel later said, this task permanently inoculated him against warmongering rhetoric of any kind, and it made him aware of the power of what is unknown in our lives.
  • As the play’s much-quoted and frequently misunderstood final line has it: ‘Hell is other people.’ Sartre later explained that he did not mean to say that other people were hellish in general. He meant that after death we become frozen in their view, unable any longer to fend off their interpretation. In life, we can still do something to manage the impression we make; in death, this freedom goes and we are left entombed in other’s people’s memories and perceptions.
  • We have to do two near-impossible things at once: understand ourselves as limited by circumstances, and yet continue to pursue our projects as though we are truly in control. In Beauvoir’s view, existentialism is the philosophy that best enables us to do this, because it concerns itself so deeply with both freedom and contingency. It acknowledges the radical and terrifying scope of our freedom in life, but also the concrete influences that other philosophies tend to ignore: history, the body, social relationships and the environment.
  • The aspects of our existence that limit us, Merleau-Ponty says, are the very same ones that bind us to the world and give us scope for action and perception. They make us what we are. Sartre acknowledged the need for this trade-off, but he found it more painful to accept. Everything in him longed to be free of bonds, of impediments and limitations
  • Of course we have to learn this skill of interpreting and anticipating the world, and this happens in early childhood, which is why Merleau-Ponty thought child psychology was essential to philosophy. This is an extraordinary insight. Apart from Rousseau, very few philosophers before him had taken childhood seriously; most wrote as though all human experience were that of a fully conscious, rational, verbal adult who has been dropped into this world from the sky — perhaps by a stork.
  • For Merleau-Ponty, we cannot understand our experience if we don’t think of ourselves in part as overgrown babies. We fall for optical illusions because we once learned to see the world in terms of shapes, objects and things relevant to our own interests. Our first perceptions came to us in tandem with our first active experiments in observing the world and reaching out to explore it, and are still linked with those experiences.
  • Another factor in all of this, for Merleau-Ponty, is our social existence: we cannot thrive without others, or not for long, and we need this especially in early life. This makes solipsistic speculation about the reality of others ridiculous; we could never engage in such speculation if we hadn’t already been formed by them.
  • As Descartes could have said (but didn’t), ‘I think, therefore other people exist.’ We grow up with people playing with us, pointing things out, talking, listening, and getting us used to reading emotions and movements; this is how we become capable, reflective, smoothly integrated beings.
  • In general, Merleau-Ponty thinks human experience only makes sense if we abandon philosophy’s time-honoured habit of starting with a solitary, capsule-like, immobile adult self, isolated from its body and world, which must then be connected up again — adding each element around it as though adding clothing to a doll. Instead, for him, we slide from the womb to the birth canal to an equally close and total immersion in the world. That immersion continues as long as we live, although we may also cultivate the art of partially withdrawing from time to time when we want to think or daydream.
  • When he looks for his own metaphor to describe how he sees consciousness, he comes up with a beautiful one: consciousness, he suggests, is like a ‘fold’ in the world, as though someone had crumpled a piece of cloth to make a little nest or hollow. It stays for a while, before eventually being unfolded and smoothed away. There is something seductive, even erotic, in this idea of my conscious self as an improvised pouch in the cloth of the world. I still have my privacy — my withdrawing room. But I am part of the world’s fabric, and I remain formed out of it for as long as I am here.
  • By the time of these works, Merleau-Ponty is taking his desire to describe experience to the outer limits of what language can convey. Just as with the late Husserl or Heidegger, or Sartre in his Flaubert book, we see a philosopher venturing so far from shore that we can barely follow. Emmanuel Levinas would head out to the fringes too, eventually becoming incomprehensible to all but his most patient initiates.
  • Sartre once remarked — speaking of a disagreement they had about Husserl in 1941 — that ‘we discovered, astounded, that our conflicts had, at times, stemmed from our childhood, or went back to the elementary differences of our two organisms’. Merleau-Ponty also said in an interview that Sartre’s work seemed strange to him, not because of philosophical differences, but because of a certain ‘register of feeling’, especially in Nausea, that he could not share. Their difference was one of temperament and of the whole way the world presented itself to them.
  • The two also differed in their purpose. When Sartre writes about the body or other aspects of experience, he generally does it in order to make a different point. He expertly evokes the grace of his café waiter, gliding between the tables, bending at an angle just so, steering the drink-laden tray through the air on the tips of his fingers — but he does it all in order to illustrate his ideas about bad faith. When Merleau-Ponty writes about skilled and graceful movement, the movement itself is his point. This is the thing he wants to understand.
  • We can never move definitively from ignorance to certainty, for the thread of the inquiry will constantly lead us back to ignorance again. This is the most attractive description of philosophy I’ve ever read, and the best argument for why it is worth doing, even (or especially) when it takes us no distance at all from our starting point.
  • By prioritising perception, the body, social life and childhood development, Merleau-Ponty gathered up philosophy’s far-flung outsider subjects and brought them in to occupy the centre of his thought.
  • In his inaugural lecture at the Collège de France on 15 January 1953, published as In Praise of Philosophy, he said that philosophers should concern themselves above all with whatever is ambiguous in our experience. At the same time, they should think clearly about these ambiguities, using reason and science. Thus, he said, ‘The philosopher is marked by the distinguishing trait that he possesses inseparably the taste for evidence and the feeling for ambiguity.’ A constant movement is required between these two
  • As Sartre wrote in response to Hiroshima, humanity had now gained the power to wipe itself out, and must decide every single day that it wanted to live. Camus also wrote that humanity faced the task of choosing between collective suicide and a more intelligent use of its technology — ‘between hell and reason’. After 1945, there seemed little reason to trust in humanity’s ability to choose well.
  • Merleau-Ponty observed in a lecture of 1951 that, more than any previous century, the twentieth century had reminded people how ‘contingent’ their lives were — how at the mercy of historical events and other changes that they could not control. This feeling went on long after the war ended. After the A-bombs were dropped on Hiroshima and Nagasaki, many feared that a Third World War would not be long in coming, this time between the Soviet Union and the United States.
johnsonle1

A Brand New Organ Has Been Found In The Human Body, Say Scientists | The Huffington Post - 1 views

  •  
    "The anatomic description that had been laid down over 100 years of anatomy was incorrect. This organ is far from fragmented and complex. It is simply one continuous structure," Professor Coffey explained.
sissij

The Increasing Significance of the Decline of Men - The New York Times - 0 views

  • At one end of the scale, men continue to dominate.
  • But at the other end of the scale, men of all races and ethnicities are dropping out of the work force, abusing opioids and falling behind women in both college attendance and graduation rates.
  • From 1979 to 2007, seven percent of men and 16 percent of women with middle-skill jobs lost their positions, according to the Dallas Fed study. Four percent of these men moved to low-skill work, and 3 percent moved to high-skill jobs. Almost all the women, 15 percent, moved into high-skill jobs, with only 1 percent moving to low-skill work.
  • ...9 more annotations...
  • For boys and girls raised in two-parent households, there were only modest differences between the sexes in terms of success at school, and boys tended to earn more than their sisters in early adulthood.
  • At the same time, the divorce rate for college graduates has declined from 34.8 percent among those born between 1950 and 1955 to 29.9 percent among those born between 1957 and 1964. In contrast, the divorce rate for those without college degrees increased over the same period from 44.3 percent to 50.6 percent.
  • First, there are irreversible changes in the workplace, particularly the rise of jobs requiring social skills (even STEM jobs) that will continue to make it hard for men who lack those skills.
  • Men are really going to have to change their act or have big problems. I think of big guys from the cave days, guys who were good at lifting stuff and hunting and the things we got genetically selected out for. During the industrial revolution that wasn’t so bad, but it’s not going to be there anymore.
  • This vulnerability, in turn, makes boys more susceptible toattention deficit hyperactivity disorder, and conduct disorders as well as the epigenetic mechanisms that can account for the recent widespread increase of these disorders in U.S. culture.
  • Schore argues that a major factor in rising dysfunction among boys and men in this country is the failure of the United States to provide longer periods of paid parental leave, with the result that many infants are placed in day care when they are six weeks old.
  • Females consistently score higher on tests of emotional and social intelligence. Sex differences in sociability and social perceptiveness have been shown to have biological origins, with differences appearing in infancy and higher levels of fetal testosterone associated with lower scores on tests of social intelligence.
  • Second, male children suffer more from restricted or nonexistent parental leave policies and contemporary child care arrangements, as well as from growing up in single-parent households. Advertisement Continue reading the main story
  • It has been a longstanding objective of right-wing regimes to push women back into traditional gender roles. Is that what’s going on here? Or could it be something less pernicious and more important?
  •  
    I think this research is very interesting. It takes a different perspective when discussing gender issues. It notices that there are actually a decline of men in the society. Although there are still wage inequality and other gender problems that women are usually in disadvantages, men are having more and more disadvantages now as the the society shift from physical work to mental work. As the society evolved, the social structure also evolves. Gender equality means we should put equal attention to all genders (there are more than two). --Sissi (3/16/2017)
Javier E

America Is Flunking Math - Persuasion - 1 views

  • One can argue that the preeminence of each civilization was, in part, due to their sophisticated understanding and use of mathematics. This is particularly clear in the case of the West, which forged ahead in the 17th century with the discovery of calculus, one of the greatest scientific breakthroughs of all time.
  • The United States became the dominant force in the mathematical sciences in the wake of World War II, largely due to the disastrous genocidal policies of the Third Reich. The Nazis’ obsession with purging German science of what it viewed as nefarious Jewish influence led to a massive exodus of Jewish mathematicians and scientists to America
  • Indeed, academic institutions in the United States have thrived largely because of their ability to attract talented individuals from around the world.
  • ...28 more annotations...
  • The quality of mathematics research in the United States today is the envy of the scientific world. This is a direct result of the openness and inclusivity of the profession.
  • Can Americans maintain this unmatched excellence in the future? There are worrisome signs that suggest not.
  • The Organization for Economic Cooperation and Development compares mathematical proficiency among 15-year-olds by country worldwide. According to its 2018 report, America ranked 37th while China, America’s main competitor for world leadership, came in first.
  • This is despite the fact that the United States is the fifth-highest spender per pupil among the 37 developed OECD nations
  • This massive failure of our K-12 education system trickles through the STEM pipeline.
  • At the undergraduate level, too few American students are prepared for higher-level mathematics courses. These students are then unprepared for rigorous graduate-level work
  • According to our own experiences at the universities where we teach, an overwhelming majority of American students with strong math backgrounds are either foreign-born or first-generation students who have additional support from their education-conscious families. At all levels, STEM disciplines are more and more dependent on a constant flow of foreign talent.
  • There are many reasons for this failure, but the way that we educate and prepare teachers is particularly influential. The vast majority of K-12 math teachers are graduates of teacher-preparation programs that teach very little substantive mathematics
  • This has led to a constant stream of ill-advised and dumbed-down reforms. One of the latest fads is anti-racist mathematics. Promoted in several states, the bizarre doctrine threatens to further degrade the teaching of mathematics.
  • Another major concern is the twisted interpretation of diversity, equity, and inclusion (DEI).
  • Under the banner of DEI, universities are abandoning the use of standardized tests like the SAT and GRE in admissions, and cities are considering scrapping academic tracking and various gifted programs in schools, which they deem “inequitable.”
  • such programs are particularly effective, when properly implemented, at discovering and encouraging talented children from disadvantaged backgrounds.
  • The new 2021 Mathematics Framework, currently under consideration by California’s Department of Education, does away “with all tracking, acceleration, gifted programs, or any instruction that involves clustering by individual differences, without expressing any awareness of the impact these drastic alterations would have in preparing STEM-ready candidates.”
  • These measures will not only hinder the progress of the generations of our future STEM workforce but also contribute to structural inequalities, as they are uniquely detrimental to students whose parents cannot send them to private schools or effective enrichment programs.
  • These are just a few examples of an unprecedented fervor for revolutionary change in the name of Critical Race Theory (CRT), a doctrine that views the world as a fierce battleground for the narratives of various identity groups.
  • This will only lead to a further widening of racial disparities in educational outcomes while lowering American children’s rankings in education internationally.
  • Ill-conceived DEI policies, often informed by CRT, and the declining standards of K-12 math education feed each other in a vicious circle
  • Regarding minorities, in particular, public K-12 education all too often produces students unprepared to compete, thus leading to large disparities in admissions at universities, graduate programs, and faculty positions. This disparity is then condemned as a manifestation of structural racism, resulting in administrative measures to lower the evaluation criteria. Lowering standards at all levels leads eventually to even worse outcomes and larger disparities, and so on in a downward spiral.
  • A case in point is the recent report by the American Mathematical Society that accuses the entire mathematics community, with the thinnest of specific evidence, of systemic racial discrimination. A major justification put forward for such a grave accusation is the lack of sufficient representation of Black mathematicians in the professio
  • the report, while raising awareness of several ugly facts from the long-ago past, makes little effort to address the real reasons for this, mainly the catastrophic failure of the K-12 mathematical educational system.
  • The National Science Foundation, a federal institution meant to support fundamental research, is now diverting some of its limited funding to various DEI initiatives of questionable benefit.
  • Meanwhile, other countries, especially China, are doing precisely the opposite, following the model of our past dedication to objective measures of excellence. How long before we will see a reverse exodus, away from the United States?
  • The present crisis can still be reversed by focusing on a few concrete actions:
  • Improve schools in urban areas and inner-city neighborhoods by following the most promising education programs. These programs demonstrate that inner-city children benefit if they are challenged by high standards and a nurturing environment.
  • Follow the lead of other highly successful rigorous programs such as BASIS schools and Math for America, which focus on rigorous STEM curricula, combined with 21st-century teaching methods, and recruit talented teachers to help them build on their STEM knowledge and teaching methods.
  • Increase, rather than eliminate, tailored instruction, both for accelerated and remedial math courses.
  • Reject the soft bigotry of low expectations, that Black children cannot do well in competitive mathematics programs and need dumbed-down ethnocentric versions of mathematics.
  • Uphold the objective selection process based on merit at all levels of education and research.
Javier E

Thieves of experience: On the rise of surveillance capitalism - 1 views

  • Harvard Business School professor emerita Shoshana Zuboff argues in her new book that the Valley’s wealth and power are predicated on an insidious, essentially pathological form of private enterprise—what she calls “surveillance capitalism.” Pioneered by Google, perfected by Facebook, and now spreading throughout the economy, surveillance capitalism uses human life as its raw material. Our everyday experiences, distilled into data, have become a privately-owned business asset used to predict and mold our behavior, whether we’re shopping or socializing, working or voting.
  • By reengineering the economy and society to their own benefit, Google and Facebook are perverting capitalism in a way that undermines personal freedom and corrodes democracy.
  • Under the Fordist model of mass production and consumption that prevailed for much of the twentieth century, industrial capitalism achieved a relatively benign balance among the contending interests of business owners, workers, and consumers. Enlightened executives understood that good pay and decent working conditions would ensure a prosperous middle class eager to buy the goods and services their companies produced. It was the product itself — made by workers, sold by companies, bought by consumers — that tied the interests of capitalism’s participants together. Economic and social equilibrium was negotiated through the product.
  • ...72 more annotations...
  • By removing the tangible product from the center of commerce, surveillance capitalism upsets the equilibrium. Whenever we use free apps and online services, it’s often said, we become the products, our attention harvested and sold to advertisers
  • this truism gets it wrong. Surveillance capitalism’s real products, vaporous but immensely valuable, are predictions about our future behavior — what we’ll look at, where we’ll go, what we’ll buy, what opinions we’ll hold — that internet companies derive from our personal data and sell to businesses, political operatives, and other bidders.
  • Unlike financial derivatives, which they in some ways resemble, these new data derivatives draw their value, parasite-like, from human experience.To the Googles and Facebooks of the world, we are neither the customer nor the product. We are the source of what Silicon Valley technologists call “data exhaust” — the informational byproducts of online activity that become the inputs to prediction algorithms
  • Another 2015 study, appearing in the Journal of Computer-Mediated Communication, showed that when people hear their phone ring but are unable to answer it, their blood pressure spikes, their pulse quickens, and their problem-solving skills decline.
  • The smartphone has become a repository of the self, recording and dispensing the words, sounds and images that define what we think, what we experience and who we are. In a 2015 Gallup survey, more than half of iPhone owners said that they couldn’t imagine life without the device.
  • So what happens to our minds when we allow a single tool such dominion over our perception and cognition?
  • Not only do our phones shape our thoughts in deep and complicated ways, but the effects persist even when we aren’t using the devices. As the brain grows dependent on the technology, the research suggests, the intellect weakens.
  • he has seen mounting evidence that using a smartphone, or even hearing one ring or vibrate, produces a welter of distractions that makes it harder to concentrate on a difficult problem or job. The division of attention impedes reasoning and performance.
  • internet companies operate in what Zuboff terms “extreme structural independence from people.” When databases displace goods as the engine of the economy, our own interests, as consumers but also as citizens, cease to be part of the negotiation. We are no longer one of the forces guiding the market’s invisible hand. We are the objects of surveillance and control.
  • Social skills and relationships seem to suffer as well.
  • In both tests, the subjects whose phones were in view posted the worst scores, while those who left their phones in a different room did the best. The students who kept their phones in their pockets or bags came out in the middle. As the phone’s proximity increased, brainpower decreased.
  • In subsequent interviews, nearly all the participants said that their phones hadn’t been a distraction—that they hadn’t even thought about the devices during the experiment. They remained oblivious even as the phones disrupted their focus and thinking.
  • The researchers recruited 520 undergraduates at UCSD and gave them two standard tests of intellectual acuity. One test gauged “available working-memory capacity,” a measure of how fully a person’s mind can focus on a particular task. The second assessed “fluid intelligence,” a person’s ability to interpret and solve an unfamiliar problem. The only variable in the experiment was the location of the subjects’ smartphones. Some of the students were asked to place their phones in front of them on their desks; others were told to stow their phones in their pockets or handbags; still others were required to leave their phones in a different room.
  • the “integration of smartphones into daily life” appears to cause a “brain drain” that can diminish such vital mental skills as “learning, logical reasoning, abstract thought, problem solving, and creativity.”
  •  Smartphones have become so entangled with our existence that, even when we’re not peering or pawing at them, they tug at our attention, diverting precious cognitive resources. Just suppressing the desire to check our phone, which we do routinely and subconsciously throughout the day, can debilitate our thinking.
  • They found that students who didn’t bring their phones to the classroom scored a full letter-grade higher on a test of the material presented than those who brought their phones. It didn’t matter whether the students who had their phones used them or not: All of them scored equally poorly.
  • A study of nearly a hundred secondary schools in the U.K., published last year in the journal Labour Economics, found that when schools ban smartphones, students’ examination scores go up substantially, with the weakest students benefiting the most.
  • Data, the novelist and critic Cynthia Ozick once wrote, is “memory without history.” Her observation points to the problem with allowing smartphones to commandeer our brains
  • Because smartphones serve as constant reminders of all the friends we could be chatting with electronically, they pull at our minds when we’re talking with people in person, leaving our conversations shallower and less satisfying.
  • In a 2013 study conducted at the University of Essex in England, 142 participants were divided into pairs and asked to converse in private for ten minutes. Half talked with a phone in the room, half without a phone present. The subjects were then given tests of affinity, trust and empathy. “The mere presence of mobile phones,” the researchers reported in the Journal of Social and Personal Relationships, “inhibited the development of interpersonal closeness and trust” and diminished “the extent to which individuals felt empathy and understanding from their partners.”
  • The evidence that our phones can get inside our heads so forcefully is unsettling. It suggests that our thoughts and feelings, far from being sequestered in our skulls, can be skewed by external forces we’re not even aware o
  •  Scientists have long known that the brain is a monitoring system as well as a thinking system. Its attention is drawn toward any object that is new, intriguing or otherwise striking — that has, in the psychological jargon, “salience.”
  • even in the history of captivating media, the smartphone stands out. It is an attention magnet unlike any our minds have had to grapple with before. Because the phone is packed with so many forms of information and so many useful and entertaining functions, it acts as what Dr. Ward calls a “supernormal stimulus,” one that can “hijack” attention whenever it is part of our surroundings — and it is always part of our surroundings.
  • Imagine combining a mailbox, a newspaper, a TV, a radio, a photo album, a public library and a boisterous party attended by everyone you know, and then compressing them all into a single, small, radiant object. That is what a smartphone represents to us. No wonder we can’t take our minds off it.
  • The irony of the smartphone is that the qualities that make it so appealing to us — its constant connection to the net, its multiplicity of apps, its responsiveness, its portability — are the very ones that give it such sway over our minds.
  • Phone makers like Apple and Samsung and app writers like Facebook, Google and Snap design their products to consume as much of our attention as possible during every one of our waking hours
  • Social media apps were designed to exploit “a vulnerability in human psychology,” former Facebook president Sean Parker said in a recent interview. “[We] understood this consciously. And we did it anyway.”
  • A quarter-century ago, when we first started going online, we took it on faith that the web would make us smarter: More information would breed sharper thinking. We now know it’s not that simple.
  • As strange as it might seem, people’s knowledge and understanding may actually dwindle as gadgets grant them easier access to online data stores
  • In a seminal 2011 study published in Science, a team of researchers — led by the Columbia University psychologist Betsy Sparrow and including the late Harvard memory expert Daniel Wegner — had a group of volunteers read forty brief, factual statements (such as “The space shuttle Columbia disintegrated during re-entry over Texas in Feb. 2003”) and then type the statements into a computer. Half the people were told that the machine would save what they typed; half were told that the statements would be erased.
  • Afterward, the researchers asked the subjects to write down as many of the statements as they could remember. Those who believed that the facts had been recorded in the computer demonstrated much weaker recall than those who assumed the facts wouldn’t be stored. Anticipating that information would be readily available in digital form seemed to reduce the mental effort that people made to remember it
  • The researchers dubbed this phenomenon the “Google effect” and noted its broad implications: “Because search engines are continually available to us, we may often be in a state of not feeling we need to encode the information internally. When we need it, we will look it up.”
  • as the pioneering psychologist and philosopher William James said in an 1892 lecture, “the art of remembering is the art of thinking.”
  • Only by encoding information in our biological memory can we weave the rich intellectual associations that form the essence of personal knowledge and give rise to critical and conceptual thinking. No matter how much information swirls around us, the less well-stocked our memory, the less we have to think with.
  • As Dr. Wegner and Dr. Ward explained in a 2013 Scientific American article, when people call up information through their devices, they often end up suffering from delusions of intelligence. They feel as though “their own mental capacities” had generated the information, not their devices. “The advent of the ‘information age’ seems to have created a generation of people who feel they know more than ever before,” the scholars concluded, even though “they may know ever less about the world around them.”
  • That insight sheds light on society’s current gullibility crisis, in which people are all too quick to credit lies and half-truths spread through social media. If your phone has sapped your powers of discernment, you’ll believe anything it tells you.
  • A second experiment conducted by the researchers produced similar results, while also revealing that the more heavily students relied on their phones in their everyday lives, the greater the cognitive penalty they suffered.
  • When we constrict our capacity for reasoning and recall or transfer those skills to a gadget, we sacrifice our ability to turn information into knowledge. We get the data but lose the meaning
  • We need to give our minds more room to think. And that means putting some distance between ourselves and our phones.
  • Google’s once-patient investors grew restive, demanding that the founders figure out a way to make money, preferably lots of it.
  • nder pressure, Page and Brin authorized the launch of an auction system for selling advertisements tied to search queries. The system was designed so that the company would get paid by an advertiser only when a user clicked on an ad. This feature gave Google a huge financial incentive to make accurate predictions about how users would respond to ads and other online content. Even tiny increases in click rates would bring big gains in income. And so the company began deploying its stores of behavioral data not for the benefit of users but to aid advertisers — and to juice its own profits. Surveillance capitalism had arrived.
  • Google’s business now hinged on what Zuboff calls “the extraction imperative.” To improve its predictions, it had to mine as much information as possible from web users. It aggressively expanded its online services to widen the scope of its surveillance.
  • Through Gmail, it secured access to the contents of people’s emails and address books. Through Google Maps, it gained a bead on people’s whereabouts and movements. Through Google Calendar, it learned what people were doing at different moments during the day and whom they were doing it with. Through Google News, it got a readout of people’s interests and political leanings. Through Google Shopping, it opened a window onto people’s wish lists,
  • The company gave all these services away for free to ensure they’d be used by as many people as possible. It knew the money lay in the data.
  • the organization grew insular and secretive. Seeking to keep the true nature of its work from the public, it adopted what its CEO at the time, Eric Schmidt, called a “hiding strategy” — a kind of corporate omerta backed up by stringent nondisclosure agreements.
  • Page and Brin further shielded themselves from outside oversight by establishing a stock structure that guaranteed their power could never be challenged, neither by investors nor by directors.
  • What’s most remarkable about the birth of surveillance capitalism is the speed and audacity with which Google overturned social conventions and norms about data and privacy. Without permission, without compensation, and with little in the way of resistance, the company seized and declared ownership over everyone’s information
  • The companies that followed Google presumed that they too had an unfettered right to collect, parse, and sell personal data in pretty much any way they pleased. In the smart homes being built today, it’s understood that any and all data will be beamed up to corporate clouds.
  • Google conducted its great data heist under the cover of novelty. The web was an exciting frontier — something new in the world — and few people understood or cared about what they were revealing as they searched and surfed. In those innocent days, data was there for the taking, and Google took it
  • Google also benefited from decisions made by lawmakers, regulators, and judges — decisions that granted internet companies free use of a vast taxpayer-funded communication infrastructure, relieved them of legal and ethical responsibility for the information and messages they distributed, and gave them carte blanche to collect and exploit user data.
  • Consider the terms-of-service agreements that govern the division of rights and the delegation of ownership online. Non-negotiable, subject to emendation and extension at the company’s whim, and requiring only a casual click to bind the user, TOS agreements are parodies of contracts, yet they have been granted legal legitimacy by the court
  • Law professors, writes Zuboff, “call these ‘contracts of adhesion’ because they impose take-it-or-leave-it conditions on users that stick to them whether they like it or not.” Fundamentally undemocratic, the ubiquitous agreements helped Google and other firms commandeer personal data as if by fiat.
  • n the choices we make as consumers and private citizens, we have always traded some of our autonomy to gain other rewards. Many people, it seems clear, experience surveillance capitalism less as a prison, where their agency is restricted in a noxious way, than as an all-inclusive resort, where their agency is restricted in a pleasing way
  • Zuboff makes a convincing case that this is a short-sighted and dangerous view — that the bargain we’ve struck with the internet giants is a Faustian one
  • but her case would have been stronger still had she more fully addressed the benefits side of the ledger.
  • there’s a piece missing. While Zuboff’s assessment of the costs that people incur under surveillance capitalism is exhaustive, she largely ignores the benefits people receive in return — convenience, customization, savings, entertainment, social connection, and so on
  • hat the industries of the future will seek to manufacture is the self.
  • Behavior modification is the thread that ties today’s search engines, social networks, and smartphone trackers to tomorrow’s facial-recognition systems, emotion-detection sensors, and artificial-intelligence bots.
  • All of Facebook’s information wrangling and algorithmic fine-tuning, she writes, “is aimed at solving one problem: how and when to intervene in the state of play that is your daily life in order to modify your behavior and thus sharply increase the predictability of your actions now, soon, and later.”
  • “The goal of everything we do is to change people’s actual behavior at scale,” a top Silicon Valley data scientist told her in an interview. “We can test how actionable our cues are for them and how profitable certain behaviors are for us.”
  • This goal, she suggests, is not limited to Facebook. It is coming to guide much of the economy, as financial and social power shifts to the surveillance capitalists
  • Combining rich information on individuals’ behavioral triggers with the ability to deliver precisely tailored and timed messages turns out to be a recipe for behavior modification on an unprecedented scale.
  • it was Facebook, with its incredibly detailed data on people’s social lives, that grasped digital media’s full potential for behavior modification. By using what it called its “social graph” to map the intentions, desires, and interactions of literally billions of individuals, it saw that it could turn its network into a worldwide Skinner box, employing psychological triggers and rewards to program not only what people see but how they react.
  • spying on the populace is not the end game. The real prize lies in figuring out ways to use the data to shape how people think and act. “The best way to predict the future is to invent it,” the computer scientist Alan Kay once observed. And the best way to predict behavior is to script it.
  • competition for personal data intensified. It was no longer enough to monitor people online; making better predictions required that surveillance be extended into homes, stores, schools, workplaces, and the public squares of cities and towns. Much of the recent innovation in the tech industry has entailed the creation of products and services designed to vacuum up data from every corner of our lives
  • “The typical complaint is that privacy is eroded, but that is misleading,” Zuboff writes. “In the larger societal pattern, privacy is not eroded but redistributed . . . . Instead of people having the rights to decide how and what they will disclose, these rights are concentrated within the domain of surveillance capitalism.” The transfer of decision rights is also a transfer of autonomy and agency, from the citizen to the corporation.
  • What we lose under this regime is something more fundamental than privacy. It’s the right to make our own decisions about privacy — to draw our own lines between those aspects of our lives we are comfortable sharing and those we are not
  • Other possible ways of organizing online markets, such as through paid subscriptions for apps and services, never even got a chance to be tested.
  • Online surveillance came to be viewed as normal and even necessary by politicians, government bureaucrats, and the general public
  • Google and other Silicon Valley companies benefited directly from the government’s new stress on digital surveillance. They earned millions through contracts to share their data collection and analysis techniques with the National Security Agenc
  • As much as the dot-com crash, the horrors of 9/11 set the stage for the rise of surveillance capitalism. Zuboff notes that, in 2000, members of the Federal Trade Commission, frustrated by internet companies’ lack of progress in adopting privacy protections, began formulating legislation to secure people’s control over their online information and severely restrict the companies’ ability to collect and store it. It seemed obvious to the regulators that ownership of personal data should by default lie in the hands of private citizens, not corporations.
  • The 9/11 attacks changed the calculus. The centralized collection and analysis of online data, on a vast scale, came to be seen as essential to national security. “The privacy provisions debated just months earlier vanished from the conversation more or less overnight,”
Javier E

Why Study History? (1985) | AHA - 0 views

  • Isn't there quite enough to learn about the world today? Why add to the burden by looking at the past
  • Historical knowledge is no more and no less than carefully and critically constructed collective memory. As such it can both make us wiser in our public choices and more richly human in our private lives.
  • Without individual memory, a person literally loses his or her identity, and would not know how to act in encounters with others. Imagine waking up one morning unable to tell total strangers from family and friends!
  • ...37 more annotations...
  • Collective memory is similar, though its loss does not immediately paralyze everyday private activity. But ignorance of history-that is, absent or defective collective memory-does deprive us of the best available guide for public action, especially in encounters with outsider
  • Often it is enough for experts to know about outsiders, if their advice is listened to. But democratic citizenship and effective participation in the determination of public policy require citizens to share a collective memory, organized into historical knowledge and belief
  • This value of historical knowledge obviously justifies teaching and learning about what happened in recent times, for the way things are descends from the way they were yesterday and the day before that
  • in fact, institutions that govern a great deal of our everyday behavior took shape hundreds or even thousands of years ago
  • Only an acquaintance with the entire human adventure on earth allows us to understand these dimensions of contemporary reality.
  • it follows that study of history is essential for every young person.
  • Collective memory is quite the same. Historians are always at work reinterpreting the past, asking new questions, searching new sources and finding new meanings in old documents in order to bring the perspective of new knowledge and experience to bear on the task of understanding the past.
  • what we know and believe about history is always changing. In other words, our collective, codified memory alters with time just as personal memories do, and for the same reasons.
  • skeptics are likely to conclude that history has no right to take student time from other subjects. If what is taught today is not really true, how can it claim space in a crowded school curriculum?
  • what if the world is more complicated and diverse than words can ever tell? What if human minds are incapable of finding' neat pigeon holes into which everything that happens will fit?
  • What if we have to learn to live with uncertainty and probabilities, and act on the basis of the best guesswork we are capable of?
  • Then, surely, the changing perspectives of historical understanding are the very best introduction we can have to the practical problems of real life. Then, surely, a serious effort to understand the interplay of change and continuity in human affairs is the only adequate introduction human beings can have to the confusing flow of events that constitutes the actual, adult world.
  • Memory is not something fixed and forever. As time passes, remembered personal experiences take on new meanings.
  • Systematic sciences are not enough. They discount time, and therefore oversimplify reality, especially human reality.
  • Memory, indeed, makes us human. History, our collective memory, carefully codified and critically revised, makes us social, sharing ideas and ideals with others so as to form all sorts of different human groups
  • The varieties of history are enormous; facts and probabilities about the past are far too numerous for anyone to comprehend them all. Every sort of human group has its own histor
  • Where to start? How bring some sort of order to the enormous variety of things known and believed about the past?
  • Early in this century, teachers and academic administrators pretty well agreed that two sorts of history courses were needed: a survey of the national history of the United States and a survey of European history.
  • This second course was often broadened into a survey of Western civilization in the 1930s and 1940s
  • But by the 1960s and 1970s these courses were becoming outdated, left behind by the rise of new kinds social and quantitative history, especially the history of women, of Blacks, and of other formerly overlooked groups within the borders of the United States, and of peoples emerging from colonial status in the world beyond our borders.
  • much harder to combine old with new to make an inclusive, judiciously balanced (and far less novel) introductory course for high school or college students.
  • But abandoning the effort to present a meaningful portrait of the entire national and civilizational past destroyed the original justification for requiring students to study history
  • Competing subjects abounded, and no one could or would decide what mattered most and should take precedence. As this happened, studying history became only one among many possible ways of spending time in school.
  • The costs of this change are now becoming apparent, and many concerned persons agree that returning to a more structured curriculum, in which history ought to play a prominent part, is imperative.
  • three levels of generality seem likely to have the greatest importance for ordinary people.
  • First is family, local, neighborhood history
  • Second is national history, because that is where political power is concentrated in our time.
  • Last is global history, because intensified communications make encounters with all the other peoples of the earth increasingly important.
  • Other pasts are certainly worth attention, but are better studied in the context of a prior acquaintance with personal-local, national, and global history. That is because these three levels are the ones that affect most powerfully what all other groups and segments of society actually do.
  • National history that leaves out Blacks and women and other minorities is no longer acceptable; but American history that leaves out the Founding Fathers and the Constitution is not acceptable either. What is needed is a vision of the whole, warts and all.
  • the study of history does not lead to exact prediction of future events. Though it fosters practical wisdom, knowledge of the past does not permit anyone to know exactly what is going to happen
  • Consequently, the lessons of history, though supremely valuable when wisely formulated, become grossly misleading when oversimplifiers try to transfer them mechanically from one age to another, or from one place to another.
  • Predictable fixity is simply not the human way of behaving. Probabilities and possibilities-together with a few complete surprises-are what we live with and must learn to expect.
  • Second, as acquaintance with the past expands, delight in knowing more and more can and often does become an end in itself.
  • On the other hand, studying alien religious beliefs, strange customs, diverse family patterns and vanished social structures shows how differently various human groups have tried to cop
  • Broadening our humanity and extending our sensibilities by recognizing sameness and difference throughout the recorded past is therefore an important reason for studying history, and especially the history of peoples far away and long ago
  • For we can only know ourselves by knowing how we resemble and how we differ from others. Acquaintance with the human past is the only way to such self knowledge.
margogramiak

Scientists show what loneliness looks like in the brain: Neural 'signature' may reflect... - 0 views

  • This holiday season will be a lonely one for many people as social distancing due to COVID-19 continues, and it is important to understand how isolation affects our health.
  • This holiday season will be a lonely one for many people as social distancing due to COVID-19 continues, and it is important to understand how isolation affects our health.
    • margogramiak
       
      This is a very current topic, and something I've been wondering about.
  • based on variations in the volume of different brain regions as well as based on how those regions communicate with one another across brain networks.
    • margogramiak
       
      That's interesting. I'm excited to hear about what those differences are.
  • ...7 more annotations...
  • They then compared the MRI data of participants who reported often feeling lonely with those who did not.
    • margogramiak
       
      Makes sense
  • Researchers found the default networks of lonely people were more strongly wired together and surprisingly, their grey matter volume in regions of the default network was greater.
    • margogramiak
       
      Why?
  • The fact the structure and function of this network is positively associated with loneliness may be because lonely people are more likely to use imagination, memories of the past or hopes for the future to overcome their social isolation.
    • margogramiak
       
      This makes sense, but it's always really sad to think about.
  • In lonely people, the structure of this fibre tract was better preserved.
    • margogramiak
       
      Again, why?
  • "In the absence of desired social experiences, lonely individuals may be biased towards internally-directed thoughts such as reminiscing or imagining social experiences.
    • margogramiak
       
      That makes sense. It also makes sense that it would affect the brain.
  • Loneliness is increasingly being recognized as a major health problem, and previous studies have shown older people who experience loneliness have a higher risk of cognitive decline and dementia.
    • margogramiak
       
      This is obviously a very current issue because of quarantine
  • the urgency of reducing loneliness in today's society,
    • margogramiak
       
      but how is that possible in a climate like this?
mshilling1

The Importance of Logic and Critical Thinking | WIRED - 0 views

  • The rationality of the world is what is at risk. Too many people are taken advantage of because of their lack of critical thinking, logic and deductive reasoning
  • These same people are raising children without these same skills, creating a whole new generation of clueless people.
  • However, valid logic does not always guarantee truth or a sound argument.
  • ...3 more annotations...
  • Valid logic is when the structure of logic is correct in the way of syntax and semantics rather than truth.
  • The basic lesson here is that, while the logic above might seem valid because of the structure of the statement, it takes a further understanding to figure out why it's not necessarily true
  • The underlying lesson here is not to immediately assume everything you read or are told is true, something all children need to and should learn.
Javier E

Pfizer and Moderna Vaccines Likely to Produce Lasting Immunity, Study Finds - The New Y... - 0 views

  • in people who survived Covid-19, immune cells that recognize the virus lie quiescent in the bone marrow for at least eight months after infection. A study by another team indicated that so-called memory B cells continue to mature and strengthen for at least a year after infection.
  • Based on those findings, researchers suggested that immunity might last for years, possibly a lifetime, in people who were infected with the coronavirus and later vaccinated.
  • But it was unclear whether vaccination alone might have a similarly long-lasting effect.
  • ...13 more annotations...
  • “Usually by four to six weeks, there’s not much left,” said Deepta Bhattacharya, an immunologist at the University of Arizona. But germinal centers stimulated by the mRNA vaccines are “still going, months into it, and not a lot of decline in most people.”
  • The broader the range and the longer these cells have to practice, the more likely they are to be able to thwart variants of the virus that may emerge.
  • “Everyone always focuses on the virus evolving — this is showing that the B cells are doing the same thing,” said Marion Pepper, an immunologist at the University of Washington in Seattle. “And it’s going to be protective against ongoing evolution of the virus, which is really encouraging.”
  • Dr. Ellebedy’s team found that 15 weeks after the first dose of vaccine, the germinal center was still highly active in all 14 of the participants, and that the number of memory cells that recognized the coronavirus had not declined.
  • “The fact that the reactions continued for almost four months after vaccination — that’s a very, very good sign,” Dr. Ellebedy said. Germinal centers typically peak one to two weeks after immunization, and then wane.
  • After an infection or a vaccination, a specialized structure called the germinal center forms in lymph nodes. This structure is an elite school of sorts for B cells — a boot camp where they become increasingly sophisticated and learn to recognize a diverse set of viral genetic sequences.
  • The results suggest that a vast majority of vaccinated people will be protected over the long term — at least, against the existing coronavirus variants
  • But older adults, people with weak immune systems and those who take drugs that suppress immunity may need boosters; people who survived Covid-19 and were later immunized may never need them at all.
  • In the absence of variants that sidestep immunity, in theory immunity could last a lifetime, experts said. But the virus is clearly evolving.
  • “Anything that would actually require a booster would be variant-based, not based on waning of immunity,” Dr. Bhattacharya said. “I just don’t see that happening.”
  • The good news: A booster vaccine will probably have the same effect as prior infection in immunized people, Dr. Ellebedy said. “If you give them another chance to engage, they will have a massive response,” he said, referring to memory B cells.
  • Dr. Ellebedy said the results also suggested that these signs of persistent immune reaction might be caused by mRNA vaccines alone, as opposed to those made by more traditional means, like Johnson & Johnson’s
  • But that is an unfair comparison, because the Johnson & Johnson vaccine is given as a single dose, Dr. Iwasaki said: “If the J & J had a booster, maybe it will induce this same kind of response.”
caelengrubb

5 key facts about language and the brain - 0 views

  • Language is a complex topic, interwoven with issues of identity, rhetoric, and ar
  • While other animals do have their own codes for communication — to indicate, for instance, the presence of danger, a willingness to mate, or the presence of food — such communications are typically “repetitive instrumental acts” that lack a formal structure of the kind that humans use when they utter sentences
  • As Homo sapiens, we have the necessary biological tools to utter the complex constructions that constitute language, the vocal apparatus, and a brain structure complex and well-developed enough to create a varied vocabulary and strict sets of rules on how to use it.
  • ...7 more annotations...
  • Though it remains unclear at what point the ancestors of modern humans first started to develop spoken language, we know that our Homo sapiens predecessors emerged around 150,000–200,000 years ago. So, Prof. Pagel explains, complex speech is likely at least as old as that
  • A study led by researchers from Lund University in Sweden found that committed language students experienced growth in the hippocampus, a brain region associated with learning and spatial navigation, as well as in parts of the cerebral cortex, or the outmost layer of the brain.
  • In fact, researchers have drawn many connections between bilingualism or multilingualism and the maintenance of brain health
  • Multiple studies, for instance, have found that bilingualism can protect the brain against Alzheimer’s disease and other forms of dementia.
  • Being bilingual has other benefits, too, such as training the brain to process information efficiently while expending only the necessary resources on the tasks at hand.
  • Research now shows that her assessment was absolutely correct — the language that we use does change not only the way we think and express ourselves, but also how we perceive and interact with the world.
  • Language holds such power over our minds, decision-making processes, and lives, so Broditsky concludes by encouraging us to consider how we might use it to shape the way we think about ourselves and the world.
lucieperloff

Don't Waste Your Emotions on Plants, They Have No Feelings | Live Science - 0 views

  • Trees — and all plants, for that matter — feel nothing at all, because consciousness, emotions and cognition are hallmarks of animals alone, scientists recently reported in an opinion article.
  • Though plants lack brains, the firing of electrical signals in their stems and leaves nonetheless triggered responses that hinted at consciousness, researchers previously reported.
  • Beginning in 2006, some scientists have argued that plants possess neuron-like cells that interact with hormones and neurotransmitters, forming "a plant nervous system, analogous to that in animals
    • lucieperloff
       
      so what changed?
  • ...2 more annotations...
  • "If the lower animals — which have nervous systems — lack consciousness, the chances that plants without nervous systems have consciousness are effectively nil," Taiz said.
  • "Being unconscious is in all likelihood an advantage to plants and contributes to their evolutionary fitness," he added.
anonymous

Human Brain: facts and information - 0 views

  • The human brain is more complex than any other known structure in the universe.
  • Weighing in at three pounds, on average, this spongy mass of fat and protein is made up of two overarching types of cells—called glia and neurons—and it contains many billions of each.
  • The cerebrum is the largest part of the brain, accounting for 85 percent of the organ's weight. The distinctive, deeply wrinkled outer surface is the cerebral cortex. It's the cerebrum that makes the human brain—and therefore humans—so formidable. Animals such as elephants, dolphins, and whales actually have larger brains, but humans have the most developed cerebrum. It's packed to capacity inside our skulls, with deep folds that cleverly maximize the total surface area of the cortex.
  • ...18 more annotations...
  • The cerebrum has two halves, or hemispheres, that are further divided into four regions, or lobes. The frontal lobes, located behind the forehead, are involved with speech, thought, learning, emotion, and movement.
  • Behind them are the parietal lobes, which process sensory information such as touch, temperature, and pain.
  • At the rear of the brain are the occipital lobes, dealing with vision
  • Lastly, there are the temporal lobes, near the temples, which are involved with hearing and memory.
  • The second-largest part of the brain is the cerebellum, which sits beneath the back of the cerebrum.
  • diencephalon, located in the core of the brain. A complex of structures roughly the size of an apricot, its two major sections are the thalamus and hypothalamus
  • The brain is extremely sensitive and delicate, and so it requires maximum protection, which is provided by the hard bone of the skull and three tough membranes called meninges.
  • Want more proof that the brain is extraordinary? Look no further than the blood-brain barrier.
  • This led scientists to learn that the brain has an ingenious, protective layer. Called the blood-brain barrier, it’s made up of special, tightly bound cells that together function as a kind of semi-permeable gate throughout most of the organ. It keeps the brain environment safe and stable by preventing some toxins, pathogens, and other harmful substances from entering the brain through the bloodstream, while simultaneously allowing oxygen and vital nutrients to pass through.
  • One in five Americans suffers from some form of neurological damage, a wide-ranging list that includes stroke, epilepsy, and cerebral palsy, as well as dementia.
  • Alzheimer’s disease, which is characterized in part by a gradual progression of short-term memory loss, disorientation, and mood swings, is the most common cause of dementia. It is the sixth leading cause of death in the United States
  • 50 million people suffer from Alzheimer’s or some form of dementia. While there are a handful of drugs available to mitigate Alzheimer’s symptoms, there is no cure.
  • Unfortunately, negative attitudes toward people who suffer from mental illness are widespread. The stigma attached to mental illness can create feelings of shame, embarrassment, and rejection, causing many people to suffer in silence.
  • In the United States, where anxiety disorders are the most common forms of mental illness, only about 40 percent of sufferers receive treatment. Anxiety disorders often stem from abnormalities in the brain’s hippocampus and prefrontal cortex.
  • Attention-deficit/hyperactivity disorder, or ADHD, is a mental health condition that also affects adults but is far more often diagnosed in children.
  • ADHD is characterized by hyperactivity and an inability to stay focused.
  • Depression is another common mental health condition. It is the leading cause of disability worldwide and is often accompanied by anxiety. Depression can be marked by an array of symptoms, including persistent sadness, irritability, and changes in appetite.
  • The good news is that in general, anxiety and depression are highly treatable through various medications—which help the brain use certain chemicals more efficiently—and through forms of therapy
  •  
    Here is some anatomy of the brain and descriptions of diseases like Alzheimer's and conditions like ADHD, depression, anxiety.
johnsonel7

Hand gestures point towards the origins of language - 0 views

  • Communication gestures used by humans and our primate relatives are providing clues about how our species' ability to use spoken language evolved.
  • But human language is one such oddity. Our ability to use subtle combinations of sounds produced by our vocal cords to create words and sentences, which when combined with grammatical rules, convey complex ideas.
  • "The idea is to look at language, not just as speech, but seeing it as a constellation of many cognitive properties,"
  • ...3 more annotations...
  • In human babies, which learn to gesture at objects before they can speak, the left side of their brain seems to be engaged when they do so. Certain regions on the left side of our brain, such as Broca's area, are especially important when we speak.
  • Prof. Sandler is exploring the relationship between physical communication and the composition of human language. She believes sign languages can provide some clues to the structure of human language and how language may have emerged in our ancestors.
  • Prof. Sandler is exploring the relationship between physical communication and the composition of human language. She believes sign languages can provide some clues to the structure of human language and how language may have emerged in our ancestors.
Javier E

Understanding What's Wrong With Facebook | Talking Points Memo - 0 views

  • to really understand the problem with Facebook we need to understand the structural roots of that problem, how much of it is baked into the core architecture of the site and its very business model
  • much of it is inherent in the core strategies of the post-2000, second wave Internet tech companies that now dominate our information space and economy.
  • Facebook is an ingenious engine for information and ideational manipulation.
  • ...17 more annotations...
  • Good old fashioned advertising does that to a degree. But Facebook is much more powerful, adaptive and efficient.
  • Facebook is designed to do specific things. It’s an engine to understand people’s minds and then manipulate their thinking.
  • Those tools are refined for revenue making but can be used for many other purposes. That makes it ripe for misuse and bad acting.
  • The core of all second wave Internet commerce operations was finding network models where costs grow mathematically and revenues grow exponentially.
  • The network and its dominance is the product and once it takes hold the cost inputs remained constrained while the revenues grow almost without limit.
  • Facebook is best understood as a fantastically profitable nuclear energy company whose profitability is based on dumping the waste on the side of the road and accepting frequent accidents and explosions as inherent to the enterprise.
  • That’s why these companies employ so few people relative to scale and profitability.
  • That’s why there’s no phone support for Google or Facebook or Twitter. If half the people on the planet are ‘customers’ or users that’s not remotely possible.
  • The core economic model requires doing all of it on the cheap. Indeed, what Zuckerberg et al. have created with Facebook is so vast that the money required not to do it on the cheap almost defies imagination.
  • Facebook’s core model and concept requires not taking responsibility for what others do with the engine created to drive revenue.
  • It all amounts to a grand exercise in socializing the externalities and keeping all the revenues for the owners.
  • Here’s a way to think about it. Nuclear power is actually incredibly cheap. The fuel is fairly plentiful and easy to pull out of the ground. You set up a little engine and it generates energy almost without limit. What makes it ruinously expensive is managing the externalities – all the risks and dangers, the radiation, accidents, the constant production of radioactive waste.
  • managing or distinguishing between legitimate and bad-acting uses of the powerful Facebook engine is one that would require huge, huge investments of money and armies of workers to manage
  • But back to Facebook. The point is that they’ve created a hugely powerful and potentially very dangerous machine
  • The core business model is based on harvesting the profits from the commercial uses of the machine and using algorithms and very, very limited personnel (relative to scale) to try to get a handle on the most outrageous and shocking abuses which the engine makes possible.
  • Zuckerberg may be a jerk and there really is a culture of bad acting within the organization. But it’s not about him being a jerk. Replace him and his team with non-jerks and you’d still have a similar core problem.
  • To manage the potential negative externalities, to take some responsibility for all the dangerous uses the engine makes possible would require money the owners are totally unwilling and in some ways are unable to spend.
Javier E

Merck CEO Ken Frazier Discusses a COVID Cure, Racism, and Why Leaders Need to Walk the ... - 0 views

  • Frazier: It means that no matter where you are in the world, you should have access to this vaccine because it is a global pandemic. And my view is unless all of us are safe, none of us are safe.
  • when you think about the world that we live in with climate change, with ecosystem disruption, with populations moving around the way they do with human mobility the way it is, this pandemic is just the first of many that we could experience as a species because those conditions are only going to get worse going forward.
  • Neeley: The EU union has barred Americans from traveling to Europe. Frazier: Yes, because they see the spikes in this country, which goes back to the fact that we aren't doing the things that we could do to suppress the epidemic. We Americans, we value liberty. I know this is not a political science conversation, but the fact of the matter is if you think about the United States of America and its history, liberty has been a very strong theme in our politics. And I've always believed it's because historically, we've had these two big, beautiful oceans protecting us from the rest of the world. And so we could say it's all about my liberty. It's not about security or group security.
  • ...9 more annotations...
  • Harvard Business School, I think put out a study a few years ago, showing that something like 30% of all hiring for what's called sort of bachelor's level jobs are for skill sets that don't require a bachelor's. So that alone exclude something like 70% of African Americans for no reason.
  • This whole pandemic, what it's done, it's unmasked the huge disparities that exist in our society already. I mean, the fact of the matter is this educational one we just talked about in terms of access to broadband and hardware. But you look at the disparities. I mean, the African American according to a study at Yale is 3.5 times more likely to die from COVID than a white. Somebody who's Latinix is three times more likely to die. So this has unmasked these huge structural elements of racism that existed in this country for a long time. And we need to step up to those structural elements that determine the lives of so many people.
  • Well, this virus doesn't really care about that. And if you're going to do it, if you're going to exercise your liberty at my personal expense, then we can't control the pandemic. And the Europeans are looking at that and they're saying, "We don't want you bringing that into our shores."
  • We have to have the psychological armor to defend ourselves against the racism that's all around us, that's the first piece of advice.
  • The second piece of advice I give is that, you really can't plan your career. You have to take advantage of all the opportunities that you have before you. And I believe that at least in my own instance, what helped me a lot was that I wanted a certain level of autonomy and accountability. And when you do that, you get more responsibility because you are willing to go outside the lane of what most people do.
  • it's sort of humorous to me when people say to me, "I don't see color. I don't even notice that you're a Black man." Every minute of my life, I realize I'm a Black man. How they don't realize it is beyond me. But I really think it's important for young African Americans to have their own communities, to reinforce one another so that they can deal with that incoming.
  • My father Otis Frazier 's father, Richard Frazier , was born in 1861. And so I have only one generation between me and slavery, which is quite unusual for someone at this stage. And my father only had a third grade education and what passed for third grade education for an African American child in South Carolina, between 1906 to 1909. But he was self-taught. He had immaculate habits of speech and dress and behavior, and he was his own man. And he gave me the single most important piece of advice I've ever had when I was growing up in the inner city. And here it is, he would say to me, Kenny, what other people think about you is none of your damn business. And the sooner you learn that, the better off you'll be
  • now I can see when you're running a company like Merck and Wall Street is criticizing you because you don't do what they want you to do, I can hear my father saying, you know what they think about you is none of your damn business.
  • And that is what it meant to be a man to me, was to get up every morning, go to work, take care of your family, take your family to church on Sunday and to make sure that your children understood the importance of education and opportunity. And so, while I was born in a really tough inner city neighborhood, I always tell people I had the good fortune to be born in my mother and my father's house. More my father, because my mother died when I was really young and I was raised by a father who was not sentimental about his children, but had high standards. And it helped me a lot to have to live up to my father's standards, which I'm still living up to.
tongoscar

Ridgecrest earthquakes show how small faults can trigger big quakes - Los Angeles Times - 0 views

  • When an earthquake strikes, the instinct of many Californians is to ask: Which fault ruptured — the Newport-Inglewood, the Hayward, the mighty San Andreas?
  • But scientists are increasingly saying it’s not that simple.
  • New research shows that the Ridgecrest earthquakes that began in July ruptured at least two dozen faults.
  • ...9 more annotations...
  • The findings are important in helping understand how earthquakes can grow in the seconds after a fault ruptures, when two blocks of earth move away from each other.
  • The results provide even more evidence to support the idea that California faults once thought to be limited by their individual length can actually link together in a much more massive earthquake.
  • “The point is that the Landers earthquake and this earthquake are daisy-chaining up faults that previously were thought to rupture only by themselves, and that’s an important observation,”
  • The study raises the possibility that past earthquakes actually may have been bigger than previously thought.
  • In New Zealand, scientists were stunned at the bizarre map of the faults ruptured in the magnitude 7.8 Kaikoura earthquake of 2016, resembling an upside-down trident aimed at the silhouette of an eagle.
  • On a practical level, the research underscores the potential limitations of state earthquake zones designated to prevent new construction directly on top of faults,
  • Further analysis needs to be done to determine whether the 20 cross faults identified in the Ridgecrest study using computer analysis of shaking records actually broke the ground at the surface, according to Tim Dawson, a senior engineering geologist with the California Geological Survey.
  • A significant achievement of this study, Dolan said, was being able to image what faults look like deep underground, at a depth where earthquakes begin.
  • what this study proves is that the structural complexity continues deep underground where earthquakes begin, Dolan said.That’s important, Dolan said, because it may help scientists determine where future earthquakes are likely to stop, which tends to happen where faults become structurally complicated.
Javier E

UK mathematician wins richest prize in academia | Mathematics | The Guardian - 0 views

  • Martin Hairer, an Austrian-British researcher at Imperial College London, is the winner of the 2021 Breakthrough prize for mathematics, an annual $3m (£2.3m) award that has come to rival the Nobels in terms of kudos and prestige.
  • Hairer landed the prize for his work on stochastic analysis, a field that describes how random effects turn the maths of things like stirring a cup of tea, the growth of a forest fire, or the spread of a water droplet that has fallen on a tissue into a fiendishly complex problem.
  • His major work, a 180-page treatise that introduced the world to “regularity structures”, so stunned his colleagues that one suggested it must have been transmitted to Hairer by a more intelligent alien civilisation.
  • ...3 more annotations...
  • After dallying with physics at university, Hairer moved into mathematics. The realisation that ideas in theoretical physics can be overturned and swiftly consigned to the dustbin did not appeal. “I wouldn’t really want to put my name to a result that could be superseded by something else three years later,” he said. “In mathematics, if you obtain a result then that is it. It’s the universality of mathematics, you discover absolute truths.”
  • Hairer’s expertise lies in stochastic partial differential equations, a branch of mathematics that describes how randomness throws disorder into processes such as the movement of wind in a wind tunnel or the creeping boundary of a water droplet landing on a tissue. When the randomness is strong enough, solutions to the equations get out of control. “In some cases, the solutions fluctuate so wildly that it is not even clear what the equation meant in the first place,” he said.
  • With the invention of regularity structures, Hairer showed how the infinitely jagged noise that threw his equations into chaos could be reframed and tamed.
Javier E

He Wants to Save Classics From Whiteness. Can the Field Survive? - The New York Times - 0 views

  • Padilla laid out an indictment of his field. “If one were intentionally to design a discipline whose institutional organs and gatekeeping protocols were explicitly aimed at disavowing the legitimate status of scholars of color,” he said, “one could not do better than what classics has done.”
  • Padilla believes that classics is so entangled with white supremacy as to be inseparable from it. “Far from being extrinsic to the study of Greco-Roman antiquity,” he has written, “the production of whiteness turns on closer examination to reside in the very marrows of classics.”
  • Rather than kowtowing to criticism, Williams said, “maybe we should start defending our discipline.” She protested that it was imperative to stand up for the classics as the political, literary and philosophical foundation of European and American culture: “It’s Western civilization. It matters because it’s the West.” Hadn’t classics given us the concepts of liberty, equality and democracy?
  • ...46 more annotations...
  • Williams ceded the microphone, and Padilla was able to speak. “Here’s what I have to say about the vision of classics that you outlined,” he said. “I want nothing to do with it. I hope the field dies that you’ve outlined, and that it dies as swiftly as possible.”
  • “I believe in merit. I don’t look at the color of the author.” She pointed a finger in Padilla’s direction. “You may have got your job because you’re Black,” Williams said, “but I would prefer to think you got your job because of merit.”
  • What he did find was a slim blue-and-white textbook titled “How People Lived in Ancient Greece and Rome.” “Western civilization was formed from the union of early Greek wisdom and the highly organized legal minds of early Rome,” the book began. “The Greek belief in a person’s ability to use his powers of reason, coupled with Roman faith in military strength, produced a result that has come to us as a legacy, or gift from the past.” Thirty years later, Padilla can still recite those opening lines.
  • In 2017, he published a paper in the journal Classical Antiquity that compared evidence from antiquity and the Black Atlantic to draw a more coherent picture of the religious life of the Roman enslaved. “It will not do merely to adopt a pose of ‘righteous indignation’ at the distortions and gaps in the archive,” he wrote. “There are tools available for the effective recovery of the religious experiences of the enslaved, provided we work with these tools carefully and honestly.”
  • Padilla sensed that his pursuit of classics had displaced other parts of his identity, just as classics and “Western civilization” had displaced other cultures and forms of knowledge. Recovering them would be essential to dismantling the white-supremacist framework in which both he and classics had become trapped. “I had to actively engage in the decolonization of my mind,” he told me.
  • He also gravitated toward contemporary scholars like José Esteban Muñoz, Lorgia García Peña and Saidiya Hartman, who speak of race not as a physical fact but as a ghostly system o
  • In response to rising anti-immigrant sentiment in Europe and the United States, Mary Beard, perhaps the most famous classicist alive, wrote in The Wall Street Journal that the Romans “would have been puzzled by our modern problems with migration and asylum,” because the empire was founded on the “principles of incorporation and of the free movement of people.”
  • In November 2015, he wrote an essay for Eidolon, an online classics journal, clarifying that in Rome, as in the United States, paeans to multiculturalism coexisted with hatred of foreigners. Defending a client in court, Cicero argued that “denying foreigners access to our city is patently inhumane,” but ancient authors also recount the expulsions of whole “suspect” populations, including a roundup of Jews in 139 B.C., who were not considered “suitable enough to live alongside Romans.”
  • The job of classicists is not to “point out the howlers,” he said on a 2017 panel. “To simply take the position of the teacher, the qualified classicist who knows things and can point to these mistakes, is not sufficient.”
  • Dismantling structures of power that have been shored up by the classical tradition will require more than fact-checking; it will require writing an entirely new story about antiquity, and about who we are today
  • To find that story, Padilla is advocating reforms that would “explode the canon” and “overhaul the discipline from nuts to bolts,” including doing away with the label “classics” altogether.
  • . “What I want to be thinking about in the next few weeks,” he told them, “is how we can be telling the story of the early Roman Empire not just through a variety of sources but through a variety of persons.” He asked the students to consider the lives behind the identities he had assigned them, and the way those lives had been shaped by the machinery of empire, which, through military conquest, enslavement and trade, creates the conditions for the large-scale movement of human beings.
  • ultimately, he decided that leaving enslaved characters out of the role play was an act of care. “I’m not yet ready to turn to a student and say, ‘You are going to be a slave.’”
  • Privately, even some sympathetic classicists worry that Padilla’s approach will only hasten the field’s decline. “I’ve spoken to undergrad majors who say that they feel ashamed to tell their friends they’re studying classics,”
  • “I very much admire Dan-el’s work, and like him, I deplore the lack of diversity in the classical profession,” Mary Beard told me via email. But “to ‘condemn’ classical culture would be as simplistic as to offer it unconditional admiration.”
  • In a 2019 talk, Beard argued that “although classics may become politicized, it doesn’t actually have a politics,” meaning that, like the Bible, the classical tradition is a language of authority — a vocabulary that can be used for good or ill by would-be emancipators and oppressors alike.
  • Over the centuries, classical civilization has acted as a model for people of many backgrounds, who turned it into a matrix through which they formed and debated ideas about beauty, ethics, power, nature, selfhood, citizenship and, of course, race
  • Anthony Grafton, the great Renaissance scholar, put it this way in his preface to “The Classical Tradition”: “An exhaustive exposition of the ways in which the world has defined itself with regard to Greco-Roman antiquity would be nothing less than a comprehensive history of the world.”
  • Classics as we know it today is a creation of the 18th and 19th centuries. During that period, as European universities emancipated themselves from the control of the church, the study of Greece and Rome gave the Continent its new, secular origin story. Greek and Latin writings emerged as a competitor to the Bible’s moral authority, which lent them a liberatory power
  • Historians stress that such ideas cannot be separated from the discourses of nationalism, colorism and progress that were taking shape during the modern colonial period, as Europeans came into contact with other peoples and their traditions. “The whiter the body is, the more beautiful it is,” Winkelmann wrote.
  • While Renaissance scholars were fascinated by the multiplicity of cultures in the ancient world, Enlightenment thinkers created a hierarchy with Greece and Rome, coded as white, on top, and everything else below.
  • Jefferson, along with most wealthy young men of his time, studied classics at college, where students often spent half their time reading and translating Greek and Roman texts. “Next to Christianity,” writes Caroline Winterer, a historian at Stanford, “the central intellectual project in America before the late 19th century was classicism.
  • Of the 2.5 million people living in America in 1776, perhaps only 3,000 had gone to college, but that number included many of the founders
  • They saw classical civilization as uniquely educative — a “lamp of experience,” in the words of Patrick Henry, that could light the path to a more perfect union. However true it was, subsequent generations would come to believe, as Hannah Arendt wrote in “On Revolution,” that “without the classical example … none of the men of the Revolution on either side of the Atlantic would have possessed the courage for what then turned out to be unprecedented action.”
  • Comparisons between the United States and the Roman Empire became popular as the country emerged as a global power. Even after Latin and Greek were struck from college-entrance exams, the proliferation of courses on “great books” and Western civilization, in which classical texts were read in translation, helped create a coherent national story after the shocks of industrialization and global warfare.
  • even as the classics were pulled apart, laughed at and transformed, they continued to form the raw material with which many artists shaped their visions of modernity.
  • Over the centuries, thinkers as disparate as John Adams and Simone Weil have likened classical antiquity to a mirror. Generations of intellectuals, among them feminist, queer and Black scholars, have seen something of themselves in classical texts, flashes of recognition that held a kind of liberatory promise
  • The language that is used to describe the presence of classical antiquity in the world today — the classical tradition, legacy or heritage — contains within it the idea of a special, quasi-genetic relationship. In his lecture “There Is No Such Thing as Western Civilization,” Kwame Anthony Appiah (this magazine’s Ethicist columnist) mockingly describes the belief in such a kinship as the belief in a “golden nugget” of insight — a precious birthright and shimmering sign of greatness — that white Americans and Europeans imagine has been passed down to them from the ancients.
  • To see classics the way Padilla sees it means breaking the mirror; it means condemning the classical legacy as one of the most harmful stories we’ve told ourselves
  • Padilla is wary of colleagues who cite the radical uses of classics as a way to forestall change; he believes that such examples have been outmatched by the field’s long alliance with the forces of dominance and oppression.
  • Classics and whiteness are the bones and sinew of the same body; they grew strong together, and they may have to die together. Classics deserves to survive only if it can become “a site of contestation” for the communities who have been denigrated by it in the past.
  • if classics fails his test, Padilla and others are ready to give it up. “I would get rid of classics altogether,” Walter Scheidel, another of Padilla’s former advisers at Stanford, told me. “I don’t think it should exist as an academic field.”
  • One way to get rid of classics would be to dissolve its faculties and reassign their members to history, archaeology and language departments.
  • many classicists are advocating softer approaches to reforming the discipline, placing the emphasis on expanding its borders. Schools including Howard and Emory have integrated classics with Ancient Mediterranean studies, turning to look across the sea at Egypt, Anatolia, the Levant and North Africa. The change is a declaration of purpose: to leave behind the hierarchies of the Enlightenment and to move back toward the Renaissance model of the ancient world as a place of diversity and mixture.
  • Ian Morris put it more bluntly. “Classics is a Euro-American foundation myth,” Morris said to me. “Do we really want that sort of thing?”
  • There’s a more interesting story to be told about the history of what we call the West, the history of humanity, without valorizing particular cultures in it,” said Josephine Quinn, a professor of ancient history at Oxford. “It seems to me the really crucial mover in history is always the relationship between people, between cultures.”
  • “In some moods, I feel that this is just a moment of despair, and people are trying to find significance even if it only comes from self-accusation,” he told me. “I’m not sure that there is a discipline that is exempt from the fact that it is part of the history of this country. How distinctly wicked is classics? I don’t know that it is.”
  • “One of the dubious successes of my generation is that it did break the canon,” Richlin told me. “I don’t think we could believe at the time that we would be putting ourselves out of business, but we did.” She added: “If they blew up the classics departments, that would really be the end.”
  • Padilla, like Douglass, now sees the moment of absorption into the classical, literary tradition as simultaneous with his apprehension of racial difference; he can no longer find pride or comfort in having used it to bring himself out of poverty.
  • “Claiming dignity within this system of structural oppression,” Padilla has said, “requires full buy-in into its logic of valuation.” He refuses to “praise the architects of that trauma as having done right by you at the end.”
  • Last June, as racial-justice protests unfolded across the nation, Padilla turned his attention to arenas beyond classics. He and his co-authors — the astrophysicist Jenny Greene, the literary theorist Andrew Cole and the poet Tracy K. Smith — began writing their open letter to Princeton with 48 proposals for reform. “Anti-Blackness is foundational to America,” the letter began. “Indifference to the effects of racism on this campus has allowed legitimate demands for institutional support and redress in the face of microaggression and outright racist incidents to go long unmet.”
  • Padilla believes that the uproar over free speech is misguided. “I don’t see things like free speech or the exchange of ideas as ends in themselves,” he told me. “I have to be honest about that. I see them as a means to the end of human flourishing.”
  • “There is a certain kind of classicist who will look on what transpired and say, ‘Oh, that’s not us,’” Padilla said when we spoke recently. “What is of interest to me is why is it so imperative for classicists of a certain stripe to make this discursive move? ‘This is not us.’
  • Joel Christensen, the Brandeis professor, now feels that it is his “moral and ethical and intellectual responsibility” to teach classics in a way that exposes its racist history. “Otherwise we’re just participating in propaganda,”
  • Christensen, who is 42, was in graduate school before he had his “crisis of faith,” and he understands the fear that many classicists may experience at being asked to rewrite the narrative of their life’s work. But, he warned, “that future is coming, with or without Dan-el.”
  • On Jan. 6, Padilla turned on the television minutes after the windows of the Capitol were broken. In the crowd, he saw a man in a Greek helmet with TRUMP 2020 painted in white. He saw a man in a T-shirt bearing a golden eagle on a fasces — symbols of Roman law and governance — below the logo 6MWE, which stands for “Six Million Wasn’t Enough,
Javier E

Opinion | You Are the Object of Facebook's Secret Extraction Operation - The New York T... - 0 views

  • Facebook is not just any corporation. It reached trillion-dollar status in a single decade by applying the logic of what I call surveillance capitalism — an economic system built on the secret extraction and manipulation of human data
  • Facebook and other leading surveillance capitalist corporations now control information flows and communication infrastructures across the world.
  • These infrastructures are critical to the possibility of a democratic society, yet our democracies have allowed these companies to own, operate and mediate our information spaces unconstrained by public law.
  • ...56 more annotations...
  • The result has been a hidden revolution in how information is produced, circulated and acted upon
  • The world’s liberal democracies now confront a tragedy of the “un-commons.” Information spaces that people assume to be public are strictly ruled by private commercial interests for maximum profit.
  • The internet as a self-regulating market has been revealed as a failed experiment. Surveillance capitalism leaves a trail of social wreckage in its wake: the wholesale destruction of privacy, the intensification of social inequality, the poisoning of social discourse with defactualized information, the demolition of social norms and the weakening of democratic institutions.
  • These social harms are not random. They are tightly coupled effects of evolving economic operations. Each harm paves the way for the next and is dependent on what went before.
  • There is no way to escape the machine systems that surveil u
  • All roads to economic and social participation now lead through surveillance capitalism’s profit-maximizing institutional terrain, a condition that has intensified during nearly two years of global plague.
  • Will Facebook’s digital violence finally trigger our commitment to take back the “un-commons”?
  • Will we confront the fundamental but long ignored questions of an information civilization: How should we organize and govern the information and communication spaces of the digital century in ways that sustain and advance democratic values and principles?
  • Mark Zuckerberg’s start-up did not invent surveillance capitalism. Google did that. In 2000, when only 25 percent of the world’s information was stored digitally, Google was a tiny start-up with a great search product but little revenue.
  • By 2001, in the teeth of the dot-com bust, Google’s leaders found their breakthrough in a series of inventions that would transform advertising. Their team learned how to combine massive data flows of personal information with advanced computational analyses to predict where an ad should be placed for maximum “click through.”
  • Google’s scientists learned how to extract predictive metadata from this “data exhaust” and use it to analyze likely patterns of future behavior.
  • Prediction was the first imperative that determined the second imperative: extraction.
  • Lucrative predictions required flows of human data at unimaginable scale. Users did not suspect that their data was secretly hunted and captured from every corner of the internet and, later, from apps, smartphones, devices, cameras and sensors
  • User ignorance was understood as crucial to success. Each new product was a means to more “engagement,” a euphemism used to conceal illicit extraction operations.
  • When asked “What is Google?” the co-founder Larry Page laid it out in 2001,
  • “Storage is cheap. Cameras are cheap. People will generate enormous amounts of data,” Mr. Page said. “Everything you’ve ever heard or seen or experienced will become searchable. Your whole life will be searchable.”
  • Instead of selling search to users, Google survived by turning its search engine into a sophisticated surveillance medium for seizing human data
  • Company executives worked to keep these economic operations secret, hidden from users, lawmakers, and competitors. Mr. Page opposed anything that might “stir the privacy pot and endanger our ability to gather data,” Mr. Edwards wrote.
  • As recently as 2017, Eric Schmidt, the executive chairman of Google’s parent company, Alphabet, acknowledged the role of Google’s algorithmic ranking operations in spreading corrupt information. “There is a line that we can’t really get across,” he said. “It is very difficult for us to understand truth.” A company with a mission to organize and make accessible all the world’s information using the most sophisticated machine systems cannot discern corrupt information.
  • This is the economic context in which disinformation wins
  • In March 2008, Mr. Zuckerberg hired Google’s head of global online advertising, Sheryl Sandberg, as his second in command. Ms. Sandberg had joined Google in 2001 and was a key player in the surveillance capitalism revolution. She led the build-out of Google’s advertising engine, AdWords, and its AdSense program, which together accounted for most of the company’s $16.6 billion in revenue in 2007.
  • A Google multimillionaire by the time she met Mr. Zuckerberg, Ms. Sandberg had a canny appreciation of Facebook’s immense opportunities for extraction of rich predictive data. “We have better information than anyone else. We know gender, age, location, and it’s real data as opposed to the stuff other people infer,” Ms. Sandberg explained
  • The company had “better data” and “real data” because it had a front-row seat to what Mr. Page had called “your whole life.”
  • Facebook paved the way for surveillance economics with new privacy policies in late 2009. The Electronic Frontier Foundation warned that new “Everyone” settings eliminated options to restrict the visibility of personal data, instead treating it as publicly available information.
  • Mr. Zuckerberg “just went for it” because there were no laws to stop him from joining Google in the wholesale destruction of privacy. If lawmakers wanted to sanction him as a ruthless profit-maximizer willing to use his social network against society, then 2009 to 2010 would have been a good opportunity.
  • Facebook was the first follower, but not the last. Google, Facebook, Amazon, Microsoft and Apple are private surveillance empires, each with distinct business models.
  • In 2021 these five U.S. tech giants represent five of the six largest publicly traded companies by market capitalization in the world.
  • As we move into the third decade of the 21st century, surveillance capitalism is the dominant economic institution of our time. In the absence of countervailing law, this system successfully mediates nearly every aspect of human engagement with digital information
  • Today all apps and software, no matter how benign they appear, are designed to maximize data collection.
  • Historically, great concentrations of corporate power were associated with economic harms. But when human data are the raw material and predictions of human behavior are the product, then the harms are social rather than economic
  • The difficulty is that these novel harms are typically understood as separate, even unrelated, problems, which makes them impossible to solve. Instead, each new stage of harm creates the conditions for the next stage.
  • Fifty years ago the conservative economist Milton Friedman exhorted American executives, “There is one and only one social responsibility of business — to use its resources and engage in activities designed to increase its profits so long as it stays within the rules of the game.” Even this radical doctrine did not reckon with the possibility of no rules.
  • With privacy out of the way, ill-gotten human data are concentrated within private corporations, where they are claimed as corporate assets to be deployed at will.
  • The sheer size of this knowledge gap is conveyed in a leaked 2018 Facebook document, which described its artificial intelligence hub, ingesting trillions of behavioral data points every day and producing six million behavioral predictions each second.
  • Next, these human data are weaponized as targeting algorithms, engineered to maximize extraction and aimed back at their unsuspecting human sources to increase engagement
  • Targeting mechanisms change real life, sometimes with grave consequences. For example, the Facebook Files depict Mr. Zuckerberg using his algorithms to reinforce or disrupt the behavior of billions of people. Anger is rewarded or ignored. News stories become more trustworthy or unhinged. Publishers prosper or wither. Political discourse turns uglier or more moderate. People live or die.
  • Occasionally the fog clears to reveal the ultimate harm: the growing power of tech giants willing to use their control over critical information infrastructure to compete with democratically elected lawmakers for societal dominance.
  • when it comes to the triumph of surveillance capitalism’s revolution, it is the lawmakers of every liberal democracy, especially in the United States, who bear the greatest burden of responsibility. They allowed private capital to rule our information spaces during two decades of spectacular growth, with no laws to stop it.
  • All of it begins with extraction. An economic order founded on the secret massive-scale extraction of human data assumes the destruction of privacy as a nonnegotiable condition of its business operations.
  • We can’t fix all our problems at once, but we won’t fix any of them, ever, unless we reclaim the sanctity of information integrity and trustworthy communications
  • The abdication of our information and communication spaces to surveillance capitalism has become the meta-crisis of every republic, because it obstructs solutions to all other crises.
  • Neither Google, nor Facebook, nor any other corporate actor in this new economic order set out to destroy society, any more than the fossil fuel industry set out to destroy the earth.
  • like global warming, the tech giants and their fellow travelers have been willing to treat their destructive effects on people and society as collateral damage — the unfortunate but unavoidable byproduct of perfectly legal economic operations that have produced some of the wealthiest and most powerful corporations in the history of capitalism.
  • Where does that leave us?
  • Democracy is the only countervailing institutional order with the legitimate authority and power to change our course. If the ideal of human self-governance is to survive the digital century, then all solutions point to one solution: a democratic counterrevolution.
  • instead of the usual laundry lists of remedies, lawmakers need to proceed with a clear grasp of the adversary: a single hierarchy of economic causes and their social harms.
  • We can’t rid ourselves of later-stage social harms unless we outlaw their foundational economic causes
  • This means we move beyond the current focus on downstream issues such as content moderation and policing illegal content. Such “remedies” only treat the symptoms without challenging the illegitimacy of the human data extraction that funds private control over society’s information spaces
  • Similarly, structural solutions like “breaking up” the tech giants may be valuable in some cases, but they will not affect the underlying economic operations of surveillance capitalism.
  • Instead, discussions about regulating big tech should focus on the bedrock of surveillance economics: the secret extraction of human data from realms of life once called “private.
  • No secret extraction means no illegitimate concentrations of knowledge about people. No concentrations of knowledge means no targeting algorithms. No targeting means that corporations can no longer control and curate information flows and social speech or shape human behavior to favor their interests
  • the sober truth is that we need lawmakers ready to engage in a once-a-century exploration of far more basic questions:
  • How should we structure and govern information, connection and communication in a democratic digital century?
  • What new charters of rights, legislative frameworks and institutions are required to ensure that data collection and use serve the genuine needs of individuals and society?
  • What measures will protect citizens from unaccountable power over information, whether it is wielded by private companies or governments?
  • The corporation that is Facebook may change its name or its leaders, but it will not voluntarily change its economics.
« First ‹ Previous 41 - 60 of 257 Next › Last »
Showing 20 items per page