Skip to main content

Home/ TOK Friends/ Group items tagged Logical

Rss Feed Group items tagged

Javier E

Opinion | Who Will Teach Us How to Feel? - The New York Times - 0 views

  • T Magazine had a very good idea. They gathered some artists and museum curators and asked them to name the artworks that define the contemporary age — pieces created anywhere in the world since 1970.
  • of the 25 works they chose, very few are paintings or sculptures.
  • Most of the pieces selected are intellectual concepts or political attitudes expressed through video, photographs, installations or words.
  • ...12 more annotations...
  • Of the 27 artists recognized, 20 were born in the U.S
  • most of these artists haven’t captured or maybe even appealed to a mass audience
  • Most of the artists have adopted a similar pose: political provocateur. The works are less beautiful creations to be experienced and more often political statements to be decoded
  • What you see when all these works are brought together is how the aesthetic has given way to the political, how the inner life has given way to the protest gesture.
  • Among these 25 pieces, 20 are impersonal and only five allow you to see what life is like for another human being
  • Only a few explore relationships and emotional connection.
  • There almost seems to be a taboo now against capturing states like joy, temptation, gratitude, exaltation, betrayal, forgiveness and longing.
  • one of the things art has traditionally done is educate the emotions
  • Lisa Feldman Barrett and other neuroscientists argue that emotions aren’t baked into our nature as things all humans share. They are constructed by culture — art and music, and relationships.
  • When we see the depth of psychological expression in a Rembrandt portrait, or experience the intimacy of a mother and daughter in a Mary Cassatt, we’re not gaining a new fact, but we’re experiencing a new emotion. We’re widening the repertoire of ways we can feel and can communicate feelings to others.
  • Barrett uses the phrase “emotional granularity” to capture the reality that some people — and some eras — experience a wider range and specificity of emotions than others.
  • People with highly educated emotions can be astonished by the complexity of other people without feeling the need to judge them immediately as good or bad according to some political logic.
Javier E

The science of influencing people: six ways to win an argument | Science | The Guardian - 1 views

  • we have all come across people who appear to have next to no understanding of world events – but who talk with the utmost confidence and conviction
  • the latest psychological research can now help us to understand why
  • the “illusion of explanatory depth”
  • ...28 more annotations...
  • The problem is that we confuse a shallow familiarity with general concepts for real, in-depth knowledge.
  • our knowledge is also highly selective: we conveniently remember facts that support our beliefs and forget others
  • Psychological studies show that people fail to notice the logical fallacies in an argument if the conclusion supports their viewpoint
  • “motivated reasoning”
  • A high standard of education doesn’t necessarily protect us from these flaws
  • That false sense of expertise can, in turn, lead them to feel that they have the licence to be more closed-minded in their political views – an attitude known as “earned dogmatism”.
  • “People confuse their current level of understanding with their peak knowledge,”
  • Graduates, for instance, often overestimate their understanding of their degree subject:
  • recent psychological research also offers evidence-based ways towards achieving more fruitful discussions.
  • a simple but powerful way of deflating someone’s argument is to ask for more detail. “You need to get the ‘other side’ focusing on how something would play itself out, in a step by step fashion”
  • By revealing the shallowness of their existing knowledge, this prompts a more moderate and humble attitude.
  • You need to ask how something works to get the effect
  • If you are trying to debunk a particular falsehood – like a conspiracy theory or fake news – you should make sure that your explanation offers a convincing, coherent narrative that fills all the gaps left in the other person’s understanding
  • The persuasive power of well-constructed narratives means that it’s often useful to discuss the sources of misinformation, so that the person can understand why they were being misled in the first place
  • Each of our beliefs is deeply rooted in a much broader and more complex political ideology. Climate crisis denial, for instance, is now inextricably linked to beliefs in free trade, capitalism and the dangers of environmental regulation.
  • Attacking one issue may therefore threaten to unravel someone’s whole worldview – a feeling that triggers emotionally charged motivated reasoning. It is for this reason that highly educated Republicans in the US deny the overwhelming evidence.
  • disentangle the issue at hand from their broader beliefs, or to explain how the facts can still be accommodated into their worldview.
  • “All people have multiple identities,” says Prof Jay Van Bavel at New York University, who studies the neuroscience of the “partisan brain”. “These identities can become active at any given time, depending on the circumstances.”
  • you might have more success by appealing to another part of the person’s identity entirely.
  • when people are asked to first reflect on their other, nonpolitical values, they tend to become more objective in discussion on highly partisan issues, as they stop viewing facts through their ideological lens.
  • Another simple strategy to encourage a more detached and rational mindset is to ask your conversation partner to imagine the argument from the viewpoint of someone from another country
  • The aim is to help them recognise that they can change their mind on certain issues while staying true to other important elements of their personality.
  • this strategy increases “psychological distance” from the issue at hand and cools emotionally charged reasoning so that you can see things more objectively.
  • If you are considering policies with potentially long-term consequences, you could ask them to imagine viewing the situation through the eyes of someone in the future
  • people are generally much more rational in their arguments, and more willing to own up to the limits of their knowledge and understanding, if they are treated with respect and compassion.
  • Aggression, by contrast, leads them to feel that their identity is threatened, which in turn can make them closed-minded
  • Assuming that the purpose of your argument is to change minds, rather than to signal your own superiority, you are much more likely to achieve your aims by arguing gently and kindly rather than belligerently, and affirming your respect for the person, even if you are telling them some hard truths
  • As a bonus, you will also come across better to onlookers. “There’s a lot of work showing that third-party observers always attribute high levels of competence when the person is conducting themselves with more civility,”
douglasn89

Champs or chumps?: China and currency manipulation | The Economist - 0 views

  • They do not include, for example, the domestic purchasing power of a currency.
    • douglasn89
       
      In Economics, a lot of values are taken for granted and/or ignored.
  • This illustrates one of the method’s flaws: in terms of the goods and services that it can actually buy, the Swiss franc is in fact among the world’s most overvalued currencies.
    • douglasn89
       
      Countries can raise or lower their currency value from an international standpoint without much changes within the country itself.
  • As for China itself, it has been fighting to prop up the yuan in the face of capital outflows, and its score is in fact negative
    • douglasn89
       
      This reminds us of the fact that in Economics, a lot of ideas that may seem to be logical can actually be counterintuitive.
  • ...1 more annotation...
  • As for China itself, it has been fighting to prop up the yuan in the face of capital outflows, and its score is in fact negative
Javier E

Farewell to Kenneth Arrow, a Gentle Genius of Economics - WSJ - 0 views

  • Is there a voting system that can be relied on to distill the will of a group of people? Arrow’s impossibility theorem regarding voting and combining preferences put him in the rarefied group of economists with theorems named after them.
  • Drawing upon mathematical logic, it shows that there is no possible voting scheme that can consistently and sensibly reflect the preferences of a set of individuals with diverse views
  • Any scheme that could ever be invented will be at risk of perverse outcomes, where, for example, the choice between options A and B is affected by the presence or absence of option C; or where a vote switch by one person toward option A makes it less likely to prevail.
  • ...3 more annotations...
  • it also explained why committees have so much trouble coming to consistent conclusions and why, with an increasingly polarized electorate, democracy can become increasingly dysfunctional.
  • until Kenneth drew on the techniques of topology (that is, the study of geometric properties and spatial relations), no one had ever been able to establish precise conditions under which there would be prices that would clear all markets, or under which one could assume that the market outcome was optimal
  • in the early 1950s, he clarified the very specific conditions under which market outcomes were for the best and, of equal importance, the far more general conditions under which public interventions in markets had the potential to make things better.
sissij

Why Silicon Valley Titans Train Their Brains with Philosophy | Big Think - 0 views

  • To alleviate the stresses and open their minds, the execs have been known to experiment with microdosing on psychedelics, taking brain-stimulating nootropics, and sleeping in phases. What’s their latest greatest brain hack? Philosophy.
  • The guidance focuses on using reason and logic to unmask illusions about your life or work.
  • He thinks that approach can obscure the true understanding of human life. In an interview with Quartz, he says that rather than ask “How can I be more successful?” it’s actually more important to ask - "Why be successful?”
  • ...2 more annotations...
  • introduces thought and balance to people’s lives.
  • Thomas S. Kuhn
  •  
    I found this very interesting that philosophy can be linked to the modern silicon valley. The silicon valley always gives me a modern impression as if it is the lead of human technology and a believer of science. I am surprised that actually many people in silicon valley are interested in philosophy, something that I consider being not practical at all. I think this shows the importance of being cross-disciplined. --Sissi (5/23/2017)
Javier E

The Science of Snobbery: How We're Duped Into Thinking Fancy Things Are Better - The At... - 0 views

  • Expert judges and amateurs alike claim to judge classical musicians based on sound. But Tsay’s research suggests that the original judges, despite their experience and expertise, judged the competition (which they heard and watched live) based on visual information, just as amateurs do.
  • just like with classical music, we do not appraise wine in the way that we expect. 
  • Priceonomics revisited this seemingly damning research: the lack of correlation between wine enjoyment and price in blind tastings, the oenology students tricked by red food dye into describing a white wine like a red, a distribution of medals at tastings equivalent to what one would expect from pure chance, the grand crus described like cheap wines and vice-versa when the bottles are switched.
  • ...26 more annotations...
  • Taste does not simply equal your taste buds. It draws on information from all our senses as well as context. As a result, food is susceptible to the same trickery as wine. Adding yellow food dye to vanilla pudding leads people to experience a lemony taste. Diners eating in the dark at a chic concept restaurant confuse veal for tuna. Branding, packaging, and price tags are equally important to enjoyment. Cheap fish is routinely passed off as its pricier cousins at seafood and sushi restaurants. 
  • Just like with wine and classical music, we often judge food based on very different criteria than what we claim. The result is that our perceptions are easily skewed in ways we don’t anticipate. 
  • What does it mean for wine that presentation so easily trumps the quality imbued by being grown on premium Napa land or years of fruitful aging? Is it comforting that the same phenomenon is found in food and classical music, or is it a strike against the authenticity of our enjoyment of them as well? How common must these manipulations be until we concede that the influence of the price tag of a bottle of wine or the visual appearance of a pianist is not a trick but actually part of the quality?
  • To answer these questions, we need to investigate the underlying mechanism that leads us to judge wine, food, and music by criteria other than what we claim to value. And that mechanism seems to be the quick, intuitive judgments our minds unconsciously make
  • this unknowability also makes it easy to be led astray when our intuition makes a mistake. We may often be able to count on the price tag or packaging of food and wine for accurate information about quality. But as we believe that we’re judging based on just the product, we fail to recognize when presentation manipulates our snap judgments.
  • Participants were just as effective when watching 6 second video clips and when comparing their ratings to ratings of teacher effectiveness as measured by actual student test performance. 
  • The power of intuitive first impressions has been demonstrated in a variety of other contexts. One experiment found that people predicted the outcome of political elections remarkably well based on silent 10 second video clips of debates - significantly outperforming political pundits and predictions made based on economic indicators.
  • In a real world case, a number of art experts successfully identified a 6th century Greek statue as a fraud. Although the statue had survived a 14 month investigation by a respected museum that included the probings of a geologist, they instantly recognized something was off. They just couldn’t explain how they knew.
  • Cases like this represent the canon behind the idea of the “adaptive unconscious,” a concept made famous by journalist Malcolm Gladwell in his book Blink. The basic idea is that we constantly, quickly, and unconsciously do the equivalent of judging a book by its cover. After all, a cover provides a lot of relevant information in a world in which we don’t have time to read every page.
  • Gladwell describes the adaptive unconscious as “a kind of giant computer that quickly and quietly processes a lot of the data we need in order to keep functioning as human beings.”
  • In a famous experiment, psychologist Nalini Ambady provided participants in an academic study with 30 second silent video clips of a college professor teaching a class and asked them to rate the effectiveness of the professor.
  • In follow up experiments, Chia-Jung Tsay found that those judging musicians’ auditions based on visual cues were not giving preference to attractive performers. Rather, they seemed to look for visual signs of relevant characteristics like passion, creativity, and uniqueness. Seeing signs of passion is valuable information. But in differentiating between elite performers, it gives an edge to someone who looks passionate over someone whose play is passionate
  • Outside of these more eccentric examples, it’s our reliance on quick judgments, and ignorance of their workings, that cause people to act on ugly, unconscious biases
  • It’s also why - from a business perspective - packaging and presentation is just as important as the good or service on offer. Why marketing is just as important as product. 
  • Gladwell ends Blink optimistically. By paying closer attention to our powers of rapid cognition, he argues, we can avoid its pitfalls and harness its powers. We can blindly audition musicians behind a screen, look at a piece of art devoid of other context, and pay particular attention to possible unconscious bias in our performance reports.
  • But Gladwell’s success in demonstrating how the many calculations our adaptive unconscious performs without our awareness undermines his hopeful message of consciously harnessing its power.
  • As a former world-class tennis player and coach of over 50 years, Braden is a perfect example of the ideas behind thin slicing. But if he can’t figure out what his unconscious is up to when he recognizes double faults, why should anyone else expect to be up to the task?
  • flawed judgment in fields like medicine and investing has more serious consequences. The fact that expertise is so tricky leads psychologist Daniel Kahneman to assert that most experts should seek the assistance of statistics and algorithms in making decisions.
  • In his book Thinking, Fast and Slow, he describes our two modes of thought: System 1, like the adaptive unconscious, is our “fast, instinctive, and emotional” intuition. System 2 is our “slower, more deliberative, and more logical” conscious thought. Kahneman believes that we often leave decisions up to System 1 and generally place far “too much confidence in human judgment” due to the pitfalls of our intuition described above.
  • Not every judgment will be made in a field that is stable and regular enough for an algorithm to help us make judgments or predictions. But in those cases, he notes, “Hundreds of studies have shown that wherever we have sufficient information to build a model, it will perform better than most people.”
  • Experts can avoid the pitfalls of intuition more easily than laypeople. But they need help too, especially as our collective confidence in expertise leads us to overconfidence in their judgments. 
  • This article has referred to the influence of price tags and context on products and experiences like wine and classical music concerts as tricks that skew our perception. But maybe we should consider them a real, actual part of the quality.
  • Losing ourselves in a universe of relativism, however, will lead us to miss out on anything new or unique. Take the example of the song “Hey Ya!” by Outkast. When the music industry heard it, they felt sure it would be a hit. When it premiered on the radio, however, listeners changed the channel. The song sounded too dissimilar from songs people liked, so they responded negatively. 
  • It took time for people to get familiar with the song and realize that they enjoyed it. Eventually “Hey Ya!” became the hit of the summer.
  • Many boorish people talking about the ethereal qualities of great wine probably can't even identify cork taint because their impressions are dominated by the price tag and the wine label. But the classic defense of wine - that you need to study it to appreciate it - is also vindicated. The open question - which is both editorial and empiric - is what it means for the industry that constant vigilance and substantial study is needed to dependably appreciate wine for the product quality alone. But the questions is relevant to the enjoyment of many other products and experiences that we enjoy in life.
  • Maybe the most important conclusion is to not only recognize the fallibility of our judgments and impressions, but to recognize when it matters, and when it doesn’t
anonymous

Are Mass Murderers Insane? Usually Not, Researchers Say - The New York Times - 0 views

  • Ditto for Dylann Roof, the racist who murdered nine African-American churchgoers in South Carolina in 2015, and Christopher Harper-Mercer, the angry young man who killed nine people at a community college in Oregon the same year.
  • Most mass murderers instead belong to a rogue’s gallery of the disgruntled and aggrieved, whose anger and intentions wax and wane over time, eventually curdling into violence in the wake of some perceived humiliation.
  • This evolution proceeds rationally and logically, at least in the murderer’s mind. The unthinkable becomes thinkable, then inevitable.
  • ...1 more annotation...
  • Analyzing his database, Dr. Stone has concluded that about 65 percent of mass killers exhibited no evidence of a severe mental disorder; 22 percent likely had psychosis, the delusional thinking and hallucinations that characterize schizophrenia, or sometimes accompany mania and severe depression.
Javier E

The trouble with atheists: a defence of faith | Books | The Guardian - 1 views

  • My daughter has just turned six. Some time over the next year or so, she will discover that her parents are weird. We're weird because we go to church.
  • This means as she gets older there'll be voices telling her what it means, getting louder and louder until by the time she's a teenager they'll be shouting right in her ear. It means that we believe in a load of bronze-age absurdities. That we fetishise pain and suffering. That we advocate wishy-washy niceness. That we're too stupid to understand the irrationality of our creeds. That we build absurdly complex intellectual structures on the marshmallow foundations of a fantasy. That we're savagely judgmental.
  • that's not the bad news. Those are the objections of people who care enough about religion to object to it. Or to rent a set of recreational objections from Richard Dawkins or Christopher Hitchens. As accusations, they may be a hodge-podge, but at least they assume there's a thing called religion which looms with enough definition and significance to be detested.
  • ...25 more annotations...
  • the really painful message our daughter will receive is that we're embarrassing. For most people who aren't New Atheists, or old atheists, and have no passion invested in the subject, either negative or positive, believers aren't weird because we're wicked. We're weird because we're inexplicable; because, when there's no necessity for it that anyone sensible can see, we've committed ourselves to a set of awkward and absurd attitudes that obtrude, that stick out against the background of modern life, and not in some important or respectworthy or principled way, either.
  • Believers are people who try to insert Jee-zus into conversations at parties; who put themselves down, with writhings of unease, for perfectly normal human behaviour; who are constantly trying to create a solemn hush that invites a fart, a hiccup, a bit of subversion. Believers are people who, on the rare occasions when you have to listen to them, like at a funeral or a wedding, seize the opportunity to pour the liquidised content of a primary-school nativity play into your earhole, apparently not noticing that childhood is over.
  • What goes on inside believers is mysterious. So far as it can be guessed at it appears to be a kind of anxious pretending, a kind of continual, nervous resistance to reality.
  • to me, it's belief that involves the most uncompromising attention to the nature of things of which you are capable. Belief demands that you dispense with illusion after illusion, while contemporary common sense requires continual, fluffy pretending – pretending that might as well be systematic, it's so thoroughly incentivised by our culture.
  • The atheist bus says: "There's probably no God. So stop worrying and enjoy your life."
  • the word that offends against realism here is "enjoy". I'm sorry – enjoy your life?
  • If you based your knowledge of the human species exclusively on adverts, you'd think that the normal condition of humanity was to be a good-looking single person between 20 and 35, with excellent muscle-definition and/or an excellent figure, and a large disposable income. And you'd think the same thing if you got your information exclusively from the atheist bus
  • The implication of the bus slogan is that enjoyment would be your natural state if you weren't being "worried" by us believers and our hellfire preaching. Take away the malignant threat of God-talk, and you would revert to continuous pleasure
  • What's so wrong with this, apart from it being total bollocks? Well, in the first place, that it buys a bill of goods, sight unseen, from modern marketing. Given that human life isn't and can't be made up of enjoyment, it is in effect accepting a picture of human life in which those pieces of living where easy enjoyment is more likely become the only pieces that are visible.
  • But then, like every human being, I am not in the habit of entertaining only those emotions I can prove. I'd be an unrecognisable oddity if I did. Emotions can certainly be misleading: they can fool you into believing stuff that is definitely, demonstrably untrue. Yet emotions are also our indispensable tool for navigating, for feeling our way through, the much larger domain of stuff that isn't susceptible to proof or disproof, that isn't checkable against the physical universe. We dream, hope, wonder, sorrow, rage, grieve, delight, surmise, joke, detest; we form such unprovable conjectures as novels or clarinet concertos; we imagine. And religion is just a part of that, in one sense. It's just one form of imagining, absolutely functional, absolutely human-normal. It would seem perverse, on the face of it, to propose that this one particular manifestation of imagining should be treated as outrageous, should be excised if (which is doubtful) we can manage it.
  • suppose, as the atheist bus goes by, you are povertystricken, or desperate for a job, or a drug addict, or social services have just taken away your child. The bus tells you that there's probably no God so you should stop worrying and enjoy your life, and now the slogan is not just bitterly inappropriate in mood. What it means, if it's true, is that anyone who isn't enjoying themselves is entirely on their own. What the bus says is: there's no help coming.
  • Enjoyment is great. The more enjoyment the better. But enjoyment is one emotion. To say that life is to be enjoyed (just enjoyed) is like saying that mountains should only have summits, or that all colours should be purple, or that all plays should be by Shakespeare. This really is a bizarre category error.
  • A consolation you could believe in would be one that wasn't in danger of popping like a soap bubble on contact with the ordinary truths about us. A consolation you could trust would be one that acknowledged the difficult stuff rather than being in flight from it, and then found you grounds for hope in spite of it, or even because of it
  • The novelist Richard Powers has written that the Clarinet Concerto sounds the way mercy would sound, and that's exactly how I experienced it in 1997. Mercy, though, is one of those words that now requires definition. It does not only mean some tyrant's capacity to suspend a punishment he has himself inflicted. It can mean – and does mean in this case – getting something kind instead of the sensible consequences of an action, or as well as the sensible consequences of an action.
  • from outside, belief looks like a series of ideas about the nature of the universe for which a truth-claim is being made, a set of propositions that you sign up to; and when actual believers don't talk about their belief in this way, it looks like slipperiness, like a maddening evasion of the issue.
  • I am a fairly orthodox Christian. Every Sunday I say and do my best to mean the whole of the Creed, which is a series of propositions. But it is still a mistake to suppose that it is assent to the propositions that makes you a believer. It is the feelings that are primary. I assent to the ideas because I have the feelings; I don't have the feelings because I've assented to the ideas.
  • what I felt listening to Mozart in 1997 is not some wishy-washy metaphor for an idea I believe in, and it's not a front behind which the real business of belief is going on: it's the thing itself. My belief is made of, built up from, sustained by, emotions like that. That's what makes it real.
  • I think that Mozart, two centuries earlier, had succeeded in creating a beautiful and accurate report of an aspect of reality. I think that the reason reality is that way – that it is in some ultimate sense merciful as well as being a set of physical processes all running along on their own without hope of appeal, all the way up from quantum mechanics to the relative velocity of galaxies by way of "blundering, low and horridly cruel" biology (Darwin) – is that the universe is sustained by a continual and infinitely patient act of love. I think that love keeps it in being.
  • That's what I think. But it's all secondary. It all comes limping along behind my emotional assurance that there was mercy, and I felt it. And so the argument about whether the ideas are true or not, which is the argument that people mostly expect to have about religion, is also secondary for me.
  • No, I can't prove it. I don't know that any of it is true. I don't know if there's a God. (And neither do you, and neither does Professor Dawkins, and neither does anybody. It isn't the kind of thing you can know. It isn't a knowable item.)
  • let's be clear about the emotional logic of the bus's message. It amounts to a denial of hope or consolation on any but the most chirpy, squeaky, bubble-gummy reading of the human situation
  • It's got itself established in our culture, relatively recently, that the emotions involved in religious belief must be different from the ones involved in all the other kinds of continuous imagining, hoping, dreaming, and so on, that humans do. These emotions must be alien, freakish, sad, embarrassing, humiliating, immature, pathetic. These emotions must be quite separate from commonsensical us. But they aren't
  • The emotions that sustain religious belief are all, in fact, deeply ordinary and deeply recognisable to anybody who has ever made their way across the common ground of human experience as an adult.
  • It's just that the emotions in question are rarely talked about apart from their rationalisation into ideas. This is what I have tried to do in my new book, Unapologetic.
  • You can easily look up what Christians believe in. You can read any number of defences of Christian ideas. This, however, is a defence of Christian emotions – of their intelligibility, of their grown-up dignity.
Javier E

In Defense of Facts - The Atlantic - 1 views

  • over 13 years, he has published a series of anthologies—of the contemporary American essay, of the world essay, and now of the historical American essay—that misrepresents what the essay is and does, that falsifies its history, and that contains, among its numerous selections, very little one would reasonably classify within the genre. And all of this to wide attention and substantial acclaim
  • D’Agata’s rationale for his “new history,” to the extent that one can piece it together from the headnotes that preface each selection, goes something like this. The conventional essay, nonfiction as it is, is nothing more than a delivery system for facts. The genre, as a consequence, has suffered from a chronic lack of critical esteem, and thus of popular attention. The true essay, however, deals not in knowing but in “unknowing”: in uncertainty, imagination, rumination; in wandering and wondering; in openness and inconclusion
  • Every piece of this is false in one way or another.
  • ...31 more annotations...
  • There are genres whose principal business is fact—journalism, history, popular science—but the essay has never been one of them. If the form possesses a defining characteristic, it is that the essay makes an argument
  • That argument can rest on fact, but it can also rest on anecdote, or introspection, or cultural interpretation, or some combination of all these and more
  • what makes a personal essay an essay and not just an autobiographical narrative is precisely that it uses personal material to develop, however speculatively or intuitively, a larger conclusion.
  • Nonfiction is the source of the narcissistic injury that seems to drive him. “Nonfiction,” he suggests, is like saying “not art,” and if D’Agata, who has himself published several volumes of what he refers to as essays, desires a single thing above all, it is to be known as a maker of art.
  • D’Agata tells us that the term has been in use since about 1950. In fact, it was coined in 1867 by the staff of the Boston Public Library and entered widespread circulation after the turn of the 20th century. The concept’s birth and growth, in other words, did coincide with the rise of the novel to literary preeminence, and nonfiction did long carry an odor of disesteem. But that began to change at least as long ago as the 1960s, with the New Journalism and the “nonfiction novel.”
  • What we really seem to get in D’Agata’s trilogy, in other words, is a compendium of writing that the man himself just happens to like, or that he wants to appropriate as a lineage for his own work.
  • What it’s like is abysmal: partial to trivial formal experimentation, hackneyed artistic rebellion, opaque expressions of private meaning, and modish political posturing
  • If I bought a bag of chickpeas and opened it to find that it contained some chickpeas, some green peas, some pebbles, and some bits of goat poop, I would take it back to the store. And if the shopkeeper said, “Well, they’re ‘lyric’ chickpeas,” I would be entitled to say, “You should’ve told me that before I bought them.”
  • when he isn’t cooking quotes or otherwise fudging the record, he is simply indifferent to issues of factual accuracy, content to rely on a mixture of guesswork, hearsay, and his own rather faulty memory.
  • His rejoinders are more commonly a lot more hostile—not to mention juvenile (“Wow, Jim, your penis must be so much bigger than mine”), defensive, and in their overarching logic, deeply specious. He’s not a journalist, he insists; he’s an essayist. He isn’t dealing in anything as mundane as the facts; he’s dealing in “art, dickhead,” in “poetry,” and there are no rules in art.
  • D’Agata replies that there is something between history and fiction. “We all believe in emotional truths that could never hold water, but we still cling to them and insist on their relevance.” The “emotional truths” here, of course, are D’Agata’s, not Presley’s. If it feels right to say that tae kwon do was invented in ancient India (not modern Korea, as Fingal discovers it was), then that is when it was invented. The term for this is truthiness.
  • D’Agata clearly wants to have it both ways. He wants the imaginative freedom of fiction without relinquishing the credibility (and for some readers, the significance) of nonfiction. He has his fingers crossed, and he’s holding them behind his back. “John’s a different kind of writer,” an editor explains to Fingal early in the book. Indeed he is. But the word for such a writer isn’t essayist. It’s liar.
  • he point of all this nonsense, and a great deal more just like it, is to advance an argument about the essay and its history. The form, D’Agata’s story seems to go, was neglected during the long ages that worshiped “information” but slowly emerged during the 19th and 20th centuries as artists learned to defy convention and untrammel their imaginations, coming fully into its own over the past several decades with the dawning recognition of the illusory nature of knowledge.
  • Most delectable is when he speaks about “the essay’s traditional ‘five-paragraph’ form.” I almost fell off my chair when I got to that one. The five-paragraph essay—introduction, three body paragraphs, conclusion; stultifying, formulaic, repetitive—is the province of high-school English teachers. I have never met one outside of a classroom, and like any decent college writing instructor, I never failed to try to wean my students away from them. The five-paragraph essay isn’t an essay; it’s a paper.
  • What he fails to understand is that facts and the essay are not antagonists but siblings, offspring of the same historical moment
  • —by ignoring the actual contexts of his selections, and thus their actual intentions—D’Agata makes the familiar contemporary move of imposing his own conceits and concerns upon the past. That is how ethnography turns into “song,” Socrates into an essayist, and the whole of literary history into a single man’s “emotional truth.”
  • The history of the essay is indeed intertwined with “facts,” but in a very different way than D’Agata imagines. D’Agata’s mind is Manichaean. Facts bad, imagination good
  • When he refers to his selections as essays, he does more than falsify the essay as a genre. He also effaces all the genres that they do belong to: not only poetry, fiction, journalism, and travel, but, among his older choices, history, parable, satire, the sermon, and more—genres that possess their own particular traditions, conventions, and expectation
  • one needs to recognize that facts themselves have a history.
  • Facts are not just any sort of knowledge, such as also existed in the ancient and medieval worlds. A fact is a unit of information that has been established through uniquely modern methods
  • Fact, etymologically, means “something done”—that is, an act or deed
  • It was only in the 16th century—an age that saw the dawning of a new empirical spirit, one that would issue not only in modern science, but also in modern historiography, journalism, and scholarship—that the word began to signify our current sense of “real state of things.”
  • It was at this exact time, and in this exact spirit, that the essay was born. What distinguished Montaigne’s new form—his “essays” or attempts to discover and publish the truth about himself—was not that it was personal (precursors like Seneca also wrote personally), but that it was scrupulously investigative. Montaigne was conducting research into his soul, and he was determined to get it right.
  • His famous motto, Que sais-je?—“What do I know?”—was an expression not of radical doubt but of the kind of skepticism that fueled the modern revolution in knowledge.
  • It is no coincidence that the first English essayist, Galileo’s contemporary Francis Bacon, was also the first great theorist of science.
  • That knowledge is problematic—difficult to establish, labile once created, often imprecise and always subject to the limitations of the human mind—is not the discovery of postmodernism. It is a foundational insight of the age of science, of fact and information, itself.
  • The point is not that facts do not exist, but that they are unstable (and are becoming more so as the pace of science quickens). Knowledge is always an attempt. Every fact was established by an argument—by observation and interpretation—and is susceptible to being overturned by a different one
  • A fact, you might say, is nothing more than a frozen argument, the place where a given line of investigation has come temporarily to rest.
  • Sometimes those arguments are scientific papers. Sometimes they are news reports, which are arguments with everything except the conclusions left out (the legwork, the notes, the triangulation of sources—the research and the reasoning).
  • When it comes to essays, though, we don’t refer to those conclusions as facts. We refer to them as wisdom, or ideas
  • the essay draws its strength not from separating reason and imagination but from putting them in conversation. A good essay moves fluidly between thought and feeling. It subjects the personal to the rigors of the intellect and the discipline of external reality. The truths it finds are more than just emotional.
Javier E

You Think With the World, Not Just Your Brain - The Atlantic - 2 views

  • embodied or extended cognition: broadly, the theory that what we think of as brain processes can take place outside of the brain.
  • The octopus, for instance, has a bizarre and miraculous mind, sometimes inside its brain, sometimes extending beyond it in sucker-tipped trails. Neurons are spread throughout its body; the creature has more of them in its arms than in its brain itself. It’s possible that each arm might be, to some extent, an independently thinking creature, all of which are collapsed into an octopean superconsciousness in times of danger
  • Embodied cognition, though, tells us that we’re all more octopus-like than we realize. Our minds are not like the floating conceptual “I” imagined by Descartes. We’re always thinking with, and inseparable from, our bodies.
  • ...8 more annotations...
  • The body codes how the brain works, more than the brain controls the body. When we walk—whether taking a pleasant afternoon stroll, or storming off in tears, or trying to sneak into a stranger’s house late at night, with intentions that seem to have exploded into our minds from some distant elsewhere—the brain might be choosing where each foot lands, but the way in which it does so is always constrained by the shape of our legs
  • The way in which the brain approaches the task of walking is already coded by the physical layout of the body—and as such, wouldn’t it make sense to think of the body as being part of our decision-making apparatus? The mind is not simply the brain, as a generation of biological reductionists, clearing out the old wreckage of what had once been the soul, once insisted. It’s not a kind of software being run on the logical-processing unit of the brain. It’s bigger, and richer, and grosser, in every sense. It has joints and sinews. The rarefied rational mind sweats and shits; this body, this mound of eventually rotting flesh, is really you.
  • That’s embodied cognition.
  • Extended cognition is stranger.
  • The mind, they argue, has no reason to stop at the edges of the body, hemmed in by skin, flapping open and closed with mouths and anuses.
  • When we jot something down—a shopping list, maybe—on a piece of paper, aren’t we in effect remembering it outside our heads? Most of all, isn’t language itself something that’s always external to the individual mind?
  • Language sits hazy in the world, a symbolic and intersubjective ether, but at the same time it forms the substance of our thought and the structure of our understanding. Isn’t language thinking for us?
  • Writing, for Plato, is a pharmakon, a “remedy” for forgetfulness, but if taken in too strong a dose it becomes a poison: A person no longer remembers things for themselves; it’s the text that remembers, with an unholy autonomy. The same criticisms are now commonly made of smartphones. Not much changes.
katedriscoll

History | TOKTalk.net - 0 views

  • inking the different Areas of Knowledge (AOK) with different Ways of Knowing (WOK) can be quite challenging at times. I now attempted to link History with Language, Logics, Emotion and Sense Perception.
  • Does the way (the language) that certain historical events are presented in history books influence the way that the reader understands these events? What role does loaded language play when talking about historical events? What role do connotation and denotation play when talking about historical events? How can language introduce bias into historical accounts? How does language help or hinder the interpretation of historical facts?
  • recently read an interesting poem by the German poet and playwright Berthold Brecht – a poem which got me thinking. You see, this is one of the TOK illnesses, you start to see TOK everywhere, and also in poetry. Continue reading »
cvanderloo

The U.S. wants Costa Rica to host refugees before they cross the border. Here's why - 0 views

  • In July, the U.S. government announced a plan for Costa Rica to temporarily host up to 200 refugees from Central America while they are processed for placement in the U.S. or elsewhere.
  • The new scale and diversity of refugees is challenging tiny Costa Rica’s capacity to manage these populations and ensure protection of their human rights. The U.S. plan to send more refugees their way will only add to this challenge.
  • The plan for Costa Rica to temporarily house refugees is in addition to an existing program that helps Central American minors gain refugee status in the U.S.
  • ...7 more annotations...
  • While the plan offers a short-term solution for protecting those most vulnerable to violence, it does not address the magnitude of the migration. In the first six months of the current fiscal year, the U.S. border patrol apprehended 120,700 people from the Northern Triangle countries attempting to enter the U.S. Some of those who cross the border will apply for asylum, but the majority will be sent back to their countries of origin and the violence they were fleeing.
  • Costa Rica is a major destination for migrants and refugees in the region, and immigrants account for 9 percent of the country’s population of 4.8 million. Like the United States, Costa Rica has seen a dramatic increase in arrivals of refugees from Northern Triangle countries, particularly El Salvador, since 2012
  • Central Americans moving to Costa Rica today often already have established social networks in Costa Rica –
  • Immigration officials expect to continue to see around 500 Colombian refugees arriving each year, despite the newly signed peace accord. Costa Rica has also seen a large increase in Venezuelans fleeing economic crisis.
  • Costa Rica has become a popular destination and transit country because of its relatively open borders and policies, its reputation as a champion of human rights and its relatively low levels of crime, violence and poverty. I
  • Over the past 10 years, the country has increased restrictions on immigration, hoping to discourage low-income economic migrants from Nicaragua from entering. These restrictions echo the national security logic of U.S. policies.
  • It neither addresses the underlying conditions of violence that refugees seek to escape nor strengthens regional governments’ abilities to deal with the arrival of these vulnerable populations.
katedriscoll

Translating Amanda Gorman - It Bears Mentioning - 0 views

  • The logic is supposed to be that only someone of Gorman’s race, and optimally gender, can effectively translate her expression into another language. But is that true? And are we not denying Gorman and black people basic humanity in – if I may jump the gun – pretending that it is?
  • Notice I didn’t mention Shakespeare translated into other languages. According to the Critical Race Theory paradigm that informs this performative take on translating Gorman, Shakespeare being a white man means that white translators of his work are akin to him, while non-white ones, minted in a world where they must always grapple with whiteness “centered,” are perfect bilinguals of a sort.
katedriscoll

Justified True Belief - TOK RESOURCE.ORG - 0 views

  • This traditional unpacking of the idea of knowledge follows naturally after the Student knowledge claims. The Wittgenstein and the polysemy of language unit will also inform the class activities presented below; especially for differentiating between opinion and belief.   
  • TRUE:The knowledge claim is True rather than False. It corresponds to the real world. It is a fact. It is “what is the case.”
  • JUSTIFIED:The knowledge claim is justified with adequate evidence. Justification requires Coherence with previous data and Clarity with regard to language and logic. There can be no Contradiction or strong Counter evidence.
  • ...1 more annotation...
  • BELIEVED:The knowledge claim is a matter of Conviction. We must own our knowledge.
Javier E

Opinion | Humans Are Animals. Let's Get Over It. - The New York Times - 0 views

  • The separation of people from, and the superiority of people to, members of other species is a good candidate for the originating idea of Western thought. And a good candidate for the worst.
  • Like Plato, Hobbes associates anarchy with animality and civilization with the state, which gives to our merely animal motion moral content for the first time and orders us into a definite hierarchy.
  • It is rationality that gives us dignity, that makes a claim to moral respect that no mere animal can deserve. “The moral law reveals to me a life independent of animality,” writes Immanuel Kant in “Critique of Practical Reason.” In this assertion, at least, the Western intellectual tradition has been remarkably consistent.
  • ...15 more annotations...
  • the devaluation of animals and disconnection of us from them reflect a deeper devaluation of the material universe in general
  • In this scheme of things, we owe nature nothing; it is to yield us everything. This is the ideology of species annihilation and environmental destruction, and also of technological development.
  • Further trouble is caused when the distinctions between humans and animals are then used to draw distinctions among human beings
  • Some of us, in short, are animals — and some of us are better than that. This, it turns out, is a useful justification for colonialism, slavery and racism.
  • The classical source for this distinction is certainly Aristotle. In the “Politics,” he writes, “Where then there is such a difference as that between soul and body, or between men and animals (as in the case of those whose business is to use their body, and who can do nothing better), the lower sort are by nature slaves.
  • Every human hierarchy, insofar as it can be justified philosophically, is treated by Aristotle by analogy to the relation of people to animals.
  • One difficult thing to face about our animality is that it entails our deaths; being an animal is associated throughout philosophy with dying purposelessly, and so with living meaninglessly.
  • this line of thought also happens to justify colonizing or even extirpating the “savage,” the beast in human form.
  • Our supposed fundamental distinction from “beasts, “brutes” and “savages” is used to divide us from nature, from one another and, finally, from ourselves
  • In Plato’s “Republic,” Socrates divides the human soul into two parts. The soul of the thirsty person, he says, “wishes for nothing else than to drink.” But we can restrain ourselves. “That which inhibits such actions,” he concludes, “arises from the calculations of reason.” When we restrain or control ourselves, Plato argues, a rational being restrains an animal.
  • In this view, each of us is both a beast and a person — and the point of human life is to constrain our desires with rationality and purify ourselves of animality
  • These sorts of systematic self-divisions come to be refigured in Cartesian dualism, which separates the mind from the body, or in Sigmund Freud’s distinction between id and ego, or in the neurological contrast between the functions of the amygdala and the prefrontal cortex.
  • I don’t know how to refute it, exactly, except to say that I don’t feel myself to be a logic program running on an animal body; I’d like to consider myself a lot more integrated than that.
  • And I’d like to repudiate every political and environmental conclusion ever drawn by our supposed transcendence of the order of nature
  • There is no doubt that human beings are distinct from other animals, though not necessarily more distinct than other animals are from one another. But maybe we’ve been too focused on the differences for too long. Maybe we should emphasize what all us animals have in common.
Javier E

Thieves of experience: On the rise of surveillance capitalism - 1 views

  • Harvard Business School professor emerita Shoshana Zuboff argues in her new book that the Valley’s wealth and power are predicated on an insidious, essentially pathological form of private enterprise—what she calls “surveillance capitalism.” Pioneered by Google, perfected by Facebook, and now spreading throughout the economy, surveillance capitalism uses human life as its raw material. Our everyday experiences, distilled into data, have become a privately-owned business asset used to predict and mold our behavior, whether we’re shopping or socializing, working or voting.
  • By reengineering the economy and society to their own benefit, Google and Facebook are perverting capitalism in a way that undermines personal freedom and corrodes democracy.
  • Under the Fordist model of mass production and consumption that prevailed for much of the twentieth century, industrial capitalism achieved a relatively benign balance among the contending interests of business owners, workers, and consumers. Enlightened executives understood that good pay and decent working conditions would ensure a prosperous middle class eager to buy the goods and services their companies produced. It was the product itself — made by workers, sold by companies, bought by consumers — that tied the interests of capitalism’s participants together. Economic and social equilibrium was negotiated through the product.
  • ...72 more annotations...
  • By removing the tangible product from the center of commerce, surveillance capitalism upsets the equilibrium. Whenever we use free apps and online services, it’s often said, we become the products, our attention harvested and sold to advertisers
  • this truism gets it wrong. Surveillance capitalism’s real products, vaporous but immensely valuable, are predictions about our future behavior — what we’ll look at, where we’ll go, what we’ll buy, what opinions we’ll hold — that internet companies derive from our personal data and sell to businesses, political operatives, and other bidders.
  • Unlike financial derivatives, which they in some ways resemble, these new data derivatives draw their value, parasite-like, from human experience.To the Googles and Facebooks of the world, we are neither the customer nor the product. We are the source of what Silicon Valley technologists call “data exhaust” — the informational byproducts of online activity that become the inputs to prediction algorithms
  • Another 2015 study, appearing in the Journal of Computer-Mediated Communication, showed that when people hear their phone ring but are unable to answer it, their blood pressure spikes, their pulse quickens, and their problem-solving skills decline.
  • The smartphone has become a repository of the self, recording and dispensing the words, sounds and images that define what we think, what we experience and who we are. In a 2015 Gallup survey, more than half of iPhone owners said that they couldn’t imagine life without the device.
  • So what happens to our minds when we allow a single tool such dominion over our perception and cognition?
  • Not only do our phones shape our thoughts in deep and complicated ways, but the effects persist even when we aren’t using the devices. As the brain grows dependent on the technology, the research suggests, the intellect weakens.
  • he has seen mounting evidence that using a smartphone, or even hearing one ring or vibrate, produces a welter of distractions that makes it harder to concentrate on a difficult problem or job. The division of attention impedes reasoning and performance.
  • internet companies operate in what Zuboff terms “extreme structural independence from people.” When databases displace goods as the engine of the economy, our own interests, as consumers but also as citizens, cease to be part of the negotiation. We are no longer one of the forces guiding the market’s invisible hand. We are the objects of surveillance and control.
  • Social skills and relationships seem to suffer as well.
  • In both tests, the subjects whose phones were in view posted the worst scores, while those who left their phones in a different room did the best. The students who kept their phones in their pockets or bags came out in the middle. As the phone’s proximity increased, brainpower decreased.
  • In subsequent interviews, nearly all the participants said that their phones hadn’t been a distraction—that they hadn’t even thought about the devices during the experiment. They remained oblivious even as the phones disrupted their focus and thinking.
  • The researchers recruited 520 undergraduates at UCSD and gave them two standard tests of intellectual acuity. One test gauged “available working-memory capacity,” a measure of how fully a person’s mind can focus on a particular task. The second assessed “fluid intelligence,” a person’s ability to interpret and solve an unfamiliar problem. The only variable in the experiment was the location of the subjects’ smartphones. Some of the students were asked to place their phones in front of them on their desks; others were told to stow their phones in their pockets or handbags; still others were required to leave their phones in a different room.
  • the “integration of smartphones into daily life” appears to cause a “brain drain” that can diminish such vital mental skills as “learning, logical reasoning, abstract thought, problem solving, and creativity.”
  •  Smartphones have become so entangled with our existence that, even when we’re not peering or pawing at them, they tug at our attention, diverting precious cognitive resources. Just suppressing the desire to check our phone, which we do routinely and subconsciously throughout the day, can debilitate our thinking.
  • They found that students who didn’t bring their phones to the classroom scored a full letter-grade higher on a test of the material presented than those who brought their phones. It didn’t matter whether the students who had their phones used them or not: All of them scored equally poorly.
  • A study of nearly a hundred secondary schools in the U.K., published last year in the journal Labour Economics, found that when schools ban smartphones, students’ examination scores go up substantially, with the weakest students benefiting the most.
  • Data, the novelist and critic Cynthia Ozick once wrote, is “memory without history.” Her observation points to the problem with allowing smartphones to commandeer our brains
  • Because smartphones serve as constant reminders of all the friends we could be chatting with electronically, they pull at our minds when we’re talking with people in person, leaving our conversations shallower and less satisfying.
  • In a 2013 study conducted at the University of Essex in England, 142 participants were divided into pairs and asked to converse in private for ten minutes. Half talked with a phone in the room, half without a phone present. The subjects were then given tests of affinity, trust and empathy. “The mere presence of mobile phones,” the researchers reported in the Journal of Social and Personal Relationships, “inhibited the development of interpersonal closeness and trust” and diminished “the extent to which individuals felt empathy and understanding from their partners.”
  • The evidence that our phones can get inside our heads so forcefully is unsettling. It suggests that our thoughts and feelings, far from being sequestered in our skulls, can be skewed by external forces we’re not even aware o
  •  Scientists have long known that the brain is a monitoring system as well as a thinking system. Its attention is drawn toward any object that is new, intriguing or otherwise striking — that has, in the psychological jargon, “salience.”
  • even in the history of captivating media, the smartphone stands out. It is an attention magnet unlike any our minds have had to grapple with before. Because the phone is packed with so many forms of information and so many useful and entertaining functions, it acts as what Dr. Ward calls a “supernormal stimulus,” one that can “hijack” attention whenever it is part of our surroundings — and it is always part of our surroundings.
  • Imagine combining a mailbox, a newspaper, a TV, a radio, a photo album, a public library and a boisterous party attended by everyone you know, and then compressing them all into a single, small, radiant object. That is what a smartphone represents to us. No wonder we can’t take our minds off it.
  • The irony of the smartphone is that the qualities that make it so appealing to us — its constant connection to the net, its multiplicity of apps, its responsiveness, its portability — are the very ones that give it such sway over our minds.
  • Phone makers like Apple and Samsung and app writers like Facebook, Google and Snap design their products to consume as much of our attention as possible during every one of our waking hours
  • Social media apps were designed to exploit “a vulnerability in human psychology,” former Facebook president Sean Parker said in a recent interview. “[We] understood this consciously. And we did it anyway.”
  • A quarter-century ago, when we first started going online, we took it on faith that the web would make us smarter: More information would breed sharper thinking. We now know it’s not that simple.
  • As strange as it might seem, people’s knowledge and understanding may actually dwindle as gadgets grant them easier access to online data stores
  • In a seminal 2011 study published in Science, a team of researchers — led by the Columbia University psychologist Betsy Sparrow and including the late Harvard memory expert Daniel Wegner — had a group of volunteers read forty brief, factual statements (such as “The space shuttle Columbia disintegrated during re-entry over Texas in Feb. 2003”) and then type the statements into a computer. Half the people were told that the machine would save what they typed; half were told that the statements would be erased.
  • Afterward, the researchers asked the subjects to write down as many of the statements as they could remember. Those who believed that the facts had been recorded in the computer demonstrated much weaker recall than those who assumed the facts wouldn’t be stored. Anticipating that information would be readily available in digital form seemed to reduce the mental effort that people made to remember it
  • The researchers dubbed this phenomenon the “Google effect” and noted its broad implications: “Because search engines are continually available to us, we may often be in a state of not feeling we need to encode the information internally. When we need it, we will look it up.”
  • as the pioneering psychologist and philosopher William James said in an 1892 lecture, “the art of remembering is the art of thinking.”
  • Only by encoding information in our biological memory can we weave the rich intellectual associations that form the essence of personal knowledge and give rise to critical and conceptual thinking. No matter how much information swirls around us, the less well-stocked our memory, the less we have to think with.
  • As Dr. Wegner and Dr. Ward explained in a 2013 Scientific American article, when people call up information through their devices, they often end up suffering from delusions of intelligence. They feel as though “their own mental capacities” had generated the information, not their devices. “The advent of the ‘information age’ seems to have created a generation of people who feel they know more than ever before,” the scholars concluded, even though “they may know ever less about the world around them.”
  • That insight sheds light on society’s current gullibility crisis, in which people are all too quick to credit lies and half-truths spread through social media. If your phone has sapped your powers of discernment, you’ll believe anything it tells you.
  • A second experiment conducted by the researchers produced similar results, while also revealing that the more heavily students relied on their phones in their everyday lives, the greater the cognitive penalty they suffered.
  • When we constrict our capacity for reasoning and recall or transfer those skills to a gadget, we sacrifice our ability to turn information into knowledge. We get the data but lose the meaning
  • We need to give our minds more room to think. And that means putting some distance between ourselves and our phones.
  • Google’s once-patient investors grew restive, demanding that the founders figure out a way to make money, preferably lots of it.
  • nder pressure, Page and Brin authorized the launch of an auction system for selling advertisements tied to search queries. The system was designed so that the company would get paid by an advertiser only when a user clicked on an ad. This feature gave Google a huge financial incentive to make accurate predictions about how users would respond to ads and other online content. Even tiny increases in click rates would bring big gains in income. And so the company began deploying its stores of behavioral data not for the benefit of users but to aid advertisers — and to juice its own profits. Surveillance capitalism had arrived.
  • Google’s business now hinged on what Zuboff calls “the extraction imperative.” To improve its predictions, it had to mine as much information as possible from web users. It aggressively expanded its online services to widen the scope of its surveillance.
  • Through Gmail, it secured access to the contents of people’s emails and address books. Through Google Maps, it gained a bead on people’s whereabouts and movements. Through Google Calendar, it learned what people were doing at different moments during the day and whom they were doing it with. Through Google News, it got a readout of people’s interests and political leanings. Through Google Shopping, it opened a window onto people’s wish lists,
  • The company gave all these services away for free to ensure they’d be used by as many people as possible. It knew the money lay in the data.
  • the organization grew insular and secretive. Seeking to keep the true nature of its work from the public, it adopted what its CEO at the time, Eric Schmidt, called a “hiding strategy” — a kind of corporate omerta backed up by stringent nondisclosure agreements.
  • Page and Brin further shielded themselves from outside oversight by establishing a stock structure that guaranteed their power could never be challenged, neither by investors nor by directors.
  • What’s most remarkable about the birth of surveillance capitalism is the speed and audacity with which Google overturned social conventions and norms about data and privacy. Without permission, without compensation, and with little in the way of resistance, the company seized and declared ownership over everyone’s information
  • The companies that followed Google presumed that they too had an unfettered right to collect, parse, and sell personal data in pretty much any way they pleased. In the smart homes being built today, it’s understood that any and all data will be beamed up to corporate clouds.
  • Google conducted its great data heist under the cover of novelty. The web was an exciting frontier — something new in the world — and few people understood or cared about what they were revealing as they searched and surfed. In those innocent days, data was there for the taking, and Google took it
  • Google also benefited from decisions made by lawmakers, regulators, and judges — decisions that granted internet companies free use of a vast taxpayer-funded communication infrastructure, relieved them of legal and ethical responsibility for the information and messages they distributed, and gave them carte blanche to collect and exploit user data.
  • Consider the terms-of-service agreements that govern the division of rights and the delegation of ownership online. Non-negotiable, subject to emendation and extension at the company’s whim, and requiring only a casual click to bind the user, TOS agreements are parodies of contracts, yet they have been granted legal legitimacy by the court
  • Law professors, writes Zuboff, “call these ‘contracts of adhesion’ because they impose take-it-or-leave-it conditions on users that stick to them whether they like it or not.” Fundamentally undemocratic, the ubiquitous agreements helped Google and other firms commandeer personal data as if by fiat.
  • n the choices we make as consumers and private citizens, we have always traded some of our autonomy to gain other rewards. Many people, it seems clear, experience surveillance capitalism less as a prison, where their agency is restricted in a noxious way, than as an all-inclusive resort, where their agency is restricted in a pleasing way
  • Zuboff makes a convincing case that this is a short-sighted and dangerous view — that the bargain we’ve struck with the internet giants is a Faustian one
  • but her case would have been stronger still had she more fully addressed the benefits side of the ledger.
  • there’s a piece missing. While Zuboff’s assessment of the costs that people incur under surveillance capitalism is exhaustive, she largely ignores the benefits people receive in return — convenience, customization, savings, entertainment, social connection, and so on
  • hat the industries of the future will seek to manufacture is the self.
  • Behavior modification is the thread that ties today’s search engines, social networks, and smartphone trackers to tomorrow’s facial-recognition systems, emotion-detection sensors, and artificial-intelligence bots.
  • All of Facebook’s information wrangling and algorithmic fine-tuning, she writes, “is aimed at solving one problem: how and when to intervene in the state of play that is your daily life in order to modify your behavior and thus sharply increase the predictability of your actions now, soon, and later.”
  • “The goal of everything we do is to change people’s actual behavior at scale,” a top Silicon Valley data scientist told her in an interview. “We can test how actionable our cues are for them and how profitable certain behaviors are for us.”
  • This goal, she suggests, is not limited to Facebook. It is coming to guide much of the economy, as financial and social power shifts to the surveillance capitalists
  • Combining rich information on individuals’ behavioral triggers with the ability to deliver precisely tailored and timed messages turns out to be a recipe for behavior modification on an unprecedented scale.
  • it was Facebook, with its incredibly detailed data on people’s social lives, that grasped digital media’s full potential for behavior modification. By using what it called its “social graph” to map the intentions, desires, and interactions of literally billions of individuals, it saw that it could turn its network into a worldwide Skinner box, employing psychological triggers and rewards to program not only what people see but how they react.
  • spying on the populace is not the end game. The real prize lies in figuring out ways to use the data to shape how people think and act. “The best way to predict the future is to invent it,” the computer scientist Alan Kay once observed. And the best way to predict behavior is to script it.
  • competition for personal data intensified. It was no longer enough to monitor people online; making better predictions required that surveillance be extended into homes, stores, schools, workplaces, and the public squares of cities and towns. Much of the recent innovation in the tech industry has entailed the creation of products and services designed to vacuum up data from every corner of our lives
  • “The typical complaint is that privacy is eroded, but that is misleading,” Zuboff writes. “In the larger societal pattern, privacy is not eroded but redistributed . . . . Instead of people having the rights to decide how and what they will disclose, these rights are concentrated within the domain of surveillance capitalism.” The transfer of decision rights is also a transfer of autonomy and agency, from the citizen to the corporation.
  • What we lose under this regime is something more fundamental than privacy. It’s the right to make our own decisions about privacy — to draw our own lines between those aspects of our lives we are comfortable sharing and those we are not
  • Other possible ways of organizing online markets, such as through paid subscriptions for apps and services, never even got a chance to be tested.
  • Online surveillance came to be viewed as normal and even necessary by politicians, government bureaucrats, and the general public
  • Google and other Silicon Valley companies benefited directly from the government’s new stress on digital surveillance. They earned millions through contracts to share their data collection and analysis techniques with the National Security Agenc
  • As much as the dot-com crash, the horrors of 9/11 set the stage for the rise of surveillance capitalism. Zuboff notes that, in 2000, members of the Federal Trade Commission, frustrated by internet companies’ lack of progress in adopting privacy protections, began formulating legislation to secure people’s control over their online information and severely restrict the companies’ ability to collect and store it. It seemed obvious to the regulators that ownership of personal data should by default lie in the hands of private citizens, not corporations.
  • The 9/11 attacks changed the calculus. The centralized collection and analysis of online data, on a vast scale, came to be seen as essential to national security. “The privacy provisions debated just months earlier vanished from the conversation more or less overnight,”
Javier E

Revisiting the prophetic work of Neil Postman about the media » MercatorNet - 1 views

  • The NYU professor was surely prophetic. “Our own tribe is undergoing a vast and trembling shift from the magic of writing to the magic of electronics,” he cautioned.
  • “We face the rapid dissolution of the assumptions of an education organised around the slow-moving printed word, and the equally rapid emergence of a new education based on the speed-of-light electronic message.”
  • What Postman perceived in television has been dramatically intensified by smartphones and social media
  • ...31 more annotations...
  • Postman also recognised that technology was changing our mental processes and social habits.
  • Today corporations like Google and Amazon collect data on Internet users based on their browsing history, the things they purchase, and the apps they use
  • Yet all citizens are undergoing this same transformation. Our digital devices undermine social interactions by isolating us,
  • “Years from now, it will be noticed that the massive collection and speed-of-light retrieval of data have been of great value to large-scale organisations, but have solved very little of importance to most people, and have created at least as many problems for them as they may have solved.”
  • “Television has by its power to control the time, attention, and cognitive habits of our youth gained the power to control their education.”
  • As a student of Canadian philosopher Marshall McLuhan, Postman believed that the medium of information was critical to understanding its social and political effects. Every technology has its own agenda. Postman worried that the very nature of television undermined American democratic institutions.
  • Many Americans tuned in to the presidential debate looking for something substantial and meaty
  • It was simply another manifestation of the incoherence and vitriol of cable news
  • “When, in short, a people become an audience and their public business a vaudeville act, then a nation finds itself at risk; culture-death is a clear possibility,” warned Postman.
  • Technology Is Never Neutral
  • As for new problems, we have increased addictions (technological and pornographic); increased loneliness, anxiety, and distraction; and inhibited social and intellectual maturation.
  • The average length of a shot on network television is only 3.5 seconds, so that the eye never rests, always has something new to see. Moreover, television offers viewers a variety of subject matter, requires minimal skills to comprehend it, and is largely aimed at emotional gratification.
  • This is far truer of the Internet and social media, where more than a third of Americans, and almost half of young people, now get their news.
  • with smartphones now ubiquitous, the Internet has replaced television as the “background radiation of the social and intellectual universe.”
  • Is There Any Solution?
  • Reading news or commentary in print, in contrast, requires concentration, patience, and careful reflection, virtues that our digital age vitiates.
  • Politics as Entertainment
  • “How television stages the world becomes the model for how the world is properly to be staged,” observed Postman. In the case of politics, television fashions public discourse into yet another form of entertainment
  • In America, the fundamental metaphor for political discourse is the television commercial. The television commercial is not at all about the character of products to be consumed. … They tell everything about the fears, fancies, and dreams of those who might buy them.
  • The television commercial has oriented business away from making products of value and towards making consumers feel valuable, which means that the business of business has now become pseudo-therapy. The consumer is a patient assured by psycho-dramas.
  • Such is the case with the way politics is “advertised” to different subsets of the American electorate. The “consumer,” depending on his political leanings, may be manipulated by fears of either an impending white-nationalist, fascist dictatorship, or a radical, woke socialist takeover.
  • This paradigm is aggravated by the hypersiloing of media content, which explains why Americans who read left-leaning media view the Proud Boys as a legitimate, existential threat to national civil order, while those who read right-leaning media believe the real immediate enemies of our nation are Antifa
  • Regardless of whether either of these groups represents a real public menace, the loss of any national consensus over what constitutes objective news means that Americans effectively talk past one another: they use the Proud Boys or Antifa as rhetorical barbs to smear their ideological opponents as extremists.
  • Yet these technologies are far from neutral. They are, rather, “equipped with a program for social change.
  • Postman’s analysis of technology is prophetic and profound. He warned of the trivialising of our media, defined by “broken time and broken attention,” in which “facts push other facts into and then out of consciousness at speeds that neither permit nor require evaluation.” He warned of “a neighborhood of strangers and pointless quantity.”
  • does Postman offer any solutions to this seemingly uncontrollable technological juggernaut?
  • Postman’s suggestions regarding education are certainly relevant. He unequivocally condemned education that mimics entertainment, and urged a return to learning that is hierarchical, meaning that it first gives students a foundation of essential knowledge before teaching “critical thinking.”
  • Postman also argued that education must avoid a lowest-common-denominator approach in favor of complexity and the perplexing: the latter method elicits in the student a desire to make sense of what perplexes him.
  • Finally, Postman promoted education of vigorous exposition, logic, and rhetoric, all being necessary for citizenship
  • Another course of action is to understand what these media, by their very nature, do to us and to public discourse.
  • We must, as Postman exhorts us, “demystify the data” and dominate our technology, lest it dominate us. We must identify and resist how television, social media, and smartphones manipulate our emotions, infantilise us, and weaken our ability to rebuild what 2020 has ravaged.
caelengrubb

Why it's time to stop worrying about the decline of the English language | Language | T... - 0 views

  • Now imagine that something even more fundamental than electricity or money is at risk: a tool we have relied on since the dawn of human history, enabling the very foundations of civilisation to be laid
  • I’m talking about our ability to communicate – to put our thoughts into words, and to use those words to forge bonds, to deliver vital information, to learn from our mistakes and build on the work done by others.
  • “Their language is deteriorating. They are lowering the bar. Our language is flying off at all tangents, without the anchor of a solid foundation.
  • ...20 more annotations...
  • Although it is at pains to point out that it does not believe language can be preserved unchanged, it worries that communication is at risk of becoming far less effective. “Some changes would be wholly unacceptable, as they would cause confusion and the language would lose shades of meaning
  • “Without grammar, we lose the agreed-upon standards about what means what. We lose the ability to communicate when respondents are not actually in the same room speaking to one another. Without grammar, we lose the precision required to be effective and purposeful in writing.”
  • At the same time, our laziness and imprecision are leading to unnecessary bloating of the language – “language obesity,”
  • That’s five writers, across a span of 400 years, all moaning about the same erosion of standards. And yet the period also encompasses some of the greatest works of English literature.
  • Since then, the English-speaking world has grown more prosperous, better educated and more efficiently governed, despite an increase in population. Most democratic freedoms have been preserved and intellectual achievement intensified.
  • Linguistic decline is the cultural equivalent of the boy who cried wolf, except the wolf never turns up
  • Our language will always be as flexible and sophisticated as it has been up to now. Those who warn about the deterioration of English haven’t learned about the history of the language, and don’t understand the nature of their own complaints – which are simply statements of preference for the way of doing things they have become used to.
  • But the problem is that writers at that time also felt they were speaking a degraded, faltering tongue
  • Seventy-odd years ago, people knew their grammar and knew how to talk clearly. And, if we follow the logic, they must also have been better at organising, finding things out and making things work.
  • Hand-wringing about standards is not restricted to English. The fate of every language in the world has been lamented by its speakers at some point or another.
  • “For more than 2,000 years, complaints about the decay of respective languages have been documented in literature, but no one has yet been able to name an example of a ‘decayed language’.” He has a point.
  • One common driver of linguistic change is a process called reanalysis.
  • Another form that linguistic change often takes is grammaticalisation: a process in which a common phrase is bleached of its independent meaning and made into a word with a solely grammatical function
  • One instance of this is the verb “to go”, when used for an action in the near future or an intention.
  • Human anatomy makes some changes to language more likely than others. The simple mechanics of moving from a nasal sound (m or n) to a non-nasal one can make a consonant pop up in between
  • The way our brain divides up words also drives change. We split them into phonemes (building blocks of sound that have special perceptual significance) and syllables (groups of phonemes).
  • ound changes can come about as a result of social pressures: certain ways of saying things are seen as having prestige, while others are stigmatised. We gravitate towards the prestigious, and make efforts to avoid saying things in a way that is associated with undesirable qualities – often just below the level of consciousnes
  • The problem arises when deciding what might be good or bad. There are, despite what many people feel, no objective criteria by which to judge what is better or worse in communication
  • Though we are all capable of adaptation, many aspects of the way we use language, including stylistic preferences, have solidified by our 20s. If you are in your 50s, you may identify with many aspects of the way people spoke 30-45 years ago.
  • The irony is, of course, that the pedants are the ones making the mistakes. To people who know how language works, pundits such as Douglas Rushkoff only end up sounding ignorant, having failed to really interrogate their views
kaylynfreeman

Opinion | The Social Sciences' 'Physics Envy' - The New York Times - 0 views

  • Economists, political scientists and sociologists have long suffered from an academic inferiority complex: physics envy. They often feel that their disciplines should be on a par with the “real” sciences and self-consciously model their work on them, using language (“theory,” “experiment,” “law”) evocative of physics and chemistry.
  • Many social scientists contend that science has a method, and if you want to be scientific, you should adopt it. The method requires you to devise a theoretical model, deduce a testable hypothesis from the model and then test the hypothesis against the world. If the hypothesis is confirmed, the theoretical model holds; if the hypothesis is not confirmed, the theoretical model does not hold. If your discipline does not operate by this method — known as hypothetico-deductivism — then in the minds of many, it’s not scientific.
  • it’s not even a good description of how the “hard” sciences work. It’s a high school textbook version of science, with everything messy and chaotic about scientific inquiry safely ignored.
  • ...9 more annotations...
  • For the sake of everyone who stands to gain from a better knowledge of politics, economics and society, the social sciences need to overcome their inferiority complex, reject hypothetico-deductivism and embrace the fact that they are mature disciplines with no need to emulate other sciences.
  • Or consider the famous “impossibility theorem,” developed by the economist Kenneth Arrow, which shows that no single voting system can simultaneously satisfy several important principles of fairness. There is no need to test this model with data — in fact, there is no way to test it — and yet the result offers policy makers a powerful lesson: there are unavoidable trade-offs in the design of voting systems.
  • Unfortunately, the belief that every theory must have its empirical support (and vice versa) now constrains the kinds of social science projects that are undertaken, alters the trajectory of academic careers and drives graduate training. Rather than attempt to imitate the hard sciences, social scientists would be better off doing what they do best: thinking deeply about what prompts human beings to behave the way they do.
  • theoretical models can be of great value even if they are never supported by empirical testing. In the 1950s, for instance, the economist Anthony Downs offered an elegant explanation for why rival political parties might adopt identical platforms during an election campaign. His model relied on the same strategic logic that explains why two competing gas stations or fast-food restaurants locate across the street from each other — if you don’t move to a central location but your opponent does, your opponent will nab those voters (customers). The best move is for competitors to mimic each other. This framework has proven useful to generations of political scientists even though Mr. Downs did not empirically test it and despite the fact that its main prediction, that candidates will take identical positions in elections, is clearly false. The model offered insight into why candidates move toward the center in competitive elections
  • Likewise, the analysis of empirical data can be valuable even in the absence of a grand theoretical model. Did the welfare reform championed by Bill Clinton in the 1990s reduce poverty? Are teenage employees adversely affected by increases in the minimum wage?
  • Answering such questions about the effects of public policies does not require sweeping theoretical claims, just careful attention to the data.
  • theories are like maps: the test of a map lies not in arbitrarily checking random points but in whether people find it useful to get somewhere.
  • Economists, political scientists and sociologists have long suffered from an academic inferiority complex: physics envy. They often feel that their disciplines should be on a par with the “real” sciences and self-consciously model their work on them, using language (“theory,” “experiment,” “law”) evocative of physics and chemistry.
  • The ideal of hypothetico-deductivism is flawed for many reasons. For one thing,
« First ‹ Previous 181 - 200 of 253 Next › Last »
Showing 20 items per page