Skip to main content

Home/ TOK Friends/ Group items tagged insular

Rss Feed Group items tagged

Javier E

How To Repel Tourism « The Dish - 0 views

  • In short: Demanding a visa from a country’s travelers in advance is associated with a 70 percent lower level of tourist entries than from a similar country where there is no visa requirement. The U.S. requires an advance visa from citizens of 81 percent of the world’s countries; if it waived that requirement, the researchers estimate, inbound tourism arrivals would more than double, and tourism expenditure would climb by $123 billion.
  • what it is like to enter the US as a non-citizen. It’s a grueling, off-putting, frightening, and often brutal process. Compared with entering a European country, it’s like entering a police state. When you add the sheer difficulty of getting a visa, the brusque, rude and contemptuous treatment you routinely get from immigration officials at the border, the sense that all visitors are criminals and potential terrorists unless proven otherwise, the US remains one of the most unpleasant places for anyone in the world to try and get access to.
  • And this, of course, is a function not only of a vast and all-powerful bureaucracy. It’s a function of this country’s paranoia and increasing insularity. It’s a thoroughly democratic decision to keep foreigners out as much as possible. And it’s getting worse and worse.
sissij

The Psychology of Scary Movies | FilmmakerIQ.com - 0 views

  • This may explain the shape of our movie monsters: creatures with sharp teeth or snake like appearance.
  • scary movies don’t actually activate fear responses in the amygdala at all. Instead, it was other parts of the brain that were firing – the visual cortex – the part of the brain responsible for processing visual information, the insular cortex- self awareness, the thalamus -the relay switch between brain hemispheres, and the dorsal-medial prefrontal cortex – the part of the brain associated with planning, attention, and problem solving.
  • Unfortunately for Aristotle, research has shown the opposite – watching violence actually makes people MORE aggressive.
  • ...2 more annotations...
  • Experiments with adolescent boys found that they enjoyed a horror film more when their female companion (who was a research plant) was visibly scared.
  • Where there is no imagination – there is no horror
  •  
    I found this very interesting as it went deep into the psychology behind the horror movies. It's especially astonishing for me to see that horror movies don't actually activate fear responses, instead they stimulate the prefrontal cortex of our brain. Also, this article provides a lot of possibilities why we are so attracted to horror movies. I think this can be related to our perceptions and logic of survival since horror movie can help us return to the most primitive state(trembling in the woods) feel the impulse of wild. --Sissi (11/14/2016)
Javier E

Surveying religious knowledge « The Immanent Frame - 0 views

  • Instead of concluding that Americans lack “religious knowledge” because they don’t know what social scientists think they should, we might want to ask what, if anything, the study reveals about lived religion. If, for example, 45 percent of U.S. Catholics “do not know that their church teaches that the bread and wine used in Communion do not merely symbolize but actually become the body and blood of Christ,” then perhaps it is a mistake simply to identify Catholicism with what Catholic bishops say it is. To conclude that Americans are “uninformed” about “their own traditions” betrays a subtle bias in favor of elites and begs the question of what constitutes one’s “own” religion: are we “illiterate,” or do we simply disagree about what belongs in the “canon”?
  • Is ‘having correct beliefs’ the main point of religion? There may be socially useful reasons to play that down in favor of encouraging shared values: compassion, service, and social justice.
  • People do not know who Maimonides was, I think, for the same reason they do not know the origin of “devil’s advocate.” Our culture has become more pluralistic, and people draw upon its elements in the way of the bricoleur to construct a web of meanings that is flexible, contextually activated, and what we would call “post-modern”
  • ...8 more annotations...
  • Perhaps more fundamental, the rules that shape how the bricoleur selects the various elements have also changed. Religion is far less formative of our public discourse and culture in the deep, constitutive way that it was even as recently as the postwar period, due in large part to the rise of neo-liberalism.
  • if we ask Americans a range of questions about matters that extend beyond their immediate horizons, we would be somewhat amazed by the blind spots in our thinking. Ignorance about religion might very well extend to ignorance about geopolitical concerns or about other cultures generally. What is striking, and simultaneously disturbing, is the degree to which our comfort and certainty about ourselves as Americans and about the world we inhabit enable us to settle into a kind of willful ignorance. So I am not inclined to single out religion in this regard; something more fundamental has been revealed
  • To take seriously the results of the Pew quiz leads us to question the substance of our national commitment to religious tolerance and pluralism. If our knowledge of other religions (even our own) is shoddy, then what constitutes the substance of our toleration of others? Is it simply a procedural concern? And, more importantly, if we fail to know basic facts about others, do we make it easier to retreat into the comfort of insular spaces, deaf to the claims of others? Do we expect, at the end of the day, no matter our public announcements to the contrary, that all others should sound and believe as the majority of Americans do?
  • The Pew Forum’s Religious Knowledge Survey examines one type of religious knowledge: knowledge-that. Respondents were asked whether certain propositions about world religions were true. But it is an open question whether this really is the sort of knowledge that we have in mind when we are talking about religious knowledge. At least sometimes, it seems like we mean knowledge-how.
  • The unwritten premise of the survey is that belief ought to be individual, considered, and fully-informed. But that premise fits neither religious experience nor human subjectivity over the long term. Thus to ask these questions in this way is to presume a particular kind of religious subject that is largely nonexistent, then to take pleasure in clucking over its nonexistence.
  • what if religion is not primarily about knowledge? What if the defining core of religion is more like a way of life, a nexus of action? What if, as per Charles Taylor, a religious orientation is more akin to a “social imaginary,” which functions as an “understanding” on a register that is somewhat inarticulable? Indeed, I think Taylor’s corpus offers multiple resources for criticizing what he would describe as the “intellectualism” of such approaches to religion—methodologies that treat human persons as “thinking things,” and thus reduce religious phenomena to a set of ideas, beliefs, and propositions. Taylor’s account of social imaginaries reminds us of a kind of understanding that is “carried” in practices, implicit in rituals and routines, and can never be adequately articulated or made explicit. If we begin to think about religion more like a social imaginary than a set of propositions and beliefs, then the methodologies of surveys of religious “knowledge” are going to look problematic.
  • I’m reminded of an observation Wittgenstein makes in the Philosophical Investigations: One could be a master of a game without being able to articulate the rules.
  • belief, faith and knowledge, Pew Forum on Religion and Public Life, religion and science, religion in the U.S., religious literacy, surveys, U.S. Religious Knowledge Survey
Keiko E

How Warm Temperatures Affect Us - WSJ.com - 0 views

  • Could a few feet of the cold stuff really have such a fundamental effect on beliefs and behavior? Absolutely, according to recent studies on how temperature influences us at an unconscious level. Researchers affiliated with the Center for Decision Sciences at Columbia Business School measured the public's changing attitudes about climate change
  • Those who felt that the current day was warmer than usual for the time of year were more likely to believe in and worry about global warming than those who thought it was cooler outside. They were also more likely to donate the money they earned from taking the survey to a charity that did work on climate change.
  • The researchers call this bias "attribute substitution," meaning that we take a simple judgment, like noting a warm or cold day, and apply it to a larger, more complex one, like whether the planet is headed for a meltdown
  • ...4 more annotations...
  • Physical warmth activates circuits in the brain associated with feelings of psychological warmth. The insular cortex, or insula, plays a critical role in this crossover between the outside world and our experience of it. This peach-sized region helps us to perceive whether a sensation is hot or cold, pleasurable or painful. It is activated when we crave chocolate, fall in love or get disgusted. The insula also guides us on social matters: whether to trust or feel guilty, to empathize or be embarrassed. People who meditate, according to some studies, have a thicker insula.
  • As it turns out, the insula's poetic merging of the physical and emotional helps to explain much of our unconscious behavior. People who are socially rejected—given the cold shoulder—get chills and crave warm food such as soup.
  • Warm springtime conditions are related to a better mood and expanded memory, but both take a plunge in the summer heat. Extreme temperatures make people hostile and aggressive, and violent crimes occur more often in the hotter months. Drivers honk more in heat waves. When you're hot and tired, you're more likely to interpret another person's neutral expression negatively.
  • our perception of reality still relies on sensory experience. Though we may wish for it to be otherwise, our minds cannot be separated from our bodies. And our bodies depend on the environment—what we encounter here on planet Earth
jongardner04

Ghost Illusion Created in the Lab | Neuroscience News Research Articles | Neuroscience ... - 0 views

  • Ghosts exist only in the mind, and scientists know just where to find them, an EPFL study suggests
  • In their experiment, Blanke’s team interfered with the sensorimotor input of participants in such a way that their brains no longer identified such signals as belonging to their own body, but instead interpreted them as th
  • ose of someone else.
  • ...3 more annotations...
  • The researchers first analyzed the brains of 12 patients with neurological disorders – mostly epilepsy – who have experienced this kind of “apparition.” MRI analysis of the patients’s brains revealed interference with three cortical regions: the insular cortex, parietal-frontal cortex, and the temporo-parietal cortex.
  • The participants were unaware of the experiment’s purpose.
  • Instinctively, several subjects reported a strong “feeling of a presence,” even counting up to four “ghosts” where none existed.
  •  
    Scientist performed an experiment creating an illusion of a ghost. This relates to the sense perception idea. 
  •  
    Scientist performed an experiment creating an illusion of a ghost. This relates to the sense perception idea. 
Javier E

Among the Disrupted - NYTimes.com - 0 views

  • Writers hover between a decent poverty and an indecent one; they are expected to render the fruits of their labors for little and even for nothing, and all the miracles of electronic dissemination somehow do not suffice for compensation, either of the fiscal or the spiritual kind.
  • Journalistic institutions slowly transform themselves into silent sweatshops in which words cannot wait for thoughts, and first responses are promoted into best responses, and patience is a professional liability.
  • the discussion of culture is being steadily absorbed into the discussion of business. There are “metrics” for phenomena that cannot be metrically measured. Numerical values are assigned to things that cannot be captured by numbers. Economic concepts go rampaging through noneconomic realms:
  • ...10 more annotations...
  • Quantification is the most overwhelming influence upon the contemporary American understanding of, well, everything. It is enabled by the idolatry of data, which has itself been enabled by the almost unimaginable data-generating capabilities of the new technology
  • The distinction between knowledge and information is a thing of the past, and there is no greater disgrace than to be a thing of the past.
  • even as technologism, which is not the same as technology, asserts itself over more and more precincts of human life, so too does scientism, which is not the same as science.
  • The notion that the nonmaterial dimensions of life must be explained in terms of the material dimensions, and that nonscientific understandings must be translated into scientific understandings if they are to qualify as knowledge, is increasingly popular inside and outside the university
  • The contrary insistence that the glories of art and thought are not evolutionary adaptations, or that the mind is not the brain, or that love is not just biology’s bait for sex, now amounts to a kind of heresy.
  • So, too, does the view that the strongest defense of the humanities lies not in the appeal to their utility — that literature majors may find good jobs, that theaters may economically revitalize neighborhoods — but rather in the appeal to their defiantly nonutilitarian character, so that individuals can know more than how things work, and develop their powers of discernment and judgment, their competence in matters of truth and goodness and beauty, to equip themselves adequately for the choices and the crucibles of private and public life.
  • are we becoming posthumanists?
  • In American culture right now, as I say, the worldview that is ascendant may be described as posthumanism.
  • The posthumanism of the 1970s and 1980s was more insular, an academic affair of “theory,” an insurgency of professors; our posthumanism is a way of life, a social fate.
  • In “The Age of the Crisis of Man: Thought and Fiction in America, 1933-1973,” the gifted essayist Mark Greif, who reveals himself to be also a skillful historian of ideas, charts the history of the 20th-century reckonings with the definition of “man.”
Javier E

Thieves of experience: On the rise of surveillance capitalism - 1 views

  • Harvard Business School professor emerita Shoshana Zuboff argues in her new book that the Valley’s wealth and power are predicated on an insidious, essentially pathological form of private enterprise—what she calls “surveillance capitalism.” Pioneered by Google, perfected by Facebook, and now spreading throughout the economy, surveillance capitalism uses human life as its raw material. Our everyday experiences, distilled into data, have become a privately-owned business asset used to predict and mold our behavior, whether we’re shopping or socializing, working or voting.
  • By reengineering the economy and society to their own benefit, Google and Facebook are perverting capitalism in a way that undermines personal freedom and corrodes democracy.
  • Under the Fordist model of mass production and consumption that prevailed for much of the twentieth century, industrial capitalism achieved a relatively benign balance among the contending interests of business owners, workers, and consumers. Enlightened executives understood that good pay and decent working conditions would ensure a prosperous middle class eager to buy the goods and services their companies produced. It was the product itself — made by workers, sold by companies, bought by consumers — that tied the interests of capitalism’s participants together. Economic and social equilibrium was negotiated through the product.
  • ...72 more annotations...
  • By removing the tangible product from the center of commerce, surveillance capitalism upsets the equilibrium. Whenever we use free apps and online services, it’s often said, we become the products, our attention harvested and sold to advertisers
  • this truism gets it wrong. Surveillance capitalism’s real products, vaporous but immensely valuable, are predictions about our future behavior — what we’ll look at, where we’ll go, what we’ll buy, what opinions we’ll hold — that internet companies derive from our personal data and sell to businesses, political operatives, and other bidders.
  • Unlike financial derivatives, which they in some ways resemble, these new data derivatives draw their value, parasite-like, from human experience.To the Googles and Facebooks of the world, we are neither the customer nor the product. We are the source of what Silicon Valley technologists call “data exhaust” — the informational byproducts of online activity that become the inputs to prediction algorithms
  • Another 2015 study, appearing in the Journal of Computer-Mediated Communication, showed that when people hear their phone ring but are unable to answer it, their blood pressure spikes, their pulse quickens, and their problem-solving skills decline.
  • The smartphone has become a repository of the self, recording and dispensing the words, sounds and images that define what we think, what we experience and who we are. In a 2015 Gallup survey, more than half of iPhone owners said that they couldn’t imagine life without the device.
  • So what happens to our minds when we allow a single tool such dominion over our perception and cognition?
  • Not only do our phones shape our thoughts in deep and complicated ways, but the effects persist even when we aren’t using the devices. As the brain grows dependent on the technology, the research suggests, the intellect weakens.
  • he has seen mounting evidence that using a smartphone, or even hearing one ring or vibrate, produces a welter of distractions that makes it harder to concentrate on a difficult problem or job. The division of attention impedes reasoning and performance.
  • internet companies operate in what Zuboff terms “extreme structural independence from people.” When databases displace goods as the engine of the economy, our own interests, as consumers but also as citizens, cease to be part of the negotiation. We are no longer one of the forces guiding the market’s invisible hand. We are the objects of surveillance and control.
  • Social skills and relationships seem to suffer as well.
  • In both tests, the subjects whose phones were in view posted the worst scores, while those who left their phones in a different room did the best. The students who kept their phones in their pockets or bags came out in the middle. As the phone’s proximity increased, brainpower decreased.
  • In subsequent interviews, nearly all the participants said that their phones hadn’t been a distraction—that they hadn’t even thought about the devices during the experiment. They remained oblivious even as the phones disrupted their focus and thinking.
  • The researchers recruited 520 undergraduates at UCSD and gave them two standard tests of intellectual acuity. One test gauged “available working-memory capacity,” a measure of how fully a person’s mind can focus on a particular task. The second assessed “fluid intelligence,” a person’s ability to interpret and solve an unfamiliar problem. The only variable in the experiment was the location of the subjects’ smartphones. Some of the students were asked to place their phones in front of them on their desks; others were told to stow their phones in their pockets or handbags; still others were required to leave their phones in a different room.
  • the “integration of smartphones into daily life” appears to cause a “brain drain” that can diminish such vital mental skills as “learning, logical reasoning, abstract thought, problem solving, and creativity.”
  •  Smartphones have become so entangled with our existence that, even when we’re not peering or pawing at them, they tug at our attention, diverting precious cognitive resources. Just suppressing the desire to check our phone, which we do routinely and subconsciously throughout the day, can debilitate our thinking.
  • They found that students who didn’t bring their phones to the classroom scored a full letter-grade higher on a test of the material presented than those who brought their phones. It didn’t matter whether the students who had their phones used them or not: All of them scored equally poorly.
  • A study of nearly a hundred secondary schools in the U.K., published last year in the journal Labour Economics, found that when schools ban smartphones, students’ examination scores go up substantially, with the weakest students benefiting the most.
  • Data, the novelist and critic Cynthia Ozick once wrote, is “memory without history.” Her observation points to the problem with allowing smartphones to commandeer our brains
  • Because smartphones serve as constant reminders of all the friends we could be chatting with electronically, they pull at our minds when we’re talking with people in person, leaving our conversations shallower and less satisfying.
  • In a 2013 study conducted at the University of Essex in England, 142 participants were divided into pairs and asked to converse in private for ten minutes. Half talked with a phone in the room, half without a phone present. The subjects were then given tests of affinity, trust and empathy. “The mere presence of mobile phones,” the researchers reported in the Journal of Social and Personal Relationships, “inhibited the development of interpersonal closeness and trust” and diminished “the extent to which individuals felt empathy and understanding from their partners.”
  • The evidence that our phones can get inside our heads so forcefully is unsettling. It suggests that our thoughts and feelings, far from being sequestered in our skulls, can be skewed by external forces we’re not even aware o
  •  Scientists have long known that the brain is a monitoring system as well as a thinking system. Its attention is drawn toward any object that is new, intriguing or otherwise striking — that has, in the psychological jargon, “salience.”
  • even in the history of captivating media, the smartphone stands out. It is an attention magnet unlike any our minds have had to grapple with before. Because the phone is packed with so many forms of information and so many useful and entertaining functions, it acts as what Dr. Ward calls a “supernormal stimulus,” one that can “hijack” attention whenever it is part of our surroundings — and it is always part of our surroundings.
  • Imagine combining a mailbox, a newspaper, a TV, a radio, a photo album, a public library and a boisterous party attended by everyone you know, and then compressing them all into a single, small, radiant object. That is what a smartphone represents to us. No wonder we can’t take our minds off it.
  • The irony of the smartphone is that the qualities that make it so appealing to us — its constant connection to the net, its multiplicity of apps, its responsiveness, its portability — are the very ones that give it such sway over our minds.
  • Phone makers like Apple and Samsung and app writers like Facebook, Google and Snap design their products to consume as much of our attention as possible during every one of our waking hours
  • Social media apps were designed to exploit “a vulnerability in human psychology,” former Facebook president Sean Parker said in a recent interview. “[We] understood this consciously. And we did it anyway.”
  • A quarter-century ago, when we first started going online, we took it on faith that the web would make us smarter: More information would breed sharper thinking. We now know it’s not that simple.
  • As strange as it might seem, people’s knowledge and understanding may actually dwindle as gadgets grant them easier access to online data stores
  • In a seminal 2011 study published in Science, a team of researchers — led by the Columbia University psychologist Betsy Sparrow and including the late Harvard memory expert Daniel Wegner — had a group of volunteers read forty brief, factual statements (such as “The space shuttle Columbia disintegrated during re-entry over Texas in Feb. 2003”) and then type the statements into a computer. Half the people were told that the machine would save what they typed; half were told that the statements would be erased.
  • Afterward, the researchers asked the subjects to write down as many of the statements as they could remember. Those who believed that the facts had been recorded in the computer demonstrated much weaker recall than those who assumed the facts wouldn’t be stored. Anticipating that information would be readily available in digital form seemed to reduce the mental effort that people made to remember it
  • The researchers dubbed this phenomenon the “Google effect” and noted its broad implications: “Because search engines are continually available to us, we may often be in a state of not feeling we need to encode the information internally. When we need it, we will look it up.”
  • as the pioneering psychologist and philosopher William James said in an 1892 lecture, “the art of remembering is the art of thinking.”
  • Only by encoding information in our biological memory can we weave the rich intellectual associations that form the essence of personal knowledge and give rise to critical and conceptual thinking. No matter how much information swirls around us, the less well-stocked our memory, the less we have to think with.
  • As Dr. Wegner and Dr. Ward explained in a 2013 Scientific American article, when people call up information through their devices, they often end up suffering from delusions of intelligence. They feel as though “their own mental capacities” had generated the information, not their devices. “The advent of the ‘information age’ seems to have created a generation of people who feel they know more than ever before,” the scholars concluded, even though “they may know ever less about the world around them.”
  • That insight sheds light on society’s current gullibility crisis, in which people are all too quick to credit lies and half-truths spread through social media. If your phone has sapped your powers of discernment, you’ll believe anything it tells you.
  • A second experiment conducted by the researchers produced similar results, while also revealing that the more heavily students relied on their phones in their everyday lives, the greater the cognitive penalty they suffered.
  • When we constrict our capacity for reasoning and recall or transfer those skills to a gadget, we sacrifice our ability to turn information into knowledge. We get the data but lose the meaning
  • We need to give our minds more room to think. And that means putting some distance between ourselves and our phones.
  • Google’s once-patient investors grew restive, demanding that the founders figure out a way to make money, preferably lots of it.
  • nder pressure, Page and Brin authorized the launch of an auction system for selling advertisements tied to search queries. The system was designed so that the company would get paid by an advertiser only when a user clicked on an ad. This feature gave Google a huge financial incentive to make accurate predictions about how users would respond to ads and other online content. Even tiny increases in click rates would bring big gains in income. And so the company began deploying its stores of behavioral data not for the benefit of users but to aid advertisers — and to juice its own profits. Surveillance capitalism had arrived.
  • Google’s business now hinged on what Zuboff calls “the extraction imperative.” To improve its predictions, it had to mine as much information as possible from web users. It aggressively expanded its online services to widen the scope of its surveillance.
  • Through Gmail, it secured access to the contents of people’s emails and address books. Through Google Maps, it gained a bead on people’s whereabouts and movements. Through Google Calendar, it learned what people were doing at different moments during the day and whom they were doing it with. Through Google News, it got a readout of people’s interests and political leanings. Through Google Shopping, it opened a window onto people’s wish lists,
  • The company gave all these services away for free to ensure they’d be used by as many people as possible. It knew the money lay in the data.
  • the organization grew insular and secretive. Seeking to keep the true nature of its work from the public, it adopted what its CEO at the time, Eric Schmidt, called a “hiding strategy” — a kind of corporate omerta backed up by stringent nondisclosure agreements.
  • Page and Brin further shielded themselves from outside oversight by establishing a stock structure that guaranteed their power could never be challenged, neither by investors nor by directors.
  • What’s most remarkable about the birth of surveillance capitalism is the speed and audacity with which Google overturned social conventions and norms about data and privacy. Without permission, without compensation, and with little in the way of resistance, the company seized and declared ownership over everyone’s information
  • The companies that followed Google presumed that they too had an unfettered right to collect, parse, and sell personal data in pretty much any way they pleased. In the smart homes being built today, it’s understood that any and all data will be beamed up to corporate clouds.
  • Google conducted its great data heist under the cover of novelty. The web was an exciting frontier — something new in the world — and few people understood or cared about what they were revealing as they searched and surfed. In those innocent days, data was there for the taking, and Google took it
  • Google also benefited from decisions made by lawmakers, regulators, and judges — decisions that granted internet companies free use of a vast taxpayer-funded communication infrastructure, relieved them of legal and ethical responsibility for the information and messages they distributed, and gave them carte blanche to collect and exploit user data.
  • Consider the terms-of-service agreements that govern the division of rights and the delegation of ownership online. Non-negotiable, subject to emendation and extension at the company’s whim, and requiring only a casual click to bind the user, TOS agreements are parodies of contracts, yet they have been granted legal legitimacy by the court
  • Law professors, writes Zuboff, “call these ‘contracts of adhesion’ because they impose take-it-or-leave-it conditions on users that stick to them whether they like it or not.” Fundamentally undemocratic, the ubiquitous agreements helped Google and other firms commandeer personal data as if by fiat.
  • n the choices we make as consumers and private citizens, we have always traded some of our autonomy to gain other rewards. Many people, it seems clear, experience surveillance capitalism less as a prison, where their agency is restricted in a noxious way, than as an all-inclusive resort, where their agency is restricted in a pleasing way
  • Zuboff makes a convincing case that this is a short-sighted and dangerous view — that the bargain we’ve struck with the internet giants is a Faustian one
  • but her case would have been stronger still had she more fully addressed the benefits side of the ledger.
  • there’s a piece missing. While Zuboff’s assessment of the costs that people incur under surveillance capitalism is exhaustive, she largely ignores the benefits people receive in return — convenience, customization, savings, entertainment, social connection, and so on
  • hat the industries of the future will seek to manufacture is the self.
  • Behavior modification is the thread that ties today’s search engines, social networks, and smartphone trackers to tomorrow’s facial-recognition systems, emotion-detection sensors, and artificial-intelligence bots.
  • All of Facebook’s information wrangling and algorithmic fine-tuning, she writes, “is aimed at solving one problem: how and when to intervene in the state of play that is your daily life in order to modify your behavior and thus sharply increase the predictability of your actions now, soon, and later.”
  • “The goal of everything we do is to change people’s actual behavior at scale,” a top Silicon Valley data scientist told her in an interview. “We can test how actionable our cues are for them and how profitable certain behaviors are for us.”
  • This goal, she suggests, is not limited to Facebook. It is coming to guide much of the economy, as financial and social power shifts to the surveillance capitalists
  • Combining rich information on individuals’ behavioral triggers with the ability to deliver precisely tailored and timed messages turns out to be a recipe for behavior modification on an unprecedented scale.
  • it was Facebook, with its incredibly detailed data on people’s social lives, that grasped digital media’s full potential for behavior modification. By using what it called its “social graph” to map the intentions, desires, and interactions of literally billions of individuals, it saw that it could turn its network into a worldwide Skinner box, employing psychological triggers and rewards to program not only what people see but how they react.
  • spying on the populace is not the end game. The real prize lies in figuring out ways to use the data to shape how people think and act. “The best way to predict the future is to invent it,” the computer scientist Alan Kay once observed. And the best way to predict behavior is to script it.
  • competition for personal data intensified. It was no longer enough to monitor people online; making better predictions required that surveillance be extended into homes, stores, schools, workplaces, and the public squares of cities and towns. Much of the recent innovation in the tech industry has entailed the creation of products and services designed to vacuum up data from every corner of our lives
  • “The typical complaint is that privacy is eroded, but that is misleading,” Zuboff writes. “In the larger societal pattern, privacy is not eroded but redistributed . . . . Instead of people having the rights to decide how and what they will disclose, these rights are concentrated within the domain of surveillance capitalism.” The transfer of decision rights is also a transfer of autonomy and agency, from the citizen to the corporation.
  • What we lose under this regime is something more fundamental than privacy. It’s the right to make our own decisions about privacy — to draw our own lines between those aspects of our lives we are comfortable sharing and those we are not
  • Other possible ways of organizing online markets, such as through paid subscriptions for apps and services, never even got a chance to be tested.
  • Online surveillance came to be viewed as normal and even necessary by politicians, government bureaucrats, and the general public
  • Google and other Silicon Valley companies benefited directly from the government’s new stress on digital surveillance. They earned millions through contracts to share their data collection and analysis techniques with the National Security Agenc
  • As much as the dot-com crash, the horrors of 9/11 set the stage for the rise of surveillance capitalism. Zuboff notes that, in 2000, members of the Federal Trade Commission, frustrated by internet companies’ lack of progress in adopting privacy protections, began formulating legislation to secure people’s control over their online information and severely restrict the companies’ ability to collect and store it. It seemed obvious to the regulators that ownership of personal data should by default lie in the hands of private citizens, not corporations.
  • The 9/11 attacks changed the calculus. The centralized collection and analysis of online data, on a vast scale, came to be seen as essential to national security. “The privacy provisions debated just months earlier vanished from the conversation more or less overnight,”
johnsonel7

The case for economics - by the numbers | MIT News - 0 views

  • In recent years, criticism has been levelled at economics for being insular and unconcerned about real-world problems. But a new study led by MIT scholars finds the field increasingly overlaps with the work of other disciplines, and, in a related development, has become more empirical and data-driven, while producing less work of pure theory.
  • In psychology journals, for instance, citations of economics papers have more than doubled since 2000. Public health papers now cite economics work twice as often as they did 10 years ago, and citations of economics research in fields from operations research to computer science have risen sharply as well.
  • As Angrist acknowledges, one impetus for the study was the wave of criticism the economics profession has faced over the last decade, after the banking crisis and the “Great Recession” of 2008-2009, which included the finance-sector crash of 2008. The paper’s title alludes to the film “Inside Job” — whose thesis holds that, as Angrist puts it, “economics scholarship as an academic enterprise was captured somehow by finance, and that academic economists should therefore be blamed for the Great Recession.”
  • ...3 more annotations...
  • “If you ask me, economics has never been better,” says Josh Angrist, an MIT economist who led the study. “It’s never been more useful. It’s never been more scientific and more evidence-based.”
  • The study also details the relationship between economics and four additional social science disciplines: anthropology, political science, psychology, and sociology. Among these, political science has overtaken sociology as the discipline most engaged with economics. Psychology papers now cite economics research about as often as they cite works of sociology. The new intellectual connectivity between economics and psychology appears to be a product of the growth of behavioral economics, which examines the irrational, short-sighted financial decision-making of individuals — a different paradigm than the assumptions about rational decision-making found in neoclassical economics.
  • “It really seems to be the diversity of economics that makes it do well in influencing other fields,” Ellison says. “Operations research, computer science, and psychology are paying a lot of attention to economic theory. Sociologists are paying a lot of attention to labor economics, marketing and management are paying attention to industrial organization, statisticians are paying attention to econometrics, and the public health people are paying attention to health economics. Just about everything in economics is influential somewhere.”
katherineharron

Tech-averse Supreme Court could be forced into modern era - CNNPolitics - 0 views

  • The coronavirus pandemic is forcing all courts to alter their procedures, but the US Supreme Court, imbued with an archaic, insular air and a majority of justices over age 65, will face a distinct challenge to keep operating and provide public access to proceedings.
  • The virus is bound to force Supreme Court justices into new territory. They may open their operations in more modern ways. Or, if they move in the opposite direction and shun any high-tech alternative, they might postpone all previously scheduled March and April oral argument sessions, a total 20 disputes, until next summer or fall.
  • This very practical dilemma comes as the justices already have one of the most substantively difficult slate of cases in years, testing abortion rights, anti-bias protections for LGBTQ workers, and the Trump administration's plan for deportation of certain undocumented immigrants who came to the US as children. (Those cases have already been argued, and the justices are drafting opinions to be released later this spring.)
  • ...1 more annotation...
  • If they are weighing a more sophisticated audio or visual connection -- to each other, and to the public -- the justices have the support of an on-site technology team and young law clerks, four per chamber. At the other end of the spectrum, they might weigh canceling the remaining argument sessions and resolve the cases based only on the written briefs filed. Those lengthy filings are more comprehensive than lawyers' presentations in hour-long oral sessions.
Javier E

The Brain Has a Special Kind of Memory for Past Infections - Scientific American - 0 views

  • immune cells from the periphery routinely patrol the central nervous system and support its function. In a new study, researchers showed for the first time that—just as the brain remembers people, places, smells, and so on—it also stores what they call “memory traces” of the body’s past infections. Reactivating the same brain cells that encode this information is enough to swiftly summon the peripheral immune system to defend at-risk tissues.
  • It is clear the peripheral immune system is capable of retaining information about past infections to fight off future ones—otherwise, vaccines would not work. But Asya Rolls, a neuroimmunologist at Technion–Israel Institute of Technology and the paper’s senior author, says the study expands this concept of classical immunologic memory. Initially, she was taken aback that the brain could store traces of immune activity and use them to trigger such a precise response. “I was amazed,” she says.
  • After the infection and immune response dissipated, the researchers injected the mice with a drug that artificially reactivated those same groups of brain cells. They were stunned by what they saw: upon reactivation, the insular cortex directed the immune system to mount a targeted response in the gut at the site of the original inflammation—even though, by that time, there was no infection, tissue damage or pathogen-initiated local inflammation to be found. The brain had retained some sort of memory of the infection and was prepared to reinitiate the fight.
  • ...2 more annotations...
  • The new study provides “unassailable” evidence that the central nervous system can control the peripheral immune system, Tracey says. “It’s an incredibly important contribution to the fields of neuroscience and immunology.”
  • Just as researchers have traced sensory and motor processing to specific brain regions, Tracey suspects that a similar neurological “map” of immunologic information also exists. This new study, he says, is the first direct evidence of that map. “It’s going to be really exciting to see what comes next,” he adds.
Javier E

Among the Disrupted - The New York Times - 0 views

  • even as technologism, which is not the same as technology, asserts itself over more and more precincts of human life, so too does scientism, which is not the same as science.
  • The notion that the nonmaterial dimensions of life must be explained in terms of the material dimensions, and that nonscientific understandings must be translated into scientific understandings if they are to qualify as knowledge, is increasingly popular inside and outside the university,
  • So, too, does the view that the strongest defense of the humanities lies not in the appeal to their utility — that literature majors may find good jobs, that theaters may economically revitalize neighborhoods
  • ...27 more annotations...
  • The contrary insistence that the glories of art and thought are not evolutionary adaptations, or that the mind is not the brain, or that love is not just biology’s bait for sex, now amounts to a kind of heresy.
  • Greif’s book is a prehistory of our predicament, of our own “crisis of man.” (The “man” is archaic, the “crisis” is not.) It recognizes that the intellectual history of modernity may be written in part as the epic tale of a series of rebellions against humanism
  • We are not becoming transhumanists, obviously. We are too singular for the Singularity. But are we becoming posthumanists?
  • In American culture right now, as I say, the worldview that is ascendant may be described as posthumanism.
  • The posthumanism of the 1970s and 1980s was more insular, an academic affair of “theory,” an insurgency of professors; our posthumanism is a way of life, a social fate.
  • In “The Age of the Crisis of Man: Thought and Fiction in America, 1933-1973,” the gifted essayist Mark Greif, who reveals himself to be also a skillful historian of ideas, charts the history of the 20th-century reckonings with the definition of “man.
  • Here is his conclusion: “Anytime your inquiries lead you to say, ‘At this moment we must ask and decide who we fundamentally are, our solution and salvation must lie in a new picture of ourselves and humanity, this is our profound responsibility and a new opportunity’ — just stop.” Greif seems not to realize that his own book is a lasting monument to precisely such inquiry, and to its grandeur
  • “Answer, rather, the practical matters,” he counsels, in accordance with the current pragmatist orthodoxy. “Find the immediate actions necessary to achieve an aim.” But before an aim is achieved, should it not be justified? And the activity of justification may require a “picture of ourselves.” Don’t just stop. Think harder. Get it right.
  • — but rather in the appeal to their defiantly nonutilitarian character, so that individuals can know more than how things work, and develop their powers of discernment and judgment, their competence in matters of truth and goodness and beauty, to equip themselves adequately for the choices and the crucibles of private and public life.
  • Who has not felt superior to humanism? It is the cheapest target of all: Humanism is sentimental, flabby, bourgeois, hypocritical, complacent, middlebrow, liberal, sanctimonious, constricting and often an alibi for power
  • what is humanism? For a start, humanism is not the antithesis of religion, as Pope Francis is exquisitely demonstrating
  • The worldview takes many forms: a philosophical claim about the centrality of humankind to the universe, and about the irreducibility of the human difference to any aspect of our animality
  • Here is a humanist proposition for the age of Google: The processing of information is not the highest aim to which the human spirit can aspire, and neither is competitiveness in a global economy. The character of our society cannot be determined by engineers.
  • And posthumanism? It elects to understand the world in terms of impersonal forces and structures, and to deny the importance, and even the legitimacy, of human agency.
  • There have been humane posthumanists and there have been inhumane humanists. But the inhumanity of humanists may be refuted on the basis of their own worldview
  • the condemnation of cruelty toward “man the machine,” to borrow the old but enduring notion of an 18th-century French materialist, requires the importation of another framework of judgment. The same is true about universalism, which every critic of humanism has arraigned for its failure to live up to the promise of a perfect inclusiveness
  • there has never been a universalism that did not exclude. Yet the same is plainly the case about every particularism, which is nothing but a doctrine of exclusion; and the correction of particularism, the extension of its concept and its care, cannot be accomplished in its own name. It requires an idea from outside, an idea external to itself, a universalistic idea, a humanistic idea.
  • Asking universalism to keep faith with its own principles is a perennial activity of moral life. Asking particularism to keep faith with its own principles is asking for trouble.
  • there is no more urgent task for American intellectuals and writers than to think critically about the salience, even the tyranny, of technology in individual and collective life
  • a methodological claim about the most illuminating way to explain history and human affairs, and about the essential inability of the natural sciences to offer a satisfactory explanation; a moral claim about the priority, and the universal nature, of certain values, not least tolerance and compassion
  • “Our very mastery seems to escape our mastery,” Michel Serres has anxiously remarked. “How can we dominate our domination; how can we master our own mastery?”
  • universal accessibility is not the end of the story, it is the beginning. The humanistic methods that were practiced before digitalization will be even more urgent after digitalization, because we will need help in navigating the unprecedented welter
  • Searches for keywords will not provide contexts for keywords. Patterns that are revealed by searches will not identify their own causes and reasons
  • The new order will not relieve us of the old burdens, and the old pleasures, of erudition and interpretation.
  • Is all this — is humanism — sentimental? But sentimentality is not always a counterfeit emotion. Sometimes sentiment is warranted by reality.
  • The persistence of humanism through the centuries, in the face of formidable intellectual and social obstacles, has been owed to the truth of its representations of our complexly beating hearts, and to the guidance that it has offered, in its variegated and conflicting versions, for a soulful and sensitive existence
  • a complacent humanist is a humanist who has not read his books closely, since they teach disquiet and difficulty. In a society rife with theories and practices that flatten and shrink and chill the human subject, the humanist is the dissenter.
1 - 11 of 11
Showing 20 items per page