Skip to main content

Home/ TOK Friends/ Group items tagged guide

Rss Feed Group items tagged

Javier E

The Practical and the Theoretical - NYTimes.com - 1 views

  • Our society is divided into castes based upon a supposed division between theoretical knowledge and practical skill. The college professor holds forth on television, as the plumber fumes about detached ivory tower intellectuals.
  • . There is a natural temptation to view these activities as requiring distinct capacities.
  • If these are distinct cognitive capacities, then knowing how to do something is not knowledge of a fact — that is, there is a distinction between practical and theoretical knowledge.
  • ...6 more annotations...
  • According to the model suggested by this supposed dichotomy, exercises of theoretical knowledge involve active reflection, engagement with the propositions or rules of the theory in question that guides the subsequent exercise of the knowledge. Think of the chess player following an instruction she has learned for an opening move in chess. In contrast, practical knowledge is exercised automatically and without reflection.
  • Additionally, the fact that exercises of theoretical knowledge are guided by propositions or rules seems to entail that they involve instructions that are universally applicable
  • when one reflects upon any exercise of knowledge, whether practical or theoretical, it appears to have the characteristics that would naïvely be ascribed to the exercise of both practical and intellectual capacities
  • Perhaps one way to distinguish practical knowledge and theoretical knowledge is by talking. When we acquire knowledge of how to do something, we may not be able to express our knowledge in words. But when we acquire knowledge of a truth, we are able to express this knowledge in words.
  • once one bears down on the supposed distinction between practical knowledge and knowledge of truths, it breaks down. The plumber’s or electrician’s activities are a manifestation of the same kind of intelligence as the scientist’s or historian’s latest articles — knowledge of truths.
  • these are distinctions along a continuum, rather than distinctions in kind, as the folk distinction between practical and theoretical pursuits is intended to be.
Javier E

Why It's OK to Let Apps Make You a Better Person - Evan Selinger - Technology - The Atl... - 0 views

  • one theme emerges from the media coverage of people's relationships with our current set of technologies: Consumers want digital willpower. App designers in touch with the latest trends in behavioral modification--nudging, the quantified self, and gamification--and good old-fashioned financial incentive manipulation, are tackling weakness of will. They're harnessing the power of payouts, cognitive biases, social networking, and biofeedback. The quantified self becomes the programmable self.
  • the trend still has multiple interesting dimensions
  • Individuals are turning ever more aspects of their lives into managerial problems that require technological solutions. We have access to an ever-increasing array of free and inexpensive technologies that harness incredible computational power that effectively allows us to self-police behavior everywhere we go. As pervasiveness expands, so does trust.
  • ...20 more annotations...
  • Some embrace networked, data-driven lives and are comfortable volunteering embarrassing, real time information about what we're doing, whom we're doing it with, and how we feel about our monitored activities.
  • Put it all together and we can see that our conception of what it means to be human has become "design space." We're now Humanity 2.0, primed for optimization through commercial upgrades. And today's apps are more harbinger than endpoint.
  • philosophers have had much to say about the enticing and seemingly inevitable dispersion of technological mental prosthetic that promise to substitute or enhance some of our motivational powers.
  • beyond the practical issues lie a constellation of central ethical concerns.
  • they should cause us to pause as we think about a possible future that significantly increases the scale and effectiveness of willpower-enhancing apps. Let's call this hypothetical future Digital Willpower World and characterize the ethical traps we're about to discuss as potential general pitfalls
  • it is antithetical to the ideal of " resolute choice." Some may find the norm overly perfectionist, Spartan, or puritanical. However, it is not uncommon for folks to defend the idea that mature adults should strive to develop internal willpower strong enough to avoid external temptations, whatever they are, and wherever they are encountered.
  • In part, resolute choosing is prized out of concern for consistency, as some worry that lapse of willpower in any context indicates a generally weak character.
  • Fragmented selves behave one way while under the influence of digital willpower, but another when making decisions without such assistance. In these instances, inconsistent preferences are exhibited and we risk underestimating the extent of our technological dependency.
  • It simply means that when it comes to digital willpower, we should be on our guard to avoid confusing situational with integrated behaviors.
  • the problem of inauthenticity, a staple of the neuroethics debates, might arise. People might start asking themselves: Has the problem of fragmentation gone away only because devices are choreographing our behavior so powerfully that we are no longer in touch with our so-called real selves -- the selves who used to exist before Digital Willpower World was formed?
  • Infantalized subjects are morally lazy, quick to have others take responsibility for their welfare. They do not view the capacity to assume personal responsibility for selecting means and ends as a fundamental life goal that validates the effort required to remain committed to the ongoing project of maintaining willpower and self-control.
  • Michael Sandel's Atlantic essay, "The Case Against Perfection." He notes that technological enhancement can diminish people's sense of achievement when their accomplishments become attributable to human-technology systems and not an individual's use of human agency.
  • Borgmann worries that this environment, which habituates us to be on auto-pilot and delegate deliberation, threatens to harm the powers of reason, the most central component of willpower (according to the rationalist tradition).
  • In several books, including Technology and the Character of Contemporary Life, he expresses concern about technologies that seem to enhance willpower but only do so through distraction. Borgmann's paradigmatic example of the non-distracted, focally centered person is a serious runner. This person finds the practice of running maximally fulfilling, replete with the rewarding "flow" that can only comes when mind/body and means/ends are unified, while skill gets pushed to the limit.
  • Perhaps the very conception of a resolute self was flawed. What if, as psychologist Roy Baumeister suggests, willpower is more "staple of folk psychology" than real way of thinking about our brain processes?
  • novel approaches suggest the will is a flexible mesh of different capacities and cognitive mechanisms that can expand and contract, depending on the agent's particular setting and needs. Contrary to the traditional view that identifies the unified and cognitively transparent self as the source of willed actions, the new picture embraces a rather diffused, extended, and opaque self who is often guided by irrational trains of thought. What actually keeps the self and its will together are the given boundaries offered by biology, a coherent self narrative created by shared memories and experiences, and society. If this view of the will as an expa
  • nding and contracting system with porous and dynamic boundaries is correct, then it might seem that the new motivating technologies and devices can only increase our reach and further empower our willing selves.
  • "It's a mistake to think of the will as some interior faculty that belongs to an individual--the thing that pushes the motor control processes that cause my action," Gallagher says. "Rather, the will is both embodied and embedded: social and physical environment enhance or impoverish our ability to decide and carry out our intentions; often our intentions themselves are shaped by social and physical aspects of the environment."
  • It makes perfect sense to think of the will as something that can be supported or assisted by technology. Technologies, like environments and institutions can facilitate action or block it. Imagine I have the inclination to go to a concert. If I can get my ticket by pressing some buttons on my iPhone, I find myself going to the concert. If I have to fill out an application form and carry it to a location several miles away and wait in line to pick up my ticket, then forget it.
  • Perhaps the best way forward is to put a digital spin on the Socratic dictum of knowing myself and submit to the new freedom: the freedom of consuming digital willpower to guide me past the sirens.
Javier E

What's Wrong With the Teenage Mind? - WSJ.com - 1 views

  • What happens when children reach puberty earlier and adulthood later? The answer is: a good deal of teenage weirdness. Fortunately, developmental psychologists and neuroscientists are starting to explain the foundations of that weirdness.
  • The crucial new idea is that there are two different neural and psychological systems that interact to turn children into adults. Over the past two centuries, and even more over the past generation, the developmental timing of these two systems has changed. That, in turn, has profoundly changed adolescence and produced new kinds of adolescent woe. The big question for anyone who deals with young people today is how we can go about bringing these cogs of the teenage mind into sync once again
  • The first of these systems has to do with emotion and motivation. It is very closely linked to the biological and chemical changes of puberty and involves the areas of the brain that respond to rewards. This is the system that turns placid 10-year-olds into restless, exuberant, emotionally intense teenagers, desperate to attain every goal, fulfill every desire and experience every sensation. Later, it turns them back into relatively placid adults.
  • ...23 more annotations...
  • adolescents aren't reckless because they underestimate risks, but because they overestimate rewards—or, rather, find rewards more rewarding than adults do. The reward centers of the adolescent brain are much more active than those of either children or adults.
  • What teenagers want most of all are social rewards, especially the respect of their peers
  • Becoming an adult means leaving the world of your parents and starting to make your way toward the future that you will share with your peers. Puberty not only turns on the motivational and emotional system with new force, it also turns it away from the family and toward the world of equals.
  • The second crucial system in our brains has to do with control; it channels and harnesses all that seething energy. In particular, the prefrontal cortex reaches out to guide other parts of the brain, including the parts that govern motivation and emotion. This is the system that inhibits impulses and guides decision-making, that encourages long-term planning and delays gratification.
  • Today's adolescents develop an accelerator a long time before they can steer and brake.
  • Expertise comes with experience.
  • In gatherer-hunter and farming societies, childhood education involves formal and informal apprenticeship. Children have lots of chances to practice the skills that they need to accomplish their goals as adults, and so to become expert planners and actors.
  • In the past, to become a good gatherer or hunter, cook or caregiver, you would actually practice gathering, hunting, cooking and taking care of children all through middle childhood and early adolescence—tuning up just the prefrontal wiring you'd need as an adult. But you'd do all that under expert adult supervision and in the protected world of childhood
  • In contemporary life, the relationship between these two systems has changed dramatically. Puberty arrives earlier, and the motivational system kicks in earlier too. At the same time, contemporary children have very little experience with the kinds of tasks that they'll have to perform as grown-ups.
  • The experience of trying to achieve a real goal in real time in the real world is increasingly delayed, and the growth of the control system depends on just those experiences.
  • This control system depends much more on learning. It becomes increasingly effective throughout childhood and continues to develop during adolescence and adulthood, as we gain more experience.
  • An ever longer protected period of immaturity and dependence—a childhood that extends through college—means that young humans can learn more than ever before. There is strong evidence that IQ has increased dramatically as more children spend more time in school
  • children know more about more different subjects than they ever did in the days of apprenticeships.
  • Wide-ranging, flexible and broad learning, the kind we encourage in high-school and college, may actually be in tension with the ability to develop finely-honed, controlled, focused expertise in a particular skill, the kind of learning that once routinely took place in human societies.
  • this new explanation based on developmental timing elegantly accounts for the paradoxes of our particular crop of adolescents.
  • First, experience shapes the brain.
  • the brain is so powerful precisely because it is so sensitive to experience. It's as true to say that our experience of controlling our impulses make the prefrontal cortex develop as it is to say that prefrontal development makes us better at controlling our impulses
  • Second, development plays a crucial role in explaining human nature
  • there is more and more evidence that genes are just the first step in complex developmental sequences, cascades of interactions between organism and environment, and that those developmental processes shape the adult brain. Even small changes in developmental timing can lead to big changes in who we become.
  • Brain research is often taken to mean that adolescents are really just defective adults—grown-ups with a missing part.
  • But the new view of the adolescent brain isn't that the prefrontal lobes just fail to show up; it's that they aren't properly instructed and exercised
  • Instead of simply giving adolescents more and more school experiences—those extra hours of after-school classes and homework—we could try to arrange more opportunities for apprenticeship
  • Summer enrichment activities like camp and travel, now so common for children whose parents have means, might be usefully alternated with summer jobs, with real responsibilities.
  •  
    The two brain systems, the increasing gap between them, and the implications for adolescent education.
Javier E

untitled - 0 views

  • Scientists at Stanford University and the J. Craig Venter Institute have developed the first software simulation of an entire organism, a humble single-cell bacterium that lives in the human genital and respiratory tracts.
  • the work was a giant step toward developing computerized laboratories that could carry out many thousands of experiments much faster than is possible now, helping scientists penetrate the mysteries of diseases like cancer and Alzheimer’s.
  • cancer is not a one-gene problem; it’s a many-thousands-of-factors problem.”
  • ...7 more annotations...
  • This kind of modeling is already in use to study individual cellular processes like metabolism. But Dr. Covert said: “Where I think our work is different is that we explicitly include all of the genes and every known gene function. There’s no one else out there who has been able to include more than a handful of functions or more than, say, one-third of the genes.”
  • The simulation, which runs on a cluster of 128 computers, models the complete life span of the cell at the molecular level, charting the interactions of 28 categories of molecules — including DNA, RNA, proteins and small molecules known as metabolites, which are generated by cell processes.
  • They called the simulation an important advance in the new field of computational biology, which has recently yielded such achievements as the creation of a synthetic life form — an entire bacterial genome created by a team led by the genome pioneer J. Craig Venter. The scientists used it to take over an existing cell.
  • A decade ago, scientists developed simulations of metabolism that are now being used to study a wide array of cells, including bacteria, yeast and photosynthetic organisms. Other models exist for processes like protein synthesis.
  • “Right now, running a simulation for a single cell to divide only one time takes around 10 hours and generates half a gigabyte of data,” Dr. Covert wrote. “I find this fact completely fascinating, because I don’t know that anyone has ever asked how much data a living thing truly holds. We often think of the DNA as the storage medium, but clearly there is more to it than that.”
  • scientists chose an approach called object-oriented programming, which parallels the design of modern software systems. Software designers organize their programs in modules, which communicate with one another by passing data and instructions back and forth.
  • “The major modeling insight we had a few years ago was to break up the functionality of the cell into subgroups, which we could model individually, each with its own mathematics, and then to integrate these submodels together into a whole,”
Javier E

Book Club: A Guide To Living « The Dish - 0 views

  • He proves nothing that he doesn’t simultaneously subvert a little; he makes no over-arching argument about the way humans must live; he has no logician’s architecture or religious doctrine. He slips past all those familiar means of telling other people what’s good for them, and simply explains what has worked for him and others and leaves the reader empowered to forge her own future
  • You can see its eccentric power by considering the alternative ways of doing what Montaigne was doing. Think of contemporary self-help books – and all the fake certainty and rigid formulae they contain. Or think of a hideous idea like “the purpose-driven life” in which everything must be forced into the box of divine guidance in order to really live at all. Think of the stringency of Christian disciplines – say, the spiritual exercises of Ignatius of Loyola – and marvel at how Montaigne offers an entirely different and less compelling way to live. Think of the rigidity of Muslim practice and notice how much lee-way Montaigne gives to sin
  • This is a non-philosophical philosophy. It is a theory of practical life as told through one man’s random and yet not-so-random reflections on his time on earth. And it is shot through with doubt. Even the maxims that Montaigne embraces for living are edged with those critical elements of Montaigne’s thought that say “as far as I know”
  • ...4 more annotations...
  • Is this enough? Or is it rather a capitulation to relativism, a manifesto for political quietism, a worldview that treats injustice as something to be abhorred but not constantly fought against? This might be seen as the core progressive objection to the way of Montaigne. Or is his sensibility in an age of religious terror and violence and fanaticism the only ultimate solution we have?
  • here’s what we do know. We are fallible beings; we have nothing but provisional knowledge; and we will die. And this is enough. This does not mean we should give up inquiring or seeking to understand. Skepticism is not nihilism. It doesn’t posit that there is no truth; it merely notes that if truth exists, it is inherently beyond our ultimate grasp. And accepting those limits is the first step toward sanity, toward getting on with life. This is what I mean by conservatism.
  • you can find in philosophy any number of clues about how to live; you can even construct them into an ideology that explains all of human life and society – like Marxism or free market fundamentalism or a Nietzschean will to power. But as each totalist system broke down upon my further inspection, I found myself returning to Montaigne and the tradition of skepticism he represents
  • If I were to single out one theme of Montaigne’s work that has stuck with me, it would be this staring of death in the face, early and often, and never flinching. It is what our culture refuses to do much of the time, thereby disempowering us in the face of our human challenges.
Javier E

Why Our Memory Fails Us - NYTimes.com - 1 views

  • how we all usually respond when our memory is challenged. We have an abstract understanding that people can remember the same event differently.
  • But when our own memories are challenged, we may neglect all this and instead respond emotionally, acting as though we must be right and everyone else must be wrong.
  • It’s no accident that Oprah Winfrey’s latest best seller is called “What I Know For Sure,” rather than “Some Things That Might Be True.”
  • ...12 more annotations...
  • Our lack of appreciation for the fallibility of our own memories can lead to much bigger problems than a misattributed quote.
  • Memory failures that resemble Dr. Tyson’s mash-up of distinct experiences have led to false convictions, and even death sentences. Whose memories we believe and whose we disbelieve influence how we interpret controversial public events, as demonstrated most recently by the events in Ferguson, Mo.
  • , for false memories, higher confidence was associated with lower accuracy.
  • In general, if you have seen something before, your confidence that you have seen it and your accuracy in recalling it are linked: The more confident you are in your memory, the more likely you are to be right
  • . This fall the panel (which one of us, Daniel Simons, served on) released a comprehensive report that recommended procedures to minimize the chances of false memory and mistaken identification, including videotaping police lineups and improving jury instructions.
  • When we recall our own memories, we are not extracting a perfect record of our experiences and playing it back verbatim. Most people believe that memory works this way, but it doesn’t. Instead, we are effectively whispering a message from our past to our present, reconstructing it on the fly each time. We get a lot of details right, but when our memories change, we only “hear” the most recent version of the message, and we may assume that what we believe now is what we always believed.
  • Studies find that even our “flashbulb memories” of emotionally charged events can be distorted and inaccurate, but we cling to them with the greatest of confidence.
  • With each retrieval our memories can morph, and so can our confidence in them. This is why the National Academy of Sciences report strongly advised courts to rely on initial statements rather than courtroom proclamations:
  • In fact, the mere act of describing a person’s appearance can change how likely you are to pick him out of a lineup later. This finding, known as “verbal overshadowing,” had been controversial, but was recently verified in a collective effort by more than 30 separate research labs.
  • The science of memory distortion has become rigorous and reliable enough to help guide public policy. It should also guide our personal attitudes and actions.
  • It is just as misguided to conclude that someone who misremembers must be lying as it is to defend a false memory in the face of contradictory evidence. We should be more understanding of mistakes by others, and credit them when they admit they were wrong. We are all fabulists, and we must all get used to it.
  • Subliminal is a good book on this subject, as is Thinking, Fast and Slow. It is not merely that our memory is a game of telephone in which we garble the memory as we retrieve and re-store it. It appears to be a very useful part of existence, which has stayed with us or gotten stronger with evolution. There seems to be utility in forgetting, misremembering, and having memories coalesce with those of others in our social groups.
grayton downing

A Start to Saving Lives - Treating Sore Throats - NYTimes.com - 0 views

  • Getting parents to take sore throats more seriously and treating them more aggressively with penicillin could save thousands of lives in poor countries relatively cheaply, doctors from India and South Africa say.
  • The authors of a recent paper in the journal Global Heart estimate that a quarter of all sore throats are caused by strep A bacteria and that such infections lead to as many as 500,000 deaths a year, almost all of them in poor countries.
  • . Cuba, Costa Rica and Martinique have sharply reduced rheumatic fever by public education about sore throat, screening for strep by symptoms, and treating quickly.
Javier E

Humans, Version 3.0 § SEEDMAGAZINE.COM - 0 views

  • Where are we humans going, as a species? If science fiction is any guide, we will genetically evolve like in X-Men, become genetically engineered as in Gattaca, or become cybernetically enhanced like General Grievous in Star Wars.
  • There is, however, another avenue for human evolution, one mostly unappreciated in both science and fiction. It is this unheralded mechanism that will usher in the next stage of human, giving future people exquisite powers we do not currently possess, powers worthy of natural selection itself. And, importantly, it doesn’t require us to transform into cyborgs or bio-engineered lab rats. It merely relies on our natural bodies and brains functioning as they have for millions of years. This mystery mechanism of human transformation is neuronal recycling, coined by neuroscientist Stanislas Dehaene, wherein the brain’s innate capabilities are harnessed for altogether novel functions.
  • The root of these misconceptions is the radical underappreciation of the design engineered by natural selection into the powers implemented by our bodies and brains, something central to my 2009 book, The Vision Revolution. For example, optical illusions (such as the Hering) are not examples of the brain’s poor hardware design, but, rather, consequences of intricate evolutionary software for generating perceptions that correct for neural latencies in normal circumstances.
  • ...4 more annotations...
  • Like all animal brains, human brains are not general-purpose universal learning machines, but, instead, are intricately structured suites of instincts optimized for the environments in which they evolved. To harness our brains, we want to let the brain’s brilliant mechanisms run as intended—i.e., not to be twisted. Rather, the strategy is to twist Y into a shape that the brain does know how to process.
  • there is a very good reason to be optimistic that the next stage of human will come via the form of adaptive harnessing, rather than direct technological enhancement: It has already happened. We have already been transformed via harnessing beyond what we once were. We’re already Human 2.0, not the Human 1.0, or Homo sapiens, that natural selection made us. We Human 2.0’s have, among many powers, three that are central to who we take ourselves to be today: writing, speech, and music (the latter perhaps being the pinnacle of the arts). Yet these three capabilities, despite having all the hallmarks of design, were not a result of natural selection, nor were they the result of genetic engineering or cybernetic enhancement to our brains. Instead, and as I argue in both The Vision Revolution and my forthcoming Harnessed, these are powers we acquired by virtue of harnessing, or neuronal recycling.
  • Although the step from Human 1.0 to 2.0 was via cultural selection, not via explicit human designers, does the transformation to Human 3.0 need to be entirely due to a process like cultural evolution, or might we have any hope of purposely guiding our transformation? When considering our future, that’s probably the most relevant question we should be asking ourselves.
  • One of my reasons for optimism is that nature-harnessing technologies (like writing, speech, and music) must mimic fundamental ecological features in nature, and that is a much easier task for scientists to tackle than emulating the exhorbitantly complex mechanisms of the brain
Duncan H

Cancer Screening May Be More Popular Than Useful - NYTimes.com - 0 views

  • Now expert groups are proposing less screening for prostate, breast and cervical cancer and have emphasized that screening comes with harms as well as benefits.
  • the influential United States Preventive Services Task Force, which evaluates evidence and publishes screening guidelines, said that women in their 40s do not appear to benefit from mammograms and that women ages 50 to 74 should consider having them every two years inst
  • Two recent clinical trials of prostate cancer screening cast doubt on whether many lives — or any — are saved. And it said that screening often leads to what can be disabling treatments for men whose cancer otherwise would never have harmed them. A new analysis of mammography concluded that while mammograms find cancer in 138,000 women each year, as many as 120,000 to 134,000 of those women either have cancers that are already lethal or have cancers that grow so slowly they do not need to be treated.
  • ...2 more annotations...
  • But these concepts are difficult for many to swallow. Specialists like urologists, radiologists and oncologists, who see patients who are sick and dying from cancer, often resist the idea of doing less screening. General practitioners, who may agree with the new guidelines, worry about getting involved in long conversations with patients trying to explain why they might reconsider having a mammogram every year or a P.S.A. test at all. Some doctors fear lawsuits if they do not screen and a patient develops a fatal cancer. Patients often say they will take their chances with screening’s harms if a test can save their lives.
  • And comments like Dr. Brawley’s give rise to other questions as well. Is all this happening now because of worries over costs? And in any case, is all this simply an academic argument, since most doctors, faced with real patients, still suggest frequent screening and their patients agree?
  •  
    Who should get screening and when?
Javier E

A Meditation on the Art of Not Trying - NYTimes.com - 0 views

  • It’s the default prescription for any tense situation: a blind date, a speech, a job interview, the first dinner with the potential in-laws. Relax. Act natural. Just be yourself. But when you’re nervous, how can you be yourself?
  • Edward Slingerland. He has developed, quite deliberately, a theory of spontaneity based on millenniums of Asian philosophy and decades of research by psychologists and neuroscientists.
  • He calls it the paradox of wu wei, the Chinese term for “effortless action.”
  • ...18 more annotations...
  • Wu wei is integral to romance, religion, politics and commerce. It’s why some leaders have charisma and why business executives insist on a drunken dinner before sealing a deal.
  • the quest for wu wei has been going on ever since humans began living in groups larger than hunter-gathering clans. Unable to rely on the bonds of kinship, the first urban settlements survived by developing shared values, typically through religion, that enabled people to trust one another’s virtue and to cooperate for the common good.
  • But there was always the danger that someone was faking it and would make a perfectly rational decision to put his own interest first if he had a chance to shirk his duty.
  • To be trusted, it wasn’t enough just to be a sensible, law-abiding citizen, and it wasn’t even enough to dutifully strive to be virtuous. You had to demonstrate that your virtue was so intrinsic that it came to you effortlessly.
  • the discovery in 1993 of bamboo strips in a tomb in the village of Guodian in central China. The texts on the bamboo, composed more than three centuries before Christ, emphasize that following rules and fulfilling obligations are not enough to maintain social order.
  • These texts tell aspiring politicians that they must have an instinctive sense of their duties to their superiors: “If you try to be filial, this not true filiality; if you try to be obedient, this is not true obedience. You cannot try, but you also cannot not try.”
  • is that authentic wu wei? Not according to the rival school of Taoists that arose around the same time as Confucianism, in the fifth century B.C. It was guided by the Tao Te Ching, “The Classic of the Way and Virtue,” which took a direct shot at Confucius: “The worst kind of Virtue never stops striving for Virtue, and so never achieves Virtue.”
  • Through willpower and the rigorous adherence to rules, traditions and rituals, the Confucian “gentleman” was supposed to learn proper behavior so thoroughly that it would eventually become second nature to him.
  • Taoists did not strive. Instead of following the rigid training and rituals required by Confucius, they sought to liberate the natural virtue within. They went with the flow. They disdained traditional music in favor of a funkier new style with a beat. They emphasized personal meditation instead of formal scholarship.
  • Variations of this debate would take place among Zen Buddhist, Hindu and Christian philosophers, and continue today among psychologists and neuroscientists arguing how much of morality and behavior is guided by rational choices or by unconscious feelings.
  • “Psychological science suggests that the ancient Chinese philosophers were genuinely on to something,” says Jonathan Schooler, a psychologist at the University of California, Santa Barbara. “Particularly when one has developed proficiency in an area, it is often better to simply go with the flow. Paralysis through analysis and overthinking are very real pitfalls that the art of wu wei was designed to avoid.”
  • Before signing a big deal, businesspeople often insist on getting to know potential partners at a boozy meal because alcohol makes it difficult to fake feelings.
  • Some people, like politicians and salespeople, can get pretty good at faking spontaneity, but we’re constantly looking for ways to expose them.
  • However wu wei is attained, there’s no debate about the charismatic effect it creates. It conveys an authenticity that makes you attractive, whether you’re addressing a crowd or talking to one person.
  • what’s the best strategy for wu wei — trying or not trying? Dr. Slingerland recommends a combination. Conscious effort is necessary to learn a skill, and the Confucian emphasis on following rituals is in accord with psychological research showing we have a limited amount of willpower. Training yourself to follow rules automatically can be liberating, because it conserves cognitive energy for other tasks.
  • He likes the compromise approach of Mencius, a Chinese philosopher in the fourth century B.C. who combined the Confucian and Taoist approaches: Try, but not too hard.
  • “But in many domains actual success requires the ability to transcend our training and relax completely into what we are doing, or simply forget ourselves as agents.”
  • The sprouts were Mencius’ conception of wu wei: Something natural that requires gentle cultivation. You plant the seeds and water the sprouts, but at some point you need to let nature take its course. Just let the sprouts be themselves.
Javier E

Why Teenagers Act Crazy - NYTimes.com - 1 views

  • there is a darker side to adolescence that, until now, was poorly understood: a surge during teenage years in anxiety and fearfulness. Largely because of a quirk of brain development, adolescents, on average, experience more anxiety and fear and have a harder time learning how not to be afraid than either children or adults.
  • the brain circuit for processing fear — the amygdala — is precocious and develops way ahead of the prefrontal cortex, the seat of reasoning and executive control. This means that adolescents have a brain that is wired with an enhanced capacity for fear and anxiety, but is relatively underdeveloped when it comes to calm reasoning.
  • the brain’s reward center, just like its fear circuit, matures earlier than the prefrontal cortex. That reward center drives much of teenagers’ risky behavior. This behavioral paradox also helps explain why adolescents are particularly prone to injury and trauma. The top three killers of teenagers are accidents, homicide and suicide.
  • ...10 more annotations...
  • The brain-development lag has huge implications for how we think about anxiety and how we treat it. It suggests that anxious adolescents may not be very responsive to psychotherapy that attempts to teach them to be unafraid, like cognitive behavior therapy
  • should also make us think twice — and then some — about the ever rising use of stimulants in young people, because these drugs may worsen anxiety and make it harder for teenagers to do what they are developmentally supposed to do: learn to be unafraid when it is appropriate
  • up to 20 percent of adolescents in the United States experience a diagnosable anxiety disorder, like generalized anxiety or panic attacks, probably resulting from a mix of genetic factors and environmental influences.
  • This isn’t to say that cognitive therapy is ineffective for teenagers, but that because of their relative difficulty in learning to be unafraid, it may not be the most effective treatment when used on its own.
  • Fear learning lies at the heart of anxiety and anxiety disorders. This primitive form of learning allows us to form associations between events and specific cues and environments that may predict danger.
  • once previously threatening cues or situations become safe, we have to be able to re-evaluate them and suppress our learned fear associations. People with anxiety disorders have trouble doing this and experience persistent fear in the absence of threat — better known as anxiety.
  • Dr. Casey discovered that adolescents had a much harder time “unlearning” the link between the colored square and the noise than children or adults did.
  • adolescents had trouble learning that a cue that was previously linked to something aversive was now neutral and “safe.” If you consider that adolescence is a time of exploration when young people develop greater autonomy, an enhanced capacity for fear and a more tenacious memory for threatening situations are adaptive and would confer survival advantage. In fact, the developmental gap between the amygdala and the prefrontal cortex that is described in humans has been found across mammalian species, suggesting that this is an evolutionary advantage.
  • As a psychiatrist, I’ve treated many adults with various anxiety disorders, nearly all of whom trace the origin of the problem to their teenage years. They typically report an uneventful childhood rudely interrupted by adolescent anxiety. For many, the anxiety was inexplicable and came out of nowhere.
  • prescription sales for stimulants increased more than fivefold between 2002 and 2012. This is of potential concern because it is well known from both human and animal studies that stimulants enhance learning and, in particular, fear conditioning.
Javier E

A Basic Guide for Curious Minds | Bill Gates - 1 views

  • Randall Munroe’s new book, Thing Explainer: Complicated Stuff in Simple Words,
  • Munroe sets out to explain various subjects—from how smartphones work to what the U.S. Constitution says—without any complicated terms. Instead he draws blueprint-style diagrams and annotates them using only the 1,000 most common words in the English language. A nuclear reactor is a “heavy metal power building.” A dishwasher is a “box that cleans food holders.”
sissij

Bacon Shortage? Calm Down. It's Fake News. - The New York Times - 2 views

  • The alarming headlines came quickly Wednesday morning: “Now It’s Getting Serious: 2017 Could See a Bacon Shortage.”
  • The source of the anxiety was a recent report from the U.S.D.A., boosted by the Ohio Pork Council, which reported that the country’s frozen pork belly inventory was at its lowest point in half a century.
  • To create a panic “was not our intent,” Mr. Deaton added with a laugh. “We can’t control how the news is interpreted.”
  •  
    With the development of Internet and social media, we find the news on the website, paper, TV more unreliable. Partly because we can easily find alternate statement that points out the flaws, but mostly, it's because the news today likes to use exaggeration to grab the attention of the general population. Media should realize the impact and panic it can cause in the society before they report news. Although freedom of speech is appreciated, that doesn't mean the media can put aside its responsibility to guide the general population in a good direction. I remember there was a fake news after the big earthquake in Japan saying that salt can prevent nuclear radiation, then people were all panic and bought salt. It was very funny that in some places, some people were even fighting for a pack of salt. The media should make sure that people won't misunderstand the message before they publish. --Sissi (2/1/2017)
Javier E

Opinion | Speaking as a White Male … - The New York Times - 0 views

  • If you go back to the intellectuals of the 1950s, you get the impression that they thought individuals could very much determine their own beliefs.
  • Busy fighting communism and fascism, people back then emphasized individual reason and were deeply allergic to groupthink.
  • We don’t think this way anymore, and in fact thinking this way can get you into trouble. I guess the first step was the rise of perspectivism
  • ...15 more annotations...
  • This is the belief, often traced back to Nietzsche, that what you believe is determined by where you stand: Our opinions are not guided by objective truth, because there is no such thing; they are guided by our own spot in society.
  • Then came Michel Foucault and critical race theorists and the rest, and the argument that society is structured by elites to preserve their privilege.
  • Now we are at a place where it is commonly assumed that your perceptions are something that come to you through your group, through your demographic identity.
  • What does that mean? After you’ve stated your group identity, what is the therefore that follows?
  • We’ve shifted from an emphasis on individual judgment toward a greater emphasis on collective experience.
  • Under what circumstances should we embrace the idea that collective identity shapes our thinking? Under what circumstances should we resist collective identity and insist on the primacy of individual discretion, and our common humanity?
  • On the one hand, the drive to bring in formerly marginalized groups has obviously been one of the great achievements of our era
  • Wider inclusion has vastly improved public debate
  • other times, group identity seems irrelevant to many issues
  • And there are other times when collective thinking seems positively corrupting. Why are people’s views of global warming, genetically modified foods and other scientific issues strongly determined by political label? That seems ridiculous.
  • Our whole education system is based on the idea that we train individuals to be critical thinkers. Our political system is based on the idea that persuasion and deliberation lead to compromise and toward truth. The basis of human dignity is our capacity to make up our own minds
  • One of the things I’ve learned in a lifetime in journalism is that people are always more unpredictable than their categories.
  • the notion that group membership determines opinion undermines all that.
  • If it’s just group against group, deliberation is a sham, beliefs are just masks groups use to preserve power structures, and democracy is a fraud.
  • The epistemological foundation of our system is in surprisingly radical flux.
Cecilia Ergueta

We Aren't Built to Live in the Moment - The New York Times - 2 views

  • Looking into the future, consciously and unconsciously, is a central function of our large brain, as psychologists and neuroscientists have discovered — rather belatedly, because for the past century most researchers have assumed that we’re prisoners of the past and the present.
  • Behavior, memory and perception can’t be understood without appreciating the central role of prospection. We learn not by storing static records but by continually retouching memories and imagining future possibilities
  • Our emotions are less reactions to the present than guides to future behavior
  • ...4 more annotations...
  • If traditional psychological theory had been correct, these people would have spent a lot of time ruminating. But they actually thought about the future three times more often than the past,
  • paid special attention to unexpected novelties because that was how they learned to avoid punishment and win rewards.
  • Your brain engages in the same sort of prospection to provide its own instant answers, which come in the form of emotions. The main purpose of emotions is to guide future behavior and moral judgments, according to researchers in a new field called prospective psychology. Emotions enable you to empathize with others by predicting their reactions.
  • If Homo prospectus takes the really long view, does he become morbid? That was a longstanding assumption in psychologists’ “terror management theory,” which held that humans avoid thinking about the future because they fear death. The theory was explored in hundreds of experiments assigning people to think about their own deaths. One common response was to become more assertive about one’s cultural values, like becoming more patriotic.
Javier E

Thieves of experience: On the rise of surveillance capitalism - 1 views

  • Harvard Business School professor emerita Shoshana Zuboff argues in her new book that the Valley’s wealth and power are predicated on an insidious, essentially pathological form of private enterprise—what she calls “surveillance capitalism.” Pioneered by Google, perfected by Facebook, and now spreading throughout the economy, surveillance capitalism uses human life as its raw material. Our everyday experiences, distilled into data, have become a privately-owned business asset used to predict and mold our behavior, whether we’re shopping or socializing, working or voting.
  • By reengineering the economy and society to their own benefit, Google and Facebook are perverting capitalism in a way that undermines personal freedom and corrodes democracy.
  • Under the Fordist model of mass production and consumption that prevailed for much of the twentieth century, industrial capitalism achieved a relatively benign balance among the contending interests of business owners, workers, and consumers. Enlightened executives understood that good pay and decent working conditions would ensure a prosperous middle class eager to buy the goods and services their companies produced. It was the product itself — made by workers, sold by companies, bought by consumers — that tied the interests of capitalism’s participants together. Economic and social equilibrium was negotiated through the product.
  • ...72 more annotations...
  • By removing the tangible product from the center of commerce, surveillance capitalism upsets the equilibrium. Whenever we use free apps and online services, it’s often said, we become the products, our attention harvested and sold to advertisers
  • this truism gets it wrong. Surveillance capitalism’s real products, vaporous but immensely valuable, are predictions about our future behavior — what we’ll look at, where we’ll go, what we’ll buy, what opinions we’ll hold — that internet companies derive from our personal data and sell to businesses, political operatives, and other bidders.
  • Unlike financial derivatives, which they in some ways resemble, these new data derivatives draw their value, parasite-like, from human experience.To the Googles and Facebooks of the world, we are neither the customer nor the product. We are the source of what Silicon Valley technologists call “data exhaust” — the informational byproducts of online activity that become the inputs to prediction algorithms
  • Another 2015 study, appearing in the Journal of Computer-Mediated Communication, showed that when people hear their phone ring but are unable to answer it, their blood pressure spikes, their pulse quickens, and their problem-solving skills decline.
  • The smartphone has become a repository of the self, recording and dispensing the words, sounds and images that define what we think, what we experience and who we are. In a 2015 Gallup survey, more than half of iPhone owners said that they couldn’t imagine life without the device.
  • So what happens to our minds when we allow a single tool such dominion over our perception and cognition?
  • Not only do our phones shape our thoughts in deep and complicated ways, but the effects persist even when we aren’t using the devices. As the brain grows dependent on the technology, the research suggests, the intellect weakens.
  • he has seen mounting evidence that using a smartphone, or even hearing one ring or vibrate, produces a welter of distractions that makes it harder to concentrate on a difficult problem or job. The division of attention impedes reasoning and performance.
  • internet companies operate in what Zuboff terms “extreme structural independence from people.” When databases displace goods as the engine of the economy, our own interests, as consumers but also as citizens, cease to be part of the negotiation. We are no longer one of the forces guiding the market’s invisible hand. We are the objects of surveillance and control.
  • Social skills and relationships seem to suffer as well.
  • In both tests, the subjects whose phones were in view posted the worst scores, while those who left their phones in a different room did the best. The students who kept their phones in their pockets or bags came out in the middle. As the phone’s proximity increased, brainpower decreased.
  • In subsequent interviews, nearly all the participants said that their phones hadn’t been a distraction—that they hadn’t even thought about the devices during the experiment. They remained oblivious even as the phones disrupted their focus and thinking.
  • The researchers recruited 520 undergraduates at UCSD and gave them two standard tests of intellectual acuity. One test gauged “available working-memory capacity,” a measure of how fully a person’s mind can focus on a particular task. The second assessed “fluid intelligence,” a person’s ability to interpret and solve an unfamiliar problem. The only variable in the experiment was the location of the subjects’ smartphones. Some of the students were asked to place their phones in front of them on their desks; others were told to stow their phones in their pockets or handbags; still others were required to leave their phones in a different room.
  • the “integration of smartphones into daily life” appears to cause a “brain drain” that can diminish such vital mental skills as “learning, logical reasoning, abstract thought, problem solving, and creativity.”
  •  Smartphones have become so entangled with our existence that, even when we’re not peering or pawing at them, they tug at our attention, diverting precious cognitive resources. Just suppressing the desire to check our phone, which we do routinely and subconsciously throughout the day, can debilitate our thinking.
  • They found that students who didn’t bring their phones to the classroom scored a full letter-grade higher on a test of the material presented than those who brought their phones. It didn’t matter whether the students who had their phones used them or not: All of them scored equally poorly.
  • A study of nearly a hundred secondary schools in the U.K., published last year in the journal Labour Economics, found that when schools ban smartphones, students’ examination scores go up substantially, with the weakest students benefiting the most.
  • Data, the novelist and critic Cynthia Ozick once wrote, is “memory without history.” Her observation points to the problem with allowing smartphones to commandeer our brains
  • Because smartphones serve as constant reminders of all the friends we could be chatting with electronically, they pull at our minds when we’re talking with people in person, leaving our conversations shallower and less satisfying.
  • In a 2013 study conducted at the University of Essex in England, 142 participants were divided into pairs and asked to converse in private for ten minutes. Half talked with a phone in the room, half without a phone present. The subjects were then given tests of affinity, trust and empathy. “The mere presence of mobile phones,” the researchers reported in the Journal of Social and Personal Relationships, “inhibited the development of interpersonal closeness and trust” and diminished “the extent to which individuals felt empathy and understanding from their partners.”
  • The evidence that our phones can get inside our heads so forcefully is unsettling. It suggests that our thoughts and feelings, far from being sequestered in our skulls, can be skewed by external forces we’re not even aware o
  •  Scientists have long known that the brain is a monitoring system as well as a thinking system. Its attention is drawn toward any object that is new, intriguing or otherwise striking — that has, in the psychological jargon, “salience.”
  • even in the history of captivating media, the smartphone stands out. It is an attention magnet unlike any our minds have had to grapple with before. Because the phone is packed with so many forms of information and so many useful and entertaining functions, it acts as what Dr. Ward calls a “supernormal stimulus,” one that can “hijack” attention whenever it is part of our surroundings — and it is always part of our surroundings.
  • Imagine combining a mailbox, a newspaper, a TV, a radio, a photo album, a public library and a boisterous party attended by everyone you know, and then compressing them all into a single, small, radiant object. That is what a smartphone represents to us. No wonder we can’t take our minds off it.
  • The irony of the smartphone is that the qualities that make it so appealing to us — its constant connection to the net, its multiplicity of apps, its responsiveness, its portability — are the very ones that give it such sway over our minds.
  • Phone makers like Apple and Samsung and app writers like Facebook, Google and Snap design their products to consume as much of our attention as possible during every one of our waking hours
  • Social media apps were designed to exploit “a vulnerability in human psychology,” former Facebook president Sean Parker said in a recent interview. “[We] understood this consciously. And we did it anyway.”
  • A quarter-century ago, when we first started going online, we took it on faith that the web would make us smarter: More information would breed sharper thinking. We now know it’s not that simple.
  • As strange as it might seem, people’s knowledge and understanding may actually dwindle as gadgets grant them easier access to online data stores
  • In a seminal 2011 study published in Science, a team of researchers — led by the Columbia University psychologist Betsy Sparrow and including the late Harvard memory expert Daniel Wegner — had a group of volunteers read forty brief, factual statements (such as “The space shuttle Columbia disintegrated during re-entry over Texas in Feb. 2003”) and then type the statements into a computer. Half the people were told that the machine would save what they typed; half were told that the statements would be erased.
  • Afterward, the researchers asked the subjects to write down as many of the statements as they could remember. Those who believed that the facts had been recorded in the computer demonstrated much weaker recall than those who assumed the facts wouldn’t be stored. Anticipating that information would be readily available in digital form seemed to reduce the mental effort that people made to remember it
  • The researchers dubbed this phenomenon the “Google effect” and noted its broad implications: “Because search engines are continually available to us, we may often be in a state of not feeling we need to encode the information internally. When we need it, we will look it up.”
  • as the pioneering psychologist and philosopher William James said in an 1892 lecture, “the art of remembering is the art of thinking.”
  • Only by encoding information in our biological memory can we weave the rich intellectual associations that form the essence of personal knowledge and give rise to critical and conceptual thinking. No matter how much information swirls around us, the less well-stocked our memory, the less we have to think with.
  • As Dr. Wegner and Dr. Ward explained in a 2013 Scientific American article, when people call up information through their devices, they often end up suffering from delusions of intelligence. They feel as though “their own mental capacities” had generated the information, not their devices. “The advent of the ‘information age’ seems to have created a generation of people who feel they know more than ever before,” the scholars concluded, even though “they may know ever less about the world around them.”
  • That insight sheds light on society’s current gullibility crisis, in which people are all too quick to credit lies and half-truths spread through social media. If your phone has sapped your powers of discernment, you’ll believe anything it tells you.
  • A second experiment conducted by the researchers produced similar results, while also revealing that the more heavily students relied on their phones in their everyday lives, the greater the cognitive penalty they suffered.
  • When we constrict our capacity for reasoning and recall or transfer those skills to a gadget, we sacrifice our ability to turn information into knowledge. We get the data but lose the meaning
  • We need to give our minds more room to think. And that means putting some distance between ourselves and our phones.
  • Google’s once-patient investors grew restive, demanding that the founders figure out a way to make money, preferably lots of it.
  • nder pressure, Page and Brin authorized the launch of an auction system for selling advertisements tied to search queries. The system was designed so that the company would get paid by an advertiser only when a user clicked on an ad. This feature gave Google a huge financial incentive to make accurate predictions about how users would respond to ads and other online content. Even tiny increases in click rates would bring big gains in income. And so the company began deploying its stores of behavioral data not for the benefit of users but to aid advertisers — and to juice its own profits. Surveillance capitalism had arrived.
  • Google’s business now hinged on what Zuboff calls “the extraction imperative.” To improve its predictions, it had to mine as much information as possible from web users. It aggressively expanded its online services to widen the scope of its surveillance.
  • Through Gmail, it secured access to the contents of people’s emails and address books. Through Google Maps, it gained a bead on people’s whereabouts and movements. Through Google Calendar, it learned what people were doing at different moments during the day and whom they were doing it with. Through Google News, it got a readout of people’s interests and political leanings. Through Google Shopping, it opened a window onto people’s wish lists,
  • The company gave all these services away for free to ensure they’d be used by as many people as possible. It knew the money lay in the data.
  • the organization grew insular and secretive. Seeking to keep the true nature of its work from the public, it adopted what its CEO at the time, Eric Schmidt, called a “hiding strategy” — a kind of corporate omerta backed up by stringent nondisclosure agreements.
  • Page and Brin further shielded themselves from outside oversight by establishing a stock structure that guaranteed their power could never be challenged, neither by investors nor by directors.
  • What’s most remarkable about the birth of surveillance capitalism is the speed and audacity with which Google overturned social conventions and norms about data and privacy. Without permission, without compensation, and with little in the way of resistance, the company seized and declared ownership over everyone’s information
  • The companies that followed Google presumed that they too had an unfettered right to collect, parse, and sell personal data in pretty much any way they pleased. In the smart homes being built today, it’s understood that any and all data will be beamed up to corporate clouds.
  • Google conducted its great data heist under the cover of novelty. The web was an exciting frontier — something new in the world — and few people understood or cared about what they were revealing as they searched and surfed. In those innocent days, data was there for the taking, and Google took it
  • Google also benefited from decisions made by lawmakers, regulators, and judges — decisions that granted internet companies free use of a vast taxpayer-funded communication infrastructure, relieved them of legal and ethical responsibility for the information and messages they distributed, and gave them carte blanche to collect and exploit user data.
  • Consider the terms-of-service agreements that govern the division of rights and the delegation of ownership online. Non-negotiable, subject to emendation and extension at the company’s whim, and requiring only a casual click to bind the user, TOS agreements are parodies of contracts, yet they have been granted legal legitimacy by the court
  • Law professors, writes Zuboff, “call these ‘contracts of adhesion’ because they impose take-it-or-leave-it conditions on users that stick to them whether they like it or not.” Fundamentally undemocratic, the ubiquitous agreements helped Google and other firms commandeer personal data as if by fiat.
  • n the choices we make as consumers and private citizens, we have always traded some of our autonomy to gain other rewards. Many people, it seems clear, experience surveillance capitalism less as a prison, where their agency is restricted in a noxious way, than as an all-inclusive resort, where their agency is restricted in a pleasing way
  • Zuboff makes a convincing case that this is a short-sighted and dangerous view — that the bargain we’ve struck with the internet giants is a Faustian one
  • but her case would have been stronger still had she more fully addressed the benefits side of the ledger.
  • there’s a piece missing. While Zuboff’s assessment of the costs that people incur under surveillance capitalism is exhaustive, she largely ignores the benefits people receive in return — convenience, customization, savings, entertainment, social connection, and so on
  • hat the industries of the future will seek to manufacture is the self.
  • Behavior modification is the thread that ties today’s search engines, social networks, and smartphone trackers to tomorrow’s facial-recognition systems, emotion-detection sensors, and artificial-intelligence bots.
  • All of Facebook’s information wrangling and algorithmic fine-tuning, she writes, “is aimed at solving one problem: how and when to intervene in the state of play that is your daily life in order to modify your behavior and thus sharply increase the predictability of your actions now, soon, and later.”
  • “The goal of everything we do is to change people’s actual behavior at scale,” a top Silicon Valley data scientist told her in an interview. “We can test how actionable our cues are for them and how profitable certain behaviors are for us.”
  • This goal, she suggests, is not limited to Facebook. It is coming to guide much of the economy, as financial and social power shifts to the surveillance capitalists
  • Combining rich information on individuals’ behavioral triggers with the ability to deliver precisely tailored and timed messages turns out to be a recipe for behavior modification on an unprecedented scale.
  • it was Facebook, with its incredibly detailed data on people’s social lives, that grasped digital media’s full potential for behavior modification. By using what it called its “social graph” to map the intentions, desires, and interactions of literally billions of individuals, it saw that it could turn its network into a worldwide Skinner box, employing psychological triggers and rewards to program not only what people see but how they react.
  • spying on the populace is not the end game. The real prize lies in figuring out ways to use the data to shape how people think and act. “The best way to predict the future is to invent it,” the computer scientist Alan Kay once observed. And the best way to predict behavior is to script it.
  • competition for personal data intensified. It was no longer enough to monitor people online; making better predictions required that surveillance be extended into homes, stores, schools, workplaces, and the public squares of cities and towns. Much of the recent innovation in the tech industry has entailed the creation of products and services designed to vacuum up data from every corner of our lives
  • “The typical complaint is that privacy is eroded, but that is misleading,” Zuboff writes. “In the larger societal pattern, privacy is not eroded but redistributed . . . . Instead of people having the rights to decide how and what they will disclose, these rights are concentrated within the domain of surveillance capitalism.” The transfer of decision rights is also a transfer of autonomy and agency, from the citizen to the corporation.
  • What we lose under this regime is something more fundamental than privacy. It’s the right to make our own decisions about privacy — to draw our own lines between those aspects of our lives we are comfortable sharing and those we are not
  • Other possible ways of organizing online markets, such as through paid subscriptions for apps and services, never even got a chance to be tested.
  • Online surveillance came to be viewed as normal and even necessary by politicians, government bureaucrats, and the general public
  • Google and other Silicon Valley companies benefited directly from the government’s new stress on digital surveillance. They earned millions through contracts to share their data collection and analysis techniques with the National Security Agenc
  • As much as the dot-com crash, the horrors of 9/11 set the stage for the rise of surveillance capitalism. Zuboff notes that, in 2000, members of the Federal Trade Commission, frustrated by internet companies’ lack of progress in adopting privacy protections, began formulating legislation to secure people’s control over their online information and severely restrict the companies’ ability to collect and store it. It seemed obvious to the regulators that ownership of personal data should by default lie in the hands of private citizens, not corporations.
  • The 9/11 attacks changed the calculus. The centralized collection and analysis of online data, on a vast scale, came to be seen as essential to national security. “The privacy provisions debated just months earlier vanished from the conversation more or less overnight,”
katedriscoll

Theory of Knowledge IB Guide | Part 5 | IB Blog - 0 views

  • All knowledge comes from somewhere. Even if we say it is innate (comes from within us) we still have to say how that knowledge appears. The Ways of Knowing are what they sound like, the methods through which knowledge becomes apparent to us. In the IB there are eight different ways of knowing: Language, Sense perception, Emotion, Reason, Imagination, Faith, Intuition and Memory. Although this might seem like a lot, the good news is that the for the IB you’re only really advised to study four of them in depth (although it’s worth knowing how each of them works).
  • This quote from author Olivia Fox Cabane points out the power of the human imagination. What is being described here is the what we traditionally call imagination: the ability to form a mental representation of a sense experience without the normal stimulus. There is another form of imagination, however. Propositional imagining: is the idea of ‘imagining that’ things were different than they are, for example that the cold war had never ended.
tongoscar

Reason Vs. Emotion - 0 views

  • we are all guided by both reason and emotion, and both play important parts.
  • emotional intelligence can be a stronger predictor of many dimensions of life-success than IQ,
  • Emotions can be influenced by thought (the emphasis of Cognitive psychotherapies), and thoughts are influenced by emotion (an emphasis of Emotionally Focused therapies). A third element is behavior — which I believe also interplays similarly with thought and emotion.
  • ...4 more annotations...
  • Emotion and reason each have somewhat different, but complementary and interlaced roles. They both provide information and guide behavior.
  • Negative emotions are opportunities for learning and closeness.
  • Each emotion conveys its own message.
  • Again, what helps here is understanding that you each have different styles, that neither is right or wrong, and you can find ways to bridge a little bit.
‹ Previous 21 - 40 of 213 Next › Last »
Showing 20 items per page