Skip to main content

Home/ TOK Friends/ Group items matching "into" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
Javier E

Why Our Children Don't Think There Are Moral Facts - NYTimes.com - 1 views

  • I already knew that many college-aged students don’t believe in moral facts.
  • the overwhelming majority of college freshman in their classrooms view moral claims as mere opinions that are not true or are true only relative to a culture.
  • where is the view coming from?
  • ...32 more annotations...
  • the Common Core standards used by a majority of K-12 programs in the country require that students be able to “distinguish among fact, opinion, and reasoned judgment in a text.”
  • So what’s wrong with this distinction and how does it undermine the view that there are objective moral facts?
  • For example, many people once thought that the earth was flat. It’s a mistake to confuse truth (a feature of the world) with proof (a feature of our mental lives)
  • Furthermore, if proof is required for facts, then facts become person-relative. Something might be a fact for me if I can prove it but not a fact for you if you can’t. In that case, E=MC2 is a fact for a physicist but not for me.
  • worse, students are taught that claims are either facts or opinions. They are given quizzes in which they must sort claims into one camp or the other but not both. But if a fact is something that is true and an opinion is something that is believed, then many claims will obviously be both
  • How does the dichotomy between fact and opinion relate to morality
  • Kids are asked to sort facts from opinions and, without fail, every value claim is labeled as an opinion.
  • Here’s a little test devised from questions available on fact vs. opinion worksheets online: are the following facts or opinions? — Copying homework assignments is wrong. — Cursing in school is inappropriate behavior. — All men are created equal. — It is worth sacrificing some personal liberties to protect our country from terrorism. — It is wrong for people under the age of 21 to drink alcohol. — Vegetarians are healthier than people who eat meat. — Drug dealers belong in prison.
  • The answer? In each case, the worksheets categorize these claims as opinions. The explanation on offer is that each of these claims is a value claim and value claims are not facts. This is repeated ad nauseum: any claim with good, right, wrong, etc. is not a fact.
  • In summary, our public schools teach students that all claims are either facts or opinions and that all value and moral claims fall into the latter camp. The punchline: there are no moral facts. And if there are no moral facts, then there are no moral truths.
  • It should not be a surprise that there is rampant cheating on college campuses: If we’ve taught our students for 12 years that there is no fact of the matter as to whether cheating is wrong, we can’t very well blame them for doing so later on.
  • If it’s not true that it’s wrong to murder a cartoonist with whom one disagrees, then how can we be outraged? If there are no truths about what is good or valuable or right, how can we prosecute people for crimes against humanity? If it’s not true that all humans are created equal, then why vote for any political system that doesn’t benefit you over others?
  • the curriculum sets our children up for doublethink. They are told that there are no moral facts in one breath even as the next tells them how they ought to behave.
  • Our children deserve a consistent intellectual foundation. Facts are things that are true. Opinions are things we believe. Some of our beliefs are true. Others are not. Some of our beliefs are backed by evidence. Others are not.
  • Value claims are like any other claims: either true or false, evidenced or not.
  • The hard work lies not in recognizing that at least some moral claims are true but in carefully thinking through our evidence for which of the many competing moral claims is correct.
  • Moral truths are not the same as scientific truths or mathematical truths. Yet they may still be used a guiding principle for our individual lives as well as our laws.But there is equal danger of giving moral judgments the designation of truth as there is in not doing so. Many people believe that abortion is murder on the same level as shooting someone with a gun. But many others do not. So is it true that abortion is murder?Moral principles can become generally accepted and then form the basis for our laws. But many long accepted moral principles were later rejected as being faulty. "Separate but equal" is an example. Judging homosexual relationships as immoral is another example.
  • Whoa! That Einstein derived an equation is a fact. But the equation represents a theory that may have to be tweaked at some point in the future. It may be a fact that the equation foretold the violence of atomic explosions, but there are aspects of nature that elude the equation. Remember "the theory of everything?"
  • Here is a moral fact, this is a sermon masquerading as a philosophical debate on facts, opinions and truth. This professor of religion is asserting that the government via common core is teaching atheism via the opinion vs fact.He is arguing, in a dishonest form, that public schools should be teaching moral facts. Of course moral facts is code for the Ten Commandments.
  • As a fourth grade teacher, I try to teach students to read critically, including distinguishing between facts and opinions as they read (and have been doing this long before the Common Core arrived, by the way). It's not always easy for children to grasp the difference. I can only imagine the confusion that would ensue if I introduced a third category -- moral "facts" that can't be proven but are true nonetheless!
  • horrible acts occur not because of moral uncertainty, but because people are too sure that their views on morality are 100% true, and anyone who fails to recognize and submit themselves are heathens who deserve death.I can't think of any case where a society has suffered because people are too thoughtful and open-minded to different perspectives on moral truth.In any case, it's not an elementary school's job to teach "moral truths."
  • The characterization of moral anti-realism as some sort of fringe view in philosophy is misleading. Claims that can be true or false are, it seems, 'made true' by features of the world. It's not clear to many in philosophy (like me) just what features of the world could make our moral claims true. We are more likely to see people's value claims as making claims about, and enforcing conformity to, our own (contingent) social norms. This is not to hold, as Mr. McBrayer seems to think follows, that there are no reasons to endorse or criticize these social norms.
  • This is nonsense. Giving kids the tools to distinguish between fact and opinion is hard enough in an age when Republicans actively deny reality on Fox News every night. The last thing we need is to muddy their thinking with the concept of "moral facts."A fact is a belief that everyone _should_ agree upon because it is observable and testable. Morals are not agreed upon by all. Consider the hot button issue of abortion.
  • Truthfully, I'm not terribly concerned that third graders will end up taking these lessons in the definition of fact versus opinion to the extremes considered here, or take them as a license to cheat. That will come much later, when they figure out, as people always have, what they can get a way with. But Prof. McBrayer, with his blithe expectation that all the grownups know that there moral "facts"? He scares the heck out of me.
  • I've long chafed at the language of "fact" v. "opinion", which is grounded in a very particular, limited view of human cognition. In my own ethics courses, I work actively to undermine the distinction, focusing instead on considered judgment . . . or even more narrowly, on consideration itself. (See http://wp.me/p5Ag0i-6M )
  • The real waffle here is the very concept of "moral facts." Our statements of values, even very important ones are, obviously, not facts. Trying to dress them up as if they are facts, to me, argues for a pretty serious moral weakness on the part of those advancing the idea.
  • Our core values are not important because they are facts. They are important because we collectively hold them and cherish them. To lean on the false crutch of "moral facts" to admit the weakness of your own moral convictions.
  • I would like to believe that there is a core of moral facts/values upon which all humanity can agree, but it would be tough to identify exactly what those are.
  • For the the ancient philosophers, reality comprised the Good, the True, and the Beautiful (what we might now call ethics, science and art), seeing these as complementary and inseparable, though distinct, realms. With the ascendency of science in our culture as the only valid measure of reality to the detriment of ethics and art (that is, if it is not observable and provable, it is not real), we have turned the good and the beautiful into mere "social constructs" that have no validity on their own. While I am sympathetic in many ways with Dr. McBrayer's objections, I think he falls into the trap of discounting the Good and The Beautiful as valid in and of themselves, and tries, instead, to find ways to give them validity through the True. I think his argument would have been stronger had he used the language of validity rather than the language of truth. Goodness, Truth and Beauty each have their own validity, though interdependent and inseparable. When we artificially extract one of these and give it primacy, we distort reality and alienate ourselves from it.
  • Professor McBrayer seems to miss the major point of the Common Core concern: can students distinguish between premises based on (reasonably construed) fact and premises based on emotion when evaluating conclusions? I would prefer that students learn to reason rather than be taught moral 'truth' that follows Professor McBrayer's logic.
  • Moral issues cannot scientifically be treated on the level that Prof. McBrayer is attempting to use in this column: true or false, fact or opinion or both. Instead, they should be treated as important characteristics of the systematic working of a society or of a group of people in general. One can compare the working of two groups of people: one in which e.g. cheating and lying is acceptable, and one in which they are not. One can use historical or model examples to show the consequences and the working of specific systems of morals. I think that this method - suitably adjusted - can be used even in second grade.
  • Relativism has nothing to do with liberalism. The second point is that I'm not sure it does all that much harm, because I have yet to encounter a student who thought that he or she had to withhold judgment on those who hold opposing political views!
Javier E

[Six Questions] | Astra Taylor on The People's Platform: Taking Back Power and Culture in the Digital Age | Harper's Magazine - 1 views

  • Astra Taylor, a cultural critic and the director of the documentaries Zizek! and Examined Life, challenges the notion that the Internet has brought us into an age of cultural democracy. While some have hailed the medium as a platform for diverse voices and the free exchange of information and ideas, Taylor shows that these assumptions are suspect at best. Instead, she argues, the new cultural order looks much like the old: big voices overshadow small ones, content is sensationalist and powered by advertisements, quality work is underfunded, and corporate giants like Google and Facebook rule. The Internet does offer promising tools, Taylor writes, but a cultural democracy will be born only if we work collaboratively to develop the potential of this powerful resource
  • Most people don’t realize how little information can be conveyed in a feature film. The transcripts of both of my movies are probably equivalent in length to a Harper’s cover story.
  • why should Amazon, Apple, Facebook, and Google get a free pass? Why should we expect them to behave any differently over the long term? The tradition of progressive media criticism that came out of the Frankfurt School, not to mention the basic concept of political economy (looking at the way business interests shape the cultural landscape), was nowhere to be seen, and that worried me. It’s not like political economy became irrelevant the second the Internet was invented.
  • ...15 more annotations...
  • How do we reconcile our enjoyment of social media even as we understand that the corporations who control them aren’t always acting in our best interests?
  • hat was because the underlying economic conditions hadn’t been changed or “disrupted,” to use a favorite Silicon Valley phrase. Google has to serve its shareholders, just like NBCUniversal does. As a result, many of the unappealing aspects of the legacy-media model have simply carried over into a digital age — namely, commercialism, consolidation, and centralization. In fact, the new system is even more dependent on advertising dollars than the one that preceded it, and digital advertising is far more invasive and ubiquitous
  • the popular narrative — new communications technologies would topple the establishment and empower regular people — didn’t accurately capture reality. Something more complex and predictable was happening. The old-media dinosaurs weren’t dying out, but were adapting to the online environment; meanwhile the new tech titans were coming increasingly to resemble their predecessors
  • I use lots of products that are created by companies whose business practices I object to and that don’t act in my best interests, or the best interests of workers or the environment — we all do, since that’s part of living under capitalism. That said, I refuse to invest so much in any platform that I can’t quit without remorse
  • these services aren’t free even if we don’t pay money for them; we pay with our personal data, with our privacy. This feeds into the larger surveillance debate, since government snooping piggybacks on corporate data collection. As I argue in the book, there are also negative cultural consequences (e.g., when advertisers are paying the tab we get more of the kind of culture marketers like to associate themselves with and less of the stuff they don’t) and worrying social costs. For example, the White House and the Federal Trade Commission have both recently warned that the era of “big data” opens new avenues of discrimination and may erode hard-won consumer protections.
  • I’m resistant to the tendency to place this responsibility solely on the shoulders of users. Gadgets and platforms are designed to be addictive, with every element from color schemes to headlines carefully tested to maximize clickability and engagement. The recent news that Facebook tweaked its algorithms for a week in 2012, showing hundreds of thousands of users only “happy” or “sad” posts in order to study emotional contagion — in other words, to manipulate people’s mental states — is further evidence that these platforms are not neutral. In the end, Facebook wants us to feel the emotion of wanting to visit Facebook frequently
  • social inequalities that exist in the real world remain meaningful online. What are the particular dangers of discrimination on the Internet?
  • That it’s invisible or at least harder to track and prove. We haven’t figured out how to deal with the unique ways prejudice plays out over digital channels, and that’s partly because some folks can’t accept the fact that discrimination persists online. (After all, there is no sign on the door that reads Minorities Not Allowed.)
  • just because the Internet is open doesn’t mean it’s equal; offline hierarchies carry over to the online world and are even amplified there. For the past year or so, there has been a lively discussion taking place about the disproportionate and often outrageous sexual harassment women face simply for entering virtual space and asserting themselves there — research verifies that female Internet users are dramatically more likely to be threatened or stalked than their male counterparts — and yet there is very little agreement about what, if anything, can be done to address the problem.
  • What steps can we take to encourage better representation of independent and non-commercial media? We need to fund it, first and foremost. As individuals this means paying for the stuff we believe in and want to see thrive. But I don’t think enlightened consumption can get us where we need to go on its own. I’m skeptical of the idea that we can shop our way to a better world. The dominance of commercial media is a social and political problem that demands a collective solution, so I make an argument for state funding and propose a reconceptualization of public media. More generally, I’m struck by the fact that we use these civic-minded metaphors, calling Google Books a “library” or Twitter a “town square” — or even calling social media “social” — but real public options are off the table, at least in the United States. We hand the digital commons over to private corporations at our peril.
  • 6. You advocate for greater government regulation of the Internet. Why is this important?
  • I’m for regulating specific things, like Internet access, which is what the fight for net neutrality is ultimately about. We also need stronger privacy protections and restrictions on data gathering, retention, and use, which won’t happen without a fight.
  • I challenge the techno-libertarian insistence that the government has no productive role to play and that it needs to keep its hands off the Internet for fear that it will be “broken.” The Internet and personal computing as we know them wouldn’t exist without state investment and innovation, so let’s be real.
  • there’s a pervasive and ill-advised faith that technology will promote competition if left to its own devices (“competition is a click away,” tech executives like to say), but that’s not true for a variety of reasons. The paradox of our current media landscape is this: our devices and consumption patterns are ever more personalized, yet we’re simultaneously connected to this immense, opaque, centralized infrastructure. We’re all dependent on a handful of firms that are effectively monopolies — from Time Warner and Comcast on up to Google and Facebook — and we’re seeing increased vertical integration, with companies acting as both distributors and creators of content. Amazon aspires to be the bookstore, the bookshelf, and the book. Google isn’t just a search engine, a popular browser, and an operating system; it also invests in original content
  • So it’s not that the Internet needs to be regulated but that these big tech corporations need to be subject to governmental oversight. After all, they are reaching farther and farther into our intimate lives. They’re watching us. Someone should be watching them.
kortanekev

The Crisis Of Market Fundamentalism - 0 views

  • At the same time, a second, more momentous, intellectual revolution will be needed regarding government intervention in social outcomes and economic structures. Market fundamentalism conceals a profound contradiction. Free trade, technological progress, and other forces that promote economic “efficiency” are presented as beneficial to society, even if they harm individual workers or businesses, because growing national incomes allow winners to compensate losers, ensuring that nobody is left worse off.
  •  
    This article calls into question the definition of a "good outcome." Who does it benefit? Does some solution benefit the masses, but take advantage of a minority? The highlighted passage shows the ideal of pareto-efficiency, when no one is left worse off when a change has been made. So even if only one group has been benefitted, the other groups are not worse off (unless that calls into account discrimination etc.).  Evie K 3/1/17
sissij

Turning Negative Thinkers Into Positive Ones - The New York Times - 0 views

  • I leave the Y grinning from ear to ear, uplifted not just by my own workout but even more so by my interaction with these darling representatives of the next generation.
  • I lived for half a century with a man who suffered from periodic bouts of depression, so I understand how challenging negativism can be.
  • “micro-moments of positivity,”
  • ...6 more annotations...
  • The research that Dr. Fredrickson and others have done demonstrates that the extent to which we can generate positive emotions from even everyday activities can determine who flourishes and who doesn’t.
  • Clearly, there are times and situations that naturally result in negative feelings in the most upbeat of individuals. Worry, sadness, anger and other such “downers” have their place in any normal life.
  • Negative feelings activate a region of the brain called the amygdala, which is involved in processing fear and anxiety and other emotions.
  • Both he and Dr. Fredrickson and their colleagues have demonstrated that the brain is “plastic,” or capable of generating new cells and pathways, and it is possible to train the circuitry in the brain to promote more positive responses.
  • reinforce positivity
  • Practice mindfulness. Ruminating on past problems or future difficulties drains mental resources and steals attention from current pleasures.
  •  
    The distance between negative attitude and positive attitude is not that far away. Just by changing a few wordings in the sentence, we can describe an event in a really positive manner. From my personal experience, attitude is like a habit. If you always like to think negatively, then you brain tends to give pessimistic response to events. So sometimes, you have to train your brain into positive thinkers. As we learned in TOK, we tends to see things and think in pattern, so it is very importantly to create a good pattern for our thinking. --Sissi (4/3/2017)
Javier E

What's Wrong With the Teenage Mind? - WSJ.com - 1 views

  • What happens when children reach puberty earlier and adulthood later? The answer is: a good deal of teenage weirdness. Fortunately, developmental psychologists and neuroscientists are starting to explain the foundations of that weirdness.
  • The crucial new idea is that there are two different neural and psychological systems that interact to turn children into adults. Over the past two centuries, and even more over the past generation, the developmental timing of these two systems has changed. That, in turn, has profoundly changed adolescence and produced new kinds of adolescent woe. The big question for anyone who deals with young people today is how we can go about bringing these cogs of the teenage mind into sync once again
  • The first of these systems has to do with emotion and motivation. It is very closely linked to the biological and chemical changes of puberty and involves the areas of the brain that respond to rewards. This is the system that turns placid 10-year-olds into restless, exuberant, emotionally intense teenagers, desperate to attain every goal, fulfill every desire and experience every sensation. Later, it turns them back into relatively placid adults.
  • ...23 more annotations...
  • adolescents aren't reckless because they underestimate risks, but because they overestimate rewards—or, rather, find rewards more rewarding than adults do. The reward centers of the adolescent brain are much more active than those of either children or adults.
  • What teenagers want most of all are social rewards, especially the respect of their peers
  • Becoming an adult means leaving the world of your parents and starting to make your way toward the future that you will share with your peers. Puberty not only turns on the motivational and emotional system with new force, it also turns it away from the family and toward the world of equals.
  • The second crucial system in our brains has to do with control; it channels and harnesses all that seething energy. In particular, the prefrontal cortex reaches out to guide other parts of the brain, including the parts that govern motivation and emotion. This is the system that inhibits impulses and guides decision-making, that encourages long-term planning and delays gratification.
  • Today's adolescents develop an accelerator a long time before they can steer and brake.
  • Expertise comes with experience.
  • In gatherer-hunter and farming societies, childhood education involves formal and informal apprenticeship. Children have lots of chances to practice the skills that they need to accomplish their goals as adults, and so to become expert planners and actors.
  • In the past, to become a good gatherer or hunter, cook or caregiver, you would actually practice gathering, hunting, cooking and taking care of children all through middle childhood and early adolescence—tuning up just the prefrontal wiring you'd need as an adult. But you'd do all that under expert adult supervision and in the protected world of childhood
  • In contemporary life, the relationship between these two systems has changed dramatically. Puberty arrives earlier, and the motivational system kicks in earlier too. At the same time, contemporary children have very little experience with the kinds of tasks that they'll have to perform as grown-ups.
  • The experience of trying to achieve a real goal in real time in the real world is increasingly delayed, and the growth of the control system depends on just those experiences.
  • This control system depends much more on learning. It becomes increasingly effective throughout childhood and continues to develop during adolescence and adulthood, as we gain more experience.
  • An ever longer protected period of immaturity and dependence—a childhood that extends through college—means that young humans can learn more than ever before. There is strong evidence that IQ has increased dramatically as more children spend more time in school
  • children know more about more different subjects than they ever did in the days of apprenticeships.
  • Wide-ranging, flexible and broad learning, the kind we encourage in high-school and college, may actually be in tension with the ability to develop finely-honed, controlled, focused expertise in a particular skill, the kind of learning that once routinely took place in human societies.
  • this new explanation based on developmental timing elegantly accounts for the paradoxes of our particular crop of adolescents.
  • First, experience shapes the brain.
  • the brain is so powerful precisely because it is so sensitive to experience. It's as true to say that our experience of controlling our impulses make the prefrontal cortex develop as it is to say that prefrontal development makes us better at controlling our impulses
  • Second, development plays a crucial role in explaining human nature
  • there is more and more evidence that genes are just the first step in complex developmental sequences, cascades of interactions between organism and environment, and that those developmental processes shape the adult brain. Even small changes in developmental timing can lead to big changes in who we become.
  • Brain research is often taken to mean that adolescents are really just defective adults—grown-ups with a missing part.
  • But the new view of the adolescent brain isn't that the prefrontal lobes just fail to show up; it's that they aren't properly instructed and exercised
  • Instead of simply giving adolescents more and more school experiences—those extra hours of after-school classes and homework—we could try to arrange more opportunities for apprenticeship
  • Summer enrichment activities like camp and travel, now so common for children whose parents have means, might be usefully alternated with summer jobs, with real responsibilities.
  •  
    The two brain systems, the increasing gap between them, and the implications for adolescent education.
Javier E

What Have We Learned, If Anything? by Tony Judt | The New York Review of Books - 0 views

  • During the Nineties, and again in the wake of September 11, 2001, I was struck more than once by a perverse contemporary insistence on not understanding the context of our present dilemmas, at home and abroad; on not listening with greater care to some of the wiser heads of earlier decades; on seeking actively to forget rather than remember, to deny continuity and proclaim novelty on every possible occasion. We have become stridently insistent that the past has little of interest to teach us. Ours, we assert, is a new world; its risks and opportunities are without precedent.
  • the twentieth century that we have chosen to commemorate is curiously out of focus. The overwhelming majority of places of official twentieth-century memory are either avowedly nostalgo-triumphalist—praising famous men and celebrating famous victories—or else, and increasingly, they are opportunities for the recollection of selective suffering.
  • The problem with this lapidary representation of the last century as a uniquely horrible time from which we have now, thankfully, emerged is not the description—it was in many ways a truly awful era, an age of brutality and mass suffering perhaps unequaled in the historical record. The problem is the message: that all of that is now behind us, that its meaning is clear, and that we may now advance—unencumbered by past errors—into a different and better era.
  • ...19 more annotations...
  • Today, the “common” interpretation of the recent past is thus composed of the manifold fragments of separate pasts, each of them (Jewish, Polish, Serb, Armenian, German, Asian-American, Palestinian, Irish, homosexual…) marked by its own distinctive and assertive victimhood.
  • The resulting mosaic does not bind us to a shared past, it separates us from it. Whatever the shortcomings of the national narratives once taught in school, however selective their focus and instrumental their message, they had at least the advantage of providing a nation with past references for present experience. Traditional history, as taught to generations of schoolchildren and college students, gave the present a meaning by reference to the past: today’s names, places, inscriptions, ideas, and allusions could be slotted into a memorized narrative of yesterday. In our time, however, this process has gone into reverse. The past now acquires meaning only by reference to our many and often contrasting present concerns.
  • the United States thus has no modern memory of combat or loss remotely comparable to that of the armed forces of other countries. But it is civilian casualties that leave the most enduring mark on national memory and here the contrast is piquant indeed
  • Today, the opposite applies. Most people in the world outside of sub-Saharan Africa have access to a near infinity of data. But in the absence of any common culture beyond a small elite, and not always even there, the fragmented information and ideas that people select or encounter are determined by a multiplicity of tastes, affinities, and interests. As the years pass, each one of us has less in common with the fast-multiplying worlds of our contemporaries, not to speak of the world of our forebears.
  • What is significant about the present age of transformations is the unique insouciance with which we have abandoned not merely the practices of the past but their very memory. A world just recently lost is already half forgotten.
  • In the US, at least, we have forgotten the meaning of war. There is a reason for this. I
  • Until the last decades of the twentieth century most people in the world had limited access to information; but—thanks to national education, state-controlled radio and television, and a common print culture—within any one state or nation or community people were all likely to know many of the same things.
  • it was precisely that claim, that “it’s torture, and therefore it’s no good,” which until very recently distinguished democracies from dictatorships. We pride ourselves on having defeated the “evil empire” of the Soviets. Indeed so. But perhaps we should read again the memoirs of those who suffered at the hands of that empire—the memoirs of Eugen Loebl, Artur London, Jo Langer, Lena Constante, and countless others—and then compare the degrading abuses they suffered with the treatments approved and authorized by President Bush and the US Congress. Are they so very different?
  • As a consequence, the United States today is the only advanced democracy where public figures glorify and exalt the military, a sentiment familiar in Europe before 1945 but quite unknown today
  • the complacent neoconservative claim that war and conflict are things Americans understand—in contrast to naive Europeans with their pacifistic fantasies—seems to me exactly wrong: it is Europeans (along with Asians and Africans) who understand war all too well. Most Americans have been fortunate enough to live in blissful ignorance of its true significance.
  • That same contrast may account for the distinctive quality of much American writing on the cold war and its outcome. In European accounts of the fall of communism, from both sides of the former Iron Curtain, the dominant sentiment is one of relief at the closing of a long, unhappy chapter. Here in the US, however, the story is typically recorded in a triumphalist key.5
  • For many American commentators and policymakers the message of the twentieth century is that war works. Hence the widespread enthusiasm for our war on Iraq in 2003 (despite strong opposition to it in most other countries). For Washington, war remains an option—on that occasion the first option. For the rest of the developed world it has become a last resort.6
  • Ignorance of twentieth-century history does not just contribute to a regrettable enthusiasm for armed conflict. It also leads to a misidentification of the enemy.
  • This abstracting of foes and threats from their context—this ease with which we have talked ourselves into believing that we are at war with “Islamofascists,” “extremists” from a strange culture, who dwell in some distant “Islamistan,” who hate us for who we are and seek to destroy “our way of life”—is a sure sign that we have forgotten the lesson of the twentieth century: the ease with which war and fear and dogma can bring us to demonize others, deny them a common humanity or the protection of our laws, and do unspeakable things to them.
  • How else are we to explain our present indulgence for the practice of torture? For indulge it we assuredly do.
  • “But what would I have achieved by proclaiming my opposition to torture?” he replied. “I have never met anyone who is in favor of torture.”8 Well, times have changed. In the US today there are many respectable, thinking people who favor torture—under the appropriate circumstances and when applied to those who merit it.
  • American civilian losses (excluding the merchant navy) in both world wars amounted to less than 2,000 dead.
  • We are slipping down a slope. The sophistic distinctions we draw today in our war on terror—between the rule of law and “exceptional” circumstances, between citizens (who have rights and legal protections) and noncitizens to whom anything can be done, between normal people and “terrorists,” between “us” and “them”—are not new. The twentieth century saw them all invoked. They are the selfsame distinctions that licensed the worst horrors of the recent past: internment camps, deportation, torture, and murder—those very crimes that prompt us to murmur “never again.” So what exactly is it that we think we have learned from the past? Of what possible use is our self-righteous cult of memory and memorials if the United States can build its very own internment camp and torture people there?
  • We need to learn again—or perhaps for the first time—how war brutalizes and degrades winners and losers alike and what happens to us when, having heedlessly waged war for no good reason, we are encouraged to inflate and demonize our enemies in order to justify that war’s indefinite continuance.
Javier E

George Packer: Is Amazon Bad for Books? : The New Yorker - 0 views

  • Amazon is a global superstore, like Walmart. It’s also a hardware manufacturer, like Apple, and a utility, like Con Edison, and a video distributor, like Netflix, and a book publisher, like Random House, and a production studio, like Paramount, and a literary magazine, like The Paris Review, and a grocery deliverer, like FreshDirect, and someday it might be a package service, like U.P.S. Its founder and chief executive, Jeff Bezos, also owns a major newspaper, the Washington Post. All these streams and tributaries make Amazon something radically new in the history of American business
  • Amazon is not just the “Everything Store,” to quote the title of Brad Stone’s rich chronicle of Bezos and his company; it’s more like the Everything. What remains constant is ambition, and the search for new things to be ambitious about.
  • It wasn’t a love of books that led him to start an online bookstore. “It was totally based on the property of books as a product,” Shel Kaphan, Bezos’s former deputy, says. Books are easy to ship and hard to break, and there was a major distribution warehouse in Oregon. Crucially, there are far too many books, in and out of print, to sell even a fraction of them at a physical store. The vast selection made possible by the Internet gave Amazon its initial advantage, and a wedge into selling everything else.
  • ...38 more annotations...
  • it’s impossible to know for sure, but, according to one publisher’s estimate, book sales in the U.S. now make up no more than seven per cent of the company’s roughly seventy-five billion dollars in annual revenue.
  • A monopoly is dangerous because it concentrates so much economic power, but in the book business the prospect of a single owner of both the means of production and the modes of distribution is especially worrisome: it would give Amazon more control over the exchange of ideas than any company in U.S. history.
  • “The key to understanding Amazon is the hiring process,” one former employee said. “You’re not hired to do a particular job—you’re hired to be an Amazonian. Lots of managers had to take the Myers-Briggs personality tests. Eighty per cent of them came in two or three similar categories, and Bezos is the same: introverted, detail-oriented, engineer-type personality. Not musicians, designers, salesmen. The vast majority fall within the same personality type—people who graduate at the top of their class at M.I.T. and have no idea what to say to a woman in a bar.”
  • According to Marcus, Amazon executives considered publishing people “antediluvian losers with rotary phones and inventory systems designed in 1968 and warehouses full of crap.” Publishers kept no data on customers, making their bets on books a matter of instinct rather than metrics. They were full of inefficiences, starting with overpriced Manhattan offices.
  • For a smaller house, Amazon’s total discount can go as high as sixty per cent, which cuts deeply into already slim profit margins. Because Amazon manages its inventory so well, it often buys books from small publishers with the understanding that it can’t return them, for an even deeper discount
  • According to one insider, around 2008—when the company was selling far more than books, and was making twenty billion dollars a year in revenue, more than the combined sales of all other American bookstores—Amazon began thinking of content as central to its business. Authors started to be considered among the company’s most important customers. By then, Amazon had lost much of the market in selling music and videos to Apple and Netflix, and its relations with publishers were deteriorating
  • In its drive for profitability, Amazon did not raise retail prices; it simply squeezed its suppliers harder, much as Walmart had done with manufacturers. Amazon demanded ever-larger co-op fees and better shipping terms; publishers knew that they would stop being favored by the site’s recommendation algorithms if they didn’t comply. Eventually, they all did.
  • Brad Stone describes one campaign to pressure the most vulnerable publishers for better terms: internally, it was known as the Gazelle Project, after Bezos suggested “that Amazon should approach these small publishers the way a cheetah would pursue a sickly gazelle.”
  • ithout dropping co-op fees entirely, Amazon simplified its system: publishers were asked to hand over a percentage of their previous year’s sales on the site, as “marketing development funds.”
  • The figure keeps rising, though less for the giant pachyderms than for the sickly gazelles. According to the marketing executive, the larger houses, which used to pay two or three per cent of their net sales through Amazon, now relinquish five to seven per cent of gross sales, pushing Amazon’s percentage discount on books into the mid-fifties. Random House currently gives Amazon an effective discount of around fifty-three per cent.
  • In December, 1999, at the height of the dot-com mania, Time named Bezos its Person of the Year. “Amazon isn’t about technology or even commerce,” the breathless cover article announced. “Amazon is, like every other site on the Web, a content play.” Yet this was the moment, Marcus said, when “content” people were “on the way out.”
  • By 2010, Amazon controlled ninety per cent of the market in digital books—a dominance that almost no company, in any industry, could claim. Its prohibitively low prices warded off competition
  • In 2004, he set up a lab in Silicon Valley that would build Amazon’s first piece of consumer hardware: a device for reading digital books. According to Stone’s book, Bezos told the executive running the project, “Proceed as if your goal is to put everyone selling physical books out of a job.”
  • Lately, digital titles have levelled off at about thirty per cent of book sales.
  • The literary agent Andrew Wylie (whose firm represents me) says, “What Bezos wants is to drag the retail price down as low as he can get it—a dollar-ninety-nine, even ninety-nine cents. That’s the Apple play—‘What we want is traffic through our device, and we’ll do anything to get there.’ ” If customers grew used to paying just a few dollars for an e-book, how long before publishers would have to slash the cover price of all their titles?
  • As Apple and the publishers see it, the ruling ignored the context of the case: when the key events occurred, Amazon effectively had a monopoly in digital books and was selling them so cheaply that it resembled predatory pricing—a barrier to entry for potential competitors. Since then, Amazon’s share of the e-book market has dropped, levelling off at about sixty-five per cent, with the rest going largely to Apple and to Barnes & Noble, which sells the Nook e-reader. In other words, before the feds stepped in, the agency model introduced competition to the market
  • But the court’s decision reflected a trend in legal thinking among liberals and conservatives alike, going back to the seventies, that looks at antitrust cases from the perspective of consumers, not producers: what matters is lowering prices, even if that goal comes at the expense of competition. Barry Lynn, a market-policy expert at the New America Foundation, said, “It’s one of the main factors that’s led to massive consolidation.”
  • Publishers sometimes pass on this cost to authors, by redefining royalties as a percentage of the publisher’s receipts, not of the book’s list price. Recently, publishers say, Amazon began demanding an additional payment, amounting to approximately one per cent of net sales
  • brick-and-mortar retailers employ forty-seven people for every ten million dollars in revenue earned; Amazon employs fourteen.
  • Since the arrival of the Kindle, the tension between Amazon and the publishers has become an open battle. The conflict reflects not only business antagonism amid technological change but a division between the two coasts, with different cultural styles and a philosophical disagreement about what techies call “disruption.”
  • Bezos told Charlie Rose, “Amazon is not happening to bookselling. The future is happening to bookselling.”
  • n Grandinetti’s view, the Kindle “has helped the book business make a more orderly transition to a mixed print and digital world than perhaps any other medium.” Compared with people who work in music, movies, and newspapers, he said, authors are well positioned to thrive. The old print world of scarcity—with a limited number of publishers and editors selecting which manuscripts to publish, and a limited number of bookstores selecting which titles to carry—is yielding to a world of digital abundance. Grandinetti told me that, in these new circumstances, a publisher’s job “is to build a megaphone.”
  • it offers an extremely popular self-publishing platform. Authors become Amazon partners, earning up to seventy per cent in royalties, as opposed to the fifteen per cent that authors typically make on hardcovers. Bezos touts the biggest successes, such as Theresa Ragan, whose self-published thrillers and romances have been downloaded hundreds of thousands of times. But one survey found that half of all self-published authors make less than five hundred dollars a year.
  • The business term for all this clear-cutting is “disintermediation”: the elimination of the “gatekeepers,” as Bezos calls the professionals who get in the customer’s way. There’s a populist inflection to Amazon’s propaganda, an argument against élitist institutions and for “the democratization of the means of production”—a common line of thought in the West Coast tech world
  • “Book publishing is a very human business, and Amazon is driven by algorithms and scale,” Sargent told me. When a house gets behind a new book, “well over two hundred people are pushing your book all over the place, handing it to people, talking about it. A mass of humans, all in one place, generating tremendous energy—that’s the magic potion of publishing. . . . That’s pretty hard to replicate in Amazon’s publishing world, where they have hundreds of thousands of titles.”
  • By producing its own original work, Amazon can sell more devices and sign up more Prime members—a major source of revenue. While the company was building the
  • Like the publishing venture, Amazon Studios set out to make the old “gatekeepers”—in this case, Hollywood agents and executives—obsolete. “We let the data drive what to put in front of customers,” Carr told the Wall Street Journal. “We don’t have tastemakers deciding what our customers should read, listen to, and watch.”
  • book publishers have been consolidating for several decades, under the ownership of media conglomerates like News Corporation, which squeeze them for profits, or holding companies such as Rivergroup, which strip them to service debt. The effect of all this corporatization, as with the replacement of independent booksellers by superstores, has been to privilege the blockbuster.
  • The combination of ceaseless innovation and low-wage drudgery makes Amazon the epitome of a successful New Economy company. It’s hiring as fast as it can—nearly thirty thousand employees last year.
  • the long-term outlook is discouraging. This is partly because Americans don’t read as many books as they used to—they are too busy doing other things with their devices—but also because of the relentless downward pressure on prices that Amazon enforces.
  • he digital market is awash with millions of barely edited titles, most of it dreck, while r
  • Amazon believes that its approach encourages ever more people to tell their stories to ever more people, and turns writers into entrepreneurs; the price per unit might be cheap, but the higher number of units sold, and the accompanying royalties, will make authors wealthier
  • In Friedman’s view, selling digital books at low prices will democratize reading: “What do you want as an author—to sell books to as few people as possible for as much as possible, or for as little as possible to as many readers as possible?”
  • The real talent, the people who are writers because they happen to be really good at writing—they aren’t going to be able to afford to do it.”
  • Seven-figure bidding wars still break out over potential blockbusters, even though these battles often turn out to be follies. The quest for publishing profits in an economy of scarcity drives the money toward a few big books. So does the gradual disappearance of book reviewers and knowledgeable booksellers, whose enthusiasm might have rescued a book from drowning in obscurity. When consumers are overwhelmed with choices, some experts argue, they all tend to buy the same well-known thing.
  • These trends point toward what the literary agent called “the rich getting richer, the poor getting poorer.” A few brand names at the top, a mass of unwashed titles down below, the middle hollowed out: the book business in the age of Amazon mirrors the widening inequality of the broader economy.
  • “If they did, in my opinion they would save the industry. They’d lose thirty per cent of their sales, but they would have an additional thirty per cent for every copy they sold, because they’d be selling directly to consumers. The industry thinks of itself as Procter & Gamble*. What gave publishers the idea that this was some big goddam business? It’s not—it’s a tiny little business, selling to a bunch of odd people who read.”
  • Bezos is right: gatekeepers are inherently élitist, and some of them have been weakened, in no small part, because of their complacency and short-term thinking. But gatekeepers are also barriers against the complete commercialization of ideas, allowing new talent the time to develop and learn to tell difficult truths. When the last gatekeeper but one is gone, will Amazon care whether a book is any good? ♦
Javier E

How to Make Your Own Luck | Brain Pickings - 0 views

  • editor Jocelyn Glei and her team at Behance’s 99U pull together another package of practical wisdom from 21 celebrated creative entrepreneurs. Despite the somewhat self-helpy, SEO-skewing title, this compendium of advice is anything but contrived. Rather, it’s a no-nonsense, experience-tested, life-approved cookbook for creative intelligence, exploring everything from harnessing the power of habit to cultivating meaningful relationships that enrich your work to overcoming the fear of failure.
  • If the twentieth-century career was a ladder that we climbed from one predictable rung to the next, the twenty-first-century career is more like a broad rock face that we are all free-climbing. There’s no defined route, and we must use our own ingenuity, training, and strength to rise to the top. We must make our own luck.
  • Lucky people take advantage of chance occurrences that come their way. Instead of going through life on cruise control, they pay attention to what’s happening around them and, therefore, are able to extract greater value from each situation… Lucky people are also open to novel opportunities and willing to try things outside of their usual experiences. They’re more inclined to pick up a book on an unfamiliar subject, to travel to less familiar destinations, and to interact with people who are different than themselves.
  • ...14 more annotations...
  • the primary benefit of a diary as a purely pragmatic record of your workday productivity and progress — while most dedicated diarists would counter that the core benefits are spiritual and psychoemotional — it does offer some valuable insight into the psychology of how journaling elevates our experience of everyday life:
  • We can’t, however, simply will ourselves into better habits. Since willpower is a limited resource, whenever we’ve overexerted our self-discipline in one domain, a concept known as “ego depletion” kicks in and renders us mindless automata in another
  • the key to changing a habit is to invest heavily in the early stages of habit-formation so that the behavior becomes automated and we later default into it rather than exhausting our willpower wrestling with it. Young also cautions that it’s a self-defeating strategy to try changing several habits at once. Rather, he advises, spend one month on each habit alone before moving on to the next
  • a diary boosts your creativity
  • This is one of the most important reasons to keep a diary: it can make you more aware of your own progress, thus becoming a wellspring of joy in your workday.
  • The second reason is focalism. When we contemplate failure from afar, according to Gilbert and Wilson, we tend to overemphasize the focal event (i.e., failure) and overlook all the other episodic details of daily life that help us move on and feel better. The threat of failure is so vivid that it consumes our attention
  • the authors point to a pattern that reveals the single most important motivator: palpable progress on meaningful work: On the days when these professionals saw themselves moving forward on something they cared about — even if the progress was a seemingly incremental “small win” — they were more likely to be happy and deeply engaged in their work. And, being happier and more deeply engaged, they were more likely to come up with new ideas and solve problems creatively.
  • Although the act of reflecting and writing, in itself, can be beneficial, you’ll multiply the power of your diary if you review it regularly — if you listen to what your life has been telling you. Periodically, maybe once a month, set aside time to get comfortable and read back through your entries. And, on New Year’s Day, make an annual ritual of reading through the previous year.
  • This, they suggest, can yield profound insights into the inner workings of your own mind — especially if you look for specific clues and patterns, trying to identify the richest sources of meaning in your work and the types of projects that truly make your heart sing. Once you understand what motivates you most powerfully, you’ll be able to prioritize this type of work in going forward. Just as important, however, is cultivating a gratitude practice and acknowledging your own accomplishments in the diary:
  • Fields argues that if we move along the Uncertainty Curve either too fast or too slowly, we risk either robbing the project of its creative potential and ending up in mediocrity. Instead, becoming mindful of the psychology of that process allows us to pace ourselves better and master that vital osmosis between freedom and constraint.
  • Schwalbe reminds us of the “impact bias” — our tendency to greatly overestimate the intensity and extent of our emotional reactions, which causes us to expect failures to be more painful than they actually are and thus to fear them more than we should.
  • When we think about taking a risk, we rarely consider how good we will be at reframing a disappointing outcome. In short, we underestimate our resilience.
  • what you do every day is best seen as an iceberg, with a small fraction of conscious decision sitting atop a much larger foundation of habits and behaviors.
  • don’t let yourself forget that the good life, the meaningful life, the truly fulfilling life, is the life of presence, not of productivity.
Javier E

Why Are Hundreds of Harvard Students Studying Ancient Chinese Philosophy? - Christine Gross-Loh - The Atlantic - 0 views

  • Puett's course Classical Chinese Ethical and Political Theory has become the third most popular course at the university. The only classes with higher enrollment are Intro to Economics and Intro to Computer Science.
  • the class fulfills one of Harvard's more challenging core requirements, Ethical Reasoning. It's clear, though, that students are also lured in by Puett's bold promise: “This course will change your life.”
  • Puett uses Chinese philosophy as a way to give undergraduates concrete, counter-intuitive, and even revolutionary ideas, which teach them how to live a better life. 
  • ...18 more annotations...
  • Puett puts a fresh spin on the questions that Chinese scholars grappled with centuries ago. He requires his students to closely read original texts (in translation) such as Confucius’s Analects, the Mencius, and the Daodejing and then actively put the teachings into practice in their daily lives. His lectures use Chinese thought in the context of contemporary American life to help 18- and 19-year-olds who are struggling to find their place in the world figure out how to be good human beings; how to create a good society; how to have a flourishing life. 
  • Puett began offering his course to introduce his students not just to a completely different cultural worldview but also to a different set of tools. He told me he is seeing more students who are “feeling pushed onto a very specific path towards very concrete career goals”
  • Puett tells his students that being calculating and rationally deciding on plans is precisely the wrong way to make any sort of important life decision. The Chinese philosophers they are reading would say that this strategy makes it harder to remain open to other possibilities that don’t fit into that plan.
  • Students who do this “are not paying enough attention to the daily things that actually invigorate and inspire them, out of which could come a really fulfilling, exciting life,” he explains. If what excites a student is not the same as what he has decided is best for him, he becomes trapped on a misguided path, slated to begin an unfulfilling career.
  • He teaches them that:   The smallest actions have the most profound ramifications. 
  • From a Chinese philosophical point of view, these small daily experiences provide us endless opportunities to understand ourselves. When we notice and understand what makes us tick, react, feel joyful or angry, we develop a better sense of who we are that helps us when approaching new situations. Mencius, a late Confucian thinker (4th century B.C.E.), taught that if you cultivate your better nature in these small ways, you can become an extraordinary person with an incredible influence
  • Decisions are made from the heart. Americans tend to believe that humans are rational creatures who make decisions logically, using our brains. But in Chinese, the word for “mind” and “heart” are the same.
  • If the body leads, the mind will follow. Behaving kindly (even when you are not feeling kindly), or smiling at someone (even if you aren’t feeling particularly friendly at the moment) can cause actual differences in how you end up feeling and behaving, even ultimately changing the outcome of a situation.
  • In the same way that one deliberately practices the piano in order to eventually play it effortlessly, through our everyday activities we train ourselves to become more open to experiences and phenomena so that eventually the right responses and decisions come spontaneously, without angst, from the heart-mind.
  • Whenever we make decisions, from the prosaic to the profound (what to make for dinner; which courses to take next semester; what career path to follow; whom to marry), we will make better ones when we intuit how to integrate heart and mind and let our rational and emotional sides blend into one. 
  • Aristotle said, “We are what we repeatedly do,” a view shared by thinkers such as Confucius, who taught that the importance of rituals lies in how they inculcate a certain sensibility in a person.
  • “The Chinese philosophers we read taught that the way to really change lives for the better is from a very mundane level, changing the way people experience and respond to the world, so what I try to do is to hit them at that level. I’m not trying to give my students really big advice about what to do with their lives. I just want to give them a sense of what they can do daily to transform how they live.”
  • Their assignments are small ones: to first observe how they feel when they smile at a stranger, hold open a door for someone, engage in a hobby. He asks them to take note of what happens next: how every action, gesture, or word dramatically affects how others respond to them. Then Puett asks them to pursue more of the activities that they notice arouse positive, excited feelings.
  • Once they’ve understood themselves better and discovered what they love to do they can then work to become adept at those activities through ample practice and self-cultivation. Self-cultivation is related to another classical Chinese concept: that effort is what counts the most, more than talent or aptitude. We aren’t limited to our innate talents; we all have enormous potential to expand our abilities if we cultivate them
  • To be interconnected, focus on mundane, everyday practices, and understand that great things begin with the very smallest of acts are radical ideas for young people living in a society that pressures them to think big and achieve individual excellence.
  • One of Puett’s former students, Adam Mitchell, was a math and science whiz who went to Harvard intending to major in economics. At Harvard specifically and in society in general, he told me, “we’re expected to think of our future in this rational way: to add up the pros and cons and then make a decision. That leads you down the road of ‘Stick with what you’re good at’”—a road with little risk but little reward.
  • after his introduction to Chinese philosophy during his sophomore year, he realized this wasn’t the only way to think about the future. Instead, he tried courses he was drawn to but wasn’t naturally adroit at because he had learned how much value lies in working hard to become better at what you love. He became more aware of the way he was affected by those around him, and how they were affected by his own actions in turn. Mitchell threw himself into foreign language learning, feels his relationships have deepened, and is today working towards a master’s degree in regional studies.
  • “I can happily say that Professor Puett lived up to his promise, that the course did in fact change my life.”
Javier E

The Singular Mind of Terry Tao - The New York Times - 0 views

  • reflecting on his career so far, Tao told me that his view of mathematics has utterly changed since childhood. ‘‘When I was growing up, I knew I wanted to be a mathematician, but I had no idea what that entailed,’’ he said in a lilting Australian accent. ‘‘I sort of imagined a committee would hand me problems to solve or something.’’
  • But it turned out that the work of real mathematicians bears little resemblance to the manipulations and memorization of the math student. Even those who experience great success through their college years may turn out not to have what it takes. The ancient art of mathematics, Tao has discovered, does not reward speed so much as patience, cunning and, perhaps most surprising of all, the sort of gift for collaboration and improvisation that characterizes the best jazz musicians
  • Tao now believes that his younger self, the prodigy who wowed the math world, wasn’t truly doing math at all. ‘‘It’s as if your only experience with music were practicing scales or learning music theory,’’ he said, looking into light pouring from his window. ‘‘I didn’t learn the deeper meaning of the subject until much later.’’
  • ...8 more annotations...
  • The true work of the mathematician is not experienced until the later parts of graduate school, when the student is challenged to create knowledge in the form of a novel proof. It is common to fill page after page with an attempt, the seasons turning, only to arrive precisely where you began, empty-handed — or to realize that a subtle flaw of logic doomed the whole enterprise from its outset. The steady state of mathematical research is to be completely stuck. It is a process that Charles Fefferman of Princeton, himself a onetime math prodigy turned Fields medalist, likens to ‘‘playing chess with the devil.’’ The rules of the devil’s game are special, though: The devil is vastly superior at chess, but, Fefferman explained, you may take back as many moves as you like, and the devil may not. You play a first game, and, of course, ‘‘he crushes you.’’ So you take back moves and try something different, and he crushes you again, ‘‘in much the same way.’’ If you are sufficiently wily, you will eventually discover a move that forces the devil to shift strategy; you still lose, but — aha! — you have your first clue.
  • Tao has emerged as one of the field’s great bridge-­builders. At the time of his Fields Medal, he had already made discoveries with more than 30 different collaborators. Since then, he has also become a prolific math blogger with a decidedly non-­Gaussian ebullience: He celebrates the work of others, shares favorite tricks, documents his progress and delights at any corrections that follow in the comments. He has organized cooperative online efforts to work on problems. ‘‘Terry is what a great 21st-­century mathematician looks like,’’ Jordan Ellenberg, a mathematician at the University of Wisconsin, Madison, who has collaborated with Tao, told me. He is ‘‘part of a network, always communicating, always connecting what he is doing with what other people are doing.’’
  • Most mathematicians tend to specialize, but Tao ranges widely, learning from others and then working with them to make discoveries. Markus Keel, a longtime collaborator and close friend, reaches to science fiction to explain Tao’s ability to rapidly digest and employ mathematical ideas: Seeing Tao in action, Keel told me, reminds him of the scene in ‘‘The Matrix’’ when Neo has martial arts downloaded into his brain and then, opening his eyes, declares, ‘‘I know kung fu.’’ The citation for Tao’s Fields Medal, awarded in 2006, is a litany of boundary hopping and notes particularly ‘‘beautiful work’’ on Horn’s conjecture, which Tao completed with a friend he had played foosball with in graduate school. It was a new area of mathematics for Tao, at a great remove from his known stamping grounds. ‘‘This is akin,’’ the citation read, ‘‘to a leading English-­language novelist suddenly producing the definitive Russian novel.’’
  • For their work, Tao and Green salvaged a crucial bit from an earlier proof done by others, which had been discarded as incorrect, and aimed at a different goal. Other maneuvers came from masterful proofs by Timothy Gowers of England and Endre Szemeredi of Hungary. Their work, in turn, relied on contributions from Erdos, Klaus Roth and Frank Ramsey, an Englishman who died at age 26 in 1930, and on and on, into history. Ask mathematicians about their experience of the craft, and most will talk about an intense feeling of intellectual camaraderie. ‘‘A very central part of any mathematician’s life is this sense of connection to other minds, alive today and going back to Pythagoras,’’ said Steven Strogatz, a professor of mathematics at Cornell University. ‘‘We are having this conversation with each other going over the millennia.’’
  • As a group, the people drawn to mathematics tend to value certainty and logic and a neatness of outcome, so this game becomes a special kind of torture. And yet this is what any ­would-be mathematician must summon the courage to face down: weeks, months, years on a problem that may or may not even be possible to unlock. You find yourself sitting in a room without doors or windows, and you can shout and carry on all you want, but no one is listening.
  • An effort to prove that 1 equals 0 is not likely to yield much fruit, it’s true, but the hacker’s mind-set can be extremely useful when doing math. Long ago, mathematicians invented a number that when multiplied by itself equals negative 1, an idea that seemed to break the basic rules of multiplication. It was so far outside what mathematicians were doing at the time that they called it ‘‘imaginary.’’ Yet imaginary numbers proved a powerful invention, and modern physics and engineering could not function without them.
  • Early encounters with math can be misleading. The subject seems to be about learning rules — how and when to apply ancient tricks to arrive at an answer. Four cookies remain in the cookie jar; the ball moves at 12.5 feet per second. Really, though, to be a mathematician is to experiment. Mathematical research is a fundamentally creative act. Lore has it that when David Hilbert, arguably the most influential mathematician of fin de siècle Europe, heard that a colleague had left to pursue fiction, he quipped: ‘‘He did not have enough imagination for mathematics.’’
  • Many people think that substantial progress on Navier-­Stokes may be impossible, and years ago, Tao told me, he wrote a blog post concurring with this view. Now he has some small bit of hope. The twin-prime conjecture had the same feel, a sense of breaking through the wall of intimidation that has scared off many aspirants. Outside the world of mathematics, both Navier-­Stokes and the twin-prime conjecture are described as problems. But for Tao and others in the field, they are more like opponents. Tao’s opponent has been known to taunt him, convincing him that he is overlooking the obvious, or to fight back, making quick escapes when none should be possible. Now the opponent appears to have revealed a weakness. But Tao said he has been here before, thinking he has found a way through the defenses, when in fact he was being led into an ambush. ‘‘You learn to get suspicious,’’ Tao said. ‘‘You learn to be on the lookout.’’
Javier E

Big Think Interview With Nicholas Carr | Nicholas Carr | Big Think - 0 views

  • Neurologically, how does our brain adapt itself to new technologies? Nicholas Carr: A couple of types of adaptations take place in your brain. One is a strengthening of the synaptical connections between the neurons involved in using that instrument, in using that tool. And basically these are chemical – neural chemical changes. So you know, cells in our brain communicate by transmitting electrical signals between them and those electrical signals are actually activated by the exchange of chemicals, neurotransmitters in our synapses. And so when you begin to use a tool, for instance, you have much stronger electrochemical signals being processed in those – through those synaptical connections. And then the second, and even more interesting adaptation is in actual physical changes,anatomical changes. Your neurons, you may grow new neurons that are then recruited into these circuits or your existing neurons may grow new synaptical terminals. And again, that also serves to strengthen the activity in those, in those particular pathways that are being used – new pathways. On the other hand, you know, the brain likes to be efficient and so even as its strengthening the pathways you’re exercising, it’s pulling – it’s weakening the connections in other ways between the cells that supported old ways of thinking or working or behaving, or whatever that you’re not exercising so much.
  • And it was only in around the year 800 or 900 that we saw the introduction of word spaces. And suddenly reading became, in a sense, easier and suddenly you had to arrival of silent reading, which changed the act of reading from just transcription of speech to something that every individual did on their own. And suddenly you had this whole deal of the silent solitary reader who was improving their mind, expanding their horizons, and so forth. And when Guttenberg invented the printing press around 1450, what that served to do was take this new very attentive, very deep form of reading, which had been limited to just, you know, monasteries and universities, and by making books much cheaper and much more available, spread that way of reading out to a much larger mass of audience. And so we saw, for the last 500 years or so, one of the central facts of culture was deep solitary reading.
  • What the book does as a technology is shield us from distraction. The only thinggoing on is the, you know, the progression of words and sentences across page after page and so suddenly we see this immersive kind of very attentive thinking, whether you are paying attention to a story or to an argument, or whatever. And what we know about the brain is the brain adapts to these types of tools.
  • ...12 more annotations...
  • we adapt to the environment of the internet, which is an environment of kind of constant immersion and information and constant distractions, interruptions, juggling lots of messages, lots of bits of information.
  • Because it’s no longer just a matter of personal choice, of personal discipline, though obviously those things are always important, but what we’re seeing and we see this over and over again in the history of technology, is that the technology – the technology of the web, the technology of digital media, gets entwined very, very deeply into social processes, into expectations. So more and more, for instance in our work lives. You know, if our boss and all our colleagues are constantly exchanging messages, constantly checking email on their Blackberry or iPhone or their Droid or whatever, then it becomes very difficult to say, I’m not going to be as connected because you feel like you’re career is going to take a hit.
  • With the arrival – with the transfer now of text more and more onto screens, we see, I think, a new and in some ways more primitive way of reading. In order to take in information off a screen, when you are also being bombarded with all sort of other information and when there links in the text where you have to think even for just a fraction of a second, you know, do I click on this link or not. Suddenly reading again becomes a more cognitively intensive act, the way it was back when there were no spaces between words.
  • If all your friends are planning their social lives through texts and Facebook and Twitter and so forth, then to back away from that means to feel socially isolated. And of course for all people, particularly for young people, there’s kind of nothing worse than feeling socially isolated, that your friends are you know, having these conversations and you’re not involved. So it’s easy to say the solution, which is to, you know, becomes a little bit more disconnected. What’s hard it actually doing that.
  • if you want to change your brain, you change your habits. You change your habits of thinking. And that means, you know, setting aside time to engage in more contemplative, more reflective ways of thinking and that means, you know, setting aside time to engage in more contemplative, more reflective ways of thinking, to be – to screen out distractions. And that means retreating from digital media and from the web and from Smart Phones and texting and Facebook and Tweeting and everything else.
  • The Thinker was, you know, in a contemplative pose and was concentrating deeply, and wasn’t you know, multi-tasking. And because that is something that, until recently anyway, people always thought was the deepest and most distinctly human way of thinking.
  • we may end up finding that those are actually the most valuable ways of thinking that are available to us as human beings.
  • the ability to pay attention also is very important for our ability to build memories, to transfer information from our short-term memory to our long-term memory. And only when we do that do we weave new information into everything else we have stored in our brains. All the other facts we’ve learned, all the other experiences we’ve had, emotions we’ve felt. And that’s how you build, I think, a rich intellect and a rich intellectual life.
  • On the other hand, there is a cost. We lose – we begin to lose the facilities that we don’t exercise. So adaptation has both a very, very positive side, but also a potentially negative side because ultimately our brain is qualitatively neutral. It doesn’t pare what it’s strengthening or what it’s weakening, it just responds to the way we’re exercising our mind.
  • the book in some ways is the most interesting from our own present standpoint, particularly when we want to think about the way the internet is changing us. It’s interesting to think about how the book changed us.
  • So we become, after the arrival of the printing press in general, more attentive more attuned to contemplative ways of thinking. And that’s a very unnatural way of using our mind. You know, paying attention, filtering out distractions.
  • what we lose is the ability to pay deep attention to one thing for a sustained period of time, to filter out distractions.
Javier E

Among the Disrupted - NYTimes.com - 0 views

  • Writers hover between a decent poverty and an indecent one; they are expected to render the fruits of their labors for little and even for nothing, and all the miracles of electronic dissemination somehow do not suffice for compensation, either of the fiscal or the spiritual kind.
  • Journalistic institutions slowly transform themselves into silent sweatshops in which words cannot wait for thoughts, and first responses are promoted into best responses, and patience is a professional liability.
  • the discussion of culture is being steadily absorbed into the discussion of business. There are “metrics” for phenomena that cannot be metrically measured. Numerical values are assigned to things that cannot be captured by numbers. Economic concepts go rampaging through noneconomic realms:
  • ...10 more annotations...
  • Quantification is the most overwhelming influence upon the contemporary American understanding of, well, everything. It is enabled by the idolatry of data, which has itself been enabled by the almost unimaginable data-generating capabilities of the new technology
  • The distinction between knowledge and information is a thing of the past, and there is no greater disgrace than to be a thing of the past.
  • even as technologism, which is not the same as technology, asserts itself over more and more precincts of human life, so too does scientism, which is not the same as science.
  • The notion that the nonmaterial dimensions of life must be explained in terms of the material dimensions, and that nonscientific understandings must be translated into scientific understandings if they are to qualify as knowledge, is increasingly popular inside and outside the university
  • The contrary insistence that the glories of art and thought are not evolutionary adaptations, or that the mind is not the brain, or that love is not just biology’s bait for sex, now amounts to a kind of heresy.
  • So, too, does the view that the strongest defense of the humanities lies not in the appeal to their utility — that literature majors may find good jobs, that theaters may economically revitalize neighborhoods — but rather in the appeal to their defiantly nonutilitarian character, so that individuals can know more than how things work, and develop their powers of discernment and judgment, their competence in matters of truth and goodness and beauty, to equip themselves adequately for the choices and the crucibles of private and public life.
  • are we becoming posthumanists?
  • In American culture right now, as I say, the worldview that is ascendant may be described as posthumanism.
  • The posthumanism of the 1970s and 1980s was more insular, an academic affair of “theory,” an insurgency of professors; our posthumanism is a way of life, a social fate.
  • In “The Age of the Crisis of Man: Thought and Fiction in America, 1933-1973,” the gifted essayist Mark Greif, who reveals himself to be also a skillful historian of ideas, charts the history of the 20th-century reckonings with the definition of “man.”
Javier E

Why Kids Sext - Hanna Rosin - The Atlantic - 0 views

  • Within an hour, the deputies realized just how common the sharing of nude pictures was at the school. “The boys kept telling us, ‘It’s nothing unusual. It happens all the time,’ ” Lowe recalls. Every time someone they were interviewing mentioned another kid who might have naked pictures on his or her phone, they had to call that kid in for an interview. After just a couple of days, the deputies had filled multiple evidence bins with phones, and they couldn’t see an end to it. Fears of a cabal got replaced by a more mundane concern: what to do with “hundreds of damned phones. I told the deputies, ‘We got to draw the line somewhere or we’re going to end up talking to every teenager in the damned county!’ ”
  • Nor did the problem stop at the county’s borders. Several boys, in an effort to convince Lowe that they hadn’t been doing anything rare or deviant, showed him that he could type the hashtag symbol (#) into Instagram followed by the name of pretty much any nearby county and then thots, and find a similar account.
  • In some he sensed low self-esteem—for example, the girl who’d sent her naked picture to a boy, unsolicited: “It just showed up! I guess she was hot after him?” A handful of senior girls became indignant during the course of the interview. “This is my life and my body and I can do whatever I want with it,” or, “I don’t see any problem with it. I’m proud of my body,” Lowe remembers them saying. A few, as far as he could tell, had taken pictures especially for the Instagram accounts and had actively tried to get them posted.
  • ...5 more annotations...
  • What seemed to mortify them most was having to talk about what they’d done with a “police officer outside their age group.”
  • Most of the girls on Instagram fell into the same category as Jasmine. They had sent a picture to their boyfriend, or to someone they wanted to be their boyfriend, and then he had sent it on to others. For the most part, they were embarrassed but not devastated, Lowe said. They felt betrayed, but few seemed all that surprised that their photos had been passed around.
  • Lowe’s team explained to both the kids pictured on Instagram and the ones with photos on their phones the serious legal consequences of their actions. Possessing or sending a nude photo of a minor—even if it’s a photo of yourself—can be prosecuted as a felony under state child-porn laws. He explained that 10 years down the road they might be looking for a job or trying to join the military, or sitting with their families at church, and the pictures could wash back up; someone who had the pictures might even try to blackmail them.
  • yet the kids seemed strikingly blasé. “They’re just sitting there thinking, Wah, wah, wah,” Lowe said, turning his hands into flapping lips. “It’s not sinking in. Remember at that age, you think you’re invincible, and you’re going to do whatever the hell you want to do? We just couldn’t get them past that.”
  • while adults send naked pictures too, of course, the speed with which teens have incorporated the practice into their mating rituals has taken society by surprise.
Javier E

Noam Chomsky on Where Artificial Intelligence Went Wrong - Yarden Katz - The Atlantic - 1 views

  • Skinner's approach stressed the historical associations between a stimulus and the animal's response -- an approach easily framed as a kind of empirical statistical analysis, predicting the future as a function of the past.
  • Chomsky's conception of language, on the other hand, stressed the complexity of internal representations, encoded in the genome, and their maturation in light of the right data into a sophisticated computational system, one that cannot be usefully broken down into a set of associations.
  • Chomsky acknowledged that the statistical approach might have practical value, just as in the example of a useful search engine, and is enabled by the advent of fast computers capable of processing massive data. But as far as a science goes, Chomsky would argue it is inadequate, or more harshly, kind of shallow
  • ...17 more annotations...
  • David Marr, a neuroscientist colleague of Chomsky's at MIT, defined a general framework for studying complex biological systems (like the brain) in his influential book Vision,
  • a complex biological system can be understood at three distinct levels. The first level ("computational level") describes the input and output to the system, which define the task the system is performing. In the case of the visual system, the input might be the image projected on our retina and the output might our brain's identification of the objects present in the image we had observed. The second level ("algorithmic level") describes the procedure by which an input is converted to an output, i.e. how the image on our retina can be processed to achieve the task described by the computational level. Finally, the third level ("implementation level") describes how our own biological hardware of cells implements the procedure described by the algorithmic level.
  • The emphasis here is on the internal structure of the system that enables it to perform a task, rather than on external association between past behavior of the system and the environment. The goal is to dig into the "black box" that drives the system and describe its inner workings, much like how a computer scientist would explain how a cleverly designed piece of software works and how it can be executed on a desktop computer.
  • As written today, the history of cognitive science is a story of the unequivocal triumph of an essentially Chomskyian approach over Skinner's behaviorist paradigm -- an achievement commonly referred to as the "cognitive revolution,"
  • While this may be a relatively accurate depiction in cognitive science and psychology, behaviorist thinking is far from dead in related disciplines. Behaviorist experimental paradigms and associationist explanations for animal behavior are used routinely by neuroscientists
  • Chomsky critiqued the field of AI for adopting an approach reminiscent of behaviorism, except in more modern, computationally sophisticated form. Chomsky argued that the field's heavy use of statistical techniques to pick regularities in masses of data is unlikely to yield the explanatory insight that science ought to offer. For Chomsky, the "new AI" -- focused on using statistical learning techniques to better mine and predict data -- is unlikely to yield general principles about the nature of intelligent beings or about cognition.
  • Behaviorist principles of associations could not explain the richness of linguistic knowledge, our endlessly creative use of it, or how quickly children acquire it with only minimal and imperfect exposure to language presented by their environment.
  • it has been argued in my view rather plausibly, though neuroscientists don't like it -- that neuroscience for the last couple hundred years has been on the wrong track.
  • Implicit in this endeavor is the assumption that with enough sophisticated statistical tools and a large enough collection of data, signals of interest can be weeded it out from the noise in large and poorly understood biological systems.
  • Brenner, a contemporary of Chomsky who also participated in the same symposium on AI, was equally skeptical about new systems approaches to understanding the brain. When describing an up-and-coming systems approach to mapping brain circuits called Connectomics, which seeks to map the wiring of all neurons in the brain (i.e. diagramming which nerve cells are connected to others), Brenner called it a "form of insanity."
  • These debates raise an old and general question in the philosophy of science: What makes a satisfying scientific theory or explanation, and how ought success be defined for science?
  • Ever since Isaiah Berlin's famous essay, it has become a favorite pastime of academics to place various thinkers and scientists on the "Hedgehog-Fox" continuum: the Hedgehog, a meticulous and specialized worker, driven by incremental progress in a clearly defined field versus the Fox, a flashier, ideas-driven thinker who jumps from question to question, ignoring field boundaries and applying his or her skills where they seem applicable.
  • Chomsky's work has had tremendous influence on a variety of fields outside his own, including computer science and philosophy, and he has not shied away from discussing and critiquing the influence of these ideas, making him a particularly interesting person to interview.
  • If you take a look at the progress of science, the sciences are kind of a continuum, but they're broken up into fields. The greatest progress is in the sciences that study the simplest systems. So take, say physics -- greatest progress there. But one of the reasons is that the physicists have an advantage that no other branch of sciences has. If something gets too complicated, they hand it to someone else.
  • If a molecule is too big, you give it to the chemists. The chemists, for them, if the molecule is too big or the system gets too big, you give it to the biologists. And if it gets too big for them, they give it to the psychologists, and finally it ends up in the hands of the literary critic, and so on.
  • An unlikely pair, systems biology and artificial intelligence both face the same fundamental task of reverse-engineering a highly complex system whose inner workings are largely a mystery
  • neuroscience developed kind of enthralled to associationism and related views of the way humans and animals work. And as a result they've been looking for things that have the properties of associationist psychology.
Javier E

History News Network | An Open Letter to the Harvard Business School Dean Who Gave Historians an Assignment - 0 views

  • I would like to make some gratuitous curricular and pedagogical suggestions for business schools.
  • Foremost, business schools, at least those that purport to mold leaders, should stop training and start educating. Their graduates should be able to think and problem-solve for themselves, not just copy the latest fad.
  • Business schools generally do not cultivate or even select for general intelligence and breadth of understanding but instead breed shrewdness and encourage narrow technical knowledge.
  • ...8 more annotations...
  • To try to cover up the obvious shortcomings of their profoundly antisocial pedagogical model, many business schools tack on courses in ethics, corporate social responsibility, and the like, then shrug their collective shoulders when their graduates behave in ways that would make Vikings and pirates blush.
  • The only truly socially responsible management curriculum would be one built from the ground up out of the liberal arts – economics, of course, but also history, philosophy, political science, psychology, and sociology – because those are the core disciplines of social scientific and humanistic inquiry.
  • Properly understood, they are not “subjects” but ways of thinking about human beings, their behaviors, their institutions (of which businesses are just a small subset), and the ways they interact with the natural world. Only intelligent people with broad and deep backgrounds in the liberal arts can consistently make ethical decisions that are good for stakeholders, consumers, and the world they share.
  • Precisely because they are not deeply rooted in the liberal arts, many business schools try to inculcate messages into the brains of their students that are unscientific, mere fashions that cycle into and then out of popularity.
  • No one can seriously purport to understand corporate X (finance, formation, governance, social responsibility, etc.) today who does not understand X’s origins and subsequent development. Often, then, the historian of corporate X is the real expert, not the business school professor who did a little consulting, a few interviews, and a survey.
  • Lurking somewhere in the background of most great business leaders, ones who helped their companies, their customers, and the world, is a liberal arts education.
  • Instead of forcing students to choose between a broad liberal arts degree or a business career, business schools and liberal arts departments ought to work together to integrate all methods of knowing into a seamless whole focused on key questions and problems important to us all
  • There is not a single question of importance in the business world that does not have economic, historical, philosophical, political, psychological, and sociological components that are absolutely crucial to making good (right and moral) decisions. So why continue to divide understanding of the real world into hoary compartments
kushnerha

How to Cultivate the Art of Serendipity - The New York Times - 0 views

  • A surprising number of the conveniences of modern life were invented when someone stumbled upon a discovery or capitalized on an accident
  • wonder whether we can train ourselves to become more serendipitous. How do we cultivate the art of finding what we’re not seeking?
  • Croatian has no word to capture the thrill of the unexpected discovery, so she was delighted when — after moving to the United States on a Fulbright scholarship in the 1980s — she learned the English word “serendipity.”
  • ...12 more annotations...
  • Today we think of serendipity as something like dumb luck. But its original meaning was very different.
  • suggested that this old tale contained a crucial idea about human genius: “As their highnesses travelled, they were always making discoveries, by accident and sagacity, of things which they were not in quest of.” And he proposed a new word — “serendipity” — to describe this princely talent for detective work. At its birth, serendipity meant a skill rather than a random stroke of good fortune.
  • sees serendipity as something people do. In the mid-1990s, she began a study of about 100 people to find out how they created their own serendipity, or failed to do so.
  • As people dredge the unknown, they are engaging in a highly creative act. What an inventor “finds” is always an expression of him- or herself.
  • You become a super-encounterer, according to Dr. Erdelez, in part because you believe that you are one — it helps to assume that you possess special powers of perception
  • “gathering string” is just another way of talking about super-encountering. After all, “string” is the stuff that accumulates in a journalist’s pocket. It’s the note you jot down in your car after the interview, the knickknack you notice on someone’s shelf, or the anomaly that jumps out at you in Appendix B of an otherwise boring research study.
  • came up with the term super-encounterer to give us a way to talk about the people rather than just the discoveries. Without such words, we tend to become dazzled by the happy accident itself, to think of it as something that exists independent of an observer.
  • We can slip into a twisted logic in which we half-believe the penicillin picked Alexander Fleming to be its emissary, or that the moons of Jupiter wanted to be seen by Galileo. But discoveries are products of the human mind.
  • subjects fell into three distinct groups. Some she called “non-encounterers”; they saw through a tight focus, a kind of chink hole, and they tended to stick to their to-do lists when searching for information rather than wandering off into the margins. Other people were “occasional encounterers,” who stumbled into moments of serendipity now and then. Most interesting were the “super-encounterers,” who reported that happy surprises popped up wherever they looked.
  • One survey of patent holders (the PatVal study of European inventors, published in 2005) found that an incredible 50 percent of patents resulted from what could be described as a serendipitous process. Thousands of survey respondents reported that their idea evolved when they were working on an unrelated project — and often when they weren’t even trying to invent anything.
  • need to develop a new, interdisciplinary field — call it serendipity studies — that can help us create a taxonomy of discoveries in the chemistry lab, the newsroom, the forest, the classroom, the particle accelerator and the hospital. By observing and documenting the many different “species” of super-encounterers, we might begin to understand their minds.
  • Of course, even if we do organize the study of serendipity, it will always be a whimsical undertaking, given that the phenomenon is difficult to define, amazingly variable and hard to capture in data. The clues will no doubt emerge where we least expect them
Javier E

This dark side of the Internet is costing young people their jobs and social lives - The Washington Post - 1 views

  • A recent study by Common Sense Media, a parent advocacy group, found that 59 percent of parents think their teens are addicted to mobile devices. Meanwhile, 50 percent of teenagers feel the same way. The study surveyed nearly 1,300 parents and children this year.
  • “It’s not as obvious as substance addiction, but it’s very, very real,” said Alex, a 22-year-old who had been at reSTART for five days with a familiar story: He withdrew from college because he put playing games or using the Internet ahead of going to class or work.
  • His parents, he said, had always encouraged him to use technology, without realizing the harm it could do. They were just trying to raise their son in a world soaked in technology that didn’t exist when they were his age.
  • ...4 more annotations...
  • Those who say they suffer from Internet addiction share many symptoms with other types of addicts, in terms of which chemicals are released into the brain, experts say. The pleasure centers of the brain light up when introduced to the stimulus. Addicts lose interest in other hobbies or, sometimes, never develop any. When not allowed to go online, they experience withdrawal symptoms such as irritability, depression or even physical shaking. They retreat into corners of the Internet where they can find quick success — a dominant ranking in a game or a well-liked Facebook post — that they don’t have in the real world
  • Peter’s tech dependence started when he was 13, after his father died. He retreated into gaming to cope, playing from sunup until sundown, sometimes without taking breaks to eat or even to use the bathroom.
  • Gaming offered him a euphoric escape from reality. He spent more and more time playing games, watching online videos, and getting into arguments on social media and forums. He withdrew from the rest of the world, avoiding the pain and feelings of total worthlessness that hit him when he tried to address his problems. His schoolwork suffered. His physical health declined because he never learned to cook, to clean, to exercise — or, as he put it, “to live in an adult way.” That helped push his relationship with his mother to its breaking point, he said.
  • Many of her young clients have poor impulse control and an inability to plan for the future. Even the thought of having to plan a meal, Cash said, can lock some of her patients up with fear.
krystalxu

Why Study Philosophy? 'To Challenge Your Own Point of View' - The Atlantic - 1 views

  • Goldstein’s forthcoming book, Plato at the Googleplex: Why Philosophy Won’t Go Away, offers insight into the significant—and often invisible—progress that philosophy has made. I spoke with Goldstein about her take on the science vs. philosophy debates, how we can measure philosophy’s advances, and why an understanding of philosophy is critical to our lives today.
  • One of the things about philosophy is that you don’t have to give up on any other field. Whatever field there is, there’s a corresponding field of philosophy. Philosophy of language, philosophy of politics, philosophy of math. All the things I wanted to know about I could still study within a philosophical framework.
  • There’s a peer pressure that sets in at a certain age. They so much want to be like everybody else. But what I’ve found is that if you instill this joy of thinking, the sheer intellectual fun, it will survive even the adolescent years and come back in fighting form. It’s empowering.
  • ...18 more annotations...
  • One thing that’s changed tremendously is the presence of women and the change in focus because of that. There’s a lot of interest in literature and philosophy, and using literature as a philosophical examination. It makes me so happy! Because I was seen as a hard-core analytic philosopher, and when I first began to write novels people thought, Oh, and we thought she was serious! But that’s changed entirely. People take literature seriously, especially in moral philosophy, as thought experiments. A lot of the most developed and effective thought experiments come from novels. Also, novels contribute to making moral progress, changing people’s emotions.
  • The other thing that’s changed is that there’s more applied philosophy. Let’s apply philosophical theory to real-life problems, like medical ethics, environmental ethics, gender issues. This is a real change from when I was in school and it was only theory.
  • here’s a lot of philosophical progress, it’s just a progress that’s very hard to see. It’s very hard to see because we see with it. We incorporate philosophical progress into our own way of viewing the world.
  • Plato would be constantly surprised by what we know. And not only what we know scientifically, or by our technology, but what we know ethically. We take a lot for granted. It’s obvious to us, for example, that individual’s ethical truths are equally important.
  • it’s usually philosophical arguments that first introduce the very outlandish idea that we need to extend rights. And it takes more, it takes a movement, and activism, and emotions, to affect real social change. It starts with an argument, but then it becomes obvious. The tracks of philosophy’s work are erased because it becomes intuitively obvious
  • The arguments against slavery, against cruel and unusual punishment, against unjust wars, against treating children cruelly—these all took arguments.
  • About 30 years ago, the philosopher Peter Singer started to argue about the way animals are treated in our factory farms. Everybody thought he was nuts. But I’ve watched this movement grow; I’ve watched it become emotional. It has to become emotional. You have to draw empathy into it. But here it is, right in our time—a philosopher making the argument, everyone dismissing it, but then people start discussing it. Even criticizing it, or saying it’s not valid, is taking it seriously
  • This is what we have to teach our children. Even things that go against their intuition they need to take seriously. What was intuition two generations ago is no longer intuition; and it’s arguments that change i
  • We are very inertial creatures. We do not like to change our thinking, especially if it’s inconvenient for us. And certainly the people in power never want to wonder whether they should hold power.
  • I’m really trying to draw the students out, make them think for themselves. The more they challenge me, the more successful I feel as a teacher. It has to be very active
  • Plato used the metaphor that in teaching philosophy, there needs to be a fire in the teacher, and the sheer heat will help the fire grow in the student. It’s something that’s kindled because of the proximity to the heat.
  • how can you make the case that they should study philosophy?
  • ches your inner life. You have lots of frameworks to apply to problems, and so many ways to interpret things. It makes life so much more interesting. It’s us at our most human. And it helps us increase our humanity. No matter what you do, that’s an asset.
  • What do you think are the biggest philosophical issues of our time? The growth in scientific knowledge presents new philosophical issues.
  • The idea of the multiverse. Where are we in the universe? Physics is blowing our minds about this.
  • The question of whether some of these scientific theories are really even scientific. Can we get predictions out of them?
  • And with the growth in cognitive science and neuroscience. We’re going into the brain and getting these images of the brain. Are we discovering what we really are? Are we solving the problem of free will? Are we learning that there isn’t any free will? How much do the advances in neuroscience tell us about the deep philosophical issues?
  • With the decline of religion is there a sense of the meaninglessness of life and the easy consumerist answer that’s filling the space religion used to occupy? This is something that philosophers ought to be addressing.
Javier E

I Downloaded the Information That Facebook Has on Me. Yikes. - The New York Times - 0 views

  • When I downloaded a copy of my Facebook data last week, I didn’t expect to see much. My profile is sparse, I rarely post anything on the site, and I seldom click on ads
  • With a few clicks, I learned that about 500 advertisers — many that I had never heard of, like Bad Dad, a motorcycle parts store, and Space Jesus, an electronica band — had my contact information
  • Facebook also had my entire phone book, including the number to ring my apartment buzzer. The social network had even kept a permanent record of the roughly 100 people I had deleted from my friends list over the last 14 years, including my exes.
  • ...16 more annotations...
  • During his testimony, Mr. Zuckerberg repeatedly said Facebook has a tool for downloading your data that “allows people to see and take out all the information they’ve put into Facebook.”
  • Most basic information, like my birthday, could not be deleted. More important, the pieces of data that I found objectionable, like the record of people I had unfriended, could not be removed from Facebook, either.
  • “They don’t delete anything, and that’s a general policy,” said Gabriel Weinberg, the founder of DuckDuckGo, which offers internet privacy tools. He added that data was kept around to eventually help brands serve targeted ads.
  • When you download a copy of your Facebook data, you will see a folder containing multiple subfolders and files. The most important one is the “index” file, which is essentially a raw data set of your Facebook account, where you can click through your profile, friends list, timeline and messages, among other features.
  • Upon closer inspection, it turned out that Facebook had stored my entire phone book because I had uploaded it when setting up Facebook’s messaging app, Messenger.
  • Facebook also kept a history of each time I opened Facebook over the last two years, including which device and web browser I used. On some days, it even logged my locations, like when I was at a hospital two years ago or when I visited Tokyo last year.
  • what bothered me was the data that I had explicitly deleted but that lingered in plain sight. On my friends list, Facebook had a record of “Removed Friends,” a dossier of the 112 people I had removed along with the date I clicked the “Unfriend” button. Why should Facebook remember the people I’ve cut off from my life?
  • Facebook said unfamiliar advertisers might appear on the list because they might have obtained my contact information from elsewhere, compiled it into a list of people they wanted to target and uploaded that list into Facebook
  • Brands can obtain your information in many different ways. Those include:
  • ■ Buying information from a data provider like Acxiom, which has amassed one of the world’s largest commercial databases on consumers. Brands can buy different types of customer data sets from a provider, like contact information for people who belong to a certain demographic, and take that information to Facebook to serve targeted ads
  • ■ Using tracking technologies like web cookies and invisible pixels that load in your web browser to collect information about your browsing activities. There are many different trackers on the web, and Facebook offers 10 different trackers to help brands harvest your information, according to Ghostery, which offers privacy tools that block ads and trackers.
  • ■ Getting your information in simpler ways, too. Someone you shared information with could share it with another entity. Your credit card loyalty program, for example
  • I also downloaded copies of my Google data with a tool called Google Takeout. The data sets were exponentially larger than my Facebook data.
  • For my personal email account alone, Google’s archive of my data measured eight gigabytes, enough to hold about 2,000 hours of music. By comparison, my Facebook data was about 650 megabytes, the equivalent of about 160 hours of music.
  • In a folder labeled Ads, Google kept a history of many news articles I had read, like a Newsweek story about Apple employees walking into glass walls and a New York Times story about the editor of our Modern Love column. I didn’t click on ads for either of these stories, but the search giant logged them because the sites had loaded ads served by Google.
  • In another folder, labeled Android, Google had a record of apps I had opened on an Android phone since 2015, along with the date and time. This felt like an extraordinary level of detail.
kushnerha

Ignore the GPS. That Ocean Is Not a Road. - The New York Times - 2 views

  • Faith is a concept that often enters the accounts of GPS-induced mishaps. “It kept saying it would navigate us a road,” said a Japanese tourist in Australia who, while attempting to reach North Stradbroke Island, drove into the Pacific Ocean. A man in West Yorkshire, England, who took his BMW off-road and nearly over a cliff, told authorities that his GPS “kept insisting the path was a road.” In perhaps the most infamous incident, a woman in Belgium asked GPS to take her to a destination less than two hours away. Two days later, she turned up in Croatia.
  • These episodes naturally inspire incredulity, if not outright mockery. After a couple of Swedes mistakenly followed their GPS to the city of Carpi (when they meant to visit Capri), an Italian tourism official dryly noted to the BBC that “Capri is an island. They did not even wonder why they didn’t cross any bridge or take any boat.” An Upper West Side blogger’s account of the man who interpreted “turn here” to mean onto a stairway in Riverside Park was headlined “GPS, Brain Fail Driver.”
  • several studies have demonstrated empirically what we already know instinctively. Cornell researchers who analyzed the behavior of drivers using GPS found drivers “detached” from the “environments that surround them.” Their conclusion: “GPS eliminated much of the need to pay attention.”
  • ...6 more annotations...
  • We seem driven (so to speak) to transform cars, conveyances that show us the world, into machines that also see the world for
  • There is evidence that one’s cognitive map can deteriorate. A widely reported study published in 2006 demonstrated that the brains of London taxi drivers have larger than average amounts of gray matter in the area responsible for complex spatial relations. Brain scans of retired taxi drivers suggested that the volume of gray matter in those areas also decreases when that part of the brain is no longer being used as frequently. “I think it’s possible that if you went to someone doing a lot of active navigation, but just relying on GPS,” Hugo Spiers, one of the authors of the taxi study, hypothesized to me, “you’d actually get a reduction in that area.”
  • A consequence is a possible diminution of our “cognitive map,” a term introduced in 1948 by the psychologist Edward Tolman of the University of California, Berkeley. In a groundbreaking paper, Dr. Tolman analyzed several laboratory experiments involving rats and mazes. He argued that rats had the ability to develop not only cognitive “strip maps” — simple conceptions of the spatial relationship between two points — but also more comprehensive cognitive maps that encompassed the entire maze.
  • Could society’s embrace of GPS be eroding our cognitive maps? For Julia Frankenstein, a psychologist at the University of Freiburg’s Center for Cognitive Science, the danger of GPS is that “we are not forced to remember or process the information — as it is permanently ‘at hand,’ we need not think or decide for ourselves.” She has written that we “see the way from A to Z, but we don’t see the landmarks along the way.” In this sense, “developing a cognitive map from this reduced information is a bit like trying to get an entire musical piece from a few notes.” GPS abets a strip-map level of orientation with the world.
  • We seem driven (so to speak) to transform cars, conveyances that show us the world, into machines that also see the world for us.
  • For Dr. Tolman, the cognitive map was a fluid metaphor with myriad applications. He identified with his rats. Like them, a scientist runs the maze, turning strip maps into comprehensive maps — increasingly accurate models of the “great God-given maze which is our human world,” as he put it. The countless examples of “displaced aggression” he saw in that maze — “the poor Southern whites, who take it out on the Negros,” “we psychologists who criticize all other departments,” “Americans who criticize the Russians and the Russians who criticize us” — were all, to some degree, examples of strip-map comprehension, a blinkered view that failed to comprehend the big picture. “What in the name of Heaven and Psychology can we do about it?” he wrote. “My only answer is to preach again the virtues of reason — of, that is, broad cognitive maps.”
« First ‹ Previous 61 - 80 of 1578 Next › Last »
Showing 20 items per page