Skip to main content

Home/ TOK Friends/ Group items tagged secrets

Rss Feed Group items tagged

Javier E

The Scoreboards Where You Can't See Your Score - NYTimes.com - 0 views

  • The characters in Gary Shteyngart’s novel “Super Sad True Love Story” inhabit a continuously surveilled and scored society.
  • Consider the protagonist, Lenny Abramov, age 39. A digital dossier about him accumulates his every health condition (high cholesterol, depression), liability (mortgage: $560,330), purchase (“bound, printed, nonstreaming media artifact”), tendency (“heterosexual, nonathletic, nonautomotive, nonreligious”) and probability (“life span estimated at 83”). And that profile is available for perusal by employers, friends and even strangers in bars.
  • Even before the appearance of these books, a report called “The Scoring of America” by the World Privacy Forum showed how analytics companies now offer categorization services like “churn scores,” which aim to predict which customers are likely to forsake their mobile phone carrier or cable TV provider for another company; “job security scores,” which factor a person’s risk of unemployment into calculations of his or her ability to pay back a loan; “charitable donor scores,” which foundations use to identify the households likeliest to make large donations; and “frailty scores,” which are typically used to predict the risk of medical complications and death in elderly patients who have surgery.
  • ...12 more annotations...
  • In two nonfiction books, scheduled to be published in January, technology experts examine similar consumer-ranking techniques already in widespread use.
  • While a federal law called the Fair Credit Reporting Act requires consumer reporting agencies to provide individuals with copies of their credit reports on request, many other companies are free to keep their proprietary consumer scores to themselves.
  • Befitting the founder of a firm that markets reputation management, Mr. Fertik contends that individuals have some power to influence commercial scoring systems.
  • “This will happen whether or not you want to participate, and these scores will be used by others to make major decisions about your life, such as whether to hire, insure, or even date you,”
  • “Important corporate actors have unprecedented knowledge of the minutiae of our daily lives,” he writes in “The Black Box Society: The Secret Algorithms That Control Money and Information” (Harvard University Press), “while we know little to nothing about how they use this knowledge to influence important decisions that we — and they — make.”
  • Data brokers amass dossiers with thousands of details about individual consumers, like age, religion, ethnicity, profession, mortgage size, social networks, estimated income and health concerns such as impotence and irritable bowel syndrome. Then analytics engines can compare patterns in those variables against computer forecasting models. Algorithms are used to assign consumers scores — and to recommend offering, or withholding, particular products, services or fees — based on predictions about their behavior.
  • It’s a fictional forecast of a data-deterministic culture in which computer algorithms constantly analyze consumers’ profiles, issuing individuals numeric rankings that may benefit or hinder them.
  • Think of this technique as reputation engine optimization. If an algorithm incorrectly pegs you as physically unfit, for instance, the book suggests that you can try to mitigate the wrong. You can buy a Fitbit fitness tracker, for instance, and upload the exercise data to a public profile — or even “snap that Fitbit to your dog” and “you’ll quickly be the fittest person in your town.”
  • Professor Pasquale offers a more downbeat reading. Companies, he says, are using such a wide variety of numerical rating systems that it would be impossible for average people to significantly influence their scores.
  • “Corporations depend on automated judgments that may be wrong, biased or destructive,” Professor Pasquale writes. “Faulty data, invalid assumptions and defective models can’t be corrected when they are hidden.”
  • Moreover, trying to influence scoring systems could backfire. If a person attached a fitness device to a dog and tried to claim the resulting exercise log, he suggests, an algorithm might be able to tell the difference and issue that person a high score for propensity toward fraudulent activity.
  • “People shouldn’t think they can outwit corporations with hundreds of millions of dollars,” Professor Pasquale said in a phone interview.Consumers would have more control, he argues, if Congress extended the right to see and correct credit reports to other kinds of rankings.
carolinewren

Book Review: 'A New History of Life' by Peter Ward and Joe Kirschvink - WSJ - 0 views

  • I imagine that physicists are similarly deluged with revelations about how to build a perpetual-motion machine or about the hitherto secret truth behind relativity. And so I didn’t view the arrival of “A New History of Life” with great enthusiasm.
  • subtitle breathlessly promises “radical new discoveries about the origins and evolution of life on earth,” while the jacket copy avers that “our current paradigm for understanding the history of life on Earth dates back to Charles Darwin’s time, yet scientific advances of the last few decades have radically reshaped that aging picture.”
  • authors Peter Ward and Joe Kirschvink are genuine scientists—paleontologists, to be exact. And they can write.
  • ...16 more annotations...
  • even genuine scientists are human and as such susceptible to the allure of offering up new paradigms (as the historian of science Thomas Kuhn put it)
  • paleontologist Stephen Jay Gould insisted that his conception of “punctuated equilibria” (a kind of Marxist biology that blurred the lines between evolution and revolution), which he developed along with fellow paleontologist Niles Eldredge, upended the traditional Darwinian understanding of how natural selection works.
  • This notion doesn’t constitute a fundamental departure from plain old evolution by natural selection; it simply italicizes that sometimes the process is comparatively rapid, other times slower.
  • In addition, they have long had a peculiar perspective on evolution, because of the limitations of the fossil record
  • Darwin was a pioneering geologist as well as the greatest of all biologists, and his insights were backgrounded by the key concept of uniformitarianism, as advocated by Charles Lyell, his friend and mentor
  • previously regnant paradigm among geologists had been “catastrophism
  • fossil record was therefore seen as reflecting the creation and extinction of new species by an array of dramatic and “unnatural” dei ex machina.
  • Of late, however, uniformitarianism has been on a losing streak. Catastrophism is back, with a bang . . . or a flood, or a burst of extraterrestrial radiation, or an onslaught of unpleasant, previously submerged chemicals
  • This emphasis on catastrophes is the first of a triad of novelties on which “A New History of Life” is based. The second involves an enhanced role for some common but insufficiently appreciated inorganic molecules, notably carbon dioxide, oxygen and hydrogen sulfide.
  • Life didn’t so much unfold smoothly over hundreds of millions of years as lurch chaotically in response to diverse crises and opportunities: too much oxygen, too little carbon dioxide, too little oxygen, too much carbon dioxide, too hot, too cold
  • So far, so good, except that in their eagerness to emphasize what is new and different, the authors teeter on the verge of the same trap as Gould: exaggerating the novelty of their own ideas.
  • Things begin to unravel when it comes to the third leg of Messrs. Ward and Kirschvink’s purported paradigmatic novelty: a supposed role for ecosystems—rain forests, deserts, rivers, coral reefs, deep-sea vents—as units of evolutionary change
  • “While the history of life may be populated by species,” they write, “it has been the evolution of ecosystems that has been the most influential factor in arriving at the modern-day assemblage of life. . . . [W]e know that on occasion in the deep past entirely new ecosystems appear, populated by new kinds of life.” True enough, but it is those “new kinds of life,” not whole ecosystems, upon which natural selection acts.
  • One of the most common popular misconceptions about evolution is that it proceeds “for the good of the species.”
  • The problem is that smaller, nimbler units are far more likely to reproduce differentially than are larger, clumsier, more heterogeneous ones. Insofar as ecosystems are consequential for evolution—and doubtless they are—it is because, like occasional catastrophes, they provide the immediate environment within which something not-so-new is acted out.
  • This is natural selection doing its same-old, same-old thing: acting by a statistically potent process of variation combined with selective retention and differential reproduction, a process that necessarily operates within the particular ecosystem that a given lineage occupies.
Javier E

Facebook Has All the Power - Julie Posetti - The Atlantic - 0 views

  • scholars covet thy neighbor's data. They're attracted to the very large and often fascinating data sets that private companies have developed.
  • It's the companies that own and manage this data. The only standards we know they have to follow are in the terms-of-service that users accept to create an account, and the law as it stands in different countries.
  • the "sexiness" of the Facebook data that led Cornell University and the Proceedings of the National Academy of Sciences (PNAS) into an ethically dubious arrangement, where, for example, Facebook's unreadable 9,000-word terms-of-service are said to be good enough to meet the standard for "informed consent."
  • ...9 more annotations...
  • When the study drew attention and controversy, there was a moment when they both could have said: "We didn't look carefully enough at this the first time. Now we can see that it doesn't meet our standards." Instead they allowed Facebook and the PR people to take the lead in responding to the controversy.
  • What should this reality signal to Facebook users? Is it time to pull-back? You have (almost) no rights. You have (almost) no control. You have no idea what they're doing to you or with you. You don't even know who's getting the stuff you are posting, and you're not allowed to know. Trade secret!
  • Are there any particular warnings here for journalists and editors in terms of their exposure on Facebook? Yeah. Facebook has all the power. You have almost none. Just keep that in mind in all your dealings with it, as an individual with family and friends, as a journalist with a story to file, and as a news organization that is "on" Facebook.
  • I am not in a commercial situation where I have to maximize my traffic, so I can opt out. Right now my choice is to keep my account, but use it cynically. 
  • does this level of experimentation indicate the prospect of a further undermining of audience-driven news priorities and traditional news values? The right way to think about it is a loss of power—for news producers and their priorities. As I said, Facebook thinks it knows better than I do what "my" 180,000 subscribers should get from me.
  • Facebook has "where else are they going to go?" logic now. And they have good reason for this confidence. (It's called network effects.) But "where else are they going to go?" is a long way from trust and loyalty. It is less a durable business model than a statement of power. 
  • I distinguished between the "thin" legitimacy that Facebook operates under and the "thick" legitimacy that the university requires to be the institution it was always supposed to be. (Both are distinct from il-legitimacy.) News organizations should learn to make this distinction more often. Normal PR exists to muddle it. Which is why you don't hand a research crisis over to university PR people.
  • some commentators have questioned the practice of A/B headline testing in the aftermath of this scandal—is there a clear connection? The connection to me is that both are forms of behaviourism. Behaviourism is a view of human beings in which, as Hannah Arendt said, they are reduced to the level of a conditioned and "behaving" animal—an animal that responds to these stimuli but not those. This is why a popular shorthand for Facebook's study was that users were being treated as lab rats.
  • Journalism is supposed to be about informing people so they can understand the world and take action when necessary. Action and behaviour are not the same thing at all. One is a conscious choice, the other a human tendency. There's a tension, then, between commercial behaviourism, which may be deeply functional in some ways for the news industry, and informing people as citizens capable of understanding their world well enough to improve it, which is the deepest purpose of journalism. A/B testing merely highlights this tension.
julia rhodes

"Carrot and Stick" Motivation Revisited by New Research | Psychology Today - 1 views

  • We continue to revisit the issue of motivation and specifically, the “carrot and stick” aspect.  New research seems to indicate that brain chemicals may control behavior and for people to learn and adapt in the world; therefore, both punishment and reward may be necessary. T
  • The real question is, which route would you choose—positive or negative? Most people are taught to refrain from engaging in a certain behavior by being given punishments that create negative feelings.
  • Different players use different strategies. It all depends on their genetic material. People's tendency to change their choice immediately after receiving a punishment depends on which serotonin gene variant they inherited from their parents. The dopamine gene variant, on the other hand, exerts influence on whether people can stop themselves making the choice that was previously rewarded, but no longer is
  • ...7 more annotations...
  • What do we mean by motivation? It's been defined as a predisposition to behave in a purposeful manner to achieve specific, unmet needs and the will to achieve, and the inner force that drives individuals to accomplish personal and organizational goals. And why do we need motivated employees? The answer is survival.
  • It turns out that people are motivated by interesting work, challenge, and increasing responsibility—intrinsic factors. People have a deep-seated need for growth and achievement.
  • Even understanding what constitutes human motivation  has been a centuries old puzzle, addressed as far back as Aristotle.
  • . Pink concludes that extrinsic motivators work only in a surprisingly narrow band of circumstances; rewards often destroy creativity and employee performance; and the secret to high performance isn’t reward and punishment but that unseen intrinsic drive—the drive to do something  because it is meaningful.
  • true motivation boils down to three elements: Autonomy, the desire to direct our own lives; mastery, the desire to continually improve at something that matters to us, and purpose, the desire to do things in service of something larger than ourselves.
  • The carrot-and-stick approach worked well for typical tasks of the early 20th century —routine, unchallenging and highly controlled. For these tasks, where the process is straightforward and lateral thinking is not required, rewards can provide a small motivational boost without any harmful side effects
  • obs in the 21st century have changed dramatically. They have become more complex, more interesting and more self-directed, and this is where the carrot-and-stick approach has become unstuck.
Javier E

How to Raise a University's Profile: Pricing and Packaging - NYTimes.com - 0 views

  • I talked to a half-dozen of Hugh Moren’s fellow students. A highly indebted senior who was terrified of the weak job market described George Washington, where he had invested considerable time getting and doing internships, as “the world’s most expensive trade school.” Another mentioned the abundance of rich students whose parents were giving them a fancy-sounding diploma the way they might a new car. There are serious students here, he acknowledged, but: “You can go to G.W. and essentially buy a degree.”
  • A recent study from the Organization for Economic Cooperation and Development found that, on average, American college graduates score well below college graduates from most other industrialized countries in mathematics. In literacy (“understanding, evaluating, using and engaging with written text”), scores are just average. This comes on the heels of Richard Arum and Josipa Roksa’s “Academically Adrift,” a study that found “limited or no learning” among many college students.Instead of focusing on undergraduate learning, nu
  • colleges have been engaged in the kind of building spree I saw at George Washington. Recreation centers with world-class workout facilities and lazy rivers rise out of construction pits even as students and parents are handed staggeringly large tuition bills. Colleges compete to hire famous professors even as undergraduates wander through academic programs that often lack rigor or coherence. Campuses vie to become the next Harvard — or at least the next George Washington — while ignoring the growing cost and suspect quality of undergraduate education.
  • ...58 more annotations...
  • Mr. Trachtenberg understood the centrality of the university as a physical place. New structures were a visceral sign of progress. They told visitors, donors and civic leaders that the institution was, like beams and scaffolding rising from the earth, ascending. He added new programs, recruited more students, and followed the dictate of constant expansion.
  • the American research university had evolved into a complicated and somewhat peculiar organization. It was built to be all things to all people: to teach undergraduates, produce knowledge, socialize young men and women, train workers for jobs, anchor local economies, even put on weekend sports events. And excellence was defined by similarity to old, elite institutions. Universities were judged by the quality of their scholars, the size of their endowments, the beauty of their buildings and the test scores of their incoming students.
  • John Silber embarked on a huge building campaign while bringing luminaries like Saul Bellow and Elie Wiesel on board to teach and lend their prestige to the B.U. name, creating a bigger, more famous and much more costly institution. He had helped write a game plan for the aspiring college president.
  • GWU is, for all intents and purposes, a for-profit organization. Best example: study abroad. Their top program, a partnering with Sciences Po, costs each student (30 of them, on a program with 'prestige' status?) a full semester's tuition. It costs GW, according to Sciences Po website, €1000. A neat $20,000 profit per student (who is in digging her/himself deeper and deeper in debt.) Moreover, the school takes a $500 admin fee for the study abroad application! With no guarantee that all credits transfer. Students often lose a partial semester, GW profits again. Nor does GW offer help with an antiquated, one-shot/no transfers, tricky registration process. It's tough luck in gay Paris.Just one of many examples. Dorms with extreme mold, off-campus housing impossible for freshmen and sophomores. Required meal plan: Chick-o-Filet etc. Classes with over 300 students (required).This is not Harvard, but costs same.Emotional problems? Counselors too few. Suicides continue and are not appropriately addressed. Caring environment? Extension so and so, please hold.It's an impressive campus, I'm an alum. If you apply, make sure the DC experience is worth the price: good are internships, a few colleges like Elliot School, post-grad.GWU uses undergrad $$ directly for building projects, like the medical center to which students have NO access. (Student health facility is underfunded, outsourced.)Outstanding professors still make a difference. But is that enough?
  • Mr. Trachtenberg, however, understood something crucial about the modern university. It had come to inhabit a market for luxury goods. People don’t buy Gucci bags merely for their beauty and functionality. They buy them because other people will know they can afford the price of purchase. The great virtue of a luxury good, from the manufacturer’s standpoint, isn’t just that people will pay extra money for the feeling associated with a name brand. It’s that the high price is, in and of itself, a crucial part of what people are buying.
  • Mr. Trachtenberg convinced people that George Washington was worth a lot more money by charging a lot more money. Unlike most college presidents, he was surprisingly candid about his strategy. College is like vodka, he liked to explain.
  • The Absolut Rolex plan worked. The number of applicants surged from some 6,000 to 20,000, the average SAT score of students rose by nearly 200 points, and the endowment jumped from $200 million to almost $1 billion.
  • The university became a magnet for the children of new money who didn’t quite have the SATs or family connections required for admission to Stanford or Yale. It also aggressively recruited international students, rich families from Asia and the Middle East who believed, as nearly everyone did, that American universities were the best in the world.
  • U.S. News & World Report now ranks the university at No. 54 nationwide, just outside the “first tier.”
  • The watch and vodka analogies are correct. Personally, I used car analogies when discussing college choices with my kids. We were in the fortunate position of being able to comfortably send our kids to any college in the country and have them leave debt free. Notwithstanding, I told them that they would be going to a state school unless they were able to get into one of about 40 schools that I felt, in whatever arbitrary manner I decided, that was worth the extra cost. They both ended up going to state schools.College is by and large a commodity and you get out of it what you put into it. Both of my kids worked hard in college and were involved in school life. They both left the schools better people and the schools better schools for them being there. They are both now successful adults.I believe too many people look for the prestige of a named school and that is not what college should be primarily about.
  • In 2013, only 14 percent of the university’s 10,000 undergraduates received a grant — a figure on a par with elite schools but far below the national average. The average undergraduate borrower leaves with about $30,800 in debt.
  • When I talk to the best high school students in my state I always stress the benefits of the honors college experience at an affordable public university. For students who won't qualify for a public honors college. the regular pubic university experience is far preferable to the huge debt of places like GW.
  • Carey would do well to look beyond high ticket private universities (which after all are still private enterprises) and what he describes as the Olympian heights of higher education (which for some reason seems also to embitter him) and look at the system overall . The withdrawal of public support was never a policy choice; it was a political choice, "packaged and branded" as some tax cutting palaver all wrapped up in the argument that a free-market should decide how much college should cost and how many seats we need. In such an environment, trustees at private universities are no more solely responsible for turning their degrees into commodities than the administrations of state universities are for raising the number of out-of-state students in order to offset the loss of support from their legislatures. No doubt, we will hear more about market based solutions and technology from Mr. Carey
  • I went to GW back in the 60s. It was affordable and it got me away from home in New York. While I was there, Newsweek famously published a article about the DC Universities - GW, Georgetown, American and Catholic - dubbing them the Pony league, the schools for the children of wealthy middle class New Yorkers who couldn't get into the Ivy League. Nobody really complained. But that wasn't me. I went because I wanted to be where the action was in the 60s, and as we used to say - "GW was literally a stone's throw from the White House. And we could prove it." Back then, the two biggest alumni names were Jackie Kennedy, who's taken some classes there, and J. Edgar Hoover. Now, according to the glossy magazine they send me each month, it's the actress Kerry Washington. There's some sort of progress there, but I'm a GW alum and not properly trained to understand it.
  • This explains a lot of the modern, emerging mentality. It encompasses the culture of enforced grade inflation, cheating and anti-intellectualism in much of higher education. It is consistent with our culture of misleading statistics and information, cronyism and fake quality, the "best and the brightest" being only schemers and glad handers. The wisdom and creativity engendered by an honest, rigorous academic education are replaced by the disingenuous quick fix, the winner-take-all mentality that neglects the common good.
  • I attended nearby Georgetown University and graduated in 1985. Relative to state schools and elite schools, it was expensive then. I took out loans. I had Pell grants. I had work-study and GSL. I paid my debt of $15,000 off in ten years. Would I have done it differently? Yes: I would have continued on to graduate school and not worried about paying off those big loans right after college. My career work out and I am grateful for the education I received and paid for. But I would not recommend to my nieces and nephews debts north of $100,000 for a BA in liberal arts. Go community. Then go state. Then punch your ticket to Harvard, Yale or Stanford — if you are good enough.
  • American universities appear to have more and more drifted away from educating individuals and citizens to becoming high priced trade schools and purveyors of occupational licenses. Lost in the process is the concept of expanding a student's ability to appreciate broadly and deeply, as well as the belief that a republican democracy needs an educated citizenry, not a trained citizenry, to function well.Both the Heisman Trophy winner and the producer of a successful tech I.P.O. likely have much in common, a college education whose rewards are limited to the financial. I don't know if I find this more sad on the individual level or more worrisome for the future of America.
  • This is now a consumer world for everything, including institutions once thought to float above the Shakespearean briars of the work-a-day world such as higher education, law and medicine. Students get this. Parents get this. Everything is negotiable: financial aid, a spot in the nicest dorm, tix to the big game. But through all this, there are faculty - lots of 'em - who work away from the fluff to link the ambitions of the students with the reality and rigor of the 21st century. The job of the student is to get beyond the visible hype of the surroundings and find those faculty members. They will make sure your investment is worth it
  • My experience in managing or working with GW alumni in their 20's or 30's has not been good. Virtually all have been mentally lazy and/or had a stunning sense of entitlement. Basically they've been all talk and no results. That's been quite a contrast to the graduates from VA/MD state universities.
  • More and more, I notice what my debt-financed contributions to the revenue streams of my vendors earn them, not me. My banks earned enough to pay ridiculous bonuses to employees for reckless risk-taking. My satellite tv operator earned enough to overpay ESPN for sports programming that I never watch--and that, in turn, overpays these idiotic pro athletes and college sports administrators. My health insurer earned enough to defeat one-payor insurance; to enable the opaque, inefficient billing practices of hospitals and other providers; and to feed the behemoth pharmaceutical industry. My church earned enough to buy the silence of sex abuse victims and oppose progressive political candidates. And my govt earned enough to continue ag subsidies, inefficient defense spending, and obsolete transportation and energy policies.
  • as the parent of GWU freshman I am grateful for every opportunity afforded her. She has a generous merit scholarship, is in the honors program with some small classes, and has access to internships that can be done while at school. GWU also gave her AP credits to advance her to sophomore status. Had she attended the state flagship school (where she was accepted into that exclusive honors program) she would have a great education but little else. It's not possible to do foreign affairs related internship far from D.C. or Manhattan. She went to a very competitive high school where for the one or two ivy league schools in which she was interested, she didn't have the same level of connections or wealth as many of her peers. Whether because of the Common Application or other factors, getting into a good school with financial help is difficult for a middle class student like my daughter who had a 4.0 GPA and 2300 on the SAT. She also worked after school.The bottom line - GWU offered more money than perceived "higher tier" universities, and brought tuition to almost that of our state school system. And by the way, I think she is also getting a very good education.
  • This article reinforces something I have learned during my daughter's college application process. Most students choose a school based on emotion (reputation) and not value. This luxury good analogy holds up.
  • The entire education problem can be solved by MOOCs lots and lots of them plus a few closely monitored tests and personal interviews with people. Of course many many people make MONEY off of our entirely inefficient way of "educating" -- are we even really doing that -- getting a degree does NOT mean one is actually educated
  • As a first-generation college graduate I entered GW ambitious but left saddled with debt, and crestfallen at the hard-hitting realization that my four undergraduate years were an aberration from what life is actually like post-college: not as simple as getting an [unpaid] internship with a fancy titled institution, as most Colonials do. I knew how to get in to college, but what do you do after the recess of life ends?I learned more about networking, resume plumping (designated responses to constituents...errr....replied to emails), and elevator pitches than actual theory, economic principles, strong writing skills, critical thinking, analysis, and philosophy. While relatively easy to get a job after graduating (for many with a GW degree this is sadly not the case) sustaining one and excelling in it is much harder. It's never enough just to be able to open a new door, you also need to be prepared to navigate your way through that next opportunity.
  • this is a very telling article. Aimless and directionless high school graduates are matched only by aimless and directionless institutes of higher learning. Each child and each parent should start with a goal - before handing over their hard earned tuition dollars, and/or leaving a trail of broken debt in the aftermath of a substandard, unfocused education.
  • it is no longer the most expensive university in America. It is the 46th.Others have been implementing the Absolut Rolex Plan. John Sexton turned New York University into a global higher-education player by selling the dream of downtown living to students raised on “Sex and the City.” Northeastern followed Boston University up the ladder. Under Steven B. Sample, the University of Southern California became a U.S. News top-25 university. Washington University in St. Louis did the same.
  • I currently attend GW, and I have to say, this article completely misrepresents the situation. I have yet to meet a single person who is paying the full $60k tuition - I myself am paying $30k, because the school gave me $30k in grants. As for the quality of education, Foreign Policy rated GW the #8 best school in the world for undergraduate education in international affairs, Princeton Review ranks it as one of the best schools for political science, and U.S. News ranks the law school #20. The author also ignores the role that an expanding research profile plays in growing a university's prestige and educational power.
  • And in hundreds of regional universities and community colleges, presidents and deans and department chairmen have watched this spectacle of ascension and said to themselves, “That could be me.” Agricultural schools and technical institutes are lobbying state legislatures for tuition increases and Ph.D. programs, fitness centers and arenas for sport. Presidents and boards are drawing up plans to raise tuition, recruit “better” students and add academic programs. They all want to go in one direction — up! — and they are all moving with a single vision of what they want to be.
  • this is the same playbook used by hospitals the past 30 years or so. It is how Hackensack Hospital became Hackensack Medical Center and McComb Hospital became Southwest Mississippi Regional Medical Center. No wonder the results have been the same in healthcare and higher education; both have priced themselves out of reach for average Americans.
  • a world where a college is rated not by the quality of its output, but instaed, by the quality of its inputs. A world where there is practically no work to be done by the administration because the college's reputation is made before the first class even begins! This is isanity! But this is the swill that the mammoth college marketing departments nationwide have shoved down America's throat. Colleges are ranked not by the quality of their graduates, but rather, by the test scores of their incoming students!
  • The Pew Foundation has been doing surveys on what students learn, how much homework they do, how much time they spend with professors etc. All good stuff to know before a student chooses a school. It is called the National Survey of Student Engagement (NSSE - called Nessy). It turns out that the higher ranked schools do NOT allow their information to be released to the public. It is SECRET.Why do you think that is?
  • The article blames "the standard university organizational model left teaching responsibilities to autonomous academic departments and individual faculty members, each of which taught and tested in its own way." This is the view of someone who has never taught at a university, nor thought much about how education there actually happens. Once undergraduates get beyond the general requirements, their educations _have_ to depend on "autonomous departments" because it's only those departments know what the requirements for given degree can be, and can grant the necessary accreditation of a given student. The idea that some administrator could know what's necessary for degrees in everything from engineering to fiction writing is nonsense, except that's what the people who only know the theory of education (but not its practice) actually seem to think. In the classroom itself, you have tremendously talented people, who nevertheless have their own particular strengths and approaches. Don't you think it's a good idea to let them do what they do best rather than trying to make everyone teach the same way? Don't you think supervision of young teachers by older colleagues, who actually know their field and its pedagogy, rather than some administrator, who knows nothing of the subject, is a good idea?
  • it makes me very sad to see how expensive some public schools have become. Used to be you could work your way through a public school without loans, but not any more. Like you, I had the advantage of a largely-scholarship paid undergraduate education at a top private college. However, I was also offered a virtually free spot in my state university's (then new) honors college
  • My daughter attended a good community college for a couple of classes during her senior year of high school and I could immediately see how such places are laboratories for failure. They seem like high schools in atmosphere and appearance. Students rush in by car and rush out again when the class is over.The four year residency college creates a completely different feel. On arrival, you get the sense that you are engaging in something important, something apart and one that will require your full attention. I don't say this is for everyone or that the model is not flawed in some ways (students actually only spend 2 1/2 yrs. on campus to get the four yr. degree). College is supposed to be a 60 hour per week job. Anything less than that and the student is seeking himself or herself
  • This. Is. STUNNING. I have always wondered, especially as my kids have approached college age, why American colleges have felt justified in raising tuition at a rate that has well exceeded inflation, year after year after year. (Nobody needs a dorm with luxury suites and a lazy river pool at college!) And as it turns out, they did it to become luxury brands. Just that simple. Incredible.I don't even blame this guy at GWU for doing what he did. He wasn't made responsible for all of American higher ed. But I do think we all need to realize what happened, and why. This is front page stuff.
  • I agree with you, but, unfortunately, given the choice between low tuition, primitive dorms, and no athletic center VS expensive & luxurious, the customers (and their parents) are choosing the latter. As long as this is the case, there is little incentive to provide bare-bones and cheap education.
  • Wesleyan University in CT is one school that is moving down the rankings. Syracuse University is another. Reed College is a third. Why? Because these schools try hard to stay out of the marketing game. (With its new president, Syracuse has jumped back into the game.) Bryn Mawr College, outside Philadelphia hasn't fared well over the past few decades in the rankings, which is true of practically every women's college. Wellesley is by far the highest ranked women's college, but even there the acceptance rate is significantly higher than one finds at comparable coed liberal arts colleges like Amherst & Williams. University of Chicago is another fascinating case for Mr. Carey to study (I'm sure he does in his forthcoming book, which I look forward to reading). Although it has always enjoyed an illustrious academic reputation, until recently Chicago's undergraduate reputation paled in comparison to peer institutions on the two coasts. A few years ago, Chicago changed its game plan to more closely resemble Harvard and Stanford in undergraduate amenities, and lo and behold, its rankings shot up. It was a very cynical move on the president's part to reassemble the football team, but it was a shrewd move because athletics draw more money than academics ever can (except at engineering schools like Cal Tech & MIT), and more money draws richer students from fancier secondary schools with higher test scores, which lead to higher rankings - and the beat goes on.
  • College INDUSTRY is out of control. Sorry, NYU, GW, BU are not worth the price. Are state schools any better? We have the University of Michigan, which is really not a state school, but a university that gives a discount to people who live in Michigan. Why? When you have an undergraduate body 40+% out-of-state that pays tuition of over $50K/year, you tell me?Perhaps the solution is two years of community college followed by two at places like U of M or Michigan State - get the same diploma at the end for much less and beat the system.
  • In one recent yr., the majority of undergrad professors at Harvard, according to Boston.com, where adjuncts. That means low pay, no benefits, no office, temp workers. Harvard.Easily available student loans fueled this arms race of amenities and frills that in which colleges now engage. They moved the cost of education onto the backs of people, kids, who don't understand what they are doing.Students in colleges these days are customers and the customers must be able to get through. If it requires dumbing things down, so be it. On top of tuition, G.W. U. is known by its students as the land of added fees on top of added fees. The joke around campus was that they would soon be installing pay toilets in the student union. No one was laughing.
  • You could written the same story about my alma mater, American University. The place reeked of ambition and upward mobility decades ago and still does. Whoever's running it now must look at its measly half-billion-dollar endowment and compare it to GWU's $1.5 billion and seethe with envy, while GWU's president sets his sights on an Ivy League-size endowment. And both get back to their real jobs: 24/7 fundraising,Which is what university presidents are all about these days. Money - including million-dollar salaries for themselves (GWU's president made more than Harvard's in 2011) - pride, cachet, power, a mansion, first-class all the way. They should just be honest about it and change their university's motto to Ostende mihi pecuniam! (please excuse my questionable Latin)Whether the students are actually learning anything is up to them, I guess - if they do, it's thanks to the professors, adjuncts and the administrative staff, who do the actual work of educating and keep the school running.
  • When I was in HS (70s), many of my richer friends went to GW and I was then of the impression that GW was a 'good' school. As I age, I have come to realize that this place is just another façade to the emptiness that has become America. All too often are we faced with a dilemma: damned if we do, damned if we don't. Yep, 'education' has become a trap for all too many of our citizen.
  • I transferred to GWU from a state school. I am forever grateful that I did. I wanted to get a good rigorous education and go to one of the best International Affairs schools in the world. Even though the state school I went to was dirt-cheap, the education and the faculty was awful. I transferred to GW and was amazed at the professors at that university. An ambassador or a prominent IA scholar taught every class. GW is an expensive school, but that is the free market. If you want a good education you need to be willing to pay for it or join the military. I did the latter and my school was completely free with no debt and I received an amazing education. If young people aren't willing to make some sort of sacrifice to get ahead or just expect everything to be given to then our country is in a sad state.We need to stop blaming universities like GWU that strive to attract better students, better professors, and better infrastructure. They are doing what is expected in America, to better oneself.
  • "Whether the students are actually learning anything is up to them, I guess." How could it possibly be otherwise??? I am glad that you are willing to give credit to teachers and administrators, but it is not they who "do the actual work of educating." From this fallacy comes its corollary, that we should blame teachers first for "under-performing schools". This long-running show of scapegoating may suit the wallets and vanity of American parents, but it is utterly senseless. When, if ever, American culture stops reeking of arrogance, greed and anti-intellectualism, things may improve, and we may resume the habit of bothering to learn. Until then, nothing doing.
  • Universities sell knowledge and grade students on how much they have learned. Fundamentally, there is conflict of interest in thsi setup. Moreover, students who are poorly educated, even if they know this, will not criticize their school, because doing so would make it harder for them to have a career. As such, many problems with higher education remain unexposed to the public.
  • I've lectured and taught in at least five different countries in three continents and the shortest perusal of what goes on abroad would totally undermine most of these speculations. For one thing American universities are unique in their dedication to a broad based liberal arts type education. In France, Italy or Germany, for example, you select a major like mathematics or physics and then in your four years you will not take even one course in another subject. The amount of work that you do that is critically evaluated by an instructor is a tiny fraction of what is done in an American University. While half educated critics based on profoundly incomplete research write criticism like this Universities in Germany Italy, the Netherlands, South Korea and Japan as well as France have appointed committees and made studies to explain why the American system of higher education so drastically outperforms their own system. Elsewhere students do get a rather nice dose of general education but it ends in secondary school and it has the narrowness and formulaic quality that we would just normally associate with that. The character who wrote this article probably never set foot on a "campus" of the University of Paris or Rome
  • The university is part of a complex economic system and it is responding to the demands of that system. For example, students and parents choose universities that have beautiful campuses and buildings. So universities build beautiful campuses. State support of universities has greatly declined, and this decline in funding is the greatest cause of increased tuition. Therefore universities must compete for dollars and must build to attract students and parents. Also, universities are not ranked based on how they educate students -- that's difficult to measure so it is not measured. Instead universities are ranked on research publications. So while universities certainly put much effort into teaching, research has to have a priority in order for the university to survive. Also universities do not force students and parents to attend high price institutions. Reasonably priced state institutions and community colleges are available to every student. Community colleges have an advantage because they are funded by property taxes. Finally learning requires good teaching, but it also requires students that come to the university funded, prepared, and engaged. This often does not happen. Conclusion- universities have to participate in profile raising actions in order to survive. The day that funding is provided for college, ranking is based on education, and students choose campuses with simple buildings, then things will change at the university.
  • This is the inevitable result of privatizing higher education. In the not-so-distant past, we paid for great state universities through our taxes, not tuition. Then, the states shifted funding to prisons and the Federal government radically cut research support and the GI bill. Instead, today we expect universities to support themselves through tuition, and to the extent that we offered students support, it is through non-dischargeable loans. To make matters worse, the interest rates on those loans are far above the government's cost of funds -- so in effect the loans are an excise tax on education (most of which is used to support a handful of for-profit institutions that account for the most student defaults). This "consumer sovereignty" privatized model of funding education works no better than privatizing California's electrical system did in the era of Enron, or our privatized funding of medical service, or our increasingly privatized prison system: it drives up costs at the same time that it replace quality with marketing.
  • There are data in some instances on student learning, but the deeper problem, as I suspect the author already knows, is that there is nothing like a consensus on how to measure that learning, or even on when is the proper end point to emphasize (a lot of what I teach -- I know this from what students have told me -- tends to come into sharp focus years after graduation).
  • Michael (Baltimore) has hit the nail on the head. Universities are increasingly corporatized institutions in the credentialing business. Knowledge, for those few who care about it (often not those paying for the credentials) is available freely because there's no profit in it. Like many corporate entities, it is increasingly run by increasingly highly paid administrators, not faculty.
  • GWU has not defined itself in any unique way, it has merely embraced the bland, but very expensive, accoutrements of American private education: luxury dorms, food courts, spa-like gyms, endless extracurricular activities, etc. But the real culprit for this bloat that students have to bear financially is the college ranking system by US News, Princeton Review, etc. An ultimately meaningless exercise in competition that has nevertheless pushed colleges and universities to be more like one another. A sad state of affairs, and an extremely expensive one for students
  • It is long past time to realize the failure of the Reagonomics-neoliberal private profits over public good program. In education, we need to return to public institutions publicly funded. Just as we need to recognize that Medicare, Social Security, the post office, public utilities, fire departments, interstate highway system, Veterans Administration hospitals and the GI bill are models to be improved and expanded, not destroyed.
  • George Washington is actually not a Rolex watch, it is a counterfeit Rolex. The real Rolexes of higher education -- places like Hopkins, Georgetown, Duke, the Ivies etc. -- have real endowments and real financial aid. No middle class kid is required to borrow $100,000 to get a degree from those schools, because they offer generous need-based financial aid in the form of grants, not loans. The tuition at the real Rolexes is really a sticker price that only the wealthy pay -- everybody else on a sliding scale. For middle class kids who are fortunate enough to get in, Penn actually ends up costing considerably less than a state university.The fake Rolexes -- BU, NYU, Drexel in Philadelphia -- don't have the sliding scale. They bury middle class students in debt.And really, though it is foolish to borrow $100,000 or $120,000 for an undergraduate degree, I don't find the transaction morally wrong. What is morally wrong is our federal government making that loan non-dischargeable in bankruptcy, so many if these kids will be having their wages garnished for the REST OF THEIR LIVES.There is a very simple solution to this, by the way. Cap the amount of non-dischargeable student loan debt at, say, $50,000
  • The slant of this article is critical of the growth of research universities. Couldn't disagree more. Modern research universities create are incredibly engines of economic opportunity not only for the students (who pay the bills) but also for the community via the creation of blue and white collar jobs. Large research university employ tens of thousands of locals from custodial and food service workers right up to high level administrators and specialist in finance, computer services, buildings and facilities management, etc. Johns Hopkins University and the University of Maryland system employ more people than any other industry in Maryland -- including the government. Research universities typically have hospitals providing cutting-edge medical care to the community. Local business (from cafes to property rental companies) benefit from a built-in, long-term client base as well as an educated workforce. And of course they are the foundry of new knowledge which is critical for the future growth of our country.Check out the work of famed economist Dr. Julia Lane on modeling the economic value of the research university. In a nutshell, there are few better investments America can make in herself than research universities. We are the envy of the world in that regard -- and with good reason. How many *industries* (let alone jobs) have Stanford University alone catalyzed?
  • What universities have the monopoly on is the credential. Anyone can learn, from books, from free lectures on the internet, from this newspaper, etc. But only universities can endow you with the cherished degree. For some reason, people are will to pay more for one of these pieces of paper with a certain name on it -- Ivy League, Stanford, even GW -- than another -- Generic State U -- though there is no evidence one is actually worth more in the marketplace of reality than the other. But, by the laws of economics, these places are actually underpriced: after all, something like 20 times more people are trying to buy a Harvard education than are allowed to purchase one. Usually that means you raise your price.
  • Overalll a good article, except for - "This comes on the heels of Richard Arum and Josipa Roksa’s “Academically Adrift,” a study that found “limited or no learning” among many college students." The measure of learning you report was a general thinking skills exam. That's not a good measure of college gains. Most psychologists and cognitive scientists worth their salt would tell you that improvement in critical thinking skills is going to be limited to specific areas. In other words, learning critical thinking skills in math will make little change in critical thinking about political science or biology. Thus we should not expect huge improvements in general critical thinking skills, but rather improvements in a student's major and other areas of focus, such as a minor. Although who has time for a minor when it is universally acknowledged that the purpose of a university is to please and profit an employer or, if one is lucky, an investor. Finally, improved critical thinking skills are not the end all and be all of a college education even given this profit centered perspective. Learning and mastering the cumulative knowledge of past generations is arguably the most important thing to be gained, and most universities still tend to excel at that even with the increasing mandate to run education like a business and cultivate and cull the college "consumer".
  • As for community colleges, there was an article in the Times several years ago that said it much better than I could have said it myself: community colleges are places where dreams are put on hold. Without making the full commitment to study, without leaving the home environment, many, if not most, community college students are caught betwixt and between, trying to balance work responsibilities, caring for a young child or baby and attending classes. For males, the classic "end of the road" in community college is to get a car, a job and a girlfriend, one who is not in college, and that is the end of the dream. Some can make it, but most cannot.
  • as a scientist I disagree with the claim that undergrad tuition subsidizes basic research. Nearly all lab equipment and research personnel (grad students, technicians, anyone with the title "research scientist" or similar) on campus is paid for through federal grants. Professors often spend all their time outside teaching and administration writing grant proposals, as the limited federal grant funds mean ~%85 of proposals must be rejected. What is more, out of each successful grant the university levies a "tax", called "overhead", of 30-40%, nominally to pay for basic operations (utilities, office space, administrators). So in fact one might say research helps fund the university rather than the other way around. Flag
  • It's certainly overrated as a research and graduate level university. Whether it is good for getting an undergraduate education is unclear, but a big part of the appeal is getting to live in D.C..while attending college instead of living in some small college town in the corn fields.
Javier E

Great Scientists Don't Need Math - WSJ - 0 views

  • Without advanced math, how can you do serious work in the sciences? Well, I have a professional secret to share: Many of the most successful scientists in the world today are mathematically no more than semiliterate.
  • I was reassured by the discovery that superior mathematical ability is similar to fluency in foreign languages. I might have become fluent with more effort and sessions talking with the natives, but being swept up with field and laboratory research, I advanced only by a small amount.
  • Far more important throughout the rest of science is the ability to form concepts, during which the researcher conjures images and processes by intuition.
  • ...9 more annotations...
  • exceptional mathematical fluency is required in only a few disciplines, such as particle physics, astrophysics and information theory
  • When something new is encountered, the follow-up steps usually require mathematical and statistical methods to move the analysis forward. If that step proves too technically difficult for the person who made the discovery, a mathematician or statistician can be added as a collaborator
  • Ideas in science emerge most readily when some part of the world is studied for its own sake. They follow from thorough, well-organized knowledge of all that is known or can be imagined of real entities and processes within that fragment of existence
  • Ramped up and disciplined, fantasies are the fountainhead of all creative thinking. Newton dreamed, Darwin dreamed, you dream. The images evoked are at first vague. They may shift in form and fade in and out. They grow a bit firmer when sketched as diagrams on pads of paper, and they take on life as real examples are sought and found.
  • Over the years, I have co-written many papers with mathematicians and statisticians, so I can offer the following principle with confidence. Call it Wilson's Principle No. 1: It is far easier for scientists to acquire needed collaboration from mathematicians and statisticians than it is for mathematicians and statisticians to find scientists able to make use of their equations.
  • If your level of mathematical competence is low, plan to raise it, but meanwhile, know that you can do outstanding scientific work with what you have. Think twice, though, about specializing in fields that require a close alternation of experiment and quantitative analysis. These include most of physics and chemistry, as well as a few specialties in molecular biology.
  • Newton invented calculus in order to give substance to his imagination
  • Darwin had little or no mathematical ability, but with the masses of information he had accumulated, he was able to conceive a process to which mathematics was later applied.
  • For aspiring scientists, a key first step is to find a subject that interests them deeply and focus on it. In doing so, they should keep in mind Wilson's Principle No. 2: For every scientist, there exists a discipline for which his or her level of mathematical competence is enough to achieve excellence.
maddieireland334

Is your teen using apps to keep this secret? - CNN.com - 0 views

  •  
    And if you think the only teens who sext are the ones engaging in high-risk behaviors, like drinking, using drugs or skipping school, keep reading. Studies suggest that sexting is more common than many parents might realize or want to admit.
Javier E

The Narrative Frays for Theranos and Elizabeth Holmes - The New York Times - 1 views

  • Few people, let alone those just 31 years old, have amassed the accolades and riches bestowed on Elizabeth Holmes, founder and chief executive of the blood-testing start-up Theranos.
  • This year President Obama named her a United States ambassador for global entrepreneurship. She gave the commencement address at Pepperdine University. She was the youngest person ever to be awarded the Horatio Alger Award in recognition of “remarkable achievements accomplished through honesty, hard work, self-reliance and perseverance over adversity.” She is on the Board of Fellows of Harvard Medical School.
  • Time named her one of the 100 Most Influential People in the World this year. She was the subject of lengthy profiles in The New Yorker and Fortune. Over the last week, she appeared on the cover of T: The New York Times Style Magazine, and Glamour anointed her one of its eight Women of the Year. She has been on “Charlie Rose,” as well as on stage at the Clinton Global Initiative, the World Economic Forum at Davos and the Aspen Ideas Festival, among numerous other conferences.
  • ...14 more annotations...
  • Theranos, which she started after dropping out of Stanford at age 19, has raised more than $400 million in venture capital and has been valued at $9 billion, which makes Ms. Holmes’s 50 percent stake worth $4.5 billion. Forbes put her on the cover of its Forbes 400 issue, ranking her No. 121 on the list of wealthiest Americans.
  • Thanks to an investigative article in The Wall Street Journal this month by John Carreyrou, one of the company’s central claims, and the one most exciting to many investors and doctors, is being called into question. Theranos has acknowledged it was only running a limited number of tests on a microsample of blood using its finger-prick technology. Since then, it said it had stopped using its proprietary methods on all but one relatively simple test for herpes.
  • “The constant was that nobody had any idea how this works or even if it works,” Mr. Loria told me this week. “People in medicine couldn’t understand why the media and technology worlds were so in thrall to her.
  • that so many eminent authorities — from Henry Kissinger, who had served on the company’s board; to prominent investors like the Oracle founder Larry Ellison; to the Cleveland Clinic — appear to have embraced Theranos with minimal scrutiny is a testament to the ageless power of a great story.
  • Ms. Holmes seems to have perfectly executed the current Silicon Valley playbook: Drop out of a prestigious college to pursue an entrepreneurial vision; adopt an iconic uniform; embrace an extreme diet; and champion a humanitarian mission, preferably one that can be summed up in one catchy phrase.
  • She stays relentlessly on message, as a review of her numerous conference and TV appearances make clear, while at the same time saying little of scientific substance.
  • The natural human tendency to fit complex facts into a simple, compelling narrative has grown stronger in the digital age of 24/7 news and social media,
  • “We’re deluged with information even as pressure has grown to make snap decisions,”
  • “People see a TED talk. They hear this amazing story of a 30-something-year-old woman with a wonder procedure. They see the Cleveland Clinic is on board. A switch goes off and they make an instant decision that everything is fine. You see this over and over: Really smart and wealthy people start to believe completely implausible things with 100 percent certainty.”
  • Ms. Holmes’s story also fits into a broader narrative underway in medicine, in which new health care entrepreneurs are upending ossified hospital practices with the goal of delivering more effective and patient-oriented care.
  • as a medical technology company, Theranos has bumped up against something else: the scientific method, which puts a premium on verification over narrative.
  • “You have to subject yourself to peer review. You can’t just go in a stealthy mode and then announce one day that you’ve got technology that’s going to disrupt the world.”
  • Professor Yeo said that he and his colleagues wanted to see data and testing in independent labs. “We have a small army of people ready and willing to test Theranos’s products if they’d ask us,” he said. “And that can be done without revealing any trade secrets.”
  • “Every other company in this field has gone through peer review,” said Mr. Cherny of Evercore. “Why hold back so much of the platform if your goal is the greater good of humanity?”
silveiragu

BBC - Future - The countries that don't exist - 2 views

  • In the deep future, every territory we know could eventually become a country that doesn’t exist.
    • silveiragu
       
      Contrary to the human expectation that situations remain constant. 
  • There really is a secret world of hidden independent nations
  • Middleton, however, is here to talk about countries missing from the vast majority of books and maps for sale here. He calls them the “countries that don’t exist”
    • silveiragu
       
      Reminds us of our strange relationship with nationalism-that we forget how artificial countries' boundaries are. 
  • ...21 more annotations...
  • The problem, he says, is that we don’t have a watertight definition of what a country is. “Which as a geographer, is kind of shocking
  • The globe, it turns out, is full of small (and not so small) regions that have all the trappings of a real country
  • and are ignored on most world maps.
  • Middleton, a geographer at the University of Oxford, has now charted these hidden lands in his new book, An Atlas of Countries that Don’t Exist
  • Middleton’s quest began, appropriately enough, with Narnia
    • silveiragu
       
      Interesting connection to imagination as a way of knowing.
  • a defined territory, a permanent population, a government, and “the capacity to enter into relations with other states”.
  • In Australia, meanwhile, the Republic of Murrawarri was founded in 2013, after the indigenous tribe wrote a letter to Queen Elizabeth II asking her to prove her legitimacy to govern their land.
  • Yet many countries that meet these criteria aren‘t members of the United Nations (commonly accepted as the final seal of a country’s statehood).
  • many of them are instead members of the “Unrepresented United Nations – an alternative body to champion their rights.
  • A handful of the names will be familiar to anyone who has read a newspaper: territories such as Taiwan, Tibet, Greenland, and Northern Cyprus.
  • The others are less famous, but they are by no means less serious
    • silveiragu
       
      By what criterion, "serious"?
  • One of the most troubling histories, he says, concerns the Republic of Lakotah (with a population of 100,000). Bang in the centre of the United States of America (just east of the Rocky Mountains), the republic is an attempt to reclaim the sacred Black Hills for the Lakota Sioux tribe.
  • Their plight began in the 18th Century, and by 1868 they had finally signed a deal with the US government that promised the right to live on the Black Hills. Unfortunately, they hadn’t accounted for a gold rush
  • Similar battles are being fought across every continent.
  • In fact, you have almost certainly, unknowingly, visited one.
  • Christiania, an enclave in the heart of Copenhagen.
  • On 26 September that year, they declared it independent, with its own “direct democracy”, in which each of the inhabitants (now numbering 850) could vote on any important matter.
    • silveiragu
       
      Interesting reminder that the label "country" does not only have to arise from military or economic struggles, as is tempting to think in our study of history. Also, interesting reminder that the label of "country"-by itself-means nothing. 
  • a blind eye to the activities
    • silveiragu
       
      That is really why any interest is demonstrated towards this topic. Not that some country named Christiania exists in the heart of Denmark, but that they can legitimately call themselves a nation. We have grown up, and our parents have grown up, with a rigid definition of nationalism, and the strange notion that the lines in an atlas were always there. One interpretation of the Danish government's response to Christiania is simply that they do not know what to think. Although probably not geopolitically significant, such enclave states represent a challenge our perception of countries, one which fascinates Middleton's readers because it disconcerts them. 
  • perhaps we need to rethink the concept of the nation-state altogether? He points to Antarctica, a continent shared peacefully among the international community
    • silveiragu
       
      A sign of progress, perhaps, from the industrialism-spurred cycle of divide land, industrialize, and repeat-even if the chief reason is the region's climate. 
  • The last pages of Middleton’s Atlas contain two radical examples that question everything we think we mean by the word ‘country’.
    • silveiragu
       
      That is really why any interest is demonstrated towards this topic. Not that some country named Christiania exists in the heart of Denmark, but that they can legitimately call themselves a nation. We have grown up, and our parents have grown up, with a rigid definition of nationalism, and the strange notion that the lines in an atlas were always there. These "nonexistent countries"-and our collective disregard for them-are reminiscent of the 17th and 18th centuries: then, the notion of identifying by national lines was almost as strange and artificial as these countries' borders seem to us today. 
  • “They all raise the possibility that countries as we know them are not the only legitimate basis for ordering the planet,
anonymous

'Mona Lisa': Hidden portraits 'found underneath' - CNN.com - 0 views

  • smile
  • But does it belong to an entirely different woman?
  • The hidden picture shows a woman looking into the distance, with no trace of the characteristic smile.
  • ...2 more annotations...
  • My scientific imagery technique (L.A.M.) takes us into the heart of the paint-layers of the world's most famous picture and reveals secrets that have remained hidden for 500 years,"
  • he scientist used a multispectral camera to project intense lights on to the painting while measuring the reflections.
silveiragu

BBC - Future - The man who studies the spread of ignorance - 0 views

  • is the study of wilful acts to spread confusion and deceit, usually to sell a product or win favour.
    • silveiragu
       
      Interesting data point on the irrationality of words: individuals invent words all the time, purposefully or not. Additionally, this word is USEFUL, as it reveals a clear deficiency in public understanding of, say, the tobacco industry. So, why have I not been able to find any dictionary recognizing its existence? 
  • Ignorance is power
  • Agnotology is as important today as it was back when Proctor studied
  • ...18 more annotations...
  • politically motivated doubt was sown over US President Barack Obama’s nationality for many months by opponents until he revealed his birth certificate in 2011.
  • ignorance can often be propagated under the guise of balanced debate. For example, the common idea that there will always be two opposing views does not always result in a rational conclusion.
    • silveiragu
       
      What's the exploited heuristic? There must be one.
  • a false picture of the truth, hence ignorance.
  • For example, says Proctor, many of the studies linking carcinogens in tobacco were conducted in mice initially, and the tobacco industry responded by saying that studies into mice did not mean that people were at risk,
  • Even though knowledge is ‘accessible’, it does not mean it is accessed, he warns
  • often comes from faith or tradition, or propaganda
    • silveiragu
       
      If there are Ways of Knowing, what are the Ways of Not Knowing?
  • a scientifically illiterate society will probably be more susceptible to the tactics used by those wishing to confuse and cloud the truth.
  • It’s not just about the facts, it’s about what is imagined to flow from and into such facts,
  • Another academic studying ignorance is David Dunning, from Cornell University.
  • "While some smart people will profit from all the information now just a click away, many will be misled into a false sense of expertise
  • US presidential candidate Donald Trump's solutions that are either unworkable or unconstitutional are an example of agnotology, says Dunning
    • silveiragu
       
      Or, rather, the analysis of US Presidential candidate Donald Trump...is.
  • today the need for both a word and the study of human ignorance is as strong as ever
  • 1979, a secret memo from the tobacco industry was revealed to the public.
  • How do people or companies with vested interests spread ignorance and obfuscate knowledge? Georgina Kenyon finds there is a term which defines this phenomenon.
  • it revealed many of the tactics employed by big tobacco to counter “anti-cigarette forces
  • “Doubt is our product since it is the best means of competing with the ‘body of fact’ that exists in the mind of the general public. It is also the means of establishing a controversy.”
  • revelation piqued the interest of Robert Proctor, a science historian from Stanford University
  • led him to create a word for the study of deliberate propagation of ignorance: agnotology.
kushnerha

'Run, Hide, Fight' Is Not How Our Brains Work - The New York Times - 0 views

  • One suggestion, promoted by the Federal Bureau of Investigation and Department of Homeland Security, and now widely disseminated, is “run, hide, fight.” The idea is: Run if you can; hide if you can’t run; and fight if all else fails. This three-step program appeals to common sense, but whether it makes scientific sense is another question.
  • Underlying the idea of “run, hide, fight” is the presumption that volitional choices are readily available in situations of danger. But the fact is, when you are in danger, whether it is a bicyclist speeding at you or a shooter locked and loaded, you may well find yourself frozen, unable to act and think clearly.
  • Freezing is not a choice. It is a built-in impulse controlled by ancient circuits in the brain involving the amygdala and its neural partners, and is automatically set into motion by external threats. By contrast, the kinds of intentional actions implied by “run, hide, fight” require newer circuits in the neocortex.
  • ...7 more annotations...
  • Contemporary science has refined the old “fight or flight” concept — the idea that those are the two hard-wired options when in mortal danger — to the updated “freeze, flee, fight.”
  • Why do we freeze? It’s part of a predatory defense system that is wired to keep the organism alive. Not only do we do it, but so do other mammals and other vertebrates. Even invertebrates — like flies — freeze. If you are freezing, you are less likely to be detected if the predator is far away, and if the predator is close by, you can postpone the attack (movement by the prey is a trigger for attack)
  • The freezing reaction is accompanied by a hormonal surge that helps mobilize your energy and focus your attention. While the hormonal and other physiological responses that accompany freezing are there for good reason, in highly stressful situations the secretions can be excessive and create impediments to making informed choices.
  • Sometimes freezing is brief and sometimes it persists. This can reflect the particular situation you are in, but also your individual predisposition. Some people naturally have the ability to think through a stressful situation, or to even be motivated by it, and will more readily run, hide or fight as required.
  • we have created a version of this predicament using rats. The animals have been trained, through trial and error, to “know” how to escape in a certain dangerous situation. But when they are actually placed in the dangerous situation, some rats simply cannot execute the response — they stay frozen. If, however, we artificially shut down a key subregion of the amygdala in these rats, they are able to overcome the built-in impulse to freeze and use their “knowledge” about what to do.
  • shown that if people cognitively reappraise a situation, it can dampen their amygdala activity. This dampening may open the way for conceptually based actions, like “run, hide, fight,” to replace freezing and other hard-wired impulses.
  • How to encourage this kind of cognitive reappraisal? Perhaps we could harness the power of social media to conduct a kind of collective cultural training in which we learn to reappraise the freezing that occurs in dangerous situations. In most of us, freezing will occur no matter what. It’s just a matter of how long it will last.
Javier E

Opinion | What Do We Actually Know About the Economy? (Wonkish) - The New York Times - 0 views

  • Among economists more generally, a lot of the criticism seems to amount to the view that macroeconomics is bunk, and that we should stick to microeconomics, which is the real, solid stuff. As I’ll explain in a moment, that’s all wrong
  • in an important sense the past decade has been a huge validation for textbook macroeconomics; meanwhile, the exaltation of micro as the only “real” economics both gives microeconomics too much credit and is largely responsible for the ways macroeconomic theory has gone wrong.
  • Finally, many outsiders and some insiders have concluded from the crisis that economic theory in general is bunk, that we should take guidance from people immersed in the real world – say, business leaders — and/or concentrate on empirical results and skip the models
  • ...28 more annotations...
  • And while empirical evidence is important and we need more of it, the data almost never speak for themselves – a point amply illustrated by recent monetary events.
  • chwinger, as I remember the story, was never seen to use a Feynman diagram. But he had a locked room in his house, and the rumor was that that room was where he kept the Feynman diagrams he used in secret.
  • What’s the equivalent of Feynman diagrams? Something like IS-LM, which is the simplest model you can write down of how interest rates and output are jointly determined, and is how most practicing macroeconomists actually think about short-run economic fluctuations. It’s also how they talk about macroeconomics to each other. But it’s not what they put in their papers, because the journals demand that your model have “microfoundations.”
  • The Bernanke Fed massively expanded the monetary base, by a factor of almost five. There were dire warnings that this would cause inflation and “debase the dollar.” But prices went nowhere, and not much happened to broader monetary aggregates (a result that, weirdly, some economists seemed to find deeply puzzling even though it was exactly what should have been expected.)
  • What about fiscal policy? Traditional macro said that at the zero lower bound there would be no crowding out – that deficits wouldn’t drive up interest rates, and that fiscal multipliers would be larger than under normal conditions. The first of these predictions was obviously borne out, as rates stayed low even when deficits were very large. The second prediction is a bit harder to test, for reasons I’ll get into when I talk about the limits of empiricism. But the evidence does indeed suggest large positive multipliers.
  • The overall story, then, is one of overwhelming predictive success. Basic, old-fashioned macroeconomics didn’t fail in the crisis – it worked extremely well
  • In fact, it’s hard to think of any other example of economic models working this well – making predictions that most non-economists (and some economists) refused to believe, indeed found implausible, but which came true. Where, for example, can you find any comparable successes in microeconomics?
  • Meanwhile, the demand that macro become ever more rigorous in the narrow, misguided sense that it look like micro led to useful approaches being locked up in Schwinger’s back room, and in all too many cases forgotten. When the crisis struck, it was amazing how many successful academics turned out not to know things every economist would have known in 1970, and indeed resurrected 1930-vintage fallacies in the belief that they were profound insights.
  • mainly I think it reflected the general unwillingness of human beings (a category that includes many though not necessarily all economists) to believe that so many people can be so wrong about something so big.
  • . To normal human beings the study of international trade and that of international macroeconomics might sound like pretty much the same thing. In reality, however, the two fields used very different models, had very different intellectual cultures, and tended to look down on each other. Trade people tended to consider international macro people semi-charlatans, doing ad hoc stuff devoid of rigor. International macro people considered trade people boring, obsessed with proving theorems and offering little of real-world use.
  • does microeconomics really deserve its reputation of moral and intellectual superiority? No
  • Even before the rise of behavioral economics, any halfway self-aware economist realized that utility maximization – indeed, the very concept of utility — wasn’t a fact about the world; it was more of a thought experiment, whose conclusions should always have been stated in the subjunctive.
  • But, you say, we didn’t see the Great Recession coming. Well, what do you mean “we,” white man? OK, what’s true is that few economists realized that there was a huge housing bubble
  • True, a model doesn’t have to be perfect to provide hugely important insights. But here’s my question: where are the examples of microeconomic theory providing strong, counterintuitive, successful predictions on the same order as the success of IS-LM macroeconomics after 2008? Maybe there are some, but I can’t come up with any.
  • The point is not that micro theory is useless and we should stop doing it. But it doesn’t deserve to be seen as superior to macro modeling.
  • And the effort to make macro more and more like micro – to ground everything in rational behavior – has to be seen now as destructive. True, that effort did lead to some strong predictions: e.g., only unanticipated money should affect real output, transitory income changes shouldn’t affect consumer spending, government spending should crowd out private demand, etc. But all of those predictions have turned out to be wrong.
  • Kahneman and Tversky and Thaler and so on deserved all the honors they received for helping to document the specific ways in which utility maximization falls short, but even before their work we should never have expected perfect maximization to be a good description of reality.
  • But data never speak for themselves, for a couple of reasons. One, which is familiar, is that economists don’t get to do many experiments, and natural experiments are rare
  • The other problem is that even when we do get something like natural experiments, they often took place under economic regimes that aren’t relevant to current problems.
  • Both of these problems were extremely relevant in the years following the 2008 crisis.
  • you might be tempted to conclude that the empirical evidence is that monetary expansion is inflationary, indeed roughly one-for-one.
  • But the question, as the Fed embarked on quantitative easing, was what effect this would have on an economy at the zero lower bound. And while there were many historical examples of big monetary expansion, examples at the ZLB were much rarer – in fact, basically two: the U.S. in the 1930s and Japan in the early 2000
  • These examples told a very different story: that expansion would not, in fact, be inflationary, that it would work out the way it did.
  • The point is that empirical evidence can only do certain things. It can certainly prove that your theory is wrong! And it can also make a theory much more persuasive in those cases where the theory makes surprising predictions, which the data bear out. But the data can never absolve you from the necessity of having theories.
  • Over this past decade, I’ve watched a number of economists try to argue from authority: I am a famous professor, therefore you should believe what I say. This never ends well. I’ve also seen a lot of nihilism: economists don’t know anything, and we should tear the field down and start over.
  • Obviously I differ with both views. Economists haven’t earned the right to be snooty and superior, especially if their reputation comes from the ability to do hard math: hard math has been remarkably little help lately, if ever.
  • On the other hand, economists do turn out to know quite a lot: they do have some extremely useful models, usually pretty simple ones, that have stood up well in the face of evidence and events. And they definitely shouldn’t defer to important and/or rich people on polic
  • : compare Janet Yellen’s macroeconomic track record with that of the multiple billionaires who warned that Bernanke would debase the dollar. Or take my favorite Business Week headline from 2010: “Krugman or [John] Paulson: Who You Gonna Bet On?” Um.The important thing is to be aware of what we do know, and why.Follow The New York Times Opinion section on Facebook and Twitter (@NYTopinion), and sign up for the Opinion Today newsletter.
Javier E

At the Existentialist Café: Freedom, Being, and Apricot Cocktails with Jean-P... - 0 views

  • The phenomenologists’ leading thinker, Edmund Husserl, provided a rallying cry, ‘To the things themselves!’ It meant: don’t waste time on the interpretations that accrue upon things, and especially don’t waste time wondering whether the things are real. Just look at this that’s presenting itself to you, whatever this may be, and describe it as precisely as possible.
  • You might think you have defined me by some label, but you are wrong, for I am always a work in progress. I create myself constantly through action, and this is so fundamental to my human condition that, for Sartre, it is the human condition, from the moment of first consciousness to the moment when death wipes it out. I am my own freedom: no more, no less.
  • Sartre wrote like a novelist — not surprisingly, since he was one. In his novels, short stories and plays as well as in his philosophical treatises, he wrote about the physical sensations of the world and the structures and moods of human life. Above all, he wrote about one big subject: what it meant to be free. Freedom, for him, lay at the heart of all human experience, and this set humans apart from all other kinds of object.
  • ...97 more annotations...
  • Sartre listened to his problem and said simply, ‘You are free, therefore choose — that is to say, invent.’ No signs are vouchsafed in this world, he said. None of the old authorities can relieve you of the burden of freedom. You can weigh up moral or practical considerations as carefully as you like, but ultimately you must take the plunge and do something, and it’s up to you what that something is.
  • Even if the situation is unbearable — perhaps you are facing execution, or sitting in a Gestapo prison, or about to fall off a cliff — you are still free to decide what to make of it in mind and deed. Starting from where you are now, you choose. And in choosing, you also choose who you will be.
  • The war had made people realise that they and their fellow humans were capable of departing entirely from civilised norms; no wonder the idea of a fixed human nature seemed questionable.
  • If this sounds difficult and unnerving, it’s because it is. Sartre does not deny that the need to keep making decisions brings constant anxiety. He heightens this anxiety by pointing out that what you do really matters. You should make your choices as though you were choosing on behalf of the whole of humanity, taking the entire burden of responsibility for how the human race behaves. If you avoid this responsibility by fooling yourself that you are the victim of circumstance or of someone else’s bad advice, you are failing to meet the demands of human life and choosing a fake existence, cut off from your own ‘authenticity’.
  • Along with the terrifying side of this comes a great promise: Sartre’s existentialism implies that it is possible to be authentic and free, as long as you keep up the effort.
  • almost all agreed that it was, as an article in Les nouvelles littéraires phrased it, a ‘sickening mixture of philosophic pretentiousness, equivocal dreams, physiological technicalities, morbid tastes and hesitant eroticism … an introspective embryo that one would take distinct pleasure in crushing’.
  • he offered a philosophy designed for a species that had just scared the hell out of itself, but that finally felt ready to grow up and take responsibility.
  • In this rebellious world, just as with the Parisian bohemians and Dadaists in earlier generations, everything that was dangerous and provocative was good, and everything that was nice or bourgeois was bad.
  • Such interweaving of ideas and life had a long pedigree, although the existentialists gave it a new twist. Stoic and Epicurean thinkers in the classical world had practised philosophy as a means of living well, rather than of seeking knowledge or wisdom for their own sake. By reflecting on life’s vagaries in philosophical ways, they believed they could become more resilient, more able to rise above circumstances, and better equipped to manage grief, fear, anger, disappointment or anxiety.
  • In the tradition they passed on, philosophy is neither a pure intellectual pursuit nor a collection of cheap self-help tricks, but a discipline for flourishing and living a fully human, responsible life.
  • For Kierkegaard, Descartes had things back to front. In his own view, human existence comes first: it is the starting point for everything we do, not the result of a logical deduction. My existence is active: I live it and choose it, and this precedes any statement I can make about myself.
  • Studying our own moral genealogy cannot help us to escape or transcend ourselves. But it can enable us to see our illusions more clearly and lead a more vital, assertive existence.
  • What was needed, he felt, was not high moral or theological ideals, but a deeply critical form of cultural history or ‘genealogy’ that would uncover the reasons why we humans are as we are, and how we came to be that way. For him, all philosophy could even be redefined as a form of psychology, or history.
  • For those oppressed on grounds of race or class, or for those fighting against colonialism, existentialism offered a change of perspective — literally, as Sartre proposed that all situations be judged according to how they appeared in the eyes of those most oppressed, or those whose suffering was greatest.
  • She observed that we need not expect moral philosophers to ‘live by’ their ideas in a simplistic way, as if they were following a set of rules. But we can expect them to show how their ideas are lived in. We should be able to look in through the windows of a philosophy, as it were, and see how people occupy it, how they move about and how they conduct themselves.
  • the existentialists inhabited their historical and personal world, as they inhabited their ideas. This notion of ‘inhabited philosophy’ is one I’ve borrowed from the English philosopher and novelist Iris Murdoch, who wrote the first full-length book on Sartre and was an early adopter of existentialism
  • What is existentialism anyway?
  • An existentialist who is also phenomenological provides no easy rules for dealing with this condition, but instead concentrates on describing lived experience as it presents itself. — By describing experience well, he or she hopes to understand this existence and awaken us to ways of living more authentic lives.
  • Existentialists concern themselves with individual, concrete human existence. — They consider human existence different from the kind of being other things have. Other entities are what they are, but as a human I am whatever I choose to make of myself at every moment. I am free — — and therefore I’m responsible for everything I do, a dizzying fact which causes — an anxiety inseparable from human existence itself.
  • On the other hand, I am only free within situations, which can include factors in my own biology and psychology as well as physical, historical and social variables of the world into which I have been thrown. — Despite the limitations, I always want more: I am passionately involved in personal projects of all kinds. — Human existence is thus ambiguous: at once boxed in by borders and yet transcendent and exhilarating. —
  • The first part of this is straightforward: a phenomenologist’s job is to describe. This is the activity that Husserl kept reminding his students to do. It meant stripping away distractions, habits, clichés of thought, presumptions and received ideas, in order to return our attention to what he called the ‘things themselves’. We must fix our beady gaze on them and capture them exactly as they appear, rather than as we think they are supposed to be.
  • Husserl therefore says that, to phenomenologically describe a cup of coffee, I should set aside both the abstract suppositions and any intrusive emotional associations. Then I can concentrate on the dark, fragrant, rich phenomenon in front of me now. This ‘setting aside’ or ‘bracketing out’ of speculative add-ons Husserl called epoché — a term borrowed from the ancient Sceptics,
  • The point about rigour is crucial; it brings us back to the first half of the command to describe phenomena. A phenomenologist cannot get away with listening to a piece of music and saying, ‘How lovely!’ He or she must ask: is it plaintive? is it dignified? is it colossal and sublime? The point is to keep coming back to the ‘things themselves’ — phenomena stripped of their conceptual baggage — so as to bail out weak or extraneous material and get to the heart of the experience.
  • Husserlian ‘bracketing out’ or epoché allows the phenomenologist to temporarily ignore the question ‘But is it real?’, in order to ask how a person experiences his or her world. Phenomenology gives a formal mode of access to human experience. It lets philosophers talk about life more or less as non-philosophers do, while still being able to tell themselves they are being methodical and rigorous.
  • Besides claiming to transform the way we think about reality, phenomenologists promised to change how we think about ourselves. They believed that we should not try to find out what the human mind is, as if it were some kind of substance. Instead, we should consider what it does, and how it grasps its experiences.
  • For Brentano, this reaching towards objects is what our minds do all the time. Our thoughts are invariably of or about something, he wrote: in love, something is loved, in hatred, something is hated, in judgement, something is affirmed or denied. Even when I imagine an object that isn’t there, my mental structure is still one of ‘about-ness’ or ‘of-ness’.
  • Except in deepest sleep, my mind is always engaged in this aboutness: it has ‘intentionality’. Having taken the germ of this from Brentano, Husserl made it central to his whole philosophy.
  • Husserl saw in the idea of intentionality a way to sidestep two great unsolved puzzles of philosophical history: the question of what objects ‘really’ are, and the question of what the mind ‘really’ is. By doing the epoché and bracketing out all consideration of reality from both topics, one is freed to concentrate on the relationship in the middle. One can apply one’s descriptive energies to the endless dance of intentionality that takes place in our lives: the whirl of our minds as they seize their intended phenomena one after the other and whisk them around the floor,
  • Understood in this way, the mind hardly is anything at all: it is its aboutness. This makes the human mind (and possibly some animal minds) different from any other naturally occurring entity. Nothing else can be as thoroughly about or of things as the mind is:
  • Some Eastern meditation techniques aim to still this scurrying creature, but the extreme difficulty of this shows how unnatural it is to be mentally inert. Left to itself, the mind reaches out in all directions as long as it is awake — and even carries on doing it in the dreaming phase of its sleep.
  • a mind that is experiencing nothing, imagining nothing, or speculating about nothing can hardly be said to be a mind at all.
  • Three simple ideas — description, phenomenon, intentionality — provided enough inspiration to keep roomfuls of Husserlian assistants busy in Freiburg for decades. With all of human existence awaiting their attention, how could they ever run out of things to do?
  • For Sartre, this gives the mind an immense freedom. If we are nothing but what we think about, then no predefined ‘inner nature’ can hold us back. We are protean.
  • way of this interpretation. Real, not real; inside, outside; what difference did it make? Reflecting on this, Husserl began turning his phenomenology into a branch of ‘idealism’ — the philosophical tradition which denied external reality and defined everything as a kind of private hallucination.
  • For Sartre, if we try to shut ourselves up inside our own minds, ‘in a nice warm room with the shutters closed’, we cease to exist. We have no cosy home: being out on the dusty road is the very definition of what we are.
  • One might think that, if Heidegger had anything worth saying, he could have communicated it in ordinary language. The fact is that he does not want to be ordinary, and he may not even want to communicate in the usual sense. He wants to make the familiar obscure, and to vex us. George Steiner thought that Heidegger’s purpose was less to be understood than to be experienced through a ‘felt strangeness’.
  • He takes Dasein in its most ordinary moments, then talks about it in the most innovative way he can. For Heidegger, Dasein’s everyday Being is right here: it is Being-in-the-world, or In-der-Welt-sein. The main feature of Dasein’s everyday Being-in-the-world right here is that it is usually busy doing something.
  • Thus, for Heidegger, all Being-in-the-world is also a ‘Being-with’ or Mitsein. We cohabit with others in a ‘with-world’, or Mitwelt. The old philosophical problem of how we prove the existence of other minds has now vanished. Dasein swims in the with-world long before it wonders about other minds.
  • Sometimes the best-educated people were those least inclined to take the Nazis seriously, dismissing them as too absurd to last. Karl Jaspers was one of those who made this mistake, as he later recalled, and Beauvoir observed similar dismissive attitudes among the French students in Berlin.
  • In any case, most of those who disagreed with Hitler’s ideology soon learned to keep their view to themselves. If a Nazi parade passed on the street, they would either slip out of view or give the obligatory salute like everyone else, telling themselves that the gesture meant nothing if they did not believe in it. As the psychologist Bruno Bettelheim later wrote of this period, few people will risk their life for such a small thing as raising an arm — yet that is how one’s powers of resistance are eroded away, and eventually one’s responsibility and integrity go with them.
  • for Arendt, if you do not respond adequately when the times demand it, you show a lack of imagination and attention that is as dangerous as deliberately committing an abuse. It amounts to disobeying the one command she had absorbed from Heidegger in those Marburg days: Think!
  • ‘Everything takes place under a kind of anaesthesia. Objectively dreadful events produce a thin, puny emotional response. Murders are committed like schoolboy pranks. Humiliation and moral decay are accepted like minor incidents.’ Haffner thought modernity itself was partly to blame: people had become yoked to their habits and to mass media, forgetting to stop and think, or to disrupt their routines long enough to question what was going on.
  • Heidegger’s former lover and student Hannah Arendt would argue, in her 1951 study The Origins of Totalitarianism, that totalitarian movements thrived at least partly because of this fragmentation in modern lives, which made people more vulnerable to being swept away by demagogues. Elsewhere, she coined the phrase ‘the banality of evil’ to describe the most extreme failures of personal moral awareness.
  • His communicative ideal fed into a whole theory of history: he traced all civilisation to an ‘Axial Period’ in the fifth century BC, during which philosophy and culture exploded simultaneously in Europe, the Middle East and Asia, as though a great bubble of minds had erupted from the earth’s surface. ‘True philosophy needs communion to come into existence,’ he wrote, and added, ‘Uncommunicativeness in a philosopher is virtually a criterion of the untruth of his thinking.’
  • The idea of being called to authenticity became a major theme in later existentialism, the call being interpreted as saying something like ‘Be yourself!’, as opposed to being phony. For Heidegger, the call is more fundamental than that. It is a call to take up a self that you didn’t know you had: to wake up to your Being. Moreover, it is a call to action. It requires you to do something: to take a decision of some sort.
  • Being and Time contained at least one big idea that should have been of use in resisting totalitarianism. Dasein, Heidegger wrote there, tends to fall under the sway of something called das Man or ‘the they’ — an impersonal entity that robs us of the freedom to think for ourselves. To live authentically requires resisting or outwitting this influence, but this is not easy because das Man is so nebulous. Man in German does not mean ‘man’ as in English (that’s der Mann), but a neutral abstraction, something like ‘one’ in the English phrase ‘one doesn’t do that’,
  • for Heidegger, das Man is me. It is everywhere and nowhere; it is nothing definite, but each of us is it. As with Being, it is so ubiquitous that it is difficult to see. If I am not careful, however, das Man takes over the important decisions that should be my own. It drains away my responsibility or ‘answerability’. As Arendt might put it, we slip into banality, failing to think.
  • Jaspers focused on what he called Grenzsituationen — border situations, or limit situations. These are the moments when one finds oneself constrained or boxed in by what is happening, but at the same time pushed by these events towards the limits or outer edge of normal experience. For example, you might have to make a life-or-death choice, or something might remind you suddenly of your mortality,
  • Jaspers’ interest in border situations probably had much to do with his own early confrontation with mortality. From childhood, he had suffered from a heart condition so severe that he always expected to die at any moment. He also had emphysema, which forced him to speak slowly, taking long pauses to catch his breath. Both illnesses meant that he had to budget his energies with care in order to get his work done without endangering his life.
  • If I am to resist das Man, I must become answerable to the call of my ‘voice of conscience’. This call does not come from God, as a traditional Christian definition of the voice of conscience might suppose. It comes from a truly existentialist source: my own authentic self. Alas, this voice is one I do not recognise and may not hear, because it is not the voice of my habitual ‘they-self’. It is an alien or uncanny version of my usual voice. I am familiar with my they-self, but not with my unalienated voice — so, in a weird twist, my real voice is the one that sounds strangest to me.
  • Marcel developed a strongly theological branch of existentialism. His faith distanced him from both Sartre and Heidegger, but he shared a sense of how history makes demands on individuals. In his essay ‘On the Ontological Mystery’, written in 1932 and published in the fateful year of 1933, Marcel wrote of the human tendency to become stuck in habits, received ideas, and a narrow-minded attachment to possessions and familiar scenes. Instead, he urged his readers to develop a capacity for remaining ‘available’ to situations as they arise. Similar ideas of disponibilité or availability had been explored by other writers,
  • Marcel made it his central existential imperative. He was aware of how rare and difficult it was. Most people fall into what he calls ‘crispation’: a tensed, encrusted shape in life — ‘as though each one of us secreted a kind of shell which gradually hardened and imprisoned him’.
  • Bettelheim later observed that, under Nazism, only a few people realised at once that life could not continue unaltered: these were the ones who got away quickly. Bettelheim himself was not among them. Caught in Austria when Hitler annexed it, he was sent first to Dachau and then to Buchenwald, but was then released in a mass amnesty to celebrate Hitler’s birthday in 1939 — an extraordinary reprieve, after which he left at once for America.
  • we are used to reading philosophy as offering a universal message for all times and places — or at least as aiming to do so. But Heidegger disliked the notion of universal truths or universal humanity, which he considered a fantasy. For him, Dasein is not defined by shared faculties of reason and understanding, as the Enlightenment philosophers thought. Still less is it defined by any kind of transcendent eternal soul, as in religious tradition. We do not exist on a higher, eternal plane at all. Dasein’s Being is local: it has a historical situation, and is constituted in time and place.
  • For Marcel, learning to stay open to reality in this way is the philosopher’s prime job. Everyone can do it, but the philosopher is the one who is called on above all to stay awake, so as to be the first to sound the alarm if something seems wrong.
  • Second, it also means understanding that we are historical beings, and grasping the demands our particular historical situation is making on us. In what Heidegger calls ‘anticipatory resoluteness’, Dasein discovers ‘that its uttermost possibility lies in giving itself up’. At that moment, through Being-towards-death and resoluteness in facing up to one’s time, one is freed from the they-self and attains one’s true, authentic self.
  • If we are temporal beings by our very nature, then authentic existence means accepting, first, that we are finite and mortal. We will die: this all-important realisation is what Heidegger calls authentic ‘Being-towards-Death’, and it is fundamental to his philosophy.
  • Hannah Arendt, instead, left early on: she had the benefit of a powerful warning. Just after the Nazi takeover, in spring 1933, she had been arrested while researching materials on anti-Semitism for the German Zionist Organisation at Berlin’s Prussian State Library. Her apartment was searched; both she and her mother were locked up briefly, then released. They fled, without stopping to arrange travel documents. They crossed to Czechoslovakia (then still safe) by a method that sounds almost too fabulous to be true: a sympathetic German family on the border had a house with its front door in Germany and its back door in Czechoslovakia. The family would invite people for dinner, then let them leave through the back door at night.
  • As Sartre argued in his 1943 review of The Stranger, basic phenomenological principles show that experience comes to us already charged with significance. A piano sonata is a melancholy evocation of longing. If I watch a soccer match, I see it as a soccer match, not as a meaningless scene in which a number of people run around taking turns to apply their lower limbs to a spherical object. If the latter is what I’m seeing, then I am not watching some more essential, truer version of soccer; I am failing to watch it properly as soccer at all.
  • Much as they liked Camus personally, neither Sartre nor Beauvoir accepted his vision of absurdity. For them, life is not absurd, even when viewed on a cosmic scale, and nothing can be gained by saying it is. Life for them is full of real meaning, although that meaning emerges differently for each of us.
  • For Sartre, we show bad faith whenever we portray ourselves as passive creations of our race, class, job, history, nation, family, heredity, childhood influences, events, or even hidden drives in our subconscious which we claim are out of our control. It is not that such factors are unimportant: class and race, in particular, he acknowledged as powerful forces in people’s lives, and Simone de Beauvoir would soon add gender to that list.
  • Sartre takes his argument to an extreme point by asserting that even war, imprisonment or the prospect of imminent death cannot take away my existential freedom. They form part of my ‘situation’, and this may be an extreme and intolerable situation, but it still provides only a context for whatever I choose to do next. If I am about to die, I can decide how to face that death. Sartre here resurrects the ancient Stoic idea that I may not choose what happens to me, but I can choose what to make of it, spiritually speaking.
  • But the Stoics cultivated indifference in the face of terrible events, whereas Sartre thought we should remain passionately, even furiously engaged with what happens to us and with what we can achieve. We should not expect freedom to be anything less than fiendishly difficult.
  • Freedom does not mean entirely unconstrained movement, and it certainly does not mean acting randomly. We often mistake the very things that enable us to be free — context, meaning, facticity, situation, a general direction in our lives — for things that define us and take away our freedom. It is only with all of these that we can be free in a real sense.
  • Nor did he mean that privileged groups have the right to pontificate to the poor and downtrodden about the need to ‘take responsibility’ for themselves. That would be a grotesque misreading of Sartre’s point, since his sympathy in any encounter always lay with the more oppressed side. But for each of us — for me — to be in good faith means not making excuses for myself.
  • Camus’ novel gives us a deliberately understated vision of heroism and decisive action compared to those of Sartre and Beauvoir. One can only do so much. It can look like defeatism, but it shows a more realistic perception of what it takes to actually accomplish difficult tasks like liberating one’s country.
  • Camus just kept returning to his core principle: no torture, no killing — at least not with state approval. Beauvoir and Sartre believed they were taking a more subtle and more realistic view. If asked why a couple of innocuous philosophers had suddenly become so harsh, they would have said it was because the war had changed them in profound ways. It had shown them that one’s duties to humanity could be more complicated than they seemed. ‘The war really divided my life in two,’ Sartre said later.
  • Poets and artists ‘let things be’, but they also let things come out and show themselves. They help to ease things into ‘unconcealment’ (Unverborgenheit), which is Heidegger’s rendition of the Greek term alētheia, usually translated as ‘truth’. This is a deeper kind of truth than the mere correspondence of a statement to reality, as when we say ‘The cat is on the mat’ and point to a mat with a cat on it. Long before we can do this, both cat and mat must ‘stand forth out of concealedness’. They must un-hide themselves.
  • Heidegger does not use the word ‘consciousness’ here because — as with his earlier work — he is trying to make us think in a radically different way about ourselves. We are not to think of the mind as an empty cavern, or as a container filled with representations of things. We are not even supposed to think of it as firing off arrows of intentional ‘aboutness’, as in the earlier phenomenology of Brentano. Instead, Heidegger draws us into the depths of his Schwarzwald, and asks us to imagine a gap with sunlight filtering in. We remain in the forest, but we provide a relatively open spot where other beings can bask for a moment. If we did not do this, everything would remain in the thickets, hidden even to itself.
  • The astronomer Carl Sagan began his 1980 television series Cosmos by saying that human beings, though made of the same stuff as the stars, are conscious and are therefore ‘a way for the cosmos to know itself’. Merleau-Ponty similarly quoted his favourite painter Cézanne as saying, ‘The landscape thinks itself in me, and I am its consciousness.’ This is something like what Heidegger thinks humanity contributes to the earth. We are not made of spiritual nothingness; we are part of Being, but we also bring something unique with us. It is not much: a little open space, perhaps with a path and a bench like the one the young Heidegger used to sit on to do his homework. But through us, the miracle occurs.
  • Beauty aside, Heidegger’s late writing can also be troubling, with its increasingly mystical notion of what it is to be human. If one speaks of a human being mainly as an open space or a clearing, or a means of ‘letting beings be’ and dwelling poetically on the earth, then one doesn’t seem to be talking about any recognisable person. The old Dasein has become less human than ever. It is now a forestry feature.
  • Even today, Jaspers, the dedicated communicator, is far less widely read than Heidegger, who has influenced architects, social theorists, critics, psychologists, artists, film-makers, environmental activists, and innumerable students and enthusiasts — including the later deconstructionist and post-structuralist schools, which took their starting point from his late thinking. Having spent the late 1940s as an outsider and then been rehabilitated, Heidegger became the overwhelming presence in university philosophy all over the European continent from then on.
  • As Levinas reflected on this experience, it helped to lead him to a philosophy that was essentially ethical, rather than ontological like Heidegger’s. He developed his ideas from the work of Jewish theologian Martin Buber, whose I and Thou in 1923 had distinguished between my relationship with an impersonal ‘it’ or ‘them’, and the direct personal encounter I have with a ‘you’. Levinas took it further: when I encounter you, we normally meet face-to-face, and it is through your face that you, as another person, can make ethical demands on me. This is very different from Heidegger’s Mitsein or Being-with, which suggests a group of people standing alongside one another, shoulder to shoulder as if in solidarity — perhaps as a unified nation or Volk.
  • For Levinas, we literally face each other, one individual at a time, and that relationship becomes one of communication and moral expectation. We do not merge; we respond to one another. Instead of being co-opted into playing some role in my personal drama of authenticity, you look me in the eyes — and you remain Other. You remain you.
  • This relationship is more fundamental than the self, more fundamental than consciousness, more fundamental even than Being — and it brings an unavoidable ethical obligation. Ever since Husserl, phenomenologists and existentialists had being trying to stretch the definition of existence to incorporate our social lives and relationships. Levinas did more: he turned philosophy around entirely so that these relationships were the foundation of our existence, not an extension of it.
  • Her last work, The Need for Roots, argues, among other things, that none of us has rights, but each one of us has a near-infinite degree of duty and obligation to the other. Whatever the underlying cause of her death — and anorexia nervosa seems to have been involved — no one could deny that she lived out her philosophy with total commitment. Of all the lives touched on in this book, hers is surely the most profound and challenging application of Iris Murdoch’s notion that a philosophy can be ‘inhabited’.
  • Other thinkers took radical ethical turns during the war years. The most extreme was Simone Weil, who actually tried to live by the principle of putting other people’s ethical demands first. Having returned to France after her travels through Germany in 1932, she had worked in a factory so as to experience the degrading nature of such work for herself. When France fell in 1940, her family fled to Marseilles (against her protests), and later to the US and to Britain. Even in exile, Weil made extraordinary sacrifices. If there were people in the world who could not sleep in a bed, she would not do so either, so she slept on the floor.
  • The mystery tradition had roots in Kierkegaard’s ‘leap of faith’. It owed much to the other great nineteenth-century mystic of the impossible, Dostoevsky, and to older theological notions. But it also grew from the protracted trauma that was the first half of the twentieth century. Since 1914, and especially since 1939, people in Europe and elsewhere had come to the realisation that we cannot fully know or trust ourselves; that we have no excuses or explanations for what we do — and yet that we must ground our existence and relationships on something firm, because otherwise we cannot survive.
  • One striking link between these radical ethical thinkers, all on the fringes of our main story, is that they had religious faith. They also granted a special role to the notion of ‘mystery’ — that which cannot be known, calculated or understood, especially when it concerns our relationships with each other. Heidegger was different from them, since he rejected the religion he grew up with and had no real interest in ethics — probably as a consequence of his having no real interest in the human.
  • Meanwhile, the Christian existentialist Gabriel Marcel was also still arguing, as he had since the 1930s, that ethics trumps everything else in philosophy and that our duty to each other is so great as to play the role of a transcendent ‘mystery’. He too had been led to this position partly by a wartime experience: during the First World War he had worked for the Red Cross’ Information Service, with the unenviable job of answering relatives’ inquiries about missing soldiers. Whenever news came, he passed it on, and usually it was not good. As Marcel later said, this task permanently inoculated him against warmongering rhetoric of any kind, and it made him aware of the power of what is unknown in our lives.
  • As the play’s much-quoted and frequently misunderstood final line has it: ‘Hell is other people.’ Sartre later explained that he did not mean to say that other people were hellish in general. He meant that after death we become frozen in their view, unable any longer to fend off their interpretation. In life, we can still do something to manage the impression we make; in death, this freedom goes and we are left entombed in other’s people’s memories and perceptions.
  • We have to do two near-impossible things at once: understand ourselves as limited by circumstances, and yet continue to pursue our projects as though we are truly in control. In Beauvoir’s view, existentialism is the philosophy that best enables us to do this, because it concerns itself so deeply with both freedom and contingency. It acknowledges the radical and terrifying scope of our freedom in life, but also the concrete influences that other philosophies tend to ignore: history, the body, social relationships and the environment.
  • The aspects of our existence that limit us, Merleau-Ponty says, are the very same ones that bind us to the world and give us scope for action and perception. They make us what we are. Sartre acknowledged the need for this trade-off, but he found it more painful to accept. Everything in him longed to be free of bonds, of impediments and limitations
  • Of course we have to learn this skill of interpreting and anticipating the world, and this happens in early childhood, which is why Merleau-Ponty thought child psychology was essential to philosophy. This is an extraordinary insight. Apart from Rousseau, very few philosophers before him had taken childhood seriously; most wrote as though all human experience were that of a fully conscious, rational, verbal adult who has been dropped into this world from the sky — perhaps by a stork.
  • For Merleau-Ponty, we cannot understand our experience if we don’t think of ourselves in part as overgrown babies. We fall for optical illusions because we once learned to see the world in terms of shapes, objects and things relevant to our own interests. Our first perceptions came to us in tandem with our first active experiments in observing the world and reaching out to explore it, and are still linked with those experiences.
  • Another factor in all of this, for Merleau-Ponty, is our social existence: we cannot thrive without others, or not for long, and we need this especially in early life. This makes solipsistic speculation about the reality of others ridiculous; we could never engage in such speculation if we hadn’t already been formed by them.
  • As Descartes could have said (but didn’t), ‘I think, therefore other people exist.’ We grow up with people playing with us, pointing things out, talking, listening, and getting us used to reading emotions and movements; this is how we become capable, reflective, smoothly integrated beings.
  • In general, Merleau-Ponty thinks human experience only makes sense if we abandon philosophy’s time-honoured habit of starting with a solitary, capsule-like, immobile adult self, isolated from its body and world, which must then be connected up again — adding each element around it as though adding clothing to a doll. Instead, for him, we slide from the womb to the birth canal to an equally close and total immersion in the world. That immersion continues as long as we live, although we may also cultivate the art of partially withdrawing from time to time when we want to think or daydream.
  • When he looks for his own metaphor to describe how he sees consciousness, he comes up with a beautiful one: consciousness, he suggests, is like a ‘fold’ in the world, as though someone had crumpled a piece of cloth to make a little nest or hollow. It stays for a while, before eventually being unfolded and smoothed away. There is something seductive, even erotic, in this idea of my conscious self as an improvised pouch in the cloth of the world. I still have my privacy — my withdrawing room. But I am part of the world’s fabric, and I remain formed out of it for as long as I am here.
  • By the time of these works, Merleau-Ponty is taking his desire to describe experience to the outer limits of what language can convey. Just as with the late Husserl or Heidegger, or Sartre in his Flaubert book, we see a philosopher venturing so far from shore that we can barely follow. Emmanuel Levinas would head out to the fringes too, eventually becoming incomprehensible to all but his most patient initiates.
  • Sartre once remarked — speaking of a disagreement they had about Husserl in 1941 — that ‘we discovered, astounded, that our conflicts had, at times, stemmed from our childhood, or went back to the elementary differences of our two organisms’. Merleau-Ponty also said in an interview that Sartre’s work seemed strange to him, not because of philosophical differences, but because of a certain ‘register of feeling’, especially in Nausea, that he could not share. Their difference was one of temperament and of the whole way the world presented itself to them.
  • The two also differed in their purpose. When Sartre writes about the body or other aspects of experience, he generally does it in order to make a different point. He expertly evokes the grace of his café waiter, gliding between the tables, bending at an angle just so, steering the drink-laden tray through the air on the tips of his fingers — but he does it all in order to illustrate his ideas about bad faith. When Merleau-Ponty writes about skilled and graceful movement, the movement itself is his point. This is the thing he wants to understand.
  • We can never move definitively from ignorance to certainty, for the thread of the inquiry will constantly lead us back to ignorance again. This is the most attractive description of philosophy I’ve ever read, and the best argument for why it is worth doing, even (or especially) when it takes us no distance at all from our starting point.
  • By prioritising perception, the body, social life and childhood development, Merleau-Ponty gathered up philosophy’s far-flung outsider subjects and brought them in to occupy the centre of his thought.
  • In his inaugural lecture at the Collège de France on 15 January 1953, published as In Praise of Philosophy, he said that philosophers should concern themselves above all with whatever is ambiguous in our experience. At the same time, they should think clearly about these ambiguities, using reason and science. Thus, he said, ‘The philosopher is marked by the distinguishing trait that he possesses inseparably the taste for evidence and the feeling for ambiguity.’ A constant movement is required between these two
  • As Sartre wrote in response to Hiroshima, humanity had now gained the power to wipe itself out, and must decide every single day that it wanted to live. Camus also wrote that humanity faced the task of choosing between collective suicide and a more intelligent use of its technology — ‘between hell and reason’. After 1945, there seemed little reason to trust in humanity’s ability to choose well.
  • Merleau-Ponty observed in a lecture of 1951 that, more than any previous century, the twentieth century had reminded people how ‘contingent’ their lives were — how at the mercy of historical events and other changes that they could not control. This feeling went on long after the war ended. After the A-bombs were dropped on Hiroshima and Nagasaki, many feared that a Third World War would not be long in coming, this time between the Soviet Union and the United States.
Javier E

Philosophy isn't dead yet | Raymond Tallis | Comment is free | The Guardian - 1 views

  • Fundamental physics is in a metaphysical mess and needs help. The attempt to reconcile its two big theories, general relativity and quantum mechanics, has stalled for nearly 40 years. Endeavours to unite them, such as string theory, are mathematically ingenious but incomprehensible even to many who work with them. This is well known.
  • A better-kept secret is that at the heart of quantum mechanics is a disturbing paradox – the so-called measurement problem, arising ultimately out of the Uncertainty Principle – which apparently demonstrates that the very measurements that have established and confirmed quantum theory should be impossible. Oxford philosopher of physics David Wallace has argued that this threatens to make quantum mechanics incoherent which can be remedied only by vastly multiplying worlds.
  • there is the failure of physics to accommodate conscious beings. The attempt to fit consciousness into the material world, usually by identifying it with activity in the brain, has failed dismally, if only because there is no way of accounting for the fact that certain nerve impulses are supposed to be conscious (of themselves or of the world) while the overwhelming majority (physically essentially the same) are not. In short, physics does not allow for the strange fact that matter reveals itself to material objects (such as physicists).
  • ...3 more annotations...
  • then there is the mishandling of time. The physicist Lee Smolin's recent book, Time Reborn, links the crisis in physics with its failure to acknowledge the fundamental reality of time. Physics is predisposed to lose time because its mathematical gaze freezes change. Tensed time, the difference between a remembered or regretted past and an anticipated or feared future, is particularly elusive. This worried Einstein: in a famous conversation, he mourned the fact that the present tense, "now", lay "just outside of the realm of science".
  • Recent attempts to explain how the universe came out of nothing, which rely on questionable notions such as spontaneous fluctuations in a quantum vacuum, the notion of gravity as negative energy, and the inexplicable free gift of the laws of nature waiting in the wings for the moment of creation, reveal conceptual confusion beneath mathematical sophistication. They demonstrate the urgent need for a radical re-examination of the invisible frameworks within which scientific investigations are conducted.
  • we should reflect on how a scientific image of the world that relies on up to 10 dimensions of space and rests on ideas, such as fundamental particles, that have neither identity nor location, connects with our everyday experience. This should open up larger questions, such as the extent to which mathematical portraits capture the reality of our world – and what we mean by "reality".
Javier E

Parents' Dilemma: When to Give Children Smartphones - WSJ - 0 views

  • Experience has already shown parents that ceding control over the devices has reshaped their children’s lives, allowing an outside influence on school work, friendships, recreation, sleep, romance, sex and free time.
  • Nearly 75% of teenagers had access to smartphones, concluded a 2015 study by Pew Research Center—unlocking the devices about 95 times a day on average,
  • They spent, on average, close to nine hours a day tethered to screens large
  • ...15 more annotations...
  • The more screen time, the more revenue.
  • The goal of Facebook Inc., Alphabet Inc.’s Google, Snap Inc. and their peers is to create or host captivating experiences that keep users glued to their screens, whether for Instagram, YouTube, Snapchat or Facebook
  • Snapchat users 25 and younger, for example, were spending 40 minutes a day on the app, Chief Executive Evan Spiegel said in August. Alphabet boasted to investors recently that YouTube’s 1.5 billion users were spending an average 60 minutes a day on mobile.
  • Facebook’s stock slid 4.5% to close at $179 Friday after CEO Mark Zuckerberg announced plans Thursday to overhaul the Facebook news feed in a way that could reduce the time users spend.
  • Tech companies are working to instill viewing habits earlier than ever. The number of users of YouTube Kids is soaring. Facebook recently launched Messenger Kids, a messaging app for children as young as 6.
  • Ms. Ho’s 16-year-old son, Brian is an Eagle Scout and chorister, who at times finds it hard to break away from online videogames, even at 3 a.m. The teen recently told his mother he thinks he is addicted. Ms. Ho’s daughter, Samantha, 14, also is glued to her device, in conversations with friends.
  • “You think you’re buying a piece of technology,” Ms. Shepardson said. “Now it’s like oxygen to her.”
  • Psychologists say social media creates anxiety among children when they are away from their phones—what they call “fear of missing out,” whether on social plans, conversations or damaging gossip teens worry could be about themselves.
  • About half the teens in a survey of 620 families in 2016 said they felt addicted to their smartphones. Nearly 80% said they checked the phones more than hourly and felt the need to respond instantly to messages
  • Children set up Instagram accounts under pseudonyms that friends but not parents recognize. Some teens keep several of these so-called Finsta accounts without their parents knowing.
  • An app called Secret Calculator looks and works like an iPhone calculator but doubles as a private vault to hide files, photos and videos.
  • Mr. Zuckerberg told investors late last year that Facebook planned to boost video offerings, noting that live video generates 10 times as many user interactions. Netflix Inc. chief executive Reed Hastings, said in April about the addictiveness of its shows that the company was “competing with sleep on the margins.”
  • Keeping children away from disturbing content, though, is easier than keeping them off their phones.
  • About 16% of the nation’s high-school students were bullied online in 2015, according to the U.S. Centers for Disease Control and Prevention. Children who are cyberbullied are three times more likely to contemplate suicide
  • Smartphones “bring the outside in,” said Ms. Ahn, whose husband works for a major tech company. “We want the family to be the center of gravity.”
Javier E

Seven Lessons In Economic Leadership From Ancient Egypt - 0 views

  • Although there are plenty of grounds for rage against the big banks, the challenge is to sort out which are the activities that grow the real economy of goods and services, and which are the activities that are essentially a zero-sum game of socially useless gambling?
  • The situation today is that the zero-sum games of the financial sector aren’t just a tiny sideshow. They have grown exponentially and have become almost the main game of the financial sector.
  • When finance becomes the end, not the means, then the result is what analyst Gautam Mukunda calls “excessive financialization” of the economy, as his excellent article by “The Price of Wall Street Power” in the June 2014 issue of Harvard Business Review makes clear.
  • ...15 more annotations...
  • Quite apart from the “unbalanced power” of the financial sector, and the tendency of a super-sized financial sector to cause increasingly bad global financial crashes, excessive financialization leads to resources being misallocated. “In many of the financial sector’s segments that have grown fastest since deregulation—like investment banks—the transactions are primarily zero-sum.”
  • However in times of rapid technological transformation like today, the role of the economic priesthood in protecting its own interests can become a massively destabilizing.
  • Thus we know from the history of the last couple of hundred years that in times of rapid technological transformation, the financial sector tends to become disconnected from the real economy
  • This has occurred a number of times in the last few hundred years, including the Canal Mania (England—1790s), the Rail Mania (England—1840s), the Gilded Age (US: 1880s—early 1900s) the Roaring Twenties (US—1920s) and the Big Banks of today.
  • Getting to safety is not made any easier by the fact the modern economic priesthood—the managers of large firms and the banks—has, like their ancient Egyptian forbears, found ways to participate in the casino economy and benefit from “making money out of money”, even as the economy as a whole suffers.  As Upton Sinclair wrote, “It is difficult to get a man to understand something, when his salary depends upon his not understanding it.
  • Just as the ancient Egyptian economic priesthood clung to power as the economy stagnated, so today the economic priesthood shows no signs of relinquishing their gains or their power. The appetite and expectation of extraordinary returns is still there.
  • “Corporate chieftains rationally choose financial engineering—debt-financed share buybacks, for example—over capital investment in property, plants and equipment. Financial markets reward shareholder activism. Institutional investors extend their risk parameters to beat their benchmarks… But real economic growth—averaging just a bit above 2 percent for the fifth year in a row—remains sorely lacking.”
  • As a result, the economy remains in the “Great Stagnation”(Tyler Cowen), also known as “the Secular Stagnation (Larry Summers). It is running on continuing life support from the Federal Reserve. Large enterprises still appear to be profitable. The appearance, though not the reality, of economic well-being has been sufficient to make the stock market soa
  • Just as no change was possible in ancient Egyptian society so long as the economic priesthood colluded to preserve the status quo, so the excesses and prevarications of the Financial Sector will continue so long as the regulators remain its cheerleaders.
  • Just listen to the chair of the Securities and Exchange Commission (SEC), Mary Jo White at Stanford University Rock Center for Corporate Governance speaking to directors. In her speech, she makes no secret of her view that the overall corporate arrangements are sound. The job of the SEC, as outlined in the speech, is to find the odd individual who might be doing something wrong. The idea that the large-scale activities of the major banks might be socially corrosive is not even alluded.
  • Thus in times of transformational technology, there is a huge expansion of investment, driven by the financial sector. Wealthy investors begin to expect outsized returns and so there is over-investment. The resulting bubbles in due course burst
  • Just as in ancient Egypt, no progress was possible so long as the myths and rituals of the economic priesthood and their offerings to the gods were widely accepted as real indicators of what was going on, so today no progress is possible so long as the myths and rituals of the modern economic priesthood still has a pervasive hold of people’s minds
  • In the modern economy, the myths and rituals of the economic priesthood are built on the notion that the purpose of a firm is to maximize shareholder value and the notion that if the share price is increasing, things are going well. These ideas are the intellectual underpinnings of the zero-sum activities of the financial sector for “making money out of money”, by whatever means possible
  • Like the myths and rituals of the priests of ancient Egypt, shareholder value theory is espoused with religious overtones. Shareholder value, which even Jack Welch has called “the dumbest idea in the world,” remains pervasive in business, even though it is responsible for massive offshoring of manufacturing, thereby destroying major segments of the US economy, undermining US capacity to compete in international markets and killing the economic recovery.
  • If instead society decides that the financial sector should concentrate on its socially important function of financing the real economy and providing financial security for an ever wider circle of citizens and enterprises, we could enjoy an era of growth and lasting prosperity.
Javier E

How Tech Can Turn Doctors Into Clerical Workers - The New York Times - 0 views

  • what I see in my colleague is disillusionment, and it has come too early, and I am seeing too much of it.
  • In America today, the patient in the hospital bed is just the icon, a place holder for the real patient who is not in the bed but in the computer. That virtual entity gets all our attention. Old-fashioned “bedside” rounds conducted by the attending physician too often take place nowhere near the bed but have become “card flip” rounds
  • My young colleague slumping in the chair in my office survived the student years, then three years of internship and residency and is now a full-time practitioner and teacher. The despair I hear comes from being the highest-paid clerical worker in the hospital: For every one hour we spend cumulatively with patients, studies have shown, we spend nearly two hours on our primitive Electronic Health Records, or “E.H.R.s,” and another hour or two during sacred personal time.
  • ...23 more annotations...
  • The living, breathing source of the data and images we juggle, meanwhile, is in the bed and left wondering: Where is everyone? What are they doing? Hello! It’s my body, you know
  • Our $3.4 trillion health care system is responsible for more than a quarter of a million deaths per year because of medical error, the rough equivalent of, say, a jumbo jet’s crashing every day.
  • I can get cash and account details all over America and beyond. Yet I can’t reliably get a patient record from across town, let alone from a hospital in the same state, even if both places use the same brand of E.H.R
  • the leading E.H.R.s were never built with any understanding of the rituals of care or the user experience of physicians or nurses. A clinician will make roughly 4,000 keyboard clicks during a busy 10-hour emergency-room shift
  • In the process, our daily progress notes have become bloated cut-and-paste monsters that are inaccurate and hard to wade through. A half-page, handwritten progress note of the paper era might in a few lines tell you what a physician really thought
  • so much of the E.H.R., but particularly the physical exam it encodes, is a marvel of fiction, because we humans don’t want to leave a check box empty or leave gaps in a template.
  • For a study, my colleagues and I at Stanford solicited anecdotes from physicians nationwide about patients for whom an oversight in the exam (a “miss”) had resulted in real consequences, like diagnostic delay, radiation exposure, therapeutic or surgical misadventure, even death. They were the sorts of things that would leave no trace in the E.H.R. because the recorded exam always seems complete — and yet the omission would be glaring and memorable to other physicians involved in the subsequent care. We got more than 200 such anecdotes.
  • The reason for these errors? Most of them resulted from exams that simply weren’t done as claimed. “Food poisoning” was diagnosed because the strangulated hernia in the groin was overlooked, or patients were sent to the catheterization lab for chest pain because no one saw the shingles rash on the left chest.
  • I worry that such mistakes come because we’ve gotten trapped in the bunker of machine medicine. It is a preventable kind of failure
  • How we salivated at the idea of searchable records, of being able to graph fever trends, or white blood counts, or share records at a keystroke with another institution — “interoperability”
  • The seriously ill patient has entered another kingdom, an alternate universe, a place and a process that is frightening, infantilizing; that patient’s greatest need is both scientific state-of-the-art knowledge and genuine caring from another human being. Caring is expressed in listening, in the time-honored ritual of the skilled bedside exam — reading the body — in touching and looking at where it hurts and ultimately in localizing the disease for patients not on a screen, not on an image, not on a biopsy report, but on their bodies.
  • What if the computer gave the nurse the big picture of who he was both medically and as a person?
  • a professor at M.I.T. whose current interest in biomedical engineering is “bedside informatics,” marvels at the fact that in an I.C.U., a blizzard of monitors from disparate manufacturers display EKG, heart rate, respiratory rate, oxygen saturation, blood pressure, temperature and more, and yet none of this is pulled together, summarized and synthesized anywhere for the clinical staff to use
  • What these monitors do exceedingly well is sound alarms, an average of one alarm every eight minutes, or more than 180 per patient per day. What is our most common response to an alarm? We look for the button to silence the nuisance because, unlike those in a Boeing cockpit, say, our alarms are rarely diagnosing genuine danger.
  • By some estimates, more than 50 percent of physicians in the United States have at least one symptom of burnout, defined as a syndrome of emotional exhaustion, cynicism and decreased efficacy at work
  • It is on the increase, up by 9 percent from 2011 to 2014 in one national study. This is clearly not an individual problem but a systemic one, a 4,000-key-clicks-a-day problem.
  • The E.H.R. is only part of the issue: Other factors include rapid patient turnover, decreased autonomy, merging hospital systems, an aging population, the increasing medical complexity of patients. Even if the E.H.R. is not the sole cause of what ails us, believe me, it has become the symbol of burnou
  • burnout is one of the largest predictors of physician attrition from the work force. The total cost of recruiting a physician can be nearly $90,000, but the lost revenue per physician who leaves is between $500,000 and $1 million, even more in high-paying specialties.
  • I hold out hope that artificial intelligence and machine-learning algorithms will transform our experience, particularly if natural-language processing and video technology allow us to capture what is actually said and done in the exam room.
  • as with any lab test, what A.I. will provide is at best a recommendation that a physician using clinical judgment must decide how to apply.
  • True clinical judgment is more than addressing the avalanche of blood work, imaging and lab tests; it is about using human skills to understand where the patient is in the trajectory of a life and the disease, what the nature of the patient’s family and social circumstances is and how much they want done.
  • Much of that is a result of poorly coordinated care, poor communication, patients falling through the cracks, knowledge not being transferred and so on, but some part of it is surely from failing to listen to the story and diminishing skill in reading the body as a text.
  • As he was nearing death, Avedis Donabedian, a guru of health care metrics, was asked by an interviewer about the commercialization of health care. “The secret of quality,” he replied, “is love.”/•/
Javier E

Skeptics read Jordan Peterson's '12 Rules for Life' - The Washington Post - 0 views

  • I do think that women tend to spend more time thinking about their lives, planning for the future, sort of sorting themselves out — and know how to do so. So they don’t need Peterson’s basic life advice as much as men do.
  • Emba: These days, young men seem far more lost than young women. And we’re seeing the results of that all over the place — men disappearing into video games, or pornography, or dropping out of the workforce, or succumbing to depression and despair. So maybe they need this more.
  • Rubin made it sound as though Peterson held some *hidden knowledge,* but there’s no secret to “stand up straight and make sure the people you keep around you pull you up rather than drag you down.”
  • ...12 more annotations...
  • I actually think Peterson was right to observe that it’s remarkable how many students at the universities where they tested some of his theories hadn’t been told these things. Though I thought it was interesting that he seemed to think that teaching this kind of thing was a job for the educational system rather than the parents
  • I think perhaps we’re both lucky in that though our backgrounds are different, we both come from relatively stable families with parents and surrounding adults who inculcated these “rules” intrinsically, from our youth on. So the Peterson gospel doesn’t feel new to us.
  • The fact that there are whole swaths of our generation who are advantaged by already knowing this information about how to make your life better, and another whole swath who is being left behind, character and life-formation wise, because they don’t. And they are left to rely on Jordan Peterson.
  • He is convinced of the importance and significance of these stories, these words — and religion, and its significance. At one point he stated that he didn’t have a materialist view of the world, but actually a “deeply religious” one.
  • One thing that’s definitely central to the book is telling people (particularly men) that life is hard, and you need to get it together.
  • largely the message you come away with is that if you don’t like the way things are going, it’s your fault and your fault alone. And that’s an easier message to believe when you’re a white male and systemic obstacles aren’t really a thing you run into.
  • Jordan Peterson professes not to be religious, but he is. His book is built on what he describes as archetypal myths from different cultures, but leans *very* heavily on Judeo-Christian ones especially — Cain and Abel and the stories of Jesus’s life, from his temptation in the desert to his death and resurrection.
  • This tendency was even more pronounced in his live lecture. Basically every line, every piece of advice he gave, was supported by a Bible verse. At one point, he quoted the gospel of Matthew: “Knock and the door will be opened to you” — and said, “This is how life works, ACTUALLY” — basically glaring at the crowd and daring them to disagree.
  • Just in the week or so I was reading “12 Rules,” I had several men my age come up to me on buses or in coffee shops and strike up conversations with me about Peterson — the one thing they all talked about right away was how the book had a lot of “hard truths” that they needed to hear
  • He’s not keeping great company. But I think his personal work and statements are generally benign, in many cases actually helpful, in that they urge young people to seek out a better-structured and more meaningful life.
  • I agree it’s inaccurate to label him as alt-right, though that is a low bar to clear. Frankly I see him more as a mainstream conservative. I think part of the reason people get this wrong is that there’s a big gap between what boosted his fame and what the central thrust of his book is
  • I think “traditionalist” is probably the best label for him — both because his views are traditionalist and because his worldview is so dependent on traditions (or at least what he sees as traditions.)
Javier E

Google Has Picked an Answer for You-Too Bad It's Often Wrong - WSJ - 1 views

  • Google became the world’s go-to source of information by ranking billions of links from millions of sources. Now, for many queries, the internet giant is presenting itself as the authority on truth by promoting a single search result as the answer.
  • The promoted answers, called featured snippets, are outlined in boxes above other results and presented in larger type, often with images. Google’s voice assistant sometimes reads them aloud
  • They give Google’s secret algorithms even greater power to shape public opinion, given that surveys show people consider search engines their most-trusted source of information, over traditional media or social media.
  • ...7 more annotations...
  • Google’s featured answers are feeding a raging global debate about the ability of Silicon Valley companies to influence society. Google and other internet giants are under intensifying scrutiny over the power of their products and their vulnerability to bias or manipulation.
  • Featured snippets are “generated algorithmically and [are] a reflection of what people are searching for and what’s available on the web,” the company said in an April blog post. “This can sometimes lead to results that are unexpected, inaccurate or offensive.”
  • Google, a unit of Alphabet Inc., handles almost all internet searches. Featured snippets appear on about 40% of results for searches formed as questions
  • An algorithm chooses featured snippets from websites in part by how closely they appear to satisfy a user’s question, factoring in Google’s measure of a source’s authority and its ranking in the search results.
  • By answering questions directly, Google aims to make the search engine more appealing to users and the advertisers that chase them. The answers’ real estate is so attractive that there is a budding marketing industry around tailoring content so it becomes a featured snippet.
  • as Google expanded the use of featured snippets, it has relied more often on less authoritative sources, such as purveyors of top-10 lists and gossipy clickbait.
  • “For them to wield their algorithm like this is very worrisome,” she said. “This is how people learn about the world.”
« First ‹ Previous 61 - 80 of 129 Next › Last »
Showing 20 items per page