Skip to main content

Home/ TOK Friends/ Group items tagged distractions

Rss Feed Group items tagged

Javier E

From Sports Illustrated, the Latest Body Part for Women to Fix - NYTimes.com - 0 views

  • At 44, I am old enough to remember when reconstruction was something you read about in history class, when a muffin top was something delicious you ate at the bakery, a six-pack was how you bought your beer, camel toe was something one might glimpse at the zoo, a Brazilian was someone from the largest country in South America and terms like thigh gap and bikini bridge would be met with blank looks.
  • Now, each year brings a new term for an unruly bit of body that women are expected to subdue through diet and exercise.
  • Girls’ and women’s lives matter. Their safety and health and their rights matter. Whether every inch of them looks like a magazine cover?That, my sisters, does not matter at all.
  • ...5 more annotations...
  • there’s no profit in leaving things as they are.Show me a body part, I’ll show you someone who’s making money by telling women that theirs looks wrong and they need to fix it. Tone it, work it out, tan it, bleach it, tattoo it, lipo it, remove all the hair, lose every bit of jiggle.
  • As a graphic designer and Photoshop teacher, I also have to note that Photoshop is used HEAVILY in these kinds of publications. Even on women with incredibly beautiful (by pop culture standards) bodies. It's quite sad because the imagery we're expected to live up to (or approximate) by cultural standards, is illustration. It's not even real. My boyfriend and I had a big laugh over a Playboy cover a few months ago where the Photoshopping was so extreme (thigh gap and butt cheek) it was anatomically impossible and looked ridiculous. I work in the industry.. I know what the Liquify filter and the Spot Healing Brush can do!
  • We may harp on gender inequality while pursuing stupid fetishes. Well into our middle age, we still try to forcefully wriggle into size 2 pair of jeans. We foolishly spend tonnes of money on fake ( these guys should be sued for false advertising )age -defying, anti-wrinkle creams. Why do we have to have our fuzz and bush diappear while the men have forests on their chests,abdomens,butts, arms and legs? For that we have only ourselves to blame. We just cannot get out of this mindset of being objectified. And we pass on these foolishness to our daughters and grand-daughters. They get trapped, never satisfied with what they see in the mirror. Don't expect the men to change anytime soon. They will always maintain the status quo. It is for us, women to get out of this rut. We have to 'snatch' gender-equality. It will never be handed to us. PERIOD
  • I spent years dieting and exercising to look good--or really to not look bad. I knew the calories (and probably still do) in thousands of foods. How I regret the time I spent on that and the boyfriends who cared about that. And how much more I had to give to the world. With unprecedented economic injustice, ecosystems collapsing, war breaking out everywhere, nations going under water, people starving in refugee camps, the keys to life, behavior, and disease being unlocked in the biological sciences . . . this is what we think women should spend their time worrying about? Talk about a poverty of ambition. No more. Won't even look at these demeaning magazines when I get my hair cut. If that's what a woman cares about, I try to tell her to stop wasting her time. If that's what a man cares about, he is a waste of my time. What a depressing way to distract women from achieving more in this world. Really wish I'd know this at 12.
  • we believe we're all competing against one another to procreate and participate in evolution. So women (and men) compete ferociously, and body image is a subset of all that. Then there's LeMarckian evolutionary theory and epigenetics...http://en.wikipedia.org/wiki/Lamarckismhttp://en.wikipedia.org/wiki/EpigeneticsBottom line is that we can't stop this train any more easily than we can stop the anthropocene's Climate Change. Human beings are tempted. Sometimes we win the battle, other times we give in to vanity, hedonism, and ego. This is all a subset of much larger forces at play. Men and women make choices and act within that environment. Deal with it.
Javier E

Vitamins Hide the Low Quality of Our Food - NYTimes.com - 0 views

  • we fail to notice that food marketers use synthetic vitamins to sell unhealthful products. Not only have we become dependent on these synthetic vitamins to keep ourselves safe from deficiencies, but the eating habits they encourage are having disastrous consequences on our health.
  • vitamins spread from the labs of scientists to the offices of food marketers, and began to take on a life of their own.
  • Nutritionists are correct when they tell us that most of us don’t need to be taking multivitamins. But that’s only because multiple vitamins have already been added to our food.
  • ...11 more annotations...
  • Given the poor quality of the typical American diet, this fortification is far from superfluous. In fact, for products like milk and flour, where fortification and enrichment have occurred for so long that they’ve become invisible, it would be almost irresponsible not to add synthetic vitamins.
  • synthetic vitamins are as essential to food companies as they are to us. To be successful in today’s market, food manufacturers must create products that can be easily transported over long distances and stored for extended periods.
  • They also need to be sure that their products offer some nutritional value so that customers don’t have to go elsewhere to meet their vitamin needs. But the very processing that’s necessary to create long shelf lives destroys vitamins, among other important nutrients. It’s nearly impossible to create foods that can sit for months in a supermarket that are also naturally vitamin-rich.
  • Today, it would be easy to blame food marketers for using vitamins to deceive us into buying their products. But our blindness is largely our own fault.
  • we’ve entered into a complicit agreement with them: They depend on us to buy their products, and we depend on the synthetic vitamins they add to those products to support eating habits that might otherwise leave us deficient
  • extra vitamins do not protect us from the long-term “diseases of civilization” that are currently ravaging our country, including obesity, heart disease and Type 2 diabetes — many of which are strongly associated with diet.
  • natural foods contain potentially protective substances such as phytochemicals and polyunsaturated fat that also are affected by processing, but that are not usually replaced. If these turn out to be as important as many researchers suspect, then our exclusive focus on vitamins could mean we’re protecting ourselves against the wrong dangers. It’s as if we’re taking out earthquake insurance policies in an area more at risk for floods.
  • And adding back vitamins after the fact ignores the issue of synergy: how nutrients work naturally as opposed to when they are isolated. A 2011 study on broccoli, for example, found that giving subjects fresh broccoli florets led them to absorb and metabolize seven times more of the anticancer compounds known as glucosinolates, present in broccoli and other cruciferous vegetables
  • And yet we refuse to change our eating habits in the ways that would actually protect us, which would require refocusing our diets on minimally processed foods that are naturally nutrient-rich.
  • The popularity of dietary supplements and vitamin-enhanced processed “health” foods means that even those of us who try to do right by our health are often getting it wrong.
  • we mustn’t let it distract us from an even more fundamental question: how we’ve allowed the word “vitamin” to become synonymous with “health.”
sgardner35

A High School Is Making All Of Its Female Students Get Their Prom Dresses 'Pre-Approved... - 0 views

  • While students at Delone Catholic High School in Pennsylvania will have a prom to attend, they will not necessarily
  • be allowed to wear what they want. The school has a new policy—instated this year—that requires “all young women” who plan on attending the prom—whether they attend the school or are the guest of student—to “submit a photo of the gown that will be worn to the prom for pre-approval.”
  • The petition reads, “Our children will not undergo scrutiny of prom gowns based on outdated, unrealistic expectations and rules implemented at such short notice.”
  • ...1 more annotation...
  • “distracting” and “unacceptable” and the #ClothingHasNoGender campaign that was launched by his friends in response.
Javier E

The Eight-Second Attention Span - The New York Times - 3 views

  • A survey of Canadian media consumption by Microsoft concluded that the average attention span had fallen to eight seconds, down from 12 in the year 2000. We now have a shorter attention span than goldfish, the study found.
  • “The true scarce commodity” of the near future, he said, will be “human attention.”
  • there seems little doubt that our devices have rewired our brains. We think in McNugget time. The trash flows, unfiltered, along with the relevant stuff, in an eternal stream.
  • ...5 more annotations...
  • I can no longer wait in a grocery store line, or linger for a traffic light, or even pause long enough to let a bagel pop from the toaster, without reflexively reaching for my smartphone.
  • . You see it in our politics, with fear-mongering slogans replacing anything that requires sustained thought. And the collapse of a fact-based democracy, where, for example, 60 percent of Trump supporters believe Obama was born in another country, has to be a byproduct of the pick-and-choose news from the buffet line of our screens.
  • I’ve found a pair of antidotes, very old school, for my shrinking attention span.
  • . You plant something in the cold, wet soil of the fall
  • The second is deep reading
Javier E

How Meditation Changes the Brain and Body - The New York Times - 0 views

  • a study published in Biological Psychiatry brings scientific thoroughness to mindfulness meditation and for the first time shows that, unlike a placebo, it can change the brains of ordinary people and potentially improve their health.
  • One difficulty of investigating meditation has been the placebo problem. In rigorous studies, some participants receive treatment while others get a placebo: They believe they are getting the same treatment when they are not. But people can usually tell if they are meditating. Dr. Creswell, working with scientists from a number of other universities, managed to fake mindfulness.
  • Half the subjects were then taught formal mindfulness meditation at a residential retreat center; the rest completed a kind of sham mindfulness meditation that was focused on relaxation and distracting oneself from worries and stress.
  • ...3 more annotations...
  • follow-up brain scans showed differences in only those who underwent mindfulness meditation. There was more activity, or communication, among the portions of their brains that process stress-related reactions and other areas related to focus and calm. Four months later, those who had practiced mindfulness showed much lower levels in their blood of a marker of unhealthy inflammation than the relaxation group, even though few were still meditating.
  • Dr. Creswell and his colleagues believe that the changes in the brain contributed to the subsequent reduction in inflammation, although precisely how remains unknown.
  • When it comes to how much mindfulness is needed to improve health, Dr. Creswell says, ‘‘we still have no idea about the ideal dose.”
kushnerha

The Next Genocide - The New York Times - 1 views

  • But sadly, the anxieties of our own era could once again give rise to scapegoats and imagined enemies, while contemporary environmental stresses could encourage new variations on Hitler’s ideas, especially in countries anxious about feeding their growing populations or maintaining a rising standard of living.
  • The quest for German domination was premised on the denial of science. Hitler’s alternative to science was the idea of Lebensraum.
    • kushnerha
       
      "Lebensraum linked a war of extermination to the improvement of lifestyle." Additionally, "The pursuit of peace and plenty through science, he claimed in "Mein Kampf," was a Jewish plot to distract Germans from the necessity of war."
  • Climate change has also brought uncertainties about food supply back to the center of great power politics.
  • ...8 more annotations...
  • China today, like Germany before the war, is an industrial power incapable of feeding its population from its own territory
    • kushnerha
       
      And "could make China's population susceptible to a revival of ideas like Lebensraum."
  • The risk is that a developed country able to project military power could, like Hitler’s Germany, fall into ecological panic, and take drastic steps to protect its existing standard of living.
  • United States has done more than any other nation to bring about the next ecological panic, yet it is the only country where climate science is still resisted by certain political and business elites. These deniers tend to present the empirical findings of scientists as a conspiracy and question the validity of science — an intellectual stance that is uncomfortably close to Hitler’s.
  • The Kremlin, which is economically dependent on the export of hydrocarbons to Europe, is now seeking to make gas deals with individual European states one by one in order to weaken European unity and expand its own influence.
  • Putin waxes nostalgic for the 1930s, while Russian nationalists blame gays, cosmopolitans and Jews for antiwar sentiment. None of this bodes well for Europe’s future
  • The Nazi scenario of 1941 will not reappear in precisely the same form, but several of its causal elements have already begun to assemble.
  • not difficult to imagine ethnic mass murder in Africa
    • kushnerha
       
      also no longer difficult to imagine the "triumph of a violent totalitarian strain of Islamism in the parched Middle East," a "Chinese play for resources in Africa or Russia or Eastern Europe that involves removing the people already living there," and a "growing global ecological panic if America abandons climate science or the European Union falls apart"
  • Denying science imperils the future by summoning the ghosts of the past.
    • kushnerha
       
      Americans must make the "crucial choice between science and ideology"
Javier E

The Tech Industry's Psychological War on Kids - Member Feature Stories - Medium - 0 views

  • she cried, “They took my f***ing phone!” Attempting to engage Kelly in conversation, I asked her what she liked about her phone and social media. “They make me happy,” she replied.
  • Even though they were loving and involved parents, Kelly’s mom couldn’t help feeling that they’d failed their daughter and must have done something terribly wrong that led to her problems.
  • My practice as a child and adolescent psychologist is filled with families like Kelly’s. These parents say their kids’ extreme overuse of phones, video games, and social media is the most difficult parenting issue they face — and, in many cases, is tearing the family apart.
  • ...88 more annotations...
  • What none of these parents understand is that their children’s and teens’ destructive obsession with technology is the predictable consequence of a virtually unrecognized merger between the tech industry and psychology.
  • Dr. B.J. Fogg, is a psychologist and the father of persuasive technology, a discipline in which digital machines and apps — including smartphones, social media, and video games — are configured to alter human thoughts and behaviors. As the lab’s website boldly proclaims: “Machines designed to change humans.”
  • These parents have no idea that lurking behind their kids’ screens and phones are a multitude of psychologists, neuroscientists, and social science experts who use their knowledge of psychological vulnerabilities to devise products that capture kids’ attention for the sake of industry profit.
  • psychology — a discipline that we associate with healing — is now being used as a weapon against children.
  • This alliance pairs the consumer tech industry’s immense wealth with the most sophisticated psychological research, making it possible to develop social media, video games, and phones with drug-like power to seduce young users.
  • Likewise, social media companies use persuasive design to prey on the age-appropriate desire for preteen and teen kids, especially girls, to be socially successful. This drive is built into our DNA, since real-world relational skills have fostered human evolution.
  • Called “the millionaire maker,” Fogg has groomed former students who have used his methods to develop technologies that now consume kids’ lives. As he recently touted on his personal website, “My students often do groundbreaking projects, and they continue having impact in the real world after they leave Stanford… For example, Instagram has influenced the behavior of over 800 million people. The co-founder was a student of mine.”
  • Persuasive technology (also called persuasive design) works by deliberately creating digital environments that users feel fulfill their basic human drives — to be social or obtain goals — better than real-world alternatives.
  • Kids spend countless hours in social media and video game environments in pursuit of likes, “friends,” game points, and levels — because it’s stimulating, they believe that this makes them happy and successful, and they find it easier than doing the difficult but developmentally important activities of childhood.
  • While persuasion techniques work well on adults, they are particularly effective at influencing the still-maturing child and teen brain.
  • “Video games, better than anything else in our culture, deliver rewards to people, especially teenage boys,” says Fogg. “Teenage boys are wired to seek competency. To master our world and get better at stuff. Video games, in dishing out rewards, can convey to people that their competency is growing, you can get better at something second by second.”
  • it’s persuasive design that’s helped convince this generation of boys they are gaining “competency” by spending countless hours on game sites, when the sad reality is they are locked away in their rooms gaming, ignoring school, and not developing the real-world competencies that colleges and employers demand.
  • Persuasive technologies work because of their apparent triggering of the release of dopamine, a powerful neurotransmitter involved in reward, attention, and addiction.
  • As she says, “If you don’t get 100 ‘likes,’ you make other people share it so you get 100…. Or else you just get upset. Everyone wants to get the most ‘likes.’ It’s like a popularity contest.”
  • there are costs to Casey’s phone obsession, noting that the “girl’s phone, be it Facebook, Instagram or iMessage, is constantly pulling her away from her homework, sleep, or conversations with her family.
  • Casey says she wishes she could put her phone down. But she can’t. “I’ll wake up in the morning and go on Facebook just… because,” she says. “It’s not like I want to or I don’t. I just go on it. I’m, like, forced to. I don’t know why. I need to. Facebook takes up my whole life.”
  • B.J. Fogg may not be a household name, but Fortune Magazine calls him a “New Guru You Should Know,” and his research is driving a worldwide legion of user experience (UX) designers who utilize and expand upon his models of persuasive design.
  • “No one has perhaps been as influential on the current generation of user experience (UX) designers as Stanford researcher B.J. Fogg.”
  • the core of UX research is about using psychology to take advantage of our human vulnerabilities.
  • As Fogg is quoted in Kosner’s Forbes article, “Facebook, Twitter, Google, you name it, these companies have been using computers to influence our behavior.” However, the driving force behind behavior change isn’t computers. “The missing link isn’t the technology, it’s psychology,” says Fogg.
  • UX researchers not only follow Fogg’s design model, but also his apparent tendency to overlook the broader implications of persuasive design. They focus on the task at hand, building digital machines and apps that better demand users’ attention, compel users to return again and again, and grow businesses’ bottom line.
  • the “Fogg Behavior Model” is a well-tested method to change behavior and, in its simplified form, involves three primary factors: motivation, ability, and triggers.
  • “We can now create machines that can change what people think and what people do, and the machines can do that autonomously.”
  • Regarding ability, Fogg suggests that digital products should be made so that users don’t have to “think hard.” Hence, social networks are designed for ease of use
  • Finally, Fogg says that potential users need to be triggered to use a site. This is accomplished by a myriad of digital tricks, including the sending of incessant notifications
  • moral questions about the impact of turning persuasive techniques on children and teens are not being asked. For example, should the fear of social rejection be used to compel kids to compulsively use social media? Is it okay to lure kids away from school tasks that demand a strong mental effort so they can spend their lives on social networks or playing video games that don’t make them think much at all?
  • Describing how his formula is effective at getting people to use a social network, the psychologist says in an academic paper that a key motivator is users’ desire for “social acceptance,” although he says an even more powerful motivator is the desire “to avoid being socially rejected.”
  • the startup Dopamine Labs boasts about its use of persuasive techniques to increase profits: “Connect your app to our Persuasive AI [Artificial Intelligence] and lift your engagement and revenue up to 30% by giving your users our perfect bursts of dopamine,” and “A burst of Dopamine doesn’t just feel good: it’s proven to re-wire user behavior and habits.”
  • Ramsay Brown, the founder of Dopamine Labs, says in a KQED Science article, “We have now developed a rigorous technology of the human mind, and that is both exciting and terrifying. We have the ability to twiddle some knobs in a machine learning dashboard we build, and around the world hundreds of thousands of people are going to quietly change their behavior in ways that, unbeknownst to them, feel second-nature but are really by design.”
  • Programmers call this “brain hacking,” as it compels users to spend more time on sites even though they mistakenly believe it’s strictly due to their own conscious choices.
  • Banks of computers employ AI to “learn” which of a countless number of persuasive design elements will keep users hooked
  • A persuasion profile of a particular user’s unique vulnerabilities is developed in real time and exploited to keep users on the site and make them return again and again for longer periods of time. This drives up profits for consumer internet companies whose revenue is based on how much their products are used.
  • “The leaders of Internet companies face an interesting, if also morally questionable, imperative: either they hijack neuroscience to gain market share and make large profits, or they let competitors do that and run away with the market.”
  • Social media and video game companies believe they are compelled to use persuasive technology in the arms race for attention, profits, and survival.
  • Children’s well-being is not part of the decision calculus.
  • one breakthrough occurred in 2017 when Facebook documents were leaked to The Australian. The internal report crafted by Facebook executives showed the social network boasting to advertisers that by monitoring posts, interactions, and photos in real time, the network is able to track when teens feel “insecure,” “worthless,” “stressed,” “useless” and a “failure.”
  • The report also bragged about Facebook’s ability to micro-target ads down to “moments when young people need a confidence boost.”
  • These design techniques provide tech corporations a window into kids’ hearts and minds to measure their particular vulnerabilities, which can then be used to control their behavior as consumers. This isn’t some strange future… this is now.
  • The official tech industry line is that persuasive technologies are used to make products more engaging and enjoyable. But the revelations of industry insiders can reveal darker motives.
  • Revealing the hard science behind persuasive technology, Hopson says, “This is not to say that players are the same as rats, but that there are general rules of learning which apply equally to both.”
  • After penning the paper, Hopson was hired by Microsoft, where he helped lead the development of the Xbox Live, Microsoft’s online gaming system
  • “If game designers are going to pull a person away from every other voluntary social activity or hobby or pastime, they’re going to have to engage that person at a very deep level in every possible way they can.”
  • This is the dominant effect of persuasive design today: building video games and social media products so compelling that they pull users away from the real world to spend their lives in for-profit domains.
  • Persuasive technologies are reshaping childhood, luring kids away from family and schoolwork to spend more and more of their lives sitting before screens and phones.
  • “Since we’ve figured to some extent how these pieces of the brain that handle addiction are working, people have figured out how to juice them further and how to bake that information into apps.”
  • Today, persuasive design is likely distracting adults from driving safely, productive work, and engaging with their own children — all matters which need urgent attention
  • Still, because the child and adolescent brain is more easily controlled than the adult mind, the use of persuasive design is having a much more hurtful impact on kids.
  • But to engage in a pursuit at the expense of important real-world activities is a core element of addiction.
  • younger U.S. children now spend 5 ½ hours each day with entertainment technologies, including video games, social media, and online videos.
  • Even more, the average teen now spends an incredible 8 hours each day playing with screens and phones
  • U.S. kids only spend 16 minutes each day using the computer at home for school.
  • Quietly, using screens and phones for entertainment has become the dominant activity of childhood.
  • Younger kids spend more time engaging with entertainment screens than they do in school
  • teens spend even more time playing with screens and phones than they do sleeping
  • kids are so taken with their phones and other devices that they have turned their backs to the world around them.
  • many children are missing out on real-life engagement with family and school — the two cornerstones of childhood that lead them to grow up happy and successful
  • persuasive technologies are pulling kids into often toxic digital environments
  • A too frequent experience for many is being cyberbullied, which increases their risk of skipping school and considering suicide.
  • And there is growing recognition of the negative impact of FOMO, or the fear of missing out, as kids spend their social media lives watching a parade of peers who look to be having a great time without them, feeding their feelings of loneliness and being less than.
  • The combined effects of the displacement of vital childhood activities and exposure to unhealthy online environments is wrecking a generation.
  • as the typical age when kids get their first smartphone has fallen to 10, it’s no surprise to see serious psychiatric problems — once the domain of teens — now enveloping young kids
  • Self-inflicted injuries, such as cutting, that are serious enough to require treatment in an emergency room, have increased dramatically in 10- to 14-year-old girls, up 19% per year since 2009.
  • While girls are pulled onto smartphones and social media, boys are more likely to be seduced into the world of video gaming, often at the expense of a focus on school
  • it’s no surprise to see this generation of boys struggling to make it to college: a full 57% of college admissions are granted to young women compared with only 43% to young men.
  • Economists working with the National Bureau of Economic Research recently demonstrated how many young U.S. men are choosing to play video games rather than join the workforce.
  • The destructive forces of psychology deployed by the tech industry are making a greater impact on kids than the positive uses of psychology by mental health providers and child advocates. Put plainly, the science of psychology is hurting kids more than helping them.
  • Hope for this wired generation has seemed dim until recently, when a surprising group has come forward to criticize the tech industry’s use of psychological manipulation: tech executives
  • Tristan Harris, formerly a design ethicist at Google, has led the way by unmasking the industry’s use of persuasive design. Interviewed in The Economist’s 1843 magazine, he says, “The job of these companies is to hook people, and they do that by hijacking our psychological vulnerabilities.”
  • Marc Benioff, CEO of the cloud computing company Salesforce, is one of the voices calling for the regulation of social media companies because of their potential to addict children. He says that just as the cigarette industry has been regulated, so too should social media companies. “I think that, for sure, technology has addictive qualities that we have to address, and that product designers are working to make those products more addictive, and we need to rein that back as much as possible,”
  • “If there’s an unfair advantage or things that are out there that are not understood by parents, then the government’s got to come forward and illuminate that.”
  • Since millions of parents, for example the parents of my patient Kelly, have absolutely no idea that devices are used to hijack their children’s minds and lives, regulation of such practices is the right thing to do.
  • Another improbable group to speak out on behalf of children is tech investors.
  • How has the consumer tech industry responded to these calls for change? By going even lower.
  • Facebook recently launched Messenger Kids, a social media app that will reach kids as young as five years old. Suggestive that harmful persuasive design is now honing in on very young children is the declaration of Messenger Kids Art Director, Shiu Pei Luu, “We want to help foster communication [on Facebook] and make that the most exciting thing you want to be doing.”
  • the American Psychological Association (APA) — which is tasked with protecting children and families from harmful psychological practices — has been essentially silent on the matter
  • APA Ethical Standards require the profession to make efforts to correct the “misuse” of the work of psychologists, which would include the application of B.J. Fogg’s persuasive technologies to influence children against their best interests
  • Manipulating children for profit without their own or parents’ consent, and driving kids to spend more time on devices that contribute to emotional and academic problems is the embodiment of unethical psychological practice.
  • “Never before in history have basically 50 mostly men, mostly 20–35, mostly white engineer designer types within 50 miles of where we are right now [Silicon Valley], had control of what a billion people think and do.”
  • Some may argue that it’s the parents’ responsibility to protect their children from tech industry deception. However, parents have no idea of the powerful forces aligned against them, nor do they know how technologies are developed with drug-like effects to capture kids’ minds
  • Others will claim that nothing should be done because the intention behind persuasive design is to build better products, not manipulate kids
  • similar circumstances exist in the cigarette industry, as tobacco companies have as their intention profiting from the sale of their product, not hurting children. Nonetheless, because cigarettes and persuasive design predictably harm children, actions should be taken to protect kids from their effects.
  • in a 1998 academic paper, Fogg describes what should happen if things go wrong, saying, if persuasive technologies are “deemed harmful or questionable in some regard, a researcher should then either take social action or advocate that others do so.”
  • I suggest turning to President John F. Kennedy’s prescient guidance: He said that technology “has no conscience of its own. Whether it will become a force for good or ill depends on man.”
  • The APA should begin by demanding that the tech industry’s behavioral manipulation techniques be brought out of the shadows and exposed to the light of public awareness
  • Changes should be made in the APA’s Ethics Code to specifically prevent psychologists from manipulating children using digital machines, especially if such influence is known to pose risks to their well-being.
  • Moreover, the APA should follow its Ethical Standards by making strong efforts to correct the misuse of psychological persuasion by the tech industry and by user experience designers outside the field of psychology.
  • It should join with tech executives who are demanding that persuasive design in kids’ tech products be regulated
  • The APA also should make its powerful voice heard amongst the growing chorus calling out tech companies that intentionally exploit children’s vulnerabilities.
Javier E

At the Existentialist Café: Freedom, Being, and Apricot Cocktails with Jean-P... - 0 views

  • The phenomenologists’ leading thinker, Edmund Husserl, provided a rallying cry, ‘To the things themselves!’ It meant: don’t waste time on the interpretations that accrue upon things, and especially don’t waste time wondering whether the things are real. Just look at this that’s presenting itself to you, whatever this may be, and describe it as precisely as possible.
  • You might think you have defined me by some label, but you are wrong, for I am always a work in progress. I create myself constantly through action, and this is so fundamental to my human condition that, for Sartre, it is the human condition, from the moment of first consciousness to the moment when death wipes it out. I am my own freedom: no more, no less.
  • Sartre wrote like a novelist — not surprisingly, since he was one. In his novels, short stories and plays as well as in his philosophical treatises, he wrote about the physical sensations of the world and the structures and moods of human life. Above all, he wrote about one big subject: what it meant to be free. Freedom, for him, lay at the heart of all human experience, and this set humans apart from all other kinds of object.
  • ...97 more annotations...
  • Sartre listened to his problem and said simply, ‘You are free, therefore choose — that is to say, invent.’ No signs are vouchsafed in this world, he said. None of the old authorities can relieve you of the burden of freedom. You can weigh up moral or practical considerations as carefully as you like, but ultimately you must take the plunge and do something, and it’s up to you what that something is.
  • Even if the situation is unbearable — perhaps you are facing execution, or sitting in a Gestapo prison, or about to fall off a cliff — you are still free to decide what to make of it in mind and deed. Starting from where you are now, you choose. And in choosing, you also choose who you will be.
  • The war had made people realise that they and their fellow humans were capable of departing entirely from civilised norms; no wonder the idea of a fixed human nature seemed questionable.
  • If this sounds difficult and unnerving, it’s because it is. Sartre does not deny that the need to keep making decisions brings constant anxiety. He heightens this anxiety by pointing out that what you do really matters. You should make your choices as though you were choosing on behalf of the whole of humanity, taking the entire burden of responsibility for how the human race behaves. If you avoid this responsibility by fooling yourself that you are the victim of circumstance or of someone else’s bad advice, you are failing to meet the demands of human life and choosing a fake existence, cut off from your own ‘authenticity’.
  • Along with the terrifying side of this comes a great promise: Sartre’s existentialism implies that it is possible to be authentic and free, as long as you keep up the effort.
  • almost all agreed that it was, as an article in Les nouvelles littéraires phrased it, a ‘sickening mixture of philosophic pretentiousness, equivocal dreams, physiological technicalities, morbid tastes and hesitant eroticism … an introspective embryo that one would take distinct pleasure in crushing’.
  • he offered a philosophy designed for a species that had just scared the hell out of itself, but that finally felt ready to grow up and take responsibility.
  • In this rebellious world, just as with the Parisian bohemians and Dadaists in earlier generations, everything that was dangerous and provocative was good, and everything that was nice or bourgeois was bad.
  • Such interweaving of ideas and life had a long pedigree, although the existentialists gave it a new twist. Stoic and Epicurean thinkers in the classical world had practised philosophy as a means of living well, rather than of seeking knowledge or wisdom for their own sake. By reflecting on life’s vagaries in philosophical ways, they believed they could become more resilient, more able to rise above circumstances, and better equipped to manage grief, fear, anger, disappointment or anxiety.
  • In the tradition they passed on, philosophy is neither a pure intellectual pursuit nor a collection of cheap self-help tricks, but a discipline for flourishing and living a fully human, responsible life.
  • For Kierkegaard, Descartes had things back to front. In his own view, human existence comes first: it is the starting point for everything we do, not the result of a logical deduction. My existence is active: I live it and choose it, and this precedes any statement I can make about myself.
  • Studying our own moral genealogy cannot help us to escape or transcend ourselves. But it can enable us to see our illusions more clearly and lead a more vital, assertive existence.
  • What was needed, he felt, was not high moral or theological ideals, but a deeply critical form of cultural history or ‘genealogy’ that would uncover the reasons why we humans are as we are, and how we came to be that way. For him, all philosophy could even be redefined as a form of psychology, or history.
  • For those oppressed on grounds of race or class, or for those fighting against colonialism, existentialism offered a change of perspective — literally, as Sartre proposed that all situations be judged according to how they appeared in the eyes of those most oppressed, or those whose suffering was greatest.
  • She observed that we need not expect moral philosophers to ‘live by’ their ideas in a simplistic way, as if they were following a set of rules. But we can expect them to show how their ideas are lived in. We should be able to look in through the windows of a philosophy, as it were, and see how people occupy it, how they move about and how they conduct themselves.
  • the existentialists inhabited their historical and personal world, as they inhabited their ideas. This notion of ‘inhabited philosophy’ is one I’ve borrowed from the English philosopher and novelist Iris Murdoch, who wrote the first full-length book on Sartre and was an early adopter of existentialism
  • What is existentialism anyway?
  • An existentialist who is also phenomenological provides no easy rules for dealing with this condition, but instead concentrates on describing lived experience as it presents itself. — By describing experience well, he or she hopes to understand this existence and awaken us to ways of living more authentic lives.
  • Existentialists concern themselves with individual, concrete human existence. — They consider human existence different from the kind of being other things have. Other entities are what they are, but as a human I am whatever I choose to make of myself at every moment. I am free — — and therefore I’m responsible for everything I do, a dizzying fact which causes — an anxiety inseparable from human existence itself.
  • On the other hand, I am only free within situations, which can include factors in my own biology and psychology as well as physical, historical and social variables of the world into which I have been thrown. — Despite the limitations, I always want more: I am passionately involved in personal projects of all kinds. — Human existence is thus ambiguous: at once boxed in by borders and yet transcendent and exhilarating. —
  • The first part of this is straightforward: a phenomenologist’s job is to describe. This is the activity that Husserl kept reminding his students to do. It meant stripping away distractions, habits, clichés of thought, presumptions and received ideas, in order to return our attention to what he called the ‘things themselves’. We must fix our beady gaze on them and capture them exactly as they appear, rather than as we think they are supposed to be.
  • Husserl therefore says that, to phenomenologically describe a cup of coffee, I should set aside both the abstract suppositions and any intrusive emotional associations. Then I can concentrate on the dark, fragrant, rich phenomenon in front of me now. This ‘setting aside’ or ‘bracketing out’ of speculative add-ons Husserl called epoché — a term borrowed from the ancient Sceptics,
  • The point about rigour is crucial; it brings us back to the first half of the command to describe phenomena. A phenomenologist cannot get away with listening to a piece of music and saying, ‘How lovely!’ He or she must ask: is it plaintive? is it dignified? is it colossal and sublime? The point is to keep coming back to the ‘things themselves’ — phenomena stripped of their conceptual baggage — so as to bail out weak or extraneous material and get to the heart of the experience.
  • Husserlian ‘bracketing out’ or epoché allows the phenomenologist to temporarily ignore the question ‘But is it real?’, in order to ask how a person experiences his or her world. Phenomenology gives a formal mode of access to human experience. It lets philosophers talk about life more or less as non-philosophers do, while still being able to tell themselves they are being methodical and rigorous.
  • Besides claiming to transform the way we think about reality, phenomenologists promised to change how we think about ourselves. They believed that we should not try to find out what the human mind is, as if it were some kind of substance. Instead, we should consider what it does, and how it grasps its experiences.
  • For Brentano, this reaching towards objects is what our minds do all the time. Our thoughts are invariably of or about something, he wrote: in love, something is loved, in hatred, something is hated, in judgement, something is affirmed or denied. Even when I imagine an object that isn’t there, my mental structure is still one of ‘about-ness’ or ‘of-ness’.
  • Except in deepest sleep, my mind is always engaged in this aboutness: it has ‘intentionality’. Having taken the germ of this from Brentano, Husserl made it central to his whole philosophy.
  • Husserl saw in the idea of intentionality a way to sidestep two great unsolved puzzles of philosophical history: the question of what objects ‘really’ are, and the question of what the mind ‘really’ is. By doing the epoché and bracketing out all consideration of reality from both topics, one is freed to concentrate on the relationship in the middle. One can apply one’s descriptive energies to the endless dance of intentionality that takes place in our lives: the whirl of our minds as they seize their intended phenomena one after the other and whisk them around the floor,
  • Understood in this way, the mind hardly is anything at all: it is its aboutness. This makes the human mind (and possibly some animal minds) different from any other naturally occurring entity. Nothing else can be as thoroughly about or of things as the mind is:
  • Some Eastern meditation techniques aim to still this scurrying creature, but the extreme difficulty of this shows how unnatural it is to be mentally inert. Left to itself, the mind reaches out in all directions as long as it is awake — and even carries on doing it in the dreaming phase of its sleep.
  • a mind that is experiencing nothing, imagining nothing, or speculating about nothing can hardly be said to be a mind at all.
  • Three simple ideas — description, phenomenon, intentionality — provided enough inspiration to keep roomfuls of Husserlian assistants busy in Freiburg for decades. With all of human existence awaiting their attention, how could they ever run out of things to do?
  • For Sartre, this gives the mind an immense freedom. If we are nothing but what we think about, then no predefined ‘inner nature’ can hold us back. We are protean.
  • way of this interpretation. Real, not real; inside, outside; what difference did it make? Reflecting on this, Husserl began turning his phenomenology into a branch of ‘idealism’ — the philosophical tradition which denied external reality and defined everything as a kind of private hallucination.
  • For Sartre, if we try to shut ourselves up inside our own minds, ‘in a nice warm room with the shutters closed’, we cease to exist. We have no cosy home: being out on the dusty road is the very definition of what we are.
  • One might think that, if Heidegger had anything worth saying, he could have communicated it in ordinary language. The fact is that he does not want to be ordinary, and he may not even want to communicate in the usual sense. He wants to make the familiar obscure, and to vex us. George Steiner thought that Heidegger’s purpose was less to be understood than to be experienced through a ‘felt strangeness’.
  • He takes Dasein in its most ordinary moments, then talks about it in the most innovative way he can. For Heidegger, Dasein’s everyday Being is right here: it is Being-in-the-world, or In-der-Welt-sein. The main feature of Dasein’s everyday Being-in-the-world right here is that it is usually busy doing something.
  • Thus, for Heidegger, all Being-in-the-world is also a ‘Being-with’ or Mitsein. We cohabit with others in a ‘with-world’, or Mitwelt. The old philosophical problem of how we prove the existence of other minds has now vanished. Dasein swims in the with-world long before it wonders about other minds.
  • Sometimes the best-educated people were those least inclined to take the Nazis seriously, dismissing them as too absurd to last. Karl Jaspers was one of those who made this mistake, as he later recalled, and Beauvoir observed similar dismissive attitudes among the French students in Berlin.
  • In any case, most of those who disagreed with Hitler’s ideology soon learned to keep their view to themselves. If a Nazi parade passed on the street, they would either slip out of view or give the obligatory salute like everyone else, telling themselves that the gesture meant nothing if they did not believe in it. As the psychologist Bruno Bettelheim later wrote of this period, few people will risk their life for such a small thing as raising an arm — yet that is how one’s powers of resistance are eroded away, and eventually one’s responsibility and integrity go with them.
  • for Arendt, if you do not respond adequately when the times demand it, you show a lack of imagination and attention that is as dangerous as deliberately committing an abuse. It amounts to disobeying the one command she had absorbed from Heidegger in those Marburg days: Think!
  • ‘Everything takes place under a kind of anaesthesia. Objectively dreadful events produce a thin, puny emotional response. Murders are committed like schoolboy pranks. Humiliation and moral decay are accepted like minor incidents.’ Haffner thought modernity itself was partly to blame: people had become yoked to their habits and to mass media, forgetting to stop and think, or to disrupt their routines long enough to question what was going on.
  • Heidegger’s former lover and student Hannah Arendt would argue, in her 1951 study The Origins of Totalitarianism, that totalitarian movements thrived at least partly because of this fragmentation in modern lives, which made people more vulnerable to being swept away by demagogues. Elsewhere, she coined the phrase ‘the banality of evil’ to describe the most extreme failures of personal moral awareness.
  • His communicative ideal fed into a whole theory of history: he traced all civilisation to an ‘Axial Period’ in the fifth century BC, during which philosophy and culture exploded simultaneously in Europe, the Middle East and Asia, as though a great bubble of minds had erupted from the earth’s surface. ‘True philosophy needs communion to come into existence,’ he wrote, and added, ‘Uncommunicativeness in a philosopher is virtually a criterion of the untruth of his thinking.’
  • The idea of being called to authenticity became a major theme in later existentialism, the call being interpreted as saying something like ‘Be yourself!’, as opposed to being phony. For Heidegger, the call is more fundamental than that. It is a call to take up a self that you didn’t know you had: to wake up to your Being. Moreover, it is a call to action. It requires you to do something: to take a decision of some sort.
  • Being and Time contained at least one big idea that should have been of use in resisting totalitarianism. Dasein, Heidegger wrote there, tends to fall under the sway of something called das Man or ‘the they’ — an impersonal entity that robs us of the freedom to think for ourselves. To live authentically requires resisting or outwitting this influence, but this is not easy because das Man is so nebulous. Man in German does not mean ‘man’ as in English (that’s der Mann), but a neutral abstraction, something like ‘one’ in the English phrase ‘one doesn’t do that’,
  • for Heidegger, das Man is me. It is everywhere and nowhere; it is nothing definite, but each of us is it. As with Being, it is so ubiquitous that it is difficult to see. If I am not careful, however, das Man takes over the important decisions that should be my own. It drains away my responsibility or ‘answerability’. As Arendt might put it, we slip into banality, failing to think.
  • Jaspers focused on what he called Grenzsituationen — border situations, or limit situations. These are the moments when one finds oneself constrained or boxed in by what is happening, but at the same time pushed by these events towards the limits or outer edge of normal experience. For example, you might have to make a life-or-death choice, or something might remind you suddenly of your mortality,
  • Jaspers’ interest in border situations probably had much to do with his own early confrontation with mortality. From childhood, he had suffered from a heart condition so severe that he always expected to die at any moment. He also had emphysema, which forced him to speak slowly, taking long pauses to catch his breath. Both illnesses meant that he had to budget his energies with care in order to get his work done without endangering his life.
  • If I am to resist das Man, I must become answerable to the call of my ‘voice of conscience’. This call does not come from God, as a traditional Christian definition of the voice of conscience might suppose. It comes from a truly existentialist source: my own authentic self. Alas, this voice is one I do not recognise and may not hear, because it is not the voice of my habitual ‘they-self’. It is an alien or uncanny version of my usual voice. I am familiar with my they-self, but not with my unalienated voice — so, in a weird twist, my real voice is the one that sounds strangest to me.
  • Marcel developed a strongly theological branch of existentialism. His faith distanced him from both Sartre and Heidegger, but he shared a sense of how history makes demands on individuals. In his essay ‘On the Ontological Mystery’, written in 1932 and published in the fateful year of 1933, Marcel wrote of the human tendency to become stuck in habits, received ideas, and a narrow-minded attachment to possessions and familiar scenes. Instead, he urged his readers to develop a capacity for remaining ‘available’ to situations as they arise. Similar ideas of disponibilité or availability had been explored by other writers,
  • Marcel made it his central existential imperative. He was aware of how rare and difficult it was. Most people fall into what he calls ‘crispation’: a tensed, encrusted shape in life — ‘as though each one of us secreted a kind of shell which gradually hardened and imprisoned him’.
  • Bettelheim later observed that, under Nazism, only a few people realised at once that life could not continue unaltered: these were the ones who got away quickly. Bettelheim himself was not among them. Caught in Austria when Hitler annexed it, he was sent first to Dachau and then to Buchenwald, but was then released in a mass amnesty to celebrate Hitler’s birthday in 1939 — an extraordinary reprieve, after which he left at once for America.
  • we are used to reading philosophy as offering a universal message for all times and places — or at least as aiming to do so. But Heidegger disliked the notion of universal truths or universal humanity, which he considered a fantasy. For him, Dasein is not defined by shared faculties of reason and understanding, as the Enlightenment philosophers thought. Still less is it defined by any kind of transcendent eternal soul, as in religious tradition. We do not exist on a higher, eternal plane at all. Dasein’s Being is local: it has a historical situation, and is constituted in time and place.
  • For Marcel, learning to stay open to reality in this way is the philosopher’s prime job. Everyone can do it, but the philosopher is the one who is called on above all to stay awake, so as to be the first to sound the alarm if something seems wrong.
  • Second, it also means understanding that we are historical beings, and grasping the demands our particular historical situation is making on us. In what Heidegger calls ‘anticipatory resoluteness’, Dasein discovers ‘that its uttermost possibility lies in giving itself up’. At that moment, through Being-towards-death and resoluteness in facing up to one’s time, one is freed from the they-self and attains one’s true, authentic self.
  • If we are temporal beings by our very nature, then authentic existence means accepting, first, that we are finite and mortal. We will die: this all-important realisation is what Heidegger calls authentic ‘Being-towards-Death’, and it is fundamental to his philosophy.
  • Hannah Arendt, instead, left early on: she had the benefit of a powerful warning. Just after the Nazi takeover, in spring 1933, she had been arrested while researching materials on anti-Semitism for the German Zionist Organisation at Berlin’s Prussian State Library. Her apartment was searched; both she and her mother were locked up briefly, then released. They fled, without stopping to arrange travel documents. They crossed to Czechoslovakia (then still safe) by a method that sounds almost too fabulous to be true: a sympathetic German family on the border had a house with its front door in Germany and its back door in Czechoslovakia. The family would invite people for dinner, then let them leave through the back door at night.
  • As Sartre argued in his 1943 review of The Stranger, basic phenomenological principles show that experience comes to us already charged with significance. A piano sonata is a melancholy evocation of longing. If I watch a soccer match, I see it as a soccer match, not as a meaningless scene in which a number of people run around taking turns to apply their lower limbs to a spherical object. If the latter is what I’m seeing, then I am not watching some more essential, truer version of soccer; I am failing to watch it properly as soccer at all.
  • Much as they liked Camus personally, neither Sartre nor Beauvoir accepted his vision of absurdity. For them, life is not absurd, even when viewed on a cosmic scale, and nothing can be gained by saying it is. Life for them is full of real meaning, although that meaning emerges differently for each of us.
  • For Sartre, we show bad faith whenever we portray ourselves as passive creations of our race, class, job, history, nation, family, heredity, childhood influences, events, or even hidden drives in our subconscious which we claim are out of our control. It is not that such factors are unimportant: class and race, in particular, he acknowledged as powerful forces in people’s lives, and Simone de Beauvoir would soon add gender to that list.
  • Sartre takes his argument to an extreme point by asserting that even war, imprisonment or the prospect of imminent death cannot take away my existential freedom. They form part of my ‘situation’, and this may be an extreme and intolerable situation, but it still provides only a context for whatever I choose to do next. If I am about to die, I can decide how to face that death. Sartre here resurrects the ancient Stoic idea that I may not choose what happens to me, but I can choose what to make of it, spiritually speaking.
  • But the Stoics cultivated indifference in the face of terrible events, whereas Sartre thought we should remain passionately, even furiously engaged with what happens to us and with what we can achieve. We should not expect freedom to be anything less than fiendishly difficult.
  • Freedom does not mean entirely unconstrained movement, and it certainly does not mean acting randomly. We often mistake the very things that enable us to be free — context, meaning, facticity, situation, a general direction in our lives — for things that define us and take away our freedom. It is only with all of these that we can be free in a real sense.
  • Nor did he mean that privileged groups have the right to pontificate to the poor and downtrodden about the need to ‘take responsibility’ for themselves. That would be a grotesque misreading of Sartre’s point, since his sympathy in any encounter always lay with the more oppressed side. But for each of us — for me — to be in good faith means not making excuses for myself.
  • Camus’ novel gives us a deliberately understated vision of heroism and decisive action compared to those of Sartre and Beauvoir. One can only do so much. It can look like defeatism, but it shows a more realistic perception of what it takes to actually accomplish difficult tasks like liberating one’s country.
  • Camus just kept returning to his core principle: no torture, no killing — at least not with state approval. Beauvoir and Sartre believed they were taking a more subtle and more realistic view. If asked why a couple of innocuous philosophers had suddenly become so harsh, they would have said it was because the war had changed them in profound ways. It had shown them that one’s duties to humanity could be more complicated than they seemed. ‘The war really divided my life in two,’ Sartre said later.
  • Poets and artists ‘let things be’, but they also let things come out and show themselves. They help to ease things into ‘unconcealment’ (Unverborgenheit), which is Heidegger’s rendition of the Greek term alētheia, usually translated as ‘truth’. This is a deeper kind of truth than the mere correspondence of a statement to reality, as when we say ‘The cat is on the mat’ and point to a mat with a cat on it. Long before we can do this, both cat and mat must ‘stand forth out of concealedness’. They must un-hide themselves.
  • Heidegger does not use the word ‘consciousness’ here because — as with his earlier work — he is trying to make us think in a radically different way about ourselves. We are not to think of the mind as an empty cavern, or as a container filled with representations of things. We are not even supposed to think of it as firing off arrows of intentional ‘aboutness’, as in the earlier phenomenology of Brentano. Instead, Heidegger draws us into the depths of his Schwarzwald, and asks us to imagine a gap with sunlight filtering in. We remain in the forest, but we provide a relatively open spot where other beings can bask for a moment. If we did not do this, everything would remain in the thickets, hidden even to itself.
  • The astronomer Carl Sagan began his 1980 television series Cosmos by saying that human beings, though made of the same stuff as the stars, are conscious and are therefore ‘a way for the cosmos to know itself’. Merleau-Ponty similarly quoted his favourite painter Cézanne as saying, ‘The landscape thinks itself in me, and I am its consciousness.’ This is something like what Heidegger thinks humanity contributes to the earth. We are not made of spiritual nothingness; we are part of Being, but we also bring something unique with us. It is not much: a little open space, perhaps with a path and a bench like the one the young Heidegger used to sit on to do his homework. But through us, the miracle occurs.
  • Beauty aside, Heidegger’s late writing can also be troubling, with its increasingly mystical notion of what it is to be human. If one speaks of a human being mainly as an open space or a clearing, or a means of ‘letting beings be’ and dwelling poetically on the earth, then one doesn’t seem to be talking about any recognisable person. The old Dasein has become less human than ever. It is now a forestry feature.
  • Even today, Jaspers, the dedicated communicator, is far less widely read than Heidegger, who has influenced architects, social theorists, critics, psychologists, artists, film-makers, environmental activists, and innumerable students and enthusiasts — including the later deconstructionist and post-structuralist schools, which took their starting point from his late thinking. Having spent the late 1940s as an outsider and then been rehabilitated, Heidegger became the overwhelming presence in university philosophy all over the European continent from then on.
  • As Levinas reflected on this experience, it helped to lead him to a philosophy that was essentially ethical, rather than ontological like Heidegger’s. He developed his ideas from the work of Jewish theologian Martin Buber, whose I and Thou in 1923 had distinguished between my relationship with an impersonal ‘it’ or ‘them’, and the direct personal encounter I have with a ‘you’. Levinas took it further: when I encounter you, we normally meet face-to-face, and it is through your face that you, as another person, can make ethical demands on me. This is very different from Heidegger’s Mitsein or Being-with, which suggests a group of people standing alongside one another, shoulder to shoulder as if in solidarity — perhaps as a unified nation or Volk.
  • For Levinas, we literally face each other, one individual at a time, and that relationship becomes one of communication and moral expectation. We do not merge; we respond to one another. Instead of being co-opted into playing some role in my personal drama of authenticity, you look me in the eyes — and you remain Other. You remain you.
  • This relationship is more fundamental than the self, more fundamental than consciousness, more fundamental even than Being — and it brings an unavoidable ethical obligation. Ever since Husserl, phenomenologists and existentialists had being trying to stretch the definition of existence to incorporate our social lives and relationships. Levinas did more: he turned philosophy around entirely so that these relationships were the foundation of our existence, not an extension of it.
  • Her last work, The Need for Roots, argues, among other things, that none of us has rights, but each one of us has a near-infinite degree of duty and obligation to the other. Whatever the underlying cause of her death — and anorexia nervosa seems to have been involved — no one could deny that she lived out her philosophy with total commitment. Of all the lives touched on in this book, hers is surely the most profound and challenging application of Iris Murdoch’s notion that a philosophy can be ‘inhabited’.
  • Other thinkers took radical ethical turns during the war years. The most extreme was Simone Weil, who actually tried to live by the principle of putting other people’s ethical demands first. Having returned to France after her travels through Germany in 1932, she had worked in a factory so as to experience the degrading nature of such work for herself. When France fell in 1940, her family fled to Marseilles (against her protests), and later to the US and to Britain. Even in exile, Weil made extraordinary sacrifices. If there were people in the world who could not sleep in a bed, she would not do so either, so she slept on the floor.
  • The mystery tradition had roots in Kierkegaard’s ‘leap of faith’. It owed much to the other great nineteenth-century mystic of the impossible, Dostoevsky, and to older theological notions. But it also grew from the protracted trauma that was the first half of the twentieth century. Since 1914, and especially since 1939, people in Europe and elsewhere had come to the realisation that we cannot fully know or trust ourselves; that we have no excuses or explanations for what we do — and yet that we must ground our existence and relationships on something firm, because otherwise we cannot survive.
  • One striking link between these radical ethical thinkers, all on the fringes of our main story, is that they had religious faith. They also granted a special role to the notion of ‘mystery’ — that which cannot be known, calculated or understood, especially when it concerns our relationships with each other. Heidegger was different from them, since he rejected the religion he grew up with and had no real interest in ethics — probably as a consequence of his having no real interest in the human.
  • Meanwhile, the Christian existentialist Gabriel Marcel was also still arguing, as he had since the 1930s, that ethics trumps everything else in philosophy and that our duty to each other is so great as to play the role of a transcendent ‘mystery’. He too had been led to this position partly by a wartime experience: during the First World War he had worked for the Red Cross’ Information Service, with the unenviable job of answering relatives’ inquiries about missing soldiers. Whenever news came, he passed it on, and usually it was not good. As Marcel later said, this task permanently inoculated him against warmongering rhetoric of any kind, and it made him aware of the power of what is unknown in our lives.
  • As the play’s much-quoted and frequently misunderstood final line has it: ‘Hell is other people.’ Sartre later explained that he did not mean to say that other people were hellish in general. He meant that after death we become frozen in their view, unable any longer to fend off their interpretation. In life, we can still do something to manage the impression we make; in death, this freedom goes and we are left entombed in other’s people’s memories and perceptions.
  • We have to do two near-impossible things at once: understand ourselves as limited by circumstances, and yet continue to pursue our projects as though we are truly in control. In Beauvoir’s view, existentialism is the philosophy that best enables us to do this, because it concerns itself so deeply with both freedom and contingency. It acknowledges the radical and terrifying scope of our freedom in life, but also the concrete influences that other philosophies tend to ignore: history, the body, social relationships and the environment.
  • The aspects of our existence that limit us, Merleau-Ponty says, are the very same ones that bind us to the world and give us scope for action and perception. They make us what we are. Sartre acknowledged the need for this trade-off, but he found it more painful to accept. Everything in him longed to be free of bonds, of impediments and limitations
  • Of course we have to learn this skill of interpreting and anticipating the world, and this happens in early childhood, which is why Merleau-Ponty thought child psychology was essential to philosophy. This is an extraordinary insight. Apart from Rousseau, very few philosophers before him had taken childhood seriously; most wrote as though all human experience were that of a fully conscious, rational, verbal adult who has been dropped into this world from the sky — perhaps by a stork.
  • For Merleau-Ponty, we cannot understand our experience if we don’t think of ourselves in part as overgrown babies. We fall for optical illusions because we once learned to see the world in terms of shapes, objects and things relevant to our own interests. Our first perceptions came to us in tandem with our first active experiments in observing the world and reaching out to explore it, and are still linked with those experiences.
  • Another factor in all of this, for Merleau-Ponty, is our social existence: we cannot thrive without others, or not for long, and we need this especially in early life. This makes solipsistic speculation about the reality of others ridiculous; we could never engage in such speculation if we hadn’t already been formed by them.
  • As Descartes could have said (but didn’t), ‘I think, therefore other people exist.’ We grow up with people playing with us, pointing things out, talking, listening, and getting us used to reading emotions and movements; this is how we become capable, reflective, smoothly integrated beings.
  • In general, Merleau-Ponty thinks human experience only makes sense if we abandon philosophy’s time-honoured habit of starting with a solitary, capsule-like, immobile adult self, isolated from its body and world, which must then be connected up again — adding each element around it as though adding clothing to a doll. Instead, for him, we slide from the womb to the birth canal to an equally close and total immersion in the world. That immersion continues as long as we live, although we may also cultivate the art of partially withdrawing from time to time when we want to think or daydream.
  • When he looks for his own metaphor to describe how he sees consciousness, he comes up with a beautiful one: consciousness, he suggests, is like a ‘fold’ in the world, as though someone had crumpled a piece of cloth to make a little nest or hollow. It stays for a while, before eventually being unfolded and smoothed away. There is something seductive, even erotic, in this idea of my conscious self as an improvised pouch in the cloth of the world. I still have my privacy — my withdrawing room. But I am part of the world’s fabric, and I remain formed out of it for as long as I am here.
  • By the time of these works, Merleau-Ponty is taking his desire to describe experience to the outer limits of what language can convey. Just as with the late Husserl or Heidegger, or Sartre in his Flaubert book, we see a philosopher venturing so far from shore that we can barely follow. Emmanuel Levinas would head out to the fringes too, eventually becoming incomprehensible to all but his most patient initiates.
  • Sartre once remarked — speaking of a disagreement they had about Husserl in 1941 — that ‘we discovered, astounded, that our conflicts had, at times, stemmed from our childhood, or went back to the elementary differences of our two organisms’. Merleau-Ponty also said in an interview that Sartre’s work seemed strange to him, not because of philosophical differences, but because of a certain ‘register of feeling’, especially in Nausea, that he could not share. Their difference was one of temperament and of the whole way the world presented itself to them.
  • The two also differed in their purpose. When Sartre writes about the body or other aspects of experience, he generally does it in order to make a different point. He expertly evokes the grace of his café waiter, gliding between the tables, bending at an angle just so, steering the drink-laden tray through the air on the tips of his fingers — but he does it all in order to illustrate his ideas about bad faith. When Merleau-Ponty writes about skilled and graceful movement, the movement itself is his point. This is the thing he wants to understand.
  • We can never move definitively from ignorance to certainty, for the thread of the inquiry will constantly lead us back to ignorance again. This is the most attractive description of philosophy I’ve ever read, and the best argument for why it is worth doing, even (or especially) when it takes us no distance at all from our starting point.
  • By prioritising perception, the body, social life and childhood development, Merleau-Ponty gathered up philosophy’s far-flung outsider subjects and brought them in to occupy the centre of his thought.
  • In his inaugural lecture at the Collège de France on 15 January 1953, published as In Praise of Philosophy, he said that philosophers should concern themselves above all with whatever is ambiguous in our experience. At the same time, they should think clearly about these ambiguities, using reason and science. Thus, he said, ‘The philosopher is marked by the distinguishing trait that he possesses inseparably the taste for evidence and the feeling for ambiguity.’ A constant movement is required between these two
  • As Sartre wrote in response to Hiroshima, humanity had now gained the power to wipe itself out, and must decide every single day that it wanted to live. Camus also wrote that humanity faced the task of choosing between collective suicide and a more intelligent use of its technology — ‘between hell and reason’. After 1945, there seemed little reason to trust in humanity’s ability to choose well.
  • Merleau-Ponty observed in a lecture of 1951 that, more than any previous century, the twentieth century had reminded people how ‘contingent’ their lives were — how at the mercy of historical events and other changes that they could not control. This feeling went on long after the war ended. After the A-bombs were dropped on Hiroshima and Nagasaki, many feared that a Third World War would not be long in coming, this time between the Soviet Union and the United States.
Javier E

Franklin Foer Has A Score To Settle With Facebook - The Forward - 0 views

  • he argues that we are pressed into conformity. By constantly interacting with these companies’ products, we have allowed them to intrude upon our inner lives, destroy contemplation and manipulate our behaviors.
  • I think it’s impossible to think metaphysically, impossible to think about the things that go beyond the world of appearance, if your attention is constantly being directed and if you’re constantly being distracted. So I think that contemplation is the necessary ingredient that makes a spiritual life possible.
  • privacy is something that everybody claims to want, but nobody articulates why. Privacy is beyond just having somebody get a peek through your window. The threat isn’t just that your space is being crowded and violated. What Brandeis was worried about was that idea that the fear of somebody looking over your shoulder as you think would start to affect your thought — that as soon as we know we have an audience, we start to bend our opinions to try to please our audience.
Javier E

After Surgery in Germany, I Wanted Vicodin, Not Herbal Tea - The New York Times - 0 views

  • I took two ibuprofens that first day. In hindsight, I didn’t need them, but I felt like I should take something. What I really needed was patience pills, and a few distractions
  • Come to think of it, I bring a lot of medicine with me from the United States, all over the counter, all intended to take away discomfort. The German doctors were telling me that being uncomfortable is O.K.
  • It reminded me of the poster in my doctor’s waiting room, the one informing us that herbal tea is the first remedy to try when we have a cold. The first remedy I try is the decongestants I bring with me from the United States. I can’t find those in Germany, nor can I find the children’s cough medicine that makes my child drowsy. I also import that.
  • ...6 more annotations...
  • “I do have another question,” I said. “Stool softeners — certainly, you prescribe those? That’s pretty standard with anesthesia throughout the modern world, I believe.”
  • “You won’t need those,” he answered in his calm voice. “Your body will function just fine. Just give it a day or two. Drink a cup of coffee, slowly
  • “Pain is a part of life. We cannot eliminate it nor do we want to. The pain will guide you. You will know when to rest more; you will know when you are healing.
  • If I give you Vicodin, you will no longer feel the pain, yes, but you will no longer know what your body is telling you. You might overexert yourself because you are no longer feeling the pain signals. All you need is rest.
  • And please be careful with ibuprofen. It’s not good for your kidneys. Only take it if you must. Your body will heal itself with rest.
  • I didn’t mention that I use ibuprofen like candy. Why else do they come in such jumbo sizes at American warehouse stores?
Javier E

Don't be fooled. Giuliani has a strategy. - The Washington Post - 0 views

  • There is madness in Rudolph W. Giuliani’s incoherence on behalf of President Trump, but there is also method. He’s following the Trump playbook: Confuse, distract, provoke and flood the zone with factoids and truthiness until nobody can be sure what’s real and what’s not.
  • Giuliani is obfuscating, not clarifying. He’s making it harder to know even what the president claims, let alone what the truth might be. As a legal strategy, this would be insane. But it’s really a political strategy.
  • Congress poses the only serious threat to Trump, in the form of impeachment. If the president’s loyal base can be flimflammed into thinking this is all a big witch hunt, Republican lawmakers will stay in line. At least for now.
Javier E

The teaching of economics gets an overdue overhaul - 0 views

  • Change, however, has been slow to reach the university economics curriculum. Many institutions still pump students through introductory courses untainted by recent economic history or the market shortcomings it illuminates.
  • A few plucky reformers are working to correct that: a grand and overdue idea. Overhauling the way economics is taught ought to produce students more able to understand the modern world. Even better, it should improve economics itself.
  • Yet the standard curriculum is hardly calibrated to impart these lessons. Most introductory texts begin with the simplest of models. Workers are paid according to their productivity; trade never makes anyone worse off; and government interventions in the market always generate a “deadweight loss”. Practising economists know that these statements are more true at some times than others
  • ...17 more annotations...
  • Economics teaches that incentives matter and trade-offs are unavoidable. It shows how naive attempts to fix social problems, from poverty to climate change, can have unintended consequences. Introductory economics, at its best, enables people to see the unstated assumptions and hidden costs behind the rosy promises of politicians and businessmen.
  • A Chilean professor, Oscar Landerretche, worked with other economists to design a new curriculum. He, Sam Bowles, of the Santa Fe Institute, Wendy Carlin, of University College London (UCL), and Margaret Stevens, of Oxford University, painstakingly knitted contributions from economists around the world into a text that is free, online and offers interactive charts and videos of star economists. That text is the basis of economics modules taught by a small but growing number of instructors.
  • Students pay $300 or more for textbooks explaining that in competitive markets the price of a good should fall to the cost of producing an additional unit, and unsurprisingly regurgitate the expected answers. A study of 170 economics modules taught at seven universities found that marks in exams favoured the ability to “operate a model” over proofs of independent judgment.
  • “The Economy”, as the book is economically titled, covers the usual subjects, but in a very different way. It begins with the biggest of big pictures, explaining how capitalism and industrialisation transformed the world, inviting students to contemplate how it arrived at where it is today.
  • That could mean, eventually, a broader array of perspectives within economics departments, bigger and bolder research questions—and fewer profession-shaking traumas in future.
  • Messy complications, from environmental damage to inequality, are placed firmly in the foreground.
  • It explains cost curves, as other introductory texts do, but in the context of the Industrial Revolution, thus exposing students to debates about why industrialisation kicked off when and where it did.
  • Thomas Malthus’s ideas are used to teach students the uses and limitations of economic models, combining technical instruction with a valuable lesson from the history of economic thought.
  • “The Economy” does not dumb down economics; it uses maths readily, keeping students engaged through the topicality of the material. Quite early on, students have lessons in the weirdness in economics—from game theory to power dynamics within firms—that makes the subject fascinating and useful but are skimmed over in most introductory courses.
  • Homa Zarghamee, also at Barnard, appreciates having to spend less time “unteaching”, ie, explaining to students why the perfect-competition result they learned does not actually hold in most cases. A student who does not finish the course will not be left with a misleading idea of economics, she notes.
  • But the all-important exceptions are taught quite late in the curriculum—or, often, only in more advanced courses taken by those pursuing an economics degree.
  • Far from an unintended result of ill-conceived policies, she argues, the roughly 4m deaths from hunger in 1932 and 1933 were part of a deliberate campaign by Josef Stalin and the Bolshevik leadership to crush Ukrainian national aspirations, literally starving actual or potential bearers of those aspirations into submission to the Soviet order
  • The politics in this case was the Sovietisation of Ukraine; the means was starvation. Food supply was not mismanaged by Utopian dreamers. It was weaponised.
  • . “Red Famine” presents a Bolshevik government so hell-bent on extracting wealth and controlling labour that it was willing to confiscate the last remaining grain from hungry peasants (mostly but not exclusively in Ukraine) and then block them from fleeing famine-afflicted areas to search for food.
  • . Stalin was not only aware of the ensuing mass death (amounting to roughly 13% of Ukraine’s population). He actively sought to suppress knowledge of it (including banning the publication of census data), so as not to distract from the campaign to collectivise Soviet agriculture and extend the Communist Party’s reach into the countryside—a campaign Ms Applebaum calls a “revolution...more profound and more shocking than the original Bolshevik revolution itself”
  • The book’s most powerful passages describe the moral degradation that resulted from sustained hunger, as family solidarity and village traditions of hospitality withered in the face of the overwhelming desire to eat. Under a state of siege by Soviet authorities, hunger-crazed peasants took to consuming, grass, animal hides, manure and occasionally each other. People became indifferent to the sight of corpses lying in streets, and eventually to their own demis
  • While stressing Stalin’s goal of crushing Ukrainian nationalism, moreover, Ms Applebaum passes over a subtler truth. For along with its efforts to root out “bourgeois” nationalisms, the Kremlin relentlessly promoted a Soviet version of Ukrainian identity, as it did with most other ethnic minorities. Eight decades on, that legacy has done even more to shape today’s Ukraine than the Holodomor.
Javier E

The Coming Software Apocalypse - The Atlantic - 1 views

  • Our standard framework for thinking about engineering failures—reflected, for instance, in regulations for medical devices—was developed shortly after World War II, before the advent of software, for electromechanical systems. The idea was that you make something reliable by making its parts reliable (say, you build your engine to withstand 40,000 takeoff-and-landing cycles) and by planning for the breakdown of those parts (you have two engines). But software doesn’t break. Intrado’s faulty threshold is not like the faulty rivet that leads to the crash of an airliner. The software did exactly what it was told to do. In fact it did it perfectly. The reason it failed is that it was told to do the wrong thing.
  • Software failures are failures of understanding, and of imagination. Intrado actually had a backup router, which, had it been switched to automatically, would have restored 911 service almost immediately. But, as described in a report to the FCC, “the situation occurred at a point in the application logic that was not designed to perform any automated corrective actions.”
  • The introduction of programming languages like Fortran and C, which resemble English, and tools, known as “integrated development environments,” or IDEs, that help correct simple mistakes (like Microsoft Word’s grammar checker but for code), obscured, though did little to actually change, this basic alienation—the fact that the programmer didn’t work on a problem directly, but rather spent their days writing out instructions for a machine.
  • ...52 more annotations...
  • Code is too hard to think about. Before trying to understand the attempts themselves, then, it’s worth understanding why this might be: what it is about code that makes it so foreign to the mind, and so unlike anything that came before it.
  • Technological progress used to change the way the world looked—you could watch the roads getting paved; you could see the skylines rise. Today you can hardly tell when something is remade, because so often it is remade by code.
  • Software has enabled us to make the most intricate machines that have ever existed. And yet we have hardly noticed, because all of that complexity is packed into tiny silicon chips as millions and millions of lines of cod
  • The programmer, the renowned Dutch computer scientist Edsger Dijkstra wrote in 1988, “has to be able to think in terms of conceptual hierarchies that are much deeper than a single mind ever needed to face before.” Dijkstra meant this as a warning.
  • As programmers eagerly poured software into critical systems, they became, more and more, the linchpins of the built world—and Dijkstra thought they had perhaps overestimated themselves.
  • What made programming so difficult was that it required you to think like a computer.
  • “The problem is that software engineers don’t understand the problem they’re trying to solve, and don’t care to,” says Leveson, the MIT software-safety expert. The reason is that they’re too wrapped up in getting their code to work.
  • Though he runs a lab that studies the future of computing, he seems less interested in technology per se than in the minds of the people who use it. Like any good toolmaker, he has a way of looking at the world that is equal parts technical and humane. He graduated top of his class at the California Institute of Technology for electrical engineering,
  • “The serious problems that have happened with software have to do with requirements, not coding errors.” When you’re writing code that controls a car’s throttle, for instance, what’s important is the rules about when and how and by how much to open it. But these systems have become so complicated that hardly anyone can keep them straight in their head. “There’s 100 million lines of code in cars now,” Leveson says. “You just cannot anticipate all these things.”
  • a nearly decade-long investigation into claims of so-called unintended acceleration in Toyota cars. Toyota blamed the incidents on poorly designed floor mats, “sticky” pedals, and driver error, but outsiders suspected that faulty software might be responsible
  • software experts spend 18 months with the Toyota code, picking up where NASA left off. Barr described what they found as “spaghetti code,” programmer lingo for software that has become a tangled mess. Code turns to spaghetti when it accretes over many years, with feature after feature piling on top of, and being woven around
  • Using the same model as the Camry involved in the accident, Barr’s team demonstrated that there were actually more than 10 million ways for the onboard computer to cause unintended acceleration. They showed that as little as a single bit flip—a one in the computer’s memory becoming a zero or vice versa—could make a car run out of control. The fail-safe code that Toyota had put in place wasn’t enough to stop it
  • . In all, Toyota recalled more than 9 million cars, and paid nearly $3 billion in settlements and fines related to unintended acceleration.
  • The problem is that programmers are having a hard time keeping up with their own creations. Since the 1980s, the way programmers work and the tools they use have changed remarkably little.
  • “Visual Studio is one of the single largest pieces of software in the world,” he said. “It’s over 55 million lines of code. And one of the things that I found out in this study is more than 98 percent of it is completely irrelevant. All this work had been put into this thing, but it missed the fundamental problems that people faced. And the biggest one that I took away from it was that basically people are playing computer inside their head.” Programmers were like chess players trying to play with a blindfold on—so much of their mental energy is spent just trying to picture where the pieces are that there’s hardly any left over to think about the game itself.
  • The fact that the two of them were thinking about the same problem in the same terms, at the same time, was not a coincidence. They had both just seen the same remarkable talk, given to a group of software-engineering students in a Montreal hotel by a computer researcher named Bret Victor. The talk, which went viral when it was posted online in February 2012, seemed to be making two bold claims. The first was that the way we make software is fundamentally broken. The second was that Victor knew how to fix it.
  • This is the trouble with making things out of code, as opposed to something physical. “The complexity,” as Leveson puts it, “is invisible to the eye.”
  • in early 2012, Victor had finally landed upon the principle that seemed to thread through all of his work. (He actually called the talk “Inventing on Principle.”) The principle was this: “Creators need an immediate connection to what they’re creating.” The problem with programming was that it violated the principle. That’s why software systems were so hard to think about, and so rife with bugs: The programmer, staring at a page of text, was abstracted from whatever it was they were actually making.
  • “Our current conception of what a computer program is,” he said, is “derived straight from Fortran and ALGOL in the late ’50s. Those languages were designed for punch cards.”
  • WYSIWYG (pronounced “wizzywig”) came along. It stood for “What You See Is What You Get.”
  • Victor’s point was that programming itself should be like that. For him, the idea that people were doing important work, like designing adaptive cruise-control systems or trying to understand cancer, by staring at a text editor, was appalling.
  • With the right interface, it was almost as if you weren’t working with code at all; you were manipulating the game’s behavior directly.
  • When the audience first saw this in action, they literally gasped. They knew they weren’t looking at a kid’s game, but rather the future of their industry. Most software involved behavior that unfolded, in complex ways, over time, and Victor had shown that if you were imaginative enough, you could develop ways to see that behavior and change it, as if playing with it in your hands. One programmer who saw the talk wrote later: “Suddenly all of my tools feel obsolete.”
  • hen John Resig saw the “Inventing on Principle” talk, he scrapped his plans for the Khan Academy programming curriculum. He wanted the site’s programming exercises to work just like Victor’s demos. On the left-hand side you’d have the code, and on the right, the running program: a picture or game or simulation. If you changed the code, it’d instantly change the picture. “In an environment that is truly responsive,” Resig wrote about the approach, “you can completely change the model of how a student learns ... [They] can now immediately see the result and intuit how underlying systems inherently work without ever following an explicit explanation.” Khan Academy has become perhaps the largest computer-programming class in the world, with a million students, on average, actively using the program each month.
  • The ideas spread. The notion of liveness, of being able to see data flowing through your program instantly, made its way into flagship programming tools offered by Google and Apple. The default language for making new iPhone and Mac apps, called Swift, was developed by Apple from the ground up to support an environment, called Playgrounds, that was directly inspired by Light Table.
  • “Typically the main problem with software coding—and I’m a coder myself,” Bantegnie says, “is not the skills of the coders. The people know how to code. The problem is what to code. Because most of the requirements are kind of natural language, ambiguous, and a requirement is never extremely precise, it’s often understood differently by the guy who’s supposed to code.”
  • In a pair of later talks, “Stop Drawing Dead Fish” and “Drawing Dynamic Visualizations,” Victor went one further. He demoed two programs he’d built—the first for animators, the second for scientists trying to visualize their data—each of which took a process that used to involve writing lots of custom code and reduced it to playing around in a WYSIWYG interface.
  • Victor suggested that the same trick could be pulled for nearly every problem where code was being written today. “I’m not sure that programming has to exist at all,” he told me. “Or at least software developers.” In his mind, a software developer’s proper role was to create tools that removed the need for software developers. Only then would people with the most urgent computational problems be able to grasp those problems directly, without the intermediate muck of code.
  • Victor implored professional software developers to stop pouring their talent into tools for building apps like Snapchat and Uber. “The inconveniences of daily life are not the significant problems,” he wrote. Instead, they should focus on scientists and engineers—as he put it to me, “these people that are doing work that actually matters, and critically matters, and using really, really bad tools.”
  • Bantegnie’s company is one of the pioneers in the industrial use of model-based design, in which you no longer write code directly. Instead, you create a kind of flowchart that describes the rules your program should follow (the “model”), and the computer generates code for you based on those rules
  • In a model-based design tool, you’d represent this rule with a small diagram, as though drawing the logic out on a whiteboard, made of boxes that represent different states—like “door open,” “moving,” and “door closed”—and lines that define how you can get from one state to the other. The diagrams make the system’s rules obvious: Just by looking, you can see that the only way to get the elevator moving is to close the door, or that the only way to get the door open is to stop.
  • . In traditional programming, your task is to take complex rules and translate them into code; most of your energy is spent doing the translating, rather than thinking about the rules themselves. In the model-based approach, all you have is the rules. So that’s what you spend your time thinking about. It’s a way of focusing less on the machine and more on the problem you’re trying to get it to solve.
  • “Everyone thought I was interested in programming environments,” he said. Really he was interested in how people see and understand systems—as he puts it, in the “visual representation of dynamic behavior.” Although code had increasingly become the tool of choice for creating dynamic behavior, it remained one of the worst tools for understanding it. The point of “Inventing on Principle” was to show that you could mitigate that problem by making the connection between a system’s behavior and its code immediate.
  • On this view, software becomes unruly because the media for describing what software should do—conversations, prose descriptions, drawings on a sheet of paper—are too different from the media describing what software does do, namely, code itself.
  • for this approach to succeed, much of the work has to be done well before the project even begins. Someone first has to build a tool for developing models that are natural for people—that feel just like the notes and drawings they’d make on their own—while still being unambiguous enough for a computer to understand. They have to make a program that turns these models into real code. And finally they have to prove that the generated code will always do what it’s supposed to.
  • tice brings order and accountability to large codebases. But, Shivappa says, “it’s a very labor-intensive process.” He estimates that before they used model-based design, on a two-year-long project only two to three months was spent writing code—the rest was spent working on the documentation.
  • uch of the benefit of the model-based approach comes from being able to add requirements on the fly while still ensuring that existing ones are met; with every change, the computer can verify that your program still works. You’re free to tweak your blueprint without fear of introducing new bugs. Your code is, in FAA parlance, “correct by construction.”
  • “people are not so easily transitioning to model-based software development: They perceive it as another opportunity to lose control, even more than they have already.”
  • The bias against model-based design, sometimes known as model-driven engineering, or MDE, is in fact so ingrained that according to a recent paper, “Some even argue that there is a stronger need to investigate people’s perception of MDE than to research new MDE technologies.”
  • “Human intuition is poor at estimating the true probability of supposedly ‘extremely rare’ combinations of events in systems operating at a scale of millions of requests per second,” he wrote in a paper. “That human fallibility means that some of the more subtle, dangerous bugs turn out to be errors in design; the code faithfully implements the intended design, but the design fails to correctly handle a particular ‘rare’ scenario.”
  • Newcombe was convinced that the algorithms behind truly critical systems—systems storing a significant portion of the web’s data, for instance—ought to be not just good, but perfect. A single subtle bug could be catastrophic. But he knew how hard bugs were to find, especially as an algorithm grew more complex. You could do all the testing you wanted and you’d never find them all.
  • An algorithm written in TLA+ could in principle be proven correct. In practice, it allowed you to create a realistic model of your problem and test it not just thoroughly, but exhaustively. This was exactly what he’d been looking for: a language for writing perfect algorithms.
  • TLA+, which stands for “Temporal Logic of Actions,” is similar in spirit to model-based design: It’s a language for writing down the requirements—TLA+ calls them “specifications”—of computer programs. These specifications can then be completely verified by a computer. That is, before you write any code, you write a concise outline of your program’s logic, along with the constraints you need it to satisfy
  • Programmers are drawn to the nitty-gritty of coding because code is what makes programs go; spending time on anything else can seem like a distraction. And there is a patient joy, a meditative kind of satisfaction, to be had from puzzling out the micro-mechanics of code. But code, Lamport argues, was never meant to be a medium for thought. “It really does constrain your ability to think when you’re thinking in terms of a programming language,”
  • Code makes you miss the forest for the trees: It draws your attention to the working of individual pieces, rather than to the bigger picture of how your program fits together, or what it’s supposed to do—and whether it actually does what you think. This is why Lamport created TLA+. As with model-based design, TLA+ draws your focus to the high-level structure of a system, its essential logic, rather than to the code that implements it.
  • But TLA+ occupies just a small, far corner of the mainstream, if it can be said to take up any space there at all. Even to a seasoned engineer like Newcombe, the language read at first as bizarre and esoteric—a zoo of symbols.
  • this is a failure of education. Though programming was born in mathematics, it has since largely been divorced from it. Most programmers aren’t very fluent in the kind of math—logic and set theory, mostly—that you need to work with TLA+. “Very few programmers—and including very few teachers of programming—understand the very basic concepts and how they’re applied in practice. And they seem to think that all they need is code,” Lamport says. “The idea that there’s some higher level than the code in which you need to be able to think precisely, and that mathematics actually allows you to think precisely about it, is just completely foreign. Because they never learned it.”
  • “In the 15th century,” he said, “people used to build cathedrals without knowing calculus, and nowadays I don’t think you’d allow anyone to build a cathedral without knowing calculus. And I would hope that after some suitably long period of time, people won’t be allowed to write programs if they don’t understand these simple things.”
  • Programmers, as a species, are relentlessly pragmatic. Tools like TLA+ reek of the ivory tower. When programmers encounter “formal methods” (so called because they involve mathematical, “formally” precise descriptions of programs), their deep-seated instinct is to recoil.
  • Formal methods had an image problem. And the way to fix it wasn’t to implore programmers to change—it was to change yourself. Newcombe realized that to bring tools like TLA+ to the programming mainstream, you had to start speaking their language.
  • he presented TLA+ as a new kind of “pseudocode,” a stepping-stone to real code that allowed you to exhaustively test your algorithms—and that got you thinking precisely early on in the design process. “Engineers think in terms of debugging rather than ‘verification,’” he wrote, so he titled his internal talk on the subject to fellow Amazon engineers “Debugging Designs.” Rather than bemoan the fact that programmers see the world in code, Newcombe embraced it. He knew he’d lose them otherwise. “I’ve had a bunch of people say, ‘Now I get it,’” Newcombe says.
  • In the world of the self-driving car, software can’t be an afterthought. It can’t be built like today’s airline-reservation systems or 911 systems or stock-trading systems. Code will be put in charge of hundreds of millions of lives on the road and it has to work. That is no small task.
anonymous

Opinion | I Survived 18 Years in Solitary Confinement - The New York Times - 0 views

  • I Survived 18 Years in Solitary Confinement
  • Mr. Manuel is an author, activist and poet. When he was 14 years old, he was sentenced to life in prison with no parole and spent 18 years in solitary confinement.
  • Imagine living alone in a room the size of a freight elevator for almost two decades.
  • ...33 more annotations...
  • As a 15-year-old, I was condemned to long-term solitary confinement in the Florida prison system, which ultimately lasted for 18 consecutive years
  • From age 15 to 33.
  • For 18 years I didn’t have a window in my room to distract myself from the intensity of my confinement
  • I wasn’t permitted to talk to my fellow prisoners or even to myself. I didn’t have healthy, nutritious food; I was given just enough to not die
  • These circumstances made me think about how I ended up in solitary confinement.
  • United Nations standards on the treatment of prisoners prohibits solitary confinement for more than 15 days, declaring it “cruel, inhuman or degrading.”
  • For this I was arrested and charged as an adult with armed robbery and attempted murder.
  • My court-appointed lawyer advised me to plead guilty, telling me that the maximum sentence would be 15 years. So I did. But my sentence wasn’t 15 years — it was life imprisonment without the possibility of parole.
  • But a year and a half later, at age 15, I was put back into solitary confinement after being written up for a few minor infractions.
  • Florida has different levels of solitary confinement; I spent the majority of that time in one of the most restrictive
  • I was finally released from prison in 2016 thanks to my lawyer, Bryan Stevenson
  • Researchers have long concluded that solitary confinement causes post-traumatic stress disorder and impairs prisoners’ ability to adjust to society long after they leave their cell.
  • In the summer of 1990, shortly after finishing seventh grade, I was directed by a few older kids to commit a robbery. During the botched attempt, I shot a woman. She suffered serious injuries to her jaw and mouth but survived. It was reckless and foolish on my part, the act of a 13-year-old in crisis, and I’m simply grateful no one died.
  • More aggressive change is needed in state prison systems
  • In 2016, the Obama administration banned juvenile solitary confinement in federal prisons, and a handful of states have advanced similar reforms for both children and adults.
  • Yet the practice, even for minors, is still common in the United States, and efforts to end it have been spotty
  • Because solitary confinement is hidden from public view and the broader prison population, egregious abuses are left unchecked
  • I watched a corrections officer spray a blind prisoner in the face with chemicals simply because he was standing by the door of his cell as a female nurse walked by. The prisoner later told me that to justify the spraying, the officer claimed the prisoner masturbated in front of the nurse.
  • I also witnessed the human consequences of the harshness of solitary firsthand: Some people would resort to cutting their stomachs open with a razor and sticking a plastic spork inside their intestines just so they could spend a week in the comfort of a hospital room with a television
  • On occasion, I purposely overdosed on Tylenol so that I could spend a night in the hospital. For even one night, it was worth the pain.
  • Another time, I was told I’d be switching dorms, and I politely asked to remain where I was because a guard in the new area had been overly aggressive with me. In response, four or five officers handcuffed me, picked me up by my feet and shoulders, and marched with me to my new dorm — using my head to ram the four steel doors on the way there.
  • The punishments were wholly disproportionate to the infractions. Before I knew it, months in solitary bled into years, years into almost two decades.
  • As a child, I survived these conditions by conjuring up stories of what I’d do when I was finally released. My mind was the only place I found freedom from my reality
  • the only place I could play basketball with my brother or video games with my friends, and eat my mother’s warm cherry pie on the porch.
  • No child should have to use their imagination this way — to survive.
  • It is difficult to know the exact number of children in solitary confinement today. The Liman Center at Yale Law School estimated that 61,000 Americans (adults and children) were in solitary confinement in the fall of 2017
  • No matter the count, I witnessed too many people lose their minds while isolated. They’d involuntarily cross a line and simply never return to sanity. Perhaps they didn’t want to. Staying in their mind was the better, safer, more humane option.
  • Solitary confinement is cruel and unusual punishment, something prohibited by the Eighth Amendment, yet prisons continue to practice it.
  • When it comes to children, elimination is the only moral option. And if ending solitary confinement for adults isn’t politically viable, public officials should at least limit the length of confinement to 15 days or fewer, in compliance with the U.N. standards
  • As I try to reintegrate into society, small things often awaken painful memories from solitary. Sometimes relationships feel constraining. It’s difficult to maintain the attention span required for a rigid 9-to-5 job. At first, crossing the street and seeing cars and bikes racing toward me felt terrifying.
  • I will face PTSD and challenges big and small for the rest of my life because of what I was subjected to.
  • And some things I never will — most of all, that this country can treat human beings, especially children, as cruelly as I was treated.
  • Sadly, solitary confinement for juveniles is still permissible in many states. But we have the power to change that — to ensure that the harrowing injustice I suffered as a young boy never happens to another child in America.
  •  
    A very eye-opening article and story told by a victim about young children facing solitary confinement.
clairemann

Read This If You Wake Up Hungry In The Middle Of The Night | HuffPost Life - 1 views

  • Anyone who’s woken up at 3 a.m. with their stomach growling has probably wondered, what gives? Is it even normal to be this hungry at this hour?
  • Hunger levels, which are regulated in part by our circadian rhythms, generally rise throughout the day, are highest in the evening and decline throughout the night and into the morning. So if you’re getting up in the dead of night with major hunger pangs, experts say it’s worth investigating.
  • Waking up for a midnight snack every once in a while probably isn’t anything to be concerned about. But if nocturnal noshing has become a more regular pattern — and an annoying one at that — then you may be looking for answers.
  • ...5 more annotations...
  • “If you’re dieting or restricting in any way, like by skipping meals or over-exercising, then you may not be eating enough calories,” said registered dietitian Alissa Rumsey, the author of “Unapologetic Eating.”
  • “If someone is not obtaining enough sleep on a regular basis, this could result in increased appetite throughout the day and night.”
  • Sometimes it’s not physical hunger that’s rousing you mid-slumber; it may be your body’s response to a period of stress in your life. Once you’re awake, you might be turning to food to soothe yourself.
  • “[Emotional eating] can certainly slow down your thought processes — if your body is digesting food, then maybe you’re distracted from the anxiety that’s keeping you awake,”
  • Those with NES consume a significant portion of their daily calories after dinner (with little desire to eat earlier in the day), may deal with mood issues that worsen in the evening, have trouble falling or staying asleep and believe they must eat something in order to fall back asleep at night. It’s estimated that 1.5% of the population may suffer from night eating syndrome, though the actual figure could be higher than that.
Javier E

Revisiting the prophetic work of Neil Postman about the media » MercatorNet - 1 views

  • The NYU professor was surely prophetic. “Our own tribe is undergoing a vast and trembling shift from the magic of writing to the magic of electronics,” he cautioned.
  • “We face the rapid dissolution of the assumptions of an education organised around the slow-moving printed word, and the equally rapid emergence of a new education based on the speed-of-light electronic message.”
  • What Postman perceived in television has been dramatically intensified by smartphones and social media
  • ...31 more annotations...
  • Postman also recognised that technology was changing our mental processes and social habits.
  • Today corporations like Google and Amazon collect data on Internet users based on their browsing history, the things they purchase, and the apps they use
  • Yet all citizens are undergoing this same transformation. Our digital devices undermine social interactions by isolating us,
  • “Years from now, it will be noticed that the massive collection and speed-of-light retrieval of data have been of great value to large-scale organisations, but have solved very little of importance to most people, and have created at least as many problems for them as they may have solved.”
  • “Television has by its power to control the time, attention, and cognitive habits of our youth gained the power to control their education.”
  • As a student of Canadian philosopher Marshall McLuhan, Postman believed that the medium of information was critical to understanding its social and political effects. Every technology has its own agenda. Postman worried that the very nature of television undermined American democratic institutions.
  • Many Americans tuned in to the presidential debate looking for something substantial and meaty
  • It was simply another manifestation of the incoherence and vitriol of cable news
  • “When, in short, a people become an audience and their public business a vaudeville act, then a nation finds itself at risk; culture-death is a clear possibility,” warned Postman.
  • Technology Is Never Neutral
  • As for new problems, we have increased addictions (technological and pornographic); increased loneliness, anxiety, and distraction; and inhibited social and intellectual maturation.
  • The average length of a shot on network television is only 3.5 seconds, so that the eye never rests, always has something new to see. Moreover, television offers viewers a variety of subject matter, requires minimal skills to comprehend it, and is largely aimed at emotional gratification.
  • This is far truer of the Internet and social media, where more than a third of Americans, and almost half of young people, now get their news.
  • with smartphones now ubiquitous, the Internet has replaced television as the “background radiation of the social and intellectual universe.”
  • Is There Any Solution?
  • Reading news or commentary in print, in contrast, requires concentration, patience, and careful reflection, virtues that our digital age vitiates.
  • Politics as Entertainment
  • “How television stages the world becomes the model for how the world is properly to be staged,” observed Postman. In the case of politics, television fashions public discourse into yet another form of entertainment
  • In America, the fundamental metaphor for political discourse is the television commercial. The television commercial is not at all about the character of products to be consumed. … They tell everything about the fears, fancies, and dreams of those who might buy them.
  • The television commercial has oriented business away from making products of value and towards making consumers feel valuable, which means that the business of business has now become pseudo-therapy. The consumer is a patient assured by psycho-dramas.
  • Such is the case with the way politics is “advertised” to different subsets of the American electorate. The “consumer,” depending on his political leanings, may be manipulated by fears of either an impending white-nationalist, fascist dictatorship, or a radical, woke socialist takeover.
  • This paradigm is aggravated by the hypersiloing of media content, which explains why Americans who read left-leaning media view the Proud Boys as a legitimate, existential threat to national civil order, while those who read right-leaning media believe the real immediate enemies of our nation are Antifa
  • Regardless of whether either of these groups represents a real public menace, the loss of any national consensus over what constitutes objective news means that Americans effectively talk past one another: they use the Proud Boys or Antifa as rhetorical barbs to smear their ideological opponents as extremists.
  • Yet these technologies are far from neutral. They are, rather, “equipped with a program for social change.
  • Postman’s analysis of technology is prophetic and profound. He warned of the trivialising of our media, defined by “broken time and broken attention,” in which “facts push other facts into and then out of consciousness at speeds that neither permit nor require evaluation.” He warned of “a neighborhood of strangers and pointless quantity.”
  • does Postman offer any solutions to this seemingly uncontrollable technological juggernaut?
  • Postman’s suggestions regarding education are certainly relevant. He unequivocally condemned education that mimics entertainment, and urged a return to learning that is hierarchical, meaning that it first gives students a foundation of essential knowledge before teaching “critical thinking.”
  • Postman also argued that education must avoid a lowest-common-denominator approach in favor of complexity and the perplexing: the latter method elicits in the student a desire to make sense of what perplexes him.
  • Finally, Postman promoted education of vigorous exposition, logic, and rhetoric, all being necessary for citizenship
  • Another course of action is to understand what these media, by their very nature, do to us and to public discourse.
  • We must, as Postman exhorts us, “demystify the data” and dominate our technology, lest it dominate us. We must identify and resist how television, social media, and smartphones manipulate our emotions, infantilise us, and weaken our ability to rebuild what 2020 has ravaged.
caelengrubb

Believing in Overcoming Cognitive Biases | Journal of Ethics | American Medical Associa... - 0 views

  • Cognitive biases contribute significantly to diagnostic and treatment errors
  • A 2016 review of their roles in decision making lists 4 domains of concern for physicians: gathering and interpreting evidence, taking action, and evaluating decisions
  • Confirmation bias is the selective gathering and interpretation of evidence consistent with current beliefs and the neglect of evidence that contradicts them.
  • ...14 more annotations...
  • It can occur when a physician refuses to consider alternative diagnoses once an initial diagnosis has been established, despite contradicting data, such as lab results. This bias leads physicians to see what they want to see
  • Anchoring bias is closely related to confirmation bias and comes into play when interpreting evidence. It refers to physicians’ practices of prioritizing information and data that support their initial impressions, even when first impressions are wrong
  • When physicians move from deliberation to action, they are sometimes swayed by emotional reactions rather than rational deliberation about risks and benefits. This is called the affect heuristic, and, while heuristics can often serve as efficient approaches to problem solving, they can sometimes lead to bias
  • Further down the treatment pathway, outcomes bias can come into play. This bias refers to the practice of believing that good or bad results are always attributable to prior decisions, even when there is no valid reason to do so
  • The dual-process theory, a cognitive model of reasoning, can be particularly relevant in matters of clinical decision making
  • This theory is based on the argument that we use 2 different cognitive systems, intuitive and analytical, when reasoning. The former is quick and uses information that is readily available; the latter is slower and more deliberate.
  • Consideration should be given to the difficulty physicians face in employing analytical thinking exclusively. Beyond constraints of time, information, and resources, many physicians are also likely to be sleep deprived, work in an environment full of distractions, and be required to respond quickly while managing heavy cognitive loads
  • Simply increasing physicians’ familiarity with the many types of cognitive biases—and how to avoid them—may be one of the best strategies to decrease bias-related errors
  • The same review suggests that cognitive forcing strategies may also have some success in improving diagnostic outcomes
  • Afterwards, the resident physicians were debriefed on both case-specific details and on cognitive forcing strategies, interviewed, and asked to complete a written survey. The results suggested that resident physicians further along in their training (ie, postgraduate year three) gained more awareness of cognitive strategies than resident physicians in earlier years of training, suggesting that this tool could be more useful after a certain level of training has been completed
  • A 2013 study examined the effect of a 3-part, 1-year curriculum on recognition and knowledge of cognitive biases and debiasing strategies in second-year residents
  • Cognitive biases in clinical practice have a significant impact on care, often in negative ways. They sometimes manifest as physicians seeing what they want to see rather than what is actually there. Or they come into play when physicians make snap decisions and then prioritize evidence that supports their conclusions, as opposed to drawing conclusions from evidence
  • Fortunately, cognitive psychology provides insight into how to prevent biases. Guided reflection and cognitive forcing strategies deflect bias through close examination of our own thinking processes.
  • During medical education and consistently thereafter, we must provide physicians with a full appreciation of the cost of biases and the potential benefits of combatting them.
caelengrubb

Cognitive Bias and Public Health Policy During the COVID-19 Pandemic | Critical Care Me... - 0 views

  • As the coronavirus disease 2019 (COVID-19) pandemic abates in many countries worldwide, and a new normal phase arrives, critically assessing policy responses to this public health crisis may promote better preparedness for the next wave or the next pandemic
  • A key lesson is revealed by one of the earliest and most sizeable US federal responses to the pandemic: the investment of $3 billion to build more ventilators. These extra ventilators, even had they been needed, would likely have done little to improve population survival because of the high mortality among patients with COVID-19 who require mechanical ventilation and diversion of clinicians away from more health-promoting endeavors.
  • Why are so many people distressed at the possibility that a patient in plain view—such as a person presenting to an emergency department with severe respiratory distress—would be denied an attempt at rescue because of a ventilator shortfall, but do not mount similarly impassioned concerns regarding failures to implement earlier, more aggressive physical distancing, testing, and contact tracing policies that would have saved far more lives?
  • ...12 more annotations...
  • These cognitive errors, which distract leaders from optimal policy making and citizens from taking steps to promote their own and others’ interests, cannot merely be ascribed to repudiations of science.
  • The first error that thwarts effective policy making during crises stems from what economists have called the “identifiable victim effect.” Humans respond more aggressively to threats to identifiable lives, ie, those that an individual can easily imagine being their own or belonging to people they care about (such as family members) or care for (such as a clinician’s patients) than to the hidden, “statistical” deaths reported in accounts of the population-level tolls of the crisis
  • Yet such views represent a second reason for the broad endorsement of policies that prioritize saving visible, immediately jeopardized lives: that humans are imbued with a strong and neurally mediated3 tendency to predict outcomes that are systematically more optimistic than observed outcomes
  • A third driver of misguided policy responses is that humans are present biased, ie, people tend to prefer immediate benefits to even larger benefits in the future.
  • Even if the tendency to prioritize visibly affected individuals could be resisted, many people would still place greater value on saving a life today than a life tomorrow.
  • Similar psychology helps explain the reluctance of many nations to limit refrigeration and air conditioning, forgo fuel-inefficient transportation, and take other near-term steps to reduce the future effects of climate change
  • The fourth contributing factor is that virtually everyone is subject to omission bias, which involves the tendency to prefer that a harm occur by failure to take action rather than as direct consequence of the actions that are taken
  • Although those who set policies for rationing ventilators and other scarce therapies do not intend the deaths of those who receive insufficient priority for these treatments, such policies nevertheless prevent clinicians from taking all possible steps to save certain lives.
  • An important goal of governance is to mitigate the effects of these and other biases on public policy and to effectively communicate the reasons for difficult decisions to the public. However, health systems’ routine use of wartime terminology of “standing up” and “standing down” intensive care units illustrate problematic messaging aimed at the need to address immediate danger
  • Second, had governments, health systems, and clinicians better understood the “identifiable victim effect,” they may have realized that promoting flattening the curve as a way to reduce pressure on hospitals and health care workers would be less effective than promoting early restaurant and retail store closures by saying “The lives you save when you close your doors include your own.”
  • Third, these leaders’ routine use of terms such as “nonpharmaceutical interventions”9 portrays public health responses negatively by labeling them according to what they are not. Instead, support for heavily funding contact tracing could have been generated by communicating such efforts as “lifesaving.
  • Fourth, although errors of human cognition are challenging to surmount, policy making, even in a crisis, occurs over a sufficient period to be meaningfully improved by deliberate efforts to counter untoward biases
« First ‹ Previous 81 - 100 of 134 Next › Last »
Showing 20 items per page