Skip to main content

Home/ TOK Friends/ Group items tagged medium

Rss Feed Group items tagged

Javier E

It's Time for a Real Code of Ethics in Teaching - Noah Berlatsky - The Atlantic - 3 views

  • More 5inShare Email Print A defendant in the Atlanta Public Schools case turns herself in at the Fulton County Jail on April 2. (David Goldman/AP) Earlier this week at The Atlantic, Emily Richmond asked whether high-stakes testing caused the Atlanta schools cheating scandal. The answer, I would argue, is yes... just not in the way you might think. Tests don't cause unethical behavior. But they did cause the Atlanta cheating scandal, and they are doing damage to the teaching profession. The argument that tests do not cause unethical behavior is fairly straightforward, and has been articulated by a number of writers. Jonathan Chait quite correctly points out that unethical behavior occurs in virtually all professions -- and that it occurs particularly when there are clear incentives to succeed. Incentivizing any field increases the impetus to cheat. Suppose journalism worked the way teaching traditionally had. You get hired at a newspaper, and your advancement and pay are dictated almost entirely by your years on the job, with almost no chance of either becoming a star or of getting fired for incompetence. Then imagine journalists changed that and instituted the current system, where you can get really successful if your bosses like you or be fired if they don't. You could look around and see scandal after scandal -- phone hacking! Jayson Blair! NBC's exploding truck! Janet Cooke! Stephen Glass! -- that could plausibly be attributed to this frightening new world in which journalists had an incentive to cheat in order to get ahead. It holds true of any field. If Major League Baseball instituted tenure, and maybe used tee-ball rules where you can't keep score and everybody gets a chance to hit, it could stamp out steroid use. Students have been cheating on tests forever -- massive, systematic cheating, you could say. Why? Because they have an incentive to do well. Give teachers and administrators an incentive for their students to do well, and more of them will cheat. For Chait, then, teaching has just been made more like journalism or baseball; it has gone from an incentiveless occupation to one with incentives.
  • Chait refers to violations of journalistic ethics -- like the phone-hacking scandal -- and suggests they are analogous to Major-League steroid use, and that both are similar to teachers (or students) cheating on tests. But is phone hacking "cheating"
  • Phone hacking was, then, not an example of cheating. It was a violation of professional ethics. And those ethics are not arbitrarily imposed, but are intrinsic to the practice of journalism as a profession committed to public service and to truth.
  • ...8 more annotations...
  • Behaving ethically matters, but how it matters, and what it means, depends strongly on the context in which it occurs.
  • Ethics for teachers is not, apparently, first and foremost about educating their students, or broadening their minds. Rather, ethics for teachers in our current system consists in following the rules. The implicit, linguistic signal being given is that teachers are not like journalists or doctors, committed to a profession and to the moral code needed to achieve their professional goals. Instead, they are like athletes playing games, or (as Chait says) like children taking tests.
  • Using "cheating" as an ethical lens tends to both trivialize and infantilize teacher's work
  • Professions with social respect and social capital, like doctors and lawyers, collaborate in the creation of their own standards. The assumption is that those standards are intrinsic to the profession's goals, and that, therefore, professionals themselves are best equipped to establish and monitor them. Teachers' standards, though, are imposed from outside -- as if teachers are children, or as if teaching is a game.
  • High-stakes testing, then, does leads to cheating. It does not create unethical behavior -- but it does create the particular unethical behavior of "cheating."
  • We have reached a point where we can only talk about the ethics of the profession in terms of cheating or not cheating, as if teachers' main ethical duty is to make sure that scantron bubbles get filled in correctly. Teachers, like journalists, should have a commitment to truth; like doctors, they have a duty of care. Translating those commitments and duties into a bureaucratized measure of cheating-or-not-cheating diminishes ethic
  • For teachers it is, literally, demoralizing. It severs the moral experience of teaching from the moral evaluation of teaching, which makes it almost impossible for good teachers (in all the senses of "good") to stay in the system.
  • We need better ethics for teachers -- ethics that treat them as adults and professionals, not like children playing games.
Javier E

'ContraPoints' Is Political Philosophy Made for YouTube - The Atlantic - 1 views

  • While Wynn positions herself on the left, she is no dogmatic ideologue, readily admitting to points on the right and criticizing leftist arguments when warranted
  • She has described her work as “edutainment” and “propaganda,” and it’s both
  • But what makes her videos unique is the way Wynn combines those two elements: high standards of rational argument and not-quite-rational persuasion. ContraPoints offers compelling speech aimed at truth, rendered in the raucous, meme-laden idiom of the interne
  • ...16 more annotations...
  • In 2014, Wynn noticed a trend on YouTube that disturbed her: Videos with hyperbolic titles like “why feminism ruins everything,” “SJW cringe compilation,” and “Ben Shapiro DESTROYS Every College Snowflake” were attracting millions of views and spawning long, jeering comment threads. Wynn felt she was watching the growth of a community of outrage that believes feminists, Marxists, and multiculturalists are conspiring to destroy freedom of speech, liquidate gender norms, and demolish Western civilization
  • Wynn created ContraPoints to offer entertaining, coherent rebuttals to these kinds of ideas. Her videos also explain left-wing talking points—like rape culture and cultural appropriation—and use philosophy to explore topics that are important to Wynn, such as the meaning of gender for trans people.
  • Wynn thinks it’s a mistake to assume that viewers of angry, right-wing videos are beyond redemption. “It’s quite difficult to get through to the people who are really committed to these anti-progressive beliefs,” Wynn told me recently. However, she said, she believes that many viewers find such ideas “psychologically resonant” without being hardened reactionaries. This broad, not fully committed center—comprising people whose minds can still be changed—is Wynn’s target audience.
  • Usually, the videos to which Wynn is responding take the stance of dogged reason cutting through the emotional excesses of so-called “political correctness.” For example, the American conservative commentator Ben Shapiro, who is a target of a recent ContraPoints video, has made “facts don’t care about your feelings” his motto. Wynn’s first step in trying to win over those who find anti-progressive views appealing is to show that these ideas often rest on a flimsy foundation. To do so, she fully adopts the rational standards of argument that her rivals pride themselves on following, and demonstrates how they fail to achieve them
  • Wynn dissects her opponents’ positions, holding up fallacies, evasions, and other rhetorical tricks for examination, all the while providing a running commentary on good argumentative method.
  • The host defends her own positions according to the same principles. Wynn takes on the strongest version of her opponent’s argument, acknowledges when she thinks her opponents are right and when she has been wrong, clarifies when misunderstood, and provides plenty of evidence for her claims
  • Wynn is a former Ph.D. student in philosophy, and though her videos are too rich with dick jokes for official settings, her argumentative practice would pass muster in any grad seminar.
  • she critiques many of her leftist allies for being bad at persuasion.
  • Socrates persuaded by both the logic of argument and the dynamic of fandom. Wynn is beginning to grow a dedicated following of her own: Members of online discussion groups refer to her as “mother” and “the queen,” produce fan art, and post photos of themselves dressed as characters from her videos.
  • she shares Socrates’s view that philosophy is more an erotic art than a martial one
  • As she puts it, she’s not trying to destroy the people she addresses, but seduce them
  • for Wynn, the true key to persuasion is to engage her audience on an emotional level.
  • One thing she has come across repeatedly is a disdain for the left’s perceived moral superiority. Anti-progressives of all stripes, Wynn told me, show an “intense defensiveness against being told what to do” and a “repulsion in response to moralizing.”
  • Matching her speech to the audience’s tastes presents a prickly rhetorical challenge. In an early video, Contra complains: “The problem is this medium. These goddamn savages demand a circus, and I intend to give them one, but behind the curtain, I really just want to have a conversation.
  • Philosophical conversation requires empathy and good-faith engagement. But the native tongue of political YouTube is ironic antagonism. It’s Wynn’s inimitable way of combining these two ingredients that gives ContraPoints its distinctive mouthfeel.
  • Wynn spends weeks in the online communities of her opponents—whether they’re climate skeptics or trans-exclusionary feminists—trying to understand what they believe and why they believe it. In Socrates’s words, she’s studying the souls of her audience.
Javier E

'Our minds can be hijacked': the tech insiders who fear a smartphone dystopia | Technol... - 0 views

  • Rosenstein belongs to a small but growing band of Silicon Valley heretics who complain about the rise of the so-called “attention economy”: an internet shaped around the demands of an advertising economy.
  • “It is very common,” Rosenstein says, “for humans to develop things with the best of intentions and for them to have unintended, negative consequences.”
  • most concerned about the psychological effects on people who, research shows, touch, swipe or tap their phone 2,617 times a day.
  • ...43 more annotations...
  • There is growing concern that as well as addicting users, technology is contributing toward so-called “continuous partial attention”, severely limiting people’s ability to focus, and possibly lowering IQ. One recent study showed that the mere presence of smartphones damages cognitive capacity – even when the device is turned off. “Everyone is distracted,” Rosenstein says. “All of the time.”
  • Drawing a straight line between addiction to social media and political earthquakes like Brexit and the rise of Donald Trump, they contend that digital forces have completely upended the political system and, left unchecked, could even render democracy as we know it obsolete.
  • Without irony, Eyal finished his talk with some personal tips for resisting the lure of technology. He told his audience he uses a Chrome extension, called DF YouTube, “which scrubs out a lot of those external triggers” he writes about in his book, and recommended an app called Pocket Points that “rewards you for staying off your phone when you need to focus”.
  • “One reason I think it is particularly important for us to talk about this now is that we may be the last generation that can remember life before,” Rosenstein says. It may or may not be relevant that Rosenstein, Pearlman and most of the tech insiders questioning today’s attention economy are in their 30s, members of the last generation that can remember a world in which telephones were plugged into walls.
  • One morning in April this year, designers, programmers and tech entrepreneurs from across the world gathered at a conference centre on the shore of the San Francisco Bay. They had each paid up to $1,700 to learn how to manipulate people into habitual use of their products, on a course curated by conference organiser Nir Eyal.
  • Eyal, 39, the author of Hooked: How to Build Habit-Forming Products, has spent several years consulting for the tech industry, teaching techniques he developed by closely studying how the Silicon Valley giants operate.
  • “The technologies we use have turned into compulsions, if not full-fledged addictions,” Eyal writes. “It’s the impulse to check a message notification. It’s the pull to visit YouTube, Facebook, or Twitter for just a few minutes, only to find yourself still tapping and scrolling an hour later.” None of this is an accident, he writes. It is all “just as their designers intended”
  • He explains the subtle psychological tricks that can be used to make people develop habits, such as varying the rewards people receive to create “a craving”, or exploiting negative emotions that can act as “triggers”. “Feelings of boredom, loneliness, frustration, confusion and indecisiveness often instigate a slight pain or irritation and prompt an almost instantaneous and often mindless action to quell the negative sensation,” Eyal writes.
  • The most seductive design, Harris explains, exploits the same psychological susceptibility that makes gambling so compulsive: variable rewards. When we tap those apps with red icons, we don’t know whether we’ll discover an interesting email, an avalanche of “likes”, or nothing at all. It is the possibility of disappointment that makes it so compulsive.
  • Finally, Eyal confided the lengths he goes to protect his own family. He has installed in his house an outlet timer connected to a router that cuts off access to the internet at a set time every day. “The idea is to remember that we are not powerless,” he said. “We are in control.
  • But are we? If the people who built these technologies are taking such radical steps to wean themselves free, can the rest of us reasonably be expected to exercise our free will?
  • Not according to Tristan Harris, a 33-year-old former Google employee turned vocal critic of the tech industry. “All of us are jacked into this system,” he says. “All of our minds can be hijacked. Our choices are not as free as we think they are.”
  • Harris, who has been branded “the closest thing Silicon Valley has to a conscience”, insists that billions of people have little choice over whether they use these now ubiquitous technologies, and are largely unaware of the invisible ways in which a small number of people in Silicon Valley are shaping their lives.
  • “I don’t know a more urgent problem than this,” Harris says. “It’s changing our democracy, and it’s changing our ability to have the conversations and relationships that we want with each other.” Harris went public – giving talks, writing papers, meeting lawmakers and campaigning for reform after three years struggling to effect change inside Google’s Mountain View headquarters.
  • He explored how LinkedIn exploits a need for social reciprocity to widen its network; how YouTube and Netflix autoplay videos and next episodes, depriving users of a choice about whether or not they want to keep watching; how Snapchat created its addictive Snapstreaks feature, encouraging near-constant communication between its mostly teenage users.
  • The techniques these companies use are not always generic: they can be algorithmically tailored to each person. An internal Facebook report leaked this year, for example, revealed that the company can identify when teens feel “insecure”, “worthless” and “need a confidence boost”. Such granular information, Harris adds, is “a perfect model of what buttons you can push in a particular person”.
  • Tech companies can exploit such vulnerabilities to keep people hooked; manipulating, for example, when people receive “likes” for their posts, ensuring they arrive when an individual is likely to feel vulnerable, or in need of approval, or maybe just bored. And the very same techniques can be sold to the highest bidder. “There’s no ethics,” he says. A company paying Facebook to use its levers of persuasion could be a car business targeting tailored advertisements to different types of users who want a new vehicle. Or it could be a Moscow-based troll farm seeking to turn voters in a swing county in Wisconsin.
  • It was Rosenstein’s colleague, Leah Pearlman, then a product manager at Facebook and on the team that created the Facebook “like”, who announced the feature in a 2009 blogpost. Now 35 and an illustrator, Pearlman confirmed via email that she, too, has grown disaffected with Facebook “likes” and other addictive feedback loops. She has installed a web browser plug-in to eradicate her Facebook news feed, and hired a social media manager to monitor her Facebook page so that she doesn’t have to.
  • Harris believes that tech companies never deliberately set out to make their products addictive. They were responding to the incentives of an advertising economy, experimenting with techniques that might capture people’s attention, even stumbling across highly effective design by accident.
  • It’s this that explains how the pull-to-refresh mechanism, whereby users swipe down, pause and wait to see what content appears, rapidly became one of the most addictive and ubiquitous design features in modern technology. “Each time you’re swiping down, it’s like a slot machine,” Harris says. “You don’t know what’s coming next. Sometimes it’s a beautiful photo. Sometimes it’s just an ad.”
  • The reality TV star’s campaign, he said, had heralded a watershed in which “the new, digitally supercharged dynamics of the attention economy have finally crossed a threshold and become manifest in the political realm”.
  • “Smartphones are useful tools,” he says. “But they’re addictive. Pull-to-refresh is addictive. Twitter is addictive. These are not good things. When I was working on them, it was not something I was mature enough to think about. I’m not saying I’m mature now, but I’m a little bit more mature, and I regret the downsides.”
  • All of it, he says, is reward-based behaviour that activates the brain’s dopamine pathways. He sometimes finds himself clicking on the red icons beside his apps “to make them go away”, but is conflicted about the ethics of exploiting people’s psychological vulnerabilities. “It is not inherently evil to bring people back to your product,” he says. “It’s capitalism.”
  • He identifies the advent of the smartphone as a turning point, raising the stakes in an arms race for people’s attention. “Facebook and Google assert with merit that they are giving users what they want,” McNamee says. “The same can be said about tobacco companies and drug dealers.”
  • McNamee chooses his words carefully. “The people who run Facebook and Google are good people, whose well-intentioned strategies have led to horrific unintended consequences,” he says. “The problem is that there is nothing the companies can do to address the harm unless they abandon their current advertising models.”
  • But how can Google and Facebook be forced to abandon the business models that have transformed them into two of the most profitable companies on the planet?
  • McNamee believes the companies he invested in should be subjected to greater regulation, including new anti-monopoly rules. In Washington, there is growing appetite, on both sides of the political divide, to rein in Silicon Valley. But McNamee worries the behemoths he helped build may already be too big to curtail.
  • Rosenstein, the Facebook “like” co-creator, believes there may be a case for state regulation of “psychologically manipulative advertising”, saying the moral impetus is comparable to taking action against fossil fuel or tobacco companies. “If we only care about profit maximisation,” he says, “we will go rapidly into dystopia.”
  • James Williams does not believe talk of dystopia is far-fetched. The ex-Google strategist who built the metrics system for the company’s global search advertising business, he has had a front-row view of an industry he describes as the “largest, most standardised and most centralised form of attentional control in human history”.
  • It is a journey that has led him to question whether democracy can survive the new technological age.
  • He says his epiphany came a few years ago, when he noticed he was surrounded by technology that was inhibiting him from concentrating on the things he wanted to focus on. “It was that kind of individual, existential realisation: what’s going on?” he says. “Isn’t technology supposed to be doing the complete opposite of this?
  • That discomfort was compounded during a moment at work, when he glanced at one of Google’s dashboards, a multicoloured display showing how much of people’s attention the company had commandeered for advertisers. “I realised: this is literally a million people that we’ve sort of nudged or persuaded to do this thing that they weren’t going to otherwise do,” he recalls.
  • Williams and Harris left Google around the same time, and co-founded an advocacy group, Time Well Spent, that seeks to build public momentum for a change in the way big tech companies think about design. Williams finds it hard to comprehend why this issue is not “on the front page of every newspaper every day.
  • “Eighty-seven percent of people wake up and go to sleep with their smartphones,” he says. The entire world now has a new prism through which to understand politics, and Williams worries the consequences are profound.
  • g. “The attention economy incentivises the design of technologies that grab our attention,” he says. “In so doing, it privileges our impulses over our intentions.”
  • That means privileging what is sensational over what is nuanced, appealing to emotion, anger and outrage. The news media is increasingly working in service to tech companies, Williams adds, and must play by the rules of the attention economy to “sensationalise, bait and entertain in order to survive”.
  • It is not just shady or bad actors who were exploiting the internet to change public opinion. The attention economy itself is set up to promote a phenomenon like Trump, who is masterly at grabbing and retaining the attention of supporters and critics alike, often by exploiting or creating outrage.
  • All of which has left Brichter, who has put his design work on the backburner while he focuses on building a house in New Jersey, questioning his legacy. “I’ve spent many hours and weeks and months and years thinking about whether anything I’ve done has made a net positive impact on society or humanity at all,” he says. He has blocked certain websites, turned off push notifications, restricted his use of the Telegram app to message only with his wife and two close friends, and tried to wean himself off Twitter. “I still waste time on it,” he confesses, “just reading stupid news I already know about.” He charges his phone in the kitchen, plugging it in at 7pm and not touching it until the next morning.
  • He stresses these dynamics are by no means isolated to the political right: they also play a role, he believes, in the unexpected popularity of leftwing politicians such as Bernie Sanders and Jeremy Corbyn, and the frequent outbreaks of internet outrage over issues that ignite fury among progressives.
  • All of which, Williams says, is not only distorting the way we view politics but, over time, may be changing the way we think, making us less rational and more impulsive. “We’ve habituated ourselves into a perpetual cognitive style of outrage, by internalising the dynamics of the medium,” he says.
  • It was another English science fiction writer, Aldous Huxley, who provided the more prescient observation when he warned that Orwellian-style coercion was less of a threat to democracy than the more subtle power of psychological manipulation, and “man’s almost infinite appetite for distractions”.
  • If the attention economy erodes our ability to remember, to reason, to make decisions for ourselves – faculties that are essential to self-governance – what hope is there for democracy itself?
  • “The dynamics of the attention economy are structurally set up to undermine the human will,” he says. “If politics is an expression of our human will, on individual and collective levels, then the attention economy is directly undermining the assumptions that democracy rests on.”
Javier E

Look At Me by Patricia Snow | Articles | First Things - 0 views

  • Maurice stumbles upon what is still the gold standard for the treatment of infantile autism: an intensive course of behavioral therapy called applied behavioral analysis that was developed by psychologist O. Ivar Lovaas at UCLA in the 1970s
  • in a little over a year’s time she recovers her daughter to the point that she is indistinguishable from her peers.
  • Let Me Hear Your Voice is not a particularly religious or pious work. It is not the story of a miracle or a faith healing
  • ...54 more annotations...
  • Maurice discloses her Catholicism, and the reader is aware that prayer undergirds the therapy, but the book is about the therapy, not the prayer. Specifically, it is about the importance of choosing methods of treatment that are supported by scientific data. Applied behavioral analysis is all about data: its daily collection and interpretation. The method is empirical, hard-headed, and results-oriented.
  • on a deeper level, the book is profoundly religious, more religious perhaps than its author intended. In this reading of the book, autism is not only a developmental disorder afflicting particular individuals, but a metaphor for the spiritual condition of fallen man.
  • Maurice’s autistic daughter is indifferent to her mother
  • In this reading of the book, the mother is God, watching a child of his wander away from him into darkness: a heartbroken but also a determined God, determined at any cost to bring the child back
  • the mother doesn’t turn back, concedes nothing to the condition that has overtaken her daughter. There is no political correctness in Maurice’s attitude to autism; no nod to “neurodiversity.” Like the God in Donne’s sonnet, “Batter my heart, three-personed God,” she storms the walls of her daughter’s condition
  • Like God, she sets her sights high, commits both herself and her child to a demanding, sometimes painful therapy (life!), and receives back in the end a fully alive, loving, talking, and laughing child
  • the reader realizes that for God, the harrowing drama of recovery is never a singular, or even a twice-told tale, but a perennial one. Every child of his, every child of Adam and Eve, wanders away from him into darkness
  • we have an epidemic of autism, or “autism spectrum disorder,” which includes classic autism (Maurice’s children’s diagnosis); atypical autism, which exhibits some but not all of the defects of autism; and Asperger’s syndrome, which is much more common in boys than in girls and is characterized by average or above average language skills but impaired social skills.
  • At the same time, all around us, we have an epidemic of something else. On the street and in the office, at the dinner table and on a remote hiking trail, in line at the deli and pushing a stroller through the park, people go about their business bent over a small glowing screen, as if praying.
  • This latter epidemic, or experiment, has been going on long enough that people are beginning to worry about its effects.
  • for a comprehensive survey of the emerging situation on the ground, the interested reader might look at Sherry Turkle’s recent book, Reclaiming Conversation: The Power of Talk in a Digital Age.
  • she also describes in exhaustive, chilling detail the mostly horrifying effects recent technology has had on families and workplaces, educational institutions, friendships and romance.
  • many of the promises of technology have not only not been realized, they have backfired. If technology promised greater connection, it has delivered greater alienation. If it promised greater cohesion, it has led to greater fragmentation, both on a communal and individual level.
  • If thinking that the grass is always greener somewhere else used to be a marker of human foolishness and a temptation to be resisted, today it is simply a possibility to be checked out. The new phones, especially, turn out to be portable Pied Pipers, irresistibly pulling people away from the people in front of them and the tasks at hand.
  • all it takes is a single phone on a table, even if that phone is turned off, for the conversations in the room to fade in number, duration, and emotional depth.
  • an infinitely malleable screen isn’t an invitation to stability, but to restlessness
  • Current media, and the fear of missing out that they foster (a motivator now so common it has its own acronym, FOMO), drive lives of continual interruption and distraction, of virtual rather than real relationships, and of “little” rather than “big” talk
  • if you may be interrupted at any time, it makes sense, as a student explains to Turkle, to “keep things light.”
  • we are reaping deficits in emotional intelligence and empathy; loneliness, but also fears of unrehearsed conversations and intimacy; difficulties forming attachments but also difficulties tolerating solitude and boredom
  • consider the testimony of the faculty at a reputable middle school where Turkle is called in as a consultant
  • The teachers tell Turkle that their students don’t make eye contact or read body language, have trouble listening, and don’t seem interested in each other, all markers of autism spectrum disorder
  • Like much younger children, they engage in parallel play, usually on their phones. Like autistic savants, they can call up endless information on their phones, but have no larger context or overarching narrative in which to situate it
  • Students are so caught up in their phones, one teacher says, “they don’t know how to pay attention to class or to themselves or to another person or to look in each other’s eyes and see what is going on.
  • “It is as though they all have some signs of being on an Asperger’s spectrum. But that’s impossible. We are talking about a schoolwide problem.”
  • Can technology cause Asperger’
  • “It is not necessary to settle this debate to state the obvious. If we don’t look at our children and engage them in conversation, it is not surprising if they grow up awkward and withdrawn.”
  • In the protocols developed by Ivar Lovaas for treating autism spectrum disorder, every discrete trial in the therapy, every drill, every interaction with the child, however seemingly innocuous, is prefaced by this clear command: “Look at me!”
  • If absence of relationship is a defining feature of autism, connecting with the child is both the means and the whole goal of the therapy. Applied behavioral analysis does not concern itself with when exactly, how, or why a child becomes autistic, but tries instead to correct, do over, and even perhaps actually rewire what went wrong, by going back to the beginning
  • Eye contact—which we know is essential for brain development, emotional stability, and social fluency—is the indispensable prerequisite of the therapy, the sine qua non of everything that happens.
  • There are no shortcuts to this method; no medications or apps to speed things up; no machines that can do the work for us. This is work that only human beings can do
  • it must not only be started early and be sufficiently intensive, but it must also be carried out in large part by parents themselves. Parents must be trained and involved, so that the treatment carries over into the home and continues for most of the child’s waking hours.
  • there are foundational relationships that are templates for all other relationships, and for learning itself.
  • Maurice’s book, in other words, is not fundamentally the story of a child acquiring skills, though she acquires them perforce. It is the story of the restoration of a child’s relationship with her parents
  • it is also impossible to overstate the time and commitment that were required to bring it about, especially today, when we have so little time, and such a faltering, diminished capacity for sustained engagement with small children
  • The very qualities that such engagement requires, whether our children are sick or well, are the same qualities being bred out of us by technologies that condition us to crave stimulation and distraction, and by a culture that, through a perverse alchemy, has changed what was supposed to be the freedom to work anywhere into an obligation to work everywhere.
  • In this world of total work (the phrase is Josef Pieper’s), the work of helping another person become fully human may be work that is passing beyond our reach, as our priorities, and the technologies that enable and reinforce them, steadily unfit us for the work of raising our own young.
  • in Turkle’s book, as often as not, it is young people who are distressed because their parents are unreachable. Some of the most painful testimony in Reclaiming Conversation is the testimony of teenagers who hope to do things differently when they have children, who hope someday to learn to have a real conversation, and so o
  • it was an older generation that first fell under technology’s spell. At the middle school Turkle visits, as at many other schools across the country, it is the grown-ups who decide to give every child a computer and deliver all course content electronically, meaning that they require their students to work from the very medium that distracts them, a decision the grown-ups are unwilling to reverse, even as they lament its consequences.
  • we have approached what Turkle calls the robotic moment, when we will have made ourselves into the kind of people who are ready for what robots have to offer. When people give each other less, machines seem less inhuman.
  • robot babysitters may not seem so bad. The robots, at least, will be reliable!
  • If human conversations are endangered, what of prayer, a conversation like no other? All of the qualities that human conversation requires—patience and commitment, an ability to listen and a tolerance for aridity—prayer requires in greater measure.
  • this conversation—the Church exists to restore. Everything in the traditional Church is there to facilitate and nourish this relationship. Everything breathes, “Look at me!”
  • there is a second path to God, equally enjoined by the Church, and that is the way of charity to the neighbor, but not the neighbor in the abstract.
  • “Who is my neighbor?” a lawyer asks Jesus in the Gospel of Luke. Jesus’s answer is, the one you encounter on the way.
  • Virtue is either concrete or it is nothing. Man’s path to God, like Jesus’s path on the earth, always passes through what the Jesuit Jean Pierre de Caussade called “the sacrament of the present moment,” which we could equally call “the sacrament of the present person,” the way of the Incarnation, the way of humility, or the Way of the Cross.
  • The tradition of Zen Buddhism expresses the same idea in positive terms: Be here now.
  • Both of these privileged paths to God, equally dependent on a quality of undivided attention and real presence, are vulnerable to the distracting eye-candy of our technologies
  • Turkle is at pains to show that multitasking is a myth, that anyone trying to do more than one thing at a time is doing nothing well. We could also call what she was doing multi-relating, another temptation or illusion widespread in the digital age. Turkle’s book is full of people who are online at the same time that they are with friends, who are texting other potential partners while they are on dates, and so on.
  • This is the situation in which many people find themselves today: thinking that they are special to someone because of something that transpired, only to discover that the other person is spread so thin, the interaction was meaningless. There is a new kind of promiscuity in the world, in other words, that turns out to be as hurtful as the old kind.
  • Who can actually multitask and multi-relate? Who can love everyone without diluting or cheapening the quality of love given to each individual? Who can love everyone without fomenting insecurity and jealousy? Only God can do this.
  • When an individual needs to be healed of the effects of screens and machines, it is real presence that he needs: real people in a real world, ideally a world of God’s own making
  • Nature is restorative, but it is conversation itself, unfolding in real time, that strikes these boys with the force of revelation. More even than the physical vistas surrounding them on a wilderness hike, unrehearsed conversation opens up for them new territory, open-ended adventures. “It was like a stream,” one boy says, “very ongoing. It wouldn’t break apart.”
  • in the waters of baptism, the new man is born, restored to his true parent, and a conversation begins that over the course of his whole life reminds man of who he is, that he is loved, and that someone watches over him always.
  • Even if the Church could keep screens out of her sanctuaries, people strongly attached to them would still be people poorly positioned to take advantage of what the Church has to offer. Anxious people, unable to sit alone with their thoughts. Compulsive people, accustomed to checking their phones, on average, every five and a half minutes. As these behaviors increase in the Church, what is at stake is man’s relationship with truth itself.
Javier E

Can Social Networks Do Better? We Don't Know Because They Haven't Tried - Talking Point... - 0 views

  • it’s not fair to say it’s Facebook or a Facebook problem. Facebook is just the latest media and communications medium. We hardly blame the technology of the book for spreading anti-Semitism via the notorious Protocols of the Elders of Zion
  • But of course, it’s not that simple. Social media platforms have distinct features that earlier communications media did not. The interactive nature of the media, the collection of data which is then run through algorithms and artificial intelligence creates something different.
  • All social media platforms are engineered with two basic goals: maximize the time you spend on the platform and make advertising as effective and thus as lucrative as possible. This means that social media can never be simply a common carrier, a distribution technology that has no substantial influence over the nature of the communication that travels over it.
  • ...5 more annotations...
  • it’s a substantial difference which deprives social media platforms of the kind of hands-off logic that would make it ridiculous to say phones are bad or the phone company is responsible if planning for a mass murder was carried out over the phone.
  • the Internet doesn’t ‘do’ anything more than make the distribution of information more efficient and radically lower the formal, informal and financial barriers to entry that used to stand in the way of various marginalized ideas.
  • Social media can never plead innocence like this because the platforms are designed to addict you and convince you of things.
  • If the question is: what can social media platforms do to protect against government-backed subversion campaigns like the one we saw in the 2016 campaign the best answer is, we don’t know. And we don’t know for a simple reason: they haven’t tried.
  • The point is straightforward: the mass collection of data, harnessed to modern computing power and the chance to amass unimaginable wealth has spurred vast technological innovation.
Javier E

The meaning of life in a world without work | Technology | The Guardian - 0 views

  • As artificial intelligence outperforms humans in more and more tasks, it will replace humans in more and more jobs.
  • Many new professions are likely to appear: virtual-world designers, for example. But such professions will probably require more creativity and flexibility, and it is unclear whether 40-year-old unemployed taxi drivers or insurance agents will be able to reinvent themselves as virtual-world designers
  • The crucial problem isn’t creating new jobs. The crucial problem is creating new jobs that humans perform better than algorithms. Consequently, by 2050 a new class of people might emerge – the useless class. People who are not just unemployed, but unemployable.
  • ...15 more annotations...
  • The same technology that renders humans useless might also make it feasible to feed and support the unemployable masses through some scheme of universal basic income.
  • The real problem will then be to keep the masses occupied and content. People must engage in purposeful activities, or they go crazy. So what will the useless class do all day?
  • One answer might be computer games. Economically redundant people might spend increasing amounts of time within 3D virtual reality worlds, which would provide them with far more excitement and emotional engagement than the “real world” outside.
  • This, in fact, is a very old solution. For thousands of years, billions of people have found meaning in playing virtual reality games. In the past, we have called these virtual reality games “religions”.
  • Muslims and Christians go through life trying to gain points in their favorite virtual reality game. If you pray every day, you get points. If you forget to pray, you lose points. If by the end of your life you gain enough points, then after you die you go to the next level of the game (aka heaven).
  • As religions show us, the virtual reality need not be encased inside an isolated box. Rather, it can be superimposed on the physical reality. In the past this was done with the human imagination and with sacred books, and in the 21st century it can be done with smartphones.
  • Consumerism too is a virtual reality game. You gain points by acquiring new cars, buying expensive brands and taking vacations abroad, and if you have more points than everybody else, you tell yourself you won the game.
  • we saw two others kids on the street who were hunting the same Pokémon, and we almost got into a fight with them. It struck me how similar the situation was to the conflict between Jews and Muslims about the holy city of Jerusalem. When you look at the objective reality of Jerusalem, all you see are stones and buildings. There is no holiness anywhere. But when you look through the medium of smartbooks (such as the Bible and the Qur’an), you see holy places and angels everywhere.
  • In the end, the real action always takes place inside the human brain. Does it matter whether the neurons are stimulated by observing pixels on a computer screen, by looking outside the windows of a Caribbean resort, or by seeing heaven in our mind’s eyes?
  • Indeed, one particularly interesting section of Israeli society provides a unique laboratory for how to live a contented life in a post-work world. In Israel, a significant percentage of ultra-orthodox Jewish men never work. They spend their entire lives studying holy scriptures and performing religion rituals. They and their families don’t starve to death partly because the wives often work, and partly because the government provides them with generous subsidies. Though they usually live in poverty, government support means that they never lack for the basic necessities of life.
  • That’s universal basic income in action. Though they are poor and never work, in survey after survey these ultra-orthodox Jewish men report higher levels of life-satisfaction than any other section of Israeli society.
  • Hence virtual realities are likely to be key to providing meaning to the useless class of the post-work world. Maybe these virtual realities will be generated inside computers. Maybe they will be generated outside computers, in the shape of new religions and ideologies. Maybe it will be a combination of the two. The possibilities are endless
  • In any case, the end of work will not necessarily mean the end of meaning, because meaning is generated by imagining rather than by working.
  • People in 2050 will probably be able to play deeper games and to construct more complex virtual worlds than in any previous time in history.
  • But what about truth? What about reality? Do we really want to live in a world in which billions of people are immersed in fantasies, pursuing make-believe goals and obeying imaginary laws? Well, like it or not, that’s the world we have been living in for thousands of years already.
Javier E

The Coming Software Apocalypse - The Atlantic - 1 views

  • Our standard framework for thinking about engineering failures—reflected, for instance, in regulations for medical devices—was developed shortly after World War II, before the advent of software, for electromechanical systems. The idea was that you make something reliable by making its parts reliable (say, you build your engine to withstand 40,000 takeoff-and-landing cycles) and by planning for the breakdown of those parts (you have two engines). But software doesn’t break. Intrado’s faulty threshold is not like the faulty rivet that leads to the crash of an airliner. The software did exactly what it was told to do. In fact it did it perfectly. The reason it failed is that it was told to do the wrong thing.
  • Software failures are failures of understanding, and of imagination. Intrado actually had a backup router, which, had it been switched to automatically, would have restored 911 service almost immediately. But, as described in a report to the FCC, “the situation occurred at a point in the application logic that was not designed to perform any automated corrective actions.”
  • The introduction of programming languages like Fortran and C, which resemble English, and tools, known as “integrated development environments,” or IDEs, that help correct simple mistakes (like Microsoft Word’s grammar checker but for code), obscured, though did little to actually change, this basic alienation—the fact that the programmer didn’t work on a problem directly, but rather spent their days writing out instructions for a machine.
  • ...52 more annotations...
  • Code is too hard to think about. Before trying to understand the attempts themselves, then, it’s worth understanding why this might be: what it is about code that makes it so foreign to the mind, and so unlike anything that came before it.
  • Technological progress used to change the way the world looked—you could watch the roads getting paved; you could see the skylines rise. Today you can hardly tell when something is remade, because so often it is remade by code.
  • Software has enabled us to make the most intricate machines that have ever existed. And yet we have hardly noticed, because all of that complexity is packed into tiny silicon chips as millions and millions of lines of cod
  • The programmer, the renowned Dutch computer scientist Edsger Dijkstra wrote in 1988, “has to be able to think in terms of conceptual hierarchies that are much deeper than a single mind ever needed to face before.” Dijkstra meant this as a warning.
  • As programmers eagerly poured software into critical systems, they became, more and more, the linchpins of the built world—and Dijkstra thought they had perhaps overestimated themselves.
  • What made programming so difficult was that it required you to think like a computer.
  • “The problem is that software engineers don’t understand the problem they’re trying to solve, and don’t care to,” says Leveson, the MIT software-safety expert. The reason is that they’re too wrapped up in getting their code to work.
  • Though he runs a lab that studies the future of computing, he seems less interested in technology per se than in the minds of the people who use it. Like any good toolmaker, he has a way of looking at the world that is equal parts technical and humane. He graduated top of his class at the California Institute of Technology for electrical engineering,
  • “The serious problems that have happened with software have to do with requirements, not coding errors.” When you’re writing code that controls a car’s throttle, for instance, what’s important is the rules about when and how and by how much to open it. But these systems have become so complicated that hardly anyone can keep them straight in their head. “There’s 100 million lines of code in cars now,” Leveson says. “You just cannot anticipate all these things.”
  • a nearly decade-long investigation into claims of so-called unintended acceleration in Toyota cars. Toyota blamed the incidents on poorly designed floor mats, “sticky” pedals, and driver error, but outsiders suspected that faulty software might be responsible
  • software experts spend 18 months with the Toyota code, picking up where NASA left off. Barr described what they found as “spaghetti code,” programmer lingo for software that has become a tangled mess. Code turns to spaghetti when it accretes over many years, with feature after feature piling on top of, and being woven around
  • Using the same model as the Camry involved in the accident, Barr’s team demonstrated that there were actually more than 10 million ways for the onboard computer to cause unintended acceleration. They showed that as little as a single bit flip—a one in the computer’s memory becoming a zero or vice versa—could make a car run out of control. The fail-safe code that Toyota had put in place wasn’t enough to stop it
  • . In all, Toyota recalled more than 9 million cars, and paid nearly $3 billion in settlements and fines related to unintended acceleration.
  • The problem is that programmers are having a hard time keeping up with their own creations. Since the 1980s, the way programmers work and the tools they use have changed remarkably little.
  • “Visual Studio is one of the single largest pieces of software in the world,” he said. “It’s over 55 million lines of code. And one of the things that I found out in this study is more than 98 percent of it is completely irrelevant. All this work had been put into this thing, but it missed the fundamental problems that people faced. And the biggest one that I took away from it was that basically people are playing computer inside their head.” Programmers were like chess players trying to play with a blindfold on—so much of their mental energy is spent just trying to picture where the pieces are that there’s hardly any left over to think about the game itself.
  • The fact that the two of them were thinking about the same problem in the same terms, at the same time, was not a coincidence. They had both just seen the same remarkable talk, given to a group of software-engineering students in a Montreal hotel by a computer researcher named Bret Victor. The talk, which went viral when it was posted online in February 2012, seemed to be making two bold claims. The first was that the way we make software is fundamentally broken. The second was that Victor knew how to fix it.
  • This is the trouble with making things out of code, as opposed to something physical. “The complexity,” as Leveson puts it, “is invisible to the eye.”
  • in early 2012, Victor had finally landed upon the principle that seemed to thread through all of his work. (He actually called the talk “Inventing on Principle.”) The principle was this: “Creators need an immediate connection to what they’re creating.” The problem with programming was that it violated the principle. That’s why software systems were so hard to think about, and so rife with bugs: The programmer, staring at a page of text, was abstracted from whatever it was they were actually making.
  • “Our current conception of what a computer program is,” he said, is “derived straight from Fortran and ALGOL in the late ’50s. Those languages were designed for punch cards.”
  • WYSIWYG (pronounced “wizzywig”) came along. It stood for “What You See Is What You Get.”
  • Victor’s point was that programming itself should be like that. For him, the idea that people were doing important work, like designing adaptive cruise-control systems or trying to understand cancer, by staring at a text editor, was appalling.
  • With the right interface, it was almost as if you weren’t working with code at all; you were manipulating the game’s behavior directly.
  • When the audience first saw this in action, they literally gasped. They knew they weren’t looking at a kid’s game, but rather the future of their industry. Most software involved behavior that unfolded, in complex ways, over time, and Victor had shown that if you were imaginative enough, you could develop ways to see that behavior and change it, as if playing with it in your hands. One programmer who saw the talk wrote later: “Suddenly all of my tools feel obsolete.”
  • hen John Resig saw the “Inventing on Principle” talk, he scrapped his plans for the Khan Academy programming curriculum. He wanted the site’s programming exercises to work just like Victor’s demos. On the left-hand side you’d have the code, and on the right, the running program: a picture or game or simulation. If you changed the code, it’d instantly change the picture. “In an environment that is truly responsive,” Resig wrote about the approach, “you can completely change the model of how a student learns ... [They] can now immediately see the result and intuit how underlying systems inherently work without ever following an explicit explanation.” Khan Academy has become perhaps the largest computer-programming class in the world, with a million students, on average, actively using the program each month.
  • The ideas spread. The notion of liveness, of being able to see data flowing through your program instantly, made its way into flagship programming tools offered by Google and Apple. The default language for making new iPhone and Mac apps, called Swift, was developed by Apple from the ground up to support an environment, called Playgrounds, that was directly inspired by Light Table.
  • “Typically the main problem with software coding—and I’m a coder myself,” Bantegnie says, “is not the skills of the coders. The people know how to code. The problem is what to code. Because most of the requirements are kind of natural language, ambiguous, and a requirement is never extremely precise, it’s often understood differently by the guy who’s supposed to code.”
  • In a pair of later talks, “Stop Drawing Dead Fish” and “Drawing Dynamic Visualizations,” Victor went one further. He demoed two programs he’d built—the first for animators, the second for scientists trying to visualize their data—each of which took a process that used to involve writing lots of custom code and reduced it to playing around in a WYSIWYG interface.
  • Victor suggested that the same trick could be pulled for nearly every problem where code was being written today. “I’m not sure that programming has to exist at all,” he told me. “Or at least software developers.” In his mind, a software developer’s proper role was to create tools that removed the need for software developers. Only then would people with the most urgent computational problems be able to grasp those problems directly, without the intermediate muck of code.
  • Victor implored professional software developers to stop pouring their talent into tools for building apps like Snapchat and Uber. “The inconveniences of daily life are not the significant problems,” he wrote. Instead, they should focus on scientists and engineers—as he put it to me, “these people that are doing work that actually matters, and critically matters, and using really, really bad tools.”
  • Bantegnie’s company is one of the pioneers in the industrial use of model-based design, in which you no longer write code directly. Instead, you create a kind of flowchart that describes the rules your program should follow (the “model”), and the computer generates code for you based on those rules
  • In a model-based design tool, you’d represent this rule with a small diagram, as though drawing the logic out on a whiteboard, made of boxes that represent different states—like “door open,” “moving,” and “door closed”—and lines that define how you can get from one state to the other. The diagrams make the system’s rules obvious: Just by looking, you can see that the only way to get the elevator moving is to close the door, or that the only way to get the door open is to stop.
  • . In traditional programming, your task is to take complex rules and translate them into code; most of your energy is spent doing the translating, rather than thinking about the rules themselves. In the model-based approach, all you have is the rules. So that’s what you spend your time thinking about. It’s a way of focusing less on the machine and more on the problem you’re trying to get it to solve.
  • “Everyone thought I was interested in programming environments,” he said. Really he was interested in how people see and understand systems—as he puts it, in the “visual representation of dynamic behavior.” Although code had increasingly become the tool of choice for creating dynamic behavior, it remained one of the worst tools for understanding it. The point of “Inventing on Principle” was to show that you could mitigate that problem by making the connection between a system’s behavior and its code immediate.
  • On this view, software becomes unruly because the media for describing what software should do—conversations, prose descriptions, drawings on a sheet of paper—are too different from the media describing what software does do, namely, code itself.
  • for this approach to succeed, much of the work has to be done well before the project even begins. Someone first has to build a tool for developing models that are natural for people—that feel just like the notes and drawings they’d make on their own—while still being unambiguous enough for a computer to understand. They have to make a program that turns these models into real code. And finally they have to prove that the generated code will always do what it’s supposed to.
  • tice brings order and accountability to large codebases. But, Shivappa says, “it’s a very labor-intensive process.” He estimates that before they used model-based design, on a two-year-long project only two to three months was spent writing code—the rest was spent working on the documentation.
  • uch of the benefit of the model-based approach comes from being able to add requirements on the fly while still ensuring that existing ones are met; with every change, the computer can verify that your program still works. You’re free to tweak your blueprint without fear of introducing new bugs. Your code is, in FAA parlance, “correct by construction.”
  • “people are not so easily transitioning to model-based software development: They perceive it as another opportunity to lose control, even more than they have already.”
  • The bias against model-based design, sometimes known as model-driven engineering, or MDE, is in fact so ingrained that according to a recent paper, “Some even argue that there is a stronger need to investigate people’s perception of MDE than to research new MDE technologies.”
  • “Human intuition is poor at estimating the true probability of supposedly ‘extremely rare’ combinations of events in systems operating at a scale of millions of requests per second,” he wrote in a paper. “That human fallibility means that some of the more subtle, dangerous bugs turn out to be errors in design; the code faithfully implements the intended design, but the design fails to correctly handle a particular ‘rare’ scenario.”
  • Newcombe was convinced that the algorithms behind truly critical systems—systems storing a significant portion of the web’s data, for instance—ought to be not just good, but perfect. A single subtle bug could be catastrophic. But he knew how hard bugs were to find, especially as an algorithm grew more complex. You could do all the testing you wanted and you’d never find them all.
  • An algorithm written in TLA+ could in principle be proven correct. In practice, it allowed you to create a realistic model of your problem and test it not just thoroughly, but exhaustively. This was exactly what he’d been looking for: a language for writing perfect algorithms.
  • TLA+, which stands for “Temporal Logic of Actions,” is similar in spirit to model-based design: It’s a language for writing down the requirements—TLA+ calls them “specifications”—of computer programs. These specifications can then be completely verified by a computer. That is, before you write any code, you write a concise outline of your program’s logic, along with the constraints you need it to satisfy
  • Programmers are drawn to the nitty-gritty of coding because code is what makes programs go; spending time on anything else can seem like a distraction. And there is a patient joy, a meditative kind of satisfaction, to be had from puzzling out the micro-mechanics of code. But code, Lamport argues, was never meant to be a medium for thought. “It really does constrain your ability to think when you’re thinking in terms of a programming language,”
  • Code makes you miss the forest for the trees: It draws your attention to the working of individual pieces, rather than to the bigger picture of how your program fits together, or what it’s supposed to do—and whether it actually does what you think. This is why Lamport created TLA+. As with model-based design, TLA+ draws your focus to the high-level structure of a system, its essential logic, rather than to the code that implements it.
  • But TLA+ occupies just a small, far corner of the mainstream, if it can be said to take up any space there at all. Even to a seasoned engineer like Newcombe, the language read at first as bizarre and esoteric—a zoo of symbols.
  • this is a failure of education. Though programming was born in mathematics, it has since largely been divorced from it. Most programmers aren’t very fluent in the kind of math—logic and set theory, mostly—that you need to work with TLA+. “Very few programmers—and including very few teachers of programming—understand the very basic concepts and how they’re applied in practice. And they seem to think that all they need is code,” Lamport says. “The idea that there’s some higher level than the code in which you need to be able to think precisely, and that mathematics actually allows you to think precisely about it, is just completely foreign. Because they never learned it.”
  • “In the 15th century,” he said, “people used to build cathedrals without knowing calculus, and nowadays I don’t think you’d allow anyone to build a cathedral without knowing calculus. And I would hope that after some suitably long period of time, people won’t be allowed to write programs if they don’t understand these simple things.”
  • Programmers, as a species, are relentlessly pragmatic. Tools like TLA+ reek of the ivory tower. When programmers encounter “formal methods” (so called because they involve mathematical, “formally” precise descriptions of programs), their deep-seated instinct is to recoil.
  • Formal methods had an image problem. And the way to fix it wasn’t to implore programmers to change—it was to change yourself. Newcombe realized that to bring tools like TLA+ to the programming mainstream, you had to start speaking their language.
  • he presented TLA+ as a new kind of “pseudocode,” a stepping-stone to real code that allowed you to exhaustively test your algorithms—and that got you thinking precisely early on in the design process. “Engineers think in terms of debugging rather than ‘verification,’” he wrote, so he titled his internal talk on the subject to fellow Amazon engineers “Debugging Designs.” Rather than bemoan the fact that programmers see the world in code, Newcombe embraced it. He knew he’d lose them otherwise. “I’ve had a bunch of people say, ‘Now I get it,’” Newcombe says.
  • In the world of the self-driving car, software can’t be an afterthought. It can’t be built like today’s airline-reservation systems or 911 systems or stock-trading systems. Code will be put in charge of hundreds of millions of lives on the road and it has to work. That is no small task.
Javier E

How 'The Good Place' Goes Beyond 'The Trolley Problem' - The Atlantic - 1 views

  • A sitcom may seem like an unlikely vehicle for serious discussions about moral philosophy, which viewers might expect to find in medical and legal dramas (albeit in less literal, didactic forms). But the subject and medium are surprisingly compatible. A comedy can broach otherwise tedious-sounding ideas with levity and self-awareness, and has more leeway to use contrived or exaggerated scenarios to bring concepts to life
  • bringing digestible ethics lessons to the masses can be seen as a moral act, ensuring that those who don’t spend hours poring over Kant and Judith Jarvis Thomson are also privy to what’s gained from understanding how people think.
  • The Good Place’s focus on ethics wouldn’t mean as much if it weren’t also remarkable in other ways—the performances, the top-notch writing, the wordplay and pun-laden jokes, the willingness to formally experiment with the sitcom genre.
  • ...1 more annotation...
  • “While we’re discussing the issues that I want to discuss, I also know that I have a responsibility to the audience to tell a story. The goal is not to change the world; the goal of this is to make a high-quality, entertaining show that has good-quality acting.” On that front, Season 2 has certainly succeeded
clairemann

Photos of Snowflakes Like You've Never Seen Them Before - The New York Times - 0 views

  • Sextillions of snowflakes fell from the sky this winter. That’s billions of trillions of them, now mostly melted away as spring approaches.
  • “How do snowflakes form?” Dr. Libbrecht said during an online talk on Feb. 23 that was hosted by the Bruce Museum in Greenwich, Conn. “And how do these structures appear — and just, as I like to say, literally out of thin air?”
  • The “damn thing” was the camera system for photographing snowflakes. He wanted to use the best digital sensors, ones that captured a million pixels. “The real snowflake is very, very fragile,” he said. “It’s super intricate. So you want high resolution.”
  • ...4 more annotations...
  • Dr. Myhrvold also found a special LED, manufactured by a company in Japan for industrial uses, that would emit bursts of light 1/1,000th as long as a typical camera flash. This minimizes heat emitted from the flash, which might melt the snowflake a bit.
  • Water is a simple molecule consisting of two hydrogen atoms and one oxygen. When temperatures drop below 32 degrees Fahrenheit, the molecules start sticking to one another — that is, they freeze.
  • “Because it has this complicated path through the clouds, it gives a complicated shape,” Dr. Libbrecht said. “They’re all following different paths, and so each one looks a little different, depending on the path.”
  • To counter Dr. Myhrvold’s claims, Mr. Komarechka took an image that he says was even higher resolution. Dr. Myhrvold responded with a lengthy rebuttal explaining why his images were, nonetheless, more detailed.
Javier E

FaceApp helped a middle-aged man become a popular younger woman. His fan base has never... - 1 views

  • Soya’s fame illustrated a simple truth: that social media is less a reflection of who we are, and more a performance of who we want to be.
  • It also seemed to herald a darker future where our fundamental senses of reality are under siege: The AI that allows anyone to fabricate a face can also be used to harass women with “deepfake” pornography, invent fraudulent LinkedIn personas and digitally impersonate political enemies.
  • As the photos began receiving hundreds of likes, Soya’s personality and style began to come through. She was relentlessly upbeat. She never sneered or bickered or trolled. She explored small towns, savored scenic vistas, celebrated roadside restaurants’ simple meals.
  • ...25 more annotations...
  • She took pride in the basic things, like cleaning engine parts. And she only hinted at the truth: When one fan told her in October, “It’s great to be young,” Soya replied, “Youth does not mean a certain period of life, but how to hold your heart.”
  • She seemed, well, happy, and FaceApp had made her that way. Creating the lifelike impostor had taken only a few taps: He changed the “Gender” setting to “Female,” the “Age” setting to “Teen,” and the “Impression” setting — a mix of makeup filters — to a glamorous look the app calls “Hollywood.”
  • Soya pouted and scowled on rare occasions when Nakajima himself felt frustrated. But her baseline expression was an extra-wide smile, activated with a single tap.
  • Nakajima grew his shimmering hair below his shoulders and raided his local convenience store for beauty supplies he thought would make the FaceApp images more convincing: blushes, eyeliners, concealers, shampoos.
  • “When I compare how I feel when I started to tweet as a woman and now, I do feel that I’m gradually gravitating toward this persona … this fantasy world that I created,” Nakajima said. “When I see photos of what I tweeted, I feel like, ‘Oh. That’s me.’ ”
  • The sensation Nakajima was feeling is so common that there’s a term for it: the Proteus effect, named for the shape-shifting Greek god. Stanford University researchers first coined it in 2007 to describe how people inhabiting the body of a digital avatar began to act the part
  • People made to appear taller in virtual-reality simulations acted more assertively, even after the experience ended. Prettier characters began to flirt.
  • What is it about online disguises? Why are they so good at bending people’s sense of self-perception?
  • they tap into this “very human impulse to play with identity and pretend to be someone you’re not.”
  • Users in the Internet’s early days rarely had any presumptions of authenticity, said Melanie C. Green, a University of Buffalo professor who studies technology and social trust. Most people assumed everyone else was playing a character clearly distinguished from their real life.
  • “This identity play was considered one of the huge advantages of being online,” Green said. “You could switch your gender and try on all of these different personas. It was a playground for people to explore.”
  • It wasn’t until the rise of giant social networks like Facebook — which used real identities to, among other things, supercharge targeted advertising — that this big game of pretend gained an air of duplicity. Spaces for playful performance shrank, and the biggest Internet watering holes began demanding proof of authenticity as a way to block out malicious intent.
  • The Web’s big shift from text to visuals — the rise of photo-sharing apps, live streams and video calls — seemed at first to make that unspoken rule of real identities concrete. It seemed too difficult to fake one’s appearance when everyone’s face was on constant display.
  • Now, researchers argue, advances in image-editing artificial intelligence have done for the modern Internet what online pseudonyms did for the world’s first chat rooms. Facial filters have allowed anyone to mold themselves into the character they want to play.
  • researchers fear these augmented reality tools could end up distorting the beauty standards and expectations of actual reality.
  • Some political and tech theorists worry this new world of synthetic media threatens to detonate our concept of truth, eroding our shared experiences and infusing every online relationship with suspicion and self-doubt.
  • Deceptive political memes, conspiracy theories, anti-vaccine hoaxes and other scams have torn the fabric of our democracy, culture and public health.
  • But she also thinks about her kids, who assume “that everything online is fabricated,” and wonders whether the rules of online identity require a bit more nuance — and whether that generational shift is already underway.
  • “Bots pretending to be people, automated representations of humanity — that, they perceive as exploitative,” she said. “But if it’s just someone engaging in identity experimentation, they’re like: ‘Yeah, that’s what we’re all doing.'
  • To their generation, “authenticity is not about: ‘Does your profile picture match your real face?’ Authenticity is: ‘Is your voice your voice?’
  • “Their feeling is: ‘The ideas are mine. The voice is mine. The content is mine. I’m just looking for you to receive it without all the assumptions and baggage that comes with it.’ That’s the essence of a person’s identity. That’s who they really are.”
  • But wasn’t this all just a big con? Nakajima had tricked people with a “cool girl” stereotype to boost his Twitter numbers. He hadn’t elevated the role of women in motorcycling; if anything, he’d supplanted them. And the character he’d created was paper thin: Soya had no internal complexity outside of what Nakajima had projected, just that eternally superimposed smile.
  • Perhaps he should have accepted his irrelevance and faded into the digital sunset, sharing his life for few to see. But some of Soya’s followers have said they never felt deceived: It was Nakajima — his enthusiasm, his attitude about life — they’d been charmed by all along. “His personality,” as one Twitter follower said, “shined through.”
  • In Nakajima’s mind, he’d used the tools of a superficial medium to craft genuine connections. He had not felt real until he had become noticed for being fake.
  • Nakajima said he doesn’t know how long he’ll keep Soya alive. But he said he’s grateful for the way she helped him feel: carefree, adventurous, seen.
Javier E

Revisiting the prophetic work of Neil Postman about the media » MercatorNet - 1 views

  • The NYU professor was surely prophetic. “Our own tribe is undergoing a vast and trembling shift from the magic of writing to the magic of electronics,” he cautioned.
  • “We face the rapid dissolution of the assumptions of an education organised around the slow-moving printed word, and the equally rapid emergence of a new education based on the speed-of-light electronic message.”
  • What Postman perceived in television has been dramatically intensified by smartphones and social media
  • ...31 more annotations...
  • Postman also recognised that technology was changing our mental processes and social habits.
  • Today corporations like Google and Amazon collect data on Internet users based on their browsing history, the things they purchase, and the apps they use
  • Yet all citizens are undergoing this same transformation. Our digital devices undermine social interactions by isolating us,
  • “Years from now, it will be noticed that the massive collection and speed-of-light retrieval of data have been of great value to large-scale organisations, but have solved very little of importance to most people, and have created at least as many problems for them as they may have solved.”
  • “Television has by its power to control the time, attention, and cognitive habits of our youth gained the power to control their education.”
  • As a student of Canadian philosopher Marshall McLuhan, Postman believed that the medium of information was critical to understanding its social and political effects. Every technology has its own agenda. Postman worried that the very nature of television undermined American democratic institutions.
  • Many Americans tuned in to the presidential debate looking for something substantial and meaty
  • It was simply another manifestation of the incoherence and vitriol of cable news
  • “When, in short, a people become an audience and their public business a vaudeville act, then a nation finds itself at risk; culture-death is a clear possibility,” warned Postman.
  • Technology Is Never Neutral
  • As for new problems, we have increased addictions (technological and pornographic); increased loneliness, anxiety, and distraction; and inhibited social and intellectual maturation.
  • The average length of a shot on network television is only 3.5 seconds, so that the eye never rests, always has something new to see. Moreover, television offers viewers a variety of subject matter, requires minimal skills to comprehend it, and is largely aimed at emotional gratification.
  • This is far truer of the Internet and social media, where more than a third of Americans, and almost half of young people, now get their news.
  • with smartphones now ubiquitous, the Internet has replaced television as the “background radiation of the social and intellectual universe.”
  • Is There Any Solution?
  • Reading news or commentary in print, in contrast, requires concentration, patience, and careful reflection, virtues that our digital age vitiates.
  • Politics as Entertainment
  • “How television stages the world becomes the model for how the world is properly to be staged,” observed Postman. In the case of politics, television fashions public discourse into yet another form of entertainment
  • In America, the fundamental metaphor for political discourse is the television commercial. The television commercial is not at all about the character of products to be consumed. … They tell everything about the fears, fancies, and dreams of those who might buy them.
  • The television commercial has oriented business away from making products of value and towards making consumers feel valuable, which means that the business of business has now become pseudo-therapy. The consumer is a patient assured by psycho-dramas.
  • Such is the case with the way politics is “advertised” to different subsets of the American electorate. The “consumer,” depending on his political leanings, may be manipulated by fears of either an impending white-nationalist, fascist dictatorship, or a radical, woke socialist takeover.
  • This paradigm is aggravated by the hypersiloing of media content, which explains why Americans who read left-leaning media view the Proud Boys as a legitimate, existential threat to national civil order, while those who read right-leaning media believe the real immediate enemies of our nation are Antifa
  • Regardless of whether either of these groups represents a real public menace, the loss of any national consensus over what constitutes objective news means that Americans effectively talk past one another: they use the Proud Boys or Antifa as rhetorical barbs to smear their ideological opponents as extremists.
  • Yet these technologies are far from neutral. They are, rather, “equipped with a program for social change.
  • Postman’s analysis of technology is prophetic and profound. He warned of the trivialising of our media, defined by “broken time and broken attention,” in which “facts push other facts into and then out of consciousness at speeds that neither permit nor require evaluation.” He warned of “a neighborhood of strangers and pointless quantity.”
  • does Postman offer any solutions to this seemingly uncontrollable technological juggernaut?
  • Postman’s suggestions regarding education are certainly relevant. He unequivocally condemned education that mimics entertainment, and urged a return to learning that is hierarchical, meaning that it first gives students a foundation of essential knowledge before teaching “critical thinking.”
  • Postman also argued that education must avoid a lowest-common-denominator approach in favor of complexity and the perplexing: the latter method elicits in the student a desire to make sense of what perplexes him.
  • Finally, Postman promoted education of vigorous exposition, logic, and rhetoric, all being necessary for citizenship
  • Another course of action is to understand what these media, by their very nature, do to us and to public discourse.
  • We must, as Postman exhorts us, “demystify the data” and dominate our technology, lest it dominate us. We must identify and resist how television, social media, and smartphones manipulate our emotions, infantilise us, and weaken our ability to rebuild what 2020 has ravaged.
katedriscoll

Knowledge and Language - TOK 2022: THEORY OF KNOWLEDGE WEBSITE FOR THE IBDP - 0 views

  • Language is a medium through which we pass on most knowledge. You could ask yourself how much you would know if you had no language to gather or express knowledge. Our daily language is heavily influenced by the discourse of the most dominant groups in our communities, even though we may not always be aware of this fact. The language we speak can be used to pass on knowledge and values that exist within our community, but it also influences to some extent how we know. Even though the limitations of the Sapir-Whorf Hypothesis's linguistic determinism have been pointed out, new research (eg by Boroditsky, see below) reveals how the language we speak may shape the way we think
caelengrubb

How Einstein Challenged Newtonian Physics - 0 views

  • Any discussion of Einstein should begin with what is probably his single greatest contribution to physics—the theory of relativity.
  • Between the late 1600s and the beginning of the 20th century, the field of physics was dominated by the ideas of Isaac Newton. The Newtonian laws of motion and gravitation had, up to that point in time, been the most successful scientific theory in all of history.
  • Newton’s ideas were, of course, challenged from time to time during those two centuries, but these ideas always seemed to hold up
  • ...9 more annotations...
  • There were many new phenomena that were discovered and that came to be understood in the centuries that followed Newton’s era. Take electricity and magnetism, for example. Until the 19th century, we didn’t really know what electricity or magnetism were, or how they worked. Isaac Newton certainly didn’t have a clue.
  • To many physicists around the turn of the 20th century, the state of physics seemed very settled. The Newtonian worldview had been very successful, and for a very long time.
  • In 1905, however, a revolution in physics did come. And perhaps even more surprising than the revolution itself was where that revolution came from. In 1905, Albert Einstein was not working as a professor at some prestigious university. He was not famous, or even well-known among other physicists.
  • Things didn’t stay this way for long, however. In 1905, Einstein wrote not one or two, but four absolutely groundbreaking papers. Any one of these four papers would have made him a star within the field of physics, and would have certainly secured him a position of prominence in the history of science.
  • It seems that having so many breakthroughs of this magnitude in such a short period of time had never happened before, and has never happened since. In the first of Einstein’s 1905 papers, he proposed that light doesn’t only behave like a wave, but that it is also made up of individual pieces or particles.
  • But Einstein’s paper provided concrete empirical evidence that atoms were, in fact, real and tangible objects. He was even able to use these arguments to make a pretty good estimate for the size and mass of atoms and molecules. It was a huge step forward.
  • The equations that physicists use to describe the propagation of light waves—what are known as Maxwell’s equations—predict that light should move through space at a speed of about 670 million miles per hour. And more interestingly, these equations don’t make any reference to any medium that the light waves propagate through.
  • Although no experiment had ever detected this aether, they argued that it must fill virtually all of space. After all, they argued, the light from a distant star could only reach us if there was a continuous path filled with aether, extending all the way from the star to us.
  • ventually, though, physicists discovered that there was no aether. It would be Einstein who would come up with an equation to explain this conundrum.
Javier E

'I Like to Watch,' by Emily Nussbaum book review - The Washington Post - 0 views

  • Nussbaum’s case: That television could be great, and not because it was “novelistic” or “cinematic” but because it was, simply, television, “episodic, collaborative, writer-driven, and formulaic” by design.
  • According to Nussbaum, a TV show achieved greatness not despite these facts (which assumes they are limitations) but because of them (which sees them as an infrastructure that provokes creativity and beauty — “the sort that govern sonnets,”
  • Nussbaum’s once-iconoclastic views have become mainstream.
  • ...8 more annotations...
  • It is increasingly common to find yourself apologizing not for watching too much TV but for having failed to spend 70 hours of your precious, finite life binge-watching one of the Golden Age of Television’s finest offerings.
  • Nussbaum writes of her male classmates at NYU, where she was a literature doctoral student in the late 1990s. These men worshiped literature and film; they thought TV was trash. These men “were also, not coincidentally, the ones whose opinions tended to dominate mainstream media conversation.”
  • the same forces that marginalize the already-marginalized still work to keep TV shows by and about women, people of color, and LGBTQ+ individuals on a lower tier than those about cis, straight, white men: Your Tony Sopranos, your Walter Whites, your Don Drapers, your True Detectives
  • Over and over, Nussbaum pushes back against a hierarchy that rewards dramas centered on men and hyperbolically masculine pursuits (dealing drugs, being a cop, committing murders, having sex with beautiful women) and shoves comedies and whatever scans as “female” to the side.
  • Nussbaum sticks up for soaps, rom-coms, romance novels and reality television, “the genres that get dismissed as fluff, which is how our culture regards art that makes women’s lives look like fun.
  • Nussbaum’s writing consistently comes back to the question of “whose stories carried weight . . . what kind of creativity counted as ambitious, and who . . . deserved attention . . . Whose story counted as universal?
  • What does it mean to think morally about the art we consume — and, by extension, financially support, and center in our emotional and imaginative lives? The art that informs, on some near-cellular level, who we want to know and love and be?
  • maybe the next frontier of cultural thought is in thinking more cohesively about what we’ve long compartmentalized — of not stashing conflicting feelings about good art by bad men in some dark corner of our minds, but in holding our discomfort and contradictions up to the light, for a clearer view.
Javier E

Tell all the truth slant - Philosophy and Life - 1 views

  • “Tell all the truth, but tell it slant,” wrote the poet Emily Dickinson: “Success in circuit lies.” The advice is itself a truth, a commendation in the art of looking sideways.
  • Dickinson lived in an age when it was becoming impossible to find truth straightforwardly,
  • What is striking about Dickinson, though, is that she both experienced the darkness of that doubt, and found a way to transform it into an experience that produced meaning. It’s all about the pursuit of the circuitous.
  • ...4 more annotations...
  • That her medium was poetry is no mere detail. It is almost the whole story. Poetry not only allows her to express herself – her desire for consolation, her anxiety about what’s disappearing. It is also the form of writing par excellence that can keep an eye open for what is peripheral. It can discern truths that words otherwise struggle to articulate. It glimpses, and hopes.
  • to know Socrates was to know someone who sought all the truth, and in so doing, realised it mostly lies out of sight.
  • Moses too does not see anything directly. He apparently doesn’t see anything at all. Instead, an oblique experience is granted to him. It is better described as a kind of unknowing, rather than knowing. He must leave behind what he has previously observed because this seeing consists in not seeing. That which is sought transcends all knowledge.
  • What she realises is that the truth which is beyond us, which is discerned only indirectly, is the only truth that is truly worth seeking. That which we can readily grasp and manipulate is too easy for us. It’s humdrum. It leaves life too small for us, the creature with an eye for the transcendent. But look further, and what you are offered is what she calls truth’s ‘superb surprise’. That’s why success lies in circuit. Our humanity is spoken to, from a direction – a source – that we had not expected. And our humanity expands as a result.
sanderk

Global economy will suffer for years to come, says OECD - BBC News - 0 views

  • The world will take years to recover from the coronavirus pandemic, the Organisation for Economic Co-operation and Development has warned.Angel Gurría, OECD secretary general, said the economic shock was already bigger than the financial crisis.
  • The world will take years to recover from the coronavirus pandemic, the Organisation for Economic Co-operation and Development has warned.Angel Gurría, OECD secretary general, said the economic shock was already bigger than the financial crisis.He told the BBC it was "wishful thinking" to believe that countries would bounce back quickly.
  • Mr Gurría said a recent warning that a serious outbreak could halve global growth to 1.5% already looked too optimistic.
  • ...7 more annotations...
  • While the number of job losses and company failures remains uncertain, Mr Gurría said countries would be dealing with the economic fallout "for years to come".
  • "Even if you don't get a worldwide recession, you're going to get either no growth or negative growth in many of the economies of the world, including some of the larger ones, and therefore you're going to get not only low growth this year, but also it's going to take longer to pick up in the in the future,"
  • the reason is that we don't know how much it's going to take to fix the unemployment because we don't know how many people are going to end up unemployed. We also don't know how much it's going to take to fix the hundreds of thousands of small and medium enterprises who are already suffering
  • Mr Gurría called on governments to rip up borrowing rules and "throw everything we got at it" to deal with the crisis.
  • However, he warned that bigger deficits and larger debt piles would also weigh on heavily indebted countries for years to come.
  • Mr Gurría said that just weeks ago, policymakers from the G20 club of rich nations believed the recovery would take a 'V' shape - with a short, sharp drop in economic activity followed swiftly by a rebound in growth."It was already then mostly wishful thinking," he said.
  • It's going to be more in the best of cases like a 'U' with a long trench in the bottom before it gets to the recovery period. We can avoid it looking like an 'L', if we take the right decisions today."
katherineharron

Donald Trump's twisted definition of toughness - CNNPolitics - 0 views

  • "Today, I have strongly recommended to every governor to deploy the National Guard in sufficient numbers that we dominate the streets," he said.
  • "One law and order, and that is what it is, one law. We have one, beautiful law," he said.
  • D.C. had no problems last night," Trump tweeted Tuesday morning. "Many arrests. Great job done by all. Overwhelming force. Domination. Likewise, Minneapolis was great (thank you President Trump!)."
  • ...10 more annotations...
  • The whole thing -- the speech punctuated with talk of "law and order" and the need to "dominate," the walk across ground that had been the site of protests moments before -- was orchestrated to push back against a story that had broken over the weekend: That amid the protests on Friday night outside the White House, Trump had been taken to the bunker under the White House for his protection.
  • The image of Trump cowering in a bunker while people take to the streets to protest the death of a(nother) unarmed black man immediately became fodder for Trump's two preferred mediums of communication: cable TV and Twitter. "Trump's Bunker" trended on Twitter. Cable TV repeatedly ran the story of a President being whisked away to safety.
  • And the world is split between people willing to use their power over others and those too afraid to exert it.
  • On the campaign trail in 2016, Trump repeatedly defended the use of waterboarding and other methods of torture to get information out of enemy combatants. "Don't tell me it doesn't work — torture works,"
  • Trump urged officers to treat arrested gang members rougher. He said this: "When you guys put somebody in the car and you're protecting their head, you know, the way you put their hand over? Like, don't hit their head, and they just killed somebody -- don't hit their head," Trump continued. "I said, you can take the hand away, OK?"
  • Throw them out into the cold," Trump famously/infamously said of protesters at a rally in Burlington, Vermont, in January 2016. "Don't give them their coats. No coats! Confiscate their coats."
  • Get tough Democrat Mayors and Governors," Trump urged in response to the protests. "These people are ANARCHISTS. Call in our National Guard NOW. The World is watching and laughing at you and Sleepy Joe. Is this what America wants? NO!!!"
  • There is nothing Trump cares more about -- and, of course, fears more -- than being perceived as weak and being mocked and laughed at for it. He is willing to say and do absolutely anything to keep from being put in that situation. So when he was being mocked for retreating to the White House bunker, his response was immediate: I'll show them. ... I'll walk right across the ground they were protesting on!
  • oughness is not always about exerting your dominance because you can. True strength is rooted in the actions you don't take, the ability to understand that brute force should be your last resort, not your first instinct.
  • But it's especially true for a President of the United States faced with protests on American streets driven by the death of yet another black man at the hands of the police. Truly tough people, truly strong people -- they don't need to show and tell everyone how strong and tough they are. It's in their restraint, in their understanding that might doesn't make right that their true strength shines through.Donald Trump doesn't know that.
sanderk

How Does Light Travel? - Universe Today - 0 views

  • However, there remains many fascinating and unanswered questions when it comes to light, many of which arise from its dual nature. For instance, how is it that light can be apparently without mass, but still behave as a particle? And how can it behave like a wave and pass through a vacuum, when all other waves require a medium to propagate?
  • This included rejecting Aristotle’s theory of light, which viewed it as being a disturbance in the air (one of his four “elements” that composed matter), and embracing the more mechanistic view that light was composed of indivisible atoms
  • In Young’s version of the experiment, he used a slip of paper with slits cut into it, and then pointed a light source at them to measure how light passed through it
  • ...6 more annotations...
  • According to classical (i.e. Newtonian) particle theory, the results of the experiment should have corresponded to the slits, the impacts on the screen appearing in two vertical lines. Instead, the results showed that the coherent beams of light were interfering, creating a pattern of bright and dark bands on the screen. This contradicted classical particle theory, in which particles do not interfere with each other, but merely collide.
  • The only possible explanation for this pattern of interference was that the light beams were in fact behaving as waves
  • By the late 19th century, James Clerk Maxwell proposed that light was an electromagnetic wave, and devised several equations (known as Maxwell’s equations) to describe how electric and magnetic fields are generated and altered by each other and by charges and currents. By conducting measurements of different types of radiation (magnetic fields, ultraviolet and infrared radiation), he was able to calculate the speed of light in a vacuum (represented as c).
  • For one, it introduced the idea that major changes occur when things move close the speed of light, including the time-space frame of a moving body appearing to slow down and contract in the direction of motion when measured in the frame of the observer. After centuries of increasingly precise measurements, the speed of light was determined to be 299,792,458 m/s in 1975
  • According to his theory, wave function also evolves according to a differential equation (aka. the Schrödinger equation). For particles with mass, this equation has solutions; but for particles with no mass, no solution existed. Further experiments involving the Double-Slit Experiment confirmed the dual nature of photons. where measuring devices were incorporated to observe the photons as they passed through the slits.
  • For instance, its interaction with gravity (along with weak and strong nuclear forces) remains a mystery. Unlocking this, and thus discovering a Theory of Everything (ToE) is something astronomers and physicists look forward to. Someday, we just might have it all figured out!
delgadool

Puerto Rico to pump $787 million into economy amid COVID-19 woes | Miami Herald - 0 views

  • Puerto Rico announced Monday it will begin mailing out checks this week to keep workers, businesses and first responders afloat during the crisis.
  • the rescue plan will pump $787 million into the local economy — one of the most generous incentive packages of any U.S. jurisdiction.
  • Monday, the government will give $500 dollars, in cash, to those who are self-employed, or about 170,000 people. In addition, the government will give $1,500 to small and medium businesses that have been forced to shutter during the COVID-19 crisis.
  • ...1 more annotation...
  • Also, the island’s 134,200 public-sector workers will continue to receive their salaries, and municipalities are being asked to keep paying their 51,500 employees.
tongoscar

Reason Vs. Emotion - 0 views

  • we are all guided by both reason and emotion, and both play important parts.
  • emotional intelligence can be a stronger predictor of many dimensions of life-success than IQ,
  • Emotions can be influenced by thought (the emphasis of Cognitive psychotherapies), and thoughts are influenced by emotion (an emphasis of Emotionally Focused therapies). A third element is behavior — which I believe also interplays similarly with thought and emotion.
  • ...4 more annotations...
  • Negative emotions are opportunities for learning and closeness.
  • Emotion and reason each have somewhat different, but complementary and interlaced roles. They both provide information and guide behavior.
  • Each emotion conveys its own message.
  • Again, what helps here is understanding that you each have different styles, that neither is right or wrong, and you can find ways to bridge a little bit.
« First ‹ Previous 61 - 80 of 97 Next ›
Showing 20 items per page