Skip to main content

Home/ TOK Friends/ Group items tagged self-driving

Rss Feed Group items tagged

Javier E

Do you want to help build a happier city? BBC - 0 views

  • With colleagues at the University of Cambridge, I worked on a web game called urbangems.org. In it, you are shown 10 pairs of urban scenes of London, and for each pair you need to choose which one you consider to be more beautiful, quiet and happy. Based on user votes, one is able to rank all urban scenes by beauty, quiet and happiness. Those scenes have been studied at Yahoo Labs, image processing tools that extract colour histograms. The amount of greenery is associated with all three peaceful qualities: green is often found in scenes considered to be beautiful, quiet and happy. We then ran more sophisticated image analysis tools that extracted patches from our urban scenes and found that red-brick houses and public gardens also make people happy.
  • On the other hand, cars were the visual elements most strongly associated with sadness. In rich countries, car ownership is becoming unfashionable, and car-sharing and short-term hiring is becoming more popular. Self-driving cars such as those being prototyped by Google will be more common and will be likely to be ordered via the kind of mobile apps similar to the ones we use for ordering taxis nowadays. This will result into optimised traffic flows, fewer cars, and more space for alternative modes of transportation and for people on foot.
  • Cities will experience transformations similar to those New York has experienced since 2007. During these few years, new pedestrian plazas and hundreds of miles of bike lanes were created in the five boroughs, creating spaces for public art installations and recreation. And it’s proved popular with local businesses too, boosting the local economy in areas where cyclists are freer to travel.
  • ...4 more annotations...
  • it is not clear whether the rise of post-war tower dwelling is a definite improvement on the modern city sprawl. Tall buildings (with the exception of glassed-office buildings and landmarks) are often found in sad scenes.
  • In recent years, the new mayor of the Colombian capital Bogota, Enrique Penalosa, has cancelled highways projects and poured the money instead into cycle lanes, parks and open spaces for locals – undoing decades of car-centric planning that had made the streets a no-go area for the capital’s children. On the day in February 2000 when Penalosa banned cars from the street for 24 hours, hospital admissions fell by a third, air pollution levels dropped and residents said it made them feel more optimistic about living in the city.
  • are the technologies we are designing really helping its users to be happy? Take the simple example of a web map. It usually gives us the shortest walking direction to destination. But what if it would give us the small street, full of trees, parallel to the shortest path, which would make us happier? As more and more of us share these city streets, what will keep us happy as they become more crowded?
  • the share of the world’s population living in cities has surpassed 50%. By 2025, we will see another 1.2 billion city residents. With more and more of us moving to urban centres, quality of life becomes ever-more important.
Javier E

Is Amazon Creating a Cultural Monopoly? - The New Yorker - 0 views

  • “We are not experts in antitrust law, and this letter is not a legal brief. But we are authors with a deep, collective experience in this field, and we agree with the authorities in economics and law who have asserted that Amazon’s dominant position makes it a monopoly as a seller of books and a monopsony as a buyer of books.” (A monopoly is a company that has extraordinary control over supply as a seller of goods to consumers; a monopsony has extraordinary control over suppliers as a buyer of their goods.)
  • a highly unorthodox argument: that, even though Amazon’s activities tend to reduce book prices, which is considered good for consumers, they ultimately hurt consumers
  • U.S. courts evaluate antitrust issues very differently, nowadays, than they did a hundred years ago, just after antitrust laws were established to keep big corporations from abusing their power. Back then, judges tended to be largely concerned with protecting suppliers from being squeezed by retailers, which meant that, if a corporation exercised monopoly power to push prices down, hurting suppliers, the company could easily lose an antitrust case. But by the nineteen-eighties, the judiciary’s focus had shifted to protecting consumers, leading courts to become more prone to ruling in favor of the corporation, on the grounds that lower prices are good for consumers.
  • ...7 more annotations...
  • specific argument—that Amazon’s actions are bad for consumers because they make our world less intellectually active and diverse—is unorthodox in its resort to cultural and artistic grounds. But it can be read as the inverse of a case like Leegin v. PSKS: that lower prices for worse products could be bad for consumers—and perhaps constitute an antitrust violation.
  • if higher prices corresponded with better products, that could be good for consumers—and not necessarily an antitrust violation.
  • Their argument is this: Amazon has used its market power both to influence which books get attention (by featuring them more prominently on its Web site, a practice I’ve also written about) and, in some cases, to drive prices lower. These practices, the authors argue, squeeze publishers, which makes them more risk-averse in deciding which books to publish. As a result, they claim, publishers have been “dropping some midlist authors and not publishing certain riskier books, effectively silencing many voices.” And this is bad not only for the non-famous writers who go unpublished, but for their would-be readers, who are denied the ability to hear those voices.
  • While it may be attractive, on a philosophical level, to argue that Amazon is bad for us because it makes our culture poorer, measuring that effect would be difficult, if not impossible. How would one go about valuing an unpublished masterpiece by an unknown author? This is further complicated by the fact that Amazon makes it easy for authors to self-publish and have their work be seen, without having to go through such traditional gatekeepers as agents and publishers; Amazon might argue that this allows for more free flow of information and ideas
  • Furthermore, U.S. law is concerned with diversity in media, Crane said, but that tends to be regulated through the Federal Communications Commission, not the Justice Department.
  • it’s quite possible the Justice Department will read the Authors United letter and dismiss it as uninformed. But even if that happens, Preston said, it will have been worthwhile for the writers to have made their case.
  • Authors United’s larger mission, he told me, was this: “We hope to show the public that getting products faster and cheaper isn’t necessarily the greatest good. It comes at a human cost.”
Javier E

If It Feels Right - NYTimes.com - 3 views

  • What’s disheartening is how bad they are at thinking and talking about moral issues.
  • you see the young people groping to say anything sensible on these matters. But they just don’t have the categories or vocabulary to do so.
  • “Not many of them have previously given much or any thought to many of the kinds of questions about morality that we asked,” Smith and his co-authors write. When asked about wrong or evil, they could generally agree that rape and murder are wrong. But, aside from these extreme cases, moral thinking didn’t enter the picture, even when considering things like drunken driving, cheating in school or cheating on a partner.
  • ...8 more annotations...
  • The default position, which most of them came back to again and again, is that moral choices are just a matter of individual taste. “It’s personal,” the respondents typically said. “It’s up to the individual. Who am I to say?”
  • “I would do what I thought made me happy or how I felt. I have no other way of knowing what to do but how I internally feel.”
  • Many were quick to talk about their moral feelings but hesitant to link these feelings to any broader thinking about a shared moral framework or obligation. As one put it, “I mean, I guess what makes something right is how I feel about it. But different people feel different ways, so I couldn’t speak on behalf of anyone else as to what’s right and wrong.”
  • Smith and company found an atmosphere of extreme moral individualism — of relativism and nonjudgmentalism.
  • they have not been given the resources — by schools, institutions and families — to cultivate their moral intuitions, to think more broadly about moral obligations, to check behaviors that may be degrading.
  • the interviewees were so completely untroubled by rabid consumerism.
  • their attitudes at the start of their adult lives do reveal something about American culture. For decades, writers from different perspectives have been warning about the erosion of shared moral frameworks and the rise of an easygoing moral individualism. Allan Bloom and Gertrude Himmelfarb warned that sturdy virtues are being diluted into shallow values. Alasdair MacIntyre has written about emotivism, the idea that it’s impossible to secure moral agreement in our culture because all judgments are based on how we feel at the moment. Charles Taylor has argued that morals have become separated from moral sources. People are less likely to feel embedded on a moral landscape that transcends self. James Davison Hunter wrote a book called “The Death of Character.” Smith’s interviewees are living, breathing examples of the trends these writers have described.
  • In most times and in most places, the group was seen to be the essential moral unit. A shared religion defined rules and practices. Cultures structured people’s imaginations and imposed moral disciplines. But now more people are led to assume that the free-floating individual is the essential moral unit. Morality was once revealed, inherited and shared, but now it’s thought of as something that emerges in the privacy of your own heart.
  •  
    Goodness, I went through a bit of emotion reading that. Whew. Gotta center. Anyhoo, I feel certainly conflicted over the author's idea of "shallow values." Personally, I don't necessarily see the need to have a shared moral framework to connect to. What is this framework if not a system to instill shame and obligation into its members? While I do think it's important to have an articulate moral opinion on relevant subjects, I also think the world cannot be divided into realms of right or wrong when we can barely see even an infinitely small part of it at one time. What's wrong with open-mindedness?
Javier E

On David Frum, The New York Times, and the Non-Faked 'Fake' Gaza Photos - The Atlantic - 0 views

  • Erik Wemple argues in a very tough critique of Frum's claims for the Washington Post that imbalanced, one-sided skepticism was the main problem with Frum's apology. He was willing to believe the worst about the motives and standards of the nation's leading news organization, while accepting at face value some Pallywood-style fantasies about all-fronts fakery.
  • For all their blind spots and flaws, reporters on the scene are trying to see, so they can tell, and the photographic and video reporters take greater risks than all the rest, since they must be closer to the action. For people on the other side of the world to casually assert that they're just making things up—this could and would drive them crazy. I'm sure that fakery has occurred. But the claim that it has is as serious as they come in journalism. It goes at our ultimate source of self-respect. As when saying that a doctor is deliberately mis-diagnosing patients, that a pilot is drunk in the cockpit, that a lifeguard is purposely letting people drown, you might be right, but you had better be very, very sure before making the claim.
kushnerha

What Drives Gun Sales: Terrorism, Obama and Calls for Restrictions - The New York Times - 0 views

  • On Sunday, President Obama called for making it harder to buy assault weapons after the terrorist attack in San Bernardino, Calif. On Monday, the stock prices of two top gun makers, Smith & Wesson and Ruger, soared.
  • “President Obama has actually been the best salesman for firearms,” said Brian W. Ruttenbur, an analyst with BB&T Capital Markets
  • Fear of gun-buying restrictions has been the main driver of spikes in gun sales, far surpassing the effects of mass shootings and terrorist attacks alone, according to federal background-check data
  • ...11 more annotations...
  • When a man shot and killed 26 people at Sandy Hook Elementary School in Newtown, Conn., gun sales did not set records until five days later, after President Obama called for banning assault rifles and high-capacity magazines.
  • “It would be like you’ve never owned a toaster, you don’t really want a toaster, but the federal government says they’re going to ban toasters,” Mr. Ruttenbur said. “So you go out and buy a toaster.”
  • Gun sales rose in New Jersey in 2013 after Gov. Chris Christie proposed measures that included expanding background checks and banning certain rifles. (Mr. Christie later vetoed one of the most stringent parts
  • Catch-22 for gun control proponents: Pushing for new restrictions can lead to an influx of new guns.
  • Maryland approved one of the nation’s strictest gun-control measures in May 2013, gun sales jumped as buyers tried to beat the October deadline specified in the measure
  • after Hurricane Katrina, legally registered guns were confiscated from civilians. The confiscations outraged gun owners and prompted an increase in gun sales in the area. Conservatives responded by pushing for a federal law prohibiting the seizure of firearms from civilians during an emergency
  • Gun sales have more than doubled in a decade, to about 15 million in 2013 from about seven million in 2002. More firearms are sold to residents in the United States than in any other country
  • These estimates undercount total sales because they omit some purchases in states that do not require background checks for private sales. They also exclude permits that allow people in some states to buy multiple guns with a single background check.
  • The increase is mostly due to higher sales of handguns, which are typically bought for self-defense. Two of the fastest-growing segments of the market are women and gun owners with concealed carry permits.
  • When Missouri repealed a requirement that gun buyers obtain a permit to buy a handgun in 2007, estimated gun sales went up and stayed up, by roughly 9,000 additional guns per month. The influx shifted gun-trafficking patterns, reducing the number of guns used in crimes that had been brought in from neighboring states.
  • Supreme Court invalidated a ban on handguns in Washington, estimated handgun sales in the city went from near-zero to about 40 every month.
Javier E

Jordan Peterson Comes to Aspen - The Atlantic - 0 views

  • Peterson is traveling the English-speaking world in order to spread the message of this core conviction: that the way to fix what ails Western societies is a psychological project, targeted at helping individuals to get their lives in order, not a sociological project that seeks to improve society through politics, or popular culture, or by focusing on class, racial, or gender identity.
  • the Aspen Ideas Festival, which is co-sponsored by the Aspen Institute and The Atlantic, was an anomaly in this series of public appearances: a gathering largely populated by people—Democrats and centrist Republicans, corporate leaders, academics, millionaire philanthropists, journalists—invested in the contrary proposition, that the way to fix what ails society is a sociological project, one that effects change by focusing on politics, or changing popular culture, or spurring technological advances, or investing more in diversity and inclusiveness.
  • Many of its attendees, like many journalists, are most interested in Peterson as a political figure at the center of controversies
  • ...21 more annotations...
  • Peterson deserves a full, appropriately complex accounting of his best and worst arguments; I intend to give him one soon. For now, I can only tell you how the Peterson phenomenon manifested one night in Aspen
  • “For the first time in human history the spoken word has the same reach as the written word, and there are no barriers to entry. That’s a Gutenberg revolution,” he said. “That’s a big deal. This is a game changer. The podcast world is also a Gutenberg moment but it’s even more extensive. The problem with books is that you can’t do anything else while you’re reading. But if you’re listening to a podcast you can be driving a tractor or a long haul truck or doing the dishes. So podcasts free up two hours a day for people to engage in educational activity they otherwise wouldn’t be able to engage in. That’s one-eighth of people’s lives. You’re handing people a lot of time back to engage in high-level intellectual education.
  • that technological revolution has revealed something good that we didn’t know before: “The narrow bandwidth of TV has made us think that we are stupider than we are. And people have a real hunger for deep intellectual dialogue.”
  • I’ve known for years that the university underserved the community, because we assumed that university education is for 18- to 22-year-olds, which is a proposition that’s so absurd it is absolutely mind-boggling that anyone ever conceptualized it. Why wouldn’t you take university courses throughout your entire life? What, you stop searching for wisdom when you’re 22? I don’t think so. You don’t even start until you’re like in your mid 20s. So I knew universities were underserving the broader community a long time ago. But there wasn’t a mechanism whereby that could be rectified.
  • Universities are beyond forgiveness, he argued, because due to the growing ranks of administrators, there’s been a radical increase in tuition. “Unsuspecting students are given free access to student loans that will cripple them through their 30s and their 40s, and the universities are enticing them to extend their carefree adolescence for a four year period at the cost of mortgaging their future in a deal that does not allow for escape through bankruptcy,” he complained. “So it’s essentially a form of indentured servitude. There’s no excuse for that … That cripples the economy because the students become overlaid with debt that they’ll never pay off at the time when they should be at the peak of their ability to take entrepreneurial risks. That’s absolutely appalling.”
  • A critique I frequently hear from Peterson’s critics is that everything he says is either obvious or wrong. I think that critique fails insofar as I sometimes see some critics calling one of his statements obvious even as others insist it is obviously wrong.
  • a reliable difference among men and women cross-culturally is that men are more aggressive than women. Now what's the evidence for that? Here's one piece of evidence: There are 10 times as many men in prison. Now is that a sociocultural construct? It's like, no, it's not a sociocultural construct. Okay?
  • Here's another piece of data. Women try to commit suicide more than men by a lot, and that's because women are more prone to depression and anxiety than men are. And there are reasons for that, and that's cross-cultural as well. Now men are way more likely to actually commit suicide. Why? Because they're more aggressive so they use lethal means. So now the question is how much more aggressive are men than women? The answer is not very much. So the claim that men and women are more the same than different is actually true. This is where you have to know something about statistics to understand the way the world works, instead of just applying your a priori ideological presuppositions to things that are too complex to fit in that rubric.
  • So if you draw two people out of a crowd, one man and one woman, and you had to lay a bet on who was more aggressive, and you bet on the woman, you'd win 40 percent of the time. That's quite a lot. It isn't 50 percent of the time which would be no differences. But it’s a lot. There are lots of women who are more aggressive than lots of men. So the curves overlap a lot. There's way more similarity than difference. And this is along the dimension where there's the most difference. But here's the problem. You can take small differences at the average of a distribution. Then the distributions move off to the side. And then all the action is at the tail. So here's the situation. You don't care about how aggressive the average person is. It's not that relevant. What people care about is who is the most aggressive person out of 100, because that's the person you'd better watch out for.
  • Whenever I'm interviewed by journalists who have the scent of blood in their nose, let's say, they're very willing and able to characterize the situation I find myself in as political. But that's because they can't see the world in any other manner. The political is a tiny fraction of the world. And what I'm doing isn't political. It's psychological or theological. The political element is peripheral. And if people come to the live lectures, let's say, that's absolutely self-evident
  • In a New York Times article titled, “Jordan Peterson, Custodian of the Patriarchy,” the writer Nellie Bowles quoted her subject as follows:
  • Violent attacks are what happens when men do not have partners, Mr. Peterson says, and society needs to work to make sure those men are married. “He was angry at God because women were rejecting him,” Mr. Peterson says of the Toronto killer. “The cure for that is enforced monogamy. That’s actually why monogamy emerges.” Mr. Peterson does not pause when he says this. Enforced monogamy is, to him, simply a rational solution. Otherwise women will all only go for the most high-status men, he explains, and that couldn’t make either gender happy in the end.
  • Ever since, some Peterson critics have claimed that Peterson wants to force women to have sex with male incels, or something similarly dystopian.
  • ...it's an anthropological truism generated primarily through scholars on the left, just so everybody is clear about it, that societies that use monogamy as a social norm, which by the way is virtually every human society that ever existed, do that in an attempt to control the aggression that goes along with polygamy. It's like ‘Oh my God, how contentious can you get.’ Well, how many of you are in monogamous relationships? A majority. How is that enforced?...
  • If everyone you talk to is boring it’s not them! And so if you're rejected by the opposite sex, if you’re heterosexual, then you're wrong, they're not wrong, and you've got some work to do, man. You've got some difficult work to do. And there isn't anything I've been telling young men that's clearer than that … What I've been telling people is take the responsibility for failure onto yourself. That's a hint that you've got work to do. It could also be a hint that you're young and useless and why the hell would anybody have anything to do with you because you don't have anything to offer. And that's rectifiable. Maturity helps to rectify that.
  • And what's the gender? Men. Because if you go two standard deviations out from the mean on two curves that overlap but are disjointed, then you derive an overwhelming preponderance of the overrepresented group. That's why men are about 10 times more likely to be in prison.  
  • Weiss: You are often characterized, at least in the mainstream press, as being transphobic. If you had a student come to you and say, I was born female, I now identify as male, I want you to call me by male pronouns. Would you say yes to that?
  • Peterson: Well, it would depend on the student and the context and why I thought they were asking me and what I believe their demand actually characterized, and all of that. Because that can be done in a way that is genuine and acceptable, and a way that is manipulative and unacceptable. And if it was genuine and acceptable then I would have no problem with it. And if it was manipulative and unacceptable then not a chance. And you might think, ‘Well, who am I to judge?’ Well, first of all, I am a clinical psychologist, I've talked to people for about 25,000 hours. And I'm responsible for judging how I am going to use my words. I'd judge the same way I judge all my interactions with people, which is to the best of my ability, and characterized by all the errors that I'm prone to. I'm not saying that my judgment would be unerring. I live with the consequences and I'm willing to accept the responsibility.
  • But also to be clear about this, it never happened––I never refused to call anyone by anything they had asked me to call them by, although that's been reported multiple times. It's a complete falsehood. And it had nothing to do with the transgender issue as far as I'm concerned.
  • type one and type two error problem
  • note what his avowed position is: that he has never refused to call a transgender person by their preferred pronoun, that he has done so many times, that he would always try to err on the side of believing a request to be earnest, and that he reserves the right to decline a request he believes to be in bad faith. Whether one finds that to be reasonable or needlessly difficult, it seems irresponsible to tell trans people that a prominent intellectual hates them or is deeply antagonistic to them when the only seeming conflict is utterly hypothetical and ostensibly not even directed against people that Peterson believes to be trans, but only against people whom he does not believe to be trans
Javier E

Opinion | What Do We Actually Know About the Economy? (Wonkish) - The New York Times - 0 views

  • Among economists more generally, a lot of the criticism seems to amount to the view that macroeconomics is bunk, and that we should stick to microeconomics, which is the real, solid stuff. As I’ll explain in a moment, that’s all wrong
  • in an important sense the past decade has been a huge validation for textbook macroeconomics; meanwhile, the exaltation of micro as the only “real” economics both gives microeconomics too much credit and is largely responsible for the ways macroeconomic theory has gone wrong.
  • Finally, many outsiders and some insiders have concluded from the crisis that economic theory in general is bunk, that we should take guidance from people immersed in the real world – say, business leaders — and/or concentrate on empirical results and skip the models
  • ...28 more annotations...
  • And while empirical evidence is important and we need more of it, the data almost never speak for themselves – a point amply illustrated by recent monetary events.
  • chwinger, as I remember the story, was never seen to use a Feynman diagram. But he had a locked room in his house, and the rumor was that that room was where he kept the Feynman diagrams he used in secret.
  • What’s the equivalent of Feynman diagrams? Something like IS-LM, which is the simplest model you can write down of how interest rates and output are jointly determined, and is how most practicing macroeconomists actually think about short-run economic fluctuations. It’s also how they talk about macroeconomics to each other. But it’s not what they put in their papers, because the journals demand that your model have “microfoundations.”
  • The Bernanke Fed massively expanded the monetary base, by a factor of almost five. There were dire warnings that this would cause inflation and “debase the dollar.” But prices went nowhere, and not much happened to broader monetary aggregates (a result that, weirdly, some economists seemed to find deeply puzzling even though it was exactly what should have been expected.)
  • What about fiscal policy? Traditional macro said that at the zero lower bound there would be no crowding out – that deficits wouldn’t drive up interest rates, and that fiscal multipliers would be larger than under normal conditions. The first of these predictions was obviously borne out, as rates stayed low even when deficits were very large. The second prediction is a bit harder to test, for reasons I’ll get into when I talk about the limits of empiricism. But the evidence does indeed suggest large positive multipliers.
  • The overall story, then, is one of overwhelming predictive success. Basic, old-fashioned macroeconomics didn’t fail in the crisis – it worked extremely well
  • In fact, it’s hard to think of any other example of economic models working this well – making predictions that most non-economists (and some economists) refused to believe, indeed found implausible, but which came true. Where, for example, can you find any comparable successes in microeconomics?
  • Meanwhile, the demand that macro become ever more rigorous in the narrow, misguided sense that it look like micro led to useful approaches being locked up in Schwinger’s back room, and in all too many cases forgotten. When the crisis struck, it was amazing how many successful academics turned out not to know things every economist would have known in 1970, and indeed resurrected 1930-vintage fallacies in the belief that they were profound insights.
  • mainly I think it reflected the general unwillingness of human beings (a category that includes many though not necessarily all economists) to believe that so many people can be so wrong about something so big.
  • . To normal human beings the study of international trade and that of international macroeconomics might sound like pretty much the same thing. In reality, however, the two fields used very different models, had very different intellectual cultures, and tended to look down on each other. Trade people tended to consider international macro people semi-charlatans, doing ad hoc stuff devoid of rigor. International macro people considered trade people boring, obsessed with proving theorems and offering little of real-world use.
  • does microeconomics really deserve its reputation of moral and intellectual superiority? No
  • Even before the rise of behavioral economics, any halfway self-aware economist realized that utility maximization – indeed, the very concept of utility — wasn’t a fact about the world; it was more of a thought experiment, whose conclusions should always have been stated in the subjunctive.
  • But, you say, we didn’t see the Great Recession coming. Well, what do you mean “we,” white man? OK, what’s true is that few economists realized that there was a huge housing bubble
  • True, a model doesn’t have to be perfect to provide hugely important insights. But here’s my question: where are the examples of microeconomic theory providing strong, counterintuitive, successful predictions on the same order as the success of IS-LM macroeconomics after 2008? Maybe there are some, but I can’t come up with any.
  • The point is not that micro theory is useless and we should stop doing it. But it doesn’t deserve to be seen as superior to macro modeling.
  • And the effort to make macro more and more like micro – to ground everything in rational behavior – has to be seen now as destructive. True, that effort did lead to some strong predictions: e.g., only unanticipated money should affect real output, transitory income changes shouldn’t affect consumer spending, government spending should crowd out private demand, etc. But all of those predictions have turned out to be wrong.
  • Kahneman and Tversky and Thaler and so on deserved all the honors they received for helping to document the specific ways in which utility maximization falls short, but even before their work we should never have expected perfect maximization to be a good description of reality.
  • But data never speak for themselves, for a couple of reasons. One, which is familiar, is that economists don’t get to do many experiments, and natural experiments are rare
  • The other problem is that even when we do get something like natural experiments, they often took place under economic regimes that aren’t relevant to current problems.
  • Both of these problems were extremely relevant in the years following the 2008 crisis.
  • you might be tempted to conclude that the empirical evidence is that monetary expansion is inflationary, indeed roughly one-for-one.
  • But the question, as the Fed embarked on quantitative easing, was what effect this would have on an economy at the zero lower bound. And while there were many historical examples of big monetary expansion, examples at the ZLB were much rarer – in fact, basically two: the U.S. in the 1930s and Japan in the early 2000
  • These examples told a very different story: that expansion would not, in fact, be inflationary, that it would work out the way it did.
  • The point is that empirical evidence can only do certain things. It can certainly prove that your theory is wrong! And it can also make a theory much more persuasive in those cases where the theory makes surprising predictions, which the data bear out. But the data can never absolve you from the necessity of having theories.
  • Over this past decade, I’ve watched a number of economists try to argue from authority: I am a famous professor, therefore you should believe what I say. This never ends well. I’ve also seen a lot of nihilism: economists don’t know anything, and we should tear the field down and start over.
  • Obviously I differ with both views. Economists haven’t earned the right to be snooty and superior, especially if their reputation comes from the ability to do hard math: hard math has been remarkably little help lately, if ever.
  • On the other hand, economists do turn out to know quite a lot: they do have some extremely useful models, usually pretty simple ones, that have stood up well in the face of evidence and events. And they definitely shouldn’t defer to important and/or rich people on polic
  • : compare Janet Yellen’s macroeconomic track record with that of the multiple billionaires who warned that Bernanke would debase the dollar. Or take my favorite Business Week headline from 2010: “Krugman or [John] Paulson: Who You Gonna Bet On?” Um.The important thing is to be aware of what we do know, and why.Follow The New York Times Opinion section on Facebook and Twitter (@NYTopinion), and sign up for the Opinion Today newsletter.
Javier E

What Cookies and Meth Have in Common - The New York Times - 0 views

  • Why would anyone continue to use recreational drugs despite the medical consequences and social condemnation? What makes someone eat more and more in the face of poor health?
  • modern humans have designed the perfect environment to create both of these addictions.
  • Drug exposure also contributes to a loss of self-control. Dr. Volkow found that low D2 was linked with lower activity in the prefrontal cortex, which would impair one’s ability to think critically and exercise restraint
  • ...17 more annotations...
  • Now we have a body of research that makes the connection between stress and addiction definitive. More surprising, it shows that we can change the path to addiction by changing our environment.
  • Neuroscientists have found that food and recreational drugs have a common target in the “reward circuit” of the brain, and that the brains of humans and other animals who are stressed undergo biological changes that can make them more susceptible to addiction.
  • In a 2010 study, Diana Martinez and colleagues at Columbia scanned the brains of a group of healthy controls and found that lower social status and a lower degree of perceived social support — both presumed to be proxies for stress — were correlated with fewer dopamine receptors, called D2s, in the brain’s reward circuit
  • The reward circuit evolved to help us survive by driving us to locate food or sex in our environment
  • Today, the more D2 receptors you have, the higher your natural level of stimulation and pleasure — and the less likely you are to seek out recreational drugs or comfort food to compensate
  • people addicted to cocaine, heroin, alcohol and methamphetamines experience a significant reduction in their D2 receptor levels that persists long after drug use has stopped. These people are far less sensitive to rewards, are less motivated and may find the world dull, once again making them prone to seek a chemical means to enhance their everyday life.
  • the myth has persisted that addiction is either a moral failure or a hard-wired behavior — that addicts are either completely in command or literally out of their minds
  • The processed food industry has transformed our food into a quasi-drug, while the drug industry has synthesized ever more powerful drugs that have been diverted for recreational use.
  • At this point you may be wondering: What controls the reward circuit in the first place? Some of it is genetic. We know that certain gene variations elevate the risk of addiction to various drugs. But studies of monkeys suggest that our environment can trump genetics and rewire the brain.
  • simply by changing the environment, you can increase or decrease the likelihood of an animal becoming a drug addict.
  • The same appears true for humans. Even people who are not hard-wired for addiction can be made dependent on drugs if they are stressed
  • Is it any wonder, then, that the economically frightening situation that so many Americans experience could make them into addicts? You will literally have a different brain depending on your ZIP code, social circumstances and stress level.
  • In 1990, no state in our country had an adult obesity rate above 15 percent; by 2015, 44 states had obesity rates of 25 percent or higher. What changed?
  • What happened is that cheap, calorie-dense foods that are highly rewarding to your brain are now ubiquitous.
  • Nothing in our evolution has prepared us for the double whammy of caloric modern food and potent recreational drugs. Their power to activate our reward circuit, rewire our brain and nudge us in the direction of compulsive consumption is unprecedented.
  • Food, like drugs, stimulates the brain’s reward circuit. Chronic exposure to high-fat and sugary foods is similarly linked with lower D2 levels, and people with lower D2 levels are also more likely to crave such foods. It’s a vicious cycle in which more exposure begets more craving.
  • Fortunately, our brains are remarkably plastic and sensitive to experience. Although it’s far easier said than done, just limiting exposure to high-calorie foods and recreational drugs would naturally reset our brains to find pleasure in healthier foods and life without drugs.
Javier E

ROUGH TYPE | Nicholas Carr's blog - 0 views

  • The smartphone has become a repository of the self, recording and dispensing the words, sounds and images that define what we think, what we experience and who we are. In a 2015 Gallup survey, more than half of iPhone owners said that they couldn’t imagine life without the device.
  • So what happens to our minds when we allow a single tool such dominion over our perception and cognition?
  • Not only do our phones shape our thoughts in deep and complicated ways, but the effects persist even when we aren’t using the devices. As the brain grows dependent on the technology, the research suggests, the intellect weakens.
  • ...39 more annotations...
  • he has seen mounting evidence that using a smartphone, or even hearing one ring or vibrate, produces a welter of distractions that makes it harder to concentrate on a difficult problem or job. The division of attention impedes reasoning and performance.
  • Another 2015 study, appearing in the Journal of Computer-Mediated Communication, showed that when people hear their phone ring but are unable to answer it, their blood pressure spikes, their pulse quickens, and their problem-solving skills decline.
  • The researchers recruited 520 undergraduates at UCSD and gave them two standard tests of intellectual acuity. One test gauged “available working-memory capacity,” a measure of how fully a person’s mind can focus on a particular task. The second assessed “fluid intelligence,” a person’s ability to interpret and solve an unfamiliar problem. The only variable in the experiment was the location of the subjects’ smartphones. Some of the students were asked to place their phones in front of them on their desks; others were told to stow their phones in their pockets or handbags; still others were required to leave their phones in a different room.
  • In both tests, the subjects whose phones were in view posted the worst scores, while those who left their phones in a different room did the best. The students who kept their phones in their pockets or bags came out in the middle. As the phone’s proximity increased, brainpower decreased.
  • In subsequent interviews, nearly all the participants said that their phones hadn’t been a distraction—that they hadn’t even thought about the devices during the experiment. They remained oblivious even as the phones disrupted their focus and thinking.
  • In a 2013 study conducted at the University of Essex in England, 142 participants were divided into pairs and asked to converse in private for ten minutes. Half talked with a phone in the room, half without a phone present. The subjects were then given tests of affinity, trust and empathy. “The mere presence of mobile phones,” the researchers reported in the Journal of Social and Personal Relationships, “inhibited the development of interpersonal closeness and trust” and diminished “the extent to which individuals felt empathy and understanding from their partners.”
  • the “integration of smartphones into daily life” appears to cause a “brain drain” that can diminish such vital mental skills as “learning, logical reasoning, abstract thought, problem solving, and creativity.”
  •  Smartphones have become so entangled with our existence that, even when we’re not peering or pawing at them, they tug at our attention, diverting precious cognitive resources. Just suppressing the desire to check our phone, which we do routinely and subconsciously throughout the day, can debilitate our thinking.
  • They found that students who didn’t bring their phones to the classroom scored a full letter-grade higher on a test of the material presented than those who brought their phones. It didn’t matter whether the students who had their phones used them or not: All of them scored equally poorly.
  • A study of nearly a hundred secondary schools in the U.K., published last year in the journal Labour Economics, found that when schools ban smartphones, students’ examination scores go up substantially, with the weakest students benefiting the most.
  • Social skills and relationships seem to suffer as well.
  • Because smartphones serve as constant reminders of all the friends we could be chatting with electronically, they pull at our minds when we’re talking with people in person, leaving our conversations shallower and less satisfying.
  • A second experiment conducted by the researchers produced similar results, while also revealing that the more heavily students relied on their phones in their everyday lives, the greater the cognitive penalty they suffered.
  • The evidence that our phones can get inside our heads so forcefully is unsettling. It suggests that our thoughts and feelings, far from being sequestered in our skulls, can be skewed by external forces we’re not even aware o
  •  Scientists have long known that the brain is a monitoring system as well as a thinking system. Its attention is drawn toward any object that is new, intriguing or otherwise striking — that has, in the psychological jargon, “salience.”
  • even in the history of captivating media, the smartphone stands out. It is an attention magnet unlike any our minds have had to grapple with before. Because the phone is packed with so many forms of information and so many useful and entertaining functions, it acts as what Dr. Ward calls a “supernormal stimulus,” one that can “hijack” attention whenever it is part of our surroundings — and it is always part of our surroundings.
  • Imagine combining a mailbox, a newspaper, a TV, a radio, a photo album, a public library and a boisterous party attended by everyone you know, and then compressing them all into a single, small, radiant object. That is what a smartphone represents to us. No wonder we can’t take our minds off it.
  • The irony of the smartphone is that the qualities that make it so appealing to us — its constant connection to the net, its multiplicity of apps, its responsiveness, its portability — are the very ones that give it such sway over our minds.
  • Phone makers like Apple and Samsung and app writers like Facebook, Google and Snap design their products to consume as much of our attention as possible during every one of our waking hours
  • Social media apps were designed to exploit “a vulnerability in human psychology,” former Facebook president Sean Parker said in a recent interview. “[We] understood this consciously. And we did it anyway.”
  • A quarter-century ago, when we first started going online, we took it on faith that the web would make us smarter: More information would breed sharper thinking. We now know it’s not that simple.
  • As strange as it might seem, people’s knowledge and understanding may actually dwindle as gadgets grant them easier access to online data stores
  • In a seminal 2011 study published in Science, a team of researchers — led by the Columbia University psychologist Betsy Sparrow and including the late Harvard memory expert Daniel Wegner — had a group of volunteers read forty brief, factual statements (such as “The space shuttle Columbia disintegrated during re-entry over Texas in Feb. 2003”) and then type the statements into a computer. Half the people were told that the machine would save what they typed; half were told that the statements would be erased.
  • Afterward, the researchers asked the subjects to write down as many of the statements as they could remember. Those who believed that the facts had been recorded in the computer demonstrated much weaker recall than those who assumed the facts wouldn’t be stored. Anticipating that information would be readily available in digital form seemed to reduce the mental effort that people made to remember it
  • The researchers dubbed this phenomenon the “Google effect” and noted its broad implications: “Because search engines are continually available to us, we may often be in a state of not feeling we need to encode the information internally. When we need it, we will look it up.”
  • as the pioneering psychologist and philosopher William James said in an 1892 lecture, “the art of remembering is the art of thinking.”
  • Only by encoding information in our biological memory can we weave the rich intellectual associations that form the essence of personal knowledge and give rise to critical and conceptual thinking. No matter how much information swirls around us, the less well-stocked our memory, the less we have to think with.
  • As Dr. Wegner and Dr. Ward explained in a 2013 Scientific American article, when people call up information through their devices, they often end up suffering from delusions of intelligence. They feel as though “their own mental capacities” had generated the information, not their devices. “The advent of the ‘information age’ seems to have created a generation of people who feel they know more than ever before,” the scholars concluded, even though “they may know ever less about the world around them.”
  • That insight sheds light on society’s current gullibility crisis, in which people are all too quick to credit lies and half-truths spread through social media. If your phone has sapped your powers of discernment, you’ll believe anything it tells you.
  • Data, the novelist and critic Cynthia Ozick once wrote, is “memory without history.” Her observation points to the problem with allowing smartphones to commandeer our brains
  • When we constrict our capacity for reasoning and recall or transfer those skills to a gadget, we sacrifice our ability to turn information into knowledge. We get the data but lose the meaning
  • We need to give our minds more room to think. And that means putting some distance between ourselves and our phones.
  • Harvard Business School professor emerita Shoshana Zuboff argues in her new book that the Valley’s wealth and power are predicated on an insidious, essentially pathological form of private enterprise—what she calls “surveillance capitalism.” Pioneered by Google, perfected by Facebook, and now spreading throughout the economy, surveillance capitalism uses human life as its raw material. Our everyday experiences, distilled into data, have become a privately-owned business asset used to predict and mold our behavior, whether we’re shopping or socializing, working or voting.
  • By reengineering the economy and society to their own benefit, Google and Facebook are perverting capitalism in a way that undermines personal freedom and corrodes democracy.
  • Under the Fordist model of mass production and consumption that prevailed for much of the twentieth century, industrial capitalism achieved a relatively benign balance among the contending interests of business owners, workers, and consumers. Enlightened executives understood that good pay and decent working conditions would ensure a prosperous middle class eager to buy the goods and services their companies produced. It was the product itself — made by workers, sold by companies, bought by consumers — that tied the interests of capitalism’s participants together. Economic and social equilibrium was negotiated through the product.
  • By removing the tangible product from the center of commerce, surveillance capitalism upsets the equilibrium. Whenever we use free apps and online services, it’s often said, we become the products, our attention harvested and sold to advertisers
  • this truism gets it wrong. Surveillance capitalism’s real products, vaporous but immensely valuable, are predictions about our future behavior — what we’ll look at, where we’ll go, what we’ll buy, what opinions we’ll hold — that internet companies derive from our personal data and sell to businesses, political operatives, and other bidders.
  • Unlike financial derivatives, which they in some ways resemble, these new data derivatives draw their value, parasite-like, from human experience.To the Googles and Facebooks of the world, we are neither the customer nor the product. We are the source of what Silicon Valley technologists call “data exhaust” — the informational byproducts of online activity that become the inputs to prediction algorithms
  • internet companies operate in what Zuboff terms “extreme structural independence from people.” When databases displace goods as the engine of the economy, our own interests, as consumers but also as citizens, cease to be part of the negotiation. We are no longer one of the forces guiding the market’s invisible hand. We are the objects of surveillance and control.
Javier E

The Navy's USS Gabrielle Giffords and the Future of Work - The Atlantic - 0 views

  • Minimal manning—and with it, the replacement of specialized workers with problem-solving generalists—isn’t a particularly nautical concept. Indeed, it will sound familiar to anyone in an organization who’s been asked to “do more with less”—which, these days, seems to be just about everyone.
  • Ten years from now, the Deloitte consultant Erica Volini projects, 70 to 90 percent of workers will be in so-called hybrid jobs or superjobs—that is, positions combining tasks once performed by people in two or more traditional roles.
  • If you ask Laszlo Bock, Google’s former culture chief and now the head of the HR start-up Humu, what he looks for in a new hire, he’ll tell you “mental agility.
  • ...40 more annotations...
  • “What companies are looking for,” says Mary Jo King, the president of the National Résumé Writers’ Association, “is someone who can be all, do all, and pivot on a dime to solve any problem.”
  • The phenomenon is sped by automation, which usurps routine tasks, leaving employees to handle the nonroutine and unanticipated—and the continued advance of which throws the skills employers value into flux
  • Or, for that matter, on the relevance of the question What do you want to be when you grow up?
  • By 2020, a 2016 World Economic Forum report predicted, “more than one-third of the desired core skill sets of most occupations” will not have been seen as crucial to the job when the report was published
  • I asked John Sullivan, a prominent Silicon Valley talent adviser, why should anyone take the time to master anything at all? “You shouldn’t!” he replied.
  • Minimal manning—and the evolution of the economy more generally—requires a different kind of worker, with not only different acquired skills but different inherent abilities
  • It has implications for the nature and utility of a college education, for the path of careers, for inequality and employability—even for the generational divide.
  • Then, in 2001, Donald Rumsfeld arrived at the Pentagon. The new secretary of defense carried with him a briefcase full of ideas from the corporate world: downsizing, reengineering, “transformational” technologies. Almost immediately, what had been an experimental concept became an article of faith
  • But once cadets got into actual command environments, which tend to be fluid and full of surprises, a different picture emerged. “Psychological hardiness”—a construct that includes, among other things, a willingness to explore “multiple possible response alternatives,” a tendency to “see all experience as interesting and meaningful,” and a strong sense of self-confidence—was a better predictor of leadership ability in officers after three years in the field.
  • Because there really is no such thing as multitasking—just a rapid switching of attention—I began to feel overstrained, put upon, and finally irked by the impossible set of concurrent demands. Shouldn’t someone be giving me a hand here? This, Hambrick explained, meant I was hitting the limits of working memory—basically, raw processing power—which is an important aspect of “fluid intelligence” and peaks in your early 20s. This is distinct from “crystallized intelligence”—the accumulated facts and know-how on your hard drive—which peaks in your 50
  • Others noticed the change but continued to devote equal attention to all four tasks. Their scores fell. This group, Hambrick found, was high in “conscientiousness”—a trait that’s normally an overwhelming predictor of positive job performance. We like conscientious people because they can be trusted to show up early, double-check the math, fill the gap in the presentation, and return your car gassed up even though the tank was nowhere near empty to begin with. What struck Hambrick as counterintuitive and interesting was that conscientiousness here seemed to correlate with poor performance.
  • he discovered another correlation in his test: The people who did best tended to score high on “openness to new experience”—a personality trait that is normally not a major job-performance predictor and that, in certain contexts, roughly translates to “distractibility.”
  • To borrow the management expert Peter Drucker’s formulation, people with this trait are less focused on doing things right, and more likely to wonder whether they’re doing the right things.
  • High in fluid intelligence, low in experience, not terribly conscientious, open to potential distraction—this is not the classic profile of a winning job candidate. But what if it is the profile of the winning job candidate of the future?
  • One concerns “grit”—a mind-set, much vaunted these days in educational and professional circles, that allows people to commit tenaciously to doing one thing well
  • These ideas are inherently appealing; they suggest that dedication can be more important than raw talent, that the dogged and conscientious will be rewarded in the end.
  • he studied West Point students and graduates.
  • Traditional measures such as SAT scores and high-school class rank “predicted leader performance in the stable, highly regulated environment of West Point” itself.
  • It would be supremely ironic if the advance of the knowledge economy had the effect of devaluing knowledge. But that’s what I heard, recurrentl
  • “Fluid, learning-intensive environments are going to require different traits than classical business environments,” I was told by Frida Polli, a co-founder of an AI-powered hiring platform called Pymetrics. “And they’re going to be things like ability to learn quickly from mistakes, use of trial and error, and comfort with ambiguity.”
  • “We’re starting to see a big shift,” says Guy Halfteck, a people-analytics expert. “Employers are looking less at what you know and more and more at your hidden potential” to learn new things
  • advice to employers? Stop hiring people based on their work experience. Because in these environments, expertise can become an obstacle.
  • “The Curse of Expertise.” The more we invest in building and embellishing a system of knowledge, they found, the more averse we become to unbuilding it.
  • All too often experts, like the mechanic in LePine’s garage, fail to inspect their knowledge structure for signs of decay. “It just didn’t occur to him,” LePine said, “that he was repeating the same mistake over and over.
  • The devaluation of expertise opens up ample room for different sorts of mistakes—and sometimes creates a kind of helplessness.
  • Aboard littoral combat ships, the crew lacks the expertise to carry out some important tasks, and instead has to rely on civilian help
  • Meanwhile, the modular “plug and fight” configuration was not panning out as hoped. Converting a ship from sub-hunter to minesweeper or minesweeper to surface combatant, it turned out, was a logistical nightmare
  • So in 2016 the concept of interchangeability was scuttled for a “one ship, one mission” approach, in which the extra 20-plus sailors became permanent crew members
  • “As equipment breaks, [sailors] are required to fix it without any training,” a Defense Department Test and Evaluation employee told Congress. “Those are not my words. Those are the words of the sailors who were doing the best they could to try to accomplish the missions we gave them in testing.”
  • These results were, perhaps, predictable given the Navy’s initial, full-throttle approach to minimal manning—and are an object lesson on the dangers of embracing any radical concept without thinking hard enough about the downsides
  • a world in which mental agility and raw cognitive speed eclipse hard-won expertise is a world of greater exclusion: of older workers, slower learners, and the less socially adept.
  • if you keep going down this road, you end up with one really expensive ship with just a few people on it who are geniuses … That’s not a future we want to see, because you need a large enough crew to conduct multiple tasks in combat.
  • hat does all this mean for those of us in the workforce, and those of us planning to enter it? It would be wrong to say that the 10,000-hours-of-deliberate-practice idea doesn’t hold up at all. In some situations, it clearly does
  • A spinal surgery will not be performed by a brilliant dermatologist. A criminal-defense team will not be headed by a tax attorney. And in tech, the demand for specialized skills will continue to reward expertise handsomely.
  • But in many fields, the path to success isn’t so clear. The rules keep changing, which means that highly focused practice has a much lower return
  • In uncertain environments, Hambrick told me, “specialization is no longer the coin of the realm.”
  • It leaves us with lifelong learning,
  • I found myself the target of career suggestions. “You need to be a video guy, an audio guy!” the Silicon Valley talent adviser John Sullivan told me, alluding to the demise of print media
  • I found the prospect of starting over just plain exhausting. Building a professional identity takes a lot of resources—money, time, energy. After it’s built, we expect to reap gains from our investment, and—let’s be honest—even do a bit of coasting. Are we equipped to continually return to apprentice mode? Will this burn us out?
  • Everybody I met on the Giffords seemed to share that mentality. They regarded every minute on board—even during a routine transit back to port in San Diego Harbor—as a chance to learn something new.
sissij

I know they've seen my message - so why haven't they replied? | Culture | The Guardian - 0 views

  • Ah, the tyranny of read receipts – enough to put you off digital communication for good.
  • It sounds straightforward enough, even perfunctory, and indeed it is if it’s only a blip in the back-and-forth. But when a message lingers on “seen” without explanation for anything beyond a few minutes, you’ve been “left on read”. It’s enough to make even the most self-assured individuals question their worth.
  • It works both ways, too: if you’ve read a message that you’re either unable or unwilling to respond to immediately, the countdown has already started.
  • ...10 more annotations...
  • You never picture them driving, or in the bath, or with relatives who don’t believe in phones at the table. In the moment, the likelihood of their secretly resenting you, or agonising over a reply that is certain to disappoint, seems far greater than it actually is.
  • The anxiety of being left on read is silly but it is real, and unique to this time. There is no analog equivalent.
  • but in that case I’d like to think you’d give them the benefit of the doubt, and assume they’d fallen over in the shower or something.
  • There’s no such goodwill in web 2.0, when everyone is assumed to be available at all times. And if not – it’s personal.
  • well, is it any wonder anxiety is so rife among Generation Y?
  • People will go to some lengths to avoid being seen to have “seen” a message – on Snapchat and Facebook, downloading the message then turning on flight mode and opening it can stop it from registering as opened.
  • Turning on “previews” that display on the lock screen will, in many cases, show enough to get the gist of a message (“I think we should break ... ”) without opening it.
  • But while some people contort themselves to avoid being seen to have “seen”, others manipulate that anxiety to their own ends.
  • But maybe read receipts and the games people play with them have just ruined my ability to trust.
  • When we’re used to good things happening instantly, time taken to craft a thoughtful reply is considered a bad thing.
  •  
    I totally agree with the author that the read receipts should be optional. I personally have some issue with the read receipt because I don't like to reply instantly except it is a urgent message. I like to take some time to think about what I want to comment on or write back. Although the society now likes to be fast and instant, I am still a slow person. I feel the read receipt is forcing me and giving me pressure to be fast and instant. 
Javier E

They're Watching You at Work - Don Peck - The Atlantic - 2 views

  • Predictive statistical analysis, harnessed to big data, appears poised to alter the way millions of people are hired and assessed.
  • By one estimate, more than 98 percent of the world’s information is now stored digitally, and the volume of that data has quadrupled since 2007.
  • The application of predictive analytics to people’s careers—an emerging field sometimes called “people analytics”—is enormously challenging, not to mention ethically fraught
  • ...52 more annotations...
  • By the end of World War II, however, American corporations were facing severe talent shortages. Their senior executives were growing old, and a dearth of hiring from the Depression through the war had resulted in a shortfall of able, well-trained managers. Finding people who had the potential to rise quickly through the ranks became an overriding preoccupation of American businesses. They began to devise a formal hiring-and-management system based in part on new studies of human behavior, and in part on military techniques developed during both world wars, when huge mobilization efforts and mass casualties created the need to get the right people into the right roles as efficiently as possible. By the 1950s, it was not unusual for companies to spend days with young applicants for professional jobs, conducting a battery of tests, all with an eye toward corner-office potential.
  • But companies abandoned their hard-edged practices for another important reason: many of their methods of evaluation turned out not to be very scientific.
  • this regime, so widespread in corporate America at mid-century, had almost disappeared by 1990. “I think an HR person from the late 1970s would be stunned to see how casually companies hire now,”
  • Many factors explain the change, he said, and then he ticked off a number of them: Increased job-switching has made it less important and less economical for companies to test so thoroughly. A heightened focus on short-term financial results has led to deep cuts in corporate functions that bear fruit only in the long term. The Civil Rights Act of 1964, which exposed companies to legal liability for discriminatory hiring practices, has made HR departments wary of any broadly applied and clearly scored test that might later be shown to be systematically biased.
  • about a quarter of the country’s corporations were using similar tests to evaluate managers and junior executives, usually to assess whether they were ready for bigger roles.
  • He has encouraged the company’s HR executives to think about applying the games to the recruitment and evaluation of all professional workers.
  • Knack makes app-based video games, among them Dungeon Scrawl, a quest game requiring the player to navigate a maze and solve puzzles, and Wasabi Waiter, which involves delivering the right sushi to the right customer at an increasingly crowded happy hour. These games aren’t just for play: they’ve been designed by a team of neuroscientists, psychologists, and data scientists to suss out human potential. Play one of them for just 20 minutes, says Guy Halfteck, Knack’s founder, and you’ll generate several megabytes of data, exponentially more than what’s collected by the SAT or a personality test. How long you hesitate before taking every action, the sequence of actions you take, how you solve problems—all of these factors and many more are logged as you play, and then are used to analyze your creativity, your persistence, your capacity to learn quickly from mistakes, your ability to prioritize, and even your social intelligence and personality. The end result, Halfteck says, is a high-resolution portrait of your psyche and intellect, and an assessment of your potential as a leader or an innovator.
  • When the results came back, Haringa recalled, his heart began to beat a little faster. Without ever seeing the ideas, without meeting or interviewing the people who’d proposed them, without knowing their title or background or academic pedigree, Knack’s algorithm had identified the people whose ideas had panned out. The top 10 percent of the idea generators as predicted by Knack were in fact those who’d gone furthest in the process.
  • What Knack is doing, Haringa told me, “is almost like a paradigm shift.” It offers a way for his GameChanger unit to avoid wasting time on the 80 people out of 100—nearly all of whom look smart, well-trained, and plausible on paper—whose ideas just aren’t likely to work out.
  • Aptitude, skills, personal history, psychological stability, discretion, loyalty—companies at the time felt they had a need (and the right) to look into them all. That ambit is expanding once again, and this is undeniably unsettling. Should the ideas of scientists be dismissed because of the way they play a game? Should job candidates be ranked by what their Web habits say about them? Should the “data signature” of natural leaders play a role in promotion? These are all live questions today, and they prompt heavy concerns: that we will cede one of the most subtle and human of skills, the evaluation of the gifts and promise of other people, to machines; that the models will get it wrong; that some people will never get a shot in the new workforce.
  • scoring distance from work could violate equal-employment-opportunity standards. Marital status? Motherhood? Church membership? “Stuff like that,” Meyerle said, “we just don’t touch”—at least not in the U.S., where the legal environment is strict. Meyerle told me that Evolv has looked into these sorts of factors in its work for clients abroad, and that some of them produce “startling results.”
  • consider the alternative. A mountain of scholarly literature has shown that the intuitive way we now judge professional potential is rife with snap judgments and hidden biases, rooted in our upbringing or in deep neurological connections that doubtless served us well on the savanna but would seem to have less bearing on the world of work.
  • We may like to think that society has become more enlightened since those days, and in many ways it has, but our biases are mostly unconscious, and they can run surprisingly deep. Consider race. For a 2004 study called “Are Emily and Greg More Employable Than Lakisha and Jamal?,” the economists Sendhil Mullainathan and Marianne Bertrand put white-sounding names (Emily Walsh, Greg Baker) or black-sounding names (Lakisha Washington, Jamal Jones) on similar fictitious résumés, which they then sent out to a variety of companies in Boston and Chicago. To get the same number of callbacks, they learned, they needed to either send out half again as many résumés with black names as those with white names, or add eight extra years of relevant work experience to the résumés with black names.
  • a sociologist at Northwestern, spent parts of the three years from 2006 to 2008 interviewing professionals from elite investment banks, consultancies, and law firms about how they recruited, interviewed, and evaluated candidates, and concluded that among the most important factors driving their hiring recommendations were—wait for it—shared leisure interests.
  • Lacking “reliable predictors of future performance,” Rivera writes, “assessors purposefully used their own experiences as models of merit.” Former college athletes “typically prized participation in varsity sports above all other types of involvement.” People who’d majored in engineering gave engineers a leg up, believing they were better prepared.
  • the prevailing system of hiring and management in this country involves a level of dysfunction that should be inconceivable in an economy as sophisticated as ours. Recent survey data collected by the Corporate Executive Board, for example, indicate that nearly a quarter of all new hires leave their company within a year of their start date, and that hiring managers wish they’d never extended an offer to one out of every five members on their team
  • In the late 1990s, as these assessments shifted from paper to digital formats and proliferated, data scientists started doing massive tests of what makes for a successful customer-support technician or salesperson. This has unquestionably improved the quality of the workers at many firms.
  • In 2010, however, Xerox switched to an online evaluation that incorporates personality testing, cognitive-skill assessment, and multiple-choice questions about how the applicant would handle specific scenarios that he or she might encounter on the job. An algorithm behind the evaluation analyzes the responses, along with factual information gleaned from the candidate’s application, and spits out a color-coded rating: red (poor candidate), yellow (middling), or green (hire away). Those candidates who score best, I learned, tend to exhibit a creative but not overly inquisitive personality, and participate in at least one but not more than four social networks, among many other factors. (Previous experience, one of the few criteria that Xerox had explicitly screened for in the past, turns out to have no bearing on either productivity or retention
  • When Xerox started using the score in its hiring decisions, the quality of its hires immediately improved. The rate of attrition fell by 20 percent in the initial pilot period, and over time, the number of promotions rose. Xerox still interviews all candidates in person before deciding to hire them, Morse told me, but, she added, “We’re getting to the point where some of our hiring managers don’t even want to interview anymore”
  • Gone are the days, Ostberg told me, when, say, a small survey of college students would be used to predict the statistical validity of an evaluation tool. “We’ve got a data set of 347,000 actual employees who have gone through these different types of assessments or tools,” he told me, “and now we have performance-outcome data, and we can split those and slice and dice by industry and location.”
  • Evolv’s tests allow companies to capture data about everybody who applies for work, and everybody who gets hired—a complete data set from which sample bias, long a major vexation for industrial-organization psychologists, simply disappears. The sheer number of observations that this approach makes possible allows Evolv to say with precision which attributes matter more to the success of retail-sales workers (decisiveness, spatial orientation, persuasiveness) or customer-service personnel at call centers (rapport-building)
  • There are some data that Evolv simply won’t use, out of a concern that the information might lead to systematic bias against whole classes of people
  • the idea that hiring was a science fell out of favor. But now it’s coming back, thanks to new technologies and methods of analysis that are cheaper, faster, and much-wider-ranging than what we had before
  • what most excites him are the possibilities that arise from monitoring the entire life cycle of a worker at any given company.
  • Now the two companies are working together to marry pre-hire assessments to an increasing array of post-hire data: about not only performance and duration of service but also who trained the employees; who has managed them; whether they were promoted to a supervisory role, and how quickly; how they performed in that role; and why they eventually left.
  • What begins with an online screening test for entry-level workers ends with the transformation of nearly every aspect of hiring, performance assessment, and management.
  • I turned to Sandy Pentland, the director of the Human Dynamics Laboratory at MIT. In recent years, Pentland has pioneered the use of specialized electronic “badges” that transmit data about employees’ interactions as they go about their days. The badges capture all sorts of information about formal and informal conversations: their length; the tone of voice and gestures of the people involved; how much those people talk, listen, and interrupt; the degree to which they demonstrate empathy and extroversion; and more. Each badge generates about 100 data points a minute.
  • he tried the badges out on about 2,500 people, in 21 different organizations, and learned a number of interesting lessons. About a third of team performance, he discovered, can usually be predicted merely by the number of face-to-face exchanges among team members. (Too many is as much of a problem as too few.) Using data gathered by the badges, he was able to predict which teams would win a business-plan contest, and which workers would (rightly) say they’d had a “productive” or “creative” day. Not only that, but he claimed that his researchers had discovered the “data signature” of natural leaders, whom he called “charismatic connectors” and all of whom, he reported, circulate actively, give their time democratically to others, engage in brief but energetic conversations, and listen at least as much as they talk.
  • His group is developing apps to allow team members to view their own metrics more or less in real time, so that they can see, relative to the benchmarks of highly successful employees, whether they’re getting out of their offices enough, or listening enough, or spending enough time with people outside their own team.
  • Torrents of data are routinely collected by American companies and now sit on corporate servers, or in the cloud, awaiting analysis. Bloomberg reportedly logs every keystroke of every employee, along with their comings and goings in the office. The Las Vegas casino Harrah’s tracks the smiles of the card dealers and waitstaff on the floor (its analytics team has quantified the impact of smiling on customer satisfaction). E‑mail, of course, presents an especially rich vein to be mined for insights about our productivity, our treatment of co-workers, our willingness to collaborate or lend a hand, our patterns of written language, and what those patterns reveal about our intelligence, social skills, and behavior.
  • people analytics will ultimately have a vastly larger impact on the economy than the algorithms that now trade on Wall Street or figure out which ads to show us. He reminded me that we’ve witnessed this kind of transformation before in the history of management science. Near the turn of the 20th century, both Frederick Taylor and Henry Ford famously paced the factory floor with stopwatches, to improve worker efficiency.
  • “The quantities of data that those earlier generations were working with,” he said, “were infinitesimal compared to what’s available now. There’s been a real sea change in the past five years, where the quantities have just grown so large—petabytes, exabytes, zetta—that you start to be able to do things you never could before.”
  • People analytics will unquestionably provide many workers with more options and more power. Gild, for example, helps companies find undervalued software programmers, working indirectly to raise those people’s pay. Other companies are doing similar work. One called Entelo, for instance, specializes in using algorithms to identify potentially unhappy programmers who might be receptive to a phone cal
  • He sees it not only as a boon to a business’s productivity and overall health but also as an important new tool that individual employees can use for self-improvement: a sort of radically expanded The 7 Habits of Highly Effective People, custom-written for each of us, or at least each type of job, in the workforce.
  • the most exotic development in people analytics today is the creation of algorithms to assess the potential of all workers, across all companies, all the time.
  • The way Gild arrives at these scores is not simple. The company’s algorithms begin by scouring the Web for any and all open-source code, and for the coders who wrote it. They evaluate the code for its simplicity, elegance, documentation, and several other factors, including the frequency with which it’s been adopted by other programmers. For code that was written for paid projects, they look at completion times and other measures of productivity. Then they look at questions and answers on social forums such as Stack Overflow, a popular destination for programmers seeking advice on challenging projects. They consider how popular a given coder’s advice is, and how widely that advice ranges.
  • The algorithms go further still. They assess the way coders use language on social networks from LinkedIn to Twitter; the company has determined that certain phrases and words used in association with one another can distinguish expert programmers from less skilled ones. Gild knows these phrases and words are associated with good coding because it can correlate them with its evaluation of open-source code, and with the language and online behavior of programmers in good positions at prestigious companies.
  • having made those correlations, Gild can then score programmers who haven’t written open-source code at all, by analyzing the host of clues embedded in their online histories. They’re not all obvious, or easy to explain. Vivienne Ming, Gild’s chief scientist, told me that one solid predictor of strong coding is an affinity for a particular Japanese manga site.
  • Gild’s CEO, Sheeroy Desai, told me he believes his company’s approach can be applied to any occupation characterized by large, active online communities, where people post and cite individual work, ask and answer professional questions, and get feedback on projects. Graphic design is one field that the company is now looking at, and many scientific, technical, and engineering roles might also fit the bill. Regardless of their occupation, most people leave “data exhaust” in their wake, a kind of digital aura that can reveal a lot about a potential hire.
  • professionally relevant personality traits can be judged effectively merely by scanning Facebook feeds and photos. LinkedIn, of course, captures an enormous amount of professional data and network information, across just about every profession. A controversial start-up called Klout has made its mission the measurement and public scoring of people’s online social influence.
  • Mullainathan expressed amazement at how little most creative and professional workers (himself included) know about what makes them effective or ineffective in the office. Most of us can’t even say with any certainty how long we’ve spent gathering information for a given project, or our pattern of information-gathering, never mind know which parts of the pattern should be reinforced, and which jettisoned. As Mullainathan put it, we don’t know our own “production function.”
  • Over time, better job-matching technologies are likely to begin serving people directly, helping them see more clearly which jobs might suit them and which companies could use their skills. In the future, Gild plans to let programmers see their own profiles and take skills challenges to try to improve their scores. It intends to show them its estimates of their market value, too, and to recommend coursework that might allow them to raise their scores even more. Not least, it plans to make accessible the scores of typical hires at specific companies, so that software engineers can better see the profile they’d need to land a particular job
  • Knack, for its part, is making some of its video games available to anyone with a smartphone, so people can get a better sense of their strengths, and of the fields in which their strengths would be most valued. (Palo Alto High School recently adopted the games to help students assess careers.) Ultimately, the company hopes to act as matchmaker between a large network of people who play its games (or have ever played its games) and a widening roster of corporate clients, each with its own specific profile for any given type of job.
  • When I began my reporting for this story, I was worried that people analytics, if it worked at all, would only widen the divergent arcs of our professional lives, further gilding the path of the meritocratic elite from cradle to grave, and shutting out some workers more definitively. But I now believe the opposite is likely to happen, and that we’re headed toward a labor market that’s fairer to people at every stage of their careers
  • For decades, as we’ve assessed people’s potential in the professional workforce, the most important piece of data—the one that launches careers or keeps them grounded—has been educational background: typically, whether and where people went to college, and how they did there. Over the past couple of generations, colleges and universities have become the gatekeepers to a prosperous life. A degree has become a signal of intelligence and conscientiousness, one that grows stronger the more selective the school and the higher a student’s GPA, that is easily understood by employers, and that, until the advent of people analytics, was probably unrivaled in its predictive powers.
  • the limitations of that signal—the way it degrades with age, its overall imprecision, its many inherent biases, its extraordinary cost—are obvious. “Academic environments are artificial environments,” Laszlo Bock, Google’s senior vice president of people operations, told The New York Times in June. “People who succeed there are sort of finely trained, they’re conditioned to succeed in that environment,” which is often quite different from the workplace.
  • because one’s college history is such a crucial signal in our labor market, perfectly able people who simply couldn’t sit still in a classroom at the age of 16, or who didn’t have their act together at 18, or who chose not to go to graduate school at 22, routinely get left behind for good. That such early factors so profoundly affect career arcs and hiring decisions made two or three decades later is, on its face, absurd.
  • I spoke with managers at a lot of companies who are using advanced analytics to reevaluate and reshape their hiring, and nearly all of them told me that their research is leading them toward pools of candidates who didn’t attend college—for tech jobs, for high-end sales positions, for some managerial roles. In some limited cases, this is because their analytics revealed no benefit whatsoever to hiring people with college degrees; in other cases, and more often, it’s because they revealed signals that function far better than college history,
  • Google, too, is hiring a growing number of nongraduates. Many of the people I talked with reported that when it comes to high-paying and fast-track jobs, they’re reducing their preference for Ivy Leaguers and graduates of other highly selective schools.
  • This process is just beginning. Online courses are proliferating, and so are online markets that involve crowd-sourcing. Both arenas offer new opportunities for workers to build skills and showcase competence. Neither produces the kind of instantly recognizable signals of potential that a degree from a selective college, or a first job at a prestigious firm, might. That’s a problem for traditional hiring managers, because sifting through lots of small signals is so difficult and time-consuming.
  • all of these new developments raise philosophical questions. As professional performance becomes easier to measure and see, will we become slaves to our own status and potential, ever-focused on the metrics that tell us how and whether we are measuring up? Will too much knowledge about our limitations hinder achievement and stifle our dreams? All I can offer in response to these questions, ironically, is my own gut sense, which leads me to feel cautiously optimistic.
  • Google’s understanding of the promise of analytics is probably better than anybody else’s, and the company has been changing its hiring and management practices as a result of its ongoing analyses. (Brainteasers are no longer used in interviews, because they do not correlate with job success; GPA is not considered for anyone more than two years out of school, for the same reason—the list goes on.) But for all of Google’s technological enthusiasm, these same practices are still deeply human. A real, live person looks at every résumé the company receives. Hiring decisions are made by committee and are based in no small part on opinions formed during structured interviews.
maxwellokolo

Here's what's driving North Korea's nuclear program - and it might be more than self-de... - 0 views

  •  
    In North Korea, missiles and nuclear bombs are more than a means of national defense - they are, for broad segments of the public, objects of near-religious devotion. In Pyongyang, the country's capital, missiles feature constantly in newspapers and on television.
Javier E

The Coming Software Apocalypse - The Atlantic - 1 views

  • Our standard framework for thinking about engineering failures—reflected, for instance, in regulations for medical devices—was developed shortly after World War II, before the advent of software, for electromechanical systems. The idea was that you make something reliable by making its parts reliable (say, you build your engine to withstand 40,000 takeoff-and-landing cycles) and by planning for the breakdown of those parts (you have two engines). But software doesn’t break. Intrado’s faulty threshold is not like the faulty rivet that leads to the crash of an airliner. The software did exactly what it was told to do. In fact it did it perfectly. The reason it failed is that it was told to do the wrong thing.
  • Software failures are failures of understanding, and of imagination. Intrado actually had a backup router, which, had it been switched to automatically, would have restored 911 service almost immediately. But, as described in a report to the FCC, “the situation occurred at a point in the application logic that was not designed to perform any automated corrective actions.”
  • The introduction of programming languages like Fortran and C, which resemble English, and tools, known as “integrated development environments,” or IDEs, that help correct simple mistakes (like Microsoft Word’s grammar checker but for code), obscured, though did little to actually change, this basic alienation—the fact that the programmer didn’t work on a problem directly, but rather spent their days writing out instructions for a machine.
  • ...52 more annotations...
  • Code is too hard to think about. Before trying to understand the attempts themselves, then, it’s worth understanding why this might be: what it is about code that makes it so foreign to the mind, and so unlike anything that came before it.
  • Technological progress used to change the way the world looked—you could watch the roads getting paved; you could see the skylines rise. Today you can hardly tell when something is remade, because so often it is remade by code.
  • Software has enabled us to make the most intricate machines that have ever existed. And yet we have hardly noticed, because all of that complexity is packed into tiny silicon chips as millions and millions of lines of cod
  • The programmer, the renowned Dutch computer scientist Edsger Dijkstra wrote in 1988, “has to be able to think in terms of conceptual hierarchies that are much deeper than a single mind ever needed to face before.” Dijkstra meant this as a warning.
  • As programmers eagerly poured software into critical systems, they became, more and more, the linchpins of the built world—and Dijkstra thought they had perhaps overestimated themselves.
  • What made programming so difficult was that it required you to think like a computer.
  • “The problem is that software engineers don’t understand the problem they’re trying to solve, and don’t care to,” says Leveson, the MIT software-safety expert. The reason is that they’re too wrapped up in getting their code to work.
  • Though he runs a lab that studies the future of computing, he seems less interested in technology per se than in the minds of the people who use it. Like any good toolmaker, he has a way of looking at the world that is equal parts technical and humane. He graduated top of his class at the California Institute of Technology for electrical engineering,
  • “The serious problems that have happened with software have to do with requirements, not coding errors.” When you’re writing code that controls a car’s throttle, for instance, what’s important is the rules about when and how and by how much to open it. But these systems have become so complicated that hardly anyone can keep them straight in their head. “There’s 100 million lines of code in cars now,” Leveson says. “You just cannot anticipate all these things.”
  • a nearly decade-long investigation into claims of so-called unintended acceleration in Toyota cars. Toyota blamed the incidents on poorly designed floor mats, “sticky” pedals, and driver error, but outsiders suspected that faulty software might be responsible
  • software experts spend 18 months with the Toyota code, picking up where NASA left off. Barr described what they found as “spaghetti code,” programmer lingo for software that has become a tangled mess. Code turns to spaghetti when it accretes over many years, with feature after feature piling on top of, and being woven around
  • Using the same model as the Camry involved in the accident, Barr’s team demonstrated that there were actually more than 10 million ways for the onboard computer to cause unintended acceleration. They showed that as little as a single bit flip—a one in the computer’s memory becoming a zero or vice versa—could make a car run out of control. The fail-safe code that Toyota had put in place wasn’t enough to stop it
  • . In all, Toyota recalled more than 9 million cars, and paid nearly $3 billion in settlements and fines related to unintended acceleration.
  • The problem is that programmers are having a hard time keeping up with their own creations. Since the 1980s, the way programmers work and the tools they use have changed remarkably little.
  • “Visual Studio is one of the single largest pieces of software in the world,” he said. “It’s over 55 million lines of code. And one of the things that I found out in this study is more than 98 percent of it is completely irrelevant. All this work had been put into this thing, but it missed the fundamental problems that people faced. And the biggest one that I took away from it was that basically people are playing computer inside their head.” Programmers were like chess players trying to play with a blindfold on—so much of their mental energy is spent just trying to picture where the pieces are that there’s hardly any left over to think about the game itself.
  • The fact that the two of them were thinking about the same problem in the same terms, at the same time, was not a coincidence. They had both just seen the same remarkable talk, given to a group of software-engineering students in a Montreal hotel by a computer researcher named Bret Victor. The talk, which went viral when it was posted online in February 2012, seemed to be making two bold claims. The first was that the way we make software is fundamentally broken. The second was that Victor knew how to fix it.
  • This is the trouble with making things out of code, as opposed to something physical. “The complexity,” as Leveson puts it, “is invisible to the eye.”
  • in early 2012, Victor had finally landed upon the principle that seemed to thread through all of his work. (He actually called the talk “Inventing on Principle.”) The principle was this: “Creators need an immediate connection to what they’re creating.” The problem with programming was that it violated the principle. That’s why software systems were so hard to think about, and so rife with bugs: The programmer, staring at a page of text, was abstracted from whatever it was they were actually making.
  • “Our current conception of what a computer program is,” he said, is “derived straight from Fortran and ALGOL in the late ’50s. Those languages were designed for punch cards.”
  • WYSIWYG (pronounced “wizzywig”) came along. It stood for “What You See Is What You Get.”
  • Victor’s point was that programming itself should be like that. For him, the idea that people were doing important work, like designing adaptive cruise-control systems or trying to understand cancer, by staring at a text editor, was appalling.
  • With the right interface, it was almost as if you weren’t working with code at all; you were manipulating the game’s behavior directly.
  • When the audience first saw this in action, they literally gasped. They knew they weren’t looking at a kid’s game, but rather the future of their industry. Most software involved behavior that unfolded, in complex ways, over time, and Victor had shown that if you were imaginative enough, you could develop ways to see that behavior and change it, as if playing with it in your hands. One programmer who saw the talk wrote later: “Suddenly all of my tools feel obsolete.”
  • hen John Resig saw the “Inventing on Principle” talk, he scrapped his plans for the Khan Academy programming curriculum. He wanted the site’s programming exercises to work just like Victor’s demos. On the left-hand side you’d have the code, and on the right, the running program: a picture or game or simulation. If you changed the code, it’d instantly change the picture. “In an environment that is truly responsive,” Resig wrote about the approach, “you can completely change the model of how a student learns ... [They] can now immediately see the result and intuit how underlying systems inherently work without ever following an explicit explanation.” Khan Academy has become perhaps the largest computer-programming class in the world, with a million students, on average, actively using the program each month.
  • The ideas spread. The notion of liveness, of being able to see data flowing through your program instantly, made its way into flagship programming tools offered by Google and Apple. The default language for making new iPhone and Mac apps, called Swift, was developed by Apple from the ground up to support an environment, called Playgrounds, that was directly inspired by Light Table.
  • “Typically the main problem with software coding—and I’m a coder myself,” Bantegnie says, “is not the skills of the coders. The people know how to code. The problem is what to code. Because most of the requirements are kind of natural language, ambiguous, and a requirement is never extremely precise, it’s often understood differently by the guy who’s supposed to code.”
  • In a pair of later talks, “Stop Drawing Dead Fish” and “Drawing Dynamic Visualizations,” Victor went one further. He demoed two programs he’d built—the first for animators, the second for scientists trying to visualize their data—each of which took a process that used to involve writing lots of custom code and reduced it to playing around in a WYSIWYG interface.
  • Victor suggested that the same trick could be pulled for nearly every problem where code was being written today. “I’m not sure that programming has to exist at all,” he told me. “Or at least software developers.” In his mind, a software developer’s proper role was to create tools that removed the need for software developers. Only then would people with the most urgent computational problems be able to grasp those problems directly, without the intermediate muck of code.
  • Victor implored professional software developers to stop pouring their talent into tools for building apps like Snapchat and Uber. “The inconveniences of daily life are not the significant problems,” he wrote. Instead, they should focus on scientists and engineers—as he put it to me, “these people that are doing work that actually matters, and critically matters, and using really, really bad tools.”
  • Bantegnie’s company is one of the pioneers in the industrial use of model-based design, in which you no longer write code directly. Instead, you create a kind of flowchart that describes the rules your program should follow (the “model”), and the computer generates code for you based on those rules
  • In a model-based design tool, you’d represent this rule with a small diagram, as though drawing the logic out on a whiteboard, made of boxes that represent different states—like “door open,” “moving,” and “door closed”—and lines that define how you can get from one state to the other. The diagrams make the system’s rules obvious: Just by looking, you can see that the only way to get the elevator moving is to close the door, or that the only way to get the door open is to stop.
  • . In traditional programming, your task is to take complex rules and translate them into code; most of your energy is spent doing the translating, rather than thinking about the rules themselves. In the model-based approach, all you have is the rules. So that’s what you spend your time thinking about. It’s a way of focusing less on the machine and more on the problem you’re trying to get it to solve.
  • “Everyone thought I was interested in programming environments,” he said. Really he was interested in how people see and understand systems—as he puts it, in the “visual representation of dynamic behavior.” Although code had increasingly become the tool of choice for creating dynamic behavior, it remained one of the worst tools for understanding it. The point of “Inventing on Principle” was to show that you could mitigate that problem by making the connection between a system’s behavior and its code immediate.
  • On this view, software becomes unruly because the media for describing what software should do—conversations, prose descriptions, drawings on a sheet of paper—are too different from the media describing what software does do, namely, code itself.
  • for this approach to succeed, much of the work has to be done well before the project even begins. Someone first has to build a tool for developing models that are natural for people—that feel just like the notes and drawings they’d make on their own—while still being unambiguous enough for a computer to understand. They have to make a program that turns these models into real code. And finally they have to prove that the generated code will always do what it’s supposed to.
  • tice brings order and accountability to large codebases. But, Shivappa says, “it’s a very labor-intensive process.” He estimates that before they used model-based design, on a two-year-long project only two to three months was spent writing code—the rest was spent working on the documentation.
  • uch of the benefit of the model-based approach comes from being able to add requirements on the fly while still ensuring that existing ones are met; with every change, the computer can verify that your program still works. You’re free to tweak your blueprint without fear of introducing new bugs. Your code is, in FAA parlance, “correct by construction.”
  • “people are not so easily transitioning to model-based software development: They perceive it as another opportunity to lose control, even more than they have already.”
  • The bias against model-based design, sometimes known as model-driven engineering, or MDE, is in fact so ingrained that according to a recent paper, “Some even argue that there is a stronger need to investigate people’s perception of MDE than to research new MDE technologies.”
  • “Human intuition is poor at estimating the true probability of supposedly ‘extremely rare’ combinations of events in systems operating at a scale of millions of requests per second,” he wrote in a paper. “That human fallibility means that some of the more subtle, dangerous bugs turn out to be errors in design; the code faithfully implements the intended design, but the design fails to correctly handle a particular ‘rare’ scenario.”
  • Newcombe was convinced that the algorithms behind truly critical systems—systems storing a significant portion of the web’s data, for instance—ought to be not just good, but perfect. A single subtle bug could be catastrophic. But he knew how hard bugs were to find, especially as an algorithm grew more complex. You could do all the testing you wanted and you’d never find them all.
  • An algorithm written in TLA+ could in principle be proven correct. In practice, it allowed you to create a realistic model of your problem and test it not just thoroughly, but exhaustively. This was exactly what he’d been looking for: a language for writing perfect algorithms.
  • TLA+, which stands for “Temporal Logic of Actions,” is similar in spirit to model-based design: It’s a language for writing down the requirements—TLA+ calls them “specifications”—of computer programs. These specifications can then be completely verified by a computer. That is, before you write any code, you write a concise outline of your program’s logic, along with the constraints you need it to satisfy
  • Programmers are drawn to the nitty-gritty of coding because code is what makes programs go; spending time on anything else can seem like a distraction. And there is a patient joy, a meditative kind of satisfaction, to be had from puzzling out the micro-mechanics of code. But code, Lamport argues, was never meant to be a medium for thought. “It really does constrain your ability to think when you’re thinking in terms of a programming language,”
  • Code makes you miss the forest for the trees: It draws your attention to the working of individual pieces, rather than to the bigger picture of how your program fits together, or what it’s supposed to do—and whether it actually does what you think. This is why Lamport created TLA+. As with model-based design, TLA+ draws your focus to the high-level structure of a system, its essential logic, rather than to the code that implements it.
  • But TLA+ occupies just a small, far corner of the mainstream, if it can be said to take up any space there at all. Even to a seasoned engineer like Newcombe, the language read at first as bizarre and esoteric—a zoo of symbols.
  • this is a failure of education. Though programming was born in mathematics, it has since largely been divorced from it. Most programmers aren’t very fluent in the kind of math—logic and set theory, mostly—that you need to work with TLA+. “Very few programmers—and including very few teachers of programming—understand the very basic concepts and how they’re applied in practice. And they seem to think that all they need is code,” Lamport says. “The idea that there’s some higher level than the code in which you need to be able to think precisely, and that mathematics actually allows you to think precisely about it, is just completely foreign. Because they never learned it.”
  • “In the 15th century,” he said, “people used to build cathedrals without knowing calculus, and nowadays I don’t think you’d allow anyone to build a cathedral without knowing calculus. And I would hope that after some suitably long period of time, people won’t be allowed to write programs if they don’t understand these simple things.”
  • Programmers, as a species, are relentlessly pragmatic. Tools like TLA+ reek of the ivory tower. When programmers encounter “formal methods” (so called because they involve mathematical, “formally” precise descriptions of programs), their deep-seated instinct is to recoil.
  • Formal methods had an image problem. And the way to fix it wasn’t to implore programmers to change—it was to change yourself. Newcombe realized that to bring tools like TLA+ to the programming mainstream, you had to start speaking their language.
  • he presented TLA+ as a new kind of “pseudocode,” a stepping-stone to real code that allowed you to exhaustively test your algorithms—and that got you thinking precisely early on in the design process. “Engineers think in terms of debugging rather than ‘verification,’” he wrote, so he titled his internal talk on the subject to fellow Amazon engineers “Debugging Designs.” Rather than bemoan the fact that programmers see the world in code, Newcombe embraced it. He knew he’d lose them otherwise. “I’ve had a bunch of people say, ‘Now I get it,’” Newcombe says.
  • In the world of the self-driving car, software can’t be an afterthought. It can’t be built like today’s airline-reservation systems or 911 systems or stock-trading systems. Code will be put in charge of hundreds of millions of lives on the road and it has to work. That is no small task.
anonymous

Opinion | The Kind Workers at King Soopers Who Helped a Confused College Kid - The New ... - 0 views

  • The Kind Workers at King Soopers Who Helped a Confused College Kid
  • A University of Colorado Boulder graduate recalls two victims of the shooting who helped a grocery shopping novice.
  • To the Editor:In the fall of 2016 I had just moved into my first apartment and was about to start my sophomore year at the University of Colorado Boulder.
  • ...15 more annotations...
  • Eager to finally cook food of my own, I decided to pick up groceries at the King Soopers on Table Mesa Drive.
  • I spent most of my first solo supermarket store trip asking the store attendants where various items were.
  • Right away I was struck by how welcoming and friendly everyone working there was.
  • Teri Leiker, killed in Monday’s massacre, was the employee who packed groceries into my large duffle bag. “Find everything all right?” she asked with a grin. “You could pack this thing three times over,” she added, when zipping up my bag.
  • My interaction with Teri was representative of the warm vibe and casual relationship I developed with many of the store employees.
  • Rikki Olds, also murdered in the shooting, came to the rescue several times when my self-checkout machine froze
  • , and assured me it was fine when I realized I forgot things on my shopping list halfway through scanning items.
  • Shootings have become an all-too-common occurrence in America.
  • serving as a stark reminder that at any moment Boulder could be the next target of domestic terrorism.
  • As much as a mass shooting remained a lingering possibility, never did anyone expect something of Monday’s magnitude to occur in a peaceful, quirky, happy-go-lucky college town that’s proudly advertised itself as one of the best places to live in the country.
  • I was hit with a sense of guilt knowing I had never fully expressed my gratitude for Teri or Rikki’s hard work and compassion.
  • Never did I take a second to stop and think about the role they and their co-workers played in making me comfortable in a new place and transitioning to adulthood.
  • Going forward, we owe it to ourselves to have an enhanced appreciation of life and treat one another more kindly
  • In honor of the 10 people caught in the middle of another senseless shooting, let’s all strive to be more appreciative.
  • Jack Stern
anonymous

VHS Tapes Are Worth Money - The New York Times - 0 views

  • Who Is Still Buying VHS Tapes?
  • Despite the rise of streaming, there is still a vast library of moving images that are categorically unavailable anywhere else. Also a big nostalgia factor.
  • The last VCR, according to Dave Rodriguez, 33, a digital repository librarian at Florida State University in Tallahassee, Fla., was produced in 2016
  • ...33 more annotations...
  • But the VHS tape itself may be immortal.
  • Today, a robust marketplace exists, both virtually and in real life, for this ephemera.
  • “Hold steady. Price seems fair. It is a Classic.”
  • Driving the passionate collection of this form of media is the belief that VHS offers something that other types of media cannot.
  • “The general perception that people can essentially order whatever movie they want from home is flat-out wrong,”
  • “promised as a giant video store on the internet, where a customer was only one click away from the exact film they were looking for.”
  • “Anything that you can think of is on VHS tape, because, you’ve got to think, it was a revolutionary piece of the media,”
  • “It was a way for everyone to capture something and then put it out there.”
  • preservation
  • “just so much culture packed into VHS,”
  • a movie studio, an independent filmmaker, a parent shooting their kid’s first steps, etc.
  • finds the medium inspirational
  • “some weird, obscure movie on VHS I would have seen at my friend’s house, late at night, after his parents were asleep.
  • “The quality feels raw but warm and full of flavor,” he said of VHS.
  • views them as a byway connecting her with the past
  • from reels depicting family gatherings to movies that just never made the jump to DVD
  • “I think we were the last to grow up without the internet, cellphones or social media,” and clinging to the “old analog ways,” she said, feels “very natural.”
  • “I think that people are nostalgic for the aura of the VHS era,”
  • “So many cultural touch points are rooted there,” Mr. Harris said of the 1980s.
  • It was, he believes, “a time when, in some ways, Americans knew who we were.”
  • Not only could film connoisseurs peruse the aisles of video stores on Friday nights, but they could also compose home movies, from the artful to the inane
  • “In its heyday, it was mass-produced and widely adopted,”
  • She inherited some of them from her grandmother, a children’s librarian with a vast collection.
  • Historical Journal of Film, Radio and Television
  • the first technology that allowed mass, large-scale home media access to films.”
  • Mr. Arrow said that home videos captured on VHS, or taped television programs that contain old commercials and snippets from the news, are particularly insightful in diving into cultural history.
  • “There’ll be a news break, and you’ll see, like: Oh my god, O.J.’s still in the Bronco, and it’s on the news, and then it’ll cut back to ‘Mission Impossible’ or something.”
  • Marginalized communities, Mr. Harris said, who were not well represented in media in the 1980s, benefited from VHS technology, which allowed them to create an archival system that now brings to life people and communities that were otherwise absent from the screen.
  • The nature of VHS, Mr. Harris said, made self-documentation “readily available,
  • people who lacked representation could “begin to build a library, an archive, to affirm their existence and that of their community.”
  • VHS enthusiasts agree that these tapes occupy an irreplaceable place in culture.
  • “It’s like a time capsule,”
  • “The medium is like no other.”
ilanaprincilus06

Meet the neuroscientist shattering the myth of the gendered brain | Science | The Guardian - 0 views

  • Whatever its sex, this baby’s future is predetermined by the entrenched belief that males and females do all kinds of things differently, better or worse, because they have different brains.
  • how vital it is, how life-changing, that we finally unpack – and discard – the sexist stereotypes and binary coding that limit and harm us.
  • she is out in the world, debunking the “pernicious” sex differences myth: the idea that you can “sex” a brain or that there is such a thing as a male brain and a female brain.
  • ...18 more annotations...
  • since the 18th century “when people were happy to spout off about what men and women’s brains were like – before you could even look at them. They came up with these nice ideas and metaphors that fitted the status quo and society, and gave rise to different education for men and women.”
  • she couldn’t find any beyond the negligible, and other research was also starting to question the very existence of such differences. For example, once any differences in brain size were accounted for, “well-known” sex differences in key structures disappeared.
  • Are there any significant differences based on sex alone? The answer, she says, is no.
  • “The idea of the male brain and the female brain suggests that each is a characteristically homogenous thing and that whoever has got a male brain, say, will have the same kind of aptitudes, preferences and personalities as everyone else with that ‘type’ of brain. We now know that is not the case.
  • ‘Forget the male and female brain; it’s a distraction, it’s inaccurate.’ It’s possibly harmful, too, because it’s used as a hook to say, well, there’s no point girls doing science because they haven’t got a science brain, or boys shouldn’t be emotional or should want to lead.”
  • The next question was, what then is driving the differences in behaviour between girls and boys, men and women?
  • “that the brain is moulded from birth onwards and continues to be moulded through to the ‘cognitive cliff’ in old age when our grey cells start disappearing.
  • the brain is much more a function of experiences. If you learn a skill your brain will change, and it will carry on changing.”
  • The brain is also predictive and forward-thinking in a way we had never previously realised.
  • The rules will change how the brain works and how someone behaves.” The upshot of gendered rules? “The ‘gender gap’ becomes a self-fulfilling prophecy.”
  • The brain is a biological organ. Sex is a biological factor. But it is not the sole factor; it intersects with so many variables.”
  • Letting go of age-old certainties is frightening, concedes Rippon, who is both optimistic about the future, and fearful for it.
  • On the plus side, our plastic brains are good learners. All we need to do is change the life lessons.
  • One major breakthrough in recent years has been the realisation that, even in adulthood, our brains are continually being changed, not just by the education we receive, but also by the jobs we do, the hobbies we have, the sports we play.
  • Once we acknowledge that our brains are plastic and mouldable, then the power of gender stereotypes becomes evident.
  • Beliefs about sex differences (even if ill-founded) inform stereotypes, which commonly provide just two labels – girl or boy, female or male – which, in turn, historically carry with them huge amounts of “contents assured” information and save us having to judge each individual on their own merits
  • With input from exciting breakthroughs in neuroscience, the neat, binary distinctiveness of these labels is being challenged – we are coming to realise that nature is inextricably entangled with nurture.
  • The 21st century is not just challenging the old answers – it is challenging the question itself.
Javier E

Pandemic Advice From Athletes - The New York Times - 0 views

  • There’s a special kind of exhaustion that the world’s best endurance athletes embrace. Some call it masochistic, others may call it brave. When fatigue sends legs and lungs to their limits, they are able to push through to a gear beyond their pain threshold. These athletes approach fatigue not with fear but as a challenge, an opportunity.
  • It’s a quality that allows an ultramarathoner to endure what could be an unexpected rough segment of an 100-mile race, or a sailor to push ahead when she’s in the middle of the ocean, racing through hurricane winds alone.
  • The drive to persevere is something some are born with, but it’s also a muscle everyone can learn to flex. In a way, everyone has become an endurance athlete of sorts during this pandemic, running a race with no finish line that tests the limits of their exhaustion.
  • ...31 more annotations...
  • One message they all had: You are stronger than you think you are, and everyone is able to adapt in ways they didn’t think possible.
  • there are a few techniques to help you along — 100-mile race not required.
  • Pace Yourself
  • Training to become an elite endurance athlete means learning to embrace discomfort. Instead of hiding from pain, athletes must learn to work with it. A lot of that comes down to pacing
  • Similarly, as you muscle through an ongoing pandemic, you must look for ways to make peace with unknowns and new, uncomfortable realities. “When we think about the coronavirus, we are in it for the long run; so how do you pace yourself?”
  • She recommends thinking about your routines, practicing positive self-talk and focusing on processes instead of outcomes
  • You don’t know when the pandemic will end, but you can take control of your daily habits
  • “always have a little in reserve.”
  • Deplete your resources early and you’ll be in trouble. Focusing on day-to-day activities will pay off in the long run.
  • If you burn out all your mental energy in one day or week, you may find it more difficult to adapt when things don’t return to normal as quickly as you would hope.
  • There’s a pacing in living day to day, just as there’s pacing in climbing.
  • “Don’t play all your cards at once and keep a little something in reserve.”
  • Create Mini-Goals
  • Sports psychologists frequently recommend creating mini milestones en route to a big goal. There are many steps on the path from base camp to a mountain’s summit. Likewise, there are smaller, more achievable milestones to reach and celebrate as you venture ahead into the unknown.
  • “Setting goals that are controllable makes it easier to adapt,” Dr. Meijen said. “If you set goals that are controlled by other people, goals that aren’t realistic or are tough or boring, those are much harder to adapt to.”
  • “I’m really good at breaking things down into small increments and setting micro-goals,” he said. How micro?
  • “I break things down to 10 seconds at a time,” Mr. Woltering continued. “You just have to be present in what you are doing and you have to know that it may not be the most fun — or super painful — now, but that could change in 10 seconds down the road.”
  • And it may not change quickly. Mr. Woltering said he has spent six-hour stretches counting to 10 over and over again. “You just keep moving and keep counting,” he said. “And you have to have faith that it will change at some point.”
  • Create Structure
  • “Part of expedition life is having a routine that you’re comfortable with. When I’m on expedition, I always start the day with a basin of warm water and soap. I wash my hands, face, neck and ears and get the sand out of my eyes,” he said. “It’s something that’s repeated that gets you a sense of comfort and normalcy.”
  • During the pandemic, he has found comfort and normalcy by getting outdoors, and climbing whenever possible to “run the engine.”
  • Dee Caffari, a British sailor and the first woman to sail solo, nonstop, around the world in both directions, said structure is imperative to fight back loneliness and monotony.
  • “In your day you need structure,” Ms. Caffari said. “You need to get up in the morning knowing you’re going to make something happen.”
  • Focus on Something New
  • When all else fails, look to something new: a new hobby, a new goal, a new experience
  • During a particularly hard patch of a competition, some athletes say they focus on a different sense, one that perhaps is not at the forefront of their mind when the pain sets in. A runner could note the smells around her and a climber could note the way his hair is blowing in the wind.
  • When athletes are injured, sports psychologists and coaches frequently encourage them to find a new activity to engage their mind and body. The key is to adapt, adapt and then adapt again.
  • “We all want mental toughness, it’s an important part of dealing with difficult things,”
  • “The current definition of mental toughness is the ability to pivot and to be nimble and flexible.”
  • “The next moment is always completely uncertain, and it’s always been that way,” Dr. Gervais said. But adapting, adjusting expectations and discovering new goals or hobbies can allow you to continue to build the muscle that is mental toughness.
  • Bottom line? “Optimism is an antidote to anxiety,”
runlai_jiang

How Cellphone Chips Became a National-Security Concern - WSJ - 0 views

  • The U.S. made clear this week that containing China’s growing clout in wireless technology is now a national-security priority. Telecommunications-industry leaders say such fears are justified—but question whether the government’s extraordinary intervention in a corporate takeover battle that doesn’t even involve a Chinese company will make a difference.
  • Those worries are rooted in how modern communication works. Cellular-tower radios, internet routers and related electronics use increasingly complex hardware and software, with millions of lines of code
  • Hackers can potentially control the equipment through intentional or inadvertent security flaws, such as the recently disclosed “Meltdown” and “Spectre” flaws that could have affected most of the world’s computer chips.
  • ...4 more annotations...
  • Qualcomm is one of the few American leaders in developing standards and patents for 5G, the next generation of wireless technology that should be fast enough to enable self-driving cars and other innovations. The CFIUS letter said a weakened Qualcomm could strengthen Chinese rivals, specifically Huawei Technologies Co., the world’s top cellular-equipment maker and a leading smartphone brand.
  • Washington has taken unusual steps to hinder Huawei’s business in the U.S., concerned that Beijing could force the company to exploit its understanding of the equipment to spy or disable telecom networks.
  • Many European wireless carriers, including British-based Vodafone Group PLC, praise Huawei’s equipment, saying it is often cheaper and more advanced that those of its competitors. That is another big worry for Washington.
  • board and senior management team are American. “It’s barely a foreign company now, but politics and logic aren’t often friends,” said Stacy Rasgon, a Bernstein Research analyst. “I’m just not convinced that Qualcomm’s going to slash and burn the 5G roadmap and leave it open to Huawei” if Broadcom buys it.
marleen_ueberall

How Emotions Guide Our Lives | Psychology Today - 0 views

  • Emotions guide our lives in a million ways
  • most of us don’t realize the extent to which they are driving our thoughts and behavior.
  • Our emotions can offer us clues into who we are as well as how we’ve been affected by our history. Many of our actions are initiated by emotion,
  • ...11 more annotations...
  • Primary emotions are our first emotional reaction. They’re often followed by a more defended secondary emotion.
  • However, if we look at our initial reaction, our primary emotion, we may recognize that we had more vulnerable feelings, such as feeling hurt, unwanted, or ashamed
  • t is a vital feeling that often leaves the [person] feeling very open and perhaps vulnerable.”
  • If we imagine a moment of feeling tense, frustrated or stuck in a bad feeling, driven to react without a sense of relief, we were probably caught in a secondary emotion
  • able to access the deeper, more vulnerable feeling, perhaps a want or a need, or a core feeling of sadness or shame, we were then experiencing a primary emotion
  • Primary emotions can be either adaptive reactions to the moment or maladaptive reactions based on schemas from our past.
  • We may experience what I often refer to as a “critical inner voice,” a negative internal commentary that tells us things like, “You made such a fool of yourself. Look at how they’re looking at you. They all think you’re an idiot. You should just get out of here.”
  • The maladaptive secondary emotions can also lead us to react in ways that are not in our best interest: lashing out to defend ourselves, acting resentful or enraged, driven by thoughts
  • The good news is we can transform our emotions to become adaptive.
  • We can take our side by challenging our critical self-attacks and, thereby, offering ourselves compassion and love.
  • When we live in harmony with our emotions, we become more in touch with who we are.
« First ‹ Previous 41 - 60 of 72 Next ›
Showing 20 items per page