Skip to main content

Home/ TOK Friends/ Group items tagged behavioral economics

Rss Feed Group items tagged

Javier E

It's Not Just About Bad Choices - NYTimes.com - 0 views

  • WHENEVER I write about people who are struggling, I hear from readers who say something like: Folks need to stop whining and get a job. It’s all about personal responsibility.
  • In a 2014 poll, Republicans were twice as likely to say that people are poor because of individual failings as to say the reason is lack of opportunity (Democrats thought the opposite). I decided to ask some of the poor w
  • Too often, I believe, liberals deny that poverty is linked to bad choices. As Phillips and many other poor people acknowledge, of course, it is.
  • ...11 more annotations...
  • Self-destructive behaviors — dropping out of school, joining a gang, taking drugs, bearing children when one isn’t ready — compound poverty.
  • Yet scholars are also learning to understand the roots of these behaviors, and they’re far more complicated than the conservative narrative of human weakness.
  • For starters, there is growing evidence that poverty and mental health problems are linked in complex, reinforcing ways
  • If you’re battling mental health problems, or grow up with traumas like domestic violence (or seeing your brother shot dead), you’re more likely to have trouble in school, to self-medicate with drugs or alcohol, to have trouble in relationships.
  • A second line of research has shown that economic stress robs us of cognitive bandwidth.
  • Worrying about bills, food or other problems, leaves less capacity to think ahead or to exert self-discipline. So, poverty imposes a mental tax.
  • It turns out that when people have elevated levels of cortisol, a stress hormone, they are less willing to delay gratification.
  • it’s circumstances that can land you in a situation where it’s really hard to make a good decision because you’re so stressed out. And the ones you get wrong matter much more, because there’s less slack to play with.”
  • That emphasis on personal responsibility is part of the 12-step program to confront alcoholism or drug addiction, and it may be useful for people like Jackson. But for society to place the blame entirely on the individual seems to me a cop-out.
  • Let’s also remember, though, that today we have randomized trials — the gold standard of evidence — showing that certain social programs make self-destructive behaviors less common.
  • as long as we’re talking about personal irresponsibility, let’s also examine our own. Don’t we have a collective responsibility to provide more of a fair start in life to all, so that children aren’t propelled toward bad choices?
Javier E

They're Watching You at Work - Don Peck - The Atlantic - 2 views

  • Predictive statistical analysis, harnessed to big data, appears poised to alter the way millions of people are hired and assessed.
  • By one estimate, more than 98 percent of the world’s information is now stored digitally, and the volume of that data has quadrupled since 2007.
  • The application of predictive analytics to people’s careers—an emerging field sometimes called “people analytics”—is enormously challenging, not to mention ethically fraught
  • ...52 more annotations...
  • By the end of World War II, however, American corporations were facing severe talent shortages. Their senior executives were growing old, and a dearth of hiring from the Depression through the war had resulted in a shortfall of able, well-trained managers. Finding people who had the potential to rise quickly through the ranks became an overriding preoccupation of American businesses. They began to devise a formal hiring-and-management system based in part on new studies of human behavior, and in part on military techniques developed during both world wars, when huge mobilization efforts and mass casualties created the need to get the right people into the right roles as efficiently as possible. By the 1950s, it was not unusual for companies to spend days with young applicants for professional jobs, conducting a battery of tests, all with an eye toward corner-office potential.
  • But companies abandoned their hard-edged practices for another important reason: many of their methods of evaluation turned out not to be very scientific.
  • this regime, so widespread in corporate America at mid-century, had almost disappeared by 1990. “I think an HR person from the late 1970s would be stunned to see how casually companies hire now,”
  • Many factors explain the change, he said, and then he ticked off a number of them: Increased job-switching has made it less important and less economical for companies to test so thoroughly. A heightened focus on short-term financial results has led to deep cuts in corporate functions that bear fruit only in the long term. The Civil Rights Act of 1964, which exposed companies to legal liability for discriminatory hiring practices, has made HR departments wary of any broadly applied and clearly scored test that might later be shown to be systematically biased.
  • about a quarter of the country’s corporations were using similar tests to evaluate managers and junior executives, usually to assess whether they were ready for bigger roles.
  • He has encouraged the company’s HR executives to think about applying the games to the recruitment and evaluation of all professional workers.
  • Knack makes app-based video games, among them Dungeon Scrawl, a quest game requiring the player to navigate a maze and solve puzzles, and Wasabi Waiter, which involves delivering the right sushi to the right customer at an increasingly crowded happy hour. These games aren’t just for play: they’ve been designed by a team of neuroscientists, psychologists, and data scientists to suss out human potential. Play one of them for just 20 minutes, says Guy Halfteck, Knack’s founder, and you’ll generate several megabytes of data, exponentially more than what’s collected by the SAT or a personality test. How long you hesitate before taking every action, the sequence of actions you take, how you solve problems—all of these factors and many more are logged as you play, and then are used to analyze your creativity, your persistence, your capacity to learn quickly from mistakes, your ability to prioritize, and even your social intelligence and personality. The end result, Halfteck says, is a high-resolution portrait of your psyche and intellect, and an assessment of your potential as a leader or an innovator.
  • When the results came back, Haringa recalled, his heart began to beat a little faster. Without ever seeing the ideas, without meeting or interviewing the people who’d proposed them, without knowing their title or background or academic pedigree, Knack’s algorithm had identified the people whose ideas had panned out. The top 10 percent of the idea generators as predicted by Knack were in fact those who’d gone furthest in the process.
  • What Knack is doing, Haringa told me, “is almost like a paradigm shift.” It offers a way for his GameChanger unit to avoid wasting time on the 80 people out of 100—nearly all of whom look smart, well-trained, and plausible on paper—whose ideas just aren’t likely to work out.
  • Aptitude, skills, personal history, psychological stability, discretion, loyalty—companies at the time felt they had a need (and the right) to look into them all. That ambit is expanding once again, and this is undeniably unsettling. Should the ideas of scientists be dismissed because of the way they play a game? Should job candidates be ranked by what their Web habits say about them? Should the “data signature” of natural leaders play a role in promotion? These are all live questions today, and they prompt heavy concerns: that we will cede one of the most subtle and human of skills, the evaluation of the gifts and promise of other people, to machines; that the models will get it wrong; that some people will never get a shot in the new workforce.
  • scoring distance from work could violate equal-employment-opportunity standards. Marital status? Motherhood? Church membership? “Stuff like that,” Meyerle said, “we just don’t touch”—at least not in the U.S., where the legal environment is strict. Meyerle told me that Evolv has looked into these sorts of factors in its work for clients abroad, and that some of them produce “startling results.”
  • consider the alternative. A mountain of scholarly literature has shown that the intuitive way we now judge professional potential is rife with snap judgments and hidden biases, rooted in our upbringing or in deep neurological connections that doubtless served us well on the savanna but would seem to have less bearing on the world of work.
  • We may like to think that society has become more enlightened since those days, and in many ways it has, but our biases are mostly unconscious, and they can run surprisingly deep. Consider race. For a 2004 study called “Are Emily and Greg More Employable Than Lakisha and Jamal?,” the economists Sendhil Mullainathan and Marianne Bertrand put white-sounding names (Emily Walsh, Greg Baker) or black-sounding names (Lakisha Washington, Jamal Jones) on similar fictitious résumés, which they then sent out to a variety of companies in Boston and Chicago. To get the same number of callbacks, they learned, they needed to either send out half again as many résumés with black names as those with white names, or add eight extra years of relevant work experience to the résumés with black names.
  • a sociologist at Northwestern, spent parts of the three years from 2006 to 2008 interviewing professionals from elite investment banks, consultancies, and law firms about how they recruited, interviewed, and evaluated candidates, and concluded that among the most important factors driving their hiring recommendations were—wait for it—shared leisure interests.
  • Lacking “reliable predictors of future performance,” Rivera writes, “assessors purposefully used their own experiences as models of merit.” Former college athletes “typically prized participation in varsity sports above all other types of involvement.” People who’d majored in engineering gave engineers a leg up, believing they were better prepared.
  • the prevailing system of hiring and management in this country involves a level of dysfunction that should be inconceivable in an economy as sophisticated as ours. Recent survey data collected by the Corporate Executive Board, for example, indicate that nearly a quarter of all new hires leave their company within a year of their start date, and that hiring managers wish they’d never extended an offer to one out of every five members on their team
  • In the late 1990s, as these assessments shifted from paper to digital formats and proliferated, data scientists started doing massive tests of what makes for a successful customer-support technician or salesperson. This has unquestionably improved the quality of the workers at many firms.
  • In 2010, however, Xerox switched to an online evaluation that incorporates personality testing, cognitive-skill assessment, and multiple-choice questions about how the applicant would handle specific scenarios that he or she might encounter on the job. An algorithm behind the evaluation analyzes the responses, along with factual information gleaned from the candidate’s application, and spits out a color-coded rating: red (poor candidate), yellow (middling), or green (hire away). Those candidates who score best, I learned, tend to exhibit a creative but not overly inquisitive personality, and participate in at least one but not more than four social networks, among many other factors. (Previous experience, one of the few criteria that Xerox had explicitly screened for in the past, turns out to have no bearing on either productivity or retention
  • When Xerox started using the score in its hiring decisions, the quality of its hires immediately improved. The rate of attrition fell by 20 percent in the initial pilot period, and over time, the number of promotions rose. Xerox still interviews all candidates in person before deciding to hire them, Morse told me, but, she added, “We’re getting to the point where some of our hiring managers don’t even want to interview anymore”
  • Gone are the days, Ostberg told me, when, say, a small survey of college students would be used to predict the statistical validity of an evaluation tool. “We’ve got a data set of 347,000 actual employees who have gone through these different types of assessments or tools,” he told me, “and now we have performance-outcome data, and we can split those and slice and dice by industry and location.”
  • Evolv’s tests allow companies to capture data about everybody who applies for work, and everybody who gets hired—a complete data set from which sample bias, long a major vexation for industrial-organization psychologists, simply disappears. The sheer number of observations that this approach makes possible allows Evolv to say with precision which attributes matter more to the success of retail-sales workers (decisiveness, spatial orientation, persuasiveness) or customer-service personnel at call centers (rapport-building)
  • There are some data that Evolv simply won’t use, out of a concern that the information might lead to systematic bias against whole classes of people
  • the idea that hiring was a science fell out of favor. But now it’s coming back, thanks to new technologies and methods of analysis that are cheaper, faster, and much-wider-ranging than what we had before
  • what most excites him are the possibilities that arise from monitoring the entire life cycle of a worker at any given company.
  • Now the two companies are working together to marry pre-hire assessments to an increasing array of post-hire data: about not only performance and duration of service but also who trained the employees; who has managed them; whether they were promoted to a supervisory role, and how quickly; how they performed in that role; and why they eventually left.
  • What begins with an online screening test for entry-level workers ends with the transformation of nearly every aspect of hiring, performance assessment, and management.
  • I turned to Sandy Pentland, the director of the Human Dynamics Laboratory at MIT. In recent years, Pentland has pioneered the use of specialized electronic “badges” that transmit data about employees’ interactions as they go about their days. The badges capture all sorts of information about formal and informal conversations: their length; the tone of voice and gestures of the people involved; how much those people talk, listen, and interrupt; the degree to which they demonstrate empathy and extroversion; and more. Each badge generates about 100 data points a minute.
  • he tried the badges out on about 2,500 people, in 21 different organizations, and learned a number of interesting lessons. About a third of team performance, he discovered, can usually be predicted merely by the number of face-to-face exchanges among team members. (Too many is as much of a problem as too few.) Using data gathered by the badges, he was able to predict which teams would win a business-plan contest, and which workers would (rightly) say they’d had a “productive” or “creative” day. Not only that, but he claimed that his researchers had discovered the “data signature” of natural leaders, whom he called “charismatic connectors” and all of whom, he reported, circulate actively, give their time democratically to others, engage in brief but energetic conversations, and listen at least as much as they talk.
  • His group is developing apps to allow team members to view their own metrics more or less in real time, so that they can see, relative to the benchmarks of highly successful employees, whether they’re getting out of their offices enough, or listening enough, or spending enough time with people outside their own team.
  • Torrents of data are routinely collected by American companies and now sit on corporate servers, or in the cloud, awaiting analysis. Bloomberg reportedly logs every keystroke of every employee, along with their comings and goings in the office. The Las Vegas casino Harrah’s tracks the smiles of the card dealers and waitstaff on the floor (its analytics team has quantified the impact of smiling on customer satisfaction). E‑mail, of course, presents an especially rich vein to be mined for insights about our productivity, our treatment of co-workers, our willingness to collaborate or lend a hand, our patterns of written language, and what those patterns reveal about our intelligence, social skills, and behavior.
  • people analytics will ultimately have a vastly larger impact on the economy than the algorithms that now trade on Wall Street or figure out which ads to show us. He reminded me that we’ve witnessed this kind of transformation before in the history of management science. Near the turn of the 20th century, both Frederick Taylor and Henry Ford famously paced the factory floor with stopwatches, to improve worker efficiency.
  • “The quantities of data that those earlier generations were working with,” he said, “were infinitesimal compared to what’s available now. There’s been a real sea change in the past five years, where the quantities have just grown so large—petabytes, exabytes, zetta—that you start to be able to do things you never could before.”
  • People analytics will unquestionably provide many workers with more options and more power. Gild, for example, helps companies find undervalued software programmers, working indirectly to raise those people’s pay. Other companies are doing similar work. One called Entelo, for instance, specializes in using algorithms to identify potentially unhappy programmers who might be receptive to a phone cal
  • He sees it not only as a boon to a business’s productivity and overall health but also as an important new tool that individual employees can use for self-improvement: a sort of radically expanded The 7 Habits of Highly Effective People, custom-written for each of us, or at least each type of job, in the workforce.
  • the most exotic development in people analytics today is the creation of algorithms to assess the potential of all workers, across all companies, all the time.
  • The way Gild arrives at these scores is not simple. The company’s algorithms begin by scouring the Web for any and all open-source code, and for the coders who wrote it. They evaluate the code for its simplicity, elegance, documentation, and several other factors, including the frequency with which it’s been adopted by other programmers. For code that was written for paid projects, they look at completion times and other measures of productivity. Then they look at questions and answers on social forums such as Stack Overflow, a popular destination for programmers seeking advice on challenging projects. They consider how popular a given coder’s advice is, and how widely that advice ranges.
  • The algorithms go further still. They assess the way coders use language on social networks from LinkedIn to Twitter; the company has determined that certain phrases and words used in association with one another can distinguish expert programmers from less skilled ones. Gild knows these phrases and words are associated with good coding because it can correlate them with its evaluation of open-source code, and with the language and online behavior of programmers in good positions at prestigious companies.
  • having made those correlations, Gild can then score programmers who haven’t written open-source code at all, by analyzing the host of clues embedded in their online histories. They’re not all obvious, or easy to explain. Vivienne Ming, Gild’s chief scientist, told me that one solid predictor of strong coding is an affinity for a particular Japanese manga site.
  • Gild’s CEO, Sheeroy Desai, told me he believes his company’s approach can be applied to any occupation characterized by large, active online communities, where people post and cite individual work, ask and answer professional questions, and get feedback on projects. Graphic design is one field that the company is now looking at, and many scientific, technical, and engineering roles might also fit the bill. Regardless of their occupation, most people leave “data exhaust” in their wake, a kind of digital aura that can reveal a lot about a potential hire.
  • professionally relevant personality traits can be judged effectively merely by scanning Facebook feeds and photos. LinkedIn, of course, captures an enormous amount of professional data and network information, across just about every profession. A controversial start-up called Klout has made its mission the measurement and public scoring of people’s online social influence.
  • Mullainathan expressed amazement at how little most creative and professional workers (himself included) know about what makes them effective or ineffective in the office. Most of us can’t even say with any certainty how long we’ve spent gathering information for a given project, or our pattern of information-gathering, never mind know which parts of the pattern should be reinforced, and which jettisoned. As Mullainathan put it, we don’t know our own “production function.”
  • Over time, better job-matching technologies are likely to begin serving people directly, helping them see more clearly which jobs might suit them and which companies could use their skills. In the future, Gild plans to let programmers see their own profiles and take skills challenges to try to improve their scores. It intends to show them its estimates of their market value, too, and to recommend coursework that might allow them to raise their scores even more. Not least, it plans to make accessible the scores of typical hires at specific companies, so that software engineers can better see the profile they’d need to land a particular job
  • Knack, for its part, is making some of its video games available to anyone with a smartphone, so people can get a better sense of their strengths, and of the fields in which their strengths would be most valued. (Palo Alto High School recently adopted the games to help students assess careers.) Ultimately, the company hopes to act as matchmaker between a large network of people who play its games (or have ever played its games) and a widening roster of corporate clients, each with its own specific profile for any given type of job.
  • When I began my reporting for this story, I was worried that people analytics, if it worked at all, would only widen the divergent arcs of our professional lives, further gilding the path of the meritocratic elite from cradle to grave, and shutting out some workers more definitively. But I now believe the opposite is likely to happen, and that we’re headed toward a labor market that’s fairer to people at every stage of their careers
  • For decades, as we’ve assessed people’s potential in the professional workforce, the most important piece of data—the one that launches careers or keeps them grounded—has been educational background: typically, whether and where people went to college, and how they did there. Over the past couple of generations, colleges and universities have become the gatekeepers to a prosperous life. A degree has become a signal of intelligence and conscientiousness, one that grows stronger the more selective the school and the higher a student’s GPA, that is easily understood by employers, and that, until the advent of people analytics, was probably unrivaled in its predictive powers.
  • the limitations of that signal—the way it degrades with age, its overall imprecision, its many inherent biases, its extraordinary cost—are obvious. “Academic environments are artificial environments,” Laszlo Bock, Google’s senior vice president of people operations, told The New York Times in June. “People who succeed there are sort of finely trained, they’re conditioned to succeed in that environment,” which is often quite different from the workplace.
  • because one’s college history is such a crucial signal in our labor market, perfectly able people who simply couldn’t sit still in a classroom at the age of 16, or who didn’t have their act together at 18, or who chose not to go to graduate school at 22, routinely get left behind for good. That such early factors so profoundly affect career arcs and hiring decisions made two or three decades later is, on its face, absurd.
  • I spoke with managers at a lot of companies who are using advanced analytics to reevaluate and reshape their hiring, and nearly all of them told me that their research is leading them toward pools of candidates who didn’t attend college—for tech jobs, for high-end sales positions, for some managerial roles. In some limited cases, this is because their analytics revealed no benefit whatsoever to hiring people with college degrees; in other cases, and more often, it’s because they revealed signals that function far better than college history,
  • Google, too, is hiring a growing number of nongraduates. Many of the people I talked with reported that when it comes to high-paying and fast-track jobs, they’re reducing their preference for Ivy Leaguers and graduates of other highly selective schools.
  • This process is just beginning. Online courses are proliferating, and so are online markets that involve crowd-sourcing. Both arenas offer new opportunities for workers to build skills and showcase competence. Neither produces the kind of instantly recognizable signals of potential that a degree from a selective college, or a first job at a prestigious firm, might. That’s a problem for traditional hiring managers, because sifting through lots of small signals is so difficult and time-consuming.
  • all of these new developments raise philosophical questions. As professional performance becomes easier to measure and see, will we become slaves to our own status and potential, ever-focused on the metrics that tell us how and whether we are measuring up? Will too much knowledge about our limitations hinder achievement and stifle our dreams? All I can offer in response to these questions, ironically, is my own gut sense, which leads me to feel cautiously optimistic.
  • Google’s understanding of the promise of analytics is probably better than anybody else’s, and the company has been changing its hiring and management practices as a result of its ongoing analyses. (Brainteasers are no longer used in interviews, because they do not correlate with job success; GPA is not considered for anyone more than two years out of school, for the same reason—the list goes on.) But for all of Google’s technological enthusiasm, these same practices are still deeply human. A real, live person looks at every résumé the company receives. Hiring decisions are made by committee and are based in no small part on opinions formed during structured interviews.
caelengrubb

Language Shapes the Way People Think and Behave - 0 views

  • Language is part of culture and culture has an effect on the way a person thinks, which initiates behaviors
  • The researcher found out that the linguistic discrepancy shows economic differences as well.
  • Several languages have grammatical gender systems, which the English language does not have. For example, inanimate objects have genders in German, Russian, Spanish or French.
  • ...8 more annotations...
  • Colors are distinguished differently in other languages. In some languages, there are no separate names for orange and yellow even if the people know that there are differences between these two colors
  • The difference in the way languages define colors directly affect the way speakers give meaning pertaining to colors
  • The way speakers interpret the things they feel, hear and see can be complicated because it is influences by personal experiences, norms, cultural rules, traditions and languages. Thoughts come from words and these thoughts initiate behaviors.
  • International communication and global business are also affected by languages, thus the pressing needs for localization. To effectively do business in other countries, a company must be able to deliver messages to their employees and target audiences in a language that can be correctly and clearly understood
  • If you look at the similarities and differences between languages, you’ll be able to discover clues on what constitute proper and improper behaviors.
  • People use language daily in order to celebrate, communicate, negotiate, learn, legislate, document and argue. You use language each time you need to express something
  • The study of linguistics opens a way to better understand languages – how they are spoken and the people who speak them, which lead to an understanding of how society operates. Linguistics also helps to improve society.
  • Linguists combine different methods from several scientific fields of study such as computational, biological and psychological techniques, aside from the theoretical or documentary field.
tongoscar

Why the US needs Russia and China to help change Iran's behavior | TheHill - 0 views

  • Predicting the future behavior of any country in the Middle East is a dangerous undertaking. Some might suggest it’s a lot cheaper and more effective to rely on a pack of tarot cards than a report from the U.S. intelligence community.
  • Unfortunately, in America, we seem to have little memory of this region’s history, and the misplaced illation made by many over Iranian General Qassem Soleimani’s death soon will fade.
  • In the process of deciding how they will exact this price, Iran will weigh its options against our domestic condition, whether these are set by the U.S. election cycle or Iranian perceptions of who, exactly, should pay the highest price. What Iran’s leadership does know is that a majority of Americans do not want war, nor do most Americans support the seemingly unarticulated reason for keeping U.S. troops in the region. 
  • ...2 more annotations...
  • Asymmetric responses by the U.S. are a real option, but this comes with a high price. While we could pay this price, Washington would be unable to sustain such an effort indefinitely because of domestic and global political reasons. Israel has been undertaking such operations for many years, with some measurable impact, but the Israelis arguably have the political support at home and the same elements needed for asymmetric warfare that Iran has. Furthermore, the threat of large-scale U.S. military retaliation could quickly broaden the scope of the conflict, with unintended regional economic and political consequences, and still not diminish Iran’s capability to carry out covert attacks on American officials, interests and regional allies. 
  • Pursuing such superpower diplomacy, along with asymmetric pressure on Iran, will not come without some price. Washington may need to compromise with Moscow and Beijing on other matters of considerable geopolitical significance. However, Iran is one area where all three superpowers might find a workable agreement that brings the country back into the fold. Iran is an ancient, formidable regional player and the actions taken by all concerned, across a broad spectrum of issues, will have long-term repercussions for each stakeholder’s critical geopolitical goals in the region and beyond.
Javier E

Michael Chwe, Author, Sees Jane Austen as Game Theorist - NYTimes.com - 0 views

  • It’s not every day that someone stumbles upon a major new strategic thinker during family movie night. But that’s what happened to Michael Chwe, an associate professor of political science at the University of California, Los Angeles, when he sat down with his children some eight years ago to watch “Clueless,” the 1995 romantic comedy based on Jane Austen’s “Emma.”
  • In 230 diagram-heavy pages, Mr. Chwe argues that Austen isn’t merely fodder for game-theoretical analysis, but an unacknowledged founder of the discipline itself: a kind of Empire-waisted version of the mathematician and cold war thinker John von Neumann, ruthlessly breaking down the stratagems of 18th-century social warfare.
  • Or, as Mr. Chwe puts it in the book, “Anyone interested in human behavior should read Austen because her research program has results.”
  • ...7 more annotations...
  • Modern game theory is generally dated to 1944, with the publication of von Neumann’s “Theory of Games and Economic Behavior,” which imagined human interactions as a series of moves and countermoves aimed at maximizing “payoff.” Since then the discipline has thrived, often dominating political science, economics and biology
  • But a century and a half earlier, Mr. Chwe argues, Austen was very deliberately trying to lay philosophical groundwork for a new theory of strategic action, sometimes charting territory that today’s theoreticians have themselves failed to reach.
  • Game theory, he argues, isn’t just part of “hegemonic cold war discourse,” but what the political scientist James Scott called a subversive “weapon of the weak.”
  • many situations, Mr. Chwe points out, involve parties with unequal levels of strategic thinking. Sometimes a party may simply lack ability. But sometimes a powerful party faced with a weaker one may not realize it even needs to think strategically.
  • Mr. Chwe, who identifies some 50 “strategic manipulations” in Austen
  • First among her as yet unequaled concepts is “cluelessness
  • Even some humanists who admire Mr. Chwe’s work suggest that when it comes to appreciating Austen, social scientists may be the clueless ones. Austen scholars “will not be surprised at all to see the depths of her grasp of strategic thinking and the way she anticipated a 20th-century field of inquiry,”
Javier E

Scholarship and Politics - The Case of Noam Chomsky - NYTimes.com - 0 views

  • (1) The academy is a world of its own, complete with rules, protocols, systems of evaluation, recognized achievements, agreed-on goals, a roster of heroes and a list of tasks yet to be done.
  • (2) Academic work proceeds within the confines of that world, within, that is, a professional, not a public, space, although its performance may be, and often is, public.
  • (3) academic work is only tangentially, not essentially, political; politics may attend the formation of academic units and the selection of academic personnel, but political concerns and pressures have no place in the unfolding of academic argument, except as objects of its distinctive forms of attention
  • ...16 more annotations...
  • (4) The academic views of a professor are independent of his or her real-world political views; academic disputes don’t track partisan disputes or vice versa; you can’t reason from an academic’s disciplinary views to the positions he or she would take in the public sphere; they are independent variables.
  • The answer given in the first lecture — “What is Language?” — is that we are creatures with language, and that language as a uniquely human biological capacity appeared suddenly and quite late in the evolutionary story, perhaps 75,000 years ago.
  • Chomsky gave three lectures under the general title “What Kind of Creatures are We?”
  • Language, then, does not arise from the social/cultural environment, although the environment provides the stuff or input it works on. That input is “impoverished”; it can’t account for the creativity of language performance, which has its source not in the empirical world, but in an innate ability that is more powerful than the stimuli it utilizes and plays with. It follows that if you want to understand language, you shouldn’t look to linguistic behavior but to the internal mechanism — the Universal Grammar — of which particular linguistic behaviors are a non-exhaustive expression. (The capacity exceeds the empirical resources it might deploy.)
  • In his second lecture (“What Can We Understand?”), Chomsky took up the question of what humans are capable of understanding and his answer, generally, was that we can understand what we can understand, and that means that we can’t understand what is beyond our innate mental capacities
  • This does not mean, he said, that what we can’t understand is not real: “What is mysterious to me is not an argument that it does not exist.” It’s just that while language is powerful and creative, its power and creativity have limits; and since language is thought rather than an addition to or clothing of thought, the limits of language are the limits of what we can fruitfully think about
  • This is as good as it gets. There is “no evolution in our capacity for language.”
  • These assertions are offered as a counter to what Chomsky sees as the over-optimistic Enlightenment belief — common to many empiricist philosophies — that ours is a “limitless explanatory power” and that “we can do anything.”
  • In the third lecture (“What is the Common Good?”) Chomsky turned from the philosophy of mind and language to political philosophy and the question of what constitutes a truly democratic society
  • He likened dogmatic intellectual structures that interfere with free inquiry to coercive political structures that stifle the individual’s creative independence and fail to encourage humanity’s “richest diversity
  • He asserted that any institution marked by domination and hierarchy must rise to the challenge of justifying itself, and if it cannot meet the challenge, it should be dismantled.
  • He contrasted two accounts of democracy: one — associated by him with James Madison — distrusts the “unwashed” populace and puts its faith in representative government where those doing the representing (and the voting and the distributing of goods) constitute a moneyed and propertied elite
  • the other — associated by him with Adam Smith (in one of his moods), J. S. Mill, the 1960s and a tradition of anarchist writing — seeks to expand the franchise and multiply choices in the realms of thought, politics and economics. The impulse of this second, libertarian, strain of democracy, is “to free society from economic or theological guardianship,” and by “theological” Chomsky meant not formal religion as such but any assumed and frozen ideology that blocked inquiry and limited participation. There can’t, in short, be “too much democracy.”
  • It was thought of the highest order performed by a thinker, now 85 years old, who by and large eschewed rhetorical flourishes (he has called his own speaking style “boring” and says he likes it that way) and just did it, where ‘it” was the patient exploration of deep issues that had been explored before him by a succession of predecessors, fully acknowledged, in a conversation that is forever being continued and forever being replenished.
  • Yes, I said to myself, this is what we — those of us who bought a ticket on this particular train — do; we think about problems and puzzles and try to advance the understanding of them; and we do that kind of thinking because its pleasures are, in a strong sense, athletic and provide for us, at least on occasion, the experience of fully realizing whatever capabilities we might have. And we do it in order to have that experience, and to share it with colleagues and students of like mind, and not to make a moral or political point.
  • The term “master class” is a bit overused, but I feel no hesitation in using it here. It was a master class taught by a master, and if someone were to ask me what exactly is it that academics do, I would point to these lectures and say, simply, here it is, the thing itself.
kushnerha

Philosophy's True Home - The New York Times - 0 views

  • We’ve all heard the argument that philosophy is isolated, an “ivory tower” discipline cut off from virtually every other progress-making pursuit of knowledge, including math and the sciences, as well as from the actual concerns of daily life. The reasons given for this are many. In a widely read essay in this series, “When Philosophy Lost Its Way,” Robert Frodeman and Adam Briggle claim that it was philosophy’s institutionalization in the university in the late 19th century that separated it from the study of humanity and nature, now the province of social and natural sciences.
  • This institutionalization, the authors claim, led it to betray its central aim of articulating the knowledge needed to live virtuous and rewarding lives. I have a different view: Philosophy isn’t separated from the social, natural or mathematical sciences, nor is it neglecting the study of goodness, justice and virtue, which was never its central aim.
  • identified philosophy with informal linguistic analysis. Fortunately, this narrow view didn’t stop them from contributing to the science of language and the study of law. Now long gone, neither movement defined the philosophy of its day and neither arose from locating it in universities.
  • ...13 more annotations...
  • The authors claim that philosophy abandoned its relationship to other disciplines by creating its own purified domain, accessible only to credentialed professionals. It is true that from roughly 1930 to 1950, some philosophers — logical empiricists, in particular — did speak of philosophy having its own exclusive subject matter. But since that subject matter was logical analysis aimed at unifying all of science, interdisciplinarity was front and center.
  • Philosophy also played a role in 20th-century physics, influencing the great physicists Albert Einstein, Niels Bohr and Werner Heisenberg. The philosophers Moritz Schlick and Hans Reichenbach reciprocated that interest by assimilating the new physics into their philosophies.
  • developed ideas relating logic to linguistic meaning that provided a framework for studying meaning in all human languages. Others, including Paul Grice and J.L. Austin, explained how linguistic meaning mixes with contextual information to enrich communicative contents and how certain linguistic performances change social facts. Today a new philosophical conception of the relationship between meaning and cognition adds a further dimension to linguistic science.
  • Decision theory — the science of rational norms governing action, belief and decision under uncertainty — was developed by the 20th-century philosophers Frank Ramsey, Rudolph Carnap, Richard Jeffrey and others. It plays a foundational role in political science and economics by telling us what rationality requires, given our evidence, priorities and the strength of our beliefs. Today, no area of philosophy is more successful in attracting top young minds.
  • Philosophy also assisted psychology in its long march away from narrow behaviorism and speculative Freudianism. The mid-20th-century functionalist perspective pioneered by Hilary Putnam was particularly important. According to it, pain, pleasure and belief are neither behavioral dispositions nor bare neurological states. They are interacting internal causes, capable of very different physical realizations, that serve the goals of individuals in specific ways. This view is now embedded in cognitive psychology and neuroscience.
  • philosopher-mathematicians Gottlob Frege, Bertrand Russell, Kurt Gödel, Alonzo Church and Alan Turing invented symbolic logic, helped establish the set-theoretic foundations of mathematics, and gave us the formal theory of computation that ushered in the digital age
  • Philosophy of biology is following a similar path. Today’s philosophy of science is less accessible than Aristotle’s natural philosophy chiefly because it systematizes a larger, more technically sophisticated body of knowledge.
  • Philosophy’s interaction with mathematics, linguistics, economics, political science, psychology and physics requires specialization. Far from fostering isolation, this specialization makes communication and cooperation among disciplines possible. This has always been so.
  • Nor did scientific progress rob philosophy of its former scientific subject matter, leaving it to concentrate on the broadly moral. In fact, philosophy thrives when enough is known to make progress conceivable, but it remains unachieved because of methodological confusion. Philosophy helps break the impasse by articulating new questions, posing possible solutions and forging new conceptual tools.
  • Our knowledge of the universe and ourselves expands like a ripple surrounding a pebble dropped in a pool. As we move away from the center of the spreading circle, its area, representing our secure knowledge, grows. But so does its circumference, representing the border where knowledge blurs into uncertainty and speculation, and methodological confusion returns. Philosophy patrols the border, trying to understand how we got there and to conceptualize our next move.  Its job is unending.
  • Although progress in ethics, political philosophy and the illumination of life’s meaning has been less impressive than advances in some other areas, it is accelerating.
  • the advances in our understanding because of careful formulation and critical evaluation of theories of goodness, rightness, justice and human flourishing by philosophers since 1970 compare well to the advances made by philosophers from Aristotle to 1970
  • The knowledge required to maintain philosophy’s continuing task, including its vital connection to other disciplines, is too vast to be held in one mind. Despite the often-repeated idea that philosophy’s true calling can only be fulfilled in the public square, philosophers actually function best in universities, where they acquire and share knowledge with their colleagues in other disciplines. It is also vital for philosophers to engage students — both those who major in the subject, and those who do not. Although philosophy has never had a mass audience, it remains remarkably accessible to the average student; unlike the natural sciences, its frontiers can be reached in a few undergraduate courses.
Javier E

To Cut My Spending, I Used Behavioral Economics on Myself - The Atlantic - 0 views

  • “The average person, in my view, a lot of the overspending they do isn’t in the small things, which your system is likely to deal with,” he said. “But it’s large things that are often quite invisible, and wouldn’t be picked up by your system.” There are usually more savings to be had from revisiting one’s auto- or home-insurance policy, or one’s phone bill, than from skipping the marginal cup of coffee. Loewenstein said it’s more effective to make changes with larger “one-time decisions,” instead of regularly having to make “all these micro-decisions.”
  • the dynamics that shape spending. On one side of each credit-card swipe are multiple financial corporations—a phalanx of marketers, programmers, and data analysts who have perfect visibility into countless transactions, and who are thus armed with plentiful information about people’s purchases. On the other is the individual, who lacks this bird’s-eye view and is effectively on their own as they weigh whether and how much to spend at any given time. This arrangement seems lopsided and unfair
  • “A lot of the problem is us … We tend to blame the credit-card industry for our own desire to have a standard of living that is beyond what our income is. You can’t blame Visa for that.” He said the focus should be on norms, and how individual action can alter them—maybe two friends cook dinner together instead of going out. The goal, Pollack says, would be a culture that prizes restraint without being puritanical.
  • ...1 more annotation...
  • What would create such a culture? There is the Consumer Financial Protection Bureau, which (in theory) provides high-level government oversight, and there are small individual actions (like, say, meticulously tracking one’s purchases), but there isn’t something in between—a powerful advocacy group, a mainstream cultural movement, or something else not yet built or imagined—that serves as a counterweight to the pressure on Americans to spend.
Javier E

ROUGH TYPE | Nicholas Carr's blog - 0 views

  • The smartphone has become a repository of the self, recording and dispensing the words, sounds and images that define what we think, what we experience and who we are. In a 2015 Gallup survey, more than half of iPhone owners said that they couldn’t imagine life without the device.
  • So what happens to our minds when we allow a single tool such dominion over our perception and cognition?
  • Not only do our phones shape our thoughts in deep and complicated ways, but the effects persist even when we aren’t using the devices. As the brain grows dependent on the technology, the research suggests, the intellect weakens.
  • ...39 more annotations...
  • he has seen mounting evidence that using a smartphone, or even hearing one ring or vibrate, produces a welter of distractions that makes it harder to concentrate on a difficult problem or job. The division of attention impedes reasoning and performance.
  • Another 2015 study, appearing in the Journal of Computer-Mediated Communication, showed that when people hear their phone ring but are unable to answer it, their blood pressure spikes, their pulse quickens, and their problem-solving skills decline.
  • The researchers recruited 520 undergraduates at UCSD and gave them two standard tests of intellectual acuity. One test gauged “available working-memory capacity,” a measure of how fully a person’s mind can focus on a particular task. The second assessed “fluid intelligence,” a person’s ability to interpret and solve an unfamiliar problem. The only variable in the experiment was the location of the subjects’ smartphones. Some of the students were asked to place their phones in front of them on their desks; others were told to stow their phones in their pockets or handbags; still others were required to leave their phones in a different room.
  • In both tests, the subjects whose phones were in view posted the worst scores, while those who left their phones in a different room did the best. The students who kept their phones in their pockets or bags came out in the middle. As the phone’s proximity increased, brainpower decreased.
  • In subsequent interviews, nearly all the participants said that their phones hadn’t been a distraction—that they hadn’t even thought about the devices during the experiment. They remained oblivious even as the phones disrupted their focus and thinking.
  • In a 2013 study conducted at the University of Essex in England, 142 participants were divided into pairs and asked to converse in private for ten minutes. Half talked with a phone in the room, half without a phone present. The subjects were then given tests of affinity, trust and empathy. “The mere presence of mobile phones,” the researchers reported in the Journal of Social and Personal Relationships, “inhibited the development of interpersonal closeness and trust” and diminished “the extent to which individuals felt empathy and understanding from their partners.”
  • the “integration of smartphones into daily life” appears to cause a “brain drain” that can diminish such vital mental skills as “learning, logical reasoning, abstract thought, problem solving, and creativity.”
  •  Smartphones have become so entangled with our existence that, even when we’re not peering or pawing at them, they tug at our attention, diverting precious cognitive resources. Just suppressing the desire to check our phone, which we do routinely and subconsciously throughout the day, can debilitate our thinking.
  • They found that students who didn’t bring their phones to the classroom scored a full letter-grade higher on a test of the material presented than those who brought their phones. It didn’t matter whether the students who had their phones used them or not: All of them scored equally poorly.
  • A study of nearly a hundred secondary schools in the U.K., published last year in the journal Labour Economics, found that when schools ban smartphones, students’ examination scores go up substantially, with the weakest students benefiting the most.
  • Social skills and relationships seem to suffer as well.
  • Because smartphones serve as constant reminders of all the friends we could be chatting with electronically, they pull at our minds when we’re talking with people in person, leaving our conversations shallower and less satisfying.
  • A second experiment conducted by the researchers produced similar results, while also revealing that the more heavily students relied on their phones in their everyday lives, the greater the cognitive penalty they suffered.
  • The evidence that our phones can get inside our heads so forcefully is unsettling. It suggests that our thoughts and feelings, far from being sequestered in our skulls, can be skewed by external forces we’re not even aware o
  •  Scientists have long known that the brain is a monitoring system as well as a thinking system. Its attention is drawn toward any object that is new, intriguing or otherwise striking — that has, in the psychological jargon, “salience.”
  • even in the history of captivating media, the smartphone stands out. It is an attention magnet unlike any our minds have had to grapple with before. Because the phone is packed with so many forms of information and so many useful and entertaining functions, it acts as what Dr. Ward calls a “supernormal stimulus,” one that can “hijack” attention whenever it is part of our surroundings — and it is always part of our surroundings.
  • Imagine combining a mailbox, a newspaper, a TV, a radio, a photo album, a public library and a boisterous party attended by everyone you know, and then compressing them all into a single, small, radiant object. That is what a smartphone represents to us. No wonder we can’t take our minds off it.
  • The irony of the smartphone is that the qualities that make it so appealing to us — its constant connection to the net, its multiplicity of apps, its responsiveness, its portability — are the very ones that give it such sway over our minds.
  • Phone makers like Apple and Samsung and app writers like Facebook, Google and Snap design their products to consume as much of our attention as possible during every one of our waking hours
  • Social media apps were designed to exploit “a vulnerability in human psychology,” former Facebook president Sean Parker said in a recent interview. “[We] understood this consciously. And we did it anyway.”
  • A quarter-century ago, when we first started going online, we took it on faith that the web would make us smarter: More information would breed sharper thinking. We now know it’s not that simple.
  • As strange as it might seem, people’s knowledge and understanding may actually dwindle as gadgets grant them easier access to online data stores
  • In a seminal 2011 study published in Science, a team of researchers — led by the Columbia University psychologist Betsy Sparrow and including the late Harvard memory expert Daniel Wegner — had a group of volunteers read forty brief, factual statements (such as “The space shuttle Columbia disintegrated during re-entry over Texas in Feb. 2003”) and then type the statements into a computer. Half the people were told that the machine would save what they typed; half were told that the statements would be erased.
  • Afterward, the researchers asked the subjects to write down as many of the statements as they could remember. Those who believed that the facts had been recorded in the computer demonstrated much weaker recall than those who assumed the facts wouldn’t be stored. Anticipating that information would be readily available in digital form seemed to reduce the mental effort that people made to remember it
  • The researchers dubbed this phenomenon the “Google effect” and noted its broad implications: “Because search engines are continually available to us, we may often be in a state of not feeling we need to encode the information internally. When we need it, we will look it up.”
  • as the pioneering psychologist and philosopher William James said in an 1892 lecture, “the art of remembering is the art of thinking.”
  • Only by encoding information in our biological memory can we weave the rich intellectual associations that form the essence of personal knowledge and give rise to critical and conceptual thinking. No matter how much information swirls around us, the less well-stocked our memory, the less we have to think with.
  • As Dr. Wegner and Dr. Ward explained in a 2013 Scientific American article, when people call up information through their devices, they often end up suffering from delusions of intelligence. They feel as though “their own mental capacities” had generated the information, not their devices. “The advent of the ‘information age’ seems to have created a generation of people who feel they know more than ever before,” the scholars concluded, even though “they may know ever less about the world around them.”
  • That insight sheds light on society’s current gullibility crisis, in which people are all too quick to credit lies and half-truths spread through social media. If your phone has sapped your powers of discernment, you’ll believe anything it tells you.
  • Data, the novelist and critic Cynthia Ozick once wrote, is “memory without history.” Her observation points to the problem with allowing smartphones to commandeer our brains
  • When we constrict our capacity for reasoning and recall or transfer those skills to a gadget, we sacrifice our ability to turn information into knowledge. We get the data but lose the meaning
  • We need to give our minds more room to think. And that means putting some distance between ourselves and our phones.
  • Harvard Business School professor emerita Shoshana Zuboff argues in her new book that the Valley’s wealth and power are predicated on an insidious, essentially pathological form of private enterprise—what she calls “surveillance capitalism.” Pioneered by Google, perfected by Facebook, and now spreading throughout the economy, surveillance capitalism uses human life as its raw material. Our everyday experiences, distilled into data, have become a privately-owned business asset used to predict and mold our behavior, whether we’re shopping or socializing, working or voting.
  • By reengineering the economy and society to their own benefit, Google and Facebook are perverting capitalism in a way that undermines personal freedom and corrodes democracy.
  • Under the Fordist model of mass production and consumption that prevailed for much of the twentieth century, industrial capitalism achieved a relatively benign balance among the contending interests of business owners, workers, and consumers. Enlightened executives understood that good pay and decent working conditions would ensure a prosperous middle class eager to buy the goods and services their companies produced. It was the product itself — made by workers, sold by companies, bought by consumers — that tied the interests of capitalism’s participants together. Economic and social equilibrium was negotiated through the product.
  • By removing the tangible product from the center of commerce, surveillance capitalism upsets the equilibrium. Whenever we use free apps and online services, it’s often said, we become the products, our attention harvested and sold to advertisers
  • this truism gets it wrong. Surveillance capitalism’s real products, vaporous but immensely valuable, are predictions about our future behavior — what we’ll look at, where we’ll go, what we’ll buy, what opinions we’ll hold — that internet companies derive from our personal data and sell to businesses, political operatives, and other bidders.
  • Unlike financial derivatives, which they in some ways resemble, these new data derivatives draw their value, parasite-like, from human experience.To the Googles and Facebooks of the world, we are neither the customer nor the product. We are the source of what Silicon Valley technologists call “data exhaust” — the informational byproducts of online activity that become the inputs to prediction algorithms
  • internet companies operate in what Zuboff terms “extreme structural independence from people.” When databases displace goods as the engine of the economy, our own interests, as consumers but also as citizens, cease to be part of the negotiation. We are no longer one of the forces guiding the market’s invisible hand. We are the objects of surveillance and control.
Javier E

No matter who wins the presidential election, Nate Silver was right - The Washington Post - 1 views

  • I don’t fault Silver for his caution. It’s honest. What it really says is he doesn’t know with much confidence what’s going to happen
  • That’s because there’s a lot of human caprice and whim in electoral behavior that can’t always be explained or predicted with scientific precision. Politics ain’t moneyball. Good-quality polls give an accurate sense of where a political race is at a point in time, but they don’t predict the future.
  • Predictive models, generally based on historical patterns, work until they don’t.
  • ...2 more annotations...
  • In his hedged forecasts this time, Silver appears to be acknowledging that polling and historical patterns don’t always capture what John Maynard Keynes, in his classic 1936 economic General Theory, described as “animal spirits.”
  • There is, Keynes wrote, “the instability due to the characteristic of human nature that a large proportion of our positive activities depend on spontaneous optimism rather than on a mathematical expectation, whether moral or hedonistic or economic. Most, probably, of our decisions to do something positive, the full consequences of which will be drawn out over many days to come, can only be taken as a result of animal spirits — of a spontaneous urge to action rather than inaction, and not as the outcome of a weighted average of quantitative benefits multiplied by quantitative probabilities.”
Duncan H

The Materialist Fallacy - NYTimes.com - 0 views

  • The half-century between 1912 and 1962 was a period of great wars and economic tumult but also of impressive social cohesion. Marriage rates were high. Community groups connected people across class.
  • In the half-century between 1962 and the present, America has become more prosperous, peaceful and fair, but the social fabric has deteriorated. Social trust has plummeted. Society has segmented. The share of Americans born out of wedlock is now at 40 percent and rising.
  • Liberals congregated around an economically determinist theory. The loss of good working-class jobs undermined communities and led to the social deterioration.
  • ...7 more annotations...
  • Libertarians congregated around a government-centric theory. Great Society programs enabled people to avoid work and gave young women an incentive to have children without marrying.
  • Neo-conservatives had a more culturally deterministic theory. Many of them had been poor during the Depression. Economic stress had not undermined the family then. Moreover, social breakdown began in the 1960s, a time of unprecedented prosperity. They argued that the abandonment of traditional bourgeois norms led to social disruption, especially for those in fragile circumstances.
  • a new body of research has emerged, which should lead to new theories. This research
  • tends to support a few common themes. First, no matter how social disorganization got started, once it starts, it takes on a momentum of its own. People who grow up in disrupted communities are more likely to lead disrupted lives as adults, magnifying disorder from one generation to the next.
  • Second, it’s not true that people in disorganized neighborhoods have bad values. Their goals are not different from everybody else’s. It’s that they lack the social capital to enact those values. Third, while individuals are to be held responsible for their behavior, social context is more powerful than we thought. If any of us grew up in a neighborhood where a third of the men dropped out of school, we’d be much worse off, too.
  • disruption breeds disruption
  • children who can’t form secure attachments by 18 months face a much worse set of chances for the rest of their lives because they find it harder to build stable relationships.
  •  
    What do you think?
Javier E

History News Network | An Open Letter to the Harvard Business School Dean Who Gave Hist... - 0 views

  • I would like to make some gratuitous curricular and pedagogical suggestions for business schools.
  • Foremost, business schools, at least those that purport to mold leaders, should stop training and start educating. Their graduates should be able to think and problem-solve for themselves, not just copy the latest fad.
  • Business schools generally do not cultivate or even select for general intelligence and breadth of understanding but instead breed shrewdness and encourage narrow technical knowledge.
  • ...8 more annotations...
  • To try to cover up the obvious shortcomings of their profoundly antisocial pedagogical model, many business schools tack on courses in ethics, corporate social responsibility, and the like, then shrug their collective shoulders when their graduates behave in ways that would make Vikings and pirates blush.
  • The only truly socially responsible management curriculum would be one built from the ground up out of the liberal arts – economics, of course, but also history, philosophy, political science, psychology, and sociology – because those are the core disciplines of social scientific and humanistic inquiry.
  • Properly understood, they are not “subjects” but ways of thinking about human beings, their behaviors, their institutions (of which businesses are just a small subset), and the ways they interact with the natural world. Only intelligent people with broad and deep backgrounds in the liberal arts can consistently make ethical decisions that are good for stakeholders, consumers, and the world they share.
  • Precisely because they are not deeply rooted in the liberal arts, many business schools try to inculcate messages into the brains of their students that are unscientific, mere fashions that cycle into and then out of popularity.
  • No one can seriously purport to understand corporate X (finance, formation, governance, social responsibility, etc.) today who does not understand X’s origins and subsequent development. Often, then, the historian of corporate X is the real expert, not the business school professor who did a little consulting, a few interviews, and a survey.
  • Lurking somewhere in the background of most great business leaders, ones who helped their companies, their customers, and the world, is a liberal arts education.
  • Instead of forcing students to choose between a broad liberal arts degree or a business career, business schools and liberal arts departments ought to work together to integrate all methods of knowing into a seamless whole focused on key questions and problems important to us all
  • There is not a single question of importance in the business world that does not have economic, historical, philosophical, political, psychological, and sociological components that are absolutely crucial to making good (right and moral) decisions. So why continue to divide understanding of the real world into hoary compartments
caelengrubb

Insider Trading - Econlib - 0 views

  • Insider trading” refers to transactions in a company’s securities, such as stocks or options, by corporate insiders or their associates based on information originating within the firm that would, once publicly disclosed, affect the prices of such securities.
  • Corporate insiders are individuals whose employment with the firm (as executives, directors, or sometimes rank-and-file employees) or whose privileged access to the firm’s internal affairs (as large shareholders, consultants, accountants, lawyers, etc.) gives them valuable information.
  • Famous examples of insider trading include transacting on the advance knowledge of a company’s discovery of a rich mineral ore (Securities and Exchange Commission v. Texas Gulf Sulphur Co.), on a forthcoming cut in dividends by the board of directors (Cady, Roberts & Co.), and on an unanticipated increase in corporate expenses (Diamond v. Oreamuno).
  • ...18 more annotations...
  • Such trading on information originating outside the company is generally not covered by insider trading regulation.
  • Insider trading is quite different from market manipulation, disclosure of false or misleading information to the market, or direct expropriation of the corporation’s wealth by insiders.
  • Regulation of insider trading began in the United States at the turn of the twentieth century, when judges in several states became willing to rescind corporate insiders’ transactions with uninformed shareholders.
  • One of the earliest (and unsuccessful) federal attempts to regulate insider trading occurred after the 1912–1913 congressional hearings before the Pujo Committee, which concluded that “the scandalous practices of officers and directors in speculating upon inside and advance information as to the action of their corporations may be curtailed if not stopped.”
  • The Securities Acts of 1933–1934, passed by the U.S. Congress in the aftermath of the stock market crash, though aimed primarily at prohibiting fraud and market manipulation, also targeted insider trading.
  • As of 2004, at least ninety-three countries, the vast majority of nations that possess organized securities markets, had laws regulating insider trading
  • Several factors explain the rapid emergence of such regulation, particularly during the last twenty years: namely, the growth of the securities industry worldwide, pressures to make national securities markets look more attractive in the eyes of outside investors, and the pressure the SEC exerted on foreign lawmakers and regulators to increase the effectiveness of domestic enforcement by identifying and punishing offenders and their associates operating outside the United States.
  • Many researchers argue that trading on inside information is a zero-sum game, benefiting insiders at the expense of outsiders. But most outsiders who bought from or sold to insiders would have traded anyway, and possibly at a worse price (Manne 1970). So, for example, if the insider sells stock because he expects the price to fall, the very act of selling may bring the price down to the buyer.
  • A controversial case is that of abstaining from trading on the basis of inside information (Fried 2003).
  • There is little disagreement that insider trading makes securities markets more efficient by moving the current market price closer to the future postdisclosure price. In other words, insiders’ transactions, even if they are anonymous, signal future price trends to others and make the current stock price reflect relevant information sooner.
  • Accurately priced stocks give valuable signals to investors and ensure more efficient allocation of capital.
  • The controversial question is whether insider trading is more or less effective than public disclosure.
  • Insider trading’s advantage is that it introduces individual profit motives, does not directly reveal sensitive intercorporate information, and mitigates the management’s aversion to disclosing negative information (
  • Probably the most controversial issue in the economic analysis of insider trading is whether it is an efficient way to pay managers for their entrepreneurial services to the corporation. Some researchers believe that insider trading gives managers a monetary incentive to innovate, search for, and produce valuable information, as well as to take risks that increase the firm’s value (Carlton and Fischel 1983; Manne 1966).
  • Another economic argument for insider trading is that it provides efficient compensation to holders of large blocks of stock
  • A common contention is that the presence of insider trading decreases public confidence in, and deters many potential investors from, equity markets, making them less liquid (Loss 1970).
  • Empirical research generally supports skepticism that regulation of insider trading has been effective in either the United States or internationally, as evidenced by the persistent trading profits of insiders, behavior of stock prices around corporate announcements, and relatively infrequent prosecution rates (Bhattacharya and Daouk 2002; Bris 2005).
  • Despite numerous and extensive debates, economists and legal scholars do not agree on a desirable government policy toward insider trading. On the one hand, absolute information parity is clearly infeasible, and information-based trading generally increases the pricing efficiency of financial markets. Information, after all, is a scarce economic good that is costly to produce or acquire, and its subsequent use and dissemination are difficult to control. On the other hand, insider trading, as opposed to other forms of informed trading, may produce unintended adverse consequences for the functioning of the corporate enterprise, the market-wide system of publicly mandated disclosure, or the market for information.
katedriscoll

Understanding decisions: The power of combining psychology and economics - 0 views

  • "Psychology and economics are both interested in how people make decisions, but have different theories and methods. In our work with economists at Northwestern, Michigan, the Federal Reserve and elsewhere, we have found ways to complement each other's expertise," said Wändi Bruine de Bruin, professor of behavioral decision making at Leeds' University Business School, who received her Ph.D. from Carnegie Mellon University, where she is collaborating professor in the Department of Engineering and Public Policy.
Javier E

Opinion | Lower fertility rates are the new cultural norm - The Washington Post - 0 views

  • The percentage who say that having children is very important to them has dropped from 43 percent to 30 percent since 2019. This fits with data showing that, since 2007, the total fertility rate in the United States has fallen from 2.1 lifetime births per woman, the “replacement rate” necessary to sustain population levels, to just 1.64 in 2020.
  • The U.S. economy is losing an edge that robust population dynamics gave it relative to low-birth-rate peer nations in Japan and Western Europe; this country, too, faces chronic labor-supply constraints as well as an even less favorable “dependency ratio” between workers and retirees than it already expected.
  • the timing and the magnitude of such a demographic sea-change cry out for explanation. What happened in 2007?
  • ...12 more annotations...
  • New financial constraints on family formation are a potential cause, as implied by another striking finding in the Journal poll — 78 percent of adults lack confidence this generation of children will enjoy a better life than they do.
  • Yet a recent analysis for the Aspen Economic Strategy Group by Melissa S. Kearney and Phillip B. Levine, economics professors at the University of Maryland and Wellesley College, respectively, determined that “beyond the temporary effects of the Great Recession, no recent economic or policy change is responsible for a meaningful share of the decline in the US fertility rate since 2007.”
  • Their study took account of such factors as the high cost of child care, student debt service and housing as well as Medicaid coverage and the wider availability of long-acting reversible contraception. Yet they had “no success finding evidence” that any of these were decisive.
  • Kearney and Levine speculated instead that the answers lie in the cultural zeitgeist — “shifting priorities across cohorts of young adults,”
  • A possibility worth considering, they suggested, is that young adults who experienced “intensive parenting” as children now balk at the heavy investment of time and resources needed to raise their own kids that way: It would clash with their career and leisure goals.
  • another event that year: Apple released the first iPhone, a revolutionary cultural moment if there ever was one. The ensuing smartphone-enabled social media boom — Facebook had opened membership to anyone older than 13 in 2006 — forever changed how human beings relate with one another.
  • We are just beginning to understand this development’s effect on mental health, education, religious observance, community cohesion — everything. Why wouldn’t it also affect people’s willingness to have children?
  • one indirect way new media affect childbearing rates is through “time competition effects” — essentially, hours spent watching the tube cannot be spent forming romantic partnerships.
  • a 2021 review of survey data on young adults and adolescents in the United States and other countries, the years between 2009 and 2018 saw a marked decline in reported sexual activity.
  • the authors hypothesized that people are distracted from the search for partners by “increasing use of computer games and social media.
  • during the late 20th century, Brazil’s fertility rates fell after women who watched soap operas depicting smaller families sought to emulate them by having fewer children themselves.
  • This may be an area where incentives do not influence behavior, at least not enough. Whether the cultural shift to lower birthrates occurs on an accelerated basis, as in the United States after 2007, or gradually, as it did in Japan, it appears permanent — “sticky,” as policy wonks say.
Javier E

Book Review: Models Behaving Badly - WSJ.com - 1 views

  • Mr. Derman is perhaps a bit too harsh when he describes EMM—the so-called Efficient Market Model. EMM does not, as he claims, imply that prices are always correct and that price always equals value. Prices are always wrong. What EMM says is that we can never be sure if prices are too high or too low.
  • The Efficient Market Model does not suggest that any particular model of valuation—such as the Capital Asset Pricing Model—fully accounts for risk and uncertainty or that we should rely on it to predict security returns. EMM does not, as Mr. Derman says, "stubbornly assume that all uncertainty about the future is quantifiable." The basic lesson of EMM is that it is very difficult—well nigh impossible—to beat the market consistently.
  • Mr. Derman gives an eloquent description of James Clerk Maxwell's electromagnetic theory in a chapter titled "The Sublime." He writes: "The electromagnetic field is not like Maxwell's equations; it is Maxwell's equations."
  • ...4 more annotations...
  • He sums up his key points about how to keep models from going bad by quoting excerpts from his "Financial Modeler's Manifesto" (written with Paul Wilmott), a paper he published a couple of years ago. Among its admonitions: "I will always look over my shoulder and never forget that the model is not the world"; "I will not be overly impressed with mathematics"; "I will never sacrifice reality for elegance"; "I will not give the people who use my models false comfort about their accuracy"; "I understand that my work may have enormous effects on society and the economy, many beyond my apprehension."
  • As the collapse of the subprime collateralized debt market in 2008 made clear, it is a terrible mistake to put too much faith in models purporting to value financial instruments. "In crises," Mr. Derman writes, "the behavior of people changes and normal models fail. While quantum electrodynamics is a genuine theory of all reality, financial models are only mediocre metaphors for a part of it."
  • Although financial models employ the mathematics and style of physics, they are fundamentally different from the models that science produces. Physical models can provide an accurate description of reality. Financial models, despite their mathematical sophistication, can at best provide a vast oversimplification of reality. In the universe of finance, the behavior of individuals determines value—and, as he says, "people change their minds."
  • Bringing ethics into his analysis, Mr. Derman has no patience for coddling the folly of individuals and institutions who over-rely on faulty models and then seek to escape the consequences. He laments the aftermath of the 2008 financial meltdown, when banks rebounded "to record profits and bonuses" thanks to taxpayer bailouts. If you want to benefit from the seven fat years, he writes, "you must suffer the seven lean years too, even the catastrophically lean ones. We need free markets, but we need them to be principled."
Javier E

Let's Shake Up the Social Sciences - NYTimes.com - 1 views

  • everyone knows that monopoly power is bad for markets, that people are racially biased and that illness is unequally distributed by social class. There are diminishing returns from the continuing study of many such topics. And repeatedly observing these phenomena does not help us fix them.
  • social scientists should devote a small palace guard to settled subjects and redeploy most of their forces to new fields like social neuroscience, behavioral economics, evolutionary psychology and social epigenetics, most of which, not coincidentally, lie at the intersection of the natural and social science
  • It is time to create new social science departments that reflect the breadth and complexity of the problems we face as well as the novelty of 21st-century science. These would include departments of biosocial science, network science, neuroeconomics, behavioral genetics and computational social science.
  • ...1 more annotation...
  • Nicholas A. Christakis, a physician and sociologist at Yale University, is a co-director of the Yale Institute for Network Science.
Javier E

The Cost of Relativism - NYTimes.com - 0 views

  • One of America’s leading political scientists, Robert Putnam, has just come out with a book called “Our Kids” about the growing chasm between those who live in college-educated America and those who live in high-school-educated America
  • Reintroducing norms will require, first, a moral vocabulary. These norms weren’t destroyed because of people with bad values. They were destroyed by a plague of nonjudgmentalism, which refused to assert that one way of behaving was better than another. People got out of the habit of setting standards or understanding how they were set.
  • sympathy is not enough. It’s not only money and better policy that are missing in these circles; it’s norms.
  • ...7 more annotations...
  • The health of society is primarily determined by the habits and virtues of its citizens.
  • In many parts of America there are no minimally agreed upon standards for what it means to be a father. There are no basic codes and rules woven into daily life, which people can absorb unconsciously and follow automatically.
  • Roughly 10 percent of the children born to college grads grow up in single-parent households. Nearly 70 percent of children born to high school grads do. There are a bunch of charts that look like open scissors. In the 1960s or 1970s, college-educated and noncollege-educated families behaved roughly the same. But since then, behavior patterns have ever more sharply diverged. High-school-educated parents dine with their children less than college-educated parents, read to them less, talk to them less, take them to church less, encourage them less and spend less time engaging in developmental activity.
  • Next it will require holding people responsible. People born into the most chaotic situations can still be asked the same questions: Are you living for short-term pleasure or long-term good? Are you living for yourself or for your children? Do you have the freedom of self-control or are you in bondage to your desires?
  • Next it will require holding everybody responsible. America is obviously not a country in which the less educated are behaving irresponsibly and the more educated are beacons of virtue. America is a country in which privileged people suffer from their own characteristic forms of self-indulgence: the tendency to self-segregate, the comprehensive failures of leadership in government and industry.
  • People sometimes wonder why I’ve taken this column in a spiritual and moral direction of late. It’s in part because we won’t have social repair unless we are more morally articulate, unless we have clearer definitions of how we should be behaving at all levels.
  • History is full of examples of moral revival, when social chaos was reversed, when behavior was tightened and norms reasserted. It happened in England in the 1830s and in the U.S. amid economic stress in the 1930s.
Javier E

Sex, Morality, and Modernity: Can Immanuel Kant Unite Us? - The Atlantic - 1 views

  • Before I jump back into the conversation about sexual ethics that has unfolded on the Web in recent days, inspired by Emily Witt's n+1 essay "What Do You Desire?" and featuring a fair number of my favorite writers, it's worth saying a few words about why I so value debate on this subject, and my reasons for running through some sex-life hypotheticals near the end of this article.
  • As we think and live, the investment required to understand one another increases. So do the stakes of disagreeing. 18-year-olds on the cusp of leaving home for the first time may disagree profoundly about how best to live and flourish, but the disagreements are abstract. It is easy, at 18, to express profound disagreement with, say, a friend's notions of child-rearing. To do so when he's 28, married, and raising a son or daughter is delicate, and perhaps best avoided
  • I have been speaking of friends. The gulfs that separate strangers can be wider and more difficult to navigate because there is no history of love and mutual goodwill as a foundation for trust. Less investment has been made, so there is less incentive to persevere through the hard parts.
  • ...27 more annotations...
  • I've grown very close to new people whose perspectives are radically different than mine.
  • It floors me: These individuals are all repositories of wisdom. They've gleaned it from experiences I'll never have, assumptions I don't share, and brains wired different than mine. I want to learn what they know.
  • Does that get us anywhere? A little ways, I think.
  • "Are we stuck with a passé traditionalism on one hand, and total laissez-faire on the other?" Is there common ground shared by the orthodox-Christian sexual ethics of a Rod Dreher and those who treat consent as their lodestar?
  • Gobry suggests that Emmanuel Kant provides a framework everyone can and should embrace, wherein consent isn't nearly enough to make a sexual act moral--we must, in addition, treat the people in our sex lives as ends, not means.
  • Here's how Kant put it: "Act in such a way that you treat humanity, whether in your own person or in the person of any other, never merely as a means to an end, but always at the same time as an end."
  • the disappearance of a default sexual ethic in America and the divergence of our lived experiences means we have more to learn from one another than ever, even as our different choices raise the emotional stakes.
  • Nor does it seem intuitively obvious that a suffering, terminally ill 90-year-old is regarding himself as a means, or an object, if he prefers to end his life with a lethal injection rather than waiting three months in semi-lucid agony for his lungs to slowly shut down and suffocate him. (Kant thought suicide impermissible.) The terminally ill man isn't denigrating his own worth or the preciousness of life or saying it's permissible "any time" it is difficult. He believes ending his life is permissible only because the end is nigh, and the interim affords no opportunity for "living" in anything except a narrow biological sense.
  • It seems to me that, whether we're talking about a three-week college relationship or a 60-year marriage, it is equally possible to treat one's partner as a means or as an end (though I would agree that "treating as means" is more common in hookups than marriage)
  • my simple definition is this: It is wrong to treat human persons in such a way that they are reduced to objects. This says nothing about consent: a person may consent to be used as an object, but it is still wrong to use them that way. It says nothing about utility: society may approve of using some people as objects; whether those people are actual slaves or economically oppressed wage-slaves it is still wrong to treat them like objects. What it says, in fact, is that human beings have intrinsic worth and dignity such that treating them like objects is wrong.
  • what it means to treat someone as a means, or as an object, turns out to be in dispute.
  • Years ago, I interviewed a sister who was acting as a surrogate for a sibling who couldn't carry her own child. The notion that either regarded the other (or themselves) as an object seems preposterous to me. Neither was treating the other as a means, because they both freely chose, desired and worked in concert to achieve the same end.
  • It seems to me that the Kantian insight is exactly the sort of challenge traditionalist Christians should make to college students as they try to persuade them to look more critically at hookup culture. I think a lot of college students casually mislead one another about their intentions and degree of investment, feigning romantic interest when actually they just want to have sex. Some would say they're transgressing against consent. I think Kant has a more powerful challenge. 
  • Ultimately, Kant only gets us a little way in this conversation because, outside the realm of sex, he thinks consent goes a long way toward mitigating the means problem, whereas in the realm of sex, not so much. This is inseparable from notions he has about sex that many of us just don't share.
  • two Biblical passages fit my moral intuition even better than Kant. "Love your neighbor as yourself." And "therefore all things whatsoever would that men should do to you, do ye even so to them.
  • "do unto others..." is extremely demanding, hard to live up to, and a very close fit with my moral intuitions.
  • "Do unto others" is also enough to condemn all sorts of porn, and to share all sorts of common ground with Dreher beyond consent. Interesting that it leaves us with so many disagreements too. "Do unto others" is core to my support for gay marriage.
  • Are our bones always to be trusted?) The sexual behavior parents would be mortified by is highly variable across time and cultures. So how can I regard it as a credible guide of inherent wrong? Professional football and championship boxing are every bit as violent and far more physically damaging to their participants than that basement scene, yet their cultural familiarity is such that most people don't feel them to be morally suspect. Lots of parents are proud, not mortified, when a son makes the NFL.
  • "Porn operates in fantasy the way boxing and football operate in fantasy. The injuries are quite real." He is, as you can see, uncomfortable with both. Forced at gunpoint to choose which of two events could proceed on a given night, an exact replica of the San Francisco porn shoot or an Ultimate Fighting Championship tournament--if I had to shut one down and grant the other permission to proceed--what would the correct choice be?
  • insofar as there is something morally objectionable here, it's that the audience is taking pleasure in the spectacle of someone being abused, whether that abuse is fact or convincing illusion. Violent sports and violent porn interact with dark impulses in humanity, as their producers well know.
  • If Princess Donna was failing to "do unto others" at all, the audience was arguably who she failed. Would she want others to entertain her by stoking her dark human impulses? Then again, perhaps she is helping to neuter and dissipate them in a harmless way. That's one theory of sports, isn't it? We go to war on the gridiron as a replacement for going to war? And the rise in violent porn has seemed to coincide with falling, not rising, incidence of sexual violence. 
  • On all sorts of moral questions I can articulate confident judgments. But I am confident in neither my intellect nor my gut when it comes to judging Princess Donna, or whether others are transgressing against themselves or "nature" when doing things that I myself wouldn't want to do. Without understanding their mindset, why they find that thing desirable, or what it costs them, if anything, I am loath to declare that it's grounded in depravity or inherently immoral just because it triggers my disgust instinct, especially if the people involved articulate a plausible moral code that they are following, and it even passes a widely held standard like "do unto others."
  • Here's another way to put it. Asked to render moral judgments about sexual behaviors, there are some I would readily label as immoral. (Rape is an extreme example. Showing the topless photo your girlfriend sent to your best friend is a milder one.) But I often choose to hold back and error on the side of not rendering a definitive judgment, knowing that occasionally means I'll fail to label as unethical some things that actually turn out to be morally suspect.
  • Partly I take that approach because, unlike Dreher, I don't see any great value or urgency in the condemnations, and unlike Douthat, I worry more about wrongful stigma than lack of rightful stigmas
  • In a society where notions of sexual morality aren't coercively enforced by the church or the state, what purpose is condemnation serving?
  • People are great! Erring on the side of failing to condemn permits at least the possibility of people from all of these world views engaging in conversation with one another.
  • Dreher worries about the fact that, despite our discomfort, neither Witt nor I can bring ourselves to say that the sexual acts performed during the S.F. porn shoot were definitely wrong. Does that really matter? My interlocutors perhaps see a cost more clearly than me, as well they might. My bias is that just arguing around the fire is elevating.
anonymous

Weight Gain and Stress Eating Are Downside of Pandemic Life - The New York Times - 0 views

  • Yes, Many of Us Are Stress-Eating and Gaining Weight in the Pandemic
  • A global study confirms that during the pandemic, many of us ate more junk food, exercised less, were more anxious and got less sleep.
  • Not long ago, Stephen Loy had a lot of healthy habits. He went to exercise classes three or four times a week, cooked nutritious dinners for his family, and snacked on healthy foods like hummus and bell peppers.
  • ...20 more annotations...
  • But that all changed when the pandemic struck. During the lockdowns, when he was stuck at home, his anxiety levels went up. He stopped exercising and started stress eating. Gone were the hummus and vegetables; instead, he snacked on cookies, sweets and Lay’s potato chips. He ate more fried foods and ordered takeout from local restaurants.
  • “We were feeding the soul more than feeding the stomach,”
  • “We were making sure to eat things that made us feel better — not just nutritional items.
  • Now a global survey conducted earlier this year confirms what Mr. Loy and many others experienced firsthand: The coronavirus pandemic and resulting lockdowns led to dramatic changes in health behaviors, prompting people around the world to cut back on physical activity and eat more junk foods.
  • While they tended to experience improvements in some aspects of their diets, such as cooking at home more and eating out less, they were also the most likely to report struggling with their weight and mental health.
  • With months to go before a vaccine becomes widely available and we can safely resume our pre-pandemic routines, now might be a good time to assess the healthy habits we may have let slip and to find new ways to be proactive about our physical and mental health.
  • The researchers found that the decline in healthy behaviors during the pandemic and widespread lockdowns was fairly common regardless of geography.
  • Alone
  • “Individuals with obesity were impacted the most — and that’s what we were afraid of,”
  • “They not only started off with higher anxiety levels before the pandemic, but they also had the largest increase in anxiety levels throughout the pandemic.”
  • The pandemic disrupted everyday life, isolated people from friends and family, and spawned an economic crisis, with tens of millions of people losing jobs or finding their incomes sharply reduced.
  • despite snacking on more junk foods, many people showed an increase in their “healthy eating scores,” a measure of their overall diet quality, which includes things like eating more fruits and fewer fried foods.
  • The researchers said that the overall improvements in diet appeared to be driven by the fact that the lockdowns prompted people to cook, bake and prepare more food at home.
  • Other recent surveys have also shown a sharp rise in home cooking and baking this year, with many people saying they are discovering new ingredients and looking for ways to make healthier foods.
  • But social isolation can take a toll on mental wellness, and that was evident in the findings.
  • About 20 percent said that their symptoms, such as experiencing dread and not being able to control or stop their worrying, were severe enough to interfere with their daily activities.
  • Dr. Flanagan said it was perhaps not surprising that people tended to engage in less healthful habits during the pandemic, as so many aspects of health are intertwined.
  • Stress can lead to poor sleep, which can cause people to exercise less, consume more junk foods, and then gain weight, and so on.
  • But she said she hoped that the findings might inspire people to take steps to be more proactive about their health, such as seeking out mental health specialists, prioritizing sleep and finding ways to exercise at home and cook more, in the event of future lockdowns.
  • “Being aware is really the No. 1 thing here.”
« First ‹ Previous 41 - 60 of 90 Next › Last »
Showing 20 items per page