Skip to main content

Home/ TOK Friends/ Group items tagged ironic

Rss Feed Group items tagged

43More

The Navy's USS Gabrielle Giffords and the Future of Work - The Atlantic - 0 views

  • Minimal manning—and with it, the replacement of specialized workers with problem-solving generalists—isn’t a particularly nautical concept. Indeed, it will sound familiar to anyone in an organization who’s been asked to “do more with less”—which, these days, seems to be just about everyone.
  • Ten years from now, the Deloitte consultant Erica Volini projects, 70 to 90 percent of workers will be in so-called hybrid jobs or superjobs—that is, positions combining tasks once performed by people in two or more traditional roles.
  • If you ask Laszlo Bock, Google’s former culture chief and now the head of the HR start-up Humu, what he looks for in a new hire, he’ll tell you “mental agility.
  • ...40 more annotations...
  • “What companies are looking for,” says Mary Jo King, the president of the National Résumé Writers’ Association, “is someone who can be all, do all, and pivot on a dime to solve any problem.”
  • The phenomenon is sped by automation, which usurps routine tasks, leaving employees to handle the nonroutine and unanticipated—and the continued advance of which throws the skills employers value into flux
  • Or, for that matter, on the relevance of the question What do you want to be when you grow up?
  • By 2020, a 2016 World Economic Forum report predicted, “more than one-third of the desired core skill sets of most occupations” will not have been seen as crucial to the job when the report was published
  • I asked John Sullivan, a prominent Silicon Valley talent adviser, why should anyone take the time to master anything at all? “You shouldn’t!” he replied.
  • Minimal manning—and the evolution of the economy more generally—requires a different kind of worker, with not only different acquired skills but different inherent abilities
  • It has implications for the nature and utility of a college education, for the path of careers, for inequality and employability—even for the generational divide.
  • Then, in 2001, Donald Rumsfeld arrived at the Pentagon. The new secretary of defense carried with him a briefcase full of ideas from the corporate world: downsizing, reengineering, “transformational” technologies. Almost immediately, what had been an experimental concept became an article of faith
  • But once cadets got into actual command environments, which tend to be fluid and full of surprises, a different picture emerged. “Psychological hardiness”—a construct that includes, among other things, a willingness to explore “multiple possible response alternatives,” a tendency to “see all experience as interesting and meaningful,” and a strong sense of self-confidence—was a better predictor of leadership ability in officers after three years in the field.
  • Because there really is no such thing as multitasking—just a rapid switching of attention—I began to feel overstrained, put upon, and finally irked by the impossible set of concurrent demands. Shouldn’t someone be giving me a hand here? This, Hambrick explained, meant I was hitting the limits of working memory—basically, raw processing power—which is an important aspect of “fluid intelligence” and peaks in your early 20s. This is distinct from “crystallized intelligence”—the accumulated facts and know-how on your hard drive—which peaks in your 50
  • Others noticed the change but continued to devote equal attention to all four tasks. Their scores fell. This group, Hambrick found, was high in “conscientiousness”—a trait that’s normally an overwhelming predictor of positive job performance. We like conscientious people because they can be trusted to show up early, double-check the math, fill the gap in the presentation, and return your car gassed up even though the tank was nowhere near empty to begin with. What struck Hambrick as counterintuitive and interesting was that conscientiousness here seemed to correlate with poor performance.
  • he discovered another correlation in his test: The people who did best tended to score high on “openness to new experience”—a personality trait that is normally not a major job-performance predictor and that, in certain contexts, roughly translates to “distractibility.”
  • To borrow the management expert Peter Drucker’s formulation, people with this trait are less focused on doing things right, and more likely to wonder whether they’re doing the right things.
  • High in fluid intelligence, low in experience, not terribly conscientious, open to potential distraction—this is not the classic profile of a winning job candidate. But what if it is the profile of the winning job candidate of the future?
  • One concerns “grit”—a mind-set, much vaunted these days in educational and professional circles, that allows people to commit tenaciously to doing one thing well
  • These ideas are inherently appealing; they suggest that dedication can be more important than raw talent, that the dogged and conscientious will be rewarded in the end.
  • he studied West Point students and graduates.
  • Traditional measures such as SAT scores and high-school class rank “predicted leader performance in the stable, highly regulated environment of West Point” itself.
  • It would be supremely ironic if the advance of the knowledge economy had the effect of devaluing knowledge. But that’s what I heard, recurrentl
  • “Fluid, learning-intensive environments are going to require different traits than classical business environments,” I was told by Frida Polli, a co-founder of an AI-powered hiring platform called Pymetrics. “And they’re going to be things like ability to learn quickly from mistakes, use of trial and error, and comfort with ambiguity.”
  • “We’re starting to see a big shift,” says Guy Halfteck, a people-analytics expert. “Employers are looking less at what you know and more and more at your hidden potential” to learn new things
  • advice to employers? Stop hiring people based on their work experience. Because in these environments, expertise can become an obstacle.
  • “The Curse of Expertise.” The more we invest in building and embellishing a system of knowledge, they found, the more averse we become to unbuilding it.
  • All too often experts, like the mechanic in LePine’s garage, fail to inspect their knowledge structure for signs of decay. “It just didn’t occur to him,” LePine said, “that he was repeating the same mistake over and over.
  • The devaluation of expertise opens up ample room for different sorts of mistakes—and sometimes creates a kind of helplessness.
  • Aboard littoral combat ships, the crew lacks the expertise to carry out some important tasks, and instead has to rely on civilian help
  • Meanwhile, the modular “plug and fight” configuration was not panning out as hoped. Converting a ship from sub-hunter to minesweeper or minesweeper to surface combatant, it turned out, was a logistical nightmare
  • So in 2016 the concept of interchangeability was scuttled for a “one ship, one mission” approach, in which the extra 20-plus sailors became permanent crew members
  • “As equipment breaks, [sailors] are required to fix it without any training,” a Defense Department Test and Evaluation employee told Congress. “Those are not my words. Those are the words of the sailors who were doing the best they could to try to accomplish the missions we gave them in testing.”
  • These results were, perhaps, predictable given the Navy’s initial, full-throttle approach to minimal manning—and are an object lesson on the dangers of embracing any radical concept without thinking hard enough about the downsides
  • a world in which mental agility and raw cognitive speed eclipse hard-won expertise is a world of greater exclusion: of older workers, slower learners, and the less socially adept.
  • if you keep going down this road, you end up with one really expensive ship with just a few people on it who are geniuses … That’s not a future we want to see, because you need a large enough crew to conduct multiple tasks in combat.
  • hat does all this mean for those of us in the workforce, and those of us planning to enter it? It would be wrong to say that the 10,000-hours-of-deliberate-practice idea doesn’t hold up at all. In some situations, it clearly does
  • A spinal surgery will not be performed by a brilliant dermatologist. A criminal-defense team will not be headed by a tax attorney. And in tech, the demand for specialized skills will continue to reward expertise handsomely.
  • But in many fields, the path to success isn’t so clear. The rules keep changing, which means that highly focused practice has a much lower return
  • In uncertain environments, Hambrick told me, “specialization is no longer the coin of the realm.”
  • It leaves us with lifelong learning,
  • I found myself the target of career suggestions. “You need to be a video guy, an audio guy!” the Silicon Valley talent adviser John Sullivan told me, alluding to the demise of print media
  • I found the prospect of starting over just plain exhausting. Building a professional identity takes a lot of resources—money, time, energy. After it’s built, we expect to reap gains from our investment, and—let’s be honest—even do a bit of coasting. Are we equipped to continually return to apprentice mode? Will this burn us out?
  • Everybody I met on the Giffords seemed to share that mentality. They regarded every minute on board—even during a routine transit back to port in San Diego Harbor—as a chance to learn something new.
3More

Jared Diamond: We Could Be Living in a New Stone Age by 2114 - Mother Jones - 0 views

  • Either by the year 2050 we’ve succeeded in developing a sustainable economy, in which case we can then ask your question about 100 years from now, because there will be 100 years from now; or by 2050 we’ve failed to develop a sustainable economy, which means that there will no longer be first world living conditions, and there either won’t be humans 100 years from now, or those humans 100 years from now will have lifestyles similar of those of Cro-Magnons 40,000 years ago, because we’ve already stripped away the surface copper and the surface iron. If we knock ourselves out of the first world, we’re not going to be able to rebuild a first world.
  • It all depends, he says, on where we are at 2050:
  • Not everybody agrees with Diamond that we’re in such a perilous state, of course. But there is perhaps no more celebrated chronicler of why civilizations rise, and why they fall. That is, after all, why we read him. So when Diamond says we’ve got maybe 50 years to turn it around, we should at least consider the possibility that he might actually be right. For if he is, the consequences are so intolerable that anything possible should be done to avert them.
15More

Consciousness Isn't a Mystery. It's Matter. - The New York Times - 3 views

  • Every day, it seems, some verifiably intelligent person tells us that we don’t know what consciousness is. The nature of consciousness, they say, is an awesome mystery. It’s the ultimate hard problem. The current Wikipedia entry is typical: Consciousness “is the most mysterious aspect of our lives”; philosophers “have struggled to comprehend the nature of consciousness.”
  • I find this odd because we know exactly what consciousness is — where by “consciousness” I mean what most people mean in this debate: experience of any kind whatever. It’s the most familiar thing there is, whether it’s experience of emotion, pain, understanding what someone is saying, seeing, hearing, touching, tasting or feeling. It is in fact the only thing in the universe whose ultimate intrinsic nature we can claim to know. It is utterly unmysterious.
  • The nature of physical stuff, by contrast, is deeply mysterious, and physics grows stranger by the hour. (Richard Feynman’s remark about quantum theory — “I think I can safely say that nobody understands quantum mechanics” — seems as true as ever.) Or rather, more carefully: The nature of physical stuff is mysterious except insofar as consciousness is itself a form of physical stuff.
  • ...12 more annotations...
  • “We know nothing about the intrinsic quality of physical events,” he wrote, “except when these are mental events that we directly experience.”
  • I think Russell is right: Human conscious experience is wholly a matter of physical goings-on in the body and in particular the brain. But why does he say that we know nothing about the intrinsic quality of physical events except when these are mental events we directly experience? Isn’t he exaggerating? I don’t think so
  • I need to try to reply to those (they’re probably philosophers) who doubt that we really know what conscious experience is.The reply is simple. We know what conscious experience is because the having is the knowing: Having conscious experience is knowing what it is. You don’t have to think about it (it’s really much better not to). You just have to have it. It’s true that people can make all sorts of mistakes about what is going on when they have experience, but none of them threaten the fundamental sense in which we know exactly what experience is just in having it.
  • If someone continues to ask what it is, one good reply (although Wittgenstein disapproved of it) is “you know what it is like from your own case.” Ned Block replies by adapting the response Louis Armstrong reportedly gave to someone who asked him what jazz was: “If you gotta ask, you ain’t never going to know.”
  • So we all know what consciousness is. Once we’re clear on this we can try to go further, for consciousness does of course raise a hard problem. The problem arises from the fact that we accept that consciousness is wholly a matter of physical goings-on, but can’t see how this can be so. We examine the brain in ever greater detail, using increasingly powerful techniques like fMRI, and we observe extraordinarily complex neuroelectrochemical goings-on, but we can’t even begin to understand how these goings-on can be (or give rise to) conscious experiences.
  • 1966 movie “Fantastic Voyage,” or imagine the ultimate brain scanner. Leibniz continued, “Suppose we do: visiting its insides, we will never find anything but parts pushing each other — never anything that could explain a conscious state.”
  • His mistake is to go further, and conclude that physical goings-on can’t possibly be conscious goings-on. Many make the same mistake today — the Very Large Mistake (as Winnie-the-Pooh might put it) of thinking that we know enough about the nature of physical stuff to know that conscious experience can’t be physical. We don’t. We don’t know the intrinsic nature of physical stuff, except — Russell again — insofar as we know it simply through having a conscious experience.
  • We find this idea extremely difficult because we’re so very deeply committed to the belief that we know more about the physical than we do, and (in particular) know enough to know that consciousness can’t be physical. We don’t see that the hard problem is not what consciousness is, it’s what matter is — what the physical is.
  • This point about the limits on what physics can tell us is rock solid, and it arises before we begin to consider any of the deep problems of understanding that arise within physics — problems with “dark matter” or “dark energy,” for example — or with reconciling quantum mechanics and general relativity theory.
  • Those who make the Very Large Mistake (of thinking they know enough about the nature of the physical to know that consciousness can’t be physical) tend to split into two groups. Members of the first group remain unshaken in their belief that consciousness exists, and conclude that there must be some sort of nonphysical stuff: They tend to become “dualists.” Members of the second group, passionately committed to the idea that everything is physical, make the most extraordinary move that has ever been made in the history of human thought. They deny the existence of consciousness: They become “eliminativists.”
  • no one has to react in either of these ways. All they have to do is grasp the fundamental respect in which we don’t know the intrinsic nature of physical stuff in spite of all that physics tells us. In particular, we don’t know anything about the physical that gives us good reason to think that consciousness can’t be wholly physical. It’s worth adding that one can fully accept this even if one is unwilling to agree with Russell that in having conscious experience we thereby know something about the intrinsic nature of physical reality.
  • It’s not the physics picture of matter that’s the problem; it’s the ordinary everyday picture of matter. It’s ironic that the people who are most likely to doubt or deny the existence of consciousness (on the ground that everything is physical, and that consciousness can’t possibly be physical) are also those who are most insistent on the primacy of science, because it is precisely science that makes the key point shine most brightly: the point that there is a fundamental respect in which ultimate intrinsic nature of the stuff of the universe is unknown to us — except insofar as it is consciousness.
55More

They're Watching You at Work - Don Peck - The Atlantic - 2 views

  • Predictive statistical analysis, harnessed to big data, appears poised to alter the way millions of people are hired and assessed.
  • By one estimate, more than 98 percent of the world’s information is now stored digitally, and the volume of that data has quadrupled since 2007.
  • The application of predictive analytics to people’s careers—an emerging field sometimes called “people analytics”—is enormously challenging, not to mention ethically fraught
  • ...52 more annotations...
  • By the end of World War II, however, American corporations were facing severe talent shortages. Their senior executives were growing old, and a dearth of hiring from the Depression through the war had resulted in a shortfall of able, well-trained managers. Finding people who had the potential to rise quickly through the ranks became an overriding preoccupation of American businesses. They began to devise a formal hiring-and-management system based in part on new studies of human behavior, and in part on military techniques developed during both world wars, when huge mobilization efforts and mass casualties created the need to get the right people into the right roles as efficiently as possible. By the 1950s, it was not unusual for companies to spend days with young applicants for professional jobs, conducting a battery of tests, all with an eye toward corner-office potential.
  • But companies abandoned their hard-edged practices for another important reason: many of their methods of evaluation turned out not to be very scientific.
  • this regime, so widespread in corporate America at mid-century, had almost disappeared by 1990. “I think an HR person from the late 1970s would be stunned to see how casually companies hire now,”
  • Many factors explain the change, he said, and then he ticked off a number of them: Increased job-switching has made it less important and less economical for companies to test so thoroughly. A heightened focus on short-term financial results has led to deep cuts in corporate functions that bear fruit only in the long term. The Civil Rights Act of 1964, which exposed companies to legal liability for discriminatory hiring practices, has made HR departments wary of any broadly applied and clearly scored test that might later be shown to be systematically biased.
  • about a quarter of the country’s corporations were using similar tests to evaluate managers and junior executives, usually to assess whether they were ready for bigger roles.
  • He has encouraged the company’s HR executives to think about applying the games to the recruitment and evaluation of all professional workers.
  • Knack makes app-based video games, among them Dungeon Scrawl, a quest game requiring the player to navigate a maze and solve puzzles, and Wasabi Waiter, which involves delivering the right sushi to the right customer at an increasingly crowded happy hour. These games aren’t just for play: they’ve been designed by a team of neuroscientists, psychologists, and data scientists to suss out human potential. Play one of them for just 20 minutes, says Guy Halfteck, Knack’s founder, and you’ll generate several megabytes of data, exponentially more than what’s collected by the SAT or a personality test. How long you hesitate before taking every action, the sequence of actions you take, how you solve problems—all of these factors and many more are logged as you play, and then are used to analyze your creativity, your persistence, your capacity to learn quickly from mistakes, your ability to prioritize, and even your social intelligence and personality. The end result, Halfteck says, is a high-resolution portrait of your psyche and intellect, and an assessment of your potential as a leader or an innovator.
  • When the results came back, Haringa recalled, his heart began to beat a little faster. Without ever seeing the ideas, without meeting or interviewing the people who’d proposed them, without knowing their title or background or academic pedigree, Knack’s algorithm had identified the people whose ideas had panned out. The top 10 percent of the idea generators as predicted by Knack were in fact those who’d gone furthest in the process.
  • What Knack is doing, Haringa told me, “is almost like a paradigm shift.” It offers a way for his GameChanger unit to avoid wasting time on the 80 people out of 100—nearly all of whom look smart, well-trained, and plausible on paper—whose ideas just aren’t likely to work out.
  • Aptitude, skills, personal history, psychological stability, discretion, loyalty—companies at the time felt they had a need (and the right) to look into them all. That ambit is expanding once again, and this is undeniably unsettling. Should the ideas of scientists be dismissed because of the way they play a game? Should job candidates be ranked by what their Web habits say about them? Should the “data signature” of natural leaders play a role in promotion? These are all live questions today, and they prompt heavy concerns: that we will cede one of the most subtle and human of skills, the evaluation of the gifts and promise of other people, to machines; that the models will get it wrong; that some people will never get a shot in the new workforce.
  • scoring distance from work could violate equal-employment-opportunity standards. Marital status? Motherhood? Church membership? “Stuff like that,” Meyerle said, “we just don’t touch”—at least not in the U.S., where the legal environment is strict. Meyerle told me that Evolv has looked into these sorts of factors in its work for clients abroad, and that some of them produce “startling results.”
  • consider the alternative. A mountain of scholarly literature has shown that the intuitive way we now judge professional potential is rife with snap judgments and hidden biases, rooted in our upbringing or in deep neurological connections that doubtless served us well on the savanna but would seem to have less bearing on the world of work.
  • We may like to think that society has become more enlightened since those days, and in many ways it has, but our biases are mostly unconscious, and they can run surprisingly deep. Consider race. For a 2004 study called “Are Emily and Greg More Employable Than Lakisha and Jamal?,” the economists Sendhil Mullainathan and Marianne Bertrand put white-sounding names (Emily Walsh, Greg Baker) or black-sounding names (Lakisha Washington, Jamal Jones) on similar fictitious résumés, which they then sent out to a variety of companies in Boston and Chicago. To get the same number of callbacks, they learned, they needed to either send out half again as many résumés with black names as those with white names, or add eight extra years of relevant work experience to the résumés with black names.
  • a sociologist at Northwestern, spent parts of the three years from 2006 to 2008 interviewing professionals from elite investment banks, consultancies, and law firms about how they recruited, interviewed, and evaluated candidates, and concluded that among the most important factors driving their hiring recommendations were—wait for it—shared leisure interests.
  • Lacking “reliable predictors of future performance,” Rivera writes, “assessors purposefully used their own experiences as models of merit.” Former college athletes “typically prized participation in varsity sports above all other types of involvement.” People who’d majored in engineering gave engineers a leg up, believing they were better prepared.
  • the prevailing system of hiring and management in this country involves a level of dysfunction that should be inconceivable in an economy as sophisticated as ours. Recent survey data collected by the Corporate Executive Board, for example, indicate that nearly a quarter of all new hires leave their company within a year of their start date, and that hiring managers wish they’d never extended an offer to one out of every five members on their team
  • In the late 1990s, as these assessments shifted from paper to digital formats and proliferated, data scientists started doing massive tests of what makes for a successful customer-support technician or salesperson. This has unquestionably improved the quality of the workers at many firms.
  • In 2010, however, Xerox switched to an online evaluation that incorporates personality testing, cognitive-skill assessment, and multiple-choice questions about how the applicant would handle specific scenarios that he or she might encounter on the job. An algorithm behind the evaluation analyzes the responses, along with factual information gleaned from the candidate’s application, and spits out a color-coded rating: red (poor candidate), yellow (middling), or green (hire away). Those candidates who score best, I learned, tend to exhibit a creative but not overly inquisitive personality, and participate in at least one but not more than four social networks, among many other factors. (Previous experience, one of the few criteria that Xerox had explicitly screened for in the past, turns out to have no bearing on either productivity or retention
  • When Xerox started using the score in its hiring decisions, the quality of its hires immediately improved. The rate of attrition fell by 20 percent in the initial pilot period, and over time, the number of promotions rose. Xerox still interviews all candidates in person before deciding to hire them, Morse told me, but, she added, “We’re getting to the point where some of our hiring managers don’t even want to interview anymore”
  • Gone are the days, Ostberg told me, when, say, a small survey of college students would be used to predict the statistical validity of an evaluation tool. “We’ve got a data set of 347,000 actual employees who have gone through these different types of assessments or tools,” he told me, “and now we have performance-outcome data, and we can split those and slice and dice by industry and location.”
  • Evolv’s tests allow companies to capture data about everybody who applies for work, and everybody who gets hired—a complete data set from which sample bias, long a major vexation for industrial-organization psychologists, simply disappears. The sheer number of observations that this approach makes possible allows Evolv to say with precision which attributes matter more to the success of retail-sales workers (decisiveness, spatial orientation, persuasiveness) or customer-service personnel at call centers (rapport-building)
  • There are some data that Evolv simply won’t use, out of a concern that the information might lead to systematic bias against whole classes of people
  • the idea that hiring was a science fell out of favor. But now it’s coming back, thanks to new technologies and methods of analysis that are cheaper, faster, and much-wider-ranging than what we had before
  • what most excites him are the possibilities that arise from monitoring the entire life cycle of a worker at any given company.
  • Now the two companies are working together to marry pre-hire assessments to an increasing array of post-hire data: about not only performance and duration of service but also who trained the employees; who has managed them; whether they were promoted to a supervisory role, and how quickly; how they performed in that role; and why they eventually left.
  • What begins with an online screening test for entry-level workers ends with the transformation of nearly every aspect of hiring, performance assessment, and management.
  • I turned to Sandy Pentland, the director of the Human Dynamics Laboratory at MIT. In recent years, Pentland has pioneered the use of specialized electronic “badges” that transmit data about employees’ interactions as they go about their days. The badges capture all sorts of information about formal and informal conversations: their length; the tone of voice and gestures of the people involved; how much those people talk, listen, and interrupt; the degree to which they demonstrate empathy and extroversion; and more. Each badge generates about 100 data points a minute.
  • he tried the badges out on about 2,500 people, in 21 different organizations, and learned a number of interesting lessons. About a third of team performance, he discovered, can usually be predicted merely by the number of face-to-face exchanges among team members. (Too many is as much of a problem as too few.) Using data gathered by the badges, he was able to predict which teams would win a business-plan contest, and which workers would (rightly) say they’d had a “productive” or “creative” day. Not only that, but he claimed that his researchers had discovered the “data signature” of natural leaders, whom he called “charismatic connectors” and all of whom, he reported, circulate actively, give their time democratically to others, engage in brief but energetic conversations, and listen at least as much as they talk.
  • His group is developing apps to allow team members to view their own metrics more or less in real time, so that they can see, relative to the benchmarks of highly successful employees, whether they’re getting out of their offices enough, or listening enough, or spending enough time with people outside their own team.
  • Torrents of data are routinely collected by American companies and now sit on corporate servers, or in the cloud, awaiting analysis. Bloomberg reportedly logs every keystroke of every employee, along with their comings and goings in the office. The Las Vegas casino Harrah’s tracks the smiles of the card dealers and waitstaff on the floor (its analytics team has quantified the impact of smiling on customer satisfaction). E‑mail, of course, presents an especially rich vein to be mined for insights about our productivity, our treatment of co-workers, our willingness to collaborate or lend a hand, our patterns of written language, and what those patterns reveal about our intelligence, social skills, and behavior.
  • people analytics will ultimately have a vastly larger impact on the economy than the algorithms that now trade on Wall Street or figure out which ads to show us. He reminded me that we’ve witnessed this kind of transformation before in the history of management science. Near the turn of the 20th century, both Frederick Taylor and Henry Ford famously paced the factory floor with stopwatches, to improve worker efficiency.
  • “The quantities of data that those earlier generations were working with,” he said, “were infinitesimal compared to what’s available now. There’s been a real sea change in the past five years, where the quantities have just grown so large—petabytes, exabytes, zetta—that you start to be able to do things you never could before.”
  • People analytics will unquestionably provide many workers with more options and more power. Gild, for example, helps companies find undervalued software programmers, working indirectly to raise those people’s pay. Other companies are doing similar work. One called Entelo, for instance, specializes in using algorithms to identify potentially unhappy programmers who might be receptive to a phone cal
  • He sees it not only as a boon to a business’s productivity and overall health but also as an important new tool that individual employees can use for self-improvement: a sort of radically expanded The 7 Habits of Highly Effective People, custom-written for each of us, or at least each type of job, in the workforce.
  • the most exotic development in people analytics today is the creation of algorithms to assess the potential of all workers, across all companies, all the time.
  • The way Gild arrives at these scores is not simple. The company’s algorithms begin by scouring the Web for any and all open-source code, and for the coders who wrote it. They evaluate the code for its simplicity, elegance, documentation, and several other factors, including the frequency with which it’s been adopted by other programmers. For code that was written for paid projects, they look at completion times and other measures of productivity. Then they look at questions and answers on social forums such as Stack Overflow, a popular destination for programmers seeking advice on challenging projects. They consider how popular a given coder’s advice is, and how widely that advice ranges.
  • The algorithms go further still. They assess the way coders use language on social networks from LinkedIn to Twitter; the company has determined that certain phrases and words used in association with one another can distinguish expert programmers from less skilled ones. Gild knows these phrases and words are associated with good coding because it can correlate them with its evaluation of open-source code, and with the language and online behavior of programmers in good positions at prestigious companies.
  • having made those correlations, Gild can then score programmers who haven’t written open-source code at all, by analyzing the host of clues embedded in their online histories. They’re not all obvious, or easy to explain. Vivienne Ming, Gild’s chief scientist, told me that one solid predictor of strong coding is an affinity for a particular Japanese manga site.
  • Gild’s CEO, Sheeroy Desai, told me he believes his company’s approach can be applied to any occupation characterized by large, active online communities, where people post and cite individual work, ask and answer professional questions, and get feedback on projects. Graphic design is one field that the company is now looking at, and many scientific, technical, and engineering roles might also fit the bill. Regardless of their occupation, most people leave “data exhaust” in their wake, a kind of digital aura that can reveal a lot about a potential hire.
  • professionally relevant personality traits can be judged effectively merely by scanning Facebook feeds and photos. LinkedIn, of course, captures an enormous amount of professional data and network information, across just about every profession. A controversial start-up called Klout has made its mission the measurement and public scoring of people’s online social influence.
  • Mullainathan expressed amazement at how little most creative and professional workers (himself included) know about what makes them effective or ineffective in the office. Most of us can’t even say with any certainty how long we’ve spent gathering information for a given project, or our pattern of information-gathering, never mind know which parts of the pattern should be reinforced, and which jettisoned. As Mullainathan put it, we don’t know our own “production function.”
  • Over time, better job-matching technologies are likely to begin serving people directly, helping them see more clearly which jobs might suit them and which companies could use their skills. In the future, Gild plans to let programmers see their own profiles and take skills challenges to try to improve their scores. It intends to show them its estimates of their market value, too, and to recommend coursework that might allow them to raise their scores even more. Not least, it plans to make accessible the scores of typical hires at specific companies, so that software engineers can better see the profile they’d need to land a particular job
  • Knack, for its part, is making some of its video games available to anyone with a smartphone, so people can get a better sense of their strengths, and of the fields in which their strengths would be most valued. (Palo Alto High School recently adopted the games to help students assess careers.) Ultimately, the company hopes to act as matchmaker between a large network of people who play its games (or have ever played its games) and a widening roster of corporate clients, each with its own specific profile for any given type of job.
  • When I began my reporting for this story, I was worried that people analytics, if it worked at all, would only widen the divergent arcs of our professional lives, further gilding the path of the meritocratic elite from cradle to grave, and shutting out some workers more definitively. But I now believe the opposite is likely to happen, and that we’re headed toward a labor market that’s fairer to people at every stage of their careers
  • For decades, as we’ve assessed people’s potential in the professional workforce, the most important piece of data—the one that launches careers or keeps them grounded—has been educational background: typically, whether and where people went to college, and how they did there. Over the past couple of generations, colleges and universities have become the gatekeepers to a prosperous life. A degree has become a signal of intelligence and conscientiousness, one that grows stronger the more selective the school and the higher a student’s GPA, that is easily understood by employers, and that, until the advent of people analytics, was probably unrivaled in its predictive powers.
  • the limitations of that signal—the way it degrades with age, its overall imprecision, its many inherent biases, its extraordinary cost—are obvious. “Academic environments are artificial environments,” Laszlo Bock, Google’s senior vice president of people operations, told The New York Times in June. “People who succeed there are sort of finely trained, they’re conditioned to succeed in that environment,” which is often quite different from the workplace.
  • because one’s college history is such a crucial signal in our labor market, perfectly able people who simply couldn’t sit still in a classroom at the age of 16, or who didn’t have their act together at 18, or who chose not to go to graduate school at 22, routinely get left behind for good. That such early factors so profoundly affect career arcs and hiring decisions made two or three decades later is, on its face, absurd.
  • I spoke with managers at a lot of companies who are using advanced analytics to reevaluate and reshape their hiring, and nearly all of them told me that their research is leading them toward pools of candidates who didn’t attend college—for tech jobs, for high-end sales positions, for some managerial roles. In some limited cases, this is because their analytics revealed no benefit whatsoever to hiring people with college degrees; in other cases, and more often, it’s because they revealed signals that function far better than college history,
  • Google, too, is hiring a growing number of nongraduates. Many of the people I talked with reported that when it comes to high-paying and fast-track jobs, they’re reducing their preference for Ivy Leaguers and graduates of other highly selective schools.
  • This process is just beginning. Online courses are proliferating, and so are online markets that involve crowd-sourcing. Both arenas offer new opportunities for workers to build skills and showcase competence. Neither produces the kind of instantly recognizable signals of potential that a degree from a selective college, or a first job at a prestigious firm, might. That’s a problem for traditional hiring managers, because sifting through lots of small signals is so difficult and time-consuming.
  • all of these new developments raise philosophical questions. As professional performance becomes easier to measure and see, will we become slaves to our own status and potential, ever-focused on the metrics that tell us how and whether we are measuring up? Will too much knowledge about our limitations hinder achievement and stifle our dreams? All I can offer in response to these questions, ironically, is my own gut sense, which leads me to feel cautiously optimistic.
  • Google’s understanding of the promise of analytics is probably better than anybody else’s, and the company has been changing its hiring and management practices as a result of its ongoing analyses. (Brainteasers are no longer used in interviews, because they do not correlate with job success; GPA is not considered for anyone more than two years out of school, for the same reason—the list goes on.) But for all of Google’s technological enthusiasm, these same practices are still deeply human. A real, live person looks at every résumé the company receives. Hiring decisions are made by committee and are based in no small part on opinions formed during structured interviews.
15More

History News Network | We Traded in One of the Most Self-Disciplined Presidents for the... - 1 views

  • How ironic it is then that President Obama, the bane of conservatives, possessed an abundance of self-discipline, and President Trump, who most conservatives (including Bennet) favored over Hilary Clinton, possesses almost none.
  • The ancient Greek philosopher Aristotle thought that political leaders should exercise practical wisdom (phronesis) or prudence. He considered temperance (i.e., moderation, self-restraint) and self-discipline two of the most important virtues required for such wisdom. He believed that the two virtues should help us regulate what he called “the appetitive faculty,” which deals with our emotions and desires
  • One of the twentieth-century’s most prominent commentators on political wisdom, Britain’s Isaiah Berlin (1909-1997), viewed temperance as an important political virtue, and he connected it to humility and tolerance—neither of which Trump displays. And in his “Two Concepts of Liberty,” Berlin wrote, “Freedom is self-mastery.”
  • ...12 more annotations...
  • Barry Schwartz and Kenneth Sharpe in their Practical Wisdom: The Right Way to Do the Right Thing (2010) state that such wisdom is greatly needed
  • The authors quote Aristotle and give the example of a man practicing practical wisdom and mention that “he had the self-control—the emotion-regulating skills—to choose rightly.”
  • Psychologist, futurist, and editor of The Wisdom Page Tom Lombardo also stresses the importance of temperance and self-control. In his new book on Future Consciousness
  • he includes a whole chapter (of 45 pages) on “Self-Control and Self-Responsibility.” In it he cites favorably two authors who claim that “most human problems are due to a lack of self-control.” He also states that “we cannot flourish without self-responsibility, self-control, and . . . . one of the most unethical forms of thinking and behavior in life . . . is to abdicate self-responsibility and self-control in ourselves.”
  • In Inside Obama’s Brain (2009), journalist Sasha Abramsky talked to over a hundred people who knew Obama and reported that “during the election campaign Obama almost never got upset, or panicked, by day-to-day shifts in momentum, by the ups and downs of opinion polls.” Almost a year into his presidency, Abramsky refered to the president as “a voice of moderation in a corrosively shrill, partisan political milieu.”
  • Up until the end of his presidency, Obama maintained his self-control and temperance. As a Huffington Post piece noted in 2016, he “has been the model of temperance in office on all fronts.”
  • Just as many individuals have commented on Obama’s self-discipline and temperance, so too have many remarked on Trump’s lack of these virtues
  • In May 2017, Brooks stated: “At base, Trump is an infantalist. There are three tasks that most mature adults have sort of figured out by the time they hit 25. Trump has mastered none of them. Immaturity is becoming the dominant note of his presidency, lack of self-control his leitmotif.”
  • Two months later, Douthat opined about Trump: “He is nonetheless clearly impaired, gravely deficient somewhere at the intersection of reason and judgment and conscience and self-control. . . . This president should not be the president, and the sooner he is not, the better.”
  • Karl Rove, a former senior adviser to President George W. Bush, insisted that Trump “lacks the focus or self-discipline to do the basic work required of a president.”
  • At about the same time former Republican senator Tom Coburn (R-OK) declared “The question is, does he have the self-discipline and some control over his ego to be able to say ‘I’m wrong’ every now and then? I haven’t seen that.
  • it is Trump’s narcissism and lack of humility that are his chief faults and hinder him most from being even a mediocre president.
57More

Have Smartphones Destroyed a Generation? - The Atlantic - 0 views

  • She told me she’d spent most of the summer hanging out alone in her room with her phone. That’s just the way her generation is, she said. “We didn’t have a choice to know any life without iPads or iPhones. I think we like our phones more than we like actual people.”
  • The arrival of the smartphone has radically changed every aspect of teenagers’ lives, from the nature of their social interactions to their mental health. These changes have affected young people in every corner of the nation and in every type of household
  • Around 2012, I noticed abrupt shifts in teen behaviors and emotional states. The gentle slopes of the line graphs became steep mountains and sheer cliffs, and many of the distinctive characteristics of the Millennial generation began to disappear. In all my analyses of generational data—some reaching back to the 1930s—I had never seen anything like it.
  • ...54 more annotations...
  • the trends persisted, across several years and a series of national surveys. The changes weren’t just in degree, but in kind.
  • The biggest difference between the Millennials and their predecessors was in how they viewed the world; teens today differ from the Millennials not just in their views but in how they spend their time. The experiences they have every day are radically different from those of the generation that came of age just a few years before them.
  • it was exactly the moment when the proportion of Americans who owned a smartphone surpassed 50 percent.
  • theirs is a generation shaped by the smartphone and by the concomitant rise of social media. I call them iGen
  • Born between 1995 and 2012, members of this generation are growing up with smartphones, have an Instagram account before they start high school, and do not remember a time before the internet.
  • iGen’s oldest members were early adolescents when the iPhone was introduced, in 2007, and high-school students when the iPad entered the scene, in 2010. A 2017 survey of more than 5,000 American teens found that three out of four owned an iPhone.
  • . I had grown accustomed to line graphs of trends that looked like modest hills and valleys. Then I began studying Athena’s generation.
  • More comfortable in their bedrooms than in a car or at a party, today’s teens are physically safer than teens have ever been. They’re markedly less likely to get into a car accident and, having less of a taste for alcohol than their predecessors, are less susceptible to drinking’s attendant ills.
  • Psychologically, however, they are more vulnerable than Millennials were: Rates of teen depression and suicide have skyrocketed since 2011. It’s not an exaggeration to describe iGen as being on the brink of the worst mental-health crisis in decades. Much of this deterioration can be traced to their phones.
  • the twin rise of the smartphone and social media has caused an earthquake of a magnitude we’ve not seen in a very long time, if ever. There is compelling evidence that the devices we’ve placed in young people’s hands are having profound effects on their lives—and making them seriously unhappy.
  • But the allure of independence, so powerful to previous generations, holds less sway over today’s teens, who are less likely to leave the house without their parents. The shift is stunning: 12th-graders in 2015 were going out less often than eighth-graders did as recently as 2009.
  • Today’s teens are also less likely to date. The initial stage of courtship, which Gen Xers called “liking” (as in “Ooh, he likes you!”), kids now call “talking”—an ironic choice for a generation that prefers texting to actual conversation. After two teens have “talked” for a while, they might start dating.
  • only about 56 percent of high-school seniors in 2015 went out on dates; for Boomers and Gen Xers, the number was about 85 percent.
  • The decline in dating tracks with a decline in sexual activity. The drop is the sharpest for ninth-graders, among whom the number of sexually active teens has been cut by almost 40 percent since 1991. The average teen now has had sex for the first time by the spring of 11th grade, a full year later than the average Gen Xer
  • The teen birth rate hit an all-time low in 2016, down 67 percent since its modern peak, in 1991.
  • Nearly all Boomer high-school students had their driver’s license by the spring of their senior year; more than one in four teens today still lack one at the end of high school.
  • In conversation after conversation, teens described getting their license as something to be nagged into by their parents—a notion that would have been unthinkable to previous generations.
  • In the late 1970s, 77 percent of high-school seniors worked for pay during the school year; by the mid-2010s, only 55 percent did. The number of eighth-graders who work for pay has been cut in half.
  • Beginning with Millennials and continuing with iGen, adolescence is contracting again—but only because its onset is being delayed. Across a range of behaviors—drinking, dating, spending time unsupervised— 18-year-olds now act more like 15-year-olds used to, and 15-year-olds more like 13-year-olds. Childhood now stretches well into high school.
  • In an information economy that rewards higher education more than early work history, parents may be inclined to encourage their kids to stay home and study rather than to get a part-time job. Teens, in turn, seem to be content with this homebody arrangement—not because they’re so studious, but because their social life is lived on their phone. They don’t need to leave home to spend time with their friends.
  • eighth-, 10th-, and 12th-graders in the 2010s actually spend less time on homework than Gen X teens did in the early 1990s.
  • The time that seniors spend on activities such as student clubs and sports and exercise has changed little in recent years. Combined with the decline in working for pay, this means iGen teens have more leisure time than Gen X teens did, not less.
  • So what are they doing with all that time? They are on their phone, in their room, alone and often distressed.
  • despite spending far more time under the same roof as their parents, today’s teens can hardly be said to be closer to their mothers and fathers than their predecessors were. “I’ve seen my friends with their families—they don’t talk to them,” Athena told me. “They just say ‘Okay, okay, whatever’ while they’re on their phones. They don’t pay attention to their family.” Like her peers, Athena is an expert at tuning out her parents so she can focus on her phone.
  • The number of teens who get together with their friends nearly every day dropped by more than 40 percent from 2000 to 2015; the decline has been especially steep recently.
  • Eighth-graders who are heavy users of social media increase their risk of depression by 27 percent, while those who play sports, go to religious services, or even do homework more than the average teen cut their risk significantly.
  • The roller rink, the basketball court, the town pool, the local necking spot—they’ve all been replaced by virtual spaces accessed through apps and the web.
  • The results could not be clearer: Teens who spend more time than average on screen activities are more likely to be unhappy, and those who spend more time than average on nonscreen activities are more likely to be happy.
  • There’s not a single exception. All screen activities are linked to less happiness, and all nonscreen activities are linked to more happiness
  • Eighth-graders who spend 10 or more hours a week on social media are 56 percent more likely to say they’re unhappy than those who devote less time to social media
  • If you were going to give advice for a happy adolescence based on this survey, it would be straightforward: Put down the phone, turn off the laptop, and do something—anything—that does not involve a screen
  • Social-networking sites like Facebook promise to connect us to friends. But the portrait of iGen teens emerging from the data is one of a lonely, dislocated generation. Teens who visit social-networking sites every day but see their friends in person less frequently are the most likely to agree with the statements “A lot of times I feel lonely,” “I often feel left out of things,” and “I often wish I had more good friends.” Teens’ feelings of loneliness spiked in 2013 and have remained high since.
  • This doesn’t always mean that, on an individual level, kids who spend more time online are lonelier than kids who spend less time online.
  • Teens who spend more time on social media also spend more time with their friends in person, on average—highly social teens are more social in both venues, and less social teens are less so.
  • The more time teens spend looking at screens, the more likely they are to report symptoms of depression.
  • It’s not only a matter of fewer kids partying; fewer kids are spending time simply hanging out
  • Teens who spend three hours a day or more on electronic devices are 35 percent more likely to have a risk factor for suicide, such as making a suicide plan. (That’s much more than the risk related to, say, watching TV.)
  • Since 2007, the homicide rate among teens has declined, but the suicide rate has increased. As teens have started spending less time together, they have become less likely to kill one another, and more likely to kill themselves. In 2011, for the first time in 24 years, the teen suicide rate was higher than the teen homicide rate.
  • For all their power to link kids day and night, social media also exacerbate the age-old teen concern about being left out.
  • Today’s teens may go to fewer parties and spend less time together in person, but when they do congregate, they document their hangouts relentlessly—on Snapchat, Instagram, Facebook. Those not invited to come along are keenly aware of it. Accordingly, the number of teens who feel left out has reached all-time highs across age groups.
  • Forty-eight percent more girls said they often felt left out in 2015 than in 2010, compared with 27 percent more boys. Girls use social media more often, giving them additional opportunities to feel excluded and lonely when they see their friends or classmates getting together without them.
  • Social media levy a psychic tax on the teen doing the posting as well, as she anxiously awaits the affirmation of comments and likes. When Athena posts pictures to Instagram, she told me, “I’m nervous about what people think and are going to say. It sometimes bugs me when I don’t get a certain amount of likes on a picture.”
  • Girls have also borne the brunt of the rise in depressive symptoms among today’s teens. Boys’ depressive symptoms increased by 21 percent from 2012 to 2015, while girls’ increased by 50 percent—more than twice as much
  • The rise in suicide, too, is more pronounced among girls. Although the rate increased for both sexes, three times as many 12-to-14-year-old girls killed themselves in 2015 as in 2007, compared with twice as many boys
  • Social media give middle- and high-school girls a platform on which to carry out the style of aggression they favor, ostracizing and excluding other girls around the clock.
  • I asked my undergraduate students at San Diego State University what they do with their phone while they sleep. Their answers were a profile in obsession. Nearly all slept with their phone, putting it under their pillow, on the mattress, or at the very least within arm’s reach of the bed. They checked social media right before they went to sleep, and reached for their phone as soon as they woke up in the morning
  • the smartphone is cutting into teens’ sleep: Many now sleep less than seven hours most nights. Sleep experts say that teens should get about nine hours of sleep a night; a teen who is getting less than seven hours a night is significantly sleep deprived
  • Fifty-seven percent more teens were sleep deprived in 2015 than in 1991. In just the four years from 2012 to 2015, 22 percent more teens failed to get seven hours of sleep.
  • Two national surveys show that teens who spend three or more hours a day on electronic devices are 28 percent more likely to get less than seven hours of sleep than those who spend fewer than three hours, and teens who visit social-media sites every day are 19 percent more likely to be sleep deprived.
  • Teens who read books and magazines more often than the average are actually slightly less likely to be sleep deprived—either reading lulls them to sleep, or they can put the book down at bedtime.
  • Sleep deprivation is linked to myriad issues, including compromised thinking and reasoning, susceptibility to illness, weight gain, and high blood pressure. It also affects mood: People who don’t sleep enough are prone to depression and anxiety.
  • correlations between depression and smartphone use are strong enough to suggest that more parents should be telling their kids to put down their phone.
  • What’s at stake isn’t just how kids experience adolescence. The constant presence of smartphones is likely to affect them well into adulthood. Among people who suffer an episode of depression, at least half become depressed again later in life. Adolescence is a key time for developing social skills; as teens spend less time with their friends face-to-face, they have fewer opportunities to practice them
  • Significant effects on both mental health and sleep time appear after two or more hours a day on electronic devices. The average teen spends about two and a half hours a day on electronic devices. Some mild boundary-setting could keep kids from falling into harmful habits.
6More

Scientists Discover Some of the Oldest Signs of Life on Earth - The Atlantic - 0 views

  • The Earth was formed around 4.54 billion years ago. If you condense that huge swath of prehistory into a single calendar year, then the 3.95-billion-year-old graphite that the Tokyo team analyzed was created in the third week of February. By contrast, the earliest fossils ever found are 3.7 billion years old; they were created in the second week of March.
  • Those fossils, from the Isua Belt in southwest Greenland, are stromatolites—layered structures created by communities of bacteria. And as I reported last year, their presence suggests that life already existed in a sophisticated form at the 3.7-billion-year mark, and so must have arisen much earlier. And indeed, scientists have found traces of biologically produced graphite throughout the region, in other Isua Belt rocks that are 3.8 billion years old, and in hydrothermal vents off the coast of Quebec that are at least a similar age, and possibly even older.
  • “As far back as the rock record extends—that is, as far back as we can look for direct evidence of early life, we are finding it. Earth has been a biotic, life-sustaining planet since close to its beginning.”
  • ...3 more annotations...
  • living organisms concentrate carbon-12 in their cells—and when they die, that signature persists. When scientists find graphite that’s especially enriched in carbon-12, relative to carbon-13, they can deduce that living things were around when that graphite was first formed. And that’s exactly what the Tokyo team found in the Saglek Block—grains of graphite, enriched in carbon-12, encased within 3.95-billion-year-old rock.
  • the team calculated the graphite was created at temperatures between 536 and 622 Celsius—a range that’s consistent with the temperatures at which the surrounding metamorphic rocks were transformed. This suggests that the graphite was already there when the rocks were heated and warped, and didn’t sneak in later. It was truly OG—original graphite.
  • Still, all of this evidence suggests Earth was home to life during its hellish infancy, and that such life abounded in a variety of habitats. Those pioneering organisms—bacteria, probably—haven’t left any fossils behind. But Sano and Komiya hope to find some clues about them by analyzing the Saglek Block rocks. The levels of nitrogen, iron, and sulfur in the rocks could reveal which energy sources those organisms exploited, and which environments they inhabited. They could tell us how life first lived.
26More

Why Did the Dean of the Most Diverse Law School in the Country Cancel Herself? - The Ne... - 0 views

  • Why Did the Dean of the Most Diverse Law School in the Country Cancel Herself?
  • Was it the unfortunate use of a single word? Or something far more complicated?
  • Mary Lu Bilek, who has spent 32 years at the law school at the City University of New York, the past five of them as dean, sent an email to students and faculty with the subject line: “Apology.”
  • ...23 more annotations...
  • Discussing a contentious issue of race and tenure in a committee meeting last fall, she had likened herself to a “slaveholder.”
  • It was a strange, deeply jarring thing to say, but she had been trying to make the point that her position left her responsible for whatever racial inequities might exist institutionally
  • What the dean might have regarded as an admission of culpability, some of her colleagues viewed as an expression of the buried prejudices well-intentioned liberals never think they have.
  • Ms. Bilek quickly realized that she had drawn a terrible — perhaps unforgivable — analogy
  • “begun education and counseling to uncover and overcome my biases.”
  • To colleagues in the field, the circumstances of Ms. Bilek’s departure struck a note that was both ironic and painful.
  • Decades ago, long before it became commonplace, Ms. Bilek railed against the bar exam and other standardized tests for their disparate impact on low-income students
  • She had presented herself and the institution as “anti-racist,” they wrote, while ignoring how her own decisions perpetuated “institutional racism.”
  • On the face of things, it seemed as though Ms. Bilek had been lost to the maw of cancel culture and its relentless appetite for hapless boomer prey.
  • “I regret that my mistake means that I will not be doing that work” — the work of fighting racism — “with my CUNY colleagues,”
  • “Her reputation in the world of deans is that of someone who cares deeply about racial justice,”
  • Prestige in academia begins, of course, with tenure. Ms. Bilek’s troubles started last spring when she argued for granting early tenure, an extremely precious commodity, to someone about to become an administrator — a young white woman named Allie Robbins
  • Without tenure, administrative work in a university is an especially oppressive time suck, robbing an academic of the hours that could be spent on research and writing and conference-going — essentially, what is required for tenure.
  • Beyond that, the risk of alienating people who someday might weigh in on your own tenure case remained high.
  • As the fall progressed, anger continued to foment around Ms. Bilek.
  • The day after Christmas, 22 faculty members wrote a letter denouncing her wish to leapfrog a white junior academic in the promotion process, her “slaveholder” reference, and what they viewed as her resistance to listen to faculty members of color on the personnel committee “as they pointed out the disparate racial impacts” of her conduct.
  • “But I am certain that the work they do within the Law School and in the world will bring us to a more equal, anti-racist society.”
  • Next came a list of demands that included a public apology for her misdeeds, changes to practices in governance and a retreat from any outside roles furthering the perception that she was “an anti-racist dean.”
  • “We intentionally chose not to ask her to step down but to demand instead that she commit to the systemic work that her stated anti-racist principles required,”
  • “Dean Bilek chose to ignore that outstretched hand.”
  • “We said, ‘We don’t want to make a scene — no single action should define any of us. We don’t want to take away from all the work you’ve done at the law school, but we want the accountability,’”
  • “I thought there was a chance for redemption — we do not want to cancel folks; we are not people who think in carceral ways.”
  • Kept under wraps, news of all this turmoil reached the student body only last week, and when they discovered what Ms. Bilek had said and done and how long they had been left oblivious, a large and vocal faction did not feel as generously
17More

History Is About Stories. Here's Why We Get Them Wrong | Time - 1 views

  • Science comes hard to most of us because it can’t really take that form. Instead it’s equations, models, theories and the data that support them. But ironically, science offers an explanation of why we love stories.
  • It starts with a challenge posed in human evolution — but the more we come to understand about that subject, the more we see that our storytelling instinct can lead us astray, especially when it comes to how most of us understand history.
  • Many animals have highly developed mind-reading instinct, a sort of tracking-device technique shared with creatures that have no language, not even a language of thought.
  • ...14 more annotations...
  • It’s what they use to track prey and avoid predation.
  • The theory of mind is so obvious it’s nearly invisible: it tells us that behavior is the result of the joint operation of pairs of beliefs and desires.
  • The desires are about the ways we want things to turn out in the future. The beliefs are about the way things are now.
  • The theory of mind turns what people do into a story with a plot by pairing up the content of beliefs and desires, what they are about.
  • Psycholinguistics has shown that the theory of mind is necessary for learning language and almost anything else our parents teach us.
  • Imitating others requires using the theory to figure out what they want us to do and in what order. Without it, you can’t learn much beyond what other higher primates can.
  • The theory of mind makes us construct stories obsessively, and thus encourages us to see the past as a set of them.
  • When popular historians seek to know why Hitler declared war on the U.S. (when he didn’t have to), they put the theory of mind to work: What did he believe and what was it that he wanted that made him do such a foolish thing?
  • he trouble is that the theory of mind is completely wrong about the way the mind, i.e. the brain, actually works. We can’t help but use it to guess what is going on in other people’s minds, and historians rely on it, but the evidence from neuroscience shows that in fact what’s “going on” in anyone’s mind is not decision about what to do in the light of beliefs and desire, but rather a series of neural circuitry firings.
  • The wrongness of the theory of mind is so profound it makes false all the stories we know and love, in narrative history (and in historical novels).
  • Neuroscience reveals that the brain is not organized even remotely to work the way the theory of mind says it does. The fact that narrative histories give persistently different answers to questions historians have been asking for centuries should be evidence that storytelling is not where the real answers can be found.
  • Crucially, they discovered that while different parts of the brain control different things, the neurons’ electrical signals don’t differ in “content”; they are not about different subjects. They are not about anything at all. Each neuron is just in a different part of the mid-brain, doing its job in exactly the same way all other neurons do, sending the same electrochemical oscillations.
  • There is nothing in our brains to vindicate the theory’s description of how anyone ever makes up his or her mind. And that explains a lot about how bad the theory of mind is at predicting anything much about the future, or explaining anything much about the past.
  • If we really want historical knowledge we’ll need to use the same tools scientists use — models and theories we can quantify and test. Guessing what was going through Hitler’s mind, and weaving it into a story is no substitute for empirical science.
5More

The Moral Ill Effects of Teaching Economics | HuffPost - 0 views

  • Other studies have found economics students to exhibit a stronger tendency towards anti-social positions compared to their peers.
  • Carter and Irons found that, relative to non-economics students, economics students were much more likely to offer their partners small sums, and, thus, deviate from a “fair” 50/50 spilt.
  • The authors found that, after taking an economics class, students’ responses to the end-of-the-semester survey were more likely to reflect a decline in honest behavior than students who studied astronomy.
  • ...2 more annotations...
  • They found that economics students are less likely to consider a vendor who increases the price of bottled water on a hot day to be acting “unfairly.”
  • They found ideological differences between lower-level economics students and upper-level economics students that are similar in kind to the measured differences between the ideology of economics students as a whole and their peers. He finds that upper-level students are even less likely to support egalitarian solutions to distribution problems than lower-level students, suggesting that time spent studying economics does have an indoctrination effect.
31More

Trump's disastrous end to his shocking presidency - CNNPolitics - 0 views

  • President Donald Trump is leaving America in a vortex of violence, sickness and death and more internally estranged than it has been for 150 years.
  • Hospitals are swamped and medical workers are shattered amid a faltering rollout of the vaccine supposed to end the crisis.
  • It took 200 years for the country to rack up its first two presidential impeachments.
    • edencottone
       
      made history but in a bad way. This president is deserving of the 2 impeachments
  • ...22 more annotations...
  • Trump's malfeasance has led the country down that awful, divisive path twice in just more than a year.
    • edencottone
       
      though this line is opinionated I agree
  • The city Trump has called home for four years is being turned into an armed camp incongruous with the mood of joy and renewal that pulsates through most inaugurations.
  • In a symbol of a democracy under siege, the people's buildings -- the White House and the US Capitol -- are caged behind ugly iron and cement barriers.
    • edencottone
       
      a threat to our democracy
  • eight days
  • unintended irony, Biden's team has picked "America United"
  • It is becoming ever more obvious that the horrific scenes on Capitol Hill on Wednesday were not a one-off.
  • In a chilling new warning, the FBI revealed the possible next stage in this now nationwide wave of radicalization, saying armed protests were planned at state Capitols in all 50 states between January 16 and Inauguration Day, January 20.
  • Former FBI Deputy Director Andrew McCabe was shocked by the magnitude of the bureau's intelligence on possible new violence.
  • "I don't think in the entire scope of my career working counter terrorism issues for many, many years, I don't think I ever saw a bulletin go out that concerned armed protest activity in 50 states in a three- or four-day period,"
    • edencottone
       
      we are in uncharted territory
  • he was not afraid of taking the oath of office outside next week
  • So far, after a massive domestic terror attack on the citadel of US democracy, there has been no major public briefing by any major federal law enforcement agency or the White House, an omission that fosters a sense of an absent government
  • By contrast, senior officials from the outgoing Bush administration and the incoming Obama administration worked closely together in the Situation Room on January 20, 2009, when there was concern about the authenticity of terror threat to the inauguration.
  • current atmosphere of fear and wild political insurrection
  • Momentum towards impeachment is now all but unstoppable
  • hinted at the insincerity of the Republican approach.
  • With a few exceptions, Republicans -- who indulged and in many cases supported Trump's blatantly false claims of electoral fraud for weeks -- have responded to the uproar over last week's Capitol attack by complaining that by pushing impeachment, Democrats are fracturing national unity.
    • edencottone
       
      good that they now acknowledge however should have been done much earlier
  • His comment eerily recalled the rationalizations of Republicans who declined to convict Trump in his first impeachment trial after he tried to get Ukraine to interfere in the election to damage Biden.
  • "Face the Nation."
  • has emerged from many dark periods since the Civil War
    • edencottone
       
      we can do it again
  • Trump has not appeared in public for days.
  • The virus is meanwhile running rampant. Eleven states and Washington, DC, just recorded their highest 7-day average of new cases of Covid-19 since the pandemic began. For the first time, the country is averaging over 3,000 deaths from the pandemic per day.
  • hopes that the nation could soon turn a corner are being tempered by the glitches in the vaccine roll out.
2More

Opinion | An Old Flaw in U.S. Thinking - The New York Times - 1 views

    • melnikju
       
      Ironically is still very relevant to how the US likes to stick its nose in other country's business for its own gain.
  • the Great Satan.
8More

Modern Science Didn't Appear Until the 17th Century. What Took So Long? - The New York ... - 0 views

  • While modern science is built on the primacy of empirical data — appealing to the objectivity of facts — actual progress requires determined partisans to move it along.
  • human civilization has existed for millenniums, and modern science — as distinct from ancient and medieval science, or so-called natural philosophy — has only been around for a few hundred years. What took so long?
  • Strevens gives the example of a biologist couple who spent every summer since 1973 on the Galápagos, measuring finches; it took them four decades before they had enough data to conclude that they had observed a new species of finch.
  • ...5 more annotations...
  • focusing so narrowly, for so long, on tedious work that may not come to anything is inherently unappealing for most people. Rich and learned cultures across the world pursued all kinds of erudition and scholarly traditions, but didn’t develop this “knowledge machine” until relatively recently
  • it took a cataclysm to disrupt the longstanding way of looking at the world in terms of an integrated whole.
  • Even though Newton was an ardent alchemist with a side interest in biblical prophecy, he supported his scientific findings with empirical inquiry; he was, Strevens argues, “a natural intellectual compartmentalizer” who arrived at a fortuitous time.
  • “the iron rule of explanation,” requiring scientists to settle arguments by empirical testing, imposing on them a common language “regardless of their intellectual predilections, cultural biases or narrow ambitions.
  • Climate change, pandemics — he comes up to the present day, ending on a grim but resolute note, hopeful that scientists will adapt and find a better way to communicate with a suspicious public. “We’ve pampered and praised the knowledge machine, given it the autonomy it has needed to grow,” he writes. “Now we desperately need its advice.”
14More

9 Things You May Not Know About Isaac Newton - HISTORY - 0 views

  • The experience of being abandoned by his mother scarred Newton and likely played a role in shaping his solitary, untrusting nature.
  • As an adult, Newton immersed himself in his work, had no hobbies and never married. He even remained silent about some of his scientific and mathematical discoveries for years, if he published them at all.
  • However, at age 15 or 16, he was ordered to quit school by his mother (then widowed for a second time) and return to Woolsthorpe Manor to become a farmer
  • ...11 more annotations...
  • In 1665, following an outbreak of the bubonic plague in England, Cambridge University closed its doors, forcing Newton to return home to Woolsthorpe Manor.
  • While sitting in the garden there one day, he saw an apple fall from a tree, providing him with the inspiration to eventually formulate his law of universal gravitation.
  • During his tenure at the mint, Newton supervised a major initiative to take all of the country’s old coins out of circulation and replace them with more reliable currency.
  • Although he remained at Cambridge for nearly 30 years, Newton showed little interest in teaching or in his students, and his lectures were sparsely attended; frequently, no one showed up at all
  • In 1669, Newton, then 26, was appointed the Lucasian professor of mathematics at Cambridge, one of the world’s oldest universities, whose origins date to 1209.
  • He also was focused on investigating counterfeiters, and as a result became acquainted with the city’s seedy underbelly as he personally tracked down and interviewed suspected criminals, receiving death threats along the way
  • In addition to the scientific endeavors for which he’s best known, Newton spent much of his adult life pursuing another interest, alchemy, whose goals included finding the philosopher’s stone, a substance that allegedly could turn ordinary metals like lead and iron into gold
  • From 1689 to 1690, Newton was a member of Parliament, representing Cambridge University. During this time, the legislative body enacted the Bill of Rights, which limited the power of the monarchy and laid out the rights of Parliament along with certain individual rights
  • Newton served a second brief term in Parliament, from 1701 to 1702, and again seems to have contributed little.
  • When it came to his intellectual rivals, Newton could be jealous and vindictive. Among those with whom he feuded was German mathematician and philosopher Gottfried Leibniz; the two men had a bitter battle over who invented calculus
  • In 1705, Newton was knighted by Queen Anne.
10More

Imagine a World Without Apps - The New York Times - 0 views

  • Allow me to ask a wild question: What if we played games, shopped, watched Netflix and read news on our smartphones — without using apps?
  • the downsides of our app system — principally the control that Apple and Google, the dominant app store owners in much of the world, exert over our digital lives — are onerous enough to contemplate another path.
  • in recent months, Microsoft’s Xbox video gaming console, the popular game Fortnite and other game companies have moved ahead with technology that makes it possible to play video games on smartphone web browsers.
  • ...7 more annotations...
  • if apps weren’t dominant, would we have a richer variety of digital services from a broader array of companies?
  • In the early smartphone era, there was a tug of war between technologies that were more like websites and the apps we know today. Apps won, mostly because they were technically superior.
  • control. Apple and Google dictate much of what is allowed on the world’s phones. There are good outcomes from this, including those companies weeding out bad or dangerous apps and giving us one place to find them.
  • ith unhappy side effects. Apple and Google charge a significant fee on many in-app purchases, and they’ve forced app makers into awkward workarounds.
  • You know what’s free from Apple and Google’s iron grip? The web. Smartphones could lean on the web instead.
  • This is about imagining an alternate reality where companies don’t need to devote money to creating apps that are tailored to iPhones and Android phones, can’t work on any other devices and obligate app makers to hand over a cut of each sale.
  • Maybe more smaller digital companies could thrive. Maybe our digital services would be cheaper and better. Maybe we’d have more than two dominant smartphone systems
15More

The Case for Teaching Ignorance - The New York Times - 1 views

  • Far too often, she believed, teachers fail to emphasize how much about a given topic is unknown.
  • She wanted her students to recognize the limits of knowledge and to appreciate that questions often deserve as much attention as answers
  • in recent years scholars have made a convincing case that focusing on uncertainty can foster latent curiosity, while emphasizing clarity can convey a warped understanding of knowledge.
  • ...12 more annotations...
  • Presenting ignorance as less extensive than it is, knowledge as more solid and more stable, and discovery as neater also leads students to misunderstand the interplay between answers and questions.
  • Discovery is not the neat and linear process many students imagine, but usually involves, in Dr. Firestein’s phrasing, “feeling around in dark rooms, bumping into unidentifiable things, looking for barely perceptible phantoms.”
  • As he argued in his 2012 book “Ignorance: How It Drives Science,” many scientific facts simply aren’t solid and immutable, but are instead destined to be vigorously challenged and revised by successive generations.
  • People tend to think of not knowing as something to be wiped out or overcome, as if ignorance were simply the absence of knowledge. But answers don’t merely resolve questions; they provoke new ones.
  • The larger the island of knowledge grows, the longer the shoreline — where knowledge meets ignorance — extends. The more we know, the more we can ask.
  • Questions don’t give way to answers so much as the two proliferate together. Answers breed questions. Curiosity isn’t merely a static disposition but rather a passion of the mind that is ceaselessly earned and nurtured.
  • Mapping the coast of the island of knowledge, to continue the metaphor, requires a grasp of the psychology of ambiguity. The ever-expanding shoreline, where questions are born of answers, is terrain characterized by vague and conflicting information. The resulting state of uncertainty, psychologists have shown, intensifies our emotions: not only exhilaration and surprise, but also confusion and frustration.
  • The borderland between known and unknown is also where we strive against our preconceptions to acknowledge and investigate anomalous data, a struggle Thomas S. Kuhn described in his 1962 classic, “The Structure of Scientific Revolutions.”
  • giving due emphasis to unknowns, highlighting case studies that illustrate the fertile interplay between questions and answers, and exploring the psychology of ambiguity are essential.
  • The time has come to “view ignorance as ‘regular’ rather than deviant,” the sociologists Matthias Gross and Linsey McGoey have boldly argued. Our students will be more curious — and more intelligently so — if, in addition to facts, they were equipped with theories of ignorance as well as theories of knowledge.
  • This article makes an excellent case for "ignorance studies" but, ironically, treats "agnotology" as a cutting edge concept when in fact it is an old notion with a long history and a better name: "nescience." Nescience can refer to the Socratic idea of learned ignorance, but the concept was also developed in the eighteenth century, chiefly by Scottish commonsense philosopher Thomas Reid
  • "nescience" has considerable advantages over "agnotology". Linguistically, nescience is the precise opposite of "science" (scio: "I know"; nescio: "I don't know"). So why not stick with the Latin root word--and avoid any of the unfortunate connotations "ignorance" has in contemporary parlance? Historically, the word reminds us of the concept's provenance in Socratic philosophy and the Scottish enlightenment. Finally, unearthing the history of "nescience" might yield an unexpected insight into American intellectual history, since many of our founders read Thomas Reid's philosophy in college.
15More

Scientists are baffled: What's up with the universe? - The Washington Post - 0 views

  • The universe is unimaginably big, and it keeps getting bigger. But astronomers cannot agree on how quickly it is growing — and the more they study the problem, the more they disagree.
  • Some scientists call this a “crisis” in cosmology. A less dramatic term in circulation is “the Hubble Constant tension.”
  • Nine decades ago, the astronomer Edwin Hubble showed that the universe is orders of magnitude vaster than previously imagined — and the whole kit and kaboodle is expanding. The rate of that expansion is a number called the Hubble Constant.
  • ...12 more annotations...
  • It’s a slippery number, however. Measurements using different techniques have produced different results, and the numbers show no sign of converging even as researchers refine their observations
  • the theorists are intrigued. They hope the Hubble Constant confusion is the harbinger of a potential major discovery — some "new physics."
  • “Any time there’s a discrepancy, some kind of anomaly, we all get very excited,”
  • “Where’s it all going to go? How’s it all going to end? That’s a big question,”
  • One idea floating around is that there could have been something called Early Dark Energy that skewed the appearance of the background radiation
  • “New physics might be that there’s some form of energy that acted in the earliest moments of the evolution of the universe. You’d get an injection of energy that’d then have to disappear,”
  • Leavitt, a then-obscure employee of the Harvard College Observatory, discovered that the intrinsically brighter stars have longer periods. This insight — Leavitt’s law — allows astronomers to know the Cepheid’s absolute luminosity, then gauge the distance to the star based on how bright or faint it appears.
  • just to be clear: The Hubble Constant in question is the rate of expansion in our “local” universe, not the rate of expansion when the background radiation was first emitted billions of years ago. Over time, the Hubble Constant isn’t constant.)
  • At the dawn of the 21st century, this Standard Model seemed to pass every observational test. And any disparities in the measurement of the Hubble Constant would surely be ironed out with further observations, scientists assumed. They had even nailed down the age of the universe precisely: 13.8 billion years.
  • “We felt really good,”
  • He added, jokingly, “We should have stopped taking data.”
  • “We are wired to use our intuition to understand things around us,” Riess said. “Most of the universe is made out of stuff that’s completely different than us. This adherence to intuition is often wildly unsuccessful in the universe.”
167More

MacIntyre | Internet Encyclopedia of Philosophy - 0 views

  • For MacIntyre, “rationality” comprises all the intellectual resources, both formal and substantive, that we use to judge truth and falsity in propositions, and to determine choice-worthiness in courses of action
  • Rationality in this sense is not universal; it differs from community to community and from person to person, and may both develop and regress over the course of a person’s life or a community’s history.
  • So rationality itself, whether theoretical or practical, is a concept with a history: indeed, since there are also a diversity of traditions of enquiry, with histories, there are, so it will turn out, rationalities rather than rationality, just as it will also turn out that there are justices rather than justice
  • ...164 more annotations...
  • Rationality is the collection of theories, beliefs, principles, and facts that the human subject uses to judge the world, and a person’s rationality is, to a large extent, the product of that person’s education and moral formation.
  • To the extent that a person accepts what is handed down from the moral and intellectual traditions of her or his community in learning to judge truth and falsity, good and evil, that person’s rationality is “tradition-constituted.” Tradition-constituted rationality provides the schemata by which we interpret, understand, and judge the world we live in
  • The apparent problem of relativism in MacIntyre’s theory of rationality is much like the problem of relativism in the philosophy of science. Scientific claims develop within larger theoretical frameworks, so that the apparent truth of a scientific claim depends on one’s judgment of the larger framework. The resolution of the problem of relativism therefore appears to hang on the possibility of judging frameworks or rationalities, or judging between frameworks or rationalities from a position that does not presuppose the truth of the framework or rationality, but no such theoretical standpoint is humanly possible.
  • MacIntyre finds that the world itself provides the criterion for the testing of rationalities, and he finds that there is no criterion except the world itself that can stand as the measure of the truth of any philosophical theory.
  • MacIntyre’s philosophy is indebted to the philosophy of science, which recognizes the historicism of scientific enquiry even as it seeks a truthful understanding of the world. MacIntyre’s philosophy does not offer a priori certainty about any theory or principle; it examines the ways in which reflection upon experience supports, challenges, or falsifies theories that have appeared to be the best theories so far to the people who have accepted them so far. MacIntyre’s ideal enquirers remain Hamlets, not Emmas.
  • history shows us that individuals, communities, and even whole nations may commit themselves militantly over long periods of their histories to doctrines that their ideological adversaries find irrational. This qualified relativism of appearances has troublesome implications for anyone who believes that philosophical enquiry can easily provide certain knowledge of the world
  • According to MacIntyre, theories govern the ways that we interpret the world and no theory is ever more than “the best standards so far” (3RV, p. 65). Our theories always remain open to improvement, and when our theories change, the appearances of our world—the apparent truths of claims judged within those theoretical frameworks—change with them.
  • From the subjective standpoint of the human enquirer, MacIntyre finds that theories, concepts, and facts all have histories, and they are all liable to change—for better or for worse.
  • MacIntyre holds that the rationality of individuals is not only tradition-constituted, it is also tradition constitutive, as individuals make their own contributions to their own rationality, and to the rationalities of their communities. Rationality is not fixed, within either the history of a community or the life of a person
  • The modern account of first principles justifies an approach to philosophy that rejects tradition. The modern liberal individualist approach is anti-traditional. It denies that our understanding is tradition-constituted and it denies that different cultures may differ in their standards of rationality and justice:
  • Modernity does not see tradition as the key that unlocks moral and political understanding, but as a superfluous accumulation of opinions that tend to prejudice moral and political reasoning.
  • Although modernity rejects tradition as a method of moral and political enquiry, MacIntyre finds that it nevertheless bears all the characteristics of a moral and political tradition.
  • If historical narratives are only projections of the interests of historians, then it is difficult to see how this historical narrative can claim to be truthful
  • For these post-modern theorists, “if the Enlightenment conceptions of truth and rationality cannot be sustained,” either relativism or perspectivism “is the only possible alternative” (p. 353). MacIntyre rejects both challenges by developing his theory of tradition-constituted and tradition-constitutive rationality on pp. 354-369
  • How, then, is one to settle challenges between two traditions? It depends on whether the adherents of either take the challenges of the other tradition seriously. It depends on whether the adherents of either tradition, on seeing a failure in their own tradition are willing to consider an answer offered by their rival (p. 355)
  • how a person with no traditional affiliation is to deal with the conflicting claims of rival traditions: “The initial answer is: that will depend upon who you are and how you understand yourself. This is not the kind of answer which we have been educated to expect in philosophy”
  • MacIntyre focuses the critique of modernity on the question of rational justification. Modern epistemology stands or falls on the possibility of Cartesian epistemological first principles. MacIntyre’s history exposes that notion of first principle as a fiction, and at the same time demonstrates that rational enquiry advances (or declines) only through tradition
  • MacIntyre cites Foucault’s 1966 book, Les Mots et les choses (The Order of Things, 1970) as an example of the self-subverting character of Genealogical enquiry
  • Foucault’s book reduces history to a procession of “incommensurable ordered schemes of classification and representation” none of which has any greater claim to truth than any other, yet this book “is itself organized as a scheme of classification and representation.”
  • From MacIntyre’s perspective, there is no question of deciding whether or not to work within a tradition; everyone who struggles with practical, moral, and political questions simply does. “There is no standing ground, no place for enquiry . . . apart from that which is provided by some particular tradition or other”
  • Three Rival Versions of Moral Enquiry (1990). The central idea of the Gifford Lectures is that philosophers make progress by addressing the shortcomings of traditional narratives about the world, shortcomings that become visible either through the failure of traditional narratives to make sense of experience, or through the introduction of contradictory narratives that prove impossible to dismiss
  • MacIntyre compares three traditions exemplified by three literary works published near the end of Adam Gifford’s life (1820–1887)
  • The Ninth Edition of the Encyclopaedia Britannica (1875–1889) represents the modern tradition of trying to understand the world objectively without the influence of tradition.
  • The Genealogy of Morals (1887), by Friedrich Nietzsche embodies the post-modern tradition of interpreting all traditions as arbitrary impositions of power.
  • The encyclical letter Aeterni Patris (1879) of Pope Leo XIII exemplifies the approach of acknowledging one’s predecessors within one’s own tradition of enquiry and working to advance or improve that tradition in the pursuit of objective truth. 
  • Of the three versions of moral enquiry treated in 3RV, only tradition, exemplified in 3RV by the Aristotelian, Thomistic tradition, understands itself as a tradition that looks backward to predecessors in order to understand present questions and move forward
  • Encyclopaedia obscures the role of tradition by presenting the most current conclusions and convictions of a tradition as if they had no history, and as if they represented the final discovery of unalterable truth
  • Encyclopaedists focus on the present and ignore the past.
  • Genealogists, on the other hand, focus on the past in order to undermine the claims of the present.
  • In short, Genealogy denies the teleology of human enquiry by denying (1) that historical enquiry has been fruitful, (2) that the enquiring person has a real identity, and (3) that enquiry has a real goal. MacIntyre finds this mode of enquiry incoherent.
  • Genealogy is self-deceiving insofar as it ignores the traditional and teleological character of its enquiry.
  • Genealogical moral enquiry must make similar exceptions to its treatments of the unity of the enquiring subject and the teleology of moral enquiry; thus “it seems to be the case that the intelligibility of genealogy requires beliefs and allegiances of a kind precluded by the genealogical stance” (3RV, p. 54-55)
  • MacIntyre uses Thomism because it applies the traditional mode of enquiry in a self-conscious manner. Thomistic students learn the work of philosophical enquiry as apprentices in a craft (3RV, p. 61), and maintain the principles of the tradition in their work to extend the understanding of the tradition, even as they remain open to the criticism of those principles.
  • 3RV uses Thomism as its example of tradition, but this use should not suggest that MacIntyre identifies “tradition” with Thomism or Thomism-as-a-name-for-the-Western-tradition. As noted above, WJWR distinguished four traditions of enquiry within the Western European world alone
  • MacIntyre’s emphasis on the temporality of rationality in traditional enquiry makes tradition incompatible with the epistemological projects of modern philosophy
  • Tradition is not merely conservative; it remains open to improvement,
  • Tradition differs from both encyclopaedia and genealogy in the way it understands the place of its theories in the history of human enquiry. The adherent of a tradition must understand that “the rationality of a craft is justified by its history so far,” thus it “is inseparable from the tradition through which it was achieved”
  • MacIntyre uses Thomas Aquinas to illustrate the revolutionary potential of traditional enquiry. Thomas was educated in Augustinian theology and Aristotelian philosophy, and through this education he began to see not only the contradictions between the two traditions, but also the strengths and weaknesses that each tradition revealed in the other. His education also helped him to discover a host of questions and problems that had to be answered and solved. Many of Thomas Aquinas’ responses to these concerns took the form of disputed questions. “Yet to each question the answer produced by Aquinas as a conclusion is no more than and, given Aquinas’s method, cannot but be no more than, the best answer reached so far. And hence derives the essential incompleteness”
  • argue that the virtues are essential to the practice of independent practical reason. The book is relentlessly practical; its arguments appeal only to experience and to purposes, and to the logic of practical reasoning.
  • Like other intelligent animals, human beings enter life vulnerable, weak, untrained, and unknowing, and face the likelihood of infirmity in sickness and in old age. Like other social animals, humans flourish in groups. We learn to regulate our passions, and to act effectively alone and in concert with others through an education provided within a community. MacIntyre’s position allows him to look to the animal world to find analogies to the role of social relationships in the moral formation of human beings
  • The task for the human child is to make “the transition from the infantile exercise of animal intelligence to the exercise of independent practical reasoning” (DRA, p. 87). For a child to make this transition is “to redirect and transform her or his desires, and subsequently to direct them consistently towards the goods of different stages of her or his life” (DRA, p. 87). The development of independent practical reason in the human agent requires the moral virtues in at least three ways.
  • DRA presents moral knowledge as a “knowing how,” rather than as a “knowing that.” Knowledge of moral rules is not sufficient for a moral life; prudence is required to enable the agent to apply the rules well.
  • “Knowing how to act virtuously always involves more than rule-following” (DRA, p. 93). The prudent person can judge what must be done in the absence of a rule and can also judge when general norms cannot be applied to particular cases.
  • Flourishing as an independent practical reasoner requires the virtues in a second way, simply because sometimes we need our friends to tell us who we really are. Independent practical reasoning also requires self-knowledge, but self-knowledge is impossible without the input of others whose judgment provides a reliable touchstone to test our beliefs about ourselves. Self-knowledge therefore requires the virtues that enable an agent to sustain formative relationships and to accept the criticism of trusted friends
  • Human flourishing requires the virtues in a third way, by making it possible to participate in social and political action. They enable us to “protect ourselves and others against neglect, defective sympathies, stupidity, acquisitiveness, and malice” (DRA, p. 98) by enabling us to form and sustain social relationships through which we may care for one another in our infirmities, and pursue common goods with and for the other members of our societies.
  • MacIntyre argues that it is impossible to find an external standpoint, because rational enquiry is an essentially social work (DRA, p. 156-7). Because it is social, shared rational enquiry requires moral commitment to, and practice of, the virtues to prevent the more complacent members of communities from closing off critical reflection upon “shared politically effective beliefs and concepts”
  • MacIntyre finds himself compelled to answer what may be called the question of moral provincialism: If one is to seek the truth about morality and justice, it seems necessary to “find a standpoint that is sufficiently external to the evaluative attitudes and practices that are to be put to the question.” If it is impossible for the agent to take such an external standpoint, if the agent’s commitments preclude radical criticism of the virtues of the community, does that leave the agent “a prisoner of shared prejudices” (DRA, p. 154)?
  • The book moves from MacIntyre’s assessment of human needs for the virtues to the political implications of that assessment. Social and political institutions that form and enable independent practical reasoning must “satisfy three conditions.” (1) They must enable their members to participate in shared deliberations about the communities’ actions. (2) They must establish norms of justice “consistent with exercise of” the virtue of justice. (3) They must enable the strong “to stand proxy” as advocates for the needs of the weak and the disabled.
  • The social and political institutions that MacIntyre recommends cannot be identified with the modern nation state or the modern nuclear family
  • The political structures necessary for human flourishing are essentially local
  • Yet local communities support human flourishing only when they actively support “the virtues of just generosity and shared deliberation”
  • MacIntyre rejects individualism and insists that we view human beings as members of communities who bear specific debts and responsibilities because of our social identities. The responsibilities one may inherit as a member of a community include debts to one’s forbearers that one can only repay to people in the present and future
  • The constructive argument of the second half of the book begins with traditional accounts of the excellences or virtues of practical reasoning and practical rationality rather than virtues of moral reasoning or morality. These traditional accounts define virtue as arête, as excellence
  • Practices are supported by institutions like chess clubs, hospitals, universities, industrial corporations, sports leagues, and political organizations.
  • Practices exist in tension with these institutions, since the institutions tend to be oriented to goods external to practices. Universities, hospitals, and scholarly societies may value prestige, profitability, or relations with political interest groups above excellence in the practices they are said to support.
  • Personal desires and institutional pressures to pursue external goods may threaten to derail practitioners’ pursuits of the goods internal to practices. MacIntyre defines virtue initially as the quality of character that enables an agent to overcome these temptations:
  • “A virtue is an acquired human quality the possession and exercise of which tends to enable us to achieve those goods which are internal to practices
  • Excellence as a human agent cannot be reduced to excellence in a particular practice (See AV, pp. 204–
  • The virtues therefore are to be understood as those dispositions which will not only sustain practices and enable us to achieve the goods internal to practices, but which will also sustain us in the relevant kind of quest for the good, by enabling us to overcome the harms, dangers, temptations, and distractions which we encounter, and which will furnish us with increasing self-knowledge and increasing knowledge of the good (AV, p. 219).
  • The excellent human agent has the moral qualities to seek what is good and best both in practices and in life as a whole.
  • The virtues find their point and purpose not only in sustaining those relationships necessary if the variety of goods internal to practices are to be achieved and not only in sustaining the form of an individual life in which that individual may seek out his or her good as the good of his or her whole life, but also in sustaining those traditions which provide both practices and individual lives with their necessary historical context (AV, p. 223)
  • Since “goods, and with them the only grounds for the authority of laws and virtues, can only be discovered by entering into those relationships which constitute communities whose central bond is a shared vision of and understanding of goods” (AV, p. 258), any hope for the transformation and renewal of society depends on the development and maintenance of such communities.
  • MacIntyre’s Aristotelian approach to ethics as a study of human action distinguishes him from post-Kantian moral philosophers who approach ethics as a means of determining the demands of objective, impersonal, universal morality
  • This modern approach may be described as moral epistemology. Modern moral philosophy pretends to free the individual to determine for her- or himself what she or he must do in a given situation, irrespective of her or his own desires; it pretends to give knowledge of universal moral laws
  • Aristotelian metaphysicians, particularly Thomists who define virtue in terms of the perfection of nature, rejected MacIntyre’s contention that an adequate Aristotelian account of virtue as excellence in practical reasoning and human action need not appeal to Aristotelian metaphysic
  • one group of critics rejects MacIntyre’s Aristotelianism because they hold that any Aristotelian account of the virtues must first account for the truth about virtue in terms of Aristotle’s philosophy of nature, which MacIntyre had dismissed in AV as “metaphysical biology”
  • Many of those who rejected MacIntyre’s turn to Aristotle define “virtue” primarily along moral lines, as obedience to law or adherence to some kind of natural norm. For these critics, “virtuous” appears synonymous with “morally correct;” their resistance to MacIntyre’s appeal to virtue stems from their difficulties either with what they take to be the shortcomings of MacIntyre’s account of moral correctness or with the notion of moral correctness altogether
  • MacIntyre continues to argue from the experience of practical reasoning to the demands of moral education.
  • Descartes and his successors, by contrast, along with certain “notable Thomists of the last hundred years” (p. 175), have proposed that philosophy begins from knowledge of some “set of necessarily true first principles which any truly rational person is able to evaluate as true” (p. 175). Thus for the moderns, philosophy is a technical rather than moral endeavor
  • MacIntyre distinguishes two related challenges to his position, the “relativist challenge” and the “perspectivist challenge.” These two challenges both acknowledge that the goals of the Enlightenment cannot be met and that, “the only available standards of rationality are those made available by and within traditions” (p. 252); they conclude that nothing can be known to be true or false
  • MacIntyre follows the progress of the Western tradition through “three distinct traditions:” from Homer and Aristotle to Thomas Aquinas, from Augustine to Thomas Aquinas and from Augustine through Calvin to Hume
  • Chapter 17 examines the modern liberal denial of tradition, and the ironic transformation of liberalism into the fourth tradition to be treated in the book.
  • MacIntyre credits John Stuart Mill and Thomas Aquinas as “two philosophers of the kind who by their writing send us beyond philosophy into immediate encounter with the ends of life
  • First, both were engaged by questions about the ends of life as questioning human beings and not just as philosophers. . . .
  • Secondly, both Mill and Aquinas understood their speaking and writing as contributing to an ongoing philosophical conversation. . . .
  • Thirdly, it matters that both the end of the conversation and the good of those who participate in it is truth and that the nature of truth, of good, of rational justification, and of meaning therefore have to be central topics of that conversation (Tasks, pp. 130-1).
  • Without these three characteristics, philosophy is first reduced to “the exercise of a set of analytic and argumentative skills. . . . Secondly, philosophy may thereby become a diversion from asking questions about the ends of life with any seriousness”
  • Neither Rosenzweig nor Lukács made philosophical progress because both failed to relate “their questions about the ends of life to the ends of their philosophical writing”
  • First, any adequate philosophical history or biography must determine whether the authors studied remain engaged with the questions that philosophy studies, or set the questions aside in favor of the answers. Second, any adequate philosophical history or biography must determine whether the authors studied insulated themselves from contact with conflicting worldviews or remained open to learning from every available philosophical approach. Third, any adequate philosophical history or biography must place the authors studied into a broader context that shows what traditions they come from and “whose projects” they are “carrying forward
  • MacIntyre’s recognition of the connection between an author’s pursuit of the ends of life and the same author’s work as a philosophical writer prompts him to finish the essay by demanding three things of philosophical historians and biographers
  • Philosophy is not just a study; it is a practice. Excellence in this practice demands that an author bring her or his struggles with the questions of the ends of philosophy into dialogue with historic and contemporary texts and authors in the hope of making progress in answering those questions
  • MacIntyre defends Thomistic realism as rational enquiry directed to the discovery of truth.
  • The three Thomistic essays in this book challenge those caricatures by presenting Thomism in a way that people outside of contemporary Thomistic scholarship may find surprisingly flexible and open
  • To be a moral agent, (1) one must understand one’s individual identity as transcending all the roles that one fills; (2) one must see oneself as a practically rational individual who can judge and reject unjust social standards; and (3) one must understand oneself as “as accountable to others in respect of the human virtues and not just in respect of [one’s] role-performances
  • J is guilty because he complacently accepted social structures that he should have questioned, structures that undermined his moral agency. This essay shows that MacIntyre’s ethics of human agency is not just a descriptive narrative about the manner of moral education; it is a standard laden account of the demands of moral agency.
  • MacIntyre considers “the case of J” (J, for jemand, the German word for “someone”), a train controller who learned, as a standard for his social role, to take no interest in what his trains carried, even during war time when they carried “munitions and . . . Jews on their way to extermination camps”
  • J had learned to do his work for the railroad according to one set of standards and to live other parts of his life according to other standards, so that this compliant participant in “the final solution” could contend, “You cannot charge me with moral failure” (E&P, p. 187).
  • The epistemological theories of Modern moral philosophy were supposed to provide rational justification for rules, policies, and practical determinations according to abstract universal standards, but MacIntyre has dismissed those theorie
  • Modern metaethics is supposed to enable its practitioners to step away from the conflicting demands of contending moral traditions and to judge those conflicts from a neutral position, but MacIntyre has rejected this project as well
  • In his ethical writings, MacIntyre seeks only to understand how to liberate the human agent from blindness and stupidity, to prepare the human agent to recognize what is good and best to do in the concrete circumstances of that agent’s own life, and to strengthen the agent to follow through on that judgment.
  • In his political writings, MacIntyre investigates the role of communities in the formation of effective rational agents, and the impact of political institutions on the lives of communities. This kind of ethics and politics is appropriately named the ethics of human agency.
  • The purpose of the modern moral philosophy of authors like Kant and Mill was to determine, rationally and universally, what kinds of behavior ought to be performed—not in terms of the agent’s desires or goals, but in terms of universal, rational duties. Those theories purported to let agents know what they ought to do by providing knowledge of duties and obligations, thus they could be described as theories of moral epistemology.
  • Contemporary virtue ethics purports to let agents know what qualities human beings ought to have, and the reasons that we ought to have them, not in terms of our fitness for human agency, but in the same universal, disinterested, non-teleological terms that it inherits from Kant and Mill.
  • For MacIntyre, moral knowledge remains a “knowing how” rather than a “knowing that;” MacIntyre seeks to identify those moral and intellectual excellences that make human beings more effective in our pursuit of the human good.
  • MacIntyre’s purpose in his ethics of human agency is to consider what it means to seek one’s good, what it takes to pursue one’s good, and what kind of a person one must become if one wants to pursue that good effectively as a human agent.
  • As a philosophy of human agency, MacIntyre’s work belongs to the traditions of Aristotle and Thomas Aquinas.
  • in keeping with the insight of Marx’s third thesis on Feuerbach, it maintained the common condition of theorists and people as peers in the pursuit of the good life.
  • He holds that the human good plays a role in our practical reasoning whether we recognize it or not, so that some people may do well without understanding why (E&P, p. 25). He also reads Aristotle as teaching that knowledge of the good can make us better agents
  • AV defines virtue in terms of the practical requirements for excellence in human agency, in an agent’s participation in practices (AV, ch. 14), in an agent’s whole life, and in an agent’s involvement in the life of her or his community
  • MacIntyre’s Aristotelian concept of “human action” opposes the notion of “human behavior” that prevailed among mid-twentieth-century determinist social scientists. Human actions, as MacIntyre understands them, are acts freely chosen by human agents in order to accomplish goals that those agents pursue
  • Human behavior, according to mid-twentieth-century determinist social scientists, is the outward activity of a subject, which is said to be caused entirely by environmental influences beyond the control of the subject.
  • Rejecting crude determinism in social science, and approaches to government and public policy rooted in determinism, MacIntyre sees the renewal of human agency and the liberation of the human agent as central goals for ethics and politics.
  • MacIntyre’s Aristotelian account of “human action” examines the habits that an agent must develop in order to judge and act most effectively in the pursuit of truly choice-worthy ends
  • MacIntyre seeks to understand what it takes for the human person to become the kind of agent who has the practical wisdom to recognize what is good and best to do and the moral freedom to act on her or his best judgment.
  • MacIntyre rejected the determinism of modern social science early in his career (“Determinism,” 1957), yet he recognizes that the ability to judge well and act freely is not simply given; excellence in judgment and action must be developed, and it is the task of moral philosophy to discover how these excellences or virtues of the human agent are established, maintained, and strengthened
  • MacIntyre’s Aristotelian philosophy investigates the conditions that support free and deliberate human action in order to propose a path to the liberation of the human agent through participation in the life of a political community that seeks its common goods through the shared deliberation and action of its members
  • As a classics major at Queen Mary College in the University of London (1945-1949), MacIntyre read the Greek texts of Plato and Aristotle, but his studies were not limited to the grammars of ancient languages. He also examined the ethical theories of Immanuel Kant and John Stuart Mill. He attended the lectures of analytic philosopher A. J. Ayer and of philosopher of science Karl Popper. He read Ludwig Wittgenstein’s Tractatus Logico Philosophicus, Jean-Paul Sartre’s L'existentialisme est un humanisme, and Marx’s Eighteenth Brumaire of Napoleon Bonaparte (What happened, pp. 17-18). MacIntyre met the sociologist Franz Steiner, who helped direct him toward approaching moralities substantively
  • Alasdair MacIntyre’s philosophy builds on an unusual foundation. His early life was shaped by two conflicting systems of values. One was “a Gaelic oral culture of farmers and fishermen, poets and storytellers.” The other was modernity, “The modern world was a culture of theories rather than stories” (MacIntyre Reader, p. 255). MacIntyre embraced both value systems
  • From Marxism, MacIntyre learned to see liberalism as a destructive ideology that undermines communities in the name of individual liberty and consequently undermines the moral formation of human agents
  • For MacIntyre, Marx’s way of seeing through the empty justifications of arbitrary choices to consider the real goals and consequences of political actions in economic and social terms would remain the principal insight of Marxism
  • After his retirement from teaching, MacIntyre has continued his work of promoting a renewal of human agency through an examination of the virtues demanded by practices, integrated human lives, and responsible engagement with community life. He is currently affiliated with the Centre for Contemporary Aristotelian Studies in Ethics and Politics (CASEP) at London Metropolitan University.
  • The second half of AV proposes a conception of practice and practical reasoning and the notion of excellence as a human agent as an alternative to modern moral philosophy
  • AV rejects the view of “modern liberal individualism” in which autonomous individuals use abstract moral principles to determine what they ought to do. The critique of modern normative ethics in the first half of AV rejects modern moral reasoning for its failure to justify its premises, and criticizes the frequent use of the rhetoric of objective morality and scientific necessity to manipulate people to accept arbitrary decisions
  • MacIntyre uses “modern liberal individualism” to name a much broader category that includes both liberals and conservatives in contemporary American political parlance, as well as some Marxists and anarchists (See ASIA, pp. 280-284). Conservatism, liberalism, Marxism, and anarchism all present the autonomous individual as the unit of civil society
  • The sources of modern liberal individualism—Hobbes, Locke, and Rousseau—assert that human life is solitary by nature and social by habituation and convention. MacIntyre’s Aristotelian tradition holds, on the contrary, that human life is social by nature.
  • MacIntyre identifies moral excellence with effective human agency, and seeks a political environment that will help to liberate human agents to recognize and seek their own goods, as components of the common goods of their communities, more effectively. For MacIntyre therefore, ethics and politics are bound together.
  • For MacIntyre ethics is not an application of principles to facts, but a study of moral action. Moral action, free human action, involves decisions to do things in pursuit of goals, and it involves the understanding of the implications of one’s actions for the whole variety of goals that human agents seek
  • In this sense, “To act morally is to know how to act” (SMJ, p. 56). “Morality is not a ‘knowing that’ but a ‘knowing how’”
  • If human action is a ‘knowing how,’ then ethics must also consider how one learns ‘how.’ Like other forms of ‘knowing how,’ MacIntyre finds that one learns how to act morally within a community whose language and shared standards shape our judgment
  • MacIntyre had concluded that ethics is not an abstract exercise in the assessment of facts; it is a study of free human action and of the conditions that enable rational human agency.
  • MacIntyre gives Marx credit for concluding in the third of the Theses on Feuerbach, that the only way to change society is to change ourselves, and that “The coincidence of the changing of human activity or self-changing can only be comprehended and rationally understood as revolutionary practice”
  • MacIntyre distinguishes “religion which is an opiate for the people from religion which is not” (MI, p. 83). He condemns forms of religion that justify social inequities and encourage passivity. He argues that authentic Christian teaching criticizes social structures and encourages action
  • Where “moral philosophy textbooks” discuss the kinds of maxims that should guide “promise-keeping, truth-telling, and the like,” moral maxims do not guide real agents in real life at all. “They do not guide us because we do not need to be guided. We know what to do” (ASIA, p. 106). Sometimes we do this without any maxims at all, or even against all the maxims we know. MacIntyre Illustrates his point with Huckleberry Finn’s decision to help Jim, Miss Watson’s escaped slave, to make his way to freedom
  • MacIntyre develops the ideas that morality emerges from history, and that morality organizes the common life of a community
  • The book concludes that the concepts of morality are neither timeless nor ahistorical, and that understanding the historical development of ethical concepts can liberate us “from any false absolutist claims” (SHE, p. 269). Yet this conclusion need not imply that morality is essentially arbitrary or that one could achieve freedom by liberating oneself from the morality of one’s society.
  • From this “Aristotelian point of view,” “modern morality” begins to go awry when moral norms are separated from the pursuit of human goods and moral behavior is treated as an end in itself. This separation characterizes Christian divine command ethics since the fourteenth century and has remained essential to secularized modern morality since the eighteenth century
  • From MacIntyre’s “Aristotelian point of view,” the autonomy granted to the human agent by modern moral philosophy breaks down natural human communities and isolates the individual from the kinds of formative relationships that are necessary to shape the agent into an independent practical reasoner.
  • the 1977 essay “Epistemological Crises, Dramatic Narrative, and the Philosophy of Science” (Hereafter EC). This essay, MacIntyre reports, “marks a major turning-point in my thought in the 1970s” (The Tasks of Philosophy, p. vii) EC may be described fairly as MacIntyre’s discourse on method
  • First, Philosophy makes progress through the resolution of problems. These problems arise when the theories, histories, doctrines and other narratives that help us to organize our experience of the world fail us, leaving us in “epistemological crises.” Epistemological crises are the aftermath of events that undermine the ways that we interpret our world
  • it presents three general points on the method for philosophy.
  • To live in an epistemological crisis is to be aware that one does not know what one thought one knew about some particular subject and to be anxious to recover certainty about that subject.
  • To resolve an epistemological crisis it is not enough to impose some new way of interpreting our experience, we also need to understand why we were wrong before: “When an epistemological crisis is resolved, it is by the construction of a new narrative which enables the agent to understand both how he or she could intelligibly have held his or her original beliefs and how he or she could have been so drastically misled by them
  • MacIntyre notes, “Philosophers have customarily been Emmas and not Hamlets” (p. 6); that is, philosophers have treated their conclusions as accomplished truths, rather than as “more adequate narratives” (p. 7) that remain open to further improvement.
  • To illustrate his position on the open-endedness of enquiry, MacIntyre compares the title characters of Shakespeare’s Hamlet and Jane Austen’s Emma. When Emma finds that she is deeply misled in her beliefs about the other characters in her story, Mr. Knightly helps her to learn the truth and the story comes to a happy ending (p. 6). Hamlet, by contrast, finds no pat answers to his questions; rival interpretations remain throughout the play, so that directors who would stage the play have to impose their own interpretations on the script
  • Another approach to education is the method of Descartes, who begins by rejecting everything that is not clearly and distinctly true as unreliable and false in order to rebuild his understanding of the world on a foundation of undeniable truth.
  • Descartes presents himself as willfully rejecting everything he had believed, and ignores his obvious debts to the Scholastic tradition, even as he argues his case in French and Latin. For MacIntyre, seeking epistemological certainty through universal doubt as a precondition for enquiry is a mistake: “it is an invitation not to philosophy but to mental breakdown, or rather to philosophy as a means of mental breakdown.
  • MacIntyre contrasts Descartes’ descent into mythical isolation with Galileo, who was able to make progress in astronomy and physics by struggling with the apparently insoluble questions of late medieval astronomy and physics, and radically reinterpreting the issues that constituted those questions
  • To make progress in philosophy one must sort through the narratives that inform one’s understanding, struggle with the questions that those narratives raise, and on occasion, reject, replace, or reinterpret portions of those narratives and propose those changes to the rest of one’s community for assessment. Human enquiry is always situated within the history and life of a community.
  • The third point of EC is that we can learn about progress in philosophy from the philosophy of science
  • Kuhn’s “paradigm shifts,” however, are unlike MacIntyre’s resolutions of epistemological crises in two ways.
  • First they are not rational responses to specific problems. Kuhn compares paradigm shifts to religious conversions (pp. 150, 151, 158), stressing that they are not guided by rational norms and he claims that the “mopping up” phase of a paradigm shift is a matter of convention in the training of new scientists and attrition among the holdouts of the previous paradigm
  • Second, the new paradigm is treated as a closed system of belief that regulates a new period of “normal science”; Kuhn’s revolutionary scientists are Emmas, not Hamlets
  • MacIntyre proposes elements of Imre Lakatos’ philosophy of science as correctives to Kuhn’s. While Lakatos has his own shortcomings, his general account of the methodologies of scientific research programs recognizes the role of reason in the transitions between theories and between research programs (Lakatos’ analog to Kuhn’s paradigms or disciplinary matrices). Lakatos presents science as an open ended enquiry, in which every theory may eventually be replaced by more adequate theories. For Lakatos, unlike Kuhn, rational scientific progress occurs when a new theory can account both for the apparent promise and for the actual failure of the theory it replaces.
  • The third conclusion of MacIntyre’s essay is that decisions to support some theories over others may be justified rationally to the extent that those theories allow us to understand our experience and our history, including the history of the failures of inadequate theories
  • For Aristotle, moral philosophy is a study of practical reasoning, and the excellences or virtues that Aristotle recommends in the Nicomachean Ethics are the intellectual and moral excellences that make a moral agent effective as an independent practical reasoner.
  • MacIntyre also finds that the contending parties have little interest in the rational justification of the principles they use. The language of moral philosophy has become a kind of moral rhetoric to be used to manipulate others in defense of the arbitrary choices of its users
  • examining the current condition of secular moral and political discourse. MacIntyre finds contending parties defending their decisions by appealing to abstract moral principles, but he finds their appeals eclectic, inconsistent, and incoherent.
  • The secular moral philosophers of the eighteenth and nineteenth centuries shared strong and extensive agreements about the content of morality (AV, p. 51) and believed that their moral philosophy could justify the demands of their morality rationally, free from religious authority.
  • MacIntyre traces the lineage of the culture of emotivism to the secularized Protestant cultures of northern Europe
  • Modern moral philosophy had thus set for itself an incoherent goal. It was to vindicate both the moral autonomy of the individual and the objectivity, necessity, and categorical character of the rules of morality
  • MacIntyre turns to an apparent alternative, the pragmatic expertise of professional managers. Managers are expected to appeal to the facts to make their decisions on the objective basis of effectiveness, and their authority to do this is based on their knowledge of the social sciences
  • An examination of the social sciences reveals, however, that many of the facts to which managers appeal depend on sociological theories that lack scientific status. Thus, the predictions and demands of bureaucratic managers are no less liable to ideological manipulation than the determinations of modern moral philosophers.
  • Modern moral philosophy separates moral reasoning about duties and obligations from practical reasoning about ends and practical deliberation about the means to one’s ends, and in doing so it separates morality from practice.
  • Many Europeans also lost the practical justifications for their moral norms as they approached modernity; for these Europeans, claiming that certain practices are “immoral,” and invoking Kant’s categorical imperative or Mill’s principle of utility to explain why those practices are immoral, seems no more adequate than the Polynesian appeal to taboo.
  • MacIntyre sifts these definitions and then gives his own definition of virtue, as excellence in human agency, in terms of practices, whole human lives, and traditions in chapters 14 and 15 of AV.
  • In the most often quoted sentence of AV, MacIntyre defines a practice as (1) a complex social activity that (2) enables participants to gain goods internal to the practice. (3) Participants achieve excellence in practices by gaining the internal goods. When participants achieve excellence, (4) the social understandings of excellence in the practice, of the goods of the practice, and of the possibility of achieving excellence in the practice “are systematically extended”
  • Practices, like chess, medicine, architecture, mechanical engineering, football, or politics, offer their practitioners a variety of goods both internal and external to these practices. The goods internal to practices include forms of understanding or physical abilities that can be acquired only by pursuing excellence in the associated practice
  • Goods external to practices include wealth, fame, prestige, and power; there are many ways to gain these external goods. They can be earned or purchased, either honestly or through deception; thus the pursuit of these external goods may conflict with the pursuit of the goods internal to practices.
  • An intelligent child is given the opportunity to win candy by learning to play chess. As long as the child plays chess only to win candy, he has every reason to cheat if by doing so he can win more candy. If the child begins to desire and pursue the goods internal to chess, however, cheating becomes irrational, because it is impossible to gain the goods internal to chess or any other practice except through an honest pursuit of excellence. Goods external to practices may nevertheless remain tempting to the practitioner.
  • Since MacIntyre finds social identity necessary for the individual, MacIntyre’s definition of the excellence or virtue of the human agent needs a social dimension:
  • These responsibilities also include debts incurred by the unjust actions of ones’ predecessors.
  • The enslavement and oppression of black Americans, the subjugation of Ireland, and the genocide of the Jews in Europe remained quite relevant to the responsibilities of citizens of the United States, England, and Germany in 1981, as they still do today.
  • Thus an American who said “I never owned any slaves,” “the Englishman who says ‘I never did any wrong to Ireland,’” or “the young German who believes that being born after 1945 means that what Nazis did to Jews has no moral relevance to his relationship to his Jewish contemporaries” all exhibit a kind of intellectual and moral failure.
  • “I am born with a past, and to cut myself off from that past in the individualist mode, is to deform my present relationships” (p. 221).  For MacIntyre, there is no moral identity for the abstract individual; “The self has to find its moral identity in and through its membership in communities” (p. 221).
11More

Trump's Twitter ban renews calls for tech law changes by many who don't get tech or the... - 1 views

  • There is no way Wednesday's events could have happened without the convenience and ease afforded to white supremacists — and almost everyone else — by the openness of the modern consumer internet.
  • It's ironic, then, that the insurrection unfolded on the heels of President Donald Trump's continual efforts to repeal Section 230 of the Communications Decency Act, which makes it difficult to sue online platforms over the content they host (or don't) — or how they moderate it (or don't).
  • Section 230 is, of course, the rare law that is disliked by Republicans and Democrats. Biden hates it, having said: "I think social media should be more socially conscious in terms of what is important in terms of our democracy. ... Everything should not be about whether they can make a buck."
  • ...8 more annotations...
  • It's one of the most consequential laws governing the internet, and it provided a crucial liability shield for technology companies for content they didn't themselves create, like comment threads.
  • and it has never even been updated to take into account any of the technological changes that have happened since.
  • What Rule 230 isn't (though it's often portrayed that way) is a bedrock for free speech protections: It's simply a rule that permits internet companies to moderate what other people put on their platforms — or not — without being on the hook legally for everything that happens to be there
  • There is an opportunity to use technology to protect people's ability to safely participate in democracy and enable a different America — the America we witnessed in Georgia on Tuesday — and a different world.
  • After Republicans lost the White House, the House and then the Senate, technology companies no longer feel pressure to cozy up to conservatives to keep their prerogatives.
  • But don't mistake the technology industry's lobbying points about free speech as being related to any real care for American democracy.
  • The major technology platforms enabling hate speech all have one thing in common with our 45th president: self-interest.
  • Freedom of speech is truly a value to cherish, but we cherish it through facilitating the expression of truth, not the unfettered right to spew lies and incite violence without consequence.
15More

Why the modern world is bad for your brain | Science | The Guardian - 0 views

  • Our brains are busier than ever before. We’re assaulted with facts, pseudo facts, jibber-jabber, and rumour, all posing as information. Trying to figure out what you need to know and what you can ignore is exhausting.
  • Our smartphones have become Swiss army knife–like appliances that include a dictionary, calculator, web browser, email, Game Boy, appointment calendar, voice recorder, guitar tuner, weather forecaster, GPS, texter, tweeter, Facebook updater, and flashlight.
  • But there’s a fly in the ointment. Although we think we’re doing several things at once, multitasking, this is a powerful and diabolical illusion.
  • ...12 more annotations...
  • When people think they’re multitasking, they’re actually just switching from one task to another very rapidly. And every time they do, there’s a cognitive cost in doing so.”
  • Even though we think we’re getting a lot done, ironically, multitasking makes us demonstrably less efficient.
  • Multitasking creates a dopamine-addiction feedback loop, effectively rewarding the brain for losing focus and for constantly searching for external stimulation.
  • The irony here for those of us who are trying to focus amid competing activities is clear: the very brain region we need to rely on for staying on task is easily distracted.
  • Instead of reaping the big rewards that come from sustained, focused effort, we instead reap empty rewards from completing a thousand little sugar-coated tasks.
  • His research found that being in a situation where you are trying to concentrate on a task, and an email is sitting unread in your inbox, can reduce your effective IQ by 10 points.
  • Wilson showed that the cognitive losses from multitasking are even greater than the cognitive losses from pot‑smoking.
  • If students study and watch TV at the same time, for example, the information from their schoolwork goes into the striatum, a region specialised for storing new procedures and skills, not facts and ideas. Without the distraction of TV, the information goes into the hippocampus, where it is organised and categorised in a variety of ways, making it easier to retrieve.
  • All this activity gives us a sense that we’re getting things done – and in some cases we are. But we are sacrificing efficiency and deep concentration when we interrupt our priority activities with email.
  • This uncertainty wreaks havoc with our rapid perceptual categorisation system, causes stress, and leads to decision overload. Every email requires a decision! Do I respond to it? If so, now or later? How important is it? What will be the social, economic, or job-related consequences if I don’t answer, or if I don’t answer right now?
  • A lever in the cage allowed the rats to send a small electrical signal directly to their nucleus accumbens. Do you think they liked it? Boy how they did! They liked it so much that they did nothing else. They forgot all about eating and sleeping. Long after they were hungry, they ignored tasty food if they had a chance to press that little chrome bar;
  • But remember, it is the dumb, novelty-seeking portion of the brain driving the limbic system that induces this feeling of pleasure, not the planning, scheduling, higher-level thought centres in the prefrontal cortex. Make no mistake: email-, Facebook- and Twitter-checking constitute a neural addiction.
« First ‹ Previous 41 - 60 of 73 Next ›
Showing 20 items per page