Skip to main content

Home/ TOK Friends/ Contents contributed and discussions participated by Javier E

Contents contributed and discussions participated by Javier E

Javier E

Forget 'selfie.' The Merriam-Webster word of 2013: 'Science' - CSMonitor.com - 1 views

  • Oxford tracked a huge jump in overall usage of selfie, but Merriam-Webster stuck primarily to look-ups on its website, recording a 176 percent increase for science when compared with last year
  • "The more we thought about it, the righter it seemed in that it does lurk behind a lot of big stories that we as a society are grappling with, whether it's climate change or environmental regulation or what's in our textbooks,"
  • Science, Mr. Morse said, is connected to broad cultural oppositions — science versus faith, for instance — along with the power of observation and intuition, reason and ideology, evidence and tradition.
Javier E

Revelations That Ikea Spied on Its Employees Stir Outrage in France - NYTimes.com - 0 views

  • kea’s investigations were conducted for various reasons, including the vetting of job applicants, efforts to build cases against employees accused of wrongdoing, and even attempts to undermine the arguments of consumers bringing complaints against the company. The going rate charged by the private investigators was 80 to 180 euros, or $110 to $247, per inquiry, court documents show. Between 2002 and 2012, the finance department of Ikea France approved more than €475,000 in invoices from investigators.
  • the spying cases occurred in a country that, in the digital age, has elevated privacy to a level nearly equal to the national trinity of Liberté, Égalité and Fraternité.
  • Very little of the surveillance yielded information Ikea was able to use against the targets of the data sweeps. But court documents indicate that investigators suspect that Ikea may have occasionally used knowledge of personal information to quell workplace grievances or to prompt a resignation.
  • ...2 more annotations...
  • Last month, the company’s current chief, Stefan Vanoverbeke, and financial director, Dariusz Rychert, were questioned along with Mr. Baillot for 48 hours by the judicial police before being placed under formal investigation. That set in motion a process in which the next step, if it comes, would be the filing of criminal charges.
  • In transcripts of police interviews, Mr. Paris and his colleagues in the risk management department acknowledged receiving frequent requests from Ikea store managers across France for criminal background checks, driving records and vehicle registrations — though only a fraction of those inquiries uncovered a notable offense. Usually the requests were limited to one or two people after a theft or a complaint of harassment among employees. But sometimes lists containing dozens of names of employees or job applicants were submitted for vetting, and then forwarded to one of a handful of trusted private investigators for processing.
Javier E

They're Watching You at Work - Don Peck - The Atlantic - 2 views

  • Predictive statistical analysis, harnessed to big data, appears poised to alter the way millions of people are hired and assessed.
  • By one estimate, more than 98 percent of the world’s information is now stored digitally, and the volume of that data has quadrupled since 2007.
  • The application of predictive analytics to people’s careers—an emerging field sometimes called “people analytics”—is enormously challenging, not to mention ethically fraught
  • ...52 more annotations...
  • By the end of World War II, however, American corporations were facing severe talent shortages. Their senior executives were growing old, and a dearth of hiring from the Depression through the war had resulted in a shortfall of able, well-trained managers. Finding people who had the potential to rise quickly through the ranks became an overriding preoccupation of American businesses. They began to devise a formal hiring-and-management system based in part on new studies of human behavior, and in part on military techniques developed during both world wars, when huge mobilization efforts and mass casualties created the need to get the right people into the right roles as efficiently as possible. By the 1950s, it was not unusual for companies to spend days with young applicants for professional jobs, conducting a battery of tests, all with an eye toward corner-office potential.
  • But companies abandoned their hard-edged practices for another important reason: many of their methods of evaluation turned out not to be very scientific.
  • this regime, so widespread in corporate America at mid-century, had almost disappeared by 1990. “I think an HR person from the late 1970s would be stunned to see how casually companies hire now,”
  • Many factors explain the change, he said, and then he ticked off a number of them: Increased job-switching has made it less important and less economical for companies to test so thoroughly. A heightened focus on short-term financial results has led to deep cuts in corporate functions that bear fruit only in the long term. The Civil Rights Act of 1964, which exposed companies to legal liability for discriminatory hiring practices, has made HR departments wary of any broadly applied and clearly scored test that might later be shown to be systematically biased.
  • about a quarter of the country’s corporations were using similar tests to evaluate managers and junior executives, usually to assess whether they were ready for bigger roles.
  • Aptitude, skills, personal history, psychological stability, discretion, loyalty—companies at the time felt they had a need (and the right) to look into them all. That ambit is expanding once again, and this is undeniably unsettling. Should the ideas of scientists be dismissed because of the way they play a game? Should job candidates be ranked by what their Web habits say about them? Should the “data signature” of natural leaders play a role in promotion? These are all live questions today, and they prompt heavy concerns: that we will cede one of the most subtle and human of skills, the evaluation of the gifts and promise of other people, to machines; that the models will get it wrong; that some people will never get a shot in the new workforce.
  • Knack makes app-based video games, among them Dungeon Scrawl, a quest game requiring the player to navigate a maze and solve puzzles, and Wasabi Waiter, which involves delivering the right sushi to the right customer at an increasingly crowded happy hour. These games aren’t just for play: they’ve been designed by a team of neuroscientists, psychologists, and data scientists to suss out human potential. Play one of them for just 20 minutes, says Guy Halfteck, Knack’s founder, and you’ll generate several megabytes of data, exponentially more than what’s collected by the SAT or a personality test. How long you hesitate before taking every action, the sequence of actions you take, how you solve problems—all of these factors and many more are logged as you play, and then are used to analyze your creativity, your persistence, your capacity to learn quickly from mistakes, your ability to prioritize, and even your social intelligence and personality. The end result, Halfteck says, is a high-resolution portrait of your psyche and intellect, and an assessment of your potential as a leader or an innovator.
  • When the results came back, Haringa recalled, his heart began to beat a little faster. Without ever seeing the ideas, without meeting or interviewing the people who’d proposed them, without knowing their title or background or academic pedigree, Knack’s algorithm had identified the people whose ideas had panned out. The top 10 percent of the idea generators as predicted by Knack were in fact those who’d gone furthest in the process.
  • What Knack is doing, Haringa told me, “is almost like a paradigm shift.” It offers a way for his GameChanger unit to avoid wasting time on the 80 people out of 100—nearly all of whom look smart, well-trained, and plausible on paper—whose ideas just aren’t likely to work out.
  • He has encouraged the company’s HR executives to think about applying the games to the recruitment and evaluation of all professional workers.
  • scoring distance from work could violate equal-employment-opportunity standards. Marital status? Motherhood? Church membership? “Stuff like that,” Meyerle said, “we just don’t touch”—at least not in the U.S., where the legal environment is strict. Meyerle told me that Evolv has looked into these sorts of factors in its work for clients abroad, and that some of them produce “startling results.”
  • consider the alternative. A mountain of scholarly literature has shown that the intuitive way we now judge professional potential is rife with snap judgments and hidden biases, rooted in our upbringing or in deep neurological connections that doubtless served us well on the savanna but would seem to have less bearing on the world of work.
  • We may like to think that society has become more enlightened since those days, and in many ways it has, but our biases are mostly unconscious, and they can run surprisingly deep. Consider race. For a 2004 study called “Are Emily and Greg More Employable Than Lakisha and Jamal?,” the economists Sendhil Mullainathan and Marianne Bertrand put white-sounding names (Emily Walsh, Greg Baker) or black-sounding names (Lakisha Washington, Jamal Jones) on similar fictitious résumés, which they then sent out to a variety of companies in Boston and Chicago. To get the same number of callbacks, they learned, they needed to either send out half again as many résumés with black names as those with white names, or add eight extra years of relevant work experience to the résumés with black names.
  • a sociologist at Northwestern, spent parts of the three years from 2006 to 2008 interviewing professionals from elite investment banks, consultancies, and law firms about how they recruited, interviewed, and evaluated candidates, and concluded that among the most important factors driving their hiring recommendations were—wait for it—shared leisure interests.
  • Lacking “reliable predictors of future performance,” Rivera writes, “assessors purposefully used their own experiences as models of merit.” Former college athletes “typically prized participation in varsity sports above all other types of involvement.” People who’d majored in engineering gave engineers a leg up, believing they were better prepared.
  • the prevailing system of hiring and management in this country involves a level of dysfunction that should be inconceivable in an economy as sophisticated as ours. Recent survey data collected by the Corporate Executive Board, for example, indicate that nearly a quarter of all new hires leave their company within a year of their start date, and that hiring managers wish they’d never extended an offer to one out of every five members on their team
  • In the late 1990s, as these assessments shifted from paper to digital formats and proliferated, data scientists started doing massive tests of what makes for a successful customer-support technician or salesperson. This has unquestionably improved the quality of the workers at many firms.
  • In 2010, however, Xerox switched to an online evaluation that incorporates personality testing, cognitive-skill assessment, and multiple-choice questions about how the applicant would handle specific scenarios that he or she might encounter on the job. An algorithm behind the evaluation analyzes the responses, along with factual information gleaned from the candidate’s application, and spits out a color-coded rating: red (poor candidate), yellow (middling), or green (hire away). Those candidates who score best, I learned, tend to exhibit a creative but not overly inquisitive personality, and participate in at least one but not more than four social networks, among many other factors. (Previous experience, one of the few criteria that Xerox had explicitly screened for in the past, turns out to have no bearing on either productivity or retention
  • the idea that hiring was a science fell out of favor. But now it’s coming back, thanks to new technologies and methods of analysis that are cheaper, faster, and much-wider-ranging than what we had before
  • Gone are the days, Ostberg told me, when, say, a small survey of college students would be used to predict the statistical validity of an evaluation tool. “We’ve got a data set of 347,000 actual employees who have gone through these different types of assessments or tools,” he told me, “and now we have performance-outcome data, and we can split those and slice and dice by industry and location.”
  • Evolv’s tests allow companies to capture data about everybody who applies for work, and everybody who gets hired—a complete data set from which sample bias, long a major vexation for industrial-organization psychologists, simply disappears. The sheer number of observations that this approach makes possible allows Evolv to say with precision which attributes matter more to the success of retail-sales workers (decisiveness, spatial orientation, persuasiveness) or customer-service personnel at call centers (rapport-building)
  • There are some data that Evolv simply won’t use, out of a concern that the information might lead to systematic bias against whole classes of people
  • When Xerox started using the score in its hiring decisions, the quality of its hires immediately improved. The rate of attrition fell by 20 percent in the initial pilot period, and over time, the number of promotions rose. Xerox still interviews all candidates in person before deciding to hire them, Morse told me, but, she added, “We’re getting to the point where some of our hiring managers don’t even want to interview anymore”
  • what most excites him are the possibilities that arise from monitoring the entire life cycle of a worker at any given company.
  • Mullainathan expressed amazement at how little most creative and professional workers (himself included) know about what makes them effective or ineffective in the office. Most of us can’t even say with any certainty how long we’ve spent gathering information for a given project, or our pattern of information-gathering, never mind know which parts of the pattern should be reinforced, and which jettisoned. As Mullainathan put it, we don’t know our own “production function.”
  • What begins with an online screening test for entry-level workers ends with the transformation of nearly every aspect of hiring, performance assessment, and management.
  • I turned to Sandy Pentland, the director of the Human Dynamics Laboratory at MIT. In recent years, Pentland has pioneered the use of specialized electronic “badges” that transmit data about employees’ interactions as they go about their days. The badges capture all sorts of information about formal and informal conversations: their length; the tone of voice and gestures of the people involved; how much those people talk, listen, and interrupt; the degree to which they demonstrate empathy and extroversion; and more. Each badge generates about 100 data points a minute.
  • he tried the badges out on about 2,500 people, in 21 different organizations, and learned a number of interesting lessons. About a third of team performance, he discovered, can usually be predicted merely by the number of face-to-face exchanges among team members. (Too many is as much of a problem as too few.) Using data gathered by the badges, he was able to predict which teams would win a business-plan contest, and which workers would (rightly) say they’d had a “productive” or “creative” day. Not only that, but he claimed that his researchers had discovered the “data signature” of natural leaders, whom he called “charismatic connectors” and all of whom, he reported, circulate actively, give their time democratically to others, engage in brief but energetic conversations, and listen at least as much as they talk.
  • His group is developing apps to allow team members to view their own metrics more or less in real time, so that they can see, relative to the benchmarks of highly successful employees, whether they’re getting out of their offices enough, or listening enough, or spending enough time with people outside their own team.
  • Torrents of data are routinely collected by American companies and now sit on corporate servers, or in the cloud, awaiting analysis. Bloomberg reportedly logs every keystroke of every employee, along with their comings and goings in the office. The Las Vegas casino Harrah’s tracks the smiles of the card dealers and waitstaff on the floor (its analytics team has quantified the impact of smiling on customer satisfaction). E‑mail, of course, presents an especially rich vein to be mined for insights about our productivity, our treatment of co-workers, our willingness to collaborate or lend a hand, our patterns of written language, and what those patterns reveal about our intelligence, social skills, and behavior.
  • people analytics will ultimately have a vastly larger impact on the economy than the algorithms that now trade on Wall Street or figure out which ads to show us. He reminded me that we’ve witnessed this kind of transformation before in the history of management science. Near the turn of the 20th century, both Frederick Taylor and Henry Ford famously paced the factory floor with stopwatches, to improve worker efficiency.
  • “The quantities of data that those earlier generations were working with,” he said, “were infinitesimal compared to what’s available now. There’s been a real sea change in the past five years, where the quantities have just grown so large—petabytes, exabytes, zetta—that you start to be able to do things you never could before.”
  • People analytics will unquestionably provide many workers with more options and more power. Gild, for example, helps companies find undervalued software programmers, working indirectly to raise those people’s pay. Other companies are doing similar work. One called Entelo, for instance, specializes in using algorithms to identify potentially unhappy programmers who might be receptive to a phone cal
  • He sees it not only as a boon to a business’s productivity and overall health but also as an important new tool that individual employees can use for self-improvement: a sort of radically expanded The 7 Habits of Highly Effective People, custom-written for each of us, or at least each type of job, in the workforce.
  • the most exotic development in people analytics today is the creation of algorithms to assess the potential of all workers, across all companies, all the time.
  • The way Gild arrives at these scores is not simple. The company’s algorithms begin by scouring the Web for any and all open-source code, and for the coders who wrote it. They evaluate the code for its simplicity, elegance, documentation, and several other factors, including the frequency with which it’s been adopted by other programmers. For code that was written for paid projects, they look at completion times and other measures of productivity. Then they look at questions and answers on social forums such as Stack Overflow, a popular destination for programmers seeking advice on challenging projects. They consider how popular a given coder’s advice is, and how widely that advice ranges.
  • The algorithms go further still. They assess the way coders use language on social networks from LinkedIn to Twitter; the company has determined that certain phrases and words used in association with one another can distinguish expert programmers from less skilled ones. Gild knows these phrases and words are associated with good coding because it can correlate them with its evaluation of open-source code, and with the language and online behavior of programmers in good positions at prestigious companies.
  • having made those correlations, Gild can then score programmers who haven’t written open-source code at all, by analyzing the host of clues embedded in their online histories. They’re not all obvious, or easy to explain. Vivienne Ming, Gild’s chief scientist, told me that one solid predictor of strong coding is an affinity for a particular Japanese manga site.
  • Gild’s CEO, Sheeroy Desai, told me he believes his company’s approach can be applied to any occupation characterized by large, active online communities, where people post and cite individual work, ask and answer professional questions, and get feedback on projects. Graphic design is one field that the company is now looking at, and many scientific, technical, and engineering roles might also fit the bill. Regardless of their occupation, most people leave “data exhaust” in their wake, a kind of digital aura that can reveal a lot about a potential hire.
  • professionally relevant personality traits can be judged effectively merely by scanning Facebook feeds and photos. LinkedIn, of course, captures an enormous amount of professional data and network information, across just about every profession. A controversial start-up called Klout has made its mission the measurement and public scoring of people’s online social influence.
  • Now the two companies are working together to marry pre-hire assessments to an increasing array of post-hire data: about not only performance and duration of service but also who trained the employees; who has managed them; whether they were promoted to a supervisory role, and how quickly; how they performed in that role; and why they eventually left.
  • Over time, better job-matching technologies are likely to begin serving people directly, helping them see more clearly which jobs might suit them and which companies could use their skills. In the future, Gild plans to let programmers see their own profiles and take skills challenges to try to improve their scores. It intends to show them its estimates of their market value, too, and to recommend coursework that might allow them to raise their scores even more. Not least, it plans to make accessible the scores of typical hires at specific companies, so that software engineers can better see the profile they’d need to land a particular job
  • Knack, for its part, is making some of its video games available to anyone with a smartphone, so people can get a better sense of their strengths, and of the fields in which their strengths would be most valued. (Palo Alto High School recently adopted the games to help students assess careers.) Ultimately, the company hopes to act as matchmaker between a large network of people who play its games (or have ever played its games) and a widening roster of corporate clients, each with its own specific profile for any given type of job.
  • When I began my reporting for this story, I was worried that people analytics, if it worked at all, would only widen the divergent arcs of our professional lives, further gilding the path of the meritocratic elite from cradle to grave, and shutting out some workers more definitively. But I now believe the opposite is likely to happen, and that we’re headed toward a labor market that’s fairer to people at every stage of their careers
  • For decades, as we’ve assessed people’s potential in the professional workforce, the most important piece of data—the one that launches careers or keeps them grounded—has been educational background: typically, whether and where people went to college, and how they did there. Over the past couple of generations, colleges and universities have become the gatekeepers to a prosperous life. A degree has become a signal of intelligence and conscientiousness, one that grows stronger the more selective the school and the higher a student’s GPA, that is easily understood by employers, and that, until the advent of people analytics, was probably unrivaled in its predictive powers.
  • the limitations of that signal—the way it degrades with age, its overall imprecision, its many inherent biases, its extraordinary cost—are obvious. “Academic environments are artificial environments,” Laszlo Bock, Google’s senior vice president of people operations, told The New York Times in June. “People who succeed there are sort of finely trained, they’re conditioned to succeed in that environment,” which is often quite different from the workplace.
  • because one’s college history is such a crucial signal in our labor market, perfectly able people who simply couldn’t sit still in a classroom at the age of 16, or who didn’t have their act together at 18, or who chose not to go to graduate school at 22, routinely get left behind for good. That such early factors so profoundly affect career arcs and hiring decisions made two or three decades later is, on its face, absurd.
  • I spoke with managers at a lot of companies who are using advanced analytics to reevaluate and reshape their hiring, and nearly all of them told me that their research is leading them toward pools of candidates who didn’t attend college—for tech jobs, for high-end sales positions, for some managerial roles. In some limited cases, this is because their analytics revealed no benefit whatsoever to hiring people with college degrees; in other cases, and more often, it’s because they revealed signals that function far better than college history,
  • Google, too, is hiring a growing number of nongraduates. Many of the people I talked with reported that when it comes to high-paying and fast-track jobs, they’re reducing their preference for Ivy Leaguers and graduates of other highly selective schools.
  • This process is just beginning. Online courses are proliferating, and so are online markets that involve crowd-sourcing. Both arenas offer new opportunities for workers to build skills and showcase competence. Neither produces the kind of instantly recognizable signals of potential that a degree from a selective college, or a first job at a prestigious firm, might. That’s a problem for traditional hiring managers, because sifting through lots of small signals is so difficult and time-consuming.
  • all of these new developments raise philosophical questions. As professional performance becomes easier to measure and see, will we become slaves to our own status and potential, ever-focused on the metrics that tell us how and whether we are measuring up? Will too much knowledge about our limitations hinder achievement and stifle our dreams? All I can offer in response to these questions, ironically, is my own gut sense, which leads me to feel cautiously optimistic.
  • Google’s understanding of the promise of analytics is probably better than anybody else’s, and the company has been changing its hiring and management practices as a result of its ongoing analyses. (Brainteasers are no longer used in interviews, because they do not correlate with job success; GPA is not considered for anyone more than two years out of school, for the same reason—the list goes on.) But for all of Google’s technological enthusiasm, these same practices are still deeply human. A real, live person looks at every résumé the company receives. Hiring decisions are made by committee and are based in no small part on opinions formed during structured interviews.
Javier E

Typists who clear 70 wpm can't even say where the keys are | Ars Technica - 0 views

  • The majority of typists couldn’t tell you how they type if they tried, according to a study published in October in the scientific journal Attention, Perception, and Pschyophysics. The finding comes from a body of typists who averaged 72 words per minute but could not map more than an average of 15 keys on a QWERTY keyboard.
  • The basic theory of “automatic learning,” according to Vanderbilt University, asserts that people learn actions for skill-based work consciously and store the details of why and how in their short-term memory. Eventually the why and how of a certain action fades, but the performative action remains. However, in the case of typing, it appears that we don’t even store the action—that is, we have little to no “explicit knowledge” of the keyboar
  • Typing is a learned skill, like many learned skills, for which the details are forgotten. But the authors compare it less to playing chess than spending money: you don’t need to know which way the head on the coin faces, just the particular sizes and shapes
  • ...1 more annotation...
  • That something as integral to typing as the location of the keys is a forgettable detail is surprising, if consistent with the idea that daily exposure to something and explicit knowledge of that thing don’t go hand in hand.
Javier E

Where's the Money? US Arts and Culture Economy By the Numbers - 0 views

  • An unprecedented survey of the role of the arts in the larger economy, last week’s breakdown of the GDP contribution of America’s creative industries in 2011
  • a joint effort of the U.S. Bureau of Economic Analysis and National Endowment for the Arts, reveals that advertising dwarfs the economic clout of every single other creative endeavor, followed by arts education in a distant second place
  • the total size of the arts and culture economy is 3.2 percent of GDP, coming in at $504 billion
  • ...2 more annotations...
  • the small size of the “Independent artists and performing arts” ($48.9 billion) compared to the “Arts education” category ($104 billion) should provide some pause as far as the economic linkage that exists between arts education and the production of what we consider high (or “fine”) art: teaching art — i.e. the promise of art — is a more lustrous pearl than the actual messy business of producing it
  • As a reminder, the annual budget of the National Endowment for Arts in 2011 was $154.7 million. In that same year, the Philadelphia “Phillies,” a Major League Baseball team, paid their players alone $173 million.
Javier E

The Sweet Caress of Cyberspace - NYTimes.com - 0 views

  • “Her” imagines a society in which human beings are so thoroughly marginalized that they’re being edited out of courtship and companionship, because they’re superfluous, messy. It’s a love story as horror story. If we no longer need anyone in the passenger seat, do we need anyone at all?
  • It’s a parable of narcissism in the digital world, which lets you sprint to the foreground of everything, giving you an audience or the illusion of one.
  • But “Her” also traces the flip side of the coin — that with our amassed knowledge and scientific accomplishments, we may be succeeding in rendering ourselves obsolete.
  • ...1 more annotation...
  • I savored a few themes in particular. One is the Internet’s extreme indulgence of the seemingly innate human impulse to contrive a habitat that’s entirely unthreatening, an ego-stroking ecosystem, a sensibility-controlled comfort zone.
Javier E

The Berkeley Model - NYTimes.com - 0 views

  • Entitled “At Berkeley” — and running some four hours long — it attempts to do nothing less than capture the breadth of activities at the University of California, Berkeley, probably the finest public university in the country
  • “I hope they come away with a feeling that it is a great university, run by people of intelligence and sensitivity, and working hard to maintain standards and integrity.”
Javier E

Scholarship and Politics - The Case of Noam Chomsky - NYTimes.com - 0 views

  • (1) The academy is a world of its own, complete with rules, protocols, systems of evaluation, recognized achievements, agreed-on goals, a roster of heroes and a list of tasks yet to be done.
  • (2) Academic work proceeds within the confines of that world, within, that is, a professional, not a public, space, although its performance may be, and often is, public.
  • (3) academic work is only tangentially, not essentially, political; politics may attend the formation of academic units and the selection of academic personnel, but political concerns and pressures have no place in the unfolding of academic argument, except as objects of its distinctive forms of attention
  • ...16 more annotations...
  • (4) The academic views of a professor are independent of his or her real-world political views; academic disputes don’t track partisan disputes or vice versa; you can’t reason from an academic’s disciplinary views to the positions he or she would take in the public sphere; they are independent variables.
  • The answer given in the first lecture — “What is Language?” — is that we are creatures with language, and that language as a uniquely human biological capacity appeared suddenly and quite late in the evolutionary story, perhaps 75,000 years ago.
  • Chomsky gave three lectures under the general title “What Kind of Creatures are We?”
  • Language, then, does not arise from the social/cultural environment, although the environment provides the stuff or input it works on. That input is “impoverished”; it can’t account for the creativity of language performance, which has its source not in the empirical world, but in an innate ability that is more powerful than the stimuli it utilizes and plays with. It follows that if you want to understand language, you shouldn’t look to linguistic behavior but to the internal mechanism — the Universal Grammar — of which particular linguistic behaviors are a non-exhaustive expression. (The capacity exceeds the empirical resources it might deploy.)
  • In his second lecture (“What Can We Understand?”), Chomsky took up the question of what humans are capable of understanding and his answer, generally, was that we can understand what we can understand, and that means that we can’t understand what is beyond our innate mental capacities
  • This does not mean, he said, that what we can’t understand is not real: “What is mysterious to me is not an argument that it does not exist.” It’s just that while language is powerful and creative, its power and creativity have limits; and since language is thought rather than an addition to or clothing of thought, the limits of language are the limits of what we can fruitfully think about
  • This is as good as it gets. There is “no evolution in our capacity for language.”
  • These assertions are offered as a counter to what Chomsky sees as the over-optimistic Enlightenment belief — common to many empiricist philosophies — that ours is a “limitless explanatory power” and that “we can do anything.”
  • In the third lecture (“What is the Common Good?”) Chomsky turned from the philosophy of mind and language to political philosophy and the question of what constitutes a truly democratic society
  • He likened dogmatic intellectual structures that interfere with free inquiry to coercive political structures that stifle the individual’s creative independence and fail to encourage humanity’s “richest diversity
  • He asserted that any institution marked by domination and hierarchy must rise to the challenge of justifying itself, and if it cannot meet the challenge, it should be dismantled.
  • He contrasted two accounts of democracy: one — associated by him with James Madison — distrusts the “unwashed” populace and puts its faith in representative government where those doing the representing (and the voting and the distributing of goods) constitute a moneyed and propertied elite
  • the other — associated by him with Adam Smith (in one of his moods), J. S. Mill, the 1960s and a tradition of anarchist writing — seeks to expand the franchise and multiply choices in the realms of thought, politics and economics. The impulse of this second, libertarian, strain of democracy, is “to free society from economic or theological guardianship,” and by “theological” Chomsky meant not formal religion as such but any assumed and frozen ideology that blocked inquiry and limited participation. There can’t, in short, be “too much democracy.”
  • It was thought of the highest order performed by a thinker, now 85 years old, who by and large eschewed rhetorical flourishes (he has called his own speaking style “boring” and says he likes it that way) and just did it, where ‘it” was the patient exploration of deep issues that had been explored before him by a succession of predecessors, fully acknowledged, in a conversation that is forever being continued and forever being replenished.
  • Yes, I said to myself, this is what we — those of us who bought a ticket on this particular train — do; we think about problems and puzzles and try to advance the understanding of them; and we do that kind of thinking because its pleasures are, in a strong sense, athletic and provide for us, at least on occasion, the experience of fully realizing whatever capabilities we might have. And we do it in order to have that experience, and to share it with colleagues and students of like mind, and not to make a moral or political point.
  • The term “master class” is a bit overused, but I feel no hesitation in using it here. It was a master class taught by a master, and if someone were to ask me what exactly is it that academics do, I would point to these lectures and say, simply, here it is, the thing itself.
Javier E

Rush Limbaugh Knows Nothing About Christianity « The Dish - 0 views

  • Limbaugh is onto something. The Pope of the Catholic Church really is offering a rebuttal to the Pope of the Republican party, which is what Limbaugh has largely become. In daily encyclicals, Rush is infallible in doctrine and not to be questioned in public. When he speaks on the airwaves, it is always ex cathedra. Callers can get an audience from him, but rarely a hearing. Dissent from his eternal doctrines means excommunication from the GOP and the designation of heretic. His is always the last word.
  • And in the Church of Limbaugh, market capitalism is an unqualified, eternal good. It is the ever-lasting truth about human beings. It is inextricable from any concept of human freedom. The fewer restrictions on it, the better.
  • The church has long opposed market capitalism as the core measure of human well-being. Aquinas even taught that interest-bearing loans were inherently unjust in the most influential theological document in church history. The fundamental reason is that market capitalism measures human life by a materialist rubric. And Jesus radically taught us to give up all our possessions, to renounce everything except our “daily bread”, to spend our lives serving the poverty-stricken takers rather than aspiring to be the wealthy and powerful makers. He told the Mark Zuckerberg of his day to give everything away to the poor, if he really wanted to be happy.
  • ...7 more annotations...
  • there is a risk that a radical capitalistic ideology could spread which refuses even to consider these problems, in the a priori belief that any attempt to solve them is doomed to failure and which blindly entrusts their solution to the free development of market forces.
  • Could anyone have offered a more potent critique of current Republican ideology than John Paul II? Could anything better illustrate John Paul II’s critique of radical capitalist ideology than the GOP’s refusal to be concerned in any way about a fundamental question like access to basic healthcare for millions of citizens in the richest country on earth?
  • the Church in no way disputes the fact that market capitalism is by far the least worst means of raising standards of living and ending poverty and generating wealth that can be used to cure disease, feed the hungry, and protect the vulnerable. What the Church is disputing is that, beyond our daily bread, material well-being is a proper criterion for judging human morality or happiness. On a personal level, the Church teaches, as Jesus unambiguously did, that material goods beyond a certain point are actually pernicious and destructive of human flourishing.
  • the Pope is not making an empirical observation. In so far as he is, he agrees with you. What he’s saying is that this passion for material things is not what makes us good or happy. That’s all
  • if the mania for more and more materialist thrills distracts us from, say, the plight of a working American facing bankruptcy because of cancer, or the child of an illegal immigrant with no secure home, then it is a deeply immoral distraction.
  • material goods are not self-evidently the purpose of life and are usually (and in Jesus’ stern teachings always) paths away from God and our own good and our own happiness.
  • Christianity is one of the most powerful critiques of radical market triumphalism.
Javier E

A News Organization That Rejects the View From Nowhere - Conor Friedersdorf - The Atlantic - 1 views

  • For many years, Rosen has been a leading critic of what he calls The View From Nowhere, or the conceit that journalists bring no prior commitments to their work. On his long-running blog, PressThink, he's advocated for "The View From Somewhere"—an effort by journalists to be transparent about their priors, whether ideological or otherwise.  Rosen is just one of several voices who'll shape NewCo. Still, the new venture may well be a practical test of his View from Somewhere theory of journalism. I chatted with Rosen about some questions he'll face. 
  • The View from Nowhere won’t be a requirement for our journalists. Nor will a single ideology prevail. NewCo itself will have a view of the world: Accountability journalism, exposing abuses of power, revealing injustices will no doubt be part of it. Under that banner many “views from somewhere” can fit.
  • The way "objectivity" evolves historically is out of something much more defensible and interesting, which is in that phrase "Of No Party or Clique." That's the founders of The Atlantic saying they want to be independent of party politics. They don't claim to have no politics, do they? They simply say: We're not the voice of an existing faction or coalition. But they're also not the Voice of God.
  • ...10 more annotations...
  • NewCo will emulate the founders of The Atlantic. At some point "independent from" turned into "objective about." That was the wrong turn, made long ago, by professional journalism, American-style.
  • You've written that The View From Nowhere is, in part, a defense mechanism against charges of bias originating in partisan politics. If you won't be invoking it, what will your defense be when those charges happen? There are two answers to that. 1) We told you where we're coming from. 2) High standards of verification. You need both.
  • What about ideological diversity? The View from Somewhere obviously permits it. You've said you'll have it. Is that because it is valuable in itself?
  • The basic insight is correct: Since "news judgment" is judgment, the product is improved when there are multiple perspectives at the table ... But, if the people who are recruited to the newsroom because they add perspectives that might otherwise be overlooked are also taught that they should leave their politics at the door, or think like professional journalists rather than representatives or their community, or privilege something called "news values" over the priorities they had when they decided to become journalists, then these people are being given a fatally mixed message, if you see what I mean. They are valued for the perspective they bring, and then told that they should transcend that perspective.
  • When people talk about objectivity in journalism they have many different things in mind. Some of these I have no quarrel with. You could even say I’m a “fan.” For example, if objectivity means trying to ground truth claims in verifiable facts, I am definitely for that. If it means there’s a “hard” reality out there that exists beyond any of our descriptions of it, sign me up. If objectivity is the requirement to acknowledge what is, regardless of whether we want it to be that way, then I want journalists who can be objective in that sense. Don’t you? If it means trying to see things in that fuller perspective Thomas Nagel talked about–pulling the camera back, revealing our previous position as only one of many–I second the motion. If it means the struggle to get beyond the limited perspective that our experience and upbringing afford us… yeah, we need more of that, not less. I think there is value in acts of description that do not attempt to say whether the thing described is good or bad. Is that objectivity? If so, I’m all for it, and I do that myself sometimes. 
  • By "we can do better than that" I mean: We can insist on the struggle to tell it like it is without also insisting on the View from Nowhere. The two are not connected. It was a mistake to think that they necessarily are. But why was this mistake made? To control people in the newsroom from "above." That's a big part of objectivity. Not truth. Control.
  • If it works out as you hope, if things are implemented well, etc., what's the potential payoff for readers? I think it's three things: First, this is a news site that is born into the digital world, but doesn't have to return profits to investors. That's not totally unique
  • Second: It's going to be a technology company as much as a news organization. That should result in better service.
  • a good formula for innovation is to start with something people want to do and eliminate some of the steps required to do it
  • The third upside is news with a human voice restored to it. This is the great lesson that blogging gives to journalism
Javier E

Out of Print, Maybe, but Not Out of Mind - NYTimes.com - 1 views

  • efforts to reimagine the core experience of the book have stumbled. Dozens of publishing start-ups tried harnessing social reading apps or multimedia, but few caught on.
  • Social Books, which let users leave public comments on particular passages and comment on passages selected by others, became Rethink Books and then faltered. Push Pop Press, whose avowed aim was to reimagine the book by mixing text, images, audio, video and interactive graphics, was acquired by Facebook in 2011 and heard from no more. Copia, another highly publicized social reading platform, changed its business model to become a classroom learning tool. The latest to stumble is Small Demons, which explores the interrelationship among books. Users who were struck by the Ziegfeld Follies in “The Great Gatsby,” for instance, could follow a link to the dancers’ appearance in 67 other books. Small Demons said it would close this month without a new investor.
  • “A lot of these solutions were born out of a programmer’s ability to do something rather than the reader’s enthusiasm for things they need,” said Peter Meyers, author of “Breaking the Page,” a forthcoming look at the digital transformation of books. “We pursued distractions and called them enhancements.”
  • ...6 more annotations...
  • The notion that books require too much time to read dates back, at least, to midcentury entrepreneurial operations like Reader’s Digest and CliffsNotes, which offered up predigested texts. So some start-ups chose a basic approach: Take a text and break it up. Safari Flow, a service from Safari Books, offers chapters of technical manuals for a $29 monthly subscription fee. Inkling does the same with more consumer-oriented titles like cookbooks. If you want only the chapter on pasta, you can buy it for $4.99 instead of having to buy the whole book. Citia is a New York start-up with a much more ambitious approach. Working in collaboration with an author, Citia editors take a nonfiction book and reorganize its ideas onto digital cards that can be read on different devices and sent through social networks
  • One of the first books given the Citia treatment was Kevin Kelly’s “What Technology Wants.” Material directly from the book is in quotation marks and the author is referred to in the third person, which lends a somewhat academic distance to the summaries. Sections of the book are summarized on one card, then the reader can drill down into subsections on cards hidden underneath.
  • What to label these stories is another question. The Internet by its nature breaks down borders and unfreezes text. Put a book online and set it free to grow and shrink with new arguments, be broken up and reassembled as readers demand, and it might be only nostalgia that calls it by its old name.
  • “We will continue to recognize books as books as they migrate to the Internet, but our understanding of storytelling will inevitably expand,” Mr. Brantley said. Among the presentations at Books in Browsers this fall: “A Book Isn’t a Book Isn’t a Book” and “The Death of the Reader.”
  • Much of the design innovation at the moment, Mr. Brantley believes, is not coming from publishers, who must still wrestle with delivering both digital and physical books. Instead it is being developed by a tech community that “doesn’t think about stories as the end product. Instead, they think about storytelling platforms that will enable new forms of both authoring and reading.”
  • He cited the enormous success of Wattpad, a Canadian start-up that advertises itself as the world’s largest storytelling community. There are 10 million stories on the site.
Javier E

New Thinking and Old Books Revisited - NYTimes.com - 0 views

  • Mark Thoma’s classic crack — “I’ve learned that new economic thinking means reading old books” — has a serious point to it. We’ve had a couple of centuries of economic thought at this point, and quite a few smart people doing the thinking. It’s possible to come up with truly new concepts and approaches, but it takes a lot more than good intentions and casual observation to get there.
  • There is definitely a faction within economics that considers it taboo to introduce anything into its analysis that isn’t grounded in rational behavior and market equilibrium
  • what I do, and what everyone I’ve just named plus many others does, is a more modest, more eclectic form of analysis. You use maximization and equilibrium where it seems reasonably consistent with reality, because of its clarifying power, but you introduce ad hoc deviations where experience seems to demand them — downward rigidity of wages, balance-sheet constraints, bubbles (which are hard to predict, but you can say a lot about their consequences).
  • ...4 more annotations...
  • You may say that what we need is reconstruction from the ground up — an economics with no vestige of equilibrium analysis. Well, show me some results. As it happens, the hybrid, eclectic approach I’ve just described has done pretty well in this crisis, so you had better show me some really superior results before it gets thrown out the window.
  • if you think you’ve found a fundamental logical flaw in one of our workhorse economic models, the odds are very strong that you’ve just made a mistake.
  • it’s quite clear that the teaching of macroeconomics has gone seriously astray. As Saraceno says, the simple models that have proved so useful since 2008 are by and large taught only at the undergrad level — they’re treated as too simple, too ad hoc, whatever, to make it into the grad courses even at places that aren’t very ideological.
  • to temper your modeling with a sense of realism you need to know something about reality — and not just the statistical properties of U.S. time series since 1947. Economic history — global economic history — should be a core part of the curriculum. Nobody should be making pronouncements on macro without knowing a fair bit about the collapse of the gold standard in the 1930s, what actually happened in the stagflation of the 1970s, the Asian financial crisis of the 90s, and, looking forward, the euro crisis.
Javier E

Arnon Grunberg Is Writing While Connected to Electrodes - NYTimes.com - 0 views

  • Over the past two weeks, Mr. Grunberg has spent several hours a day writing his novella, while a battery of sensors and cameras tracked his brain waves, heart rate, galvanic skin response (an electrical measure of emotional arousal) and facial expressions. Next fall, when the book is published, some 50 ordinary people in the Netherlands will read it under similarly controlled circumstances, sensors and all.
  • Researchers will then crunch the data in the hope of finding patterns that may help illuminate links between the way art is created and enjoyed, and possibly the nature of creativity itself.
  • the burgeoning field of neuroaesthetics, which over the last decade or so has attempted to uncover the neural underpinnings of our experience of music and visual art, using brain imaging technology. Slowly, a small but growing number of researchers have also begun using similar tools to scrutinize the perhaps more elusive, and perhaps endangered, experience of literary reading.
  • ...2 more annotations...
  • Last year, researchers at Stanford University drew headlines with the results of a functional magnetic resonance imaging (or fMRI) experiment showing that different regions of the brain were activated when subjects switched from reading Jane Austen’s “Mansfield Park” for pleasure to reading it analytically
  • And this fall, a study out of the New School for Social Research showed that readers of literary fiction scored higher on tests of empathy than readers of commercial fiction, a finding greeted with satisfied told-you-sos from many readers and writers alike.
Javier E

What makes us human? Doing pointless things for fun - 2 views

  • Playfulness is what makes us human. Doing pointless, purposeless things, just for fun. Doing things for the sheer devilment of it. Being silly for the sake of being silly. Larking around. Taking pleasure in activities that do not advantage us and have nothing to do with our survival. These are the highest signs of intelligence. It is when a creature, having met and surmounted all the practical needs that face him, decides to dance that we know we are in the presence of a human. It is when a creature, having successfully performed all necessary functions, starts to play the fool, just for the hell of it, that we know he is not a robot.
  • All at once, it was clear. The bush people, lounging about after dark in their family shelter, perhaps around a fire – basically just hanging out – had been amusing themselves doing a bit of rock art. And perhaps with some leftover red paste, a few of the younger ones had had a competition to see who could jump highest and make their fingermarks highest up the overhang. This was not even art. It called for no particular skill. It was just mucking about. And yet, for all the careful beauty of their pictures, for all the recognition of their lives from the vantage point of my life that was sparked in me by the appreciation of their artwork, it was not what was skilful that brought me closest to them. It was what was playful. It was their jumping and daubing finger-blobs competition that brought them to me, suddenly, as fellow humans across all those thousands of years. It tingled my spine.
  • An age is coming when machines will be able to do everything. “Ah,” you say, “but they will not be conscious.” But how will we know a machine is not conscious – how do we know another human being is conscious? There is only one way. When it starts to play. In playfulness lies the highest expression of the human spirit.
Javier E

In Defense of a Loaded Word - NYTimes.com - 2 views

  • words take on meaning within a context. It might be true that you refer to your spouse as Baby. But were I to take this as license to do the same, you would most likely protest. Right names depend on right relationships, a fact so basic to human speech that without it, human language might well collapse. But as with so much of what we take as human, we seem to be in need of an African-American exception
  • This is the politics of respectability — an attempt to raise black people to a superhuman standard. In this case it means exempting black people from a basic rule of communication — that words take on meaning from context and relationship. But as in all cases of respectability politics, what we are really saying to black people is, “Be less human.” This is not a fight over civil rights; it’s an attempt to raise a double standard.
  • To prevent enabling oppression, we demand that black people be twice as good. To prevent verifying stereotypes, we pledge to never eat a slice a watermelon in front of white people
  • ...6 more annotations...
  • But white racism needs no verification from black people. And a scientific poll of right-thinking humans will always conclude that watermelon is awesome. That is because its taste and texture appeal to certain attributes that humans tend to find pleasurable. Humans also tend to find community to be pleasurable, and within the boundaries of community relationships, words — often ironic and self-deprecating — are always spoken that take on other meanings when uttered by others.
  • A separate and unequal standard for black people is always wrong. And the desire to ban the word “nigger” is not anti-racism, it is finishing school
  • I am certain that should I decide to join in, I would invite the same hard conversation that would greet me, should I ever call my father Billy.
  • A few summers ago one of my best friends invited me up to what he affectionately called his “white-trash cabin” in the Adirondacks. This was not how I described the outing to my family. Two of my Jewish acquaintances once joked that I’d “make a good Jew.” My retort was not, “Yeah, I certainly am good with money.
  • When Matt Barnes used the word “niggas” he was being inappropriate. When Richie Incognito and Riley Cooper used “nigger,” they were being violent and offensive. That we have trouble distinguishing the two evidences our discomfort with the great chasm between black and white America
  • That such a seemingly hateful word should return as a marker of nationhood and community confounds our very notions of power. “Nigger” is different because it is attached to one of the most vibrant cultures in the Western world. And yet the culture is inextricably linked to the violence that birthed us. “Nigger” is the border, the signpost that reminds us that the old crimes don’t disappear. It tells white people that, for all their guns and all their gold, there will always be places they can never go.
Javier E

History News Network | History Gets Into Bed with Psychology, and It's a Happy Match - 0 views

  • The fact that many of our self-protective delusions are built into the way the brain works is no justification for not trying to override them. Knowing how dissonance works helps us identify our own inclinations to perpetuate errors -- and protect ourselves from those who can’t. Or won’t.Related LinksWhat Historians Can Learn from the Social Sciences and Sciences /* * * CONFIGURATION VARIABLES: EDIT BEFORE PASTING INTO YOUR WEBPAGE * * */ var disqus_shortname = 'hnndev'; // required: replace example with your forum shortname /* * * DON'T EDIT BELOW THIS LINE * * */ (function() { var dsq = document.createElement('script'); dsq.type = 'text/javascript'; dsq.async = true; dsq.src = '//' + disqus_shortname + '.disqus.com/embed.js'; (document.getElementsByTagName('head')[0] || document.getElementsByTagName('body')[0]).appendChild(dsq); })(); Please enable JavaScript to view the comments powered by Disqus. News Breaking News Historians DC Breaking News Historians DC ‘Scottsboro Boys’ pardoned in Alabama ‘November 22, 1963’ U-Boat discovered off the coast of Indonesia Vatican publicly unveils bone fragments said to belong to St. Peter Pictured: the 'real site' of the Hanging Gardens of Babylon Historian: Taiwan can use WWII legacy to improve standing with China 'I Take Long Walks': The Emotional Lives of Holocaust Scholars Chinese historian: Xi Jinping a master of "neo-authoritarianism" History Comes to Life With Tweets From Past Celtic Paths, Illuminated by a Sundial try{for(var lastpass_iter=0; lastpass_iter < document.forms.length; lastpass_iter++){ var lastpass_f = document.forms[lastpass_iter]; if(typeof(lastpass_f.lpsubmitorig2)=="undefined"){ lastpass_f.lpsubmitorig2 = lastpass_f.submit; lastpass_f.submit = function(){ var form=this; var customEvent = document.createEvent("Event"); customEvent.initEvent("lpCustomEvent", true, true); var d = document.getElementById("hiddenlpsubmitdiv"); for(var i = 0; i < document.forms.length; i++){ if(document.forms[i]==form){ d.innerText=i; } } d.dispatchEvent(customEvent); form.lpsubmitorig2(); } } }}catch(e){}
  • at last, history has gotten into bed with psychological science, and it’s a happy match. History gives us the data of, in Barbara Tuchman’s splendid words, our march of folly -- repeated examples of human beings unable and unwilling to learn from mistakes, let alone to admit them. Cognitive science shows us why
  • Our brains, which have allowed us to travel into outer space, have a whole bunch of design flaws, which is why we have so much breathtaking bumbling here on Earth.
  • ...3 more annotations...
  • Of the many built-in biases in human thought, three have perhaps the greatest consequences for our own history and that of nations: the belief that we see things as they really are, rather than as we wish them to be; the belief that we are better, kinder, smarter, and more ethical than average; and the confirmation bias, which sees to it that we notice, remember, and accept information that confirms our beliefs -- and overlook, forget, and discount information that disconfirms our beliefs.
  • The great motivational theory that accommodates all of these biases is cognitive dissonance, developed by Leon Festinger in 1957 and further refined and transformed into a theory of self-justification by his student (and later my coauthor and friend) Elliot Aronson. The need to reduce dissonance is the key mechanism that underlies the reluctance to be wrong, to change our minds, to admit serious mistakes, and to be unwilling to accept unwelcome information
  • The greater the dissonance between who we are and the mistake we made or the cruelty we committed, the greater the need to justify the mistake, the crime, the villainy, instead of admitting and rectifying it
Javier E

Art Makes You Smart - NYTimes.com - 1 views

  • Through a large-scale, random-assignment study of school tours to the museum, we were able to determine that strong causal relationships do in fact exist between arts education and a range of desirable outcomes.
  • Students who, by lottery, were selected to visit the museum on a field trip demonstrated stronger critical thinking skills, displayed higher levels of social tolerance, exhibited greater historical empathy and developed a taste for art museums and cultural institutions.
  • Over the course of the following year, nearly 11,000 students and almost 500 teachers participated in our study, roughly half of whom had been selected by lottery to visit the museum
  • ...4 more annotations...
  • Applicant groups who won the lottery constituted our treatment group, while those who did not win an immediate tour served as our control group.
  • Several weeks after the students in the treatment group visited the museum, we administered surveys to all of the students. The surveys included multiple items that assessed knowledge about art, as well as measures of tolerance, historical empathy and sustained interest in visiting art museums and other cultural institutions. We also asked them to write an essay in response to a work of art that was unfamiliar to them.
  • Moreover, most of the benefits we observed are significantly larger for minority students, low-income students and students from rural schools — typically two to three times larger than for white, middle-class, suburban students — owing perhaps to the fact that the tour was the first time they had visited an art museum.
  • we can conclude that visiting an art museum exposes students to a diversity of ideas that challenge them with different perspectives on the human condition. Expanding access to art, whether through programs in schools or through visits to area museums and galleries, should be a central part of any school’s curriculum.
Javier E

Why Do I Always Wake Up 5 Minutes Before My Alarm Goes Off? | Mental Floss - 0 views

  • At the center of your brain, a clump of nerves—called the suprachiasmatic nucleus—oversees your body’s clock: the circadian rhythm. It determines when you feel sleepy and when you feel bright-eyed. It controls your blood pressure, your body temperature, and your sense of time. It turns your body into a finely tuned machine.
  • Your sleep-wake cycle is regulated by a protein called PER. The protein level rises and falls each day, peaking in the evening and plummeting at night. When PER levels are low, your blood pressure drops, heart rate slows, and thinking becomes foggier. You get sleepy. If you follow a diligent sleep routine—waking up the same time every day—your body learns to increase your PER levels in time for your alarm. About an hour before you’re supposed to wake up, PER levels rise (along with your body temperature and blood pressure). To prepare for the stress of waking, your body releases a cocktail of stress hormones, like cortisol. Gradually, your sleep becomes lighter and lighter.&nbsp; And that’s why you wake up before your alarm. Your body hates your alarm clock. It’s jarring. It’s stressful. And it ruins all that hard work. It defeats the purpose of gradually waking up. So, to avoid being interrupted, your body does something amazing: It starts increasing PER and stress hormones earlier in the night. Your body gets a head start so the waking process isn’t cut short. It’s so precise that your eyelids open minutes—maybe even seconds—before the alarm goes off.
  • if you don’t wake before your alarm, you probably aren’t getting enough sleep—or you aren’t sleeping on a consistent schedule
  • ...1 more annotation...
  • Enter the snooze button. Since your body’s gone through all that work to rise gradually, a quick nap sends your internal clock spinning in the wrong direction. All the hormones that help you fall asleep meddle with the hormones that help you wake up. Your body gets confused. You feel groggier. And with each slap of the snooze, it gets worse. The snooze, it seems, is the worst way to start your day.
Javier E

Conspiracy theory psychology: People who claim to know the truth about JFK, UFOs, and 9... - 0 views

  • people who suspect conspiracies aren’t really skeptics. Like the rest of us, they’re selective doubters. They favor a worldview, which they uncritically defend. But their worldview isn’t about God, values, freedom, or equality. It’s about the omnipotence of elites.
  • the prevalence of such belief, documented in surveys, has forced scholars to take it more seriously. Conspiracy theory psychology is becoming an empirical field with a broader mission: to understand why so many people embrace this way of interpreting history.
  • “People low in trust of others are likely to believe that others are colluding against them,” the authors proposed. This sort of distrust, in other words, favors a certain kind of belief. It makes you more susceptible, not less, to claims of conspiracy.
  • ...5 more annotations...
  • The more you see the world this way—full of malice and planning instead of circumstance and coincidence—the more likely you are to accept conspiracy theories of all kinds. Once you buy into the first theory, with its premises of coordination, efficacy, and secrecy, the next seems that much more plausible.
  • The common thread between distrust and cynicism, as defined in these experiments, is a perception of bad character. More broadly, it’s a tendency to focus on intention and agency, rather than randomness or causal complexity. In extreme form, it can become paranoia
  • In mild form, it’s a common weakness known as the fundamental attribution error—ascribing others’ behavior to personality traits and objectives, forgetting the importance of situational factors and chance
  • Clearly, susceptibility to conspiracy theories isn’t a matter of objectively evaluating evidence. It’s more about alienation. People who fall for such theories don’t trust the government or the media. They aim their scrutiny at the official narrative, not at the alternative explanations
  • Conspiracy believers are the ultimate motivated skeptics. Their curse is that they apply this selective scrutiny not to the left or right, but to the mainstream. They tell themselves that they’re the ones who see the lies, and the rest of us are sheep. But believing that everybody’s lying is just another kind of gullibility.
« First ‹ Previous 1981 - 2000 of 2691 Next › Last »
Showing 20 items per page