Skip to main content

Home/ TOK Friends/ Group items tagged interview

Rss Feed Group items tagged

Javier E

Guess Who Doesn't Fit In at Work - NYTimes.com - 0 views

  • ACROSS cultures and industries, managers strongly prize “cultural fit” — the idea that the best employees are like-minded.
  • One recent survey found that more than 80 percent of employers worldwide named cultural fit as a top hiring priority.
  • When done carefully, selecting new workers this way can make organizations more productive and profitable.
  • ...18 more annotations...
  • In the process, fit has become a catchall used to justify hiring people who are similar to decision makers and rejecting people who are not.
  • The concept of fit first gained traction in the 1980s. The original idea was that if companies hired individuals whose personalities and values — and not just their skills — meshed with an organization’s strategy, workers would feel more attached to their jobs, work harder and stay longer.
  • in many organizations, fit has gone rogue. I saw this firsthand while researching the hiring practices of the country’s top investment banks, management consultancies and law firms. I interviewed 120 decision makers and spent nine months observing
  • While résumés (and connections) influenced which applicants made it into the interview room, interviewers’ perceptions of fit strongly shaped who walked out with job offers.
  • Crucially, though, for these gatekeepers, fit was not about a match with organizational values. It was about personal fit. In these time- and team-intensive jobs, professionals at all levels of seniority reported wanting to hire people with whom they enjoyed hanging out and could foresee developing close relationships with
  • To judge fit, interviewers commonly relied on chemistry. “
  • Many used the “airport test.” As a managing director at an investment bank put it, “Would I want to be stuck in an airport in Minneapolis in a snowstorm with them?”
  • interviewers were primarily interested in new hires whose hobbies, hometowns and biographies matched their own. Bonding over rowing college crew, getting certified in scuba, sipping single-malt Scotches in the Highlands or dining at Michelin-starred restaurants was evidence of fit; sharing a love of teamwork or a passion for pleasing clients was not
  • it has become a common feature of American corporate culture. Employers routinely ask job applicants about their hobbies and what they like to do for fun, while a complementary self-help industry informs white-collar job seekers that chemistry, not qualifications, will win them an offer.
  • Selection based on personal fit can keep demographic and cultural diversity low
  • In the elite firms I studied, the types of shared experiences associated with fit typically required large investments of time and money.
  • Class-biased definitions of fit are one reason investment banks, management consulting firms and law firms are dominated by people from the highest socioeconomic backgrounds
  • Also, whether the industry is finance, high-tech or fashion, a good fit in most American corporations still tends to be stereotypically masculine.
  • Perhaps most important, it is easy to mistake rapport for skill. Just as they erroneously believe that they can accurately tell when someone is lying, people tend to be overly confident in their ability to spot talent. Unstructured interviews, which are the most popular hiring tools for American managers and the primary way they judge fit, are notoriously poor predictors of job performance.
  • Organizations that use cultural fit for competitive advantage tend to favor concrete tools like surveys and structured interviews that systematically test behaviors associated with increased performance and employee retention.
  • For managers who want to use cultural fit in a more productive way, I have several suggestions.
  • First, communicate a clear and consistent idea of what the organization’s culture is (and is not) to potential employees. Second, make sure the definition of cultural fit is closely aligned with business goals. Ideally, fit should be based on data-driven analysis of what types of values, traits and behaviors actually predict on-the-job success. Third, create formal procedures like checklists for measuring fit, so that assessment is not left up to the eyes (and extracurriculars) of the beholder.
  • But cultural fit has become a new form of discrimination that keeps demographic and cultural diversity down
Javier E

They're Watching You at Work - Don Peck - The Atlantic - 2 views

  • Predictive statistical analysis, harnessed to big data, appears poised to alter the way millions of people are hired and assessed.
  • By one estimate, more than 98 percent of the world’s information is now stored digitally, and the volume of that data has quadrupled since 2007.
  • The application of predictive analytics to people’s careers—an emerging field sometimes called “people analytics”—is enormously challenging, not to mention ethically fraught
  • ...52 more annotations...
  • By the end of World War II, however, American corporations were facing severe talent shortages. Their senior executives were growing old, and a dearth of hiring from the Depression through the war had resulted in a shortfall of able, well-trained managers. Finding people who had the potential to rise quickly through the ranks became an overriding preoccupation of American businesses. They began to devise a formal hiring-and-management system based in part on new studies of human behavior, and in part on military techniques developed during both world wars, when huge mobilization efforts and mass casualties created the need to get the right people into the right roles as efficiently as possible. By the 1950s, it was not unusual for companies to spend days with young applicants for professional jobs, conducting a battery of tests, all with an eye toward corner-office potential.
  • But companies abandoned their hard-edged practices for another important reason: many of their methods of evaluation turned out not to be very scientific.
  • this regime, so widespread in corporate America at mid-century, had almost disappeared by 1990. “I think an HR person from the late 1970s would be stunned to see how casually companies hire now,”
  • Many factors explain the change, he said, and then he ticked off a number of them: Increased job-switching has made it less important and less economical for companies to test so thoroughly. A heightened focus on short-term financial results has led to deep cuts in corporate functions that bear fruit only in the long term. The Civil Rights Act of 1964, which exposed companies to legal liability for discriminatory hiring practices, has made HR departments wary of any broadly applied and clearly scored test that might later be shown to be systematically biased.
  • about a quarter of the country’s corporations were using similar tests to evaluate managers and junior executives, usually to assess whether they were ready for bigger roles.
  • He has encouraged the company’s HR executives to think about applying the games to the recruitment and evaluation of all professional workers.
  • Knack makes app-based video games, among them Dungeon Scrawl, a quest game requiring the player to navigate a maze and solve puzzles, and Wasabi Waiter, which involves delivering the right sushi to the right customer at an increasingly crowded happy hour. These games aren’t just for play: they’ve been designed by a team of neuroscientists, psychologists, and data scientists to suss out human potential. Play one of them for just 20 minutes, says Guy Halfteck, Knack’s founder, and you’ll generate several megabytes of data, exponentially more than what’s collected by the SAT or a personality test. How long you hesitate before taking every action, the sequence of actions you take, how you solve problems—all of these factors and many more are logged as you play, and then are used to analyze your creativity, your persistence, your capacity to learn quickly from mistakes, your ability to prioritize, and even your social intelligence and personality. The end result, Halfteck says, is a high-resolution portrait of your psyche and intellect, and an assessment of your potential as a leader or an innovator.
  • When the results came back, Haringa recalled, his heart began to beat a little faster. Without ever seeing the ideas, without meeting or interviewing the people who’d proposed them, without knowing their title or background or academic pedigree, Knack’s algorithm had identified the people whose ideas had panned out. The top 10 percent of the idea generators as predicted by Knack were in fact those who’d gone furthest in the process.
  • What Knack is doing, Haringa told me, “is almost like a paradigm shift.” It offers a way for his GameChanger unit to avoid wasting time on the 80 people out of 100—nearly all of whom look smart, well-trained, and plausible on paper—whose ideas just aren’t likely to work out.
  • Aptitude, skills, personal history, psychological stability, discretion, loyalty—companies at the time felt they had a need (and the right) to look into them all. That ambit is expanding once again, and this is undeniably unsettling. Should the ideas of scientists be dismissed because of the way they play a game? Should job candidates be ranked by what their Web habits say about them? Should the “data signature” of natural leaders play a role in promotion? These are all live questions today, and they prompt heavy concerns: that we will cede one of the most subtle and human of skills, the evaluation of the gifts and promise of other people, to machines; that the models will get it wrong; that some people will never get a shot in the new workforce.
  • scoring distance from work could violate equal-employment-opportunity standards. Marital status? Motherhood? Church membership? “Stuff like that,” Meyerle said, “we just don’t touch”—at least not in the U.S., where the legal environment is strict. Meyerle told me that Evolv has looked into these sorts of factors in its work for clients abroad, and that some of them produce “startling results.”
  • consider the alternative. A mountain of scholarly literature has shown that the intuitive way we now judge professional potential is rife with snap judgments and hidden biases, rooted in our upbringing or in deep neurological connections that doubtless served us well on the savanna but would seem to have less bearing on the world of work.
  • We may like to think that society has become more enlightened since those days, and in many ways it has, but our biases are mostly unconscious, and they can run surprisingly deep. Consider race. For a 2004 study called “Are Emily and Greg More Employable Than Lakisha and Jamal?,” the economists Sendhil Mullainathan and Marianne Bertrand put white-sounding names (Emily Walsh, Greg Baker) or black-sounding names (Lakisha Washington, Jamal Jones) on similar fictitious résumés, which they then sent out to a variety of companies in Boston and Chicago. To get the same number of callbacks, they learned, they needed to either send out half again as many résumés with black names as those with white names, or add eight extra years of relevant work experience to the résumés with black names.
  • a sociologist at Northwestern, spent parts of the three years from 2006 to 2008 interviewing professionals from elite investment banks, consultancies, and law firms about how they recruited, interviewed, and evaluated candidates, and concluded that among the most important factors driving their hiring recommendations were—wait for it—shared leisure interests.
  • Lacking “reliable predictors of future performance,” Rivera writes, “assessors purposefully used their own experiences as models of merit.” Former college athletes “typically prized participation in varsity sports above all other types of involvement.” People who’d majored in engineering gave engineers a leg up, believing they were better prepared.
  • the prevailing system of hiring and management in this country involves a level of dysfunction that should be inconceivable in an economy as sophisticated as ours. Recent survey data collected by the Corporate Executive Board, for example, indicate that nearly a quarter of all new hires leave their company within a year of their start date, and that hiring managers wish they’d never extended an offer to one out of every five members on their team
  • In the late 1990s, as these assessments shifted from paper to digital formats and proliferated, data scientists started doing massive tests of what makes for a successful customer-support technician or salesperson. This has unquestionably improved the quality of the workers at many firms.
  • In 2010, however, Xerox switched to an online evaluation that incorporates personality testing, cognitive-skill assessment, and multiple-choice questions about how the applicant would handle specific scenarios that he or she might encounter on the job. An algorithm behind the evaluation analyzes the responses, along with factual information gleaned from the candidate’s application, and spits out a color-coded rating: red (poor candidate), yellow (middling), or green (hire away). Those candidates who score best, I learned, tend to exhibit a creative but not overly inquisitive personality, and participate in at least one but not more than four social networks, among many other factors. (Previous experience, one of the few criteria that Xerox had explicitly screened for in the past, turns out to have no bearing on either productivity or retention
  • When Xerox started using the score in its hiring decisions, the quality of its hires immediately improved. The rate of attrition fell by 20 percent in the initial pilot period, and over time, the number of promotions rose. Xerox still interviews all candidates in person before deciding to hire them, Morse told me, but, she added, “We’re getting to the point where some of our hiring managers don’t even want to interview anymore”
  • Gone are the days, Ostberg told me, when, say, a small survey of college students would be used to predict the statistical validity of an evaluation tool. “We’ve got a data set of 347,000 actual employees who have gone through these different types of assessments or tools,” he told me, “and now we have performance-outcome data, and we can split those and slice and dice by industry and location.”
  • Evolv’s tests allow companies to capture data about everybody who applies for work, and everybody who gets hired—a complete data set from which sample bias, long a major vexation for industrial-organization psychologists, simply disappears. The sheer number of observations that this approach makes possible allows Evolv to say with precision which attributes matter more to the success of retail-sales workers (decisiveness, spatial orientation, persuasiveness) or customer-service personnel at call centers (rapport-building)
  • There are some data that Evolv simply won’t use, out of a concern that the information might lead to systematic bias against whole classes of people
  • the idea that hiring was a science fell out of favor. But now it’s coming back, thanks to new technologies and methods of analysis that are cheaper, faster, and much-wider-ranging than what we had before
  • what most excites him are the possibilities that arise from monitoring the entire life cycle of a worker at any given company.
  • Now the two companies are working together to marry pre-hire assessments to an increasing array of post-hire data: about not only performance and duration of service but also who trained the employees; who has managed them; whether they were promoted to a supervisory role, and how quickly; how they performed in that role; and why they eventually left.
  • What begins with an online screening test for entry-level workers ends with the transformation of nearly every aspect of hiring, performance assessment, and management.
  • I turned to Sandy Pentland, the director of the Human Dynamics Laboratory at MIT. In recent years, Pentland has pioneered the use of specialized electronic “badges” that transmit data about employees’ interactions as they go about their days. The badges capture all sorts of information about formal and informal conversations: their length; the tone of voice and gestures of the people involved; how much those people talk, listen, and interrupt; the degree to which they demonstrate empathy and extroversion; and more. Each badge generates about 100 data points a minute.
  • he tried the badges out on about 2,500 people, in 21 different organizations, and learned a number of interesting lessons. About a third of team performance, he discovered, can usually be predicted merely by the number of face-to-face exchanges among team members. (Too many is as much of a problem as too few.) Using data gathered by the badges, he was able to predict which teams would win a business-plan contest, and which workers would (rightly) say they’d had a “productive” or “creative” day. Not only that, but he claimed that his researchers had discovered the “data signature” of natural leaders, whom he called “charismatic connectors” and all of whom, he reported, circulate actively, give their time democratically to others, engage in brief but energetic conversations, and listen at least as much as they talk.
  • His group is developing apps to allow team members to view their own metrics more or less in real time, so that they can see, relative to the benchmarks of highly successful employees, whether they’re getting out of their offices enough, or listening enough, or spending enough time with people outside their own team.
  • Torrents of data are routinely collected by American companies and now sit on corporate servers, or in the cloud, awaiting analysis. Bloomberg reportedly logs every keystroke of every employee, along with their comings and goings in the office. The Las Vegas casino Harrah’s tracks the smiles of the card dealers and waitstaff on the floor (its analytics team has quantified the impact of smiling on customer satisfaction). E‑mail, of course, presents an especially rich vein to be mined for insights about our productivity, our treatment of co-workers, our willingness to collaborate or lend a hand, our patterns of written language, and what those patterns reveal about our intelligence, social skills, and behavior.
  • people analytics will ultimately have a vastly larger impact on the economy than the algorithms that now trade on Wall Street or figure out which ads to show us. He reminded me that we’ve witnessed this kind of transformation before in the history of management science. Near the turn of the 20th century, both Frederick Taylor and Henry Ford famously paced the factory floor with stopwatches, to improve worker efficiency.
  • “The quantities of data that those earlier generations were working with,” he said, “were infinitesimal compared to what’s available now. There’s been a real sea change in the past five years, where the quantities have just grown so large—petabytes, exabytes, zetta—that you start to be able to do things you never could before.”
  • People analytics will unquestionably provide many workers with more options and more power. Gild, for example, helps companies find undervalued software programmers, working indirectly to raise those people’s pay. Other companies are doing similar work. One called Entelo, for instance, specializes in using algorithms to identify potentially unhappy programmers who might be receptive to a phone cal
  • He sees it not only as a boon to a business’s productivity and overall health but also as an important new tool that individual employees can use for self-improvement: a sort of radically expanded The 7 Habits of Highly Effective People, custom-written for each of us, or at least each type of job, in the workforce.
  • the most exotic development in people analytics today is the creation of algorithms to assess the potential of all workers, across all companies, all the time.
  • The way Gild arrives at these scores is not simple. The company’s algorithms begin by scouring the Web for any and all open-source code, and for the coders who wrote it. They evaluate the code for its simplicity, elegance, documentation, and several other factors, including the frequency with which it’s been adopted by other programmers. For code that was written for paid projects, they look at completion times and other measures of productivity. Then they look at questions and answers on social forums such as Stack Overflow, a popular destination for programmers seeking advice on challenging projects. They consider how popular a given coder’s advice is, and how widely that advice ranges.
  • The algorithms go further still. They assess the way coders use language on social networks from LinkedIn to Twitter; the company has determined that certain phrases and words used in association with one another can distinguish expert programmers from less skilled ones. Gild knows these phrases and words are associated with good coding because it can correlate them with its evaluation of open-source code, and with the language and online behavior of programmers in good positions at prestigious companies.
  • having made those correlations, Gild can then score programmers who haven’t written open-source code at all, by analyzing the host of clues embedded in their online histories. They’re not all obvious, or easy to explain. Vivienne Ming, Gild’s chief scientist, told me that one solid predictor of strong coding is an affinity for a particular Japanese manga site.
  • Gild’s CEO, Sheeroy Desai, told me he believes his company’s approach can be applied to any occupation characterized by large, active online communities, where people post and cite individual work, ask and answer professional questions, and get feedback on projects. Graphic design is one field that the company is now looking at, and many scientific, technical, and engineering roles might also fit the bill. Regardless of their occupation, most people leave “data exhaust” in their wake, a kind of digital aura that can reveal a lot about a potential hire.
  • professionally relevant personality traits can be judged effectively merely by scanning Facebook feeds and photos. LinkedIn, of course, captures an enormous amount of professional data and network information, across just about every profession. A controversial start-up called Klout has made its mission the measurement and public scoring of people’s online social influence.
  • Mullainathan expressed amazement at how little most creative and professional workers (himself included) know about what makes them effective or ineffective in the office. Most of us can’t even say with any certainty how long we’ve spent gathering information for a given project, or our pattern of information-gathering, never mind know which parts of the pattern should be reinforced, and which jettisoned. As Mullainathan put it, we don’t know our own “production function.”
  • Over time, better job-matching technologies are likely to begin serving people directly, helping them see more clearly which jobs might suit them and which companies could use their skills. In the future, Gild plans to let programmers see their own profiles and take skills challenges to try to improve their scores. It intends to show them its estimates of their market value, too, and to recommend coursework that might allow them to raise their scores even more. Not least, it plans to make accessible the scores of typical hires at specific companies, so that software engineers can better see the profile they’d need to land a particular job
  • Knack, for its part, is making some of its video games available to anyone with a smartphone, so people can get a better sense of their strengths, and of the fields in which their strengths would be most valued. (Palo Alto High School recently adopted the games to help students assess careers.) Ultimately, the company hopes to act as matchmaker between a large network of people who play its games (or have ever played its games) and a widening roster of corporate clients, each with its own specific profile for any given type of job.
  • When I began my reporting for this story, I was worried that people analytics, if it worked at all, would only widen the divergent arcs of our professional lives, further gilding the path of the meritocratic elite from cradle to grave, and shutting out some workers more definitively. But I now believe the opposite is likely to happen, and that we’re headed toward a labor market that’s fairer to people at every stage of their careers
  • For decades, as we’ve assessed people’s potential in the professional workforce, the most important piece of data—the one that launches careers or keeps them grounded—has been educational background: typically, whether and where people went to college, and how they did there. Over the past couple of generations, colleges and universities have become the gatekeepers to a prosperous life. A degree has become a signal of intelligence and conscientiousness, one that grows stronger the more selective the school and the higher a student’s GPA, that is easily understood by employers, and that, until the advent of people analytics, was probably unrivaled in its predictive powers.
  • the limitations of that signal—the way it degrades with age, its overall imprecision, its many inherent biases, its extraordinary cost—are obvious. “Academic environments are artificial environments,” Laszlo Bock, Google’s senior vice president of people operations, told The New York Times in June. “People who succeed there are sort of finely trained, they’re conditioned to succeed in that environment,” which is often quite different from the workplace.
  • because one’s college history is such a crucial signal in our labor market, perfectly able people who simply couldn’t sit still in a classroom at the age of 16, or who didn’t have their act together at 18, or who chose not to go to graduate school at 22, routinely get left behind for good. That such early factors so profoundly affect career arcs and hiring decisions made two or three decades later is, on its face, absurd.
  • I spoke with managers at a lot of companies who are using advanced analytics to reevaluate and reshape their hiring, and nearly all of them told me that their research is leading them toward pools of candidates who didn’t attend college—for tech jobs, for high-end sales positions, for some managerial roles. In some limited cases, this is because their analytics revealed no benefit whatsoever to hiring people with college degrees; in other cases, and more often, it’s because they revealed signals that function far better than college history,
  • Google, too, is hiring a growing number of nongraduates. Many of the people I talked with reported that when it comes to high-paying and fast-track jobs, they’re reducing their preference for Ivy Leaguers and graduates of other highly selective schools.
  • This process is just beginning. Online courses are proliferating, and so are online markets that involve crowd-sourcing. Both arenas offer new opportunities for workers to build skills and showcase competence. Neither produces the kind of instantly recognizable signals of potential that a degree from a selective college, or a first job at a prestigious firm, might. That’s a problem for traditional hiring managers, because sifting through lots of small signals is so difficult and time-consuming.
  • all of these new developments raise philosophical questions. As professional performance becomes easier to measure and see, will we become slaves to our own status and potential, ever-focused on the metrics that tell us how and whether we are measuring up? Will too much knowledge about our limitations hinder achievement and stifle our dreams? All I can offer in response to these questions, ironically, is my own gut sense, which leads me to feel cautiously optimistic.
  • Google’s understanding of the promise of analytics is probably better than anybody else’s, and the company has been changing its hiring and management practices as a result of its ongoing analyses. (Brainteasers are no longer used in interviews, because they do not correlate with job success; GPA is not considered for anyone more than two years out of school, for the same reason—the list goes on.) But for all of Google’s technological enthusiasm, these same practices are still deeply human. A real, live person looks at every résumé the company receives. Hiring decisions are made by committee and are based in no small part on opinions formed during structured interviews.
dicindioha

Daniel Kahneman On Hiring Decisions - Business Insider - 0 views

  • Most hiring decisions come down to a gut decision. According to Nobel laureate Daniel Kahneman, however, this process is extremely flawed and there's a much better way.
    • dicindioha
       
      hiring comes down to 'gut feeling'
  • First, select a few traits that are prerequisites for success in this position (technical proficiency, engaging personality, reliability, and so on. Don't overdo it — six dimensions is a good number. The traits you choose should be as independent as possible from each other, and you should feel that you can assess them reliably by asking a few factual questions. Next, make a list of those questions for each trait and think about how you will score it, say on a 1-5 scale. You should have an idea of what you will call "very weak" or "very strong."
    • dicindioha
       
      WHAT YOU SHOULD DO IN AN INTERVIEW
  • Kahneman asked interviewers to put aside personal judgments and limit interviews to a series of factual questions meant to generate a score on six separate personality traits. A few months later, it became clear that Kahneman's systematic approach was a vast improvement over gut decisions. It was so effective that the army would use his exact method for decades to come. Why you should care is because this superior method can be copied by any organization — and really, by anyone facing a hard decision.
  • ...2 more annotations...
  • Do not skip around. To evaluate each candidate add up the six scores ... Firmly resolve that you will hire the candidate whose final score is the highest, even if there is another one whom you like better — try to resist your wish to invent broken legs to change the ranking.
  • than if you do what people normally do in such situations, which is to go into the interview unprepared and to make choices by an overall intuitive judgment such as "I looked into his eyes and liked what I saw."
  •  
    we cannot always use simply a 'gut feeling' from our so called 'reasoning' and emotional response to make big decisions like job hiring, which is what happens much of the time. this is a really interesting way to do it systematically. you still use your own perspective, but the questions asked will hopefully lead you to a better outcome
Javier E

Wine-tasting: it's junk science | Life and style | The Observer - 0 views

  • google_ad_client = 'ca-guardian_js'; google_ad_channel = 'lifeandstyle'; google_max_num_ads = '3'; // Comments Click here to join the discussion. We can't load the discussion on guardian.co.uk because you don't have JavaScript enabled. if (!!window.postMessage) { jQuery.getScript('http://discussion.guardian.co.uk/embed.js') } else { jQuery('#d2-root').removeClass('hd').html( '' + 'Comments' + 'Click here to join the discussion.We can\'t load the ' + 'discussion on guardian.co.uk ' + 'because your web browser does not support all the features that we ' + 'need. If you cannot upgrade your browser to a newer version, you can ' + 'access the discussion ' + 'here.' ); } Wor
  • Hodgson approached the organisers of the California State Fair wine competition, the oldest contest of its kind in North America, and proposed an experiment for their annual June tasting sessions.Each panel of four judges would be presented with their usual "flight" of samples to sniff, sip and slurp. But some wines would be presented to the panel three times, poured from the same bottle each time. The results would be compiled and analysed to see whether wine testing really is scientific.
  • Results from the first four years of the experiment, published in the Journal of Wine Economics, showed a typical judge's scores varied by plus or minus four points over the three blind tastings. A wine deemed to be a good 90 would be rated as an acceptable 86 by the same judge minutes later and then an excellent 94.
  • ...9 more annotations...
  • Hodgson's findings have stunned the wine industry. Over the years he has shown again and again that even trained, professional palates are terrible at judging wine."The results are disturbing," says Hodgson from the Fieldbrook Winery in Humboldt County, described by its owner as a rural paradise. "Only about 10% of judges are consistent and those judges who were consistent one year were ordinary the next year."Chance has a great deal to do with the awards that wines win."
  • French academic Frédéric Brochet tested the effect of labels in 2001. He presented the same Bordeaux superior wine to 57 volunteers a week apart and in two different bottles – one for a table wine, the other for a grand cru.The tasters were fooled.When tasting a supposedly superior wine, their language was more positive – describing it as complex, balanced, long and woody. When the same wine was presented as plonk, the critics were more likely to use negatives such as weak, light and flat.
  • In 2011 Professor Richard Wiseman, a psychologist (and former professional magician) at Hertfordshire University invited 578 people to comment on a range of red and white wines, varying from £3.49 for a claret to £30 for champagne, and tasted blind.People could tell the difference between wines under £5 and those above £10 only 53% of the time for whites and only 47% of the time for reds. Overall they would have been just as a successful flipping a coin to guess.
  • why are ordinary drinkers and the experts so poor at tasting blind? Part of the answer lies in the sheer complexity of wine.For a drink made by fermenting fruit juice, wine is a remarkably sophisticated chemical cocktail. Dr Bryce Rankine, an Australian wine scientist, identified 27 distinct organic acids in wine, 23 varieties of alcohol in addition to the common ethanol, more than 80 esters and aldehydes, 16 sugars, plus a long list of assorted vitamins and minerals that wouldn't look out of place on the ingredients list of a cereal pack. There are even harmless traces of lead and arsenic that come from the soil.
  • "People underestimate how clever the olfactory system is at detecting aromas and our brain is at interpreting them," says Hutchinson."The olfactory system has the complexity in terms of its protein receptors to detect all the different aromas, but the brain response isn't always up to it. But I'm a believer that everyone has the same equipment and it comes down to learning how to interpret it." Within eight tastings, most people can learn to detect and name a reasonable range of aromas in wine
  • People struggle with assessing wine because the brain's interpretation of aroma and bouquet is based on far more than the chemicals found in the drink. Temperature plays a big part. Volatiles in wine are more active when wine is warmer. Serve a New World chardonnay too cold and you'll only taste the overpowering oak. Serve a red too warm and the heady boozy qualities will be overpowering.
  • Colour affects our perceptions too. In 2001 Frédérick Brochet of the University of Bordeaux asked 54 wine experts to test two glasses of wine – one red, one white. Using the typical language of tasters, the panel described the red as "jammy' and commented on its crushed red fruit.The critics failed to spot that both wines were from the same bottle. The only difference was that one had been coloured red with a flavourless dye
  • Other environmental factors play a role. A judge's palate is affected by what she or he had earlier, the time of day, their tiredness, their health – even the weather.
  • Robert Hodgson is determined to improve the quality of judging. He has developed a test that will determine whether a judge's assessment of a blind-tasted glass in a medal competition is better than chance. The research will be presented at a conference in Cape Town this year. But the early findings are not promising."So far I've yet to find someone who passes," he says.
Javier E

How Memory Works: Interview with Psychologist Daniel L. Schacter | History News Network - 2 views

  • knowledge from a scientific perspective of how human memory works can be instructive to historians.
  • Memory is much more than a simple retrieval system, as Dr. Schacter has demonstrated in his research. Rather, the nature of memory is constructive and influenced by a person’s current state as well as intervening emotions, beliefs, events and other factors since a recalled event.
  • Dr. Schacter is William R. Kenan, Jr. Professor of Psychology at Harvard University. His books include Searching for Memory: The Brain, The Mind, and The Past, and The Seven Sins of Memory: How the Mind Forgets and Remembers, both winners of the American Psychological Association’s William James Book Award, and Forgotten Ideas, Neglected Pioneers: Richard Semon and the Story of Memory. He also has written hundreds of articles on memory and related matters. He was elected a Fellow of the American Academy of Arts and Sciences in 1996 and the National Academy of Sciences in 2013.
  • ...16 more annotations...
  • that memory is not a video recorder [but that] it’s a constructive activity that is in many ways accurate but prone to interesting errors and distortions. It’s the constructive side of memory that is most relevant to historians.
  • Is it the case then that our memories constantly change every time we access them?
  • That certainly can happen depending on how you recount a memory. What you emphasize. What you exaggerate. What you don’t talk about. All of those things will shape and sculpt the memory for future use. Certainly the potential is there.
  • Research on memory shows that the more distant in time the event, the more prone to inaccuracy the memory. There are several experiments when subjects recorded impressions of an event soon afterward, then a year later and then a few years later, and the memory changed.Yes. It’s not that the information is lost but, as the memory weakens, you become more prone to incorporating other kinds of information or mixing up elements of other events. This has been seen, for example, in the study of flashbulb memories. Where were you when Kennedy was shot? Where were you when you heard about 9/11?
  • Isn’t there a tendency to add details or information that may make the story more convincing or interesting later?Yes. That’s more a social function of memory. It may be that you draw on your general knowledge and probable information from your memory in a social context where there may be social demands that lead you distort the memory.
  • What are the different memory systems?
  • What is the difference between working memory and permanent memory?Working memory is really a temporary memory buffer where you hold onto information, manipulate information, use it, and it’s partly a gateway to long-term memory and also a buffer that you use when you’re retrieving information from long-term memory and that information temporarily resides in working memory, so to speak.
  • Your discussion of the testimony of White House Counsel John Dean about Watergate is illuminating. There was a perception that Dean had a photographic memory and he testified in rich detail about events. Yet later studies of White House tape recordings revealed that he was often inaccurate.
  • He was perceived because of all the detail with which he reported events and the great confidence to be something analogous to a human tape recorder. Yet there was interesting work done by psychologist Ulric Neisser who went back and analyzed what Dean said at the hearings as compared to available information on the White House taping system and basically found many and significant discrepancies between what Dean remembered and what was actually said. He usually had the gist and the meaning and overall significance right, but the exact details were often quite different in his memory than what actually was said.
  • That seems to get into the area of false memories and how they present problems in the legal system.We know from DNA exonerations of people wrongfully convicted of crimes that a large majority of those cases -- one of the more recent estimates is that in the first 250 cases of 2011 DNA exonerations, roughly 70 to 75 percent of those individuals were convicted on the basis of faulty eyewitness memory.
  • One of the interesting recent lines of research that my lab has been involved in over the past few years has been looking at similarities between what goes on between the brain and mind when we remember past events on the one hand and imagine events that might occur in the future or might have occurred in the past. What we have found, particularly with brain scanning studies, is that you get very similar brain networks coming online when you remember past events and imagine future events, for example. Many of the same brain regions or network of structures come online, and this has helped us understand more why, for example, imagining events that might have occurred can be so harmful to memory accuracy because when you imagine, you’re recruiting many of the same brain regions as accessed when you actually remember. So it’s not surprising that some of these imagined events can actually turn into false memories under the right circumstances.
  • One reasonably well accepted distinction involves episodic memory, the memory for personal experience; semantic memory, the memory for general knowledge; and procedural memory, the memory for skills and unconscious forms of memory.Those are three of the major kinds of memory and they all have different neural substrates.
  • One of the points from that Ross Perot study is that his supporters often misremembered what they felt like at the time he reported he had dropped out of the race. The nature of that misremembering depended on their state at the time they were remembering and what decisions they had made about Perot in the interim affected how they reconstructed their earlier memories.Again, that makes nicely the point that our current emotions and current appraisals of a situation can feed back into our reconstruction of the past and sometimes lead us to distort our memories so that they better support our current emotions and our current selves. We’re often using memories to justify what we currently know, believe and feel.
  • memory doesn’t work like a video camera or tape recorder.That is the main point. Our latest thinking on this is the idea that one of the major functions of memory is to support our ability to plan for the future, to imagine the future, and to use our past experiences in a flexible way to simulate different outcomes of events.
  • flexibility of memory is something that makes it useful to support this very important ability to run simulations of future events. But that very flexibility might be something that contributes to some of the memory distortion we talked about. That has been prominent in the last few years in my thinking about the constructive nature of memory.
  • The historian Daniel Aaron told his students “we remember what’s important.” What do you think of that comment?I think that generally holds true. Certainly, again, more important memories tend to be more significant with more emotional arousal and may elicit “deeper processing”, as we call it in cognitive psychology
jlessner

Straight Talk for White Men - NYTimes.com - 0 views

  • SUPERMARKET shoppers are more likely to buy French wine when French music is playing, and to buy German wine when they hear German music. That’s true even though only 14 percent of shoppers say they noticed the music, a study finds.
  • Researchers discovered that candidates for medical school interviewed on sunny days received much higher ratings than those interviewed on rainy days. Being interviewed on a rainy day was a setback equivalent to having an MCAT score 10 percent lower, according to a new book called “Everyday Bias,” by Howard J. Ross.
  • Those studies are a reminder that we humans are perhaps less rational than we would like to think, and more prone to the buffeting of unconscious influences. That’s something for those of us who are white men to reflect on when we’re accused of “privilege.”
  • ...7 more annotations...
  • When I wrote a series last year, “When Whites Just Don’t Get It,” the reaction from white men was often indignant: It’s an equal playing field now! Get off our case!
  • Yet the evidence is overwhelming that unconscious bias remains widespread in ways that systematically benefit both whites and men. So white men get a double dividend, a payoff from both racial and gender biases.
  • male professors are disproportionately likely to be described as a “star” or “genius.” Female professors are disproportionately described as “nasty,” “ugly,” “bossy” or “disorganized.”
  • When students were taking the class from someone they believed to be male, they rated the teacher more highly. The very same teacher, when believed to be female, was rated significantly lower.
  • The study found that a résumé with a name like Emily or Greg received 50 percent more callbacks than the same résumé with a name like Lakisha or Jamal. Having a white-sounding name was as beneficial as eight years’ work experience.
  • Then there was the study in which researchers asked professors to evaluate the summary of a supposed applicant for a post as laboratory manager, but, in some cases, the applicant was named John and in others Jennifer. Everything else was the same.“John” was rated an average of 4.0 on a 7-point scale for competence, “Jennifer” a 3.3. When asked to propose an annual starting salary for the applicant, the professors suggested on average a salary for “John” almost $4,000 higher than for “Jennifer.”Continue reading the main story Continue reading the main story
  • While we don’t notice systematic unfairness, we do observe specific efforts to redress it — such as affirmative action, which often strikes white men as profoundly unjust. Thus a majority of white Americans surveyed in a 2011 study said that there is now more racism against whites than against blacks.
Emilio Ergueta

Searching For Santa | Issue 70 | Philosophy Now - 0 views

  • I brace myself against the freezing air and remind myself that I’m here on a mission – to try and find an answer to a question which causes massive conflict to this day. Debate about it has reached fever pitch in recent years, with schoolteachers even being fired for teaching belief in him.
  • Certainly not! In fact, science disproves the existence of Santa. We know he couldn’t possibly visit all those children in a single evening, because his sleigh would explode at those speeds! We also know that he couldn’t fit down the chimney…
  • Not at all. A lot of people assume that because you don’t believe in Santa you must not get any presents, but that just isn’t the case. I get lots of presents, and I enjoy buying presents for my friends.
  • ...8 more annotations...
  • es, I’ve come here looking for Father Christmas.
  • Elder Kringle and his community are self-described ‘Santa Fundamentalists’. They believe the Santa legend exactly the way it’s told. Now I’m going to be the first person ever to be granted an interview by this strange and reclusive community.
  • And so my first interview ended. I confess to finding the anti-Santa position somewhat unnerving, but it certainly addresses some very poignant questions. Next I decided to interview Reverend William Ronald, a believer and Santa apologist, to see if I could get the other side of the story.
  • Now I was more than a little apprehensive. It seemed that he wanted to take me out of the country that very night, that very moment even, to meet a community of True Believers. Normally when bearded strangers decked out in red and green with bells make this kind of offer, the alarm bells start jingling in my mind. But I was enthralled. I couldn’t resist the opportunity to get this new angle on my story, and so I consented…
  • Well Sam, there are a lot of misunderstandings out there. You see, not all Santa believers reject the theories of parents placing the gifts, or even claims that the toys are made by people in factories and bought in shops.
  • If other people won’t lead their children in the ways of Santa then we’ll need to do it for them. Also, we would close all the toy stores; people shouldn’t be allowed to choose what toys they have. It isn’t the place of mortals to ‘Play Santa’ with the universe.
  • f we don’t need Santa in order to receive presents, then why believe in him at all? Wasn’t it Voltaire who said: “As long as people believe in absurdities they will continue to commit atrocities”? Does belief in Santa open up unnecessary doors for extremists? Can’t we just accept that sometimes we get crappy presents and just be grateful for getting any presents at all?
  • Maybe people only believe in Santa because it boosts their ego to think that their actions and lives are worthy of 24-hour observation. I don’t know, and I can’t claim to have all the answers. But my search for Santa has certainly given me some food for thought.
Javier E

Why Kids Sext - Hanna Rosin - The Atlantic - 0 views

  • Within an hour, the deputies realized just how common the sharing of nude pictures was at the school. “The boys kept telling us, ‘It’s nothing unusual. It happens all the time,’ ” Lowe recalls. Every time someone they were interviewing mentioned another kid who might have naked pictures on his or her phone, they had to call that kid in for an interview. After just a couple of days, the deputies had filled multiple evidence bins with phones, and they couldn’t see an end to it. Fears of a cabal got replaced by a more mundane concern: what to do with “hundreds of damned phones. I told the deputies, ‘We got to draw the line somewhere or we’re going to end up talking to every teenager in the damned county!’ ”
  • Nor did the problem stop at the county’s borders. Several boys, in an effort to convince Lowe that they hadn’t been doing anything rare or deviant, showed him that he could type the hashtag symbol (#) into Instagram followed by the name of pretty much any nearby county and then thots, and find a similar account.
  • In some he sensed low self-esteem—for example, the girl who’d sent her naked picture to a boy, unsolicited: “It just showed up! I guess she was hot after him?” A handful of senior girls became indignant during the course of the interview. “This is my life and my body and I can do whatever I want with it,” or, “I don’t see any problem with it. I’m proud of my body,” Lowe remembers them saying. A few, as far as he could tell, had taken pictures especially for the Instagram accounts and had actively tried to get them posted.
  • ...5 more annotations...
  • What seemed to mortify them most was having to talk about what they’d done with a “police officer outside their age group.”
  • Most of the girls on Instagram fell into the same category as Jasmine. They had sent a picture to their boyfriend, or to someone they wanted to be their boyfriend, and then he had sent it on to others. For the most part, they were embarrassed but not devastated, Lowe said. They felt betrayed, but few seemed all that surprised that their photos had been passed around.
  • Lowe’s team explained to both the kids pictured on Instagram and the ones with photos on their phones the serious legal consequences of their actions. Possessing or sending a nude photo of a minor—even if it’s a photo of yourself—can be prosecuted as a felony under state child-porn laws. He explained that 10 years down the road they might be looking for a job or trying to join the military, or sitting with their families at church, and the pictures could wash back up; someone who had the pictures might even try to blackmail them.
  • yet the kids seemed strikingly blasé. “They’re just sitting there thinking, Wah, wah, wah,” Lowe said, turning his hands into flapping lips. “It’s not sinking in. Remember at that age, you think you’re invincible, and you’re going to do whatever the hell you want to do? We just couldn’t get them past that.”
  • while adults send naked pictures too, of course, the speed with which teens have incorporated the practice into their mating rituals has taken society by surprise.
Javier E

Documenting Sports With Tech, or It Didn't Happen - The New York Times - 0 views

  • The real-life issues now so embedded with the sports world — like debates over racial injustice, brain damage, the ethics of college sports and cheating at the Olympics, plus 100 other things — cannot be parsed to 140 characters.
  • ? Twitter has turned a lot of sports reporting into play-by-play, hot takes and snarky one-liners. With retweets and replies, the echo can be deafening.
  • The biggest transformation has been the use of social media, and Twitter is the opium of the sports-reporting masses
  • ...8 more annotations...
  • I’m learning I can have nothing but an iPhone and I’m fine.
  • The game changer was the smartphone. It's not only my office phone. I can also use it to record interviews (its microphone is better than the one in my old Olympus, which is important in crowded, noisy places), take pictures and videos to help me remember the details of what I see, and even type or speak notes and interview answers into emails that I send myself.
  • I use it the way other people use their phones. I email, text, tweet, post to Instagram, get directions, set timers and alarms, change flights, check weather, update my calendar, map my jogs, and listen to podcasts and Spotify during long drives or plane rides. On assignment, I’ve had entire conversations with Google Translate, two of us passing my phone back and forth.
  • Besides being an all-in-one communication tool, the iPhone helps my writing. I take photographs of places I know I’ll want to describe in detail later — the inside of someone’s home, a rocky mountain summit, a piece of jewelry that a subject is wearing, the shape of the clouds and the color of the sky. I take videos of places, too, and narrate them as I shoot so that I can watch and listen later.
  • I often do stories overseas, and for the last couple of years, I have constantly connected with sources, interview subjects and my own family on my phone through WhatsApp, a brilliant messaging service that seems to be well known everywhere except the United States.
  • I use it to text, but also to trade photographs, short videos and voice messages, instantly. And you can call from it, even use it for face-to-face video conversations, free if you’re on Wi-Fi.
  • More than anything, technology has brought the sports world into the “now.”
  • Now we can see almost any game on television, in a dozen sports from anywhere in the world, with a computer on our laps and a phone in our hands. We receive and give instant analysis through the world of social media. We can track statistics for our fantasy teams. We can tweet nasty messages to famous athletes and coaches who disappoint us. Like so many other parts of society, we’re probably watching sports more physically alone than ever, but more connected in other ways.
Javier E

Thieves of experience: On the rise of surveillance capitalism - 1 views

  • Harvard Business School professor emerita Shoshana Zuboff argues in her new book that the Valley’s wealth and power are predicated on an insidious, essentially pathological form of private enterprise—what she calls “surveillance capitalism.” Pioneered by Google, perfected by Facebook, and now spreading throughout the economy, surveillance capitalism uses human life as its raw material. Our everyday experiences, distilled into data, have become a privately-owned business asset used to predict and mold our behavior, whether we’re shopping or socializing, working or voting.
  • By reengineering the economy and society to their own benefit, Google and Facebook are perverting capitalism in a way that undermines personal freedom and corrodes democracy.
  • Under the Fordist model of mass production and consumption that prevailed for much of the twentieth century, industrial capitalism achieved a relatively benign balance among the contending interests of business owners, workers, and consumers. Enlightened executives understood that good pay and decent working conditions would ensure a prosperous middle class eager to buy the goods and services their companies produced. It was the product itself — made by workers, sold by companies, bought by consumers — that tied the interests of capitalism’s participants together. Economic and social equilibrium was negotiated through the product.
  • ...72 more annotations...
  • By removing the tangible product from the center of commerce, surveillance capitalism upsets the equilibrium. Whenever we use free apps and online services, it’s often said, we become the products, our attention harvested and sold to advertisers
  • this truism gets it wrong. Surveillance capitalism’s real products, vaporous but immensely valuable, are predictions about our future behavior — what we’ll look at, where we’ll go, what we’ll buy, what opinions we’ll hold — that internet companies derive from our personal data and sell to businesses, political operatives, and other bidders.
  • Unlike financial derivatives, which they in some ways resemble, these new data derivatives draw their value, parasite-like, from human experience.To the Googles and Facebooks of the world, we are neither the customer nor the product. We are the source of what Silicon Valley technologists call “data exhaust” — the informational byproducts of online activity that become the inputs to prediction algorithms
  • Another 2015 study, appearing in the Journal of Computer-Mediated Communication, showed that when people hear their phone ring but are unable to answer it, their blood pressure spikes, their pulse quickens, and their problem-solving skills decline.
  • The smartphone has become a repository of the self, recording and dispensing the words, sounds and images that define what we think, what we experience and who we are. In a 2015 Gallup survey, more than half of iPhone owners said that they couldn’t imagine life without the device.
  • So what happens to our minds when we allow a single tool such dominion over our perception and cognition?
  • Not only do our phones shape our thoughts in deep and complicated ways, but the effects persist even when we aren’t using the devices. As the brain grows dependent on the technology, the research suggests, the intellect weakens.
  • he has seen mounting evidence that using a smartphone, or even hearing one ring or vibrate, produces a welter of distractions that makes it harder to concentrate on a difficult problem or job. The division of attention impedes reasoning and performance.
  • internet companies operate in what Zuboff terms “extreme structural independence from people.” When databases displace goods as the engine of the economy, our own interests, as consumers but also as citizens, cease to be part of the negotiation. We are no longer one of the forces guiding the market’s invisible hand. We are the objects of surveillance and control.
  • Social skills and relationships seem to suffer as well.
  • In both tests, the subjects whose phones were in view posted the worst scores, while those who left their phones in a different room did the best. The students who kept their phones in their pockets or bags came out in the middle. As the phone’s proximity increased, brainpower decreased.
  • In subsequent interviews, nearly all the participants said that their phones hadn’t been a distraction—that they hadn’t even thought about the devices during the experiment. They remained oblivious even as the phones disrupted their focus and thinking.
  • The researchers recruited 520 undergraduates at UCSD and gave them two standard tests of intellectual acuity. One test gauged “available working-memory capacity,” a measure of how fully a person’s mind can focus on a particular task. The second assessed “fluid intelligence,” a person’s ability to interpret and solve an unfamiliar problem. The only variable in the experiment was the location of the subjects’ smartphones. Some of the students were asked to place their phones in front of them on their desks; others were told to stow their phones in their pockets or handbags; still others were required to leave their phones in a different room.
  • the “integration of smartphones into daily life” appears to cause a “brain drain” that can diminish such vital mental skills as “learning, logical reasoning, abstract thought, problem solving, and creativity.”
  •  Smartphones have become so entangled with our existence that, even when we’re not peering or pawing at them, they tug at our attention, diverting precious cognitive resources. Just suppressing the desire to check our phone, which we do routinely and subconsciously throughout the day, can debilitate our thinking.
  • They found that students who didn’t bring their phones to the classroom scored a full letter-grade higher on a test of the material presented than those who brought their phones. It didn’t matter whether the students who had their phones used them or not: All of them scored equally poorly.
  • A study of nearly a hundred secondary schools in the U.K., published last year in the journal Labour Economics, found that when schools ban smartphones, students’ examination scores go up substantially, with the weakest students benefiting the most.
  • Data, the novelist and critic Cynthia Ozick once wrote, is “memory without history.” Her observation points to the problem with allowing smartphones to commandeer our brains
  • Because smartphones serve as constant reminders of all the friends we could be chatting with electronically, they pull at our minds when we’re talking with people in person, leaving our conversations shallower and less satisfying.
  • In a 2013 study conducted at the University of Essex in England, 142 participants were divided into pairs and asked to converse in private for ten minutes. Half talked with a phone in the room, half without a phone present. The subjects were then given tests of affinity, trust and empathy. “The mere presence of mobile phones,” the researchers reported in the Journal of Social and Personal Relationships, “inhibited the development of interpersonal closeness and trust” and diminished “the extent to which individuals felt empathy and understanding from their partners.”
  • The evidence that our phones can get inside our heads so forcefully is unsettling. It suggests that our thoughts and feelings, far from being sequestered in our skulls, can be skewed by external forces we’re not even aware o
  •  Scientists have long known that the brain is a monitoring system as well as a thinking system. Its attention is drawn toward any object that is new, intriguing or otherwise striking — that has, in the psychological jargon, “salience.”
  • even in the history of captivating media, the smartphone stands out. It is an attention magnet unlike any our minds have had to grapple with before. Because the phone is packed with so many forms of information and so many useful and entertaining functions, it acts as what Dr. Ward calls a “supernormal stimulus,” one that can “hijack” attention whenever it is part of our surroundings — and it is always part of our surroundings.
  • Imagine combining a mailbox, a newspaper, a TV, a radio, a photo album, a public library and a boisterous party attended by everyone you know, and then compressing them all into a single, small, radiant object. That is what a smartphone represents to us. No wonder we can’t take our minds off it.
  • The irony of the smartphone is that the qualities that make it so appealing to us — its constant connection to the net, its multiplicity of apps, its responsiveness, its portability — are the very ones that give it such sway over our minds.
  • Phone makers like Apple and Samsung and app writers like Facebook, Google and Snap design their products to consume as much of our attention as possible during every one of our waking hours
  • Social media apps were designed to exploit “a vulnerability in human psychology,” former Facebook president Sean Parker said in a recent interview. “[We] understood this consciously. And we did it anyway.”
  • A quarter-century ago, when we first started going online, we took it on faith that the web would make us smarter: More information would breed sharper thinking. We now know it’s not that simple.
  • As strange as it might seem, people’s knowledge and understanding may actually dwindle as gadgets grant them easier access to online data stores
  • In a seminal 2011 study published in Science, a team of researchers — led by the Columbia University psychologist Betsy Sparrow and including the late Harvard memory expert Daniel Wegner — had a group of volunteers read forty brief, factual statements (such as “The space shuttle Columbia disintegrated during re-entry over Texas in Feb. 2003”) and then type the statements into a computer. Half the people were told that the machine would save what they typed; half were told that the statements would be erased.
  • Afterward, the researchers asked the subjects to write down as many of the statements as they could remember. Those who believed that the facts had been recorded in the computer demonstrated much weaker recall than those who assumed the facts wouldn’t be stored. Anticipating that information would be readily available in digital form seemed to reduce the mental effort that people made to remember it
  • The researchers dubbed this phenomenon the “Google effect” and noted its broad implications: “Because search engines are continually available to us, we may often be in a state of not feeling we need to encode the information internally. When we need it, we will look it up.”
  • as the pioneering psychologist and philosopher William James said in an 1892 lecture, “the art of remembering is the art of thinking.”
  • Only by encoding information in our biological memory can we weave the rich intellectual associations that form the essence of personal knowledge and give rise to critical and conceptual thinking. No matter how much information swirls around us, the less well-stocked our memory, the less we have to think with.
  • As Dr. Wegner and Dr. Ward explained in a 2013 Scientific American article, when people call up information through their devices, they often end up suffering from delusions of intelligence. They feel as though “their own mental capacities” had generated the information, not their devices. “The advent of the ‘information age’ seems to have created a generation of people who feel they know more than ever before,” the scholars concluded, even though “they may know ever less about the world around them.”
  • That insight sheds light on society’s current gullibility crisis, in which people are all too quick to credit lies and half-truths spread through social media. If your phone has sapped your powers of discernment, you’ll believe anything it tells you.
  • A second experiment conducted by the researchers produced similar results, while also revealing that the more heavily students relied on their phones in their everyday lives, the greater the cognitive penalty they suffered.
  • When we constrict our capacity for reasoning and recall or transfer those skills to a gadget, we sacrifice our ability to turn information into knowledge. We get the data but lose the meaning
  • We need to give our minds more room to think. And that means putting some distance between ourselves and our phones.
  • Google’s once-patient investors grew restive, demanding that the founders figure out a way to make money, preferably lots of it.
  • nder pressure, Page and Brin authorized the launch of an auction system for selling advertisements tied to search queries. The system was designed so that the company would get paid by an advertiser only when a user clicked on an ad. This feature gave Google a huge financial incentive to make accurate predictions about how users would respond to ads and other online content. Even tiny increases in click rates would bring big gains in income. And so the company began deploying its stores of behavioral data not for the benefit of users but to aid advertisers — and to juice its own profits. Surveillance capitalism had arrived.
  • Google’s business now hinged on what Zuboff calls “the extraction imperative.” To improve its predictions, it had to mine as much information as possible from web users. It aggressively expanded its online services to widen the scope of its surveillance.
  • Through Gmail, it secured access to the contents of people’s emails and address books. Through Google Maps, it gained a bead on people’s whereabouts and movements. Through Google Calendar, it learned what people were doing at different moments during the day and whom they were doing it with. Through Google News, it got a readout of people’s interests and political leanings. Through Google Shopping, it opened a window onto people’s wish lists,
  • The company gave all these services away for free to ensure they’d be used by as many people as possible. It knew the money lay in the data.
  • the organization grew insular and secretive. Seeking to keep the true nature of its work from the public, it adopted what its CEO at the time, Eric Schmidt, called a “hiding strategy” — a kind of corporate omerta backed up by stringent nondisclosure agreements.
  • Page and Brin further shielded themselves from outside oversight by establishing a stock structure that guaranteed their power could never be challenged, neither by investors nor by directors.
  • What’s most remarkable about the birth of surveillance capitalism is the speed and audacity with which Google overturned social conventions and norms about data and privacy. Without permission, without compensation, and with little in the way of resistance, the company seized and declared ownership over everyone’s information
  • The companies that followed Google presumed that they too had an unfettered right to collect, parse, and sell personal data in pretty much any way they pleased. In the smart homes being built today, it’s understood that any and all data will be beamed up to corporate clouds.
  • Google conducted its great data heist under the cover of novelty. The web was an exciting frontier — something new in the world — and few people understood or cared about what they were revealing as they searched and surfed. In those innocent days, data was there for the taking, and Google took it
  • Google also benefited from decisions made by lawmakers, regulators, and judges — decisions that granted internet companies free use of a vast taxpayer-funded communication infrastructure, relieved them of legal and ethical responsibility for the information and messages they distributed, and gave them carte blanche to collect and exploit user data.
  • Consider the terms-of-service agreements that govern the division of rights and the delegation of ownership online. Non-negotiable, subject to emendation and extension at the company’s whim, and requiring only a casual click to bind the user, TOS agreements are parodies of contracts, yet they have been granted legal legitimacy by the court
  • Law professors, writes Zuboff, “call these ‘contracts of adhesion’ because they impose take-it-or-leave-it conditions on users that stick to them whether they like it or not.” Fundamentally undemocratic, the ubiquitous agreements helped Google and other firms commandeer personal data as if by fiat.
  • n the choices we make as consumers and private citizens, we have always traded some of our autonomy to gain other rewards. Many people, it seems clear, experience surveillance capitalism less as a prison, where their agency is restricted in a noxious way, than as an all-inclusive resort, where their agency is restricted in a pleasing way
  • Zuboff makes a convincing case that this is a short-sighted and dangerous view — that the bargain we’ve struck with the internet giants is a Faustian one
  • but her case would have been stronger still had she more fully addressed the benefits side of the ledger.
  • there’s a piece missing. While Zuboff’s assessment of the costs that people incur under surveillance capitalism is exhaustive, she largely ignores the benefits people receive in return — convenience, customization, savings, entertainment, social connection, and so on
  • hat the industries of the future will seek to manufacture is the self.
  • Behavior modification is the thread that ties today’s search engines, social networks, and smartphone trackers to tomorrow’s facial-recognition systems, emotion-detection sensors, and artificial-intelligence bots.
  • All of Facebook’s information wrangling and algorithmic fine-tuning, she writes, “is aimed at solving one problem: how and when to intervene in the state of play that is your daily life in order to modify your behavior and thus sharply increase the predictability of your actions now, soon, and later.”
  • “The goal of everything we do is to change people’s actual behavior at scale,” a top Silicon Valley data scientist told her in an interview. “We can test how actionable our cues are for them and how profitable certain behaviors are for us.”
  • This goal, she suggests, is not limited to Facebook. It is coming to guide much of the economy, as financial and social power shifts to the surveillance capitalists
  • Combining rich information on individuals’ behavioral triggers with the ability to deliver precisely tailored and timed messages turns out to be a recipe for behavior modification on an unprecedented scale.
  • it was Facebook, with its incredibly detailed data on people’s social lives, that grasped digital media’s full potential for behavior modification. By using what it called its “social graph” to map the intentions, desires, and interactions of literally billions of individuals, it saw that it could turn its network into a worldwide Skinner box, employing psychological triggers and rewards to program not only what people see but how they react.
  • spying on the populace is not the end game. The real prize lies in figuring out ways to use the data to shape how people think and act. “The best way to predict the future is to invent it,” the computer scientist Alan Kay once observed. And the best way to predict behavior is to script it.
  • competition for personal data intensified. It was no longer enough to monitor people online; making better predictions required that surveillance be extended into homes, stores, schools, workplaces, and the public squares of cities and towns. Much of the recent innovation in the tech industry has entailed the creation of products and services designed to vacuum up data from every corner of our lives
  • “The typical complaint is that privacy is eroded, but that is misleading,” Zuboff writes. “In the larger societal pattern, privacy is not eroded but redistributed . . . . Instead of people having the rights to decide how and what they will disclose, these rights are concentrated within the domain of surveillance capitalism.” The transfer of decision rights is also a transfer of autonomy and agency, from the citizen to the corporation.
  • What we lose under this regime is something more fundamental than privacy. It’s the right to make our own decisions about privacy — to draw our own lines between those aspects of our lives we are comfortable sharing and those we are not
  • Other possible ways of organizing online markets, such as through paid subscriptions for apps and services, never even got a chance to be tested.
  • Online surveillance came to be viewed as normal and even necessary by politicians, government bureaucrats, and the general public
  • Google and other Silicon Valley companies benefited directly from the government’s new stress on digital surveillance. They earned millions through contracts to share their data collection and analysis techniques with the National Security Agenc
  • As much as the dot-com crash, the horrors of 9/11 set the stage for the rise of surveillance capitalism. Zuboff notes that, in 2000, members of the Federal Trade Commission, frustrated by internet companies’ lack of progress in adopting privacy protections, began formulating legislation to secure people’s control over their online information and severely restrict the companies’ ability to collect and store it. It seemed obvious to the regulators that ownership of personal data should by default lie in the hands of private citizens, not corporations.
  • The 9/11 attacks changed the calculus. The centralized collection and analysis of online data, on a vast scale, came to be seen as essential to national security. “The privacy provisions debated just months earlier vanished from the conversation more or less overnight,”
Javier E

Ian Hacking, Eminent Philosopher of Science and Much Else, Dies at 87 - The New York Times - 0 views

  • In an academic career that included more than two decades as a professor in the philosophy department of the University of Toronto, following appointments at Cambridge and Stanford, Professor Hacking’s intellectual scope seemed to know no bounds. Because of his ability to span multiple academic fields, he was often described as a bridge builder.
  • “Ian Hacking was a one-person interdisciplinary department all by himself,” Cheryl Misak, a philosophy professor at the University of Toronto, said in a phone interview. “Anthropologists, sociologists, historians and psychologists, as well as those working on probability theory and physics, took him to have important insights for their disciplines.”
  • Professor Hacking wrote several landmark works on the philosophy and history of probability, including “The Taming of Chance” (1990), which was named one of the best 100 nonfiction books of the 20th century by the Modern Library.
  • ...17 more annotations...
  • “I have long been interested in classifications of people, in how they affect the people classified, and how the effects on the people in turn change the classifications,” he wrote in “Making Up People
  • His work in the philosophy of science was groundbreaking: He departed from the preoccupation with questions that had long concerned philosophers. Arguing that science was just as much about intervention as it was about representation, be helped bring experimentation to center stage.
  • Regarding one such question — whether unseen phenomena like quarks and electrons were real or merely the theoretical constructs of physicists — he argued for reality in the case of phenomena that figured in experiments, citing as an example an experiment at Stanford that involved spraying electrons and positrons into a ball of niobium to detect electric charges. “So far as I am concerned,” he wrote, “if you can spray them, they’re real.”
  • His book “The Emergence of Probability” (1975), which is said to have inspired hundreds of books by other scholars, examined how concepts of statistical probability have evolved over time, shaping the way we understand not just arcane fields like quantum physics but also everyday life.
  • “I was trying to understand what happened a few hundred years ago that made it possible for our world to be dominated by probabilities,” he said in a 2012 interview with the journal Public Culture. “We now live in a universe of chance, and everything we do — health, sports, sex, molecules, the climate — takes place within a discourse of probabilities.”
  • Whatever the subject, whatever the audience, one idea that pervades all his work is that “science is a human enterprise,” Ragnar Fjelland and Roger Strand of the University of Bergen in Norway wrote when Professor Hacking won the Holberg Prize. “It is always created in a historical situation, and to understand why present science is as it is, it is not sufficient to know that it is ‘true,’ or confirmed. We have to know the historical context of its emergence.”
  • Hacking often argued that as the human sciences have evolved, they have created categories of people, and that people have subsequently defined themselves as falling into those categories. Thus does human reality become socially constructed.
  • In 2000, he became the first Anglophone to win a permanent position at the Collège de France, where he held the chair in the philosophy and history of scientific concepts until he retired in 2006.
  • “I call this the ‘looping effect,’” he added. “Sometimes, our sciences create kinds of people that in a certain sense did not exist before.”
  • In “Why Race Still Matters,” a 2005 article in the journal Daedalus, he explored how anthropologists developed racial categories by extrapolating from superficial physical characteristics, with lasting effects — including racial oppression. “Classification and judgment are seldom separable,” he wrote. “Racial classification is evaluation.”
  • Similarly, he once wrote, in the field of mental health the word “normal” “uses a power as old as Aristotle to bridge the fact/value distinction, whispering in your ear that what is normal is also right.”
  • In his influential writings about autism, Professor Hacking charted the evolution of the diagnosis and its profound effects on those diagnosed, which in turn broadened the definition to include a greater number of people.
  • Encouraging children with autism to think of themselves that way “can separate the child from ‘normalcy’ in a way that is not appropriate,” he told Public Culture. “By all means encourage the oddities. By no means criticize the oddities.”
  • His emphasis on historical context also illuminated what he called transient mental illnesses, which appear to be so confined 0cto their time 0c 0cthat they can vanish when times change.
  • “hysterical fugue” was a short-lived epidemic of compulsive wandering that emerged in Europe in the 1880s, largely among middle-class men who had become transfixed by stories of exotic locales and the lure of trave
  • His intellectual tendencies were unmistakable from an early age. “When he was 3 or 4 years old, he would sit and read the dictionary,” Jane Hacking said. “His parents were completely baffled.”
  • He wondered aloud, the interviewer noted, if the whole universe was governed by nonlocality — if “everything in the universe is aware of everything else.”“That’s what you should be writing about,” he said. “Not me. I’m a dilettante. My governing word is ‘curiosity.’”
Javier E

Dispute Within Art Critics Group Over Diversity Reveals a Widening Rift - The New York ... - 0 views

  • Amussen, 33, is the editor of Burnaway, which focuses on criticism in the American South and often features young Black artists. (The magazine started in 2008 in response to layoffs at the Atlanta Journal-Constitution’s culture section and now runs as a nonprofit with four full-time employees and a budget that mostly consists of grants.)
  • Efforts to revive AICA-USA are continuing. In January, Jasmine Amussen joined the organization’s board to help rethink the meaning of criticism for a younger generation.
  • The organization has yearly dues of $115 and provides free access to many museums. But some members complained that the fee was too expensive for young critics, yet not enough to support significant programming.
  • ...12 more annotations...
  • “It just came down to not having enough money,” said Terence Trouillot, a senior editor at Frieze, a contemporary art magazine . He spent nearly three years on the AICA-USA board, resigning in 2022. He said that initiatives to re-energize the group “were just moving too slowly.”
  • According to Lilly Wei, a longtime AICA-USA board member who recently resigned, the group explored different ways of protecting writers in the industry. There were unrealized plans of turning the organization into a union; others hoped to create a permanent emergency fund to keep financially struggling critics afloat. She said the organization has instead canceled initiatives, including an awards program for the best exhibitions across the country.
  • Large galleries — including Gagosian, Hauser & Wirth, and Pace Gallery — now produce their own publications with interviews and articles sometimes written by the same freelance critics who simultaneously moonlight as curators and marketers. Within its membership, AICA-USA has a number of writers who belong to all three categories.
  • “It’s crazy that the ideal job nowadays is producing catalog essays for galleries, which are basically just sales pitches,” Dillon said in a phone interview. “Critical thinking about art is not valued financially.”
  • Noah Dillon, who was on the AICA-USA board until he resigned last year, has been reluctant to recommend that anyone follow his path to become a critic. Not that they could. The graduate program in art writing that he attended at the School of Visual Arts in Manhattan also closed during the pandemic.
  • David Velasco, editor in chief of Artforum, said in an interview that he hoped the magazine’s acquisition would improve the publication’s financial picture. The magazine runs nearly 700 reviews a year, Velasco said; about half of those run online and pay $50 for roughly 250 words. “Nobody I know who knows about art does it for the money,” Velasco said, “but I would love to arrive at a point where people could.”
  • While most editors recognize the importance of criticism in helping readers decipher contemporary art, and the multibillion-dollar industry it has created, venues for such writing are shrinking. Over the years, newspapers including The Philadelphia Inquirer and The Miami Herald have trimmed critics’ jobs.
  • In December, the Penske Media Corporation announced that it had acquired Artforum, a contemporary art journal, and was bringing the title under the same ownership as its two competitors, ARTnews and Art in America. Its sister publication, Bookforum, was not acquired and ceased operations. Through the pandemic, other outlets have shuttered, including popular blogs run by SFMOMA and the Walker Art Center in Minneapolis as well as smaller magazines called Astra and Elephant.
  • The need for change in museums was pointed out in the 2022 Burns Halperin Report, published by Artnet News in December, that analyzed more than a decade of data from over 30 cultural institutions. It found that just 11 percent of acquisitions at U.S. museums were by female artists and only 2.2 percent were by Black American artists
  • (National newspapers with art critics on staff include The New York Times, The Los Angeles Times, The Boston Globe and The Washington Post. )
  • Julia Halperin, one of the study’s organizers, who recently left her position as Artnet’s executive editor, said that the industry has an asymmetric approach to diversity. “The pool of artists is diversifying somewhat, but the pool of staff critics has not,” she said.
  • the matter of diversity in criticism is compounded by the fact that opportunities for all critics have been diminished.
Javier E

Grand Old Planet - NYTimes.com - 1 views

  • Mr. Rubio was asked how old the earth is. After declaring “I’m not a scientist, man,” the senator went into desperate evasive action, ending with the declaration that “it’s one of the great mysteries.”
  • Reading Mr. Rubio’s interview is like driving through a deeply eroded canyon; all at once, you can clearly see what lies below the superficial landscape. Like striated rock beds that speak of deep time, his inability to acknowledge scientific evidence speaks of the anti-rational mind-set that has taken over his political party.
  • that question didn’t come out of the blue. As speaker of the Florida House of Representatives, Mr. Rubio provided powerful aid to creationists trying to water down science education. In one interview, he compared the teaching of evolution to Communist indoctrination tactics — although he graciously added that “I’m not equating the evolution people with Fidel Castro.
  • ...5 more annotations...
  • What was Mr. Rubio’s complaint about science teaching? That it might undermine children’s faith in what their parents told them to believe.
  • What accounts for this pattern of denial? Earlier this year, the science writer Chris Mooney published “The Republican Brain,” which was not, as you might think, a partisan screed. It was, instead, a survey of the now-extensive research linking political views to personality types. As Mr. Mooney showed, modern American conservatism is highly correlated with authoritarian inclinations — and authoritarians are strongly inclined to reject any evidence contradicting their prior beliefs
  • it’s not symmetric. Liberals, being human, often give in to wishful thinking — but not in the same systematic, all-encompassing way.
  • We are, after all, living in an era when science plays a crucial economic role. How are we going to search effectively for natural resources if schools trying to teach modern geology must give equal time to claims that the world is only 6.000 years old? How are we going to stay competitive in biotechnology if biology classes avoid any material that might offend creationists?
  • then there’s the matter of using evidence to shape economic policy. You may have read about the recent study from the Congressional Research Service finding no empirical support for the dogma that cutting taxes on the wealthy leads to higher economic growth. How did Republicans respond? By suppressing the report. On economics, as in hard science, modern conservatives don’t want to hear anything challenging their preconceptions — and they don’t want anyone else to hear about it, either.
Emily Horwitz

More Young People Are Moving Away From Religion, But Why? : NPR - 0 views

  • One-fifth of Americans are religiously unaffiliated — higher than at any time in recent U.S. history — and those younger than 30 especially seem to be drifting from organized religion. A third of young Americans say they don't belong to any religion.
  • raised Jewish and considers herself Jewish with an "agnostic bent." She loves going to synagogue.
  • I realize maybe there's a disconnect there — why are you doing it if you don't necessarily have a belief in God? But I think there's a cultural aspect, there's a spiritual aspect, I suppose. I find the practice of sitting and being quiet and being alone with your thoughts to be helpful, but I don't think I need to answer that question [about God] in order to participate in the traditions I was brought up with."
  • ...8 more annotations...
  • Yusuf Ahmad, 33, raised Muslim, is now an atheist.
  • The thing for me — a large part of the reason I moved away from Catholicism was because without accepting a lot of these core beliefs, I just didn't think that I could still be part of that community.
  • "It's a little troublesome now when people ask me. I tell them and they go, 'Oh, you're a Christian,' and I try to skirt the issue now. They go, 'What does that mean?' and it's like, "It's Latin for 'I made a mistake when I was 18.'
  • I don't [believe in God] but I really want to. That's the problem with questions like these is you don't have anything that clearly states, 'Yes, this is fact,' so I'm constantly struggling.
  • I remember growing up, in like fifth [or] sixth grade I'd hear these stories and be like, 'That's crazy! Why would this guy do this? Just because he heard a voice in his head, he went to sacrifice his son and it turned into a goat?' There's no way that this happened. I wasn't buying it.
  • I remember a theology test in eighth grade where there was a question about homosexuality, and the right answer was that if you are homosexual, then that is not a sin because that's how God made you, but acting upon it would be a sin. That's what I put down as the answer, but I vividly remember thinking to myself that that was not the right answer."
  • We didn't have a lot of money, the household wasn't very stable a lot of the time, so when something bad would happen, say a prayer, go to church. When my mom got cancer the first time, it was something that was useful at the time for me as a coping mechanism.
  • So at some point you start to say, 'Why does all this stuff happen to people?' And if I pray and nothing good happens, is that supposed to be I'm being tried? I find that almost kind of cruel in some ways. It's like burning ants with a magnifying glass. Eventually that gets just too hard to believe anymore."
  •  
    An interesting interview with young adults of many different faiths about why they have lost their beliefs in God. Interestingly enough, not a lot of these were about science, as I had initially expected upon clicking on the link to the article.
Javier E

I Love You: An Interview with Dominique Ovalle : The Other Journal - 0 views

  • There is a tendency for some people to sneer at beauty or to revile it, because it is so attractive and magnetic. That makes it untrustworthy to fearful people. If people have been let down before—by life or the actions of others—there may be a tendency to mistrust things that appear to be good
  • It is hard to swallow that some things are good, beautiful, and true. Hans Urs von Balthasar said, “We can be sure that whoever sneers at [beauty’s] name as if she were the ornament of a bourgeois past—whether he admits it or not—can no longer pray and soon will no longer be able to love.”
  • When people do encounter something pure and beautiful, they have an opportunity to accept it, to believe it. That is the pivotal moment: when art meets life, when it meets reality, when it meets you and me. That’s where the conversation is.
sissij

A Scar on the Chinese Soul - The New York Times - 1 views

  • It “is that the Chinese, without direct orders, were so cruel to each other.”
  • The blurry distinction between perpetrators and victims makes collective healing by confronting the past a thorny project.
  • neither joining the Red Guards nor believing in Maoism protected someone from suffering long-term trauma.
  • ...3 more annotations...
  • Cultural Revolution trauma differs from that related to other horrific events, like the Holocaust and the Rwandan genocide, studies have noted, in part because in China, people were persecuted not for “unalterable” characteristics such as ethnicity and race, but for having the wrong frame of mind.
  • The idea that life experiences could cause inheritable genetic changes has been identified among children of Holocaust survivors, who have been shown to have an increased likelihood of stress-related illnesses.
  • The possibility of epigenetic inheritance has been raised by Chinese academics regarding the Cultural Revolution, but to research the topic would most certainly invite state punishment.
    • sissij
       
      I think this article is a little bit exaggerating. Nobody is perfect, even the Mao that saved China from the hand of the Japanese. The title "a scar on the Chinese soul" is just too heavy. It is only showing one perspective of the event and it can be misleading to the general population America. From a media course I took during the winter break, it shows that there are cultural elements influencing the media inevitably. For America, the key words are "freedom", "supremacy", "heroism", "democracy"... I can see those elements influencing the view of the author in this article. And he is not the only one. There is a book called "Street of Eternal Happiness". It is also talking about Chinese suffering. There is confirmation bias in whom the author is interviewing and what data he is using in support to his argument. --Sissi (1/19/2017)
silveiragu

Noam Chomsky Calls Postmodern Critiques of Science Over-Inflated "Polysyllabic Truisms"... - 0 views

  • we recently featured an interview in which Noam Chomsky slams postmodernist intellectuals like Slavoj Zizek and Jacques Lacan as “charlatans” and posers.
  • The turn against postmodernism has been long in coming,
  • Chomsky characterizes leftist postmodern academics as “a category of intellectuals who are undoubtedly perfectly sincere”
  • ...4 more annotations...
  • in his critique, such thinkers use “polysyllabic words and complicated constructions” to make claims that are “all very inflated” and which have “a terrible effect on the third world.
  • It’s considered very left wing, very advanced. Some of what appears in it sort of actually makes sense, but when you reproduce it in monosyllables, it turns out to be truisms. It’s perfectly true that when you look at scientists in the West, they’re mostly men, it’s perfectly true that women have had a hard time breaking into the scientific fields, and it’s perfectly true that there are institutional factors determining how science proceeds that reflect power structures.
  • you don’t get to be a respected intellectual by presenting truisms in monosyllables.
  • Chomsky’s cranky contrarianism is nothing new, and some of his polemic recalls the analytic case against “continental” philosophy or Karl Popper’s case against pseudo-science, although his investment is political as much as philosophical.
  •  
    An interesting synopsis and analysis, linked to a relatively short interview with a great thinker.
1 - 20 of 224 Next › Last »
Showing 20 items per page