Skip to main content

Home/ TOK Friends/ Group items tagged photo

Rss Feed Group items tagged

Javier E

They're Watching You at Work - Don Peck - The Atlantic - 2 views

  • Predictive statistical analysis, harnessed to big data, appears poised to alter the way millions of people are hired and assessed.
  • By one estimate, more than 98 percent of the world’s information is now stored digitally, and the volume of that data has quadrupled since 2007.
  • The application of predictive analytics to people’s careers—an emerging field sometimes called “people analytics”—is enormously challenging, not to mention ethically fraught
  • ...52 more annotations...
  • By the end of World War II, however, American corporations were facing severe talent shortages. Their senior executives were growing old, and a dearth of hiring from the Depression through the war had resulted in a shortfall of able, well-trained managers. Finding people who had the potential to rise quickly through the ranks became an overriding preoccupation of American businesses. They began to devise a formal hiring-and-management system based in part on new studies of human behavior, and in part on military techniques developed during both world wars, when huge mobilization efforts and mass casualties created the need to get the right people into the right roles as efficiently as possible. By the 1950s, it was not unusual for companies to spend days with young applicants for professional jobs, conducting a battery of tests, all with an eye toward corner-office potential.
  • But companies abandoned their hard-edged practices for another important reason: many of their methods of evaluation turned out not to be very scientific.
  • this regime, so widespread in corporate America at mid-century, had almost disappeared by 1990. “I think an HR person from the late 1970s would be stunned to see how casually companies hire now,”
  • Many factors explain the change, he said, and then he ticked off a number of them: Increased job-switching has made it less important and less economical for companies to test so thoroughly. A heightened focus on short-term financial results has led to deep cuts in corporate functions that bear fruit only in the long term. The Civil Rights Act of 1964, which exposed companies to legal liability for discriminatory hiring practices, has made HR departments wary of any broadly applied and clearly scored test that might later be shown to be systematically biased.
  • about a quarter of the country’s corporations were using similar tests to evaluate managers and junior executives, usually to assess whether they were ready for bigger roles.
  • He has encouraged the company’s HR executives to think about applying the games to the recruitment and evaluation of all professional workers.
  • Knack makes app-based video games, among them Dungeon Scrawl, a quest game requiring the player to navigate a maze and solve puzzles, and Wasabi Waiter, which involves delivering the right sushi to the right customer at an increasingly crowded happy hour. These games aren’t just for play: they’ve been designed by a team of neuroscientists, psychologists, and data scientists to suss out human potential. Play one of them for just 20 minutes, says Guy Halfteck, Knack’s founder, and you’ll generate several megabytes of data, exponentially more than what’s collected by the SAT or a personality test. How long you hesitate before taking every action, the sequence of actions you take, how you solve problems—all of these factors and many more are logged as you play, and then are used to analyze your creativity, your persistence, your capacity to learn quickly from mistakes, your ability to prioritize, and even your social intelligence and personality. The end result, Halfteck says, is a high-resolution portrait of your psyche and intellect, and an assessment of your potential as a leader or an innovator.
  • When the results came back, Haringa recalled, his heart began to beat a little faster. Without ever seeing the ideas, without meeting or interviewing the people who’d proposed them, without knowing their title or background or academic pedigree, Knack’s algorithm had identified the people whose ideas had panned out. The top 10 percent of the idea generators as predicted by Knack were in fact those who’d gone furthest in the process.
  • What Knack is doing, Haringa told me, “is almost like a paradigm shift.” It offers a way for his GameChanger unit to avoid wasting time on the 80 people out of 100—nearly all of whom look smart, well-trained, and plausible on paper—whose ideas just aren’t likely to work out.
  • Aptitude, skills, personal history, psychological stability, discretion, loyalty—companies at the time felt they had a need (and the right) to look into them all. That ambit is expanding once again, and this is undeniably unsettling. Should the ideas of scientists be dismissed because of the way they play a game? Should job candidates be ranked by what their Web habits say about them? Should the “data signature” of natural leaders play a role in promotion? These are all live questions today, and they prompt heavy concerns: that we will cede one of the most subtle and human of skills, the evaluation of the gifts and promise of other people, to machines; that the models will get it wrong; that some people will never get a shot in the new workforce.
  • scoring distance from work could violate equal-employment-opportunity standards. Marital status? Motherhood? Church membership? “Stuff like that,” Meyerle said, “we just don’t touch”—at least not in the U.S., where the legal environment is strict. Meyerle told me that Evolv has looked into these sorts of factors in its work for clients abroad, and that some of them produce “startling results.”
  • consider the alternative. A mountain of scholarly literature has shown that the intuitive way we now judge professional potential is rife with snap judgments and hidden biases, rooted in our upbringing or in deep neurological connections that doubtless served us well on the savanna but would seem to have less bearing on the world of work.
  • We may like to think that society has become more enlightened since those days, and in many ways it has, but our biases are mostly unconscious, and they can run surprisingly deep. Consider race. For a 2004 study called “Are Emily and Greg More Employable Than Lakisha and Jamal?,” the economists Sendhil Mullainathan and Marianne Bertrand put white-sounding names (Emily Walsh, Greg Baker) or black-sounding names (Lakisha Washington, Jamal Jones) on similar fictitious résumés, which they then sent out to a variety of companies in Boston and Chicago. To get the same number of callbacks, they learned, they needed to either send out half again as many résumés with black names as those with white names, or add eight extra years of relevant work experience to the résumés with black names.
  • a sociologist at Northwestern, spent parts of the three years from 2006 to 2008 interviewing professionals from elite investment banks, consultancies, and law firms about how they recruited, interviewed, and evaluated candidates, and concluded that among the most important factors driving their hiring recommendations were—wait for it—shared leisure interests.
  • Lacking “reliable predictors of future performance,” Rivera writes, “assessors purposefully used their own experiences as models of merit.” Former college athletes “typically prized participation in varsity sports above all other types of involvement.” People who’d majored in engineering gave engineers a leg up, believing they were better prepared.
  • the prevailing system of hiring and management in this country involves a level of dysfunction that should be inconceivable in an economy as sophisticated as ours. Recent survey data collected by the Corporate Executive Board, for example, indicate that nearly a quarter of all new hires leave their company within a year of their start date, and that hiring managers wish they’d never extended an offer to one out of every five members on their team
  • In the late 1990s, as these assessments shifted from paper to digital formats and proliferated, data scientists started doing massive tests of what makes for a successful customer-support technician or salesperson. This has unquestionably improved the quality of the workers at many firms.
  • In 2010, however, Xerox switched to an online evaluation that incorporates personality testing, cognitive-skill assessment, and multiple-choice questions about how the applicant would handle specific scenarios that he or she might encounter on the job. An algorithm behind the evaluation analyzes the responses, along with factual information gleaned from the candidate’s application, and spits out a color-coded rating: red (poor candidate), yellow (middling), or green (hire away). Those candidates who score best, I learned, tend to exhibit a creative but not overly inquisitive personality, and participate in at least one but not more than four social networks, among many other factors. (Previous experience, one of the few criteria that Xerox had explicitly screened for in the past, turns out to have no bearing on either productivity or retention
  • When Xerox started using the score in its hiring decisions, the quality of its hires immediately improved. The rate of attrition fell by 20 percent in the initial pilot period, and over time, the number of promotions rose. Xerox still interviews all candidates in person before deciding to hire them, Morse told me, but, she added, “We’re getting to the point where some of our hiring managers don’t even want to interview anymore”
  • Gone are the days, Ostberg told me, when, say, a small survey of college students would be used to predict the statistical validity of an evaluation tool. “We’ve got a data set of 347,000 actual employees who have gone through these different types of assessments or tools,” he told me, “and now we have performance-outcome data, and we can split those and slice and dice by industry and location.”
  • Evolv’s tests allow companies to capture data about everybody who applies for work, and everybody who gets hired—a complete data set from which sample bias, long a major vexation for industrial-organization psychologists, simply disappears. The sheer number of observations that this approach makes possible allows Evolv to say with precision which attributes matter more to the success of retail-sales workers (decisiveness, spatial orientation, persuasiveness) or customer-service personnel at call centers (rapport-building)
  • There are some data that Evolv simply won’t use, out of a concern that the information might lead to systematic bias against whole classes of people
  • the idea that hiring was a science fell out of favor. But now it’s coming back, thanks to new technologies and methods of analysis that are cheaper, faster, and much-wider-ranging than what we had before
  • what most excites him are the possibilities that arise from monitoring the entire life cycle of a worker at any given company.
  • Now the two companies are working together to marry pre-hire assessments to an increasing array of post-hire data: about not only performance and duration of service but also who trained the employees; who has managed them; whether they were promoted to a supervisory role, and how quickly; how they performed in that role; and why they eventually left.
  • What begins with an online screening test for entry-level workers ends with the transformation of nearly every aspect of hiring, performance assessment, and management.
  • I turned to Sandy Pentland, the director of the Human Dynamics Laboratory at MIT. In recent years, Pentland has pioneered the use of specialized electronic “badges” that transmit data about employees’ interactions as they go about their days. The badges capture all sorts of information about formal and informal conversations: their length; the tone of voice and gestures of the people involved; how much those people talk, listen, and interrupt; the degree to which they demonstrate empathy and extroversion; and more. Each badge generates about 100 data points a minute.
  • he tried the badges out on about 2,500 people, in 21 different organizations, and learned a number of interesting lessons. About a third of team performance, he discovered, can usually be predicted merely by the number of face-to-face exchanges among team members. (Too many is as much of a problem as too few.) Using data gathered by the badges, he was able to predict which teams would win a business-plan contest, and which workers would (rightly) say they’d had a “productive” or “creative” day. Not only that, but he claimed that his researchers had discovered the “data signature” of natural leaders, whom he called “charismatic connectors” and all of whom, he reported, circulate actively, give their time democratically to others, engage in brief but energetic conversations, and listen at least as much as they talk.
  • His group is developing apps to allow team members to view their own metrics more or less in real time, so that they can see, relative to the benchmarks of highly successful employees, whether they’re getting out of their offices enough, or listening enough, or spending enough time with people outside their own team.
  • Torrents of data are routinely collected by American companies and now sit on corporate servers, or in the cloud, awaiting analysis. Bloomberg reportedly logs every keystroke of every employee, along with their comings and goings in the office. The Las Vegas casino Harrah’s tracks the smiles of the card dealers and waitstaff on the floor (its analytics team has quantified the impact of smiling on customer satisfaction). E‑mail, of course, presents an especially rich vein to be mined for insights about our productivity, our treatment of co-workers, our willingness to collaborate or lend a hand, our patterns of written language, and what those patterns reveal about our intelligence, social skills, and behavior.
  • people analytics will ultimately have a vastly larger impact on the economy than the algorithms that now trade on Wall Street or figure out which ads to show us. He reminded me that we’ve witnessed this kind of transformation before in the history of management science. Near the turn of the 20th century, both Frederick Taylor and Henry Ford famously paced the factory floor with stopwatches, to improve worker efficiency.
  • “The quantities of data that those earlier generations were working with,” he said, “were infinitesimal compared to what’s available now. There’s been a real sea change in the past five years, where the quantities have just grown so large—petabytes, exabytes, zetta—that you start to be able to do things you never could before.”
  • People analytics will unquestionably provide many workers with more options and more power. Gild, for example, helps companies find undervalued software programmers, working indirectly to raise those people’s pay. Other companies are doing similar work. One called Entelo, for instance, specializes in using algorithms to identify potentially unhappy programmers who might be receptive to a phone cal
  • He sees it not only as a boon to a business’s productivity and overall health but also as an important new tool that individual employees can use for self-improvement: a sort of radically expanded The 7 Habits of Highly Effective People, custom-written for each of us, or at least each type of job, in the workforce.
  • the most exotic development in people analytics today is the creation of algorithms to assess the potential of all workers, across all companies, all the time.
  • The way Gild arrives at these scores is not simple. The company’s algorithms begin by scouring the Web for any and all open-source code, and for the coders who wrote it. They evaluate the code for its simplicity, elegance, documentation, and several other factors, including the frequency with which it’s been adopted by other programmers. For code that was written for paid projects, they look at completion times and other measures of productivity. Then they look at questions and answers on social forums such as Stack Overflow, a popular destination for programmers seeking advice on challenging projects. They consider how popular a given coder’s advice is, and how widely that advice ranges.
  • The algorithms go further still. They assess the way coders use language on social networks from LinkedIn to Twitter; the company has determined that certain phrases and words used in association with one another can distinguish expert programmers from less skilled ones. Gild knows these phrases and words are associated with good coding because it can correlate them with its evaluation of open-source code, and with the language and online behavior of programmers in good positions at prestigious companies.
  • having made those correlations, Gild can then score programmers who haven’t written open-source code at all, by analyzing the host of clues embedded in their online histories. They’re not all obvious, or easy to explain. Vivienne Ming, Gild’s chief scientist, told me that one solid predictor of strong coding is an affinity for a particular Japanese manga site.
  • Gild’s CEO, Sheeroy Desai, told me he believes his company’s approach can be applied to any occupation characterized by large, active online communities, where people post and cite individual work, ask and answer professional questions, and get feedback on projects. Graphic design is one field that the company is now looking at, and many scientific, technical, and engineering roles might also fit the bill. Regardless of their occupation, most people leave “data exhaust” in their wake, a kind of digital aura that can reveal a lot about a potential hire.
  • professionally relevant personality traits can be judged effectively merely by scanning Facebook feeds and photos. LinkedIn, of course, captures an enormous amount of professional data and network information, across just about every profession. A controversial start-up called Klout has made its mission the measurement and public scoring of people’s online social influence.
  • Mullainathan expressed amazement at how little most creative and professional workers (himself included) know about what makes them effective or ineffective in the office. Most of us can’t even say with any certainty how long we’ve spent gathering information for a given project, or our pattern of information-gathering, never mind know which parts of the pattern should be reinforced, and which jettisoned. As Mullainathan put it, we don’t know our own “production function.”
  • Over time, better job-matching technologies are likely to begin serving people directly, helping them see more clearly which jobs might suit them and which companies could use their skills. In the future, Gild plans to let programmers see their own profiles and take skills challenges to try to improve their scores. It intends to show them its estimates of their market value, too, and to recommend coursework that might allow them to raise their scores even more. Not least, it plans to make accessible the scores of typical hires at specific companies, so that software engineers can better see the profile they’d need to land a particular job
  • Knack, for its part, is making some of its video games available to anyone with a smartphone, so people can get a better sense of their strengths, and of the fields in which their strengths would be most valued. (Palo Alto High School recently adopted the games to help students assess careers.) Ultimately, the company hopes to act as matchmaker between a large network of people who play its games (or have ever played its games) and a widening roster of corporate clients, each with its own specific profile for any given type of job.
  • When I began my reporting for this story, I was worried that people analytics, if it worked at all, would only widen the divergent arcs of our professional lives, further gilding the path of the meritocratic elite from cradle to grave, and shutting out some workers more definitively. But I now believe the opposite is likely to happen, and that we’re headed toward a labor market that’s fairer to people at every stage of their careers
  • For decades, as we’ve assessed people’s potential in the professional workforce, the most important piece of data—the one that launches careers or keeps them grounded—has been educational background: typically, whether and where people went to college, and how they did there. Over the past couple of generations, colleges and universities have become the gatekeepers to a prosperous life. A degree has become a signal of intelligence and conscientiousness, one that grows stronger the more selective the school and the higher a student’s GPA, that is easily understood by employers, and that, until the advent of people analytics, was probably unrivaled in its predictive powers.
  • the limitations of that signal—the way it degrades with age, its overall imprecision, its many inherent biases, its extraordinary cost—are obvious. “Academic environments are artificial environments,” Laszlo Bock, Google’s senior vice president of people operations, told The New York Times in June. “People who succeed there are sort of finely trained, they’re conditioned to succeed in that environment,” which is often quite different from the workplace.
  • because one’s college history is such a crucial signal in our labor market, perfectly able people who simply couldn’t sit still in a classroom at the age of 16, or who didn’t have their act together at 18, or who chose not to go to graduate school at 22, routinely get left behind for good. That such early factors so profoundly affect career arcs and hiring decisions made two or three decades later is, on its face, absurd.
  • I spoke with managers at a lot of companies who are using advanced analytics to reevaluate and reshape their hiring, and nearly all of them told me that their research is leading them toward pools of candidates who didn’t attend college—for tech jobs, for high-end sales positions, for some managerial roles. In some limited cases, this is because their analytics revealed no benefit whatsoever to hiring people with college degrees; in other cases, and more often, it’s because they revealed signals that function far better than college history,
  • Google, too, is hiring a growing number of nongraduates. Many of the people I talked with reported that when it comes to high-paying and fast-track jobs, they’re reducing their preference for Ivy Leaguers and graduates of other highly selective schools.
  • This process is just beginning. Online courses are proliferating, and so are online markets that involve crowd-sourcing. Both arenas offer new opportunities for workers to build skills and showcase competence. Neither produces the kind of instantly recognizable signals of potential that a degree from a selective college, or a first job at a prestigious firm, might. That’s a problem for traditional hiring managers, because sifting through lots of small signals is so difficult and time-consuming.
  • all of these new developments raise philosophical questions. As professional performance becomes easier to measure and see, will we become slaves to our own status and potential, ever-focused on the metrics that tell us how and whether we are measuring up? Will too much knowledge about our limitations hinder achievement and stifle our dreams? All I can offer in response to these questions, ironically, is my own gut sense, which leads me to feel cautiously optimistic.
  • Google’s understanding of the promise of analytics is probably better than anybody else’s, and the company has been changing its hiring and management practices as a result of its ongoing analyses. (Brainteasers are no longer used in interviews, because they do not correlate with job success; GPA is not considered for anyone more than two years out of school, for the same reason—the list goes on.) But for all of Google’s technological enthusiasm, these same practices are still deeply human. A real, live person looks at every résumé the company receives. Hiring decisions are made by committee and are based in no small part on opinions formed during structured interviews.
Javier E

I asked Tinder for my data. It sent me 800 pages of my deepest, darkest secrets | Techn... - 0 views

  • I emailed Tinder requesting my personal data and got back way more than I bargained for. Some 800 pages came back containing information such as my Facebook “likes”, my photos from Instagram (even after I deleted the associated account), my education, the age-rank of men I was interested in, how many times I connected, when and where every online conversation with every single one of my matches happened … the list goes on.
  • “You are lured into giving away all this information,” says Luke Stark, a digital technology sociologist at Dartmouth University. “Apps such as Tinder are taking advantage of a simple emotional phenomenon; we can’t feel data. This is why seeing everything printed strikes you. We are physical creatures. We need materiality.”
  • What will happen if this treasure trove of data gets hacked, is made public or simply bought by another company? I can almost feel the shame I would experience. The thought that, before sending me these 800 pages, someone at Tinder might have read them already makes me cringe.
  • ...3 more annotations...
  • In May, an algorithm was used to scrape 40,000 profile images from the platform in order to build an AI to “genderise” faces. A few months earlier, 70,000 profiles from OkCupid (owned by Tinder’s parent company Match Group) were made public by a Danish researcher some commentators have labelled a “white supremacist”, who used the data to try to establish a link between intelligence and religious beliefs. The data is still out there.
  • The trouble is these 800 pages of my most intimate data are actually just the tip of the iceberg. “Your personal data affects who you see first on Tinder, yes,” says Dehaye. “But also what job offers you have access to on LinkedIn, how much you will pay for insuring your car, which ad you will see in the tube and if you can subscribe to a loan. “We are leaning towards a more and more opaque society, towards an even more intangible world where data collected about you will decide even larger facets of your life. Eventually, your whole existence will be affected.”
  • As a typical millennial constantly glued to my phone, my virtual life has fully merged with my real life. There is no difference any more. Tinder is how I meet people, so this is my reality. It is a reality that is constantly being shaped by others – but good luck trying to find out how.
Javier E

I Sent All My Text Messages in Calligraphy for a Week - Cristina Vanko - The Atlantic - 2 views

  • I decided to blend a newfound interest in calligraphy with my lifelong passion for written correspondence to create a new kind of text messaging. The idea: I wanted to message friends using calligraphic texts for one week. The average 18-to-24-year-old sends and gets something like 4,000 messages a month, which includes sending more than 500 texts a week, according to Experian. The week of my experiment, I only sent 100
  • We are a youth culture that heavily relies on emojis. I didn’t realize how much I depend on emojis and emoticons to express myself until I didn’t have them. Handdrawn emoticons, though original, just aren’t the same. I wasn't able to convey emoticons as neatly as the cleanliness of a typeface. Sketching emojis is too time consuming. To bridge the gap between time and the need for graphic imagery, I sent out selfies on special occasions when my facial expression spoke louder than words.
  • That week, the sense of urgency I normally felt about my phone virtually vanished. It was like back when texts were rationed, and when I lacked anxiety about viewing "read" receipts. I didn’t feel naked without having my phone on me every moment. 
  • ...10 more annotations...
  • So while the experiment began as an exercise to learn calligraphy, it doubled as a useful sort of digital detox that revealed my relationship with technology. Here's what I learned:
  • Receiving handwritten messages made people feel special. The awesome feeling of receiving personalized mail really can be replicated with a handwritten text.
  • Handwriting allows for more self-expression. I found I could give words a certain flourish to mimic the intonation of spoken language. Expressing myself via handwriting could also give the illusion of real-time presence. One friend told me, “it’s like you’re here with us!”
  • Before I started, I established rules for myself: I could create only handwritten text messages for seven days, absolutely no using my phone’s keyboard. I had to write out my messages on paper, photograph them, then hit “send.” I didn’t reveal my plan to my friends unless asked
  • Sometimes you don't need to respond. Most conversations aren’t life or death situations, so it was refreshing to feel 100 percent present in all interactions. I didn’t interrupt conversations by checking social media or shooting text messages to friends. I was more in tune with my surroundings. On transit, I took part in people watching—which, yes, meant mostly watching people staring at their phones. I smiled more at passersby while walking since I didn’t feel the need to avoid human interaction by staring at my phone.
  • A phone isn't only a texting device. As I texted less, I used my phone less frequently—mostly because I didn’t feel the need to look at it to keep me busy, nor did I want to feel guilty for utilizing the keyboard through other applications. I still took photos, streamed music, and logged workouts since I felt okay with pressing buttons for selection purposes
  • People don’t expect to receive phone calls anymore. Texting brings about a less intimidating, more convenient experience. But it wasn't that long ago when real-time voice were the norm. It's clear to me that, these days, people prefer to be warned about an upcoming phone call before it comes in.
  • Having a pen and paper is handy at all times. Writing out responses is a great reminder to slow down and use your hands. While all keys on a keyboard feel the same, it’s difficult to replicate the tactile activity of tracing a letter’s shape
  • My sent messages were more thoughtful.
  • I was more careful with grammar and spelling. People often ignore the rules of grammar and spelling just to maintain the pace of texting conversation. But because a typical calligraphic text took minutes to craft, I had time to make sure I got things right. The usual texting acronyms and misspellings look absurd when texted with type, but they'd be especially ridiculous written by hand.
aprossi

Kevin Seefried, seen carrying Confederate flag inside Capitol during riot, arrested - CNN - 0 views

  • Man carrying Confederate flag inside the US Capitol during riot arrested, identified as Kevin Seefried
  • The FBI has arrested Kevin Seefried, seen carrying a Confederate flag inside Capitol Hill, according to a federal criminal complaint.
  • Seefried had been a focus of the FBI's efforts to get the public to help them identify riot participants. The complaint identifies him as the man seen in the photos, widely circulated online, carrying a large Confederate flag inside the US Capitol during the January 6 siege.
  • ...3 more annotations...
  • Kevin Seefried told the FBI he had brought the Confederate flag with him to Washington from his home in Delaware, where he normally displays it outside.
  • Seefried was charged with knowingly entering or remaining in any restricted building or grounds without lawful authority, and violent entry and disorderly conduct on Capitol grounds.
  • Some people who stormed the Capitol have already come forward or have been identified by CNN and other news organizations. Many face criminal charges, and some have lost or left their jobs because of their participation.
cvanderloo

In Texas, price gouging during disasters is illegal - it is also on very shaky ethical ... - 1 views

  • In Houston, as millions suffered power and water outages, food shortages and subfreezing temperatures, another problem confronted families: price hikes.Steep increases in the price of food, gas and fuel have been reported across Texas. And as millions of Texans lost power, exorbitant prices were being asked for hotel rooms with power, with some climbing to US$1,000 a night.
  • Disaster creates a scarcity of basic necessities; retailers and providers respond by sharply raising the price tags on sought-after commodities.
  • Contrarian voices argue that price hikes are good – they provide incentives for sellers to bring extra supplies and prevent hoarding.
  • ...7 more annotations...
  • Whether price gouging helps bring more supply to disaster victims is speculative, but a surer outcome is that it will disproportionately burden the worst-off.
  • that is, an obligation to help others in danger when doing so entails only a small cost to yourself.
    • cvanderloo
       
      duty of early rescue
  • Picture a hiker lost in the woods suffering serious dehydration. A second hiker walks by and offers to sell her his extra water, but for a large sum.This violates the duty of easy rescue because it risks failing to save someone who can easily be saved, so long as the second hiker does not need the water himself.
  • Rescuing someone with little risk or cost to yourself is a moral duty, not a duty enforced by law in the U.S. So, some people might ask, why should it be enforced on would-be price gougers?
  • We are all better off when we cooperate to provide services that at some point we all may need.
    • cvanderloo
       
      social contract theory
  • This extends to rescue services such as firefighters, paramedics and first responders. But when life-threatening conditions arise from lack of food, water, shelter and power, this burden of rescue can be delegated to sellers of necessities and providers of utilities. At the least, society requires that they not raise prices and turn away those who cannot pay.
  • But actual inequality provides a reason to enforce laws against price gouging. When prices rise, the worst-off suffer the most.
katedriscoll

Cognitive Bias: Understanding How It Affects Your Decisions - 0 views

  • A cognitive bias is a flaw in your reasoning that leads you to misinterpret information from the world around you and to come to an inaccurate conclusion. Because you are flooded with information from millions of sources throughout the day, your brain develops ranking systems to decide which information deserves your attention and which information is important enough to store in memory. It also creates shortcuts meant to cut down on the time it takes for you to process information. The problem is that the shortcuts and ranking systems aren’t always perfectly objective because their architecture is uniquely adapted to your life experiences
  • Anchoring bias is the tendency to rely heavily on the first information you learn when you are evaluating something. In other words, what you learn early in an investigation often has a greater impact on your judgment than information you learn later. In one study, for example, researchers gave two groups of study participants some written background information about a person in a photograph. Then they asked them to describe how they thought the people in the photos were feeling. People who read more negative background information tended to infer more negative feelings, and people who read positive background information tended to infer more positive feelings. Their first impressions heavily influenced their ability to infer emotions in others.
  • Another common bias is the tendency to give greater credence to ideas that come to mind easily. If you can immediately think of several facts that support a judgment, you may be inclined to think that judgment is correct. For example, if a person sees multiple headlines about shark attacks in a coastal area, that person might form a belief that the risk of shark attacks is higher than it is.The American Psychological Association points out that when information is readily available around you, you’re more likely to remember it. Information that is easy to access in your memory seems more reliable.
Javier E

Thieves of experience: On the rise of surveillance capitalism - 1 views

  • Harvard Business School professor emerita Shoshana Zuboff argues in her new book that the Valley’s wealth and power are predicated on an insidious, essentially pathological form of private enterprise—what she calls “surveillance capitalism.” Pioneered by Google, perfected by Facebook, and now spreading throughout the economy, surveillance capitalism uses human life as its raw material. Our everyday experiences, distilled into data, have become a privately-owned business asset used to predict and mold our behavior, whether we’re shopping or socializing, working or voting.
  • By reengineering the economy and society to their own benefit, Google and Facebook are perverting capitalism in a way that undermines personal freedom and corrodes democracy.
  • Under the Fordist model of mass production and consumption that prevailed for much of the twentieth century, industrial capitalism achieved a relatively benign balance among the contending interests of business owners, workers, and consumers. Enlightened executives understood that good pay and decent working conditions would ensure a prosperous middle class eager to buy the goods and services their companies produced. It was the product itself — made by workers, sold by companies, bought by consumers — that tied the interests of capitalism’s participants together. Economic and social equilibrium was negotiated through the product.
  • ...72 more annotations...
  • By removing the tangible product from the center of commerce, surveillance capitalism upsets the equilibrium. Whenever we use free apps and online services, it’s often said, we become the products, our attention harvested and sold to advertisers
  • this truism gets it wrong. Surveillance capitalism’s real products, vaporous but immensely valuable, are predictions about our future behavior — what we’ll look at, where we’ll go, what we’ll buy, what opinions we’ll hold — that internet companies derive from our personal data and sell to businesses, political operatives, and other bidders.
  • Unlike financial derivatives, which they in some ways resemble, these new data derivatives draw their value, parasite-like, from human experience.To the Googles and Facebooks of the world, we are neither the customer nor the product. We are the source of what Silicon Valley technologists call “data exhaust” — the informational byproducts of online activity that become the inputs to prediction algorithms
  • Another 2015 study, appearing in the Journal of Computer-Mediated Communication, showed that when people hear their phone ring but are unable to answer it, their blood pressure spikes, their pulse quickens, and their problem-solving skills decline.
  • The smartphone has become a repository of the self, recording and dispensing the words, sounds and images that define what we think, what we experience and who we are. In a 2015 Gallup survey, more than half of iPhone owners said that they couldn’t imagine life without the device.
  • So what happens to our minds when we allow a single tool such dominion over our perception and cognition?
  • Not only do our phones shape our thoughts in deep and complicated ways, but the effects persist even when we aren’t using the devices. As the brain grows dependent on the technology, the research suggests, the intellect weakens.
  • he has seen mounting evidence that using a smartphone, or even hearing one ring or vibrate, produces a welter of distractions that makes it harder to concentrate on a difficult problem or job. The division of attention impedes reasoning and performance.
  • internet companies operate in what Zuboff terms “extreme structural independence from people.” When databases displace goods as the engine of the economy, our own interests, as consumers but also as citizens, cease to be part of the negotiation. We are no longer one of the forces guiding the market’s invisible hand. We are the objects of surveillance and control.
  • Social skills and relationships seem to suffer as well.
  • In both tests, the subjects whose phones were in view posted the worst scores, while those who left their phones in a different room did the best. The students who kept their phones in their pockets or bags came out in the middle. As the phone’s proximity increased, brainpower decreased.
  • In subsequent interviews, nearly all the participants said that their phones hadn’t been a distraction—that they hadn’t even thought about the devices during the experiment. They remained oblivious even as the phones disrupted their focus and thinking.
  • The researchers recruited 520 undergraduates at UCSD and gave them two standard tests of intellectual acuity. One test gauged “available working-memory capacity,” a measure of how fully a person’s mind can focus on a particular task. The second assessed “fluid intelligence,” a person’s ability to interpret and solve an unfamiliar problem. The only variable in the experiment was the location of the subjects’ smartphones. Some of the students were asked to place their phones in front of them on their desks; others were told to stow their phones in their pockets or handbags; still others were required to leave their phones in a different room.
  • the “integration of smartphones into daily life” appears to cause a “brain drain” that can diminish such vital mental skills as “learning, logical reasoning, abstract thought, problem solving, and creativity.”
  •  Smartphones have become so entangled with our existence that, even when we’re not peering or pawing at them, they tug at our attention, diverting precious cognitive resources. Just suppressing the desire to check our phone, which we do routinely and subconsciously throughout the day, can debilitate our thinking.
  • They found that students who didn’t bring their phones to the classroom scored a full letter-grade higher on a test of the material presented than those who brought their phones. It didn’t matter whether the students who had their phones used them or not: All of them scored equally poorly.
  • A study of nearly a hundred secondary schools in the U.K., published last year in the journal Labour Economics, found that when schools ban smartphones, students’ examination scores go up substantially, with the weakest students benefiting the most.
  • Data, the novelist and critic Cynthia Ozick once wrote, is “memory without history.” Her observation points to the problem with allowing smartphones to commandeer our brains
  • Because smartphones serve as constant reminders of all the friends we could be chatting with electronically, they pull at our minds when we’re talking with people in person, leaving our conversations shallower and less satisfying.
  • In a 2013 study conducted at the University of Essex in England, 142 participants were divided into pairs and asked to converse in private for ten minutes. Half talked with a phone in the room, half without a phone present. The subjects were then given tests of affinity, trust and empathy. “The mere presence of mobile phones,” the researchers reported in the Journal of Social and Personal Relationships, “inhibited the development of interpersonal closeness and trust” and diminished “the extent to which individuals felt empathy and understanding from their partners.”
  • The evidence that our phones can get inside our heads so forcefully is unsettling. It suggests that our thoughts and feelings, far from being sequestered in our skulls, can be skewed by external forces we’re not even aware o
  •  Scientists have long known that the brain is a monitoring system as well as a thinking system. Its attention is drawn toward any object that is new, intriguing or otherwise striking — that has, in the psychological jargon, “salience.”
  • even in the history of captivating media, the smartphone stands out. It is an attention magnet unlike any our minds have had to grapple with before. Because the phone is packed with so many forms of information and so many useful and entertaining functions, it acts as what Dr. Ward calls a “supernormal stimulus,” one that can “hijack” attention whenever it is part of our surroundings — and it is always part of our surroundings.
  • Imagine combining a mailbox, a newspaper, a TV, a radio, a photo album, a public library and a boisterous party attended by everyone you know, and then compressing them all into a single, small, radiant object. That is what a smartphone represents to us. No wonder we can’t take our minds off it.
  • The irony of the smartphone is that the qualities that make it so appealing to us — its constant connection to the net, its multiplicity of apps, its responsiveness, its portability — are the very ones that give it such sway over our minds.
  • Phone makers like Apple and Samsung and app writers like Facebook, Google and Snap design their products to consume as much of our attention as possible during every one of our waking hours
  • Social media apps were designed to exploit “a vulnerability in human psychology,” former Facebook president Sean Parker said in a recent interview. “[We] understood this consciously. And we did it anyway.”
  • A quarter-century ago, when we first started going online, we took it on faith that the web would make us smarter: More information would breed sharper thinking. We now know it’s not that simple.
  • As strange as it might seem, people’s knowledge and understanding may actually dwindle as gadgets grant them easier access to online data stores
  • In a seminal 2011 study published in Science, a team of researchers — led by the Columbia University psychologist Betsy Sparrow and including the late Harvard memory expert Daniel Wegner — had a group of volunteers read forty brief, factual statements (such as “The space shuttle Columbia disintegrated during re-entry over Texas in Feb. 2003”) and then type the statements into a computer. Half the people were told that the machine would save what they typed; half were told that the statements would be erased.
  • Afterward, the researchers asked the subjects to write down as many of the statements as they could remember. Those who believed that the facts had been recorded in the computer demonstrated much weaker recall than those who assumed the facts wouldn’t be stored. Anticipating that information would be readily available in digital form seemed to reduce the mental effort that people made to remember it
  • The researchers dubbed this phenomenon the “Google effect” and noted its broad implications: “Because search engines are continually available to us, we may often be in a state of not feeling we need to encode the information internally. When we need it, we will look it up.”
  • as the pioneering psychologist and philosopher William James said in an 1892 lecture, “the art of remembering is the art of thinking.”
  • Only by encoding information in our biological memory can we weave the rich intellectual associations that form the essence of personal knowledge and give rise to critical and conceptual thinking. No matter how much information swirls around us, the less well-stocked our memory, the less we have to think with.
  • As Dr. Wegner and Dr. Ward explained in a 2013 Scientific American article, when people call up information through their devices, they often end up suffering from delusions of intelligence. They feel as though “their own mental capacities” had generated the information, not their devices. “The advent of the ‘information age’ seems to have created a generation of people who feel they know more than ever before,” the scholars concluded, even though “they may know ever less about the world around them.”
  • That insight sheds light on society’s current gullibility crisis, in which people are all too quick to credit lies and half-truths spread through social media. If your phone has sapped your powers of discernment, you’ll believe anything it tells you.
  • A second experiment conducted by the researchers produced similar results, while also revealing that the more heavily students relied on their phones in their everyday lives, the greater the cognitive penalty they suffered.
  • When we constrict our capacity for reasoning and recall or transfer those skills to a gadget, we sacrifice our ability to turn information into knowledge. We get the data but lose the meaning
  • We need to give our minds more room to think. And that means putting some distance between ourselves and our phones.
  • Google’s once-patient investors grew restive, demanding that the founders figure out a way to make money, preferably lots of it.
  • nder pressure, Page and Brin authorized the launch of an auction system for selling advertisements tied to search queries. The system was designed so that the company would get paid by an advertiser only when a user clicked on an ad. This feature gave Google a huge financial incentive to make accurate predictions about how users would respond to ads and other online content. Even tiny increases in click rates would bring big gains in income. And so the company began deploying its stores of behavioral data not for the benefit of users but to aid advertisers — and to juice its own profits. Surveillance capitalism had arrived.
  • Google’s business now hinged on what Zuboff calls “the extraction imperative.” To improve its predictions, it had to mine as much information as possible from web users. It aggressively expanded its online services to widen the scope of its surveillance.
  • Through Gmail, it secured access to the contents of people’s emails and address books. Through Google Maps, it gained a bead on people’s whereabouts and movements. Through Google Calendar, it learned what people were doing at different moments during the day and whom they were doing it with. Through Google News, it got a readout of people’s interests and political leanings. Through Google Shopping, it opened a window onto people’s wish lists,
  • The company gave all these services away for free to ensure they’d be used by as many people as possible. It knew the money lay in the data.
  • the organization grew insular and secretive. Seeking to keep the true nature of its work from the public, it adopted what its CEO at the time, Eric Schmidt, called a “hiding strategy” — a kind of corporate omerta backed up by stringent nondisclosure agreements.
  • Page and Brin further shielded themselves from outside oversight by establishing a stock structure that guaranteed their power could never be challenged, neither by investors nor by directors.
  • What’s most remarkable about the birth of surveillance capitalism is the speed and audacity with which Google overturned social conventions and norms about data and privacy. Without permission, without compensation, and with little in the way of resistance, the company seized and declared ownership over everyone’s information
  • The companies that followed Google presumed that they too had an unfettered right to collect, parse, and sell personal data in pretty much any way they pleased. In the smart homes being built today, it’s understood that any and all data will be beamed up to corporate clouds.
  • Google conducted its great data heist under the cover of novelty. The web was an exciting frontier — something new in the world — and few people understood or cared about what they were revealing as they searched and surfed. In those innocent days, data was there for the taking, and Google took it
  • Google also benefited from decisions made by lawmakers, regulators, and judges — decisions that granted internet companies free use of a vast taxpayer-funded communication infrastructure, relieved them of legal and ethical responsibility for the information and messages they distributed, and gave them carte blanche to collect and exploit user data.
  • Consider the terms-of-service agreements that govern the division of rights and the delegation of ownership online. Non-negotiable, subject to emendation and extension at the company’s whim, and requiring only a casual click to bind the user, TOS agreements are parodies of contracts, yet they have been granted legal legitimacy by the court
  • Law professors, writes Zuboff, “call these ‘contracts of adhesion’ because they impose take-it-or-leave-it conditions on users that stick to them whether they like it or not.” Fundamentally undemocratic, the ubiquitous agreements helped Google and other firms commandeer personal data as if by fiat.
  • n the choices we make as consumers and private citizens, we have always traded some of our autonomy to gain other rewards. Many people, it seems clear, experience surveillance capitalism less as a prison, where their agency is restricted in a noxious way, than as an all-inclusive resort, where their agency is restricted in a pleasing way
  • Zuboff makes a convincing case that this is a short-sighted and dangerous view — that the bargain we’ve struck with the internet giants is a Faustian one
  • but her case would have been stronger still had she more fully addressed the benefits side of the ledger.
  • there’s a piece missing. While Zuboff’s assessment of the costs that people incur under surveillance capitalism is exhaustive, she largely ignores the benefits people receive in return — convenience, customization, savings, entertainment, social connection, and so on
  • hat the industries of the future will seek to manufacture is the self.
  • Behavior modification is the thread that ties today’s search engines, social networks, and smartphone trackers to tomorrow’s facial-recognition systems, emotion-detection sensors, and artificial-intelligence bots.
  • All of Facebook’s information wrangling and algorithmic fine-tuning, she writes, “is aimed at solving one problem: how and when to intervene in the state of play that is your daily life in order to modify your behavior and thus sharply increase the predictability of your actions now, soon, and later.”
  • “The goal of everything we do is to change people’s actual behavior at scale,” a top Silicon Valley data scientist told her in an interview. “We can test how actionable our cues are for them and how profitable certain behaviors are for us.”
  • This goal, she suggests, is not limited to Facebook. It is coming to guide much of the economy, as financial and social power shifts to the surveillance capitalists
  • Combining rich information on individuals’ behavioral triggers with the ability to deliver precisely tailored and timed messages turns out to be a recipe for behavior modification on an unprecedented scale.
  • it was Facebook, with its incredibly detailed data on people’s social lives, that grasped digital media’s full potential for behavior modification. By using what it called its “social graph” to map the intentions, desires, and interactions of literally billions of individuals, it saw that it could turn its network into a worldwide Skinner box, employing psychological triggers and rewards to program not only what people see but how they react.
  • spying on the populace is not the end game. The real prize lies in figuring out ways to use the data to shape how people think and act. “The best way to predict the future is to invent it,” the computer scientist Alan Kay once observed. And the best way to predict behavior is to script it.
  • competition for personal data intensified. It was no longer enough to monitor people online; making better predictions required that surveillance be extended into homes, stores, schools, workplaces, and the public squares of cities and towns. Much of the recent innovation in the tech industry has entailed the creation of products and services designed to vacuum up data from every corner of our lives
  • “The typical complaint is that privacy is eroded, but that is misleading,” Zuboff writes. “In the larger societal pattern, privacy is not eroded but redistributed . . . . Instead of people having the rights to decide how and what they will disclose, these rights are concentrated within the domain of surveillance capitalism.” The transfer of decision rights is also a transfer of autonomy and agency, from the citizen to the corporation.
  • What we lose under this regime is something more fundamental than privacy. It’s the right to make our own decisions about privacy — to draw our own lines between those aspects of our lives we are comfortable sharing and those we are not
  • Other possible ways of organizing online markets, such as through paid subscriptions for apps and services, never even got a chance to be tested.
  • Online surveillance came to be viewed as normal and even necessary by politicians, government bureaucrats, and the general public
  • Google and other Silicon Valley companies benefited directly from the government’s new stress on digital surveillance. They earned millions through contracts to share their data collection and analysis techniques with the National Security Agenc
  • As much as the dot-com crash, the horrors of 9/11 set the stage for the rise of surveillance capitalism. Zuboff notes that, in 2000, members of the Federal Trade Commission, frustrated by internet companies’ lack of progress in adopting privacy protections, began formulating legislation to secure people’s control over their online information and severely restrict the companies’ ability to collect and store it. It seemed obvious to the regulators that ownership of personal data should by default lie in the hands of private citizens, not corporations.
  • The 9/11 attacks changed the calculus. The centralized collection and analysis of online data, on a vast scale, came to be seen as essential to national security. “The privacy provisions debated just months earlier vanished from the conversation more or less overnight,”
caelengrubb

Anxiety, loneliness and Fear of Missing Out: The impact of social media on young people... - 0 views

  • By 2021, it is forecast that there will be around 3 billion active monthly users of social media. From the statistics alone, it’s clear that social media has become an integral (and to a large extent, unavoidable) part of our lives.
  • One implication of social media’s rapid rise, that of its relationship with young people’s mental health, has gathered a significant amount of attention in recent years.
  • So-called ‘social media addiction’ has been referred to by a wide variety of studies and experiments. It is thought that addiction to social media affects around 5% of young people, and was recently described as potentially more addictive than alcohol and cigarettes
  • ...8 more annotations...
  • The ‘urge’ to check one’s social media may be linked to both instant gratification (the need to experience fast, short term pleasure) and dopamine production (the chemical in the brain associated with reward and pleasure).
  • The popular concept of Fear of Missing Out (FOMO) refers to ‘a pervasive apprehension that others might be having rewarding experiences from which one is absent’ and is ‘characterised by the desire to stay continually connected with what others are doing’.
  • Data from qualitative studies has shown that using social media compulsively can damage sleeping patterns, having an adverse effect on young people’s performance in school
  • The University of Glasgow found that young people found it difficult to relax following night time social media use, reducing their brain’s ability to prepare for sleep. Sleep loss works in a vicious cycle of reinforcement with mental health; that is, that loss of sleep due to night time social media use can lead to poorer mental health, and poor mental health can lead to intense night time use and sleep loss
  • What is dangerous about this compulsive use is that, if gratification is not experienced, users may internalise beliefs that this is due to being ‘unpopular’, ‘unfunny’ etc. A lack of ‘likes’ on a status update may cause negative self-reflection, prompting continual ‘refreshing’ of the page in the hope of seeing that another person has ‘enjoyed’ the post, thus helping to achieve personal validation.
  • FOMO has been linked to intensive social media use and is associated with lower mood and life satisfaction.
  • Social media has been linked to poor self-esteem and self-image through the advent of image manipulation on photo sharing platforms. In particular, the notion of the ‘idealised body image’ has arguably been detrimental to self-esteem and image, especially that of young women. The 24/7 circulation of easily viewable manipulated images promotes and entrenches unrealistic expectations of how young people should look and behave.
  • The evidence suggests that social media use is strongly associated with anxiety, loneliness and depression
caelengrubb

Social Media Effects on Teens | Impact of Social Media on Self-Esteem - 0 views

  • In fact, experts worry that the social media and text messages that have become so integral to teenage life are promoting anxiety and lowering self-esteem.
  • A survey conducted by the Royal Society for Public Health asked 14-24 year olds in the UK how social media platforms impacted their health and wellbeing. The survey results found that Snapchat, Facebook, Twitter and Instagram all led to increased feelings of depression, anxiety, poor body image and loneliness.
  • For one thing, modern teens are learning to do most of their communication while looking at a screen, not another person.
  • ...7 more annotations...
  • Learning how to make friends is a major part of growing up, and friendship requires a certain amount of risk-taking. This is true for making a new friend, but it’s also true for maintaining friendships. When there are problems that need to be faced—big ones or small ones—it takes courage to be honest about your feelings and then hear what the other person has to say
  • But when friendship is conducted online and through texts, kids are doing this in a context stripped of many of the most personal—and sometimes intimidating—aspects of communication
  • The other big danger that comes from kids communicating more indirectly is that it has gotten easier to be cruel. “Kids text all sorts of things that you would never in a million years contemplate saying to anyone’s face,”
  • “Girls are socialized more to compare themselves to other people, girls in particular, to develop their identities, so it makes them more vulnerable to the downside of all this.” She warns that a lack of solid self-esteem is often to blame. “We forget that relational aggression comes from insecurity and feeling awful about yourself, and wanting to put other people down so you feel better.”
  • . Teenage girls sort through hundreds of photos, agonizing over which ones to post online. Boys compete for attention by trying to out-gross one other, pushing the envelope as much as they can in the already disinhibited atmosphere online.
  • Another big change that has come with new technology and especially smart phones is that we are never really alone. Kids update their status, share what they’re watching, listening to, and reading, and have apps that let their friends know their specific location on a map at all times.
  • Offline, the gold standard advice for helping kids build healthy self-esteem is to get them involved in something that they’re interested in
caelengrubb

The Linguistic Evolution of 'Like' - The Atlantic - 0 views

  • In our mouths or in print, in villages or in cities, in buildings or in caves, a language doesn’t sit still. It can’t. Language change has preceded apace even in places known for preserving a language in amber
  • It’s under this view of language—as something becoming rather than being, a film rather than a photo, in motion rather than at rest—that we should consider the way young people use (drum roll, please) like
  • First, let’s take like in just its traditional, accepted forms. Even in its dictionary definition, like is the product of stark changes in meaning that no one would ever guess.
  • ...19 more annotations...
  • To an Old English speaker, the word that later became like was the word for, of all things, “body.”
  • The word was lic, and lic was part of a word, gelic, that meant “with the body,” as in “with the body of,” which was a way of saying “similar to”—as in like
  • It was just that, step by step, the syllable lic, which to an Old English speaker meant “body,” came to mean, when uttered by people centuries later, “similar to”—and life went on.
  • Like has become a piece of grammar: It is the source of the suffix -ly.
  • Like has become a part of compounds. Likewise began as like plus a word, wise, which was different from the one meaning “smart when either a child or getting old.”
  • Dictionaries tell us it’s pronounced “like-MINE-did,” but I, for one, say “LIKE- minded” and have heard many others do so
  • Therefore, like is ever so much more than some isolated thing clinically described in a dictionary with a definition like “(preposition) ‘having the same characteristics or qualities as; similar to.’”
  • What we are seeing in like’s transformations today are just the latest chapters in a story that began with an ancient word that was supposed to mean “body.”
  • Because we think of like as meaning “akin to” or “similar to,” kids decorating every sentence or two with it seems like overuse. After all, how often should a coherently minded person need to note that something is similar to something rather than just being that something?
  • The new like, then, is associated with hesitation.
  • So today’s like did not spring mysteriously from a crowd on the margins of unusual mind-set and then somehow jump the rails from them into the general population.
  • The problem with the hesitation analysis is that this was a thoroughly confident speaker.
  • It’s real-life usage of this kind—to linguists it is data, just like climate patterns are to meteorologists—that suggests that the idea of like as the linguistic equivalent to slumped shoulders is off.
  • Understandably so, of course—the meaning of like suggests that people are claiming that everything is “like” itself rather than itself.
  • The new like acknowledges unspoken objection while underlining one’s own point (the factuality). Like grandparents translates here as “There were, despite what you might think, actually grandparents.”
  • Then there is a second new like, which is closer to what people tend to think of all its new uses: it is indeed a hedge.
  • Then, the two likes I have mentioned must be distinguished from yet a third usage, the quotative like—as in “And she was like, ‘I didn’t even invite him.’
  • This is yet another way that like has become grammar. The meaning “similar to” is as natural a source here as it was for -ly: mimicking people’s utterances is talking similarly to, as in “like,” them.
  • Thus the modern American English speaker has mastered not just two, but actually three different new usages of like.
jmfinizio

Opinion: The real key to winning this election - CNN - 0 views

  • the ghosts seem to be turning out in large numbers to cast their ballots early.
  • The long lines are an important reminder that the 2020 election will be won or lost based on the ground game.
  • Voting restrictions (photo identification requirements, registration limits and more) that have been imposed in more than 25 states heighten the importance of obtaining as large a margin as possible.
  • ...5 more annotations...
  • This election feels historic down to the bones.
  • This means that each party needs to make sure that every supporter has a clear voting plan if they have not already mailed-in their ballots.
  • Each party needs to do the better job selling the message that not voting is simply not an option.
  • We live in a passive, observational age where so much of our politics has turned into what we watch, hear, read, email and tweet.
  • The party that forgets to pay sufficient attention to the ground game is the one that will be rendered powerless come January 2021.
katedriscoll

KNOWLEDGE AND TECHNOLOGY - TOK RESOURCE.ORG - 0 views

  • As TOK students consider how digital technology impacts knowledge, and themselves as knowers, intriguing ethical issues emerge. In the class activities below students will explore knowledge questions relating to:
  • New York Times Magazine, November 11, 2018. Behind the Cover: What Will Become of Us? Design Director, Gail Bichler. Concept by Delcan & company. Photo illustration by Jamie Chung. Prop styling by Pink Sparrow. C.G. work by Justin Metz. “We liked the idea of a robot hand holding a human skull for its reference to 'Hamlet' and the humor of a robot's contemplating the future (or is it the past?) of humans." See video link above for a glimpse inside the process for its creation..
Javier E

The Irrational Consumer: Why Economics Is Dead Wrong About How We Make Choices - Derek ... - 4 views

  • Atlantic.displayRandomElement('#header li.business .sponsored-dropdown-item'); Derek Thompson - Derek Thompson is a senior editor at The Atlantic, where he oversees business coverage for the website. More Derek has also written for Slate, BusinessWeek, and the Daily Beast. He has appeared as a guest on radio and television networks, including NPR, the BBC, CNBC, and MSNBC. All Posts RSS feed Share Share on facebook Share on linkedin Share on twitter « Previous Thompson Email Print Close function plusOneCallback () { $(document).trigger('share'); } $(document).ready(function() { var iframeUrl = "\/ad\/thanks-iframe\/TheAtlanticOnline\/channel_business;src=blog;by=derek-thompson;title=the-irrational-consumer-why-economics-is-dead-wrong-about-how-we-make-choices;pos=sharing;sz=640x480,336x280,300x250"; var toolsClicked = false; $('#toolsTop').click(function() { toolsClicked = 'top'; }); $('#toolsBottom').click(function() { toolsClicked = 'bottom'; }); $('#thanksForSharing a.hide').click(function() { $('#thanksForSharing').hide(); }); var onShareClickHandler = function() { var top = parseInt($(this).css('top').replace(/px/, ''), 10); toolsClicked = (top > 600) ? 'bottom' : 'top'; }; var onIframeReady = function(iframe) { var win = iframe.contentWindow; // Don't show the box if there's no ad in it if (win.$('.ad').children().length == 1) { return; } var visibleAds = win.$('.ad').filter(function() { return !($(this).css('display') == 'none'); }); if (visibleAds.length == 0) { // Ad is hidden, so don't show return; } if (win.$('.ad').hasClass('adNotLoaded')) { // Ad failed to load so don't show return; } $('#thanksForSharing').css('display', 'block'); var top; if(toolsClicked == 'bottom' && $('#toolsBottom').length) { top = $('#toolsBottom')[0].offsetTop + $('#toolsBottom').height() - 310; } else { top = $('#toolsTop')[0].offsetTop + $('#toolsTop').height() + 10; } $('#thanksForSharing').css('left', (-$('#toolsTop').offset().left + 60) + 'px'); $('#thanksForSharing').css('top', top + 'px'); }; var onShare = function() { // Close "Share successful!" AddThis plugin popup if (window._atw && window._atw.clb && $('#at15s:visible').length) { _atw.clb(); } if (iframeUrl == null) { return; } $('#thanksForSharingIframe').attr('src', "\/ad\/thanks-iframe\/TheAtlanticOnline\/channel_business;src=blog;by=derek-thompson;title=the-irrational-consumer-why-economics-is-dead-wrong-about-how-we-make-choices;pos=sharing;sz=640x480,336x280,300x250"); $('#thanksForSharingIframe').load(function() { var iframe = this; var win = iframe.contentWindow; if (win.loaded) { onIframeReady(iframe); } else { win.$(iframe.contentDocument).ready(function() { onIframeReady(iframe); }) } }); }; if (window.addthis) { addthis.addEventListener('addthis.ready', function() { $('.articleTools .share').mouseover(function() { $('#at15s').unbind('click', onShareClickHandler); $('#at15s').bind('click', onShareClickHandler); }); }); addthis.addEventListener('addthis.menu.share', function(evt) { onShare(); }); } // This 'share' event is used for testing, so one can call // $(document).trigger('share') to get the thank you for // sharing box to appear. $(document).bind('share', function(event) { onShare(); }); if (!window.FB || (window.FB && !window.FB._apiKey)) { // Hook into the fbAsyncInit function and register our listener there var oldFbAsyncInit = (window.fbAsyncInit) ? window.fbAsyncInit : (function() { }); window.fbAsyncInit = function() { oldFbAsyncInit(); FB.Event.subscribe('edge.create', function(response) { // to hide the facebook comments box $('#facebookLike span.fb_edge_comment_widget').hide(); onShare(); }); }; } else if (window.FB) { FB.Event.subscribe('edge.create', function(response) { // to hide the facebook comments box $('#facebookLike span.fb_edge_comment_widget').hide(); onShare(); }); } }); The Irrational Consumer: Why Economics Is Dead Wrong About How We Make Choices By Derek Thompson he
  • First, making a choice is physically exhausting, literally, so that somebody forced to make a number of decisions in a row is likely to get lazy and dumb.
  • Second, having too many choices can make us less likely to come to a conclusion. In a famous study of the so-called "paradox of choice", psychologists Mark Lepper and Sheena Iyengar found that customers presented with six jam varieties were more likely to buy one than customers offered a choice of 24.
  • ...7 more annotations...
  • Many of our mistakes stem from a central "availability bias." Our brains are computers, and we like to access recently opened files, even though many decisions require a deep body of information that might require some searching. Cheap example: We remember the first, last, and peak moments of certain experiences.
  • The third check against the theory of the rational consumer is the fact that we're social animals. We let our friends and family and tribes do our thinking for us
  • neurologists are finding that many of the biases behavioral economists perceive in decision-making start in our brains. "Brain studies indicate that organisms seem to be on a hedonic treadmill, quickly habituating to homeostasis," McFadden writes. In other words, perhaps our preference for the status quo isn't just figuratively our heads, but also literally sculpted by the hand of evolution inside of our brains.
  • The popular psychological theory of "hyperbolic discounting" says people don't properly evaluate rewards over time. The theory seeks to explain why many groups -- nappers, procrastinators, Congress -- take rewards now and pain later, over and over again. But neurology suggests that it hardly makes sense to speak of "the brain," in the singular, because it's two very different parts of the brain that process choices for now and later. The choice to delay gratification is mostly processed in the frontal system. But studies show that the choice to do something immediately gratifying is processed in a different system, the limbic system, which is more viscerally connected to our behavior, our "reward pathways," and our feelings of pain and pleasure.
  • the final message is that neither the physiology of pleasure nor the methods we use to make choices are as simple or as single-minded as the classical economists thought. A lot of behavior is consistent with pursuit of self-interest, but in novel or ambiguous decision-making environments there is a good chance that our habits will fail us and inconsistencies in the way we process information will undo us.
  • Our brains seem to operate like committees, assigning some tasks to the limbic system, others to the frontal system. The "switchboard" does not seem to achieve complete, consistent communication between different parts of the brain. Pleasure and pain are experienced in the limbic system, but not on one fixed "utility" or "self-interest" scale. Pleasure and pain have distinct neural pathways, and these pathways adapt quickly to homeostasis, with sensation coming from changes rather than levels
  • Social networks are sources of information, on what products are available, what their features are, and how your friends like them. If the information is accurate, this should help you make better choices. On the other hand, it also makes it easier for you to follow the crowd rather than engaging in the due diligence of collecting and evaluating your own information and playing it against your own preferences
pier-paolo

ON EDUCATION; A Failure of Logic And Logistics - The New York Times - 0 views

  • THE federal No Child Left Behind law of 2002 may go down in history as the most unpopular piece of education legislation ever created. It has been criticized for setting impossibly high standards -- that every child in America must be proficient in reading and math by 2014
  • Now it turns out that about a third of the 8,000 transfers -- children often traveling over an hour to attend crowded schools -- have been moved from one school labeled failing under the law to another failing school.
  • Overcrowding breeds tension.
  • ...3 more annotations...
  • How could they? As might be expected from a law that tries to create a single accountability formula for every American school, No Child Left Behind is replete with technicalities and split hairs.
  • Mayor Michael R. Bloomberg did not want to take on the Bush administration over the federal law. The chancellor denied this, saying ''nothing is served'' by turning a tough equity issue into politics.
  • Recently, Mr. Klein had his photo taken with Bill Gates, who presented the city with $51 million to create small high schools. But principals of small high schools, like Louis Delgado of Vanguard in Manhattan, say transfers have devastated them this year.
adonahue011

Twitter is Showing That People Are Anxious and Depressed - The New York Times - 1 views

  • the lab offers this answer: Sunday, May 31. That day was not only the saddest day of 2020 so far, it was also the saddest day recorded by the lab in the last 13 years. Or at least, the saddest day on Twitter.
    • adonahue011
       
      The lab is offering the idea that May 31st was the saddest day of 2020, and the saddest in the last 13 years. The toll 2020 has put on all of us mentally is probably something at times we cannot even recognize.
    • adonahue011
       
      The lab is offering the idea that May 31st was the saddest day of 2020, and the saddest in the last 13 years. The toll 2020 has put on all of us mentally is probably something at times we cannot even recognize.
  • measuring word choices across millions of tweets, every day, the world over, to come up with a moving measure of well-being.
    • adonahue011
       
      They use a machine to track the words people are using on twitter specifically to measure the well-being of people
  • the main finding to emerge was our tendency toward relentless positivity on social media.
  • ...27 more annotations...
  • “Happiness is hard to know. It’s hard to measure,”
  • “We don’t have a lot of great data about how people are doing.”
    • adonahue011
       
      This is an interesting statement because it is so true. Yet it is so important to know how people are doing. Often times I think we personally miss some of the feelings we have, which is something we talked about in TOK. We cut out certain memories or feelings to make the narrative we want
  • to parse our national mental health through the prism of our online life.
  • that stockpile of information towered as high as it does now, in the summer of 2020
  • , Twitter reported a 34 percent increase in daily average user growth.
    • adonahue011
       
      Important statistic because we all took part in this
  • has gathered a random 10 percent of all public tweets, every day, across a dozen languages.
  • Twitter included “terrorist,” “violence” and “racist.” This was about a week after George Floyd was killed, near the start of the protests that would last all summe
  • the pandemic, the Hedonometer’s sadness readings have set multiple records. This year, “there was a full month — and we never see this — there was a full month of days that the Hedonometer was reading sadder than the Boston Marathon day,”
    • adonahue011
       
      This is saddening because it is the reality we have all had to learn how to deal with.
  • “These digital traces are markers that we’re not aware of, but they leave marks that tell us the degree to which you are avoiding things, the degree to which you are connected to people,”
    • adonahue011
       
      I agree with this statement because it is so similar to what we discussed in TOK with the idea that our brain lets us avoid things when we don't feel like we can deal with them.
  • one of the challenges of this line of research is that language itself is always evolving — and algorithms are notoriously bad at discerning context.
  • they were able to help predict which ones might develop postpartum depression, based on their posts before the birth of their babies.
    • adonahue011
       
      This type of research seems like a positive way to utilize social media. Not that the saddening posts are good but the way we can perceive this information is important
  • Using data from social media for the study of mental health also helps address the WEIRD problem:
  • psychology research is often exclusively composed of subjects who are Western, Educated, and from Industrialized, Rich, and Democratic countries.
    • adonahue011
       
      I never thought of this but it is so true! Using social media means that the stats are global.
  • We’re now able to look at a much more diverse variety of mental health experiences.”
  • but also anxiety, depression, stress and suicidal thoughts. Unsurprisingly, she found that all these levels were significantly higher than during the same months of 2019.
  • is really a representative place to check the state of the general population’s mental health.
  • argues that in the rush to embrace data, many researchers ignore the distorting effects of the platforms themselves.
    • adonahue011
       
      Contrasting opinion from the rest of the article
  • emotionally invested in the content we are presented with, coaxed toward remaining in a certain mental state.
    • adonahue011
       
      Interesting idea though I tend to think more in the opposite direction that social media is a pretty solid reflection.
  • The closest we get to looking at national mental health otherwise is through surveys like the one Gallup performs
  • the lowest rates of life satisfaction this year in over a decade, including during the 2008 recession
  • I have never been more exhausted at the end of the day than I am now,” said Michael Garfinkle, a psychoanalyst in New York.
  • There are so many contenders to consider: was it Thursday, March 12, the day after Tom Hanks announced he was sick and the N.B.A. announced it was canceled? Was it Monday, June 1, the day peaceful protesters were tear gassed so that President Trump could comfortably stroll to his Bible-wielding photo op?
martinelligi

How Social Media Is Hurting Your Memory | Time - 0 views

  • Social platforms let us stay in touch with friends and forge new relationships like never before, but those increases in communication and social connection may come at a cost. In a new paper published in the Journal of Experimental Social Psychology, researchers showed that those who documented and shared their experiences on social media formed less precise memories of those events.
  • n a series of three studies led by Diana Tamir of Princeton University, researchers explored how taking photos and videos for social media affects people’s enjoyment, engagement and memory of those experiences.
  • Tamir and her team found that sharing experiences on social media did not seem to affect how much people felt that they had enjoyed the experience or were engaged. However, those who wrote down, recorded or shared their experiences performed about 10% worse on memory tests across all experiments.
  • ...2 more annotations...
  • This availability of external information causes us to neglect information itself, but instead remember where to find it. For example, one study found that if people playing a trivia game believe that a computer is storing each trivia question for them to study later, they do not form a memory of the information they want. Instead, they form a memory of how to retrieve that information on the computer.
  • With the rise of shared content, the exciting activities that you could be doing at any given moment are more apparent than ever, which can lead to a feeling of apprehension that others are having rewarding experiences without you. FOMO, not surprisingly, is associated with being less satisfied with your life, in a worse mood and emotionally unfulfilled.
margogramiak

How To Fight Deforestation In The Amazon From Your Couch | HuffPost - 0 views

  • If you’ve got as little as 30 seconds and a decent internet connection, you can help combat the deforestation of the Amazon. 
  • Some 15% of the Amazon, the world’s largest rainforest and a crucial carbon repository, has been cut or burned down. Around two-thirds of the Amazon lie within Brazil’s borders, where almost 157 square miles of forest were cleared in April alone. In addition to storing billions of tons of carbon, the Amazon is home to tens of millions of people and some 10% of the Earth’s biodiversity.
    • margogramiak
       
      all horrifying stats.
  • you just have to be a citizen that is concerned about the issue of deforestation,
    • margogramiak
       
      that's me!
  • ...12 more annotations...
  • If you’ve got as little as 30 seconds and a decent internet connection, you can help combat the deforestation of the Amazon. 
    • margogramiak
       
      great!
  • to build an artificial intelligence model that can recognize signs of deforestation. That data can be used to alert governments and conservation organizations where intervention is needed and to inform policies that protect vital ecosystems. It may even one day predict where deforestation is likely to happen next.
    • margogramiak
       
      That sounds super cool, and definitely useful.
  • To monitor deforestation, conservation organizations need an eye in the sky.
    • margogramiak
       
      bird's eye view pictures of deforestation are always super impactful.
  • WRI’s Global Forest Watch online tracking system receives images of the world’s forests taken every few days by NASA satellites. A simple computer algorithm scans the images, flagging instances where before there were trees and now there are not. But slight disturbances, such as clouds, can trip up the computer, so experts are increasingly interested in using artificial intelligence.
    • margogramiak
       
      that's so cool.
  • Inman was surprised how willing people have been to spend their time clicking on abstract-looking pictures of the Amazon.
    • margogramiak
       
      I'm glad so many people want to help.
  • Look at these nine blocks and make a judgment about each one. Does that satellite image look like a situation where human beings have transformed the landscape in some way?” Inman explained.
    • margogramiak
       
      seems simple enough
  • It’s not always easy; that’s the point. For example, a brown patch in the trees could be the result of burning to clear land for agriculture (earning a check mark for human impact), or it could be the result of a natural forest fire (no check mark). Keen users might be able to spot subtle signs of intervention the computer would miss, like the thin yellow line of a dirt road running through the clearing. 
    • margogramiak
       
      I was thinking about this issue... that's a hard problem to solve.
  • SAS’s website offers a handful of examples comparing natural forest features and manmade changes. 
    • margogramiak
       
      I guess that would be helpful. What happens if someone messes up though?
  • users have analyzed almost 41,000 images, covering an area of rainforest nearly the size of the state of Montana. Deforestation caused by human activity is evident in almost 2 in 5 photos.
    • margogramiak
       
      wow.
  • The researchers hope to use historical images of these new geographies to create a predictive model that could identify areas most at risk of future deforestation. If they can show that their AI model is successful, it could be useful for NGOs, governments and forest monitoring bodies, enabling them to carefully track forest changes and respond by sending park rangers and conservation teams to threatened areas. In the meantime, it’s a great educational tool for the citizen scientists who use the app
    • margogramiak
       
      But then what do they do with this data? How do they use it to make a difference?
  • Users simply select the squares in which they’ve spotted some indication of human impact: the tell-tale quilt of farm plots, a highway, a suspiciously straight edge of tree line. 
    • margogramiak
       
      I could do that!
  • we have still had people from 80 different countries come onto the app and make literally hundreds of judgments that enabled us to resolve 40,000 images,
    • margogramiak
       
      I like how in a sense it makes all the users one big community because of their common goal of wanting to help the earth.
Javier E

How Do I Find Meaning and Beauty in My Life? - The New York Times - 1 views

  • I’ve spent much of my life and thought and income in pursuit of beauty in one form or another: design, fashion, the beauty or “wellness” industries. This is very much a professional hazard: My career in glossy magazines and advertising as a photo editor is all about making beautiful images of beautiful things that I’ve selected look even more beautiful.
  • Often, when I think how much of my time I’ve devoted to my own appearance or to matters of aesthetics I cringe, though I’ve often been the person in any given room defending things like style and design from accusations of superficiality and frivolity.
  • I’ve come to a place in which I no longer know what my own life should look like. I literally do not know what to do with myself and what I should believe in anymore, and this does, in fact, seem kind of frivolous, given the very urgent concerns of the society we live in.
  • ...10 more annotations...
  • I’m unable to save money for retirement and I get very anxious when I think about the future.
  • Worse, I’ve lost my sense of meaning to myself. I feel like the culture has moved on without me, and I don’t know what to do.
  • I feel alone in many ways and unsuccessful by most measures. I don’t own a home and no one needs me; I am nobody’s mother and now I am nobody’s child, as my parents are no longer living.
  • on bad ones, I feel such futility, like I’ve squandered my own youth and beauty in the hall of mirrors that is our consumerist society.
  • A: What you wrote is in no way frivolous; it concerns the entire value, direction and purpose of your life.
  • You’ve realized that, in curating what we call taste, you’ve unwittingly become part of the system, the media-driven consumerist machine that creates the desire to possess things you’ll likely never be able to afford.
  • Beauty — distracting, exclusionary, often more talismanic and notional than “real” — has never been a steady organizing principle for life
  • your sense of what is beautiful has become more complicated, as you have. Your work is no longer satisfying to you because it is no longer relevant to the person you are.
  • your way of seeing things, including yourself, seems not wholly true or right or your own, and is in dire need of a refresh. It feels reductive and merciless, informed too much by the very aspects of our culture that have become deadening to you.
  • You’re seeking new paths to the shore, new things that sustain you, and this requires doing the work to find out what that looks like. It requires acknowledging the value shift that probably began happening inside you long ago.
huffem4

My White Friend Asked Me on Facebook to Explain White Privilege. I Decided to Be Honest... - 1 views

  • I realized many of my friends—especially the white ones—have no idea what I’ve experienced/dealt with unless they were present (and aware) when it happened. There are two reasons for this: 1) because not only as a human being do I suppress the painful and uncomfortable in an effort to make it go away, I was also taught within my community (I was raised in the ’70s and ’80s—it’s shifted somewhat now) and by society at large NOT to make a fuss, speak out, or rock the boat. To just “deal with it,” lest more trouble follow (which, sadly, it often does); 2) fear of being questioned or dismissed with “Are you sure that’s what you heard?” or “Are you sure that’s what they meant?” and being angered and upset all over again by well-meaning-but-hurtful and essentially unsupportive responses.
  • the white privilege in this situation is being able to move into a “nice” neighborhood and be accepted not harassed, made to feel unwelcome, or prone to acts of vandalism and hostility.
  • if you’ve never had a defining moment in your childhood or your life where you realize your skin color alone makes other people hate you, you have white privilege.
  • ...9 more annotations...
  • if you’ve never been ‘the only one’ of your race in a class, at a party, on a job, etc. and/or it’s been pointed out in a “playful” fashion by the authority figure in said situation, you have white privilege.
  • if you’ve never been on the receiving end of the assumption that when you’ve achieved something it’s only because it was taken away from a white person who “deserved it,” you have white privilege.
  •  if no one has ever questioned your intellectual capabilities or attendance at an elite institution based solely on your skin color, you have white privilege
  • if you have never experienced or considered how damaging it is/was/could be to grow up without myriad role models and images in school that reflect you in your required reading material or in the mainstream media, you have white privilege
  • if you’ve never been blindsided when you are just trying to enjoy a meal by a well-paid faculty member’s patronizing and racist assumptions about how grateful black people must feel to be in their presence, you have white privilege
  • if you’ve never been on the receiving end of a boss’s prejudiced, uninformed “how dare she question my ideas” badmouthing based on solely on his ego and your race, you have white privilege
  • if you’ve never had to mask the fruits of your success with a floppy-eared, stuffed bunny rabbit so you won’t get harassed by the cops on the way home from your gainful employment (or never had a first date start this way), you have white privilege
  • if you’ve never had to rewrite stories and headlines or swap photos while being trolled by racists when all you’re trying to do on a daily basis is promote positivity and share stories of hope and achievement and justice, you have white privilege
  • As to you “being part of the problem,” trust me, nobody is mad at you for being white. Nobody. Just like nobody should be mad at me for being black. Or female. Or whatever. But what IS being asked of you is to acknowledge that white privilege DOES exist and not only to treat people of races that differ from yours “with respect and humor,” but also to stand up for fair treatment and justice, not to let “jokes” or “off-color” comments by friends, co-workers, or family slide by without challenge, and to continually make an effort to put yourself in someone else’s shoes, so we may all cherish and respect our unique and special contributions to society as much as we do our common ground.
« First ‹ Previous 101 - 120 of 142 Next › Last »
Showing 20 items per page