Skip to main content

Home/ TOK Friends/ Group items tagged apps

Rss Feed Group items tagged

Javier E

'Our minds can be hijacked': the tech insiders who fear a smartphone dystopia | Technol... - 0 views

  • Rosenstein belongs to a small but growing band of Silicon Valley heretics who complain about the rise of the so-called “attention economy”: an internet shaped around the demands of an advertising economy.
  • “It is very common,” Rosenstein says, “for humans to develop things with the best of intentions and for them to have unintended, negative consequences.”
  • most concerned about the psychological effects on people who, research shows, touch, swipe or tap their phone 2,617 times a day.
  • ...43 more annotations...
  • There is growing concern that as well as addicting users, technology is contributing toward so-called “continuous partial attention”, severely limiting people’s ability to focus, and possibly lowering IQ. One recent study showed that the mere presence of smartphones damages cognitive capacity – even when the device is turned off. “Everyone is distracted,” Rosenstein says. “All of the time.”
  • Drawing a straight line between addiction to social media and political earthquakes like Brexit and the rise of Donald Trump, they contend that digital forces have completely upended the political system and, left unchecked, could even render democracy as we know it obsolete.
  • Without irony, Eyal finished his talk with some personal tips for resisting the lure of technology. He told his audience he uses a Chrome extension, called DF YouTube, “which scrubs out a lot of those external triggers” he writes about in his book, and recommended an app called Pocket Points that “rewards you for staying off your phone when you need to focus”.
  • “One reason I think it is particularly important for us to talk about this now is that we may be the last generation that can remember life before,” Rosenstein says. It may or may not be relevant that Rosenstein, Pearlman and most of the tech insiders questioning today’s attention economy are in their 30s, members of the last generation that can remember a world in which telephones were plugged into walls.
  • One morning in April this year, designers, programmers and tech entrepreneurs from across the world gathered at a conference centre on the shore of the San Francisco Bay. They had each paid up to $1,700 to learn how to manipulate people into habitual use of their products, on a course curated by conference organiser Nir Eyal.
  • Eyal, 39, the author of Hooked: How to Build Habit-Forming Products, has spent several years consulting for the tech industry, teaching techniques he developed by closely studying how the Silicon Valley giants operate.
  • “The technologies we use have turned into compulsions, if not full-fledged addictions,” Eyal writes. “It’s the impulse to check a message notification. It’s the pull to visit YouTube, Facebook, or Twitter for just a few minutes, only to find yourself still tapping and scrolling an hour later.” None of this is an accident, he writes. It is all “just as their designers intended”
  • He explains the subtle psychological tricks that can be used to make people develop habits, such as varying the rewards people receive to create “a craving”, or exploiting negative emotions that can act as “triggers”. “Feelings of boredom, loneliness, frustration, confusion and indecisiveness often instigate a slight pain or irritation and prompt an almost instantaneous and often mindless action to quell the negative sensation,” Eyal writes.
  • The most seductive design, Harris explains, exploits the same psychological susceptibility that makes gambling so compulsive: variable rewards. When we tap those apps with red icons, we don’t know whether we’ll discover an interesting email, an avalanche of “likes”, or nothing at all. It is the possibility of disappointment that makes it so compulsive.
  • Finally, Eyal confided the lengths he goes to protect his own family. He has installed in his house an outlet timer connected to a router that cuts off access to the internet at a set time every day. “The idea is to remember that we are not powerless,” he said. “We are in control.
  • But are we? If the people who built these technologies are taking such radical steps to wean themselves free, can the rest of us reasonably be expected to exercise our free will?
  • Not according to Tristan Harris, a 33-year-old former Google employee turned vocal critic of the tech industry. “All of us are jacked into this system,” he says. “All of our minds can be hijacked. Our choices are not as free as we think they are.”
  • Harris, who has been branded “the closest thing Silicon Valley has to a conscience”, insists that billions of people have little choice over whether they use these now ubiquitous technologies, and are largely unaware of the invisible ways in which a small number of people in Silicon Valley are shaping their lives.
  • “I don’t know a more urgent problem than this,” Harris says. “It’s changing our democracy, and it’s changing our ability to have the conversations and relationships that we want with each other.” Harris went public – giving talks, writing papers, meeting lawmakers and campaigning for reform after three years struggling to effect change inside Google’s Mountain View headquarters.
  • He explored how LinkedIn exploits a need for social reciprocity to widen its network; how YouTube and Netflix autoplay videos and next episodes, depriving users of a choice about whether or not they want to keep watching; how Snapchat created its addictive Snapstreaks feature, encouraging near-constant communication between its mostly teenage users.
  • The techniques these companies use are not always generic: they can be algorithmically tailored to each person. An internal Facebook report leaked this year, for example, revealed that the company can identify when teens feel “insecure”, “worthless” and “need a confidence boost”. Such granular information, Harris adds, is “a perfect model of what buttons you can push in a particular person”.
  • Tech companies can exploit such vulnerabilities to keep people hooked; manipulating, for example, when people receive “likes” for their posts, ensuring they arrive when an individual is likely to feel vulnerable, or in need of approval, or maybe just bored. And the very same techniques can be sold to the highest bidder. “There’s no ethics,” he says. A company paying Facebook to use its levers of persuasion could be a car business targeting tailored advertisements to different types of users who want a new vehicle. Or it could be a Moscow-based troll farm seeking to turn voters in a swing county in Wisconsin.
  • It was Rosenstein’s colleague, Leah Pearlman, then a product manager at Facebook and on the team that created the Facebook “like”, who announced the feature in a 2009 blogpost. Now 35 and an illustrator, Pearlman confirmed via email that she, too, has grown disaffected with Facebook “likes” and other addictive feedback loops. She has installed a web browser plug-in to eradicate her Facebook news feed, and hired a social media manager to monitor her Facebook page so that she doesn’t have to.
  • Harris believes that tech companies never deliberately set out to make their products addictive. They were responding to the incentives of an advertising economy, experimenting with techniques that might capture people’s attention, even stumbling across highly effective design by accident.
  • It’s this that explains how the pull-to-refresh mechanism, whereby users swipe down, pause and wait to see what content appears, rapidly became one of the most addictive and ubiquitous design features in modern technology. “Each time you’re swiping down, it’s like a slot machine,” Harris says. “You don’t know what’s coming next. Sometimes it’s a beautiful photo. Sometimes it’s just an ad.”
  • The reality TV star’s campaign, he said, had heralded a watershed in which “the new, digitally supercharged dynamics of the attention economy have finally crossed a threshold and become manifest in the political realm”.
  • “Smartphones are useful tools,” he says. “But they’re addictive. Pull-to-refresh is addictive. Twitter is addictive. These are not good things. When I was working on them, it was not something I was mature enough to think about. I’m not saying I’m mature now, but I’m a little bit more mature, and I regret the downsides.”
  • All of it, he says, is reward-based behaviour that activates the brain’s dopamine pathways. He sometimes finds himself clicking on the red icons beside his apps “to make them go away”, but is conflicted about the ethics of exploiting people’s psychological vulnerabilities. “It is not inherently evil to bring people back to your product,” he says. “It’s capitalism.”
  • He identifies the advent of the smartphone as a turning point, raising the stakes in an arms race for people’s attention. “Facebook and Google assert with merit that they are giving users what they want,” McNamee says. “The same can be said about tobacco companies and drug dealers.”
  • McNamee chooses his words carefully. “The people who run Facebook and Google are good people, whose well-intentioned strategies have led to horrific unintended consequences,” he says. “The problem is that there is nothing the companies can do to address the harm unless they abandon their current advertising models.”
  • But how can Google and Facebook be forced to abandon the business models that have transformed them into two of the most profitable companies on the planet?
  • McNamee believes the companies he invested in should be subjected to greater regulation, including new anti-monopoly rules. In Washington, there is growing appetite, on both sides of the political divide, to rein in Silicon Valley. But McNamee worries the behemoths he helped build may already be too big to curtail.
  • Rosenstein, the Facebook “like” co-creator, believes there may be a case for state regulation of “psychologically manipulative advertising”, saying the moral impetus is comparable to taking action against fossil fuel or tobacco companies. “If we only care about profit maximisation,” he says, “we will go rapidly into dystopia.”
  • James Williams does not believe talk of dystopia is far-fetched. The ex-Google strategist who built the metrics system for the company’s global search advertising business, he has had a front-row view of an industry he describes as the “largest, most standardised and most centralised form of attentional control in human history”.
  • It is a journey that has led him to question whether democracy can survive the new technological age.
  • He says his epiphany came a few years ago, when he noticed he was surrounded by technology that was inhibiting him from concentrating on the things he wanted to focus on. “It was that kind of individual, existential realisation: what’s going on?” he says. “Isn’t technology supposed to be doing the complete opposite of this?
  • That discomfort was compounded during a moment at work, when he glanced at one of Google’s dashboards, a multicoloured display showing how much of people’s attention the company had commandeered for advertisers. “I realised: this is literally a million people that we’ve sort of nudged or persuaded to do this thing that they weren’t going to otherwise do,” he recalls.
  • Williams and Harris left Google around the same time, and co-founded an advocacy group, Time Well Spent, that seeks to build public momentum for a change in the way big tech companies think about design. Williams finds it hard to comprehend why this issue is not “on the front page of every newspaper every day.
  • “Eighty-seven percent of people wake up and go to sleep with their smartphones,” he says. The entire world now has a new prism through which to understand politics, and Williams worries the consequences are profound.
  • g. “The attention economy incentivises the design of technologies that grab our attention,” he says. “In so doing, it privileges our impulses over our intentions.”
  • That means privileging what is sensational over what is nuanced, appealing to emotion, anger and outrage. The news media is increasingly working in service to tech companies, Williams adds, and must play by the rules of the attention economy to “sensationalise, bait and entertain in order to survive”.
  • It is not just shady or bad actors who were exploiting the internet to change public opinion. The attention economy itself is set up to promote a phenomenon like Trump, who is masterly at grabbing and retaining the attention of supporters and critics alike, often by exploiting or creating outrage.
  • All of which has left Brichter, who has put his design work on the backburner while he focuses on building a house in New Jersey, questioning his legacy. “I’ve spent many hours and weeks and months and years thinking about whether anything I’ve done has made a net positive impact on society or humanity at all,” he says. He has blocked certain websites, turned off push notifications, restricted his use of the Telegram app to message only with his wife and two close friends, and tried to wean himself off Twitter. “I still waste time on it,” he confesses, “just reading stupid news I already know about.” He charges his phone in the kitchen, plugging it in at 7pm and not touching it until the next morning.
  • He stresses these dynamics are by no means isolated to the political right: they also play a role, he believes, in the unexpected popularity of leftwing politicians such as Bernie Sanders and Jeremy Corbyn, and the frequent outbreaks of internet outrage over issues that ignite fury among progressives.
  • All of which, Williams says, is not only distorting the way we view politics but, over time, may be changing the way we think, making us less rational and more impulsive. “We’ve habituated ourselves into a perpetual cognitive style of outrage, by internalising the dynamics of the medium,” he says.
  • It was another English science fiction writer, Aldous Huxley, who provided the more prescient observation when he warned that Orwellian-style coercion was less of a threat to democracy than the more subtle power of psychological manipulation, and “man’s almost infinite appetite for distractions”.
  • If the attention economy erodes our ability to remember, to reason, to make decisions for ourselves – faculties that are essential to self-governance – what hope is there for democracy itself?
  • “The dynamics of the attention economy are structurally set up to undermine the human will,” he says. “If politics is an expression of our human will, on individual and collective levels, then the attention economy is directly undermining the assumptions that democracy rests on.”
Javier E

ROUGH TYPE | Nicholas Carr's blog - 0 views

  • The smartphone has become a repository of the self, recording and dispensing the words, sounds and images that define what we think, what we experience and who we are. In a 2015 Gallup survey, more than half of iPhone owners said that they couldn’t imagine life without the device.
  • So what happens to our minds when we allow a single tool such dominion over our perception and cognition?
  • Not only do our phones shape our thoughts in deep and complicated ways, but the effects persist even when we aren’t using the devices. As the brain grows dependent on the technology, the research suggests, the intellect weakens.
  • ...39 more annotations...
  • he has seen mounting evidence that using a smartphone, or even hearing one ring or vibrate, produces a welter of distractions that makes it harder to concentrate on a difficult problem or job. The division of attention impedes reasoning and performance.
  • Another 2015 study, appearing in the Journal of Computer-Mediated Communication, showed that when people hear their phone ring but are unable to answer it, their blood pressure spikes, their pulse quickens, and their problem-solving skills decline.
  • The researchers recruited 520 undergraduates at UCSD and gave them two standard tests of intellectual acuity. One test gauged “available working-memory capacity,” a measure of how fully a person’s mind can focus on a particular task. The second assessed “fluid intelligence,” a person’s ability to interpret and solve an unfamiliar problem. The only variable in the experiment was the location of the subjects’ smartphones. Some of the students were asked to place their phones in front of them on their desks; others were told to stow their phones in their pockets or handbags; still others were required to leave their phones in a different room.
  • In both tests, the subjects whose phones were in view posted the worst scores, while those who left their phones in a different room did the best. The students who kept their phones in their pockets or bags came out in the middle. As the phone’s proximity increased, brainpower decreased.
  • In subsequent interviews, nearly all the participants said that their phones hadn’t been a distraction—that they hadn’t even thought about the devices during the experiment. They remained oblivious even as the phones disrupted their focus and thinking.
  • In a 2013 study conducted at the University of Essex in England, 142 participants were divided into pairs and asked to converse in private for ten minutes. Half talked with a phone in the room, half without a phone present. The subjects were then given tests of affinity, trust and empathy. “The mere presence of mobile phones,” the researchers reported in the Journal of Social and Personal Relationships, “inhibited the development of interpersonal closeness and trust” and diminished “the extent to which individuals felt empathy and understanding from their partners.”
  • the “integration of smartphones into daily life” appears to cause a “brain drain” that can diminish such vital mental skills as “learning, logical reasoning, abstract thought, problem solving, and creativity.”
  •  Smartphones have become so entangled with our existence that, even when we’re not peering or pawing at them, they tug at our attention, diverting precious cognitive resources. Just suppressing the desire to check our phone, which we do routinely and subconsciously throughout the day, can debilitate our thinking.
  • They found that students who didn’t bring their phones to the classroom scored a full letter-grade higher on a test of the material presented than those who brought their phones. It didn’t matter whether the students who had their phones used them or not: All of them scored equally poorly.
  • A study of nearly a hundred secondary schools in the U.K., published last year in the journal Labour Economics, found that when schools ban smartphones, students’ examination scores go up substantially, with the weakest students benefiting the most.
  • Social skills and relationships seem to suffer as well.
  • Because smartphones serve as constant reminders of all the friends we could be chatting with electronically, they pull at our minds when we’re talking with people in person, leaving our conversations shallower and less satisfying.
  • A second experiment conducted by the researchers produced similar results, while also revealing that the more heavily students relied on their phones in their everyday lives, the greater the cognitive penalty they suffered.
  • The evidence that our phones can get inside our heads so forcefully is unsettling. It suggests that our thoughts and feelings, far from being sequestered in our skulls, can be skewed by external forces we’re not even aware o
  •  Scientists have long known that the brain is a monitoring system as well as a thinking system. Its attention is drawn toward any object that is new, intriguing or otherwise striking — that has, in the psychological jargon, “salience.”
  • even in the history of captivating media, the smartphone stands out. It is an attention magnet unlike any our minds have had to grapple with before. Because the phone is packed with so many forms of information and so many useful and entertaining functions, it acts as what Dr. Ward calls a “supernormal stimulus,” one that can “hijack” attention whenever it is part of our surroundings — and it is always part of our surroundings.
  • Imagine combining a mailbox, a newspaper, a TV, a radio, a photo album, a public library and a boisterous party attended by everyone you know, and then compressing them all into a single, small, radiant object. That is what a smartphone represents to us. No wonder we can’t take our minds off it.
  • The irony of the smartphone is that the qualities that make it so appealing to us — its constant connection to the net, its multiplicity of apps, its responsiveness, its portability — are the very ones that give it such sway over our minds.
  • Phone makers like Apple and Samsung and app writers like Facebook, Google and Snap design their products to consume as much of our attention as possible during every one of our waking hours
  • Social media apps were designed to exploit “a vulnerability in human psychology,” former Facebook president Sean Parker said in a recent interview. “[We] understood this consciously. And we did it anyway.”
  • A quarter-century ago, when we first started going online, we took it on faith that the web would make us smarter: More information would breed sharper thinking. We now know it’s not that simple.
  • As strange as it might seem, people’s knowledge and understanding may actually dwindle as gadgets grant them easier access to online data stores
  • In a seminal 2011 study published in Science, a team of researchers — led by the Columbia University psychologist Betsy Sparrow and including the late Harvard memory expert Daniel Wegner — had a group of volunteers read forty brief, factual statements (such as “The space shuttle Columbia disintegrated during re-entry over Texas in Feb. 2003”) and then type the statements into a computer. Half the people were told that the machine would save what they typed; half were told that the statements would be erased.
  • Afterward, the researchers asked the subjects to write down as many of the statements as they could remember. Those who believed that the facts had been recorded in the computer demonstrated much weaker recall than those who assumed the facts wouldn’t be stored. Anticipating that information would be readily available in digital form seemed to reduce the mental effort that people made to remember it
  • The researchers dubbed this phenomenon the “Google effect” and noted its broad implications: “Because search engines are continually available to us, we may often be in a state of not feeling we need to encode the information internally. When we need it, we will look it up.”
  • as the pioneering psychologist and philosopher William James said in an 1892 lecture, “the art of remembering is the art of thinking.”
  • Only by encoding information in our biological memory can we weave the rich intellectual associations that form the essence of personal knowledge and give rise to critical and conceptual thinking. No matter how much information swirls around us, the less well-stocked our memory, the less we have to think with.
  • As Dr. Wegner and Dr. Ward explained in a 2013 Scientific American article, when people call up information through their devices, they often end up suffering from delusions of intelligence. They feel as though “their own mental capacities” had generated the information, not their devices. “The advent of the ‘information age’ seems to have created a generation of people who feel they know more than ever before,” the scholars concluded, even though “they may know ever less about the world around them.”
  • That insight sheds light on society’s current gullibility crisis, in which people are all too quick to credit lies and half-truths spread through social media. If your phone has sapped your powers of discernment, you’ll believe anything it tells you.
  • Data, the novelist and critic Cynthia Ozick once wrote, is “memory without history.” Her observation points to the problem with allowing smartphones to commandeer our brains
  • When we constrict our capacity for reasoning and recall or transfer those skills to a gadget, we sacrifice our ability to turn information into knowledge. We get the data but lose the meaning
  • We need to give our minds more room to think. And that means putting some distance between ourselves and our phones.
  • Harvard Business School professor emerita Shoshana Zuboff argues in her new book that the Valley’s wealth and power are predicated on an insidious, essentially pathological form of private enterprise—what she calls “surveillance capitalism.” Pioneered by Google, perfected by Facebook, and now spreading throughout the economy, surveillance capitalism uses human life as its raw material. Our everyday experiences, distilled into data, have become a privately-owned business asset used to predict and mold our behavior, whether we’re shopping or socializing, working or voting.
  • By reengineering the economy and society to their own benefit, Google and Facebook are perverting capitalism in a way that undermines personal freedom and corrodes democracy.
  • Under the Fordist model of mass production and consumption that prevailed for much of the twentieth century, industrial capitalism achieved a relatively benign balance among the contending interests of business owners, workers, and consumers. Enlightened executives understood that good pay and decent working conditions would ensure a prosperous middle class eager to buy the goods and services their companies produced. It was the product itself — made by workers, sold by companies, bought by consumers — that tied the interests of capitalism’s participants together. Economic and social equilibrium was negotiated through the product.
  • By removing the tangible product from the center of commerce, surveillance capitalism upsets the equilibrium. Whenever we use free apps and online services, it’s often said, we become the products, our attention harvested and sold to advertisers
  • this truism gets it wrong. Surveillance capitalism’s real products, vaporous but immensely valuable, are predictions about our future behavior — what we’ll look at, where we’ll go, what we’ll buy, what opinions we’ll hold — that internet companies derive from our personal data and sell to businesses, political operatives, and other bidders.
  • Unlike financial derivatives, which they in some ways resemble, these new data derivatives draw their value, parasite-like, from human experience.To the Googles and Facebooks of the world, we are neither the customer nor the product. We are the source of what Silicon Valley technologists call “data exhaust” — the informational byproducts of online activity that become the inputs to prediction algorithms
  • internet companies operate in what Zuboff terms “extreme structural independence from people.” When databases displace goods as the engine of the economy, our own interests, as consumers but also as citizens, cease to be part of the negotiation. We are no longer one of the forces guiding the market’s invisible hand. We are the objects of surveillance and control.
Javier E

Opinion | The Apps on My Phone Are Stalking Me - The New York Times - 0 views

  • There is much about the future that keeps me up at night — A.I. weaponry, undetectable viral deepfakes
  • but in the last few years, one technological threat has blipped my fear radar much faster than others.That fear? Ubiquitous surveillance.
  • I am no longer sure that human civilization can undo or evade living under constant, extravagantly detailed physical and even psychic surveillance
  • ...24 more annotations...
  • as a species, we are not doing nearly enough to avoid always being watched or otherwise digitally recorded.
  • our location, your purchases, video and audio from within your home and office, your online searches and every digital wandering, biometric tracking of your face and other body parts, your heart rate and other vital signs, your every communication, recording, and perhaps your deepest thoughts or idlest dreams
  • in the future, if not already, much of this data and more will be collected and analyzed by some combination of governments and corporations, among them a handful of megacompanies whose powers nearly match those of governments
  • Over the last year, as part of Times Opinion’s Privacy Project, I’ve participated in experiments in which my devices were closely monitored in order to determine the kind of data that was being collected about me.
  • I’ve realized how blind we are to the kinds of insights tech companies are gaining about us through our gadgets. Our blindness not only keeps us glued to privacy-invading tech
  • it also means that we’ve failed to create a political culture that is in any way up to the task of limiting surveillance.
  • few of our cultural or political institutions are even much trying to tamp down the surveillance state.
  • Yet the United States and other supposedly liberty-loving Western democracies have not ruled out such a future
  • like Barack Obama before him, Trump and the Justice Department are pushing Apple to create a backdoor into the data on encrypted iPhones — they want the untrustworthy F.B.I. and any local cop to be able to see everything inside anyone’s phone.
  • the fact that both Obama and Trump agreed on the need for breaking iPhone encryption suggests how thoroughly political leaders across a wide spectrum have neglected privacy as a fundamental value worthy of protection.
  • Among other revelations: Advertising companies and data brokers are keeping insanely close tabs on smartphones’ location data, tracking users so precisely that their databases could arguably compromise national security or political liberty.
  • Americans are sleepwalking into a future nearly as frightening as the one the Chinese are constructing. I choose the word “sleepwalking” deliberately, because when it comes to digital privacy, a lot of us prefer the comfortable bliss of ignorance.
  • Tracking technologies have become cheap and widely available — for less than $100, my colleagues were able to identify people walking by surveillance cameras in Bryant Park in Manhattan.
  • The Clearview AI story suggests another reason to worry that our march into surveillance has become inexorable: Each new privacy-invading technology builds on a previous one, allowing for scary outcomes from new integrations and collections of data that few users might have anticipated.
  • The upshot: As the location-tracking apps followed me, I was able to capture the pings they sent to online servers — essentially recording their spying
  • On the map, you can see the apps are essentially stalking me. They see me drive out one morning to the gas station, then to the produce store, then to Safeway; later on I passed by a music school, stopped at a restaurant, then Whole Foods.
  • But location was only one part of the data the companies had about me; because geographic data is often combined with other personal information — including a mobile advertising ID that can help merge what you see and do online with where you go in the real world — the story these companies can tell about me is actually far more detailed than I can tell about myself.
  • I can longer pretend I’ve got nothing to worry about. Sure, I’m not a criminal — but do I want anyone to learn everything about me?
  • more to the point: Is it wise for us to let any entity learn everything about everyone?
  • The remaining uncertainty about the surveillance state is not whether we will submit to it — only how readily and completely, and how thoroughly it will warp our society.
  • Will we allow the government and corporations unrestricted access to every bit of data we ever generate, or will we decide that some kinds of collections, like the encrypted data on your phone, should be forever off limits, even when a judge has issued a warrant for it?
  • In the future, will there be room for any true secret — will society allow any unrecorded thought or communication to evade detection and commercial analysis?
  • How completely will living under surveillance numb creativity and silence radical thought?
  • Can human agency survive the possibility that some companies will know more about all of us than any of us can ever know about ourselves?
sissij

Manterruption is a Thing, and Now There is an App to Detect it in Daily Conversation | ... - 0 views

  • Introducing our word of the day – “manterruption”. It’s a pretty self-explanatory term, describing a behavior when men interrupt women unnecessarily, which leads to a pretty serious imbalance in the amount of female vs. male contributions in a conversation.
  • A 2004 study on gender issues at Harvard Law School found that men were 50% more likely than women to volunteer at least one comment during class and 144% more likely to volunteer three or more comments. 
  • which as a consequence leaves decision-making mostly to men.
  • ...4 more annotations...
  • Meaning, women’s voices bring a different and valuable perspective in a conversation and should be heard more.
  • Here's the thing, though: while fighting for the cause of hearing the female perspective equally in all matters of business, government, and life is definitely worthwhile, blaming it all on interrupting men doesn’t seem fair. Because it is not just men who interrupt women, women do it too. As a matter of fact, a study done in a tech company showed that 87% of the time that women interrupt, they are interrupting other women.
  • There are also other dynamics at play, for example, seniority. It is still more likely that men will hold a more senior position in a professional environment and, generally, people with a higher rank tend to interrupt more and be interrupted less.
  • Hearing the voices and perspectives of both genders equally is incredibly important, but we should make sure we are addressing the right root causes and are not antagonizing those who need to be on the same side for progress to be made. 
  •  
    I think this app is very interesting. There are obviously gender inequality in the society that men are often more used to take the leadership than women. I think by counting how many times a woman is interrupted by a man is a very interesting aspect to show how the society is still dominated by men. I also really like that the author discusses about other possible factors of why women are more likely to be interrupted by men. Only arguing about one side wouldn't make a strong argument. Gender inequality is a big and heavy label that we should give it more thinking before we apply it to any phenomenon. --Sissi (3/14/2017)
Javier E

Opinion | Why a Digital Diary Will Change Your Life - The New York Times - 0 views

  • At first, my plan was to do what I always do when I see something halfway noteworthy, which is to tell a few hundred thousand people on Twitter, Facebook, Instagram or, in my lowest moments, even LinkedIn.
  • Smartphones and social networks have turned me into a lonely, needy man who requires constant affirmation. In desperate pursuit of such affirmation, my mind has come to resemble one of those stamping-machine assembly lines you see in cartoons, but for shareable content: The raw, analog world in all its glory enters via conveyor belt on one end, and, after some raucous puffs of smoke, it gets flattened and packaged in my head into insipid quips meant to inspire you to tap a tiny heart on a screen.
  • instead of sharing the silly lampshade joke, I journaled it in Day One, a magnificent digital diary app that has transformed my relationship with my phone, improved my memory, and given me a deeper perspective on my life than the one I was getting through the black mirror of social media.
  • ...14 more annotations...
  • In recent years, Twitter and much of the rest of the internet have been getting hotter, more reflexively outraged, less fun. Venturing onto social media these days, I often feel like a cat burglar stepping through a field of upturned rakes. I could imagine my dumb joke getting picked apart for all the ways it was problematic — “New York Times writer casually encourages bestial sexual assault! #deertoo” — bringing me ever closer to cancellation.
  • Think of Day One as a private social network for an audience of one: yourself.
  • You post updates to it just as you might on Instagram or Facebook.
  • The app — which runs on Macs, iPhones and iPads, syncing your entries between your devices — can handle long text journals, short picture-focused status updates, and pretty much anything else that comes across the digital transom.
  • I use it to jot down my deepest thoughts and shallowest jokes; to rant and to vent; to come to terms with new ideas I’m playing with, ideas that need time to marinate in secret before they’re ready for the world; and to collect and reflect upon all the weird and crazy and touching artifacts of life
  • It’s unsocial. Indeed, it’s downright antisocial. Nothing about the app is meant to be shared — it is protected with your Apple security credentials and backs up its data to the cloud using end-to-end encryption, so that the only way someone can get into your diary is by getting hold of your device and your system passcode.
  • Day One creates something so rare it feels almost sacred: A completely private digital space.
  • The best way to describe this feeling is to liken it to friendship. I feel comfortable dishing to Day One the way I would to a close friend I trust completely.
  • one of the few digital spaces that provides you mental space for contemplation and consideration
  • journaling has been shown to be good for mind and body, reducing stress and anxiety, improving interpersonal relationships, and promoting creativity
  • a digital journal offers several benefits over paper. Easy accessibility is a big one
  • you can tap out a journal while you’re in line at the supermarket
  • because so much happens on screens now, Day One offers greater fidelity to daily life. Instead of describing the insane conversation I had with my co-worker, I can just post a screenshot.
  • photography, which adds emotional heft to the rigidity of text.
peterconnelly

Virtual learning apps tracked and shared kids' data and online activities with advertis... - 0 views

  • Millions of students who participated in virtual learning during the Covid-19 pandemic had their personal data and online behaviors tracked by educational apps and websites without their consent and in many cases shared with third-party advertising technology companies, a new report has found.
  • Human Rights Watch found 146 (89%) appeared to engage in data practices that "risked or infringed on children's rights."
  • Han said the majority of the apps and websites examined by Human Rights Watch sent information about children to Google and Facebook, which collectively dominate the digital advertising market.
  • ...4 more annotations...
  • Albert Fox Cahn, founder and executive director of the Surveillance Technology Oversight Project and a fellow at the NYU School of Law, said the findings add to mounting concerns around the collection of data among young people. In recent months, there has been intense scrutiny from lawmakers about the impact tech platforms have on teens.
  • "We already knew technologies were being abused and putting children at risk, but this report is really important because it shows the scale of harm and how the same mistake is being made by educators and governments around the world," he said.
  • "Students must be able to do their schoolwork without surveillance by companies looking to harvest their data to pad their bottom line," said Samuel Levine
  • "The data must serve a purpose, but the purpose cannot be advertising," he said. "If it is not something we do in physical classrooms, it is not something that should be part of digital school life."
Javier E

Nepal Bans TikTok, Saying It Disturbs 'Social Harmony' - WSJ - 0 views

  • NEW DELHI—Nepal is banning TikTok over concerns that the video platform is “disturbing social harmony,
  • “Through social-media platform TikTok, there’s a continuous dissemination of content disturbing our social harmony and family structures,” said Sharma.
  • Nepal has faced increasing problems with TikTok, including cases of cyberbullying,
  • ...4 more annotations...
  • Over the past four years, more than 1,600 cybercrime cases on TikTok have been reported to authorities, said Kuber Kadayat, a Nepal police spokesman. He said most of the complaints were related to sharing nude photos or financial extortion.
  • Last week, the government said it would require social-media companies running platforms used in Nepal to set up liaison offices in the country.
  • India banned TikTok—citing threats to national security—along with dozens of other Chinese apps in 2020 after a clash between Indian and Chinese troops on the countries’ disputed border.
  • Nepal isn’t the first to cite concerns about the content shared on the app. Pakistan has banned the app multiple times after authorities received complaints of indecent content, later lifting the bans after receiving promises from TikTok to better control the content. In August, Senegal blocked access to the app citing hateful and subversive content being shared on it.
paisleyd

'Brain training' app may improve memory, daily functioning of people with schizophrenia... - 0 views

  • A 'brain training' iPad game developed and tested by researchers at the University of Cambridge may improve the memory of patients with schizophrenia
  • Schizophrenia is a long-term mental health condition that causes a range of psychological symptoms, ranging from changes in behaviour through to hallucinations and delusions
  • patients are still left with debilitating cognitive impairments, including in their memory
  • ...10 more annotations...
  • increasing evidence that computer-assisted training and rehabilitation can help people with schizophrenia overcome some of their symptoms
  • Schizophrenia is estimated to cost £13.1 billion per year in total in the UK, so even small improvements in cognitive functions could help patients make the transition to independent living
  • The game, Wizard, was the result of a nine-month collaboration between psychologists, neuroscientists, a professional game-developer and people with schizophrenia
  • patients who had played the memory game made significantly fewer errors and needed significantly fewer attempts to remember the location of different patterns in the CANTAB PAL test relative to the control group. In addition, patients in the cognitive training group saw an increase in their score on the GAF scale
  • Participants in the training group played the memory game for a total of eight hours over a four-week period; participants in the control group continued their treatment as usual. At the end of the four weeks, the researchers tested all participants' episodic memory using the Cambridge Neuropsychological Test Automated Battery (CANTAB) PAL, as well as their level of enjoyment and motivation, and their score on the Global Assessment of Functioning (GAF) scale
  • The memory task was woven into a narrative in which the player was allowed to choose their own character and name; the game rewarded progress with additional in-game activities to provide the user with a sense of progression independent of the cognitive training process
  • Because the game is interesting, even those patients with a general lack of motivation are spurred on to continue the training
  • used in conjunction with medication and current psychological therapies, this could help people with schizophrenia minimise the impact of their illness on everyday life
  • It is not clear exactly how the apps also improved the patients' daily functioning, but the researchers suggest it may be because improvements in memory had a direct impact on global functions or that the cognitive training may have had an indirect impact on functionality by improving general motivation and restoring self-esteem
  • This new app will allow the Wizard memory game to become widely available, inexpensively. State-of-the-art neuroscience at the University of Cambridge, combined with the innovative approach at Peak, will help bring the games industry to a new level and promote the benefits of cognitive enhancement
Javier E

WhatsApp urges users to update app after discovering spyware vulnerability | Technology... - 0 views

  • WhatsApp is encouraging users to update to the latest version of the app after discovering a vulnerability that allowed spyware to be injected into a user’s phone through the app’s phone call function.
  • The spyware was developed by the Israeli cyber intelligence company NSO Group, according to the Financial Times,
  • Attackers could transmit the malicious code to a target’s device by calling the user and infecting the call whether or not the recipient answered the call.
  • ...3 more annotations...
  • The spyware’s capabilities are near absolute. Once installed on a phone, the software can extract all of the data that’s already on the device (text messages, contacts, GPS location, email, browser history, etc) in addition to creating new data by using the phone’s microphone and camera to record the user’s surroundings and ambient sounds, according to a 2016 report by the New York Times.
  • NSO limits sales of its spyware, Pegasus, to state intelligence agencies
  • þffWhatsApp has about 1.5bn users around the world. The messaging app uses end-to-end encryption, making it popular and secure for activists and dissidents. The Pegasus spyware does not affect or involve the app’s encryption.
Javier E

Parents' Dilemma: When to Give Children Smartphones - WSJ - 0 views

  • Experience has already shown parents that ceding control over the devices has reshaped their children’s lives, allowing an outside influence on school work, friendships, recreation, sleep, romance, sex and free time.
  • Nearly 75% of teenagers had access to smartphones, concluded a 2015 study by Pew Research Center—unlocking the devices about 95 times a day on average,
  • They spent, on average, close to nine hours a day tethered to screens large
  • ...15 more annotations...
  • The more screen time, the more revenue.
  • The goal of Facebook Inc., Alphabet Inc.’s Google, Snap Inc. and their peers is to create or host captivating experiences that keep users glued to their screens, whether for Instagram, YouTube, Snapchat or Facebook
  • Snapchat users 25 and younger, for example, were spending 40 minutes a day on the app, Chief Executive Evan Spiegel said in August. Alphabet boasted to investors recently that YouTube’s 1.5 billion users were spending an average 60 minutes a day on mobile.
  • Facebook’s stock slid 4.5% to close at $179 Friday after CEO Mark Zuckerberg announced plans Thursday to overhaul the Facebook news feed in a way that could reduce the time users spend.
  • Tech companies are working to instill viewing habits earlier than ever. The number of users of YouTube Kids is soaring. Facebook recently launched Messenger Kids, a messaging app for children as young as 6.
  • Ms. Ho’s 16-year-old son, Brian is an Eagle Scout and chorister, who at times finds it hard to break away from online videogames, even at 3 a.m. The teen recently told his mother he thinks he is addicted. Ms. Ho’s daughter, Samantha, 14, also is glued to her device, in conversations with friends.
  • “You think you’re buying a piece of technology,” Ms. Shepardson said. “Now it’s like oxygen to her.”
  • Psychologists say social media creates anxiety among children when they are away from their phones—what they call “fear of missing out,” whether on social plans, conversations or damaging gossip teens worry could be about themselves.
  • About half the teens in a survey of 620 families in 2016 said they felt addicted to their smartphones. Nearly 80% said they checked the phones more than hourly and felt the need to respond instantly to messages
  • Children set up Instagram accounts under pseudonyms that friends but not parents recognize. Some teens keep several of these so-called Finsta accounts without their parents knowing.
  • An app called Secret Calculator looks and works like an iPhone calculator but doubles as a private vault to hide files, photos and videos.
  • Mr. Zuckerberg told investors late last year that Facebook planned to boost video offerings, noting that live video generates 10 times as many user interactions. Netflix Inc. chief executive Reed Hastings, said in April about the addictiveness of its shows that the company was “competing with sleep on the margins.”
  • Keeping children away from disturbing content, though, is easier than keeping them off their phones.
  • About 16% of the nation’s high-school students were bullied online in 2015, according to the U.S. Centers for Disease Control and Prevention. Children who are cyberbullied are three times more likely to contemplate suicide
  • Smartphones “bring the outside in,” said Ms. Ahn, whose husband works for a major tech company. “We want the family to be the center of gravity.”
ilanaprincilus06

New York Launches First COVID-19 Vaccination, Test Result App For Event Attendance : Co... - 0 views

  • Cuomo announced Friday that the state's health status certification, called the Excelsior Pass, will help New Yorkers voluntarily share vaccination and COVID-19 negative statuses with entertainment venues and other businesses to put the state state's economy back on track.
  • New Yorkers can always show alternate proof of vaccination or testing, like another mobile application or paper form, directly at a business or venue.
  • The pass could see New York's Broadway theaters, concert venues and sports arenas fill seats again after closures that started in March of 2020.
  • ...4 more annotations...
  • Airlines and technology companies have been working on developing technology to do so, but New York's is the first pass being made widely available to residents.
  • The idea is similar to mobile airline boarding passes: they can be printed or stored on smartphones, and participating businesses and venues can use a companion app to confirm patrons' health status.
  • rather than boost the economy and encourage vaccination, efforts like the Excelsior Pass could wind up further spread of variants. It's also still not clear that vaccinated people cannot spread the virus to people who have not been vaccinated.
  • Some worry that the passes might encourage fraud and increase the spread of the virus by people who claim to be vaccinated or COVID-19 negative but aren't.
runlai_jiang

A New Antidote for Noisy Airports: Slower Planes - WSJ - 0 views

  • Urban airports like Boston’s Logan thought they had silenced noise issues with quieter planes. Now complaints pour in from suburbs 10 to 15 miles away because new navigation routes have created relentless noise for some homeowners. Photo: Alamy By Scott McCartney Scott McCartney The Wall Street Journal BiographyScott McCartney @MiddleSeat Scott.McCartney@wsj.com March 7, 2018 8:39 a.m. ET 146 COMMENTS saveSB107507240220
  • It turns out engines aren’t the major culprit anymore. New airplanes are much quieter. It’s the “whoosh” that big airplanes make racing through the air.
  • Computer models suggest slowing departures by 30 knots—about 35 miles an hour—would reduce noise on the ground significantly.
  • ...9 more annotations...
  • The FAA says it’s impressed and is moving forward with recommendations Boston has made.
  • . A working group is forming to evaluate the main recommendation to slow departing jets to a speed limit of 220 knots during the climb to 10,000 feet, down from 250 knots.
  • New routes put planes over quiet communities. Complaints soared. Phoenix neighborhoods sued the FAA; Chicago neighborhoods are pushing for rotating runway use. Neighborhoods from California to Washington, D.C., are fighting the new procedures that airlines and the FAA insist are vital to future travel.
  • “It’s a concentration problem. It’s a frequency problem. It’s not really a noise problem.”
  • “The flights wake you up. We get a lot of complaints from young families with children,” says Mr. Wright, a data analyst who works from home for a major health-care company.
  • In Boston, an analysis suggested only 54% of the complaints Massport received resulted from noise louder than 45 decibels—about the level of background noise. When it’s relentless, you notice it more.
  • With a 30-knot reduction, noise directly under the flight track would decrease by between 1.5 and 5 decibels and the footprint on the ground would get a lot skinnier, sharply reducing the number of people affected, Mr. Hansman says.
  • The industry trade association Airlines for America has offered cautious support of the Boston recommendations. In a statement, the group said the changes must be safe, work with a variety of aircraft and not reduce the airport’s capacity for takeoffs and landings.
  • Air-traffic controllers will need to delay a departure a bit to put more room between a slower plane and a faster one, or modify its course slightly.
johnsonel7

Coronavirus: China to boost mass surveillance machine, experts say - 0 views

  • China could use the coronavirus outbreak to boost its mass surveillance capabilities as it looks to technology to help contain the epidemic in the world’s second-largest economy.The Communist Party has built a vast surveillance state through different methods with technology at its core.As artificial intelligence and the use of data becomes more advanced, Beijing has found increasingly effective ways to track the Chinese population, including facial recognition.
  • With over 77,000 coronavirus cases confirmed in China alone, the government has mobilized its surveillance machine, a move experts said could continue even after the virus has been contained.
  • The Chinese government has also enlisted the help of tech giants like Tencent, owner of popular messaging app WeChat and Alibaba subsidiary, Ant Financial, which runs payments app Alipay. On both WeChat and Alipay, users can put in their Chinese ID numbers and where they have travelled. Users will then be assigned a QR code based on a traffic light color system which instructs them about how long they need to be in quarantine, or whether they are free to travel. A QR code is a type of barcode which is widely used on digital platforms in China.
  • ...2 more annotations...
  • “The Party has increasingly treated ‘stability maintenance’ — a euphemism for social control — as an overarching priority, and devoted enormous resources to security agencies for monitoring dissidents, breaking up protests, censoring the internet, and developing and implementing mass surveillance systems,” she wrote in a recent paper.
  • “Once these systems are in place, those involved in its developments — particularly companies with money to be made — argue for their expansion or their wider use, a phenomenon known as ‘mission creep.’ What initially started as a system to crack down on crime — which is already a dubious and vague enough justification to encompass political crimes in China — is now used for other purposes including for fighting the coronavirus outbreak.”
Javier E

The Worst Part of the Woodward Tapes Isn't COVID. - 0 views

  • 1. Woodward
  • I'd like to take the other side of this Trump-Woodward story and offer two curveball views:
  • (1) I do not believe that Donald Trump "knew" how dangerous the coronavirus was. Allow me to explain.
  • ...21 more annotations...
  • This is simply how the man talks. About everything. What's more, he says everything, takes the both sides of everything:
  • Does he believe any of this, either way? Almost certainly not. The man has the brain of a goldfish: He "believes" whatever is in front of him in the moment. No matter whether or not it contradicts something he believed five minutes ago or will believe ten minutes from now.
  • All this guy does is try to create panic. That's his move
  • (2) The most alarming part of the Woodward tapes is the way Trump talks about Kim Jong Un and the moment when Trump literally takes sides with Kim Jong Un against a former American president.
  • In a way, it would be comforting to believe that our president was intelligent enough to grasp the seriousness of the coronavirus, even if his judgment in how to deal with the outbreak was malicious or poor.
  • All of the available evidence suggests the opposite:
  • Donald Trump lacks the cognitive ability to understand any concepts more complicated than self-promotion or self-preservation.
  • Put those two together—constant exaggerating self-aggrandizement and the perpetual attempt to stoke panic—and what you have is a guy was just saying stuff to Woodward.
  • After the Woodward tapes, anyone still deluding themselves about the authoritarian danger Trump poses to America is, finally, all out of excuses.
  • This, right here, is the most damning revelation from the Woodward tapes (so far):   Trump reflected on his relationships with authoritarian leaders generally, including Turkish President Recep Tayyip Erdogan. “It’s funny, the relationships I have, the tougher and meaner they are, the better I get along with them,” he told Woodward. “You know? Explain that to me someday, okay?” It's not hard to explain. And it's not funny.
  • You have this incredible rise in interest in technology and excitement about technology and the beat itself really took off while I was there. But then at the same time, you have this massive new centralization of government control over technology and the use of technology to control people and along with that rising nationalism.
  • Paul Mozur, who covers China and tech for the New York Times and is currently living in Taiwain, after the Chinese expelled all foreign journalists. 
  • That was more apparent, I think, over the past five years or so after Xi Jinping really consolidated power, but the amount of cameras that went up on street corners, the degree to which you used to be able to — there’s a moment maybe seven or eight years ago — where Jack Ma talked about the Tiananmen Square crackdowns on Chinese social media and now that’s just so utterly unthinkable. The degree to which the censorship has increased now to the level where if you say certain things on WeChat, it’s very possible the police will show up at your door where you actually have a truly fully formed Internet Police. . .
  • I think a lot of Chinese people feel more secure from the cameras, there’s been a lot of propaganda out there saying the cameras are here for your safety. There is this extremely positive, almost Utopian take on technology in China, and a lot of the stuff that I think, our knee-jerk response from the United States would be to be worried about, they kind of embrace as a vision of the future. .
  • The main reasons WeChat is a concern if you were the United States government is number one, it’s become a major vector of the spread of Chinese propaganda and censorship, and because it’s a social network that is anchored by a vast majority of users in China who are censored and who are receptive to all this propaganda, even if you’re overseas using WeChat and not censored in the same way, what you get is mostly content shared from people who are living in a censored environment, so it basically stays a censored environment. I call that a super filter bubble; the idea is that there are multiple filter bubbles contending in a website like Facebook, but with WeChat, because it’s so dominated by government controls, you get one really big mega pro-China filter bubble that then is spread all over the the world over the app, even if people outside of China don’t face the same censorship. So that’s one thing.
  • The second is the surveillance is immense and anybody who creates an account in China brings the surveillance with them overseas
  • And most people, frankly, using WeChat overseas probably created the accounts in China, and even when they don’t create the account in China, when national security priorities hit a certain level, I think they’re probably going to use it to monitor people anyway. I’ve run into a number of people who have had run-ins with the Chinese Internet Police either in China, but some of them outside of China, in their day-to-day life using WeChat, and then they return home and it becomes apparent that the Internet Police were watching them the whole time, and they get a visit and the police have a discussion with them about what their activities have been
  • So it’s also a major way that the Chinese government is able to spy on and monitor people overseas and then unsurprisingly, because of that, it’s used as a way for the Chinese intel services to harass people overseas. . . .
  • WeChat is particularly suited to this in part because every single person who uses WeChat within China has it linked to their real identity. And then because everybody on WeChat has linked to their real identity, you can map their relationship networks and lean on them that way.
  • It also has a bunch of tools that the Chinese police use, for instance key words, where you can set an alarm so that if you were to say “Tiananmen”, they could set an alarm so that anytime you say that they get a warning about that, and then they go look at what you’ve written. So there’s all these tools that are uniquely created for Chinese state surveillance that are within the app that they can also use, so there’s a bunch of ways that the app is just better.
  • It’s also one of the very few unblocked communication tools that goes between the two countries. So for all these reasons it’s a very, very big deal. For the Chinese government, it’s an important tool of social control, and it’s been a way that they’ve been able to take the social controls that exist within China and expand them to the diaspora community in some pretty unnerving ways.
Javier E

I Was Powerless Over Diet Coke - The New York Times - 0 views

  • What makes it so hard to quit?
  • two culprits: aspartame and caffeine. Or, to be more precise: addiction to sweetness and to caffeine. Individually, they’re bad; together, they’re an addict’s nightmare.
  • A 12-ounce can of regular Coke has 34 milligrams of caffeine, whereas Diet Coke has 11 milligrams more, according to Coca-Cola. (An 8-ounce cup of coffee has about 95 mg.) Artificial sweeteners activate the brain’s reward system, but only about half as much as regular sugar, said Dr. Peeke. Faux sugar doesn’t pack the same wallop as the real stuff, so it keeps you wanting more and more.
  • ...4 more annotations...
  • Not only is this tied to weight gain, especially in the belly, but it also leaves you with cravings. Aspartame is 200 times sweeter than table sugar. Serious drinkers are so used to the super-sweet taste that everything else seems bland in comparison.
  • Coca-Cola has a different take on what people refer to as an addiction. “Food and beverages, like chocolate, for example, can trigger what scientists call ‘reward centers’ in the brain, but so can other things like music or laughter,” said Daphne Dickerson, a spokeswoman for Coca-Cola. “Regularly consuming food and beverages that taste good and that you enjoy is not the same as being addicted to them.”
  • In September 2020, Ms. Beller was diagnosed with breast cancer. She didn’t quit Diet Coke until after surgery, when doctors found more cancer and she realized she’d have to undergo chemotherapy.
  • She used the Quitzilla app, a habit breaker and sobriety counter, which tracked her progress. “Every time I had a craving, just looking at the app did something good in my brain,” she said. She didn’t have a lot of physical side effects, but she did long for the drink. She credits the app with helping her stay on track.
Javier E

Everyone's Over Instagram - The Atlantic - 0 views

  • “Gen Z’s relationship with Instagram is much like millennials’ relationship with Facebook: Begrudgingly necessary,” Casey Lewis, a youth-culture consultant who writes the youth-culture newsletter After School, told me over email. “They don’t want to be on it, but they feel it’s weird if they’re not.”
  • a recent Piper Sandler survey found that, of 14,500 teens surveyed across 47 states, only 20 percent named Instagram their favorite social-media platform (TikTok came first, followed by Snapchat).
  • Simply being on Instagram is a very different thing from actively engaging with it. Participating means throwing pictures into a void, which is why it’s become kind of cringe. To do so earnestly suggests a blithe unawareness of your surroundings, like shouting into the phone in public.
  • ...10 more annotations...
  • In other words, Instagram is giving us the ick: that feeling when a romantic partner or crush does something small but noticeable—like wearing a fedora—that immediately turns you off forever.
  • “People who aren’t influencers only use [Instagram] to watch other people make big announcements,” Lee Tilghman, a former full-time Instagram influencer, told me over the phone. “My close friends who aren’t influencers, they haven’t posted in, like, two years.”
  • although Instagram now has 2 billion monthly users, it faces an existential problem: What happens when the 18-to-29-year-olds who are most likely to use the app, at least in America, age out or go elsewhere? Last year, The New York Times reported that Instagram was privately worried about attracting and retaining the new young users that would sustain its long-term growth—not to mention whose growing shopping potential is catnip to advertisers.
  • Over the summer, these frustrations boiled over. An update that promised, among other things, algorithmically recommended video content that would fill the entire screen was a bridge too far. Users were fed up with watching the app contort itself into a TikTok copycat that prioritized video and recommended posts over photos from friends
  • . Internal documents obtained by The Wall Street Journal show that Instagram users spend 17.6 million hours a day watching Reels, Instagram’s TikTok knockoff, compared with the 197.8 million hours people spend watching TikTok every day. The documents also revealed that Reels engagement has declined by 13.6 percent in recent months, with most users generating “no engagement whatsoever.”
  • Instagram may not be on its deathbed, but its transformation from cool to cringe is a sea change in the social-media universe. The platform was perhaps the most significant among an old generation of popular apps that embodied the original purpose of social media: to connect online with friends and family. Its decline is about not just a loss of relevance, but a capitulation to a new era of “performance” media, in which we create online primarily to reach people we don’t know instead of the people we do
  • . Lavish brand deals, in which an influencer promotes a brand’s product to their audience for a fee, have been known to pay anywhere from $100 to $10,000 per post, depending on the size of the creator’s following and their engagement. Now Tilghman, who became an Instagram influencer in 2015 and at one point had close to 400,000 followers, says she’s seen her rate go down by 80 percent over the past five years. The market’s just oversaturated.
  • The author Jessica DeFino, who joined Instagram in 2018 on the advice of publishing agents, similarly began stepping back from the platform in 2020, feeling overwhelmed by the constant feedback of her following. She has now set up auto-replies to her Instagram DMs: If one of her 59,000 followers sends her a message, they’re met with an invitation to instead reach out to DeFino via email.
  • would she get back on Instagram as a regular user? Only if she “created a private, personal account — somewhere I could limit my interactions to just family and friends,” she says. “Like what Instagram was in the beginning, I guess.”
  • That is if, by then, Instagram’s algorithm-driven, recommendation-fueled, shopping-heavy interface would even let her. Ick.
Javier E

Instagram's Algorithm Delivers Toxic Video Mix to Adults Who Follow Children - WSJ - 0 views

  • Instagram’s Reels video service is designed to show users streams of short videos on topics the system decides will interest them, such as sports, fashion or humor. 
  • The Meta Platforms META -1.04%decrease; red down pointing triangle-owned social app does the same thing for users its algorithm decides might have a prurient interest in children, testing by The Wall Street Journal showed.
  • The Journal sought to determine what Instagram’s Reels algorithm would recommend to test accounts set up to follow only young gymnasts, cheerleaders and other teen and preteen influencers active on the platform.
  • ...30 more annotations...
  • Following what it described as Meta’s unsatisfactory response to its complaints, Match began canceling Meta advertising for some of its apps, such as Tinder, in October. It has since halted all Reels advertising and stopped promoting its major brands on any of Meta’s platforms. “We have no desire to pay Meta to market our brands to predators or place our ads anywhere near this content,” said Match spokeswoman Justine Sacco.
  • The Journal set up the test accounts after observing that the thousands of followers of such young people’s accounts often include large numbers of adult men, and that many of the accounts who followed those children also had demonstrated interest in sex content related to both children and adults
  • The Journal also tested what the algorithm would recommend after its accounts followed some of those users as well, which produced more-disturbing content interspersed with ads.
  • The Canadian Centre for Child Protection, a child-protection group, separately ran similar tests on its own, with similar results.
  • Meta said the Journal’s tests produced a manufactured experience that doesn’t represent what billions of users see. The company declined to comment on why the algorithms compiled streams of separate videos showing children, sex and advertisements, but a spokesman said that in October it introduced new brand safety tools that give advertisers greater control over where their ads appear, and that Instagram either removes or reduces the prominence of four million videos suspected of violating its standards each month. 
  • The Journal reported in June that algorithms run by Meta, which owns both Facebook and Instagram, connect large communities of users interested in pedophilic content. The Meta spokesman said a task force set up after the Journal’s article has expanded its automated systems for detecting users who behave suspiciously, taking down tens of thousands of such accounts each month. The company also is participating in a new industry coalition to share signs of potential child exploitation.
  • “Our systems are effective at reducing harmful content, and we’ve invested billions in safety, security and brand suitability solutions,” said Samantha Stetson, a Meta vice president who handles relations with the advertising industry. She said the prevalence of inappropriate content on Instagram is low, and that the company invests heavily in reducing it.
  • Even before the 2020 launch of Reels, Meta employees understood that the product posed safety concerns, according to former employees.
  • Robbie McKay, a spokesman for Bumble, said it “would never intentionally advertise adjacent to inappropriate content,” and that the company is suspending its ads across Meta’s platforms.
  • Meta created Reels to compete with TikTok, the video-sharing platform owned by Beijing-based ByteDance. Both products feed users a nonstop succession of videos posted by others, and make money by inserting ads among them. Both companies’ algorithms show to a user videos the platforms calculate are most likely to keep that user engaged, based on his or her past viewing behavior
  • The Journal reporters set up the Instagram test accounts as adults on newly purchased devices and followed the gymnasts, cheerleaders and other young influencers. The tests showed that following only the young girls triggered Instagram to begin serving videos from accounts promoting adult sex content alongside ads for major consumer brands, such as one for Walmart that ran after a video of a woman exposing her crotch. 
  • When the test accounts then followed some users who followed those same young people’s accounts, they yielded even more disturbing recommendations. The platform served a mix of adult pornography and child-sexualizing material, such as a video of a clothed girl caressing her torso and another of a child pantomiming a sex act.
  • Experts on algorithmic recommendation systems said the Journal’s tests showed that while gymnastics might appear to be an innocuous topic, Meta’s behavioral tracking has discerned that some Instagram users following preteen girls will want to engage with videos sexualizing children, and then directs such content toward them.
  • Instagram’s system served jarring doses of salacious content to those test accounts, including risqué footage of children as well as overtly sexual adult videos—and ads for some of the biggest U.S. brands.
  • Preventing the system from pushing noxious content to users interested in it, they said, requires significant changes to the recommendation algorithms that also drive engagement for normal users. Company documents reviewed by the Journal show that the company’s safety staffers are broadly barred from making changes to the platform that might reduce daily active users by any measurable amount.
  • The test accounts showed that advertisements were regularly added to the problematic Reels streams. Ads encouraging users to visit Disneyland for the holidays ran next to a video of an adult acting out having sex with her father, and another of a young woman in lingerie with fake blood dripping from her mouth. An ad for Hims ran shortly after a video depicting an apparently anguished woman in a sexual situation along with a link to what was described as “the full video.”
  • Current and former Meta employees said in interviews that the tendency of Instagram algorithms to aggregate child sexualization content from across its platform was known internally to be a problem. Once Instagram pigeonholes a user as interested in any particular subject matter, they said, its recommendation systems are trained to push more related content to them.
  • Part of the problem is that automated enforcement systems have a harder time parsing video content than text or still images. Another difficulty arises from how Reels works: Rather than showing content shared by users’ friends, the way other parts of Instagram and Facebook often do, Reels promotes videos from sources they don’t follow
  • In an analysis conducted shortly before the introduction of Reels, Meta’s safety staff flagged the risk that the product would chain together videos of children and inappropriate content, according to two former staffers. Vaishnavi J, Meta’s former head of youth policy, described the safety review’s recommendation as: “Either we ramp up our content detection capabilities, or we don’t recommend any minor content,” meaning any videos of children.
  • At the time, TikTok was growing rapidly, drawing the attention of Instagram’s young users and the advertisers targeting them. Meta didn’t adopt either of the safety analysis’s recommendations at that time, according to J.
  • Stetson, Meta’s liaison with digital-ad buyers, disputed that Meta had neglected child safety concerns ahead of the product’s launch. “We tested Reels for nearly a year before releasing it widely, with a robust set of safety controls and measures,” she said. 
  • After initially struggling to maximize the revenue potential of its Reels product, Meta has improved how its algorithms recommend content and personalize video streams for users
  • Among the ads that appeared regularly in the Journal’s test accounts were those for “dating” apps and livestreaming platforms featuring adult nudity, massage parlors offering “happy endings” and artificial-intelligence chatbots built for cybersex. Meta’s rules are supposed to prohibit such ads.
  • The Journal informed Meta in August about the results of its testing. In the months since then, tests by both the Journal and the Canadian Centre for Child Protection show that the platform continued to serve up a series of videos featuring young children, adult content and apparent promotions for child sex material hosted elsewhere. 
  • As of mid-November, the center said Instagram is continuing to steadily recommend what the nonprofit described as “adults and children doing sexual posing.”
  • Meta hasn’t offered a timetable for resolving the problem or explained how in the future it would restrict the promotion of inappropriate content featuring children. 
  • The Journal’s test accounts found that the problem even affected Meta-related brands. Ads for the company’s WhatsApp encrypted chat service and Meta’s Ray-Ban Stories glasses appeared next to adult pornography. An ad for Lean In Girls, the young women’s empowerment nonprofit run by former Meta Chief Operating Officer Sheryl Sandberg, ran directly before a promotion for an adult sex-content creator who often appears in schoolgirl attire. Sandberg declined to comment. 
  • Through its own tests, the Canadian Centre for Child Protection concluded that Instagram was regularly serving videos and pictures of clothed children who also appear in the National Center for Missing and Exploited Children’s digital database of images and videos confirmed to be child abuse sexual material. The group said child abusers often use the images of the girls to advertise illegal content for sale in dark-web forums.
  • The nature of the content—sexualizing children without generally showing nudity—reflects the way that social media has changed online child sexual abuse, said Lianna McDonald, executive director for the Canadian center. The group has raised concerns about the ability of Meta’s algorithms to essentially recruit new members of online communities devoted to child sexual abuse, where links to illicit content in more private forums proliferate.
  • “Time and time again, we’ve seen recommendation algorithms drive users to discover and then spiral inside of these online child exploitation communities,” McDonald said, calling it disturbing that ads from major companies were subsidizing that process.
Javier E

Generative AI Brings Cost of Creation Close to Zero, Andreessen Horowitz's Martin Casad... - 0 views

  • The value of ChatGPT-like technology comes from bringing the cost of producing images, text and other creative projects close to zero
  • With only a few prompts, generative AI technology—such as the giant language models underlying the viral ChatGPT chatbot—can enable companies to create sales and marketing materials from scratch quickly for a fraction of the price of using current software tools, and paying designers, photographers and copywriters, among other expenses
  • “That’s very rare in my 20 years of experience in doing just frontier tech, to have four or five orders of magnitude of improvement on something people care about
  • ...4 more annotations...
  • many corporate technology chiefs have taken a wait-and-see approach to the technology, which has developed a reputation for producing false, misleading and unintelligible results—dubbed AI ‘hallucinations’. 
  • Though ChatGPT, which is available free online, is considered a consumer app, OpenAI has encouraged companies and startups to build apps on top of its language models—in part by providing access to the underlying computer code for a fee.
  • here are “certain spaces where it’s clearly directly applicable,” such as summarizing documents or responding to customer queries. Many startups are racing to apply the technology to a wider set of enterprise use case
  • “I think it’s going to creep into our lives in ways we least expect it,” Mr. Casado said.
Javier E

A Psychiatrist Tried to Quit Gambling. Betting Apps Kept Her Hooked. - WSJ - 0 views

  • Yet she was up against an industry skilled in the art of leveraging data analytics and human behavior to keep customers betting. Gambling companies tracked the ups and downs of Fischer’s betting behavior and gave bonus credits to keep her playing. VIP customer representatives offered encouragement and gifts.
‹ Previous 21 - 40 of 152 Next › Last »
Showing 20 items per page