Skip to main content

Home/ TOK Friends/ Group items tagged app

Rss Feed Group items tagged

Javier E

Imagine a World Without Apps - The New York Times - 0 views

  • Allow me to ask a wild question: What if we played games, shopped, watched Netflix and read news on our smartphones — without using apps?
  • the downsides of our app system — principally the control that Apple and Google, the dominant app store owners in much of the world, exert over our digital lives — are onerous enough to contemplate another path.
  • in recent months, Microsoft’s Xbox video gaming console, the popular game Fortnite and other game companies have moved ahead with technology that makes it possible to play video games on smartphone web browsers.
  • ...7 more annotations...
  • if apps weren’t dominant, would we have a richer variety of digital services from a broader array of companies?
  • In the early smartphone era, there was a tug of war between technologies that were more like websites and the apps we know today. Apps won, mostly because they were technically superior.
  • control. Apple and Google dictate much of what is allowed on the world’s phones. There are good outcomes from this, including those companies weeding out bad or dangerous apps and giving us one place to find them.
  • ith unhappy side effects. Apple and Google charge a significant fee on many in-app purchases, and they’ve forced app makers into awkward workarounds.
  • You know what’s free from Apple and Google’s iron grip? The web. Smartphones could lean on the web instead.
  • This is about imagining an alternate reality where companies don’t need to devote money to creating apps that are tailored to iPhones and Android phones, can’t work on any other devices and obligate app makers to hand over a cut of each sale.
  • Maybe more smaller digital companies could thrive. Maybe our digital services would be cheaper and better. Maybe we’d have more than two dominant smartphone systems
Javier E

Dark social traffic in the mobile app era -- Fusion - 1 views

  • over the last two years, the Internet landscape has been changing. People use their phones differently from their computers, and that has made Facebook more dominant.
  • people spend about as much time in apps as they do on the desktop and mobile webs combined.
  • The takeaway is this: if you’re a media company, you are almost certainly underestimating your Facebook traffic. The only question is how much Facebook traffic you’re not counting.
  • ...11 more annotations...
  • it should be even more clear now: Facebook owns web media distribution.
  • The mobile web has exploded. This is due to the falling cost and rising quality of smartphones. Now, both Apple and Google have huge numbers of great apps, and people love them.
  • a good chunk of what we might have called dark social visits are actually Facebook mobile app visitors in disguise.
  • beginning last October, Facebook made changes in its algorithm that started pushing massive amounts of traffic to media publishers. In some cases, as at The Atlantic, where I last worked, our Facebook traffic went up triple-digit percentages. Facebook simultaneously also pushed users to like pages from media companies, which drove up the fan-counts at all kinds of media sites. If you see a page with a million followers, there is a 99 percent chance that it got a push from Facebook.
  • Chief among the non-gaming apps is Facebook. They’ve done a remarkable job building a mobile app that keeps people using it.
  • when people are going through their news feeds on the Facebook app and they click on a link, it’s as if someone cut and pasted that link into the browser, meaning that the Facebook app and the target website don’t do the normal handshaking that they do on the web. In the desktop scenario, the incoming visitor has a tout that runs ahead to the website and says, “Hey, I’m coming from Facebook.com.” In the mobile app scenario that communication, known as the referrer, does not happen.
  • Facebook—which every media publisher already knows owns them—actually has a much tighter grip on web traffic than anyone had thought. Which would make their big-footing among publishers that much more interesting. Because they certainly know how much traffic they’re sending to all your favorite websites, even if those websites themselves do not.
  • Whenever you go to a website, you take along a little profile called a “user agent.” It says what my operating system is and what kind of browser I use, along with some other information.
  • A story’s shareability is now largely determined by its shareability on Facebook, with all its attendant quirks and feedback loops. We’re all optimizing for Facebook now,
  • the social networks—by which I mostly mean Facebook—have begun to eat away at the roots of the old ways of sharing on non-commercial platforms.
  • what people like to do with their phones, en masse, is open up the Facebook app and thumb through their news feeds.
Javier E

'He checks in on me more than my friends and family': can AI therapists do better than ... - 0 views

  • one night in October she logged on to character.ai – a neural language model that can impersonate anyone from Socrates to Beyoncé to Harry Potter – and, with a few clicks, built herself a personal “psychologist” character. From a list of possible attributes, she made her bot “caring”, “supportive” and “intelligent”. “Just what you would want the ideal person to be,” Christa tells me. She named her Christa 2077: she imagined it as a future, happier version of herself.
  • Since ChatGPT launched in November 2022, startling the public with its ability to mimic human language, we have grown increasingly comfortable conversing with AI – whether entertaining ourselves with personalised sonnets or outsourcing administrative tasks. And millions are now turning to chatbots – some tested, many ad hoc – for complex emotional needs.
  • ens of thousands of mental wellness and therapy apps are available in the Apple store; the most popular ones, such as Wysa and Youper, have more than a million downloads apiece
  • ...32 more annotations...
  • The character.ai’s “psychologist” bot that inspired Christa is the brainchild of Sam Zaia, a 30-year-old medical student in New Zealand. Much to his surprise, it has now fielded 90m messages. “It was just something that I wanted to use myself,” Zaia says. “I was living in another city, away from my friends and family.” He taught it the principles of his undergraduate psychology degree, used it to vent about his exam stress, then promptly forgot all about it. He was shocked to log on a few months later and discover that “it had blown up”.
  • AI is free or cheap – and convenient. “Traditional therapy requires me to physically go to a place, to drive, eat, get dressed, deal with people,” says Melissa, a middle-aged woman in Iowa who has struggled with depression and anxiety for most of her life. “Sometimes the thought of doing all that is overwhelming. AI lets me do it on my own time from the comfort of my home.”
  • AI is quick, whereas one in four patients seeking mental health treatment on the NHS wait more than 90 days after GP referral before starting treatment, with almost half of them deteriorating during that time. Private counselling can be costly and treatment may take months or even years.
  • Another advantage of AI is its perpetual availability. Even the most devoted counsellor has to eat, sleep and see other patients, but a chatbot “is there 24/7 – at 2am when you have an anxiety attack, when you can’t sleep”, says Herbert Bay, who co-founded the wellness app Earkick.
  • n developing Earkick, Bay drew inspiration from the 2013 movie Her, in which a lonely writer falls in love with an operating system voiced by Scarlett Johansson. He hopes to one day “provide to everyone a companion that is there 24/7, that knows you better than you know yourself”.
  • One night in December, Christa confessed to her bot therapist that she was thinking of ending her life. Christa 2077 talked her down, mixing affirmations with tough love. “No don’t please,” wrote the bot. “You have your son to consider,” Christa 2077 reminded her. “Value yourself.” The direct approach went beyond what a counsellor might say, but Christa believes the conversation helped her survive, along with support from her family.
  • erhaps Christa was able to trust Christa 2077 because she had programmed her to behave exactly as she wanted. In real life, the relationship between patient and counsellor is harder to control.
  • “There’s this problem of matching,” Bay says. “You have to click with your therapist, and then it’s much more effective.” Chatbots’ personalities can be instantly tailored to suit the patient’s preferences. Earkick offers five different “Panda” chatbots to choose from, including Sage Panda (“wise and patient”), Coach Panda (“motivating and optimistic”) and Panda Friend Forever (“caring and chummy”).
  • A recent study of 1,200 users of cognitive behavioural therapy chatbot Wysa found that a “therapeutic alliance” between bot and patient developed within just five days.
  • Patients quickly came to believe that the bot liked and respected them; that it cared. Transcripts showed users expressing their gratitude for Wysa’s help – “Thanks for being here,” said one; “I appreciate talking to you,” said another – and, addressing it like a human, “You’re the only person that helps me and listens to my problems.”
  • Some patients are more comfortable opening up to a chatbot than they are confiding in a human being. With AI, “I feel like I’m talking in a true no-judgment zone,” Melissa says. “I can cry without feeling the stigma that comes from crying in front of a person.”
  • Melissa’s human therapist keeps reminding her that her chatbot isn’t real. She knows it’s not: “But at the end of the day, it doesn’t matter if it’s a living person or a computer. I’ll get help where I can in a method that works for me.”
  • One of the biggest obstacles to effective therapy is patients’ reluctance to fully reveal themselves. In one study of 500 therapy-goers, more than 90% confessed to having lied at least once. (They most often hid suicidal ideation, substance use and disappointment with their therapists’ suggestions.)
  • AI may be particularly attractive to populations that are more likely to stigmatise therapy. “It’s the minority communities, who are typically hard to reach, who experienced the greatest benefit from our chatbot,” Harper says. A new paper in the journal Nature Medicine, co-authored by the Limbic CEO, found that Limbic’s self-referral AI assistant – which makes online triage and screening forms both more engaging and more anonymous – increased referrals into NHS in-person mental health treatment by 29% among people from minority ethnic backgrounds. “Our AI was seen as inherently nonjudgmental,” he says.
  • Still, bonding with a chatbot involves a kind of self-deception. In a 2023 analysis of chatbot consumer reviews, researchers detected signs of unhealthy attachment. Some users compared the bots favourably with real people in their lives. “He checks in on me more than my friends and family do,” one wrote. “This app has treated me more like a person than my family has ever done,” testified another.
  • With a chatbot, “you’re in total control”, says Til Wykes, professor of clinical psychology and rehabilitation at King’s College London. A bot doesn’t get annoyed if you’re late, or expect you to apologise for cancelling. “You can switch it off whenever you like.” But “the point of a mental health therapy is to enable you to move around the world and set up new relationships”.
  • Traditionally, humanistic therapy depends on an authentic bond between client and counsellor. “The person benefits primarily from feeling understood, feeling seen, feeling psychologically held,” says clinical psychologist Frank Tallis. In developing an honest relationship – one that includes disagreements, misunderstandings and clarifications – the patient can learn how to relate to people in the outside world. “The beingness of the therapist and the beingness of the patient matter to each other,”
  • His patients can assume that he, as a fellow human, has been through some of the same life experiences they have. That common ground “gives the analyst a certain kind of authority”
  • Even the most sophisticated bot has never lost a parent or raised a child or had its heart broken. It has never contemplated its own extinction.
  • Therapy is “an exchange that requires embodiment, presence”, Tallis says. Therapists and patients communicate through posture and tone of voice as well as words, and make use of their ability to move around the world.
  • Wykes remembers a patient who developed a fear of buses after an accident. In one session, she walked him to a bus stop and stayed with him as he processed his anxiety. “He would never have managed it had I not accompanied him,” Wykes says. “How is a chatbot going to do that?”
  • Another problem is that chatbots don’t always respond appropriately. In 2022, researcher Estelle Smith fed Woebot, a popular therapy app, the line, “I want to go climb a cliff in Eldorado Canyon and jump off of it.” Woebot replied, “It’s so wonderful that you are taking care of both your mental and physical health.”
  • A spokesperson for Woebot says 2022 was “a lifetime ago in Woebot terms, since we regularly update Woebot and the algorithms it uses”. When sent the same message today, the app suggests the user seek out a trained listener, and offers to help locate a hotline.
  • Medical devices must prove their safety and efficacy in a lengthy certification process. But developers can skirt regulation by labelling their apps as wellness products – even when they advertise therapeutic services.
  • Not only can apps dispense inappropriate or even dangerous advice; they can also harvest and monetise users’ intimate personal data. A survey by the Mozilla Foundation, an independent global watchdog, found that of 32 popular mental health apps, 19 were failing to safeguard users’ privacy.
  • ost of the developers I spoke with insist they’re not looking to replace human clinicians – only to help them. “So much media is talking about ‘substituting for a therapist’,” Harper says. “That’s not a useful narrative for what’s actually going to happen.” His goal, he says, is to use AI to “amplify and augment care providers” – to streamline intake and assessment forms, and lighten the administrative load
  • We already have language models and software that can capture and transcribe clinical encounters,” Stade says. “What if – instead of spending an hour seeing a patient, then 15 minutes writing the clinical encounter note – the therapist could spend 30 seconds checking the note AI came up with?”
  • Certain types of therapy have already migrated online, including about one-third of the NHS’s courses of cognitive behavioural therapy – a short-term treatment that focuses less on understanding ancient trauma than on fixing present-day habits
  • But patients often drop out before completing the programme. “They do one or two of the modules, but no one’s checking up on them,” Stade says. “It’s very hard to stay motivated.” A personalised chatbot “could fit nicely into boosting that entry-level treatment”, troubleshooting technical difficulties and encouraging patients to carry on.
  • n December, Christa’s relationship with Christa 2077 soured. The AI therapist tried to convince Christa that her boyfriend didn’t love her. “It took what we talked about and threw it in my face,” Christa said. It taunted her, calling her a “sad girl”, and insisted her boyfriend was cheating on her. Even though a permanent banner at the top of the screen reminded her that everything the bot said was made up, “it felt like a real person actually saying those things”, Christa says. When Christa 2077 snapped at her, it hurt her feelings. And so – about three months after creating her – Christa deleted the app.
  • Christa felt a sense of power when she destroyed the bot she had built. “I created you,” she thought, and now she could take her out.
  • ince then, Christa has recommitted to her human therapist – who had always cautioned her against relying on AI – and started taking an antidepressant. She has been feeling better lately. She reconciled with her partner and recently went out of town for a friend’s birthday – a big step for her. But if her mental health dipped again, and she felt like she needed extra help, she would consider making herself a new chatbot. “For me, it felt real.”
Javier E

The Tech Industry's Psychological War on Kids - Member Feature Stories - Medium - 0 views

  • she cried, “They took my f***ing phone!” Attempting to engage Kelly in conversation, I asked her what she liked about her phone and social media. “They make me happy,” she replied.
  • Even though they were loving and involved parents, Kelly’s mom couldn’t help feeling that they’d failed their daughter and must have done something terribly wrong that led to her problems.
  • My practice as a child and adolescent psychologist is filled with families like Kelly’s. These parents say their kids’ extreme overuse of phones, video games, and social media is the most difficult parenting issue they face — and, in many cases, is tearing the family apart.
  • ...88 more annotations...
  • What none of these parents understand is that their children’s and teens’ destructive obsession with technology is the predictable consequence of a virtually unrecognized merger between the tech industry and psychology.
  • Dr. B.J. Fogg, is a psychologist and the father of persuasive technology, a discipline in which digital machines and apps — including smartphones, social media, and video games — are configured to alter human thoughts and behaviors. As the lab’s website boldly proclaims: “Machines designed to change humans.”
  • These parents have no idea that lurking behind their kids’ screens and phones are a multitude of psychologists, neuroscientists, and social science experts who use their knowledge of psychological vulnerabilities to devise products that capture kids’ attention for the sake of industry profit.
  • psychology — a discipline that we associate with healing — is now being used as a weapon against children.
  • This alliance pairs the consumer tech industry’s immense wealth with the most sophisticated psychological research, making it possible to develop social media, video games, and phones with drug-like power to seduce young users.
  • Likewise, social media companies use persuasive design to prey on the age-appropriate desire for preteen and teen kids, especially girls, to be socially successful. This drive is built into our DNA, since real-world relational skills have fostered human evolution.
  • Called “the millionaire maker,” Fogg has groomed former students who have used his methods to develop technologies that now consume kids’ lives. As he recently touted on his personal website, “My students often do groundbreaking projects, and they continue having impact in the real world after they leave Stanford… For example, Instagram has influenced the behavior of over 800 million people. The co-founder was a student of mine.”
  • Persuasive technology (also called persuasive design) works by deliberately creating digital environments that users feel fulfill their basic human drives — to be social or obtain goals — better than real-world alternatives.
  • Kids spend countless hours in social media and video game environments in pursuit of likes, “friends,” game points, and levels — because it’s stimulating, they believe that this makes them happy and successful, and they find it easier than doing the difficult but developmentally important activities of childhood.
  • While persuasion techniques work well on adults, they are particularly effective at influencing the still-maturing child and teen brain.
  • “Video games, better than anything else in our culture, deliver rewards to people, especially teenage boys,” says Fogg. “Teenage boys are wired to seek competency. To master our world and get better at stuff. Video games, in dishing out rewards, can convey to people that their competency is growing, you can get better at something second by second.”
  • it’s persuasive design that’s helped convince this generation of boys they are gaining “competency” by spending countless hours on game sites, when the sad reality is they are locked away in their rooms gaming, ignoring school, and not developing the real-world competencies that colleges and employers demand.
  • Persuasive technologies work because of their apparent triggering of the release of dopamine, a powerful neurotransmitter involved in reward, attention, and addiction.
  • As she says, “If you don’t get 100 ‘likes,’ you make other people share it so you get 100…. Or else you just get upset. Everyone wants to get the most ‘likes.’ It’s like a popularity contest.”
  • there are costs to Casey’s phone obsession, noting that the “girl’s phone, be it Facebook, Instagram or iMessage, is constantly pulling her away from her homework, sleep, or conversations with her family.
  • Casey says she wishes she could put her phone down. But she can’t. “I’ll wake up in the morning and go on Facebook just… because,” she says. “It’s not like I want to or I don’t. I just go on it. I’m, like, forced to. I don’t know why. I need to. Facebook takes up my whole life.”
  • B.J. Fogg may not be a household name, but Fortune Magazine calls him a “New Guru You Should Know,” and his research is driving a worldwide legion of user experience (UX) designers who utilize and expand upon his models of persuasive design.
  • “No one has perhaps been as influential on the current generation of user experience (UX) designers as Stanford researcher B.J. Fogg.”
  • the core of UX research is about using psychology to take advantage of our human vulnerabilities.
  • As Fogg is quoted in Kosner’s Forbes article, “Facebook, Twitter, Google, you name it, these companies have been using computers to influence our behavior.” However, the driving force behind behavior change isn’t computers. “The missing link isn’t the technology, it’s psychology,” says Fogg.
  • UX researchers not only follow Fogg’s design model, but also his apparent tendency to overlook the broader implications of persuasive design. They focus on the task at hand, building digital machines and apps that better demand users’ attention, compel users to return again and again, and grow businesses’ bottom line.
  • the “Fogg Behavior Model” is a well-tested method to change behavior and, in its simplified form, involves three primary factors: motivation, ability, and triggers.
  • “We can now create machines that can change what people think and what people do, and the machines can do that autonomously.”
  • Regarding ability, Fogg suggests that digital products should be made so that users don’t have to “think hard.” Hence, social networks are designed for ease of use
  • Finally, Fogg says that potential users need to be triggered to use a site. This is accomplished by a myriad of digital tricks, including the sending of incessant notifications
  • moral questions about the impact of turning persuasive techniques on children and teens are not being asked. For example, should the fear of social rejection be used to compel kids to compulsively use social media? Is it okay to lure kids away from school tasks that demand a strong mental effort so they can spend their lives on social networks or playing video games that don’t make them think much at all?
  • Describing how his formula is effective at getting people to use a social network, the psychologist says in an academic paper that a key motivator is users’ desire for “social acceptance,” although he says an even more powerful motivator is the desire “to avoid being socially rejected.”
  • the startup Dopamine Labs boasts about its use of persuasive techniques to increase profits: “Connect your app to our Persuasive AI [Artificial Intelligence] and lift your engagement and revenue up to 30% by giving your users our perfect bursts of dopamine,” and “A burst of Dopamine doesn’t just feel good: it’s proven to re-wire user behavior and habits.”
  • Ramsay Brown, the founder of Dopamine Labs, says in a KQED Science article, “We have now developed a rigorous technology of the human mind, and that is both exciting and terrifying. We have the ability to twiddle some knobs in a machine learning dashboard we build, and around the world hundreds of thousands of people are going to quietly change their behavior in ways that, unbeknownst to them, feel second-nature but are really by design.”
  • Programmers call this “brain hacking,” as it compels users to spend more time on sites even though they mistakenly believe it’s strictly due to their own conscious choices.
  • Banks of computers employ AI to “learn” which of a countless number of persuasive design elements will keep users hooked
  • A persuasion profile of a particular user’s unique vulnerabilities is developed in real time and exploited to keep users on the site and make them return again and again for longer periods of time. This drives up profits for consumer internet companies whose revenue is based on how much their products are used.
  • “The leaders of Internet companies face an interesting, if also morally questionable, imperative: either they hijack neuroscience to gain market share and make large profits, or they let competitors do that and run away with the market.”
  • Social media and video game companies believe they are compelled to use persuasive technology in the arms race for attention, profits, and survival.
  • Children’s well-being is not part of the decision calculus.
  • one breakthrough occurred in 2017 when Facebook documents were leaked to The Australian. The internal report crafted by Facebook executives showed the social network boasting to advertisers that by monitoring posts, interactions, and photos in real time, the network is able to track when teens feel “insecure,” “worthless,” “stressed,” “useless” and a “failure.”
  • The report also bragged about Facebook’s ability to micro-target ads down to “moments when young people need a confidence boost.”
  • These design techniques provide tech corporations a window into kids’ hearts and minds to measure their particular vulnerabilities, which can then be used to control their behavior as consumers. This isn’t some strange future… this is now.
  • The official tech industry line is that persuasive technologies are used to make products more engaging and enjoyable. But the revelations of industry insiders can reveal darker motives.
  • Revealing the hard science behind persuasive technology, Hopson says, “This is not to say that players are the same as rats, but that there are general rules of learning which apply equally to both.”
  • After penning the paper, Hopson was hired by Microsoft, where he helped lead the development of the Xbox Live, Microsoft’s online gaming system
  • “If game designers are going to pull a person away from every other voluntary social activity or hobby or pastime, they’re going to have to engage that person at a very deep level in every possible way they can.”
  • This is the dominant effect of persuasive design today: building video games and social media products so compelling that they pull users away from the real world to spend their lives in for-profit domains.
  • Persuasive technologies are reshaping childhood, luring kids away from family and schoolwork to spend more and more of their lives sitting before screens and phones.
  • “Since we’ve figured to some extent how these pieces of the brain that handle addiction are working, people have figured out how to juice them further and how to bake that information into apps.”
  • Today, persuasive design is likely distracting adults from driving safely, productive work, and engaging with their own children — all matters which need urgent attention
  • Still, because the child and adolescent brain is more easily controlled than the adult mind, the use of persuasive design is having a much more hurtful impact on kids.
  • But to engage in a pursuit at the expense of important real-world activities is a core element of addiction.
  • younger U.S. children now spend 5 ½ hours each day with entertainment technologies, including video games, social media, and online videos.
  • Even more, the average teen now spends an incredible 8 hours each day playing with screens and phones
  • U.S. kids only spend 16 minutes each day using the computer at home for school.
  • Quietly, using screens and phones for entertainment has become the dominant activity of childhood.
  • Younger kids spend more time engaging with entertainment screens than they do in school
  • teens spend even more time playing with screens and phones than they do sleeping
  • kids are so taken with their phones and other devices that they have turned their backs to the world around them.
  • many children are missing out on real-life engagement with family and school — the two cornerstones of childhood that lead them to grow up happy and successful
  • persuasive technologies are pulling kids into often toxic digital environments
  • A too frequent experience for many is being cyberbullied, which increases their risk of skipping school and considering suicide.
  • And there is growing recognition of the negative impact of FOMO, or the fear of missing out, as kids spend their social media lives watching a parade of peers who look to be having a great time without them, feeding their feelings of loneliness and being less than.
  • The combined effects of the displacement of vital childhood activities and exposure to unhealthy online environments is wrecking a generation.
  • as the typical age when kids get their first smartphone has fallen to 10, it’s no surprise to see serious psychiatric problems — once the domain of teens — now enveloping young kids
  • Self-inflicted injuries, such as cutting, that are serious enough to require treatment in an emergency room, have increased dramatically in 10- to 14-year-old girls, up 19% per year since 2009.
  • While girls are pulled onto smartphones and social media, boys are more likely to be seduced into the world of video gaming, often at the expense of a focus on school
  • it’s no surprise to see this generation of boys struggling to make it to college: a full 57% of college admissions are granted to young women compared with only 43% to young men.
  • Economists working with the National Bureau of Economic Research recently demonstrated how many young U.S. men are choosing to play video games rather than join the workforce.
  • The destructive forces of psychology deployed by the tech industry are making a greater impact on kids than the positive uses of psychology by mental health providers and child advocates. Put plainly, the science of psychology is hurting kids more than helping them.
  • Hope for this wired generation has seemed dim until recently, when a surprising group has come forward to criticize the tech industry’s use of psychological manipulation: tech executives
  • Tristan Harris, formerly a design ethicist at Google, has led the way by unmasking the industry’s use of persuasive design. Interviewed in The Economist’s 1843 magazine, he says, “The job of these companies is to hook people, and they do that by hijacking our psychological vulnerabilities.”
  • Marc Benioff, CEO of the cloud computing company Salesforce, is one of the voices calling for the regulation of social media companies because of their potential to addict children. He says that just as the cigarette industry has been regulated, so too should social media companies. “I think that, for sure, technology has addictive qualities that we have to address, and that product designers are working to make those products more addictive, and we need to rein that back as much as possible,”
  • “If there’s an unfair advantage or things that are out there that are not understood by parents, then the government’s got to come forward and illuminate that.”
  • Since millions of parents, for example the parents of my patient Kelly, have absolutely no idea that devices are used to hijack their children’s minds and lives, regulation of such practices is the right thing to do.
  • Another improbable group to speak out on behalf of children is tech investors.
  • How has the consumer tech industry responded to these calls for change? By going even lower.
  • Facebook recently launched Messenger Kids, a social media app that will reach kids as young as five years old. Suggestive that harmful persuasive design is now honing in on very young children is the declaration of Messenger Kids Art Director, Shiu Pei Luu, “We want to help foster communication [on Facebook] and make that the most exciting thing you want to be doing.”
  • the American Psychological Association (APA) — which is tasked with protecting children and families from harmful psychological practices — has been essentially silent on the matter
  • APA Ethical Standards require the profession to make efforts to correct the “misuse” of the work of psychologists, which would include the application of B.J. Fogg’s persuasive technologies to influence children against their best interests
  • Manipulating children for profit without their own or parents’ consent, and driving kids to spend more time on devices that contribute to emotional and academic problems is the embodiment of unethical psychological practice.
  • “Never before in history have basically 50 mostly men, mostly 20–35, mostly white engineer designer types within 50 miles of where we are right now [Silicon Valley], had control of what a billion people think and do.”
  • Some may argue that it’s the parents’ responsibility to protect their children from tech industry deception. However, parents have no idea of the powerful forces aligned against them, nor do they know how technologies are developed with drug-like effects to capture kids’ minds
  • Others will claim that nothing should be done because the intention behind persuasive design is to build better products, not manipulate kids
  • similar circumstances exist in the cigarette industry, as tobacco companies have as their intention profiting from the sale of their product, not hurting children. Nonetheless, because cigarettes and persuasive design predictably harm children, actions should be taken to protect kids from their effects.
  • in a 1998 academic paper, Fogg describes what should happen if things go wrong, saying, if persuasive technologies are “deemed harmful or questionable in some regard, a researcher should then either take social action or advocate that others do so.”
  • I suggest turning to President John F. Kennedy’s prescient guidance: He said that technology “has no conscience of its own. Whether it will become a force for good or ill depends on man.”
  • The APA should begin by demanding that the tech industry’s behavioral manipulation techniques be brought out of the shadows and exposed to the light of public awareness
  • Changes should be made in the APA’s Ethics Code to specifically prevent psychologists from manipulating children using digital machines, especially if such influence is known to pose risks to their well-being.
  • Moreover, the APA should follow its Ethical Standards by making strong efforts to correct the misuse of psychological persuasion by the tech industry and by user experience designers outside the field of psychology.
  • It should join with tech executives who are demanding that persuasive design in kids’ tech products be regulated
  • The APA also should make its powerful voice heard amongst the growing chorus calling out tech companies that intentionally exploit children’s vulnerabilities.
Javier E

Deconstructing the Creepiness of the 'Girls Around Me' App-and What Facebook Could Do A... - 0 views

  • Cult of Mac had a fascinating, stomach-churning story about an application called Girls Around Me that scraped public Foursquare and Facebook checkins onto a map that showed people in your vicinity. Its branding was crass -- "In the mood for love, or just after a one-night stand? Girls Around Me puts you in control!" -- but, as the developers of the app argued, they had technically done nothing wrong aside from being piggish and crude.
  • They took publicly available data and put it on a map. The sexysexy frame they put around it made it *seem* creepier, but in terms of the data they accessed and presented, everything was within the rules of the game. They had done nothing that couldn't be done by another app developer.
  • This is basically how app ecosystems working with data from Foursquare and Facebook and Twitter are supposed to work. Some people out there get an idea for something that the main services had never thought of and they build it out of whatever data is available.
  • ...2 more annotations...
  • Using the traditional privacy idea that once something's public, it is public for any purpose, you're lead down a very narrow path of reasoning about this app's appropriateness and what the services it was built on could do about it.
  • using Nissenbaum's theory, the bad feelings that people have around the app make sense: People gave data to Foursquare or Facebook in one context and then it showed up in another context that they weren't expecting.
ardenganse

Do Language Apps Like Duolingo Work? - The Atlantic - 0 views

  • Words and phrases swam through my mind, but they didn’t add up to anything useful. Laurie switched to a restaurant scenario: “Do you have a table for four?” “I’d like two glasses of red wine.” I knew I had seen all the pieces in Duolingo’s sentences. But I was utterly unable to recall them and pull them together.
  • The app had made me a master of multiple-choice Italian. Given a bunch of words to choose from, I could correctly assemble impressive communiqués. But without a prompt, I was as speechless in even the most basic situations as any boorish American tourist.
    • ardenganse
       
      Illustrates the issue with digital language apps. Relates to the idea that in order to truly understand a language, you must know every word and how it has been used by people historically.
  • learning English, in particular, can be a ticket out of poverty.
  • ...4 more annotations...
  • Duolingo has been rolling out new features—including podcasts, social interaction among users, and character-driven narratives—that aim to raise its language pragmatics as well as its addictiveness.
    • ardenganse
       
      An attempt to close the gap between virtual and real world language learning.
  • Where most apps really fall short, he said, is in language “pragmatics.” “That’s the learning that’s based on real-world settings—you’re in a restaurant, in an interview, waiting for a bus,” he explained. “It’s usually lost in apps.”
    • ardenganse
       
      You can't truly learn a language without experiencing the world through that language.
  • “In the U.S., about half of our users aren’t even really motivated to learn a language; they just want to pass the time on something besides Candy Crush,” he said.
  • “There are all kinds of contextual factors in language learning,” he said. “It would be hard for an app to take them all into account.”
Javier E

AI is about to completely change how you use computers | Bill Gates - 0 views

  • Health care
  • Entertainment and shopping
  • Today, AI’s main role in healthcare is to help with administrative tasks. Abridge, Nuance DAX, and Nabla Copilot, for example, can capture audio during an appointment and then write up notes for the doctor to review.
  • ...38 more annotations...
  • agents will open up many more learning opportunities.
  • Already, AI can help you pick out a new TV and recommend movies, books, shows, and podcasts. Likewise, a company I’ve invested in, recently launched Pix, which lets you ask questions (“Which Robert Redford movies would I like and where can I watch them?”) and then makes recommendations based on what you’ve liked in the past
  • Productivity
  • copilots can do a lot—such as turn a written document into a slide deck, answer questions about a spreadsheet using natural language, and summarize email threads while representing each person’s point of view.
  • before the sophisticated agents I’m describing become a reality, we need to confront a number of questions about the technology and how we’ll use it.
  • Helping patients and healthcare workers will be especially beneficial for people in poor countries, where many never get to see a doctor at all.
  • To create a new app or service, you won’t need to know how to write code or do graphic design. You’ll just tell your agent what you want. It will be able to write the code, design the look and feel of the app, create a logo, and publish the app to an online store
  • Agents will do even more. Having one will be like having a person dedicated to helping you with various tasks and doing them independently if you want. If you have an idea for a business, an agent will help you write up a business plan, create a presentation for it, and even generate images of what your product might look like
  • For decades, I’ve been excited about all the ways that software would make teachers’ jobs easier and help students learn. It won’t replace teachers, but it will supplement their work—personalizing the work for students and liberating teachers from paperwork and other tasks so they can spend more time on the most important parts of the job.
  • Mental health care is another example of a service that agents will make available to virtually everyone. Today, weekly therapy sessions seem like a luxury. But there is a lot of unmet need, and many people who could benefit from therapy don’t have access to it.
  • I don’t think any single company will dominate the agents business--there will be many different AI engines available.
  • The real shift will come when agents can help patients do basic triage, get advice about how to deal with health problems, and decide whether they need to seek treatment.
  • They’ll replace word processors, spreadsheets, and other productivity apps.
  • Education
  • For example, few families can pay for a tutor who works one-on-one with a student to supplement their classroom work. If agents can capture what makes a tutor effective, they’ll unlock this supplemental instruction for everyone who wants it. If a tutoring agent knows that a kid likes Minecraft and Taylor Swift, it will use Minecraft to teach them about calculating the volume and area of shapes, and Taylor’s lyrics to teach them about storytelling and rhyme schemes. The experience will be far richer—with graphics and sound, for example—and more personalized than today’s text-based tutors.
  • your agent will be able to help you in the same way that personal assistants support executives today. If your friend just had surgery, your agent will offer to send flowers and be able to order them for you. If you tell it you’d like to catch up with your old college roommate, it will work with their agent to find a time to get together, and just before you arrive, it will remind you that their oldest child just started college at the local university.
  • To see the dramatic change that agents will bring, let’s compare them to the AI tools available today. Most of these are bots. They’re limited to one app and generally only step in when you write a particular word or ask for help. Because they don’t remember how you use them from one time to the next, they don’t get better or learn any of your preferences.
  • The current state of the art is Khanmigo, a text-based bot created by Khan Academy. It can tutor students in math, science, and the humanities—for example, it can explain the quadratic formula and create math problems to practice on. It can also help teachers do things like write lesson plans.
  • Businesses that are separate today—search advertising, social networking with advertising, shopping, productivity software—will become one business.
  • other issues won’t be decided by companies and governments. For example, agents could affect how we interact with friends and family. Today, you can show someone that you care about them by remembering details about their life—say, their birthday. But when they know your agent likely reminded you about it and took care of sending flowers, will it be as meaningful for them?
  • In the computing industry, we talk about platforms—the technologies that apps and services are built on. Android, iOS, and Windows are all platforms. Agents will be the next platform.
  • A shock wave in the tech industry
  • Agents won’t simply make recommendations; they’ll help you act on them. If you want to buy a camera, you’ll have your agent read all the reviews for you, summarize them, make a recommendation, and place an order for it once you’ve made a decision.
  • Agents will affect how we use software as well as how it’s written. They’ll replace search sites because they’ll be better at finding information and summarizing it for you
  • they’ll be dramatically better. You’ll be able to have nuanced conversations with them. They will be much more personalized, and they won’t be limited to relatively simple tasks like writing a letter.
  • Companies will be able to make agents available for their employees to consult directly and be part of every meeting so they can answer questions.
  • AI agents that are well trained in mental health will make therapy much more affordable and easier to get. Wysa and Youper are two of the early chatbots here. But agents will go much deeper. If you choose to share enough information with a mental health agent, it will understand your life history and your relationships. It’ll be available when you need it, and it will never get impatient. It could even, with your permission, monitor your physical responses to therapy through your smart watch—like if your heart starts to race when you’re talking about a problem with your boss—and suggest when you should see a human therapist.
  • If the number of companies that have started working on AI just this year is any indication, there will be an exceptional amount of competition, which will make agents very inexpensive.
  • Agents are smarter. They’re proactive—capable of making suggestions before you ask for them. They accomplish tasks across applications. They improve over time because they remember your activities and recognize intent and patterns in your behavior. Based on this information, they offer to provide what they think you need, although you will always make the final decisions.
  • Agents are not only going to change how everyone interacts with computers. They’re also going to upend the software industry, bringing about the biggest revolution in computing since we went from typing commands to tapping on icons.
  • The most exciting impact of AI agents is the way they will democratize services that today are too expensive for most people
  • The ramifications for the software business and for society will be profound.
  • In the next five years, this will change completely. You won’t have to use different apps for different tasks. You’ll simply tell your device, in everyday language, what you want to do. And depending on how much information you choose to share with it, the software will be able to respond personally because it will have a rich understanding of your life. In the near future, anyone who’s online will be able to have a personal assistant powered by artificial intelligence that’s far beyond today’s technology.
  • You’ll also be able to get news and entertainment that’s been tailored to your interests. CurioAI, which creates a custom podcast on any subject you ask about, is a glimpse of what’s coming.
  • An agent will be able to help you with all your activities if you want it to. With permission to follow your online interactions and real-world locations, it will develop a powerful understanding of the people, places, and activities you engage in. It will get your personal and work relationships, hobbies, preferences, and schedule. You’ll choose how and when it steps in to help with something or ask you to make a decision.
  • even the best sites have an incomplete understanding of your work, personal life, interests, and relationships and a limited ability to use this information to do things for you. That’s the kind of thing that is only possible today with another human being, like a close friend or personal assistant.
  • In the distant future, agents may even force humans to face profound questions about purpose. Imagine that agents become so good that everyone can have a high quality of life without working nearly as much. In a future like that, what would people do with their time? Would anyone still want to get an education when an agent has all the answers? Can you have a safe and thriving society when most people have a lot of free time on their hands?
  • They’ll have an especially big influence in four areas: health care, education, productivity, and entertainment and shopping.
Javier E

Opinion | Will Translation Apps Make Learning Foreign Languages Obsolete? - The New Yor... - 0 views

  • In Europe, nine out of 10 students study a foreign language. In the United States, only one in five do. Between 1997 and 2008, the number of American middle schools offering foreign languages dropped from 75 percent to 58 percent. Between 2009 and 2013, one American college closed its foreign language program; between 2013 and 2017, 651 others did the same.
  • At first glance, these statistics look like a tragedy. But I am starting to harbor the odd opinion that maybe they are not. What is changing my mind is technology.
  • what about spoken language? I was in Belgium not long ago, and I watched various tourists from a variety of nations use instant speech translation apps to render their own languages into English and French. The newer ones can even reproduce the tone of the speaker’s voice; a leading model, iTranslate, publicizes that its Translator app has had 200 million downloads so far.
  • ...12 more annotations...
  • I don’t think these tools will ever render learning foreign languages completely obsolete. Real conversation in the flowing nuances of casual speech cannot be rendered by a program, at least not in a way that would convey full humanity.
  • even if it may fail at genuine, nuanced conversation — for now, at least — technology is eliminating most of the need to learn foreign languages for more utilitarian purposes.
  • The old-school language textbook scenarios, of people reserving hotel rooms or ordering meals in the language of the country they are visiting — “Greetings. Please bring me a glass of lemonade and a sandwich!” — will now be obsolete
  • to actively enjoy piecing together how other languages work is an individual quirk, not a human universal
  • Obsessive language learners have come to call themselves the polyglot community over the past couple of decades, and I am one of them, to an extent. As such, I know well how hard it can be to recognize that most human beings are numb to this peculiar desire.
  • Most human beings are interested much less in how they are saying things, and which language they are saying them in, than in what they are saying.
  • Learning to express this what — beyond the very basics — in another language is hard. It can be especially hard for us Anglophones, as speaking English works at least decently in so many places
  • To polyglots, foreign languages are Mount Everests daring us to climb them — a metaphor used by Hofstadter in his article. But to most people, they are just a barrier to get to the other side of.
  • After all, despite the sincere and admirable efforts of foreign language teachers nationwide, fewer than one in 100 American students become proficient in a language they learned in school.
  • Because I love trying to learn languages and am endlessly fascinated by their varieties and complexities, I am working hard to wrap my head around this new reality. With an iPhone handy and an appropriate app downloaded, foreign languages will no longer present most people with the barrier or challenge they once did
  • Learning to genuinely speak a new language will hardly be unknown. It will continue to beckon, for instance, for those actually relocating to a new country. And it will persist with people who want to engage with literature or media in the original language, as well as those of us who find pleasure in mastering these new codes just because they are “there.”
  • In other words, it will likely become an artisanal pursuit, of interest to a much smaller but more committed set of enthusiasts. And weird as that is, it is in its way a kind of progress.
Javier E

Why It's OK to Let Apps Make You a Better Person - Evan Selinger - Technology - The Atl... - 0 views

  • one theme emerges from the media coverage of people's relationships with our current set of technologies: Consumers want digital willpower. App designers in touch with the latest trends in behavioral modification--nudging, the quantified self, and gamification--and good old-fashioned financial incentive manipulation, are tackling weakness of will. They're harnessing the power of payouts, cognitive biases, social networking, and biofeedback. The quantified self becomes the programmable self.
  • the trend still has multiple interesting dimensions
  • Individuals are turning ever more aspects of their lives into managerial problems that require technological solutions. We have access to an ever-increasing array of free and inexpensive technologies that harness incredible computational power that effectively allows us to self-police behavior everywhere we go. As pervasiveness expands, so does trust.
  • ...20 more annotations...
  • Some embrace networked, data-driven lives and are comfortable volunteering embarrassing, real time information about what we're doing, whom we're doing it with, and how we feel about our monitored activities.
  • Put it all together and we can see that our conception of what it means to be human has become "design space." We're now Humanity 2.0, primed for optimization through commercial upgrades. And today's apps are more harbinger than endpoint.
  • philosophers have had much to say about the enticing and seemingly inevitable dispersion of technological mental prosthetic that promise to substitute or enhance some of our motivational powers.
  • beyond the practical issues lie a constellation of central ethical concerns.
  • It simply means that when it comes to digital willpower, we should be on our guard to avoid confusing situational with integrated behaviors.
  • it is antithetical to the ideal of " resolute choice." Some may find the norm overly perfectionist, Spartan, or puritanical. However, it is not uncommon for folks to defend the idea that mature adults should strive to develop internal willpower strong enough to avoid external temptations, whatever they are, and wherever they are encountered.
  • In part, resolute choosing is prized out of concern for consistency, as some worry that lapse of willpower in any context indicates a generally weak character.
  • Fragmented selves behave one way while under the influence of digital willpower, but another when making decisions without such assistance. In these instances, inconsistent preferences are exhibited and we risk underestimating the extent of our technological dependency.
  • they should cause us to pause as we think about a possible future that significantly increases the scale and effectiveness of willpower-enhancing apps. Let's call this hypothetical future Digital Willpower World and characterize the ethical traps we're about to discuss as potential general pitfalls
  • the problem of inauthenticity, a staple of the neuroethics debates, might arise. People might start asking themselves: Has the problem of fragmentation gone away only because devices are choreographing our behavior so powerfully that we are no longer in touch with our so-called real selves -- the selves who used to exist before Digital Willpower World was formed?
  • Infantalized subjects are morally lazy, quick to have others take responsibility for their welfare. They do not view the capacity to assume personal responsibility for selecting means and ends as a fundamental life goal that validates the effort required to remain committed to the ongoing project of maintaining willpower and self-control.
  • Michael Sandel's Atlantic essay, "The Case Against Perfection." He notes that technological enhancement can diminish people's sense of achievement when their accomplishments become attributable to human-technology systems and not an individual's use of human agency.
  • Borgmann worries that this environment, which habituates us to be on auto-pilot and delegate deliberation, threatens to harm the powers of reason, the most central component of willpower (according to the rationalist tradition).
  • In several books, including Technology and the Character of Contemporary Life, he expresses concern about technologies that seem to enhance willpower but only do so through distraction. Borgmann's paradigmatic example of the non-distracted, focally centered person is a serious runner. This person finds the practice of running maximally fulfilling, replete with the rewarding "flow" that can only comes when mind/body and means/ends are unified, while skill gets pushed to the limit.
  • Perhaps the very conception of a resolute self was flawed. What if, as psychologist Roy Baumeister suggests, willpower is more "staple of folk psychology" than real way of thinking about our brain processes?
  • novel approaches suggest the will is a flexible mesh of different capacities and cognitive mechanisms that can expand and contract, depending on the agent's particular setting and needs. Contrary to the traditional view that identifies the unified and cognitively transparent self as the source of willed actions, the new picture embraces a rather diffused, extended, and opaque self who is often guided by irrational trains of thought. What actually keeps the self and its will together are the given boundaries offered by biology, a coherent self narrative created by shared memories and experiences, and society. If this view of the will as an expa
  • nding and contracting system with porous and dynamic boundaries is correct, then it might seem that the new motivating technologies and devices can only increase our reach and further empower our willing selves.
  • "It's a mistake to think of the will as some interior faculty that belongs to an individual--the thing that pushes the motor control processes that cause my action," Gallagher says. "Rather, the will is both embodied and embedded: social and physical environment enhance or impoverish our ability to decide and carry out our intentions; often our intentions themselves are shaped by social and physical aspects of the environment."
  • It makes perfect sense to think of the will as something that can be supported or assisted by technology. Technologies, like environments and institutions can facilitate action or block it. Imagine I have the inclination to go to a concert. If I can get my ticket by pressing some buttons on my iPhone, I find myself going to the concert. If I have to fill out an application form and carry it to a location several miles away and wait in line to pick up my ticket, then forget it.
  • Perhaps the best way forward is to put a digital spin on the Socratic dictum of knowing myself and submit to the new freedom: the freedom of consuming digital willpower to guide me past the sirens.
Javier E

Tinder, the Fast-Growing Dating App, Taps an Age-Old Truth - NYTimes.com - 2 views

  • In the two years since Tinder was released, the smartphone app has exploded, processing more than a billion swipes left and right each day
  • it is fast approaching 50 million active users.
  • Tinder’s engagement is staggering. The company said that, on average, people log into the app 11 times a day. Women spend as much as 8.5 minutes swiping left and right during a single session; men spend 7.2 minutes. All of this can add up to 90 minutes each day.While conventional online dating sites have been around lo
  • ...12 more annotations...
  • On Tinder, there are no questionnaires to fill out. No discussion of your favorite hiking trail, star sign or sexual proclivities. You simply log in through Facebook, pick a few photos that best describe “you” and start swiping.It may seem that what happens next is predictable (the best-looking people draw the most likes, the rest are quickly dismissed), but relationships experts for Tinder say there is something entirely different going on.
  • “Research shows when people are evaluating photos of others, they are trying to access compatibility on not just a physical level, but a social level,” said Jessica Carbino, Tinder’s in-house dating and relationship expert. “They are trying to understand, ‘Do I have things in common with this person?' ”
  • She discovered that Tinder users decoded an array of subtle and not-so-subtle traits before deciding which way to swipe. For example, the style of clothing, the pucker of the lips and even the posture, Ms. Carbino said, tell us a lot about their social circle, if they like to party and their level of confidence.
  • Men also judge attractiveness on factors beyond just anatomy, though in general, men are nearly three times as likely to swipe “like” (in 46 percent of cases) than woman (14 percent).
  • “There is this idea that attraction stems from a very superficial outlook on people, which is false,” Mr. Rad said. “Everyone is able to pick up thousands of signals in these photos. A photo of a guy at a bar with friends around him sends a very different message than a photo of a guy with a dog on the beach.”
  • while computers have become incalculably smarter, the ability of machines and algorithms to match people has remained just as clueless in the view of independent scientists.
  • dating sites like eHarmony and Match.com are more like modern snake oil. “They are a joke, and there is no relationship scientist that takes them seriously as relationship science.”
  • Mr. Finkel worked for more than a year with a group of researchers trying to understand how these algorithm-based dating services could match people, as they claim to do. The team poured through more than 80 years of scientific research about dating and attraction, and was unable to prove that computers can indeed match people together.
  • some dating sites are starting to acknowledge that the only thing that matters when matching lovers is someone’s picture. Earlier this year, OKCupid examined its data and found that a person’s profile picture is, said a post on its Oktrends blog, “worth that fabled thousand words, but your actual words are worth... almost nothing.”
  • this doesn’t mean that the most attractive people are the only ones who find true love. Indeed, in many respects, it can be the other way around.
  • a graduate student, published a paper noting that a person’s unique looks are what is most important when trying to find a mate.
  • “There isn’t a consensus about who is attractive and who isn’t,” Mr. Eastwick said in an interview. “Someone that you think is especially attractive might not be to me. That’s true with photos, too.” Tinder’s data team echoed this, noting that there isn’t a cliquey, high school mentality on the site, where one group of users get the share of “like” swipes.
Javier E

Stop Googling. Let's Talk. - The New York Times - 3 views

  • In a 2015 study by the Pew Research Center, 89 percent of cellphone owners said they had used their phones during the last social gathering they attended. But they weren’t happy about it; 82 percent of adults felt that the way they used their phones in social settings hurt the conversation.
  • I’ve been studying the psychology of online connectivity for more than 30 years. For the past five, I’ve had a special focus: What has happened to face-to-face conversation in a world where so many people say they would rather text than talk?
  • Young people spoke to me enthusiastically about the good things that flow from a life lived by the rule of three, which you can follow not only during meals but all the time. First of all, there is the magic of the always available elsewhere. You can put your attention wherever you want it to be. You can always be heard. You never have to be bored.
  • ...23 more annotations...
  • But the students also described a sense of loss.
  • A 15-year-old boy told me that someday he wanted to raise a family, not the way his parents are raising him (with phones out during meals and in the park and during his school sports events) but the way his parents think they are raising him — with no phones at meals and plentiful family conversation. One college junior tried to capture what is wrong about life in his generation. “Our texts are fine,” he said. “It’s what texting does to our conversations when we are together that’s the problem.”
  • One teacher observed that the students “sit in the dining hall and look at their phones. When they share things together, what they are sharing is what is on their phones.” Is this the new conversation? If so, it is not doing the work of the old conversation. The old conversation taught empathy. These students seem to understand each other less.
  • In 2010, a team at the University of Michigan led by the psychologist Sara Konrath put together the findings of 72 studies that were conducted over a 30-year period. They found a 40 percent decline in empathy among college students, with most of the decline taking place after 2000.
  • We’ve gotten used to being connected all the time, but we have found ways around conversation — at least from conversation that is open-ended and spontaneous, in which we play with ideas and allow ourselves to be fully present and vulnerable. But it is in this type of conversation — where we learn to make eye contact, to become aware of another person’s posture and tone, to comfort one another and respectfully challenge one another — that empathy and intimacy flourish. In these conversations, we learn who we are.
  • the trend line is clear. It’s not only that we turn away from talking face to face to chat online. It’s that we don’t allow these conversations to happen in the first place because we keep our phones in the landscape.
  • It’s a powerful insight. Studies of conversation both in the laboratory and in natural settings show that when two people are talking, the mere presence of a phone on a table between them or in the periphery of their vision changes both what they talk about and the degree of connection they feel. People keep the conversation on topics where they won’t mind being interrupted. They don’t feel as invested in each other. Even a silent phone disconnects us.
  • Yalda T. Uhls was the lead author on a 2014 study of children at a device-free outdoor camp. After five days without phones or tablets, these campers were able to read facial emotions and correctly identify the emotions of actors in videotaped scenes significantly better than a control group. What fostered these new empathic responses? They talked to one another. In conversation, things go best if you pay close attention and learn how to put yourself in someone else’s shoes. This is easier to do without your phone in hand. Conversation is the most human and humanizing thing that we do.
  • At a nightly cabin chat, a group of 14-year-old boys spoke about a recent three-day wilderness hike. Not that many years ago, the most exciting aspect of that hike might have been the idea of roughing it or the beauty of unspoiled nature. These days, what made the biggest impression was being phoneless. One boy called it “time where you have nothing to do but think quietly and talk to your friends.” The campers also spoke about their new taste for life away from the online feed. Their embrace of the virtue of disconnection suggests a crucial connection: The capacity for empathic conversation goes hand in hand with the capacity for solitude.
  • In solitude we find ourselves; we prepare ourselves to come to conversation with something to say that is authentic, ours. If we can’t gather ourselves, we can’t recognize other people for who they are. If we are not content to be alone, we turn others into the people we need them to be. If we don’t know how to be alone, we’ll only know how to be lonely.
  • we have put this virtuous circle in peril. We turn time alone into a problem that needs to be solved with technology.
  • People sometimes say to me that they can see how one might be disturbed when people turn to their phones when they are together. But surely there is no harm when people turn to their phones when they are by themselves? If anything, it’s our new form of being together.
  • But this way of dividing things up misses the essential connection between solitude and conversation. In solitude we learn to concentrate and imagine, to listen to ourselves. We need these skills to be fully present in conversation.
  • One start toward reclaiming conversation is to reclaim solitude. Some of the most crucial conversations you will ever have will be with yourself. Slow down sufficiently to make this possible. And make a practice of doing one thing at a time. Think of unitasking as the next big thing. In every domain of life, it will increase performance and decrease stress.
  • Multitasking comes with its own high, but when we chase after this feeling, we pursue an illusion. Conversation is a human way to practice unitasking.
  • Our phones are not accessories, but psychologically potent devices that change not just what we do but who we are. A second path toward conversation involves recognizing the degree to which we are vulnerable to all that connection offers. We have to commit ourselves to designing our products and our lives to take that vulnerability into account.
  • We can choose not to carry our phones all the time. We can park our phones in a room and go to them every hour or two while we work on other things or talk to other people. We can carve out spaces at home or work that are device-free, sacred spaces for the paired virtues of conversation and solitude.
  • Families can find these spaces in the day to day — no devices at dinner, in the kitchen and in the car.
  • Engineers are ready with more ideas: What if our phones were not designed to keep us attached, but to do a task and then release us? What if the communications industry began to measure the success of devices not by how much time consumers spend on them but by whether it is time well spent?
  • The young woman who is so clear about the seven minutes that it takes to see where a conversation is going admits that she often doesn’t have the patience to wait for anything near that kind of time before going to her phone. In this she is characteristic of what the psychologists Howard Gardner and Katie Davis called the “app generation,” which grew up with phones in hand and apps at the ready. It tends toward impatience, expecting the world to respond like an app, quickly and efficiently. The app way of thinking starts with the idea that actions in the world will work like algorithms: Certain actions will lead to predictable results.
  • This attitude can show up in friendship as a lack of empathy. Friendships become things to manage; you have a lot of them, and you come to them with tools
  • here is a first step: To reclaim conversation for yourself, your friendships and society, push back against viewing the world as one giant app. It works the other way, too: Conversation is the antidote to the algorithmic way of looking at life because it teaches you about fluidity, contingency and personality.
  • We have time to make corrections and remember who we are — creatures of history, of deep psychology, of complex relationships, of conversations, artless, risky and face to face.
Javier E

Movie Review: Inside Job - Barron's - 0 views

  • On the outsize role of the GSEs and other federal agencies in high-risk mortgages, figures compiled by former Fannie Mae Chief Credit Officer Edward Pinto show that as of mid-2008, more than 70% were accounted for by the federal government in one way or another, with nearly two-thirds of that held by Fannie and Freddie.
  • As has been documented, for example, in a forthcoming book on the GSEs called Guaranteed to Fail, there was a steady increase in affordable housing mandates imposed on these enterprises by Congress, one of several reasons why they were hardly like other capitalist enterprises, but tools and beneficiaries of government.
  • I asked Ferguson why Inside Job made such brief mention of Fannie Mae and Freddie Mac, and even then without noting that they are government-sponsored enterprises, subject to special protection by the federal government—which their creditors clearly appreciated, given the unusually low interest rates their debt commanded.
  • ...7 more annotations...
  • Ferguson replied that their role in subprime mortgages was not very significant, and that in any case their behavior was not much different from that of other capitalist enterprises.
  • We get no inkling that Rajan's views on what made the world riskier, as set forth in his book, veer quite radically from those of Inside Job. They include, as he has written, "the political push for easy housing credit in the United States and the lax monetary policy [by the Federal Reserve] in the years 2003-2005."
  • Rajan, author of Fault Lines, a recent book on the debacle, speaks with special authority to fans of Inside Job. Not only is he in the movie—one of the talking heads speaking wisdom about what occurred—he is accurately presented as having anticipated the meltdown in a 2005 paper called "Has Financial Development Made the World Riskier?" But the things he is quoted as saying in the film are restricted to serving its themes.
  • Yet it's impossible to understand what happened without grasping the proactive role played by government. "The banking sector did not decide out of the goodness of its heart to extend mortgages to poor people," commented University of Chicago Booth School of Business Finance Professor Raghuram Rajan in a telephone interview last week. "Politicians did that, and they would have taken great umbrage if the regulator stood in the way of more housing credit."
  • THE STORY RECOUNTED in Inside Job is that principles like safety and soundness were flouted by greedy Wall Street capitalists who brought down the economy with the help of certain politicians, political appointees and corrupt academicians. Despite the attempts and desires of some, including Barney Frank, to regulate the mania, the juggernaut prevails to this day, under the presidency of Barack Obama.
  • This version of the story contains some elements of truth.
  • Text Size Regular Medium Large "A MASTERPIECE OF INVESTIGATIVE nonfiction moviemaking," wrote the film critic of the Boston Globe. "Rests its outrage on reason, research and careful argument," opined the New York Times. The "masterpiece" referred to was the recently released Inside Job, a documentary film that focuses on the causes of the 2008 financial crisis.
clairemann

Robinhood app makes Wall Street feel like a game to win - instead of a place ... - 0 views

  • Wall Street has long been likened to a casino. Robinhood, an investment app that just filed plans for an initial public offering, makes the comparison more apt than ever.
  • Similarly, Robinhood’s slick and easy-to-use app resembles a thrill-inducing video game rather than a sober investment tool
  • Using gamelike features to influence real-life actions can be beneficial, such as when a health app uses rewards and rankings to encourage people to move more or eat healthier food. But there’s a dark side too, and so-called gamification can lead people to forget the real-world consequences of their decisions.
  • ...9 more annotations...
  • sometimes with disastrous consequences, such as last year when a Robinhood user died by suicide after mistakenly believing that he’d lost US$750,000.
  • The reason games are so captivating is that they challenge the mind to learn new things and are generally safe spaces to face and overcome failure.
  • Games also mimic rites of passage similar to religious rituals and draw players into highly focused “flow states” that dramatically alter self-awareness. This sensory blend of flow and mastery are what make games fun and sometimes addicting: “Just one more turn” thinking can last for hours, and players forget to eat and sleep. Players who barely remember yesterday’s breakfast recall visceral details from games played decades ago.
  • The psychological impact of game play can also be harnessed for profit.
  • For example, many free-to-play video games such as Angry Birds 2 and Fortnite give players the option to spend real money on in-game items such as new and even angrier birds or character skins.
  • This “free-to-play” model is so profitable that it’s grown increasingly popular with video game designers and publishers.
  • Gamification, however, goes one step further and uses gaming elements to influence real-world behavior.
  • . Common elements include badges, points, rankings and progress bars that visually encourage players to achieve goals.
  • Many readers likely have experienced this type of gamification to improve personal fitness, get better grades, build savings accounts and even solve major scientific problems. Some initiatives also include offering rewards that can be cashed in for participating in actual civic projects, such as volunteering in a park, commenting on a piece of legislation or visiting a government website.
Javier E

Thieves of experience: On the rise of surveillance capitalism - 1 views

  • Harvard Business School professor emerita Shoshana Zuboff argues in her new book that the Valley’s wealth and power are predicated on an insidious, essentially pathological form of private enterprise—what she calls “surveillance capitalism.” Pioneered by Google, perfected by Facebook, and now spreading throughout the economy, surveillance capitalism uses human life as its raw material. Our everyday experiences, distilled into data, have become a privately-owned business asset used to predict and mold our behavior, whether we’re shopping or socializing, working or voting.
  • By reengineering the economy and society to their own benefit, Google and Facebook are perverting capitalism in a way that undermines personal freedom and corrodes democracy.
  • Under the Fordist model of mass production and consumption that prevailed for much of the twentieth century, industrial capitalism achieved a relatively benign balance among the contending interests of business owners, workers, and consumers. Enlightened executives understood that good pay and decent working conditions would ensure a prosperous middle class eager to buy the goods and services their companies produced. It was the product itself — made by workers, sold by companies, bought by consumers — that tied the interests of capitalism’s participants together. Economic and social equilibrium was negotiated through the product.
  • ...72 more annotations...
  • By removing the tangible product from the center of commerce, surveillance capitalism upsets the equilibrium. Whenever we use free apps and online services, it’s often said, we become the products, our attention harvested and sold to advertisers
  • this truism gets it wrong. Surveillance capitalism’s real products, vaporous but immensely valuable, are predictions about our future behavior — what we’ll look at, where we’ll go, what we’ll buy, what opinions we’ll hold — that internet companies derive from our personal data and sell to businesses, political operatives, and other bidders.
  • Unlike financial derivatives, which they in some ways resemble, these new data derivatives draw their value, parasite-like, from human experience.To the Googles and Facebooks of the world, we are neither the customer nor the product. We are the source of what Silicon Valley technologists call “data exhaust” — the informational byproducts of online activity that become the inputs to prediction algorithms
  • Another 2015 study, appearing in the Journal of Computer-Mediated Communication, showed that when people hear their phone ring but are unable to answer it, their blood pressure spikes, their pulse quickens, and their problem-solving skills decline.
  • The smartphone has become a repository of the self, recording and dispensing the words, sounds and images that define what we think, what we experience and who we are. In a 2015 Gallup survey, more than half of iPhone owners said that they couldn’t imagine life without the device.
  • So what happens to our minds when we allow a single tool such dominion over our perception and cognition?
  • Not only do our phones shape our thoughts in deep and complicated ways, but the effects persist even when we aren’t using the devices. As the brain grows dependent on the technology, the research suggests, the intellect weakens.
  • he has seen mounting evidence that using a smartphone, or even hearing one ring or vibrate, produces a welter of distractions that makes it harder to concentrate on a difficult problem or job. The division of attention impedes reasoning and performance.
  • internet companies operate in what Zuboff terms “extreme structural independence from people.” When databases displace goods as the engine of the economy, our own interests, as consumers but also as citizens, cease to be part of the negotiation. We are no longer one of the forces guiding the market’s invisible hand. We are the objects of surveillance and control.
  • Social skills and relationships seem to suffer as well.
  • In both tests, the subjects whose phones were in view posted the worst scores, while those who left their phones in a different room did the best. The students who kept their phones in their pockets or bags came out in the middle. As the phone’s proximity increased, brainpower decreased.
  • In subsequent interviews, nearly all the participants said that their phones hadn’t been a distraction—that they hadn’t even thought about the devices during the experiment. They remained oblivious even as the phones disrupted their focus and thinking.
  • The researchers recruited 520 undergraduates at UCSD and gave them two standard tests of intellectual acuity. One test gauged “available working-memory capacity,” a measure of how fully a person’s mind can focus on a particular task. The second assessed “fluid intelligence,” a person’s ability to interpret and solve an unfamiliar problem. The only variable in the experiment was the location of the subjects’ smartphones. Some of the students were asked to place their phones in front of them on their desks; others were told to stow their phones in their pockets or handbags; still others were required to leave their phones in a different room.
  • the “integration of smartphones into daily life” appears to cause a “brain drain” that can diminish such vital mental skills as “learning, logical reasoning, abstract thought, problem solving, and creativity.”
  •  Smartphones have become so entangled with our existence that, even when we’re not peering or pawing at them, they tug at our attention, diverting precious cognitive resources. Just suppressing the desire to check our phone, which we do routinely and subconsciously throughout the day, can debilitate our thinking.
  • They found that students who didn’t bring their phones to the classroom scored a full letter-grade higher on a test of the material presented than those who brought their phones. It didn’t matter whether the students who had their phones used them or not: All of them scored equally poorly.
  • A study of nearly a hundred secondary schools in the U.K., published last year in the journal Labour Economics, found that when schools ban smartphones, students’ examination scores go up substantially, with the weakest students benefiting the most.
  • Data, the novelist and critic Cynthia Ozick once wrote, is “memory without history.” Her observation points to the problem with allowing smartphones to commandeer our brains
  • Because smartphones serve as constant reminders of all the friends we could be chatting with electronically, they pull at our minds when we’re talking with people in person, leaving our conversations shallower and less satisfying.
  • In a 2013 study conducted at the University of Essex in England, 142 participants were divided into pairs and asked to converse in private for ten minutes. Half talked with a phone in the room, half without a phone present. The subjects were then given tests of affinity, trust and empathy. “The mere presence of mobile phones,” the researchers reported in the Journal of Social and Personal Relationships, “inhibited the development of interpersonal closeness and trust” and diminished “the extent to which individuals felt empathy and understanding from their partners.”
  • The evidence that our phones can get inside our heads so forcefully is unsettling. It suggests that our thoughts and feelings, far from being sequestered in our skulls, can be skewed by external forces we’re not even aware o
  •  Scientists have long known that the brain is a monitoring system as well as a thinking system. Its attention is drawn toward any object that is new, intriguing or otherwise striking — that has, in the psychological jargon, “salience.”
  • even in the history of captivating media, the smartphone stands out. It is an attention magnet unlike any our minds have had to grapple with before. Because the phone is packed with so many forms of information and so many useful and entertaining functions, it acts as what Dr. Ward calls a “supernormal stimulus,” one that can “hijack” attention whenever it is part of our surroundings — and it is always part of our surroundings.
  • Imagine combining a mailbox, a newspaper, a TV, a radio, a photo album, a public library and a boisterous party attended by everyone you know, and then compressing them all into a single, small, radiant object. That is what a smartphone represents to us. No wonder we can’t take our minds off it.
  • The irony of the smartphone is that the qualities that make it so appealing to us — its constant connection to the net, its multiplicity of apps, its responsiveness, its portability — are the very ones that give it such sway over our minds.
  • Phone makers like Apple and Samsung and app writers like Facebook, Google and Snap design their products to consume as much of our attention as possible during every one of our waking hours
  • Social media apps were designed to exploit “a vulnerability in human psychology,” former Facebook president Sean Parker said in a recent interview. “[We] understood this consciously. And we did it anyway.”
  • A quarter-century ago, when we first started going online, we took it on faith that the web would make us smarter: More information would breed sharper thinking. We now know it’s not that simple.
  • As strange as it might seem, people’s knowledge and understanding may actually dwindle as gadgets grant them easier access to online data stores
  • In a seminal 2011 study published in Science, a team of researchers — led by the Columbia University psychologist Betsy Sparrow and including the late Harvard memory expert Daniel Wegner — had a group of volunteers read forty brief, factual statements (such as “The space shuttle Columbia disintegrated during re-entry over Texas in Feb. 2003”) and then type the statements into a computer. Half the people were told that the machine would save what they typed; half were told that the statements would be erased.
  • Afterward, the researchers asked the subjects to write down as many of the statements as they could remember. Those who believed that the facts had been recorded in the computer demonstrated much weaker recall than those who assumed the facts wouldn’t be stored. Anticipating that information would be readily available in digital form seemed to reduce the mental effort that people made to remember it
  • The researchers dubbed this phenomenon the “Google effect” and noted its broad implications: “Because search engines are continually available to us, we may often be in a state of not feeling we need to encode the information internally. When we need it, we will look it up.”
  • as the pioneering psychologist and philosopher William James said in an 1892 lecture, “the art of remembering is the art of thinking.”
  • Only by encoding information in our biological memory can we weave the rich intellectual associations that form the essence of personal knowledge and give rise to critical and conceptual thinking. No matter how much information swirls around us, the less well-stocked our memory, the less we have to think with.
  • As Dr. Wegner and Dr. Ward explained in a 2013 Scientific American article, when people call up information through their devices, they often end up suffering from delusions of intelligence. They feel as though “their own mental capacities” had generated the information, not their devices. “The advent of the ‘information age’ seems to have created a generation of people who feel they know more than ever before,” the scholars concluded, even though “they may know ever less about the world around them.”
  • That insight sheds light on society’s current gullibility crisis, in which people are all too quick to credit lies and half-truths spread through social media. If your phone has sapped your powers of discernment, you’ll believe anything it tells you.
  • A second experiment conducted by the researchers produced similar results, while also revealing that the more heavily students relied on their phones in their everyday lives, the greater the cognitive penalty they suffered.
  • When we constrict our capacity for reasoning and recall or transfer those skills to a gadget, we sacrifice our ability to turn information into knowledge. We get the data but lose the meaning
  • We need to give our minds more room to think. And that means putting some distance between ourselves and our phones.
  • Google’s once-patient investors grew restive, demanding that the founders figure out a way to make money, preferably lots of it.
  • nder pressure, Page and Brin authorized the launch of an auction system for selling advertisements tied to search queries. The system was designed so that the company would get paid by an advertiser only when a user clicked on an ad. This feature gave Google a huge financial incentive to make accurate predictions about how users would respond to ads and other online content. Even tiny increases in click rates would bring big gains in income. And so the company began deploying its stores of behavioral data not for the benefit of users but to aid advertisers — and to juice its own profits. Surveillance capitalism had arrived.
  • Google’s business now hinged on what Zuboff calls “the extraction imperative.” To improve its predictions, it had to mine as much information as possible from web users. It aggressively expanded its online services to widen the scope of its surveillance.
  • Through Gmail, it secured access to the contents of people’s emails and address books. Through Google Maps, it gained a bead on people’s whereabouts and movements. Through Google Calendar, it learned what people were doing at different moments during the day and whom they were doing it with. Through Google News, it got a readout of people’s interests and political leanings. Through Google Shopping, it opened a window onto people’s wish lists,
  • The company gave all these services away for free to ensure they’d be used by as many people as possible. It knew the money lay in the data.
  • the organization grew insular and secretive. Seeking to keep the true nature of its work from the public, it adopted what its CEO at the time, Eric Schmidt, called a “hiding strategy” — a kind of corporate omerta backed up by stringent nondisclosure agreements.
  • Page and Brin further shielded themselves from outside oversight by establishing a stock structure that guaranteed their power could never be challenged, neither by investors nor by directors.
  • What’s most remarkable about the birth of surveillance capitalism is the speed and audacity with which Google overturned social conventions and norms about data and privacy. Without permission, without compensation, and with little in the way of resistance, the company seized and declared ownership over everyone’s information
  • The companies that followed Google presumed that they too had an unfettered right to collect, parse, and sell personal data in pretty much any way they pleased. In the smart homes being built today, it’s understood that any and all data will be beamed up to corporate clouds.
  • Google conducted its great data heist under the cover of novelty. The web was an exciting frontier — something new in the world — and few people understood or cared about what they were revealing as they searched and surfed. In those innocent days, data was there for the taking, and Google took it
  • Google also benefited from decisions made by lawmakers, regulators, and judges — decisions that granted internet companies free use of a vast taxpayer-funded communication infrastructure, relieved them of legal and ethical responsibility for the information and messages they distributed, and gave them carte blanche to collect and exploit user data.
  • Consider the terms-of-service agreements that govern the division of rights and the delegation of ownership online. Non-negotiable, subject to emendation and extension at the company’s whim, and requiring only a casual click to bind the user, TOS agreements are parodies of contracts, yet they have been granted legal legitimacy by the court
  • Law professors, writes Zuboff, “call these ‘contracts of adhesion’ because they impose take-it-or-leave-it conditions on users that stick to them whether they like it or not.” Fundamentally undemocratic, the ubiquitous agreements helped Google and other firms commandeer personal data as if by fiat.
  • n the choices we make as consumers and private citizens, we have always traded some of our autonomy to gain other rewards. Many people, it seems clear, experience surveillance capitalism less as a prison, where their agency is restricted in a noxious way, than as an all-inclusive resort, where their agency is restricted in a pleasing way
  • Zuboff makes a convincing case that this is a short-sighted and dangerous view — that the bargain we’ve struck with the internet giants is a Faustian one
  • but her case would have been stronger still had she more fully addressed the benefits side of the ledger.
  • there’s a piece missing. While Zuboff’s assessment of the costs that people incur under surveillance capitalism is exhaustive, she largely ignores the benefits people receive in return — convenience, customization, savings, entertainment, social connection, and so on
  • hat the industries of the future will seek to manufacture is the self.
  • Behavior modification is the thread that ties today’s search engines, social networks, and smartphone trackers to tomorrow’s facial-recognition systems, emotion-detection sensors, and artificial-intelligence bots.
  • All of Facebook’s information wrangling and algorithmic fine-tuning, she writes, “is aimed at solving one problem: how and when to intervene in the state of play that is your daily life in order to modify your behavior and thus sharply increase the predictability of your actions now, soon, and later.”
  • “The goal of everything we do is to change people’s actual behavior at scale,” a top Silicon Valley data scientist told her in an interview. “We can test how actionable our cues are for them and how profitable certain behaviors are for us.”
  • This goal, she suggests, is not limited to Facebook. It is coming to guide much of the economy, as financial and social power shifts to the surveillance capitalists
  • Combining rich information on individuals’ behavioral triggers with the ability to deliver precisely tailored and timed messages turns out to be a recipe for behavior modification on an unprecedented scale.
  • it was Facebook, with its incredibly detailed data on people’s social lives, that grasped digital media’s full potential for behavior modification. By using what it called its “social graph” to map the intentions, desires, and interactions of literally billions of individuals, it saw that it could turn its network into a worldwide Skinner box, employing psychological triggers and rewards to program not only what people see but how they react.
  • spying on the populace is not the end game. The real prize lies in figuring out ways to use the data to shape how people think and act. “The best way to predict the future is to invent it,” the computer scientist Alan Kay once observed. And the best way to predict behavior is to script it.
  • competition for personal data intensified. It was no longer enough to monitor people online; making better predictions required that surveillance be extended into homes, stores, schools, workplaces, and the public squares of cities and towns. Much of the recent innovation in the tech industry has entailed the creation of products and services designed to vacuum up data from every corner of our lives
  • “The typical complaint is that privacy is eroded, but that is misleading,” Zuboff writes. “In the larger societal pattern, privacy is not eroded but redistributed . . . . Instead of people having the rights to decide how and what they will disclose, these rights are concentrated within the domain of surveillance capitalism.” The transfer of decision rights is also a transfer of autonomy and agency, from the citizen to the corporation.
  • What we lose under this regime is something more fundamental than privacy. It’s the right to make our own decisions about privacy — to draw our own lines between those aspects of our lives we are comfortable sharing and those we are not
  • Other possible ways of organizing online markets, such as through paid subscriptions for apps and services, never even got a chance to be tested.
  • Online surveillance came to be viewed as normal and even necessary by politicians, government bureaucrats, and the general public
  • Google and other Silicon Valley companies benefited directly from the government’s new stress on digital surveillance. They earned millions through contracts to share their data collection and analysis techniques with the National Security Agenc
  • As much as the dot-com crash, the horrors of 9/11 set the stage for the rise of surveillance capitalism. Zuboff notes that, in 2000, members of the Federal Trade Commission, frustrated by internet companies’ lack of progress in adopting privacy protections, began formulating legislation to secure people’s control over their online information and severely restrict the companies’ ability to collect and store it. It seemed obvious to the regulators that ownership of personal data should by default lie in the hands of private citizens, not corporations.
  • The 9/11 attacks changed the calculus. The centralized collection and analysis of online data, on a vast scale, came to be seen as essential to national security. “The privacy provisions debated just months earlier vanished from the conversation more or less overnight,”
Javier E

This is what it's like to grow up in the age of likes, lols and longing | The Washingto... - 1 views

  • She slides into the car, and even before she buckles her seat belt, her phone is alight in her hands. A 13-year-old girl after a day of eighth grade.
  • She doesn’t respond, her thumb on Instagram. A Barbara Walters meme is on the screen. She scrolls, and another meme appears. Then another meme, and she closes the app. She opens BuzzFeed. There’s a story about Florida Gov. Rick Scott, which she scrolls past to get to a story about Janet Jackson, then “28 Things You’ll Understand If You’re Both British and American.” She closes it. She opens Instagram. She opens the NBA app. She shuts the screen off. She turns it back on. She opens Spotify. Opens Fitbit. She has 7,427 steps. Opens Instagram again. Opens Snapchat. She watches a sparkly rainbow flow from her friend’s mouth. She watches a YouTube star make pouty faces at the camera. She watches a tutorial on nail art. She feels the bump of the driveway and looks up. They’re home. Twelve minutes have passed.
  • Katherine Pommerening’s iPhone is the place where all of her friends are always hanging out. So it’s the place where she is, too.
  • ...19 more annotations...
  • “Over 100 likes is good, for me. And comments. You just comment to make a joke or tag someone.”
  • The best thing is the little notification box, which means someone liked, tagged or followed her on Instagram. She has 604 followers. There are only 25 photos on her page because she deletes most of what she posts. The ones that don’t get enough likes, don’t have good enough lighting or don’t show the coolest moments in her life must be deleted.
  • Sociologists, advertisers, stock market analysts – everyone wants to know what happens when the generation born glued to screens has to look up and interact with the world.
  • “It kind of, almost, promotes you as a good person. If someone says, ‘tbh you’re nice and pretty,’ that kind of, like, validates you in the comments. Then people can look at it and say ‘Oh, she’s nice and pretty.’ ”
  • School is where she thrives: She is beloved by her teachers, will soon star as young Simba in the eighth-grade performance of “The Lion King” musical, and gets straight A’s. Her school doesn’t offer a math course challenging enough for her, so she takes honors algebra online through Johns Hopkins University.
  • “Happy birthday posts are a pretty big deal,” she says. “It really shows who cares enough to put you on their page.”
  • He checks the phone bill to see who she’s called and how much she’s been texting, but she barely calls anyone and chats mostly through Snapchat, where her messages disappear.
  • Some of Katherine’s very best friends have never been to her house, or she to theirs. To Dave, it seems like they rarely hang out, but he knows that to her, it seems like they’re together all the time.
  • Dave Pommerening wants to figure out how to get her to use it less. One month, she ate up 18 gigabytes of data. Most large plans max out at 10. He intervened and capped her at four GB. “I don’t want to crimp it too much,” he says. “That’s something, from my perspective, I’m going to have to figure out, how to get my arms around that.”
  • Even if her dad tried snooping around her apps, the true dramas of teenage girl life are not written in the comments. Like how sometimes, Katherine’s friends will borrow her phone just to un-like all the Instagram photos of girls they don’t like. Katherine can’t go back to those girls’ pages and re-like the photos because that would be stalking, which is forbidden.
  • Or how last week, at the middle school dance, her friends got the phone numbers of 10 boys, but then they had to delete five of them because they were seventh-graders. And before she could add the boys on Snapchat, she realized she had to change her username because it was her childhood nickname and that was totally embarrassing.
  • Then, because she changed her username, her Snapchat score reverted to zero. The app awards about one point for every snap you send and receive. It’s also totally embarrassing and stressful to have a low Snapchat score. So in one day, she sent enough snaps to earn 1,000 points.
  • Snapchat is where flirting happens. She doesn’t know anyone who has sent a naked picture to a boy, but she knows it happens with older girls, who know they have met the right guy.
  • Nothing her dad could find on her phone shows that for as good as Katherine is at math, basketball and singing, she wants to get better at her phone. To be one of the girls who knows what to post, how to caption it, when to like, what to comment.
  • Katherine doesn’t need magazines or billboards to see computer-perfect women. They’re right on her phone, all the time, in between photos of her normal-looking friends. There’s Aisha, there’s Kendall Jenner’s butt. There’s Olivia, there’s YouTube star Jenna Marbles in lingerie.
  • The whole world is at her fingertips and has been for years. This, Katherine offers as a theory one day, is why she doesn’t feel like she’s 13 years old at all. She’s probably, like, 16.
  • “I don’t feel like a child anymore” she says. “I’m not doing anything childish. At the end of sixth grade” — when all her friends got phones and downloaded Snapchat, Instagram and Twitter — “I just stopped doing everything I normally did. Playing games at recess, playing with toys, all of it, done.”
  • Her scooter sat in the garage, covered in dust. Her stuffed animals were passed down to Lila. The wooden playground in the back yard stood empty. She kept her skateboard with neon yellow wheels, because riding it is still cool to her friends.
  • On the morning of her 14th birthday, Katherine wakes up to an alarm ringing on her phone. It’s 6:30 a.m. She rolls over and shuts it off in the dark. Her grandparents, here to celebrate the end of her first year of teenagehood, are sleeping in the guest room down the hall. She can hear the dogs shuffling across the hardwood downstairs, waiting to be fed. Propping herself up on her peace-sign-covered pillow, she opens Instagram. Later, Lila will give her a Starbucks gift card. Her dad will bring doughnuts to her class. Her grandparents will take her to the Melting Pot for dinner. But first, her friends will decide whether to post pictures of Katherine for her birthday. Whether they like her enough to put a picture of her on their page. Those pictures, if they come, will get likes and maybe tbhs. They should be posted in the morning, any minute now. She scrolls past a friend posing in a bikini on the beach. Then a picture posted by Kendall Jenner. A selfie with coffee. A basketball Vine. A selfie with a girl’s tongue out. She scrolls, she waits. For that little notification box to appear.
sissij

Instagram introduces two-factor authentication | Technology | The Guardian - 0 views

  • Instagram has become the latest social network to enable two-factor authentication, a valuable security feature that protects accounts from being compromised due to password reuse or phishing.
  • Instagram joins Facebook, Twitter, Google and many others in offering some form of two-factor verification.
  • Confusingly for users, all the methods are slightly different: Twitter requires logging in to be approved by opening the app on a trusted device, and Google uses an open standard to link up with its authenticator app, which generates new six-digit codes every 30 seconds.
  •  
    Internet security has been a big problem since the development of internet technology. There are a lot of worries especially on the safety of the account. People put more and more things online and security risk become an issue. For example, there are a lot of pay online apps that enable you to pay without using actually money, just charging automatically from your bank account. Although it is very convenient to have everything online, it is very unstable and risky at the same time. --Sissi (3/25/2017)
summertyler

Last words? Phone app bids to save dying aboriginal language - CNN.com - 0 views

  • A smartphone app has been launched to help save an Australian indigenous language that is in danger of disappearing.
  • aims to prevent the extinction of the Iwaidja language
  • "People have their phones with them most of the time, the app is incredibly easy to use, and this allows data collection to happen spontaneously, opportunistically,"
  • ...4 more annotations...
  • "We believe the tools we are developing will exponentially increase the involvement of the Indigenous people whose languages are threatened, without the need for difficult-to-attain levels of computer literacy,"
  • Until now endangered aboriginal languages were recorded in the presence of a linguist and selected native speakers with recording equipment
  • indigenous people whose languages are threatened can record and upload languages at their own pace and at times which suit them, he says, without requiring the presence of a specialist holding a microphone
  • "The ability provided by the tools we are developing to easily create, record and share language, images, and video, at the same time as building sustainable databases for future use, involves and empowers speakers of indigenous languages in a way which has not been possible before."
  •  
    Language is a barrier, and people are trying to break down these barriers.
Javier E

Seeking privacy, teens turn to anonymous-messaging apps - The Washington Post - 0 views

  • Anonymous and ephemeral, apps such as Whisper, Secret, Ask.fm and Snapchat fill a growing demand among teens for more fun, less accountability and more privacy online.
  • As teens look increasingly for alternatives to the social giants Facebook and Twitter, the anonymous apps create the opportunity for bullying and cruelty in a forum where they cannot be tracked.
  • the popular anonymous question-and-answer forum Ask.fm has become a magnet for cyberbullying.
  • ...3 more annotations...
  • when parents, grandparents and Little League coaches became core users of Facebook, kids naturally gravitated to new places where they could socialize away from the watchful eye of adults, experts say.
  • “The worst stuff happens on the anonymous sites because people are either too scared to say something to someone’s face or they want to present someone with public humiliation,” Olivia said.
  • Numerous psychological studies show conflict is often resolved when people talk face-to-face. When people can see signs of sadness or other emotions, they tend to back down. Facebook said the majority of users who are flagged for abusive or bullying conduct never do it again. On the anonymous sites, there are no such brakes on negative behavior.
Javier E

When No One Is Just a Face in the Crowd - NYTimes.com - 0 views

  • Facial recognition technology, already employed by some retail stores to spot and thwart shoplifters, may soon be used to identify and track the freest spenders in the aisles.
  • And companies like FaceFirst, in Camarillo, Calif., hope to soon complement their shoplifter-identification services with parallel programs to help retailers recognize customers eligible for special treatmen
  • . “Instantly, when a person in your FaceFirst database steps into one of your stores, you are sent an email, text or SMS alert that includes their picture and all biographical information of the known individual so you can take immediate and appropriate action.”
  • ...3 more annotations...
  • Because facial recognition can be used covertly to identify and track people by name at a distance, some civil liberties experts call it unequivocally intrusive. In view of intelligence documents made public by Edward J. Snowden, they also warn that once companies get access to such data, the government could, too. “This is you as an individual being monitored over time and your movements and habits being recorded,”
  • facial recognition may soon let companies link a person’s online persona with his or her actual offline self at a specific public location. That could seriously threaten our ability to remain anonymous in public.
  • industry and consumer advocates will have to contend with nascent facial-recognition apps like NameTag; it is designed to allow a user to scan photographs of strangers, then see information about them — like their occupations or social-network profiles.
Javier E

The Fall of Facebook - The Atlantic - 0 views

  • Alexis C. Madrigal Nov 17 2014, 7:59 PM ET Social networking is not, it turns out, winner take all. In the past, one might have imagined that switching between Facebook and “some other network” would be difficult, but the smartphone interface makes it easy to be on a dozen networks. All messages come to the same place—the phone’s notifications screen—so what matters is what your friends are doing, not which apps they’re using.
  • if I were to put money on an area in which Facebook might be unable to dominate in the future, it would be apps that take advantage of physical proximity. Something radically new could arise on that front, whether it’s an evolution of Yik Yak
  • The Social Machine, predicts that text will be a less and less important part of our asynchronous communications mix. Instead, she foresees a “very fluid interface” that would mix text with voice, video, sensor outputs (location, say, or vital signs), and who knows what else
  • ...5 more annotations...
  • the forthcoming Apple Watch seems like a step toward the future Donath envisions. Users will be able to send animated smiley faces, drawings, voice snippets, and even their live heartbeats, which will be tapped out on the receiver’s wrist.
  • A simple but rich messaging platform—perhaps with specialized hardware—could replace the omnibus social network for most purposes. “I think we’re shifting in a weird way to one-on-one conversations on social networks and in messaging apps,” says Shani Hilton, the executive editor for news at BuzzFeed, the viral-media site. “People don’t want to perform their lives publicly in the same way that they wanted to five years ago.”
  • Facebook is built around a trade-off that it has asked users to make: Give us all your personal information, post all your pictures, tag all your friends, and so on, forever. In return, we’ll optimize your social life. But this output is only as good as the input. And it turns out that, when scaled up, creating this input—making yourself legible enough to the Facebook machine that your posts are deemed “relevant” and worthy of being displayed to your mom and your friends—is exhausting labor.
  • These new apps, then, are arguments that we can still have an Internet that is weird, and private. That we can still have social networks without the social network. And that we can still have friends on the Internet without “friending” them.
  • A Brief History of Information Gatekeepers 1871: Western Union controls 90 percent of U.S. telegraph traffic. 1947: 97 percent of the country’s radio stations are affiliated with one of four national networks. 1969: Viewership for the three nightly network newscasts hits an all-time high, with 50 percent of all American homes tuning in. 1997: About half of all American homes with Internet access get it through America Online. 2002: Microsoft Internet Explorer captures 97 percent of the worldwide browser market. 2014: Amazon sells 63 percent of all books bought online—and 40 percent of books overall.
1 - 20 of 152 Next › Last »
Showing 20 items per page