Skip to main content

Home/ TOK Friends/ Group items tagged youtube

Rss Feed Group items tagged

Javier E

How YouTube Drives People to the Internet's Darkest Corners - WSJ - 0 views

  • YouTube is the new television, with more than 1.5 billion users, and videos the site recommends have the power to influence viewpoints around the world.
  • Those recommendations often present divisive, misleading or false content despite changes the site has recently made to highlight more-neutral fare, a Wall Street Journal investigation found.
  • Behind that growth is an algorithm that creates personalized playlists. YouTube says these recommendations drive more than 70% of its viewing time, making the algorithm among the single biggest deciders of what people watch.
  • ...25 more annotations...
  • People cumulatively watch more than a billion YouTube hours daily world-wide, a 10-fold increase from 2012
  • After the Journal this week provided examples of how the site still promotes deceptive and divisive videos, YouTube executives said the recommendations were a problem.
  • When users show a political bias in what they choose to view, YouTube typically recommends videos that echo those biases, often with more-extreme viewpoints.
  • Such recommendations play into concerns about how social-media sites can amplify extremist voices, sow misinformation and isolate users in “filter bubbles”
  • Unlike Facebook Inc. and Twitter Inc. sites, where users see content from accounts they choose to follow, YouTube takes an active role in pushing information to users they likely wouldn’t have otherwise seen.
  • “The editorial policy of these new platforms is to essentially not have one,”
  • “That sounded great when it was all about free speech and ‘in the marketplace of ideas, only the best ones win.’ But we’re seeing again and again that that’s not what happens. What’s happening instead is the systems are being gamed and people are being gamed.”
  • YouTube has been tweaking its algorithm since last autumn to surface what its executives call “more authoritative” news source
  • YouTube last week said it is considering a design change to promote relevant information from credible news sources alongside videos that push conspiracy theories.
  • The Journal investigation found YouTube’s recommendations often lead users to channels that feature conspiracy theories, partisan viewpoints and misleading videos, even when those users haven’t shown interest in such content.
  • YouTube engineered its algorithm several years ago to make the site “sticky”—to recommend videos that keep users staying to watch still more, said current and former YouTube engineers who helped build it. The site earns money selling ads that run before and during videos.
  • YouTube’s algorithm tweaks don’t appear to have changed how YouTube recommends videos on its home page. On the home page, the algorithm provides a personalized feed for each logged-in user largely based on what the user has watched.
  • There is another way to calculate recommendations, demonstrated by YouTube’s parent, Alphabet Inc.’s Google. It has designed its search-engine algorithms to recommend sources that are authoritative, not just popular.
  • Google spokeswoman Crystal Dahlen said that Google improved its algorithm last year “to surface more authoritative content, to help prevent the spread of blatantly misleading, low-quality, offensive or downright false information,” adding that it is “working with the YouTube team to help share learnings.”
  • In recent weeks, it has expanded that change to other news-related queries. Since then, the Journal’s tests show, news searches in YouTube return fewer videos from highly partisan channels.
  • YouTube’s recommendations became even more effective at keeping people on the site in 2016, when the company began employing an artificial-intelligence technique called a deep neural network that makes connections between videos that humans wouldn’t. The algorithm uses hundreds of signals, YouTube says, but the most important remains what a given user has watched.
  • Using a deep neural network makes the recommendations more of a black box to engineers than previous techniques,
  • “We don’t have to think as much,” he said. “We’ll just give it some raw data and let it figure it out.”
  • To better understand the algorithm, the Journal enlisted former YouTube engineer Guillaume Chaslot, who worked on its recommendation engine, to analyze thousands of YouTube’s recommendations on the most popular news-related queries
  • Mr. Chaslot created a computer program that simulates the “rabbit hole” users often descend into when surfing the site. In the Journal study, the program collected the top five results to a given search. Next, it gathered the top three recommendations that YouTube promoted once the program clicked on each of those results. Then it gathered the top three recommendations for each of those promoted videos, continuing four clicks from the original search.
  • The first analysis, of November’s top search terms, showed YouTube frequently led users to divisive and misleading videos. On the 21 news-related searches left after eliminating queries about entertainment, sports and gaming—such as “Trump,” “North Korea” and “bitcoin”—YouTube most frequently recommended these videos:
  • The algorithm doesn’t seek out extreme videos, they said, but looks for clips that data show are already drawing high traffic and keeping people on the site. Those videos often tend to be sensationalist and on the extreme fringe, the engineers said.
  • Repeated tests by the Journal as recently as this week showed the home page often fed far-right or far-left videos to users who watched relatively mainstream news sources, such as Fox News and MSNBC.
  • Searching some topics and then returning to the home page without doing a new search can produce recommendations that push users toward conspiracy theories even if they seek out just mainstream sources.
  • After searching for “9/11” last month, then clicking on a single CNN clip about the attacks, and then returning to the home page, the fifth and sixth recommended videos were about claims the U.S. government carried out the attacks. One, titled “Footage Shows Military Plane hitting WTC Tower on 9/11—13 Witnesses React”—had 5.3 million views.
sanderk

How YouTube's Recommendation Algorithm Really Works - The Atlantic - 0 views

  • YouTube wants to recommend things people will like, and the clearest signal of that is whether other people liked them. Pew found that 64 percent of recommendations went to videos with more than a million views. The 50 videos that YouTube recommended most often had been viewed an average of 456 million times each. Popularity begets popularity, at least in the case of users (or bots, as here) that YouTube doesn’t know much about.
  • First, as Pew’s software made choices, the system selected longer videos. It’s as if the software recognizes that the user is going to be around for a while, and starts to serve up longer fare. Second, it also began to recommend more popular videos regardless of how popular the starting video was.
  • The system learns from a video’s early performance, and if it does well, views can grow rapidly. In one case, a highly recommended kids’ video went from 34,000 views when Pew first encountered it in July to 30 million in August.
  • ...4 more annotations...
  • So, the challenge becomes how to recommend “new videos that users want to watch” when those videos are new to the system and low in views. (Finding fresh, potentially hot videos is important, YouTube researchers have written, for “propagating viral content.”)
  • more than 70 percent of the videos that YouTube recommended showed up on the list only once. It’s impossible to examine how hundreds of thousands of videos connect to each first random video when there are such limited data about each one.
  • People want to know if YouTube regularly radicalizes people with its recommendations, as the scholar Zeynep Tufekci has suggested. This study suggests that YouTube pushes an anonymous user toward more popular, not more fringe, content.
  • For my November magazine story about children’s YouTube, the company’s answer to these kinds of troubling suggestions was that YouTube isn’t for kids. Children, they told me, should be using only the YouTube Kids app, which has been built as a safe space for them
sissij

YouTube Filtering Draws Ire of Gay and Transgender Creators - The New York Times - 0 views

  • YouTube said on Sunday that it was investigating the simmering complaints by some users that its family-friendly “restricted mode” wrongly filters out some lesbian, gay, bisexual and transgender videos.
  • In a statement, YouTube said that many videos featuring lesbian, gay, bisexual and transgender content were unaffected by the filter, an optional parental-control setting, and that it only targeted those that discussed sensitive topics such as politics, health and sexuality.
  • In a statement, YouTube described restricted mode as “an optional feature used by a very small subset of users who want to have a more limited YouTube experience.”
  • ...2 more annotations...
  • the system is “not 100 percent accurate.”
  • Over the weekend, many video creators and users complained on Twitter, recycling the hashtag #YouTubeIsOverParty, which was trending worldwide by Sunday night.
  •  
    Restriction in social media has always been a controversial issue. I think this problem in the system filtering the videos of gay and transgender creator shouldn't be all blamed on Youtube. I think the system Youtube used to filter the video is not based on the sexuality information of the creators. It think the system might take in the comments and survey results from the viewer. I think this reflects that the mainstream community and mindset still reject and repel transgenders and gays. People don't want to be sensitive topics. --Sissi (3/20/2017)
Javier E

How 2020 Forced Facebook and Twitter to Step In - The Atlantic - 0 views

  • mainstream platforms learned their lesson, accepting that they should intervene aggressively in more and more cases when users post content that might cause social harm.
  • During the wildfires in the American West in September, Facebook and Twitter took down false claims about their cause, even though the platforms had not done the same when large parts of Australia were engulfed in flames at the start of the year
  • Twitter, Facebook, and YouTube cracked down on QAnon, a sprawling, incoherent, and constantly evolving conspiracy theory, even though its borders are hard to delineate.
  • ...15 more annotations...
  • Content moderation comes to every content platform eventually, and platforms are starting to realize this faster than ever.
  • Nothing symbolizes this shift as neatly as Facebook’s decision in October (and Twitter’s shortly after) to start banning Holocaust denial. Almost exactly a year earlier, Zuckerberg had proudly tied himself to the First Amendment in a widely publicized “stand for free expression” at Georgetown University.
  • The evolution continues. Facebook announced earlier this month that it will join platforms such as YouTube and TikTok in removing, not merely labeling or down-ranking, false claims about COVID-19 vaccines.
  • the pandemic also showed that complete neutrality is impossible. Even though it’s not clear that removing content outright is the best way to correct misperceptions, Facebook and other platforms plainly want to signal that, at least in the current crisis, they don’t want to be seen as feeding people information that might kill them.
  • When internet platforms announce new policies, assessing whether they can and will enforce them consistently has always been difficult. In essence, the companies are grading their own work. But too often what can be gleaned from the outside suggests that they’re failing.
  • It tweaked its algorithm to boost authoritative sources in the news feed and turned off recommendations to join groups based around political or social issues. Facebook is reversing some of these steps now, but it cannot make people forget this toolbox exists in the future
  • As platforms grow more comfortable with their power, they are recognizing that they have options beyond taking posts down or leaving them up. In addition to warning labels, Facebook implemented other “break glass” measures to stem misinformation as the election approached.
  • Platforms don’t deserve praise for belatedly noticing dumpster fires that they helped create and affixing unobtrusive labels to them
  • Warning labels for misinformation might make some commentators feel a little better, but whether labels actually do much to contain the spread of false information is still unknown.
  • News reporting suggests that insiders at Facebook knew they could and should do more about misinformation, but higher-ups vetoed their ideas. YouTube barely acted to stem the flood of misinformation about election results on its platform.
  • Even before the pandemic, YouTube had begun adjusting its recommendation algorithm to reduce the spread of borderline and harmful content, and is introducing pop-up nudges to encourage user
  • And if 2020 finally made clear to platforms the need for greater content moderation, it also exposed the inevitable limits of content moderation.
  • Down-ranking, labeling, or deleting content on an internet platform does not address the social or political circumstances that caused it to be posted in the first place
  • even the most powerful platform will never be able to fully compensate for the failures of other governing institutions or be able to stop the leader of the free world from constructing an alternative reality when a whole media ecosystem is ready and willing to enable him. As Renée DiResta wrote in The Atlantic last month, “reducing the supply of misinformation doesn’t eliminate the demand.”
  • Even so, this year’s events showed that nothing is innate, inevitable, or immutable about platforms as they currently exist. The possibilities for what they might become—and what role they will play in society—are limited more by imagination than any fixed technological constraint, and the companies appear more willing to experiment than ever.
Javier E

Opinion | Elle Mills: Why I Quit YouTube - The New York Times - 0 views

  • The peak of my YouTube career didn’t always match my childhood fantasy of what this sort of fame might look like. Instead, I was constantly terrified of losing my audience and the validation that came with it. My self-worth had become so intertwined with my career that maintaining it genuinely felt life-or-death. I was stuck in a never-ending cycle of constantly trying to top myself to remain relevant.
  • YouTube soon became a game of, “What’s the craziest thing you’d do for attention?”
  • there’s an overwhelming guilt I feel when I look back at all those who naïvely participated in my videos. A part of me feels like I took advantage of their own longing to be seen. I gained fame and success from the exploitation of their lives. They didn’t.
  • ...6 more annotations...
  • I knew that my audience wanted to feel authenticity from me. To give that to them, I revealed pieces of myself that I might have been wiser to keep private.
  • when metrics substitute for self-worth, it’s easy to fall into the trap of giving precious pieces of yourself away to feed an audience that’s always hungry for more and more.
  • In 2018, I impulsively released a video about my struggle with burnout, which featured intimate footage of my emotional breakdowns. Those breakdowns were, in part, a product of severe anxiety and depression brought about by chasing the exact success for which many other teenagers yearn.
  • I was entering adulthood and trying to live my childhood dream, but now, to be “authentic,” I had to be the product I had long been posting online, as opposed to the person I was growing up to be.
  • Online culture encourages young people to turn themselves into a product at an age when they’re only starting to discover who they are. When an audience becomes emotionally invested in a version of you that you outgrow, keeping the product you’ve made aligned with yourself becomes an impossible dilemma.
  • Sometimes, I barely recognize the person I used to be. Although a part of me resents that I’ll never be able to forget her, I’m also grateful to her. My YouTube channel, for all the trouble it brought me, connected me to the people who wanted to hear my stories and prepared me for a real shot at a directing career. In the last year, I’ve directed a short film and am writing a feature, which showed me new ways of creating that aren’t at the expense of my privacy.
Javier E

'ContraPoints' Is Political Philosophy Made for YouTube - The Atlantic - 1 views

  • While Wynn positions herself on the left, she is no dogmatic ideologue, readily admitting to points on the right and criticizing leftist arguments when warranted
  • She has described her work as “edutainment” and “propaganda,” and it’s both
  • But what makes her videos unique is the way Wynn combines those two elements: high standards of rational argument and not-quite-rational persuasion. ContraPoints offers compelling speech aimed at truth, rendered in the raucous, meme-laden idiom of the interne
  • ...16 more annotations...
  • In 2014, Wynn noticed a trend on YouTube that disturbed her: Videos with hyperbolic titles like “why feminism ruins everything,” “SJW cringe compilation,” and “Ben Shapiro DESTROYS Every College Snowflake” were attracting millions of views and spawning long, jeering comment threads. Wynn felt she was watching the growth of a community of outrage that believes feminists, Marxists, and multiculturalists are conspiring to destroy freedom of speech, liquidate gender norms, and demolish Western civilization
  • Wynn created ContraPoints to offer entertaining, coherent rebuttals to these kinds of ideas. Her videos also explain left-wing talking points—like rape culture and cultural appropriation—and use philosophy to explore topics that are important to Wynn, such as the meaning of gender for trans people.
  • Wynn thinks it’s a mistake to assume that viewers of angry, right-wing videos are beyond redemption. “It’s quite difficult to get through to the people who are really committed to these anti-progressive beliefs,” Wynn told me recently. However, she said, she believes that many viewers find such ideas “psychologically resonant” without being hardened reactionaries. This broad, not fully committed center—comprising people whose minds can still be changed—is Wynn’s target audience.
  • Usually, the videos to which Wynn is responding take the stance of dogged reason cutting through the emotional excesses of so-called “political correctness.” For example, the American conservative commentator Ben Shapiro, who is a target of a recent ContraPoints video, has made “facts don’t care about your feelings” his motto. Wynn’s first step in trying to win over those who find anti-progressive views appealing is to show that these ideas often rest on a flimsy foundation. To do so, she fully adopts the rational standards of argument that her rivals pride themselves on following, and demonstrates how they fail to achieve them
  • Wynn dissects her opponents’ positions, holding up fallacies, evasions, and other rhetorical tricks for examination, all the while providing a running commentary on good argumentative method.
  • The host defends her own positions according to the same principles. Wynn takes on the strongest version of her opponent’s argument, acknowledges when she thinks her opponents are right and when she has been wrong, clarifies when misunderstood, and provides plenty of evidence for her claims
  • for Wynn, the true key to persuasion is to engage her audience on an emotional level.
  • she critiques many of her leftist allies for being bad at persuasion.
  • Socrates persuaded by both the logic of argument and the dynamic of fandom. Wynn is beginning to grow a dedicated following of her own: Members of online discussion groups refer to her as “mother” and “the queen,” produce fan art, and post photos of themselves dressed as characters from her videos.
  • she shares Socrates’s view that philosophy is more an erotic art than a martial one
  • As she puts it, she’s not trying to destroy the people she addresses, but seduce them
  • Wynn is a former Ph.D. student in philosophy, and though her videos are too rich with dick jokes for official settings, her argumentative practice would pass muster in any grad seminar.
  • One thing she has come across repeatedly is a disdain for the left’s perceived moral superiority. Anti-progressives of all stripes, Wynn told me, show an “intense defensiveness against being told what to do” and a “repulsion in response to moralizing.”
  • Matching her speech to the audience’s tastes presents a prickly rhetorical challenge. In an early video, Contra complains: “The problem is this medium. These goddamn savages demand a circus, and I intend to give them one, but behind the curtain, I really just want to have a conversation.
  • Philosophical conversation requires empathy and good-faith engagement. But the native tongue of political YouTube is ironic antagonism. It’s Wynn’s inimitable way of combining these two ingredients that gives ContraPoints its distinctive mouthfeel.
  • Wynn spends weeks in the online communities of her opponents—whether they’re climate skeptics or trans-exclusionary feminists—trying to understand what they believe and why they believe it. In Socrates’s words, she’s studying the souls of her audience.
pier-paolo

The YouTube Paradox - The New York Times - 0 views

  • The online video-sharing upstart, which shows more than 100 million video clips a day, has titans wondering: friend or foe?
  • trying to improve the site for its users while working to find arrangements that will satisfy Hollywood.
  • that YouTube and MySpace, the social networking site, “are copyright infringers and owe us tens of millions of dollars.”
  • ...4 more annotations...
  • an early Internet video site that was bought by Yahoo, said that YouTube would eventually be “sued into oblivion” because of copyright violations.
  • dealing with YouTube “is not the typical situation we run into every day” because the company has not thought through many of the issues raised by its business.
  • “Part of what we are doing is working with them to figure out what their business model can be,”
  • The minute a Yahoo, Microsoft or News Corp. buys YouTube, you have a target for anyone who wants to sue.
Javier E

TikTok Brain Explained: Why Some Kids Seem Hooked on Social Video Feeds - WSJ - 0 views

  • Remember the good old days when kids just watched YouTube all day? Now that they binge on 15-second TikToks, those YouTube clips seem like PBS documentaries.
  • Many parents tell me their kids can’t sit through feature-length films anymore because to them the movies feel painfully slow. Others have observed their kids struggling to focus on homework. And reading a book? Forget about it.
  • What is happening to kids’ brains?
  • ...27 more annotations...
  • “It is hard to look at increasing trends in media consumption of all types, media multitasking and rates of ADHD in young people and not conclude that there is a decrease in their attention span,
  • Emerging research suggests that watching short, fast-paced videos makes it harder for kids to sustain activities that don’t offer instant—and constant—gratification.
  • One of the few studies specifically examining TikTok-related effects on the brain focused on Douyin, the TikTok equivalent in China, made by the same Chinese parent company, ByteDance Ltd. It found that the personalized videos the app’s recommendation engine shows users activate the reward centers of the brain, as compared with the general-interest videos shown to new users.
  • Brain scans of Chinese college students showed that areas involved in addiction were highly activated in those who watched personalized videos.
  • It also found some people have trouble controlling when to stop watching.
  • attention. “If kids’ brains become accustomed to constant changes, the brain finds it difficult to adapt to a nondigital activity where things don’t move quite as fast,”
  • A TikTok spokeswoman said the company wants younger teens to develop positive digital habits early on, and that it recently made some changes aimed at curbing extensive app usage. For example, TikTok won’t allow users ages 13 to 15 to receive push notifications after 9 p.m. TikTok also periodically reminds users to take a break to go outside or grab a snack.
  • Kids have a hard time pulling away from videos on YouTube, too, and Google has made several changes to help limit its use, including turning off autoplay by default on accounts of people under 18.
  • When kids do things that require prolonged focus, such as reading or solving math problems, they’re using directed attention
  • This function starts in the prefrontal cortex, the part of the brain responsible for decision making and impulse control.
  • “Directed attention is the ability to inhibit distractions and sustain attention and to shift attention appropriately. It requires higher-order skills like planning and prioritizing,”
  • Kids generally have a harder time doing this—and putting down their videogame controllers—because the prefrontal cortex isn’t fully developed until age 25.
  • “We speculate that individuals with lower self-control ability have more difficulty shifting attention away from favorite video stimulation,
  • “In the short-form snackable world, you’re getting quick hit after quick hit, and as soon as it’s over, you have to make a choice,” said Mass General’s Dr. Marci, who wrote the new book “Rewired: Protecting Your Brain in the Digital Age.” The more developed the prefrontal cortex, the better the choices.
  • Dopamine is a neurotransmitter that gets released in the brain when it’s expecting a reward. A flood of dopamine reinforces cravings for something enjoyable, whether it’s a tasty meal, a drug or a funny TikTok video.
  • “TikTok is a dopamine machine,” said John Hutton, a pediatrician and director of the Reading & Literacy Discovery Center at Cincinnati Children’s Hospital. “If you want kids to pay attention, they need to practice paying attention.”
  • Researchers are just beginning to conduct long-term studies on digital media’s effects on kids’ brains. The National Institutes of Health is funding a study of nearly 12,000 adolescents as they grow into adulthood to examine the impact that many childhood experiences—from social media to smoking—have on cognitive development.
  • she predicts they will find that when brains repeatedly process rapid, rewarding content, their ability to process less-rapid, less-rewarding things “may change or be harmed.”
  • “It’s like we’ve made kids live in a candy store and then we tell them to ignore all that candy and eat a plate of vegetables,”
  • “We have an endless flow of immediate pleasures that’s unprecedented in human history.”
  • Parents and kids can take steps to boost attention, but it takes effort
  • Swap screen time for real time. Exercise and free play are among the best ways to build attention during childhood,
  • “Depriving kids of tech doesn’t work, but simultaneously reducing it and building up other things, like playing outside, does,”
  • Practice restraint.
  • “When you practice stopping, it strengthens those connections in the brain to allow you to stop again next time.”
  • Use tech’s own tools. TikTok has a screen-time management setting that allows users to cap their app usage.
  • Ensure good sleep. Teens are suffering from a sleep deficit.
Javier E

YouTube to Curb Its Referrals to Conspiracy Theories and Other False Claims - WSJ - 0 views

  • Videos that could “misinform users in harmful ways,” such as ones that claim the Earth isn’t round or question the actors behind the Sept. 11 terrorist attacks, will no longer be recommended with as much prominence, the Alphabet Inc. GOOGL 1.62% unit said in a blog post Friday.
  • Though the factors underpinning YouTube’s recommendation system are largely unknown, its influence is apparent in the numbers. YouTube has said its recommendations drive more than 70% of users’ viewing time, and that it recommends more than 200 million videos daily on its home page alone.
Javier E

'Meta-Content' Is Taking Over the Internet - The Atlantic - 0 views

  • Jenn, however, has complicated things by adding an unexpected topic to her repertoire: the dangers of social media. She recently spoke about disengaging from it for her well-being; she also posted an Instagram Story about the risks of ChatGPT
  • and, in none other than a YouTube video, recommended Neil Postman’s Amusing Ourselves to Death, a seminal piece of media critique from 1985 that denounces television’s reduction of life to entertainment.
  • (Her other book recommendations included Stolen Focus, by Johann Hari, and Recapture the Rapture, by Jamie Wheal.)
  • ...14 more annotations...
  • Social-media platforms are “preying on your insecurities; they’re preying on your temptations,” Jenn explained to me in an interview that shifted our parasocial connection, at least for an hour, to a mere relationship. “And, you know, I do play a role in this.” Jenn makes money through aspirational advertising, after all—a familiar part of any influencer’s job.
  • She’s pro–parasocial relationships, she explains to the camera, but only if we remain aware that we’re in one. “This relationship does not replace existing friendships, existing relationships,” she emphasizes. “This is all supplementary. Like, it should be in addition to your life, not a replacement.” I sat there watching her talk about parasocial relationships while absorbing the irony of being in one with her.
  • The open acknowledgment of social media’s inner workings, with content creators exposing the foundations of their content within the content itself, is what Alice Marwick, an associate communications professor at the University of North Carolina at Chapel Hill, described to me as “meta-content.”
  • Meta-content can be overt, such as the vlogger Casey Neistat wondering, in a vlog, if vlogging your life prevents you from being fully present in it;
  • But meta-content can also be subtle: a vlogger walking across the frame before running back to get the camera. Or influencers vlogging themselves editing the very video you’re watching, in a moment of space-time distortion.
  • Viewers don’t seem to care. We keep watching, fully accepting the performance. Perhaps that’s because the rise of meta-content promises a way to grasp authenticity by acknowledging artifice; especially in a moment when artifice is easier to create than ever before, audiences want to know what’s “real” and what isn’
  • “The idea of a space where you can trust no sources, there’s no place to sort of land, everything is put into question, is a very unsettling, unsatisfying way to live.
  • So we continue to search for, as Murray observes, the “agreed-upon things, our basic understandings of what’s real, what’s true.” But when the content we watch becomes self-aware and even self-critical, it raises the question of whether we can truly escape the machinations of social media. Maybe when we stare directly into the abyss, we begin to enjoy its company.
  • “The difference between BeReal and the social-media giants isn’t the former’s relationship to truth but the size and scale of its deceptions.” BeReal users still angle their camera and wait to take their daily photo at an aesthetic time of day. The snapshots merely remind us how impossible it is to stop performing online.
  • Jenn’s concern over the future of the internet stems, in part, from motherhood. She recently had a son, Lennon (whose first birthday party I watched on YouTube), and worries about the digital world he’s going to inherit.
  • Back in the age of MySpace, she had her own internet friends and would sneak out to parking lots at 1 a.m. to meet them in real life: “I think this was when technology was really used as a tool to connect us.” Now, she explained, it’s beginning to ensnare us. Posting content online is no longer a means to an end so much as the end itself.
  • We used to view influencers’ lives as aspirational, a reality that we could reach toward. Now both sides acknowledge that they’re part of a perfect product that the viewer understands is unattainable and the influencer acknowledges is not fully real.
  • “I forgot to say this to her in the interview, but I truly think that my videos are less about me and more of a reflection of where you are currently … You are kind of reflecting on your own life and seeing what resonates [with] you, and you’re discarding what doesn’t. And I think that’s what’s beautiful about it.”
  • meta-content is fundamentally a compromise. Recognizing the delusion of the internet doesn’t alter our course within it so much as remind us how trapped we truly are—and how we wouldn’t have it any other way.
fischerry

What is Natural Selection? - YouTube - 0 views

  • What is Natural Selection?
dpittenger

How Many Things Are There? - 1 views

  •  
    This is a youtube video that discusses how many "things" there are. He defines the definition of things and then he eventually calculates how many things there are. This is related to TOK because it shows how "things" can be relative to a person, and it also shows that everything could possible be limited.
Javier E

'Our minds can be hijacked': the tech insiders who fear a smartphone dystopia | Technol... - 0 views

  • Rosenstein belongs to a small but growing band of Silicon Valley heretics who complain about the rise of the so-called “attention economy”: an internet shaped around the demands of an advertising economy.
  • “It is very common,” Rosenstein says, “for humans to develop things with the best of intentions and for them to have unintended, negative consequences.”
  • most concerned about the psychological effects on people who, research shows, touch, swipe or tap their phone 2,617 times a day.
  • ...43 more annotations...
  • There is growing concern that as well as addicting users, technology is contributing toward so-called “continuous partial attention”, severely limiting people’s ability to focus, and possibly lowering IQ. One recent study showed that the mere presence of smartphones damages cognitive capacity – even when the device is turned off. “Everyone is distracted,” Rosenstein says. “All of the time.”
  • Drawing a straight line between addiction to social media and political earthquakes like Brexit and the rise of Donald Trump, they contend that digital forces have completely upended the political system and, left unchecked, could even render democracy as we know it obsolete.
  • Without irony, Eyal finished his talk with some personal tips for resisting the lure of technology. He told his audience he uses a Chrome extension, called DF YouTube, “which scrubs out a lot of those external triggers” he writes about in his book, and recommended an app called Pocket Points that “rewards you for staying off your phone when you need to focus”.
  • “One reason I think it is particularly important for us to talk about this now is that we may be the last generation that can remember life before,” Rosenstein says. It may or may not be relevant that Rosenstein, Pearlman and most of the tech insiders questioning today’s attention economy are in their 30s, members of the last generation that can remember a world in which telephones were plugged into walls.
  • One morning in April this year, designers, programmers and tech entrepreneurs from across the world gathered at a conference centre on the shore of the San Francisco Bay. They had each paid up to $1,700 to learn how to manipulate people into habitual use of their products, on a course curated by conference organiser Nir Eyal.
  • Eyal, 39, the author of Hooked: How to Build Habit-Forming Products, has spent several years consulting for the tech industry, teaching techniques he developed by closely studying how the Silicon Valley giants operate.
  • “The technologies we use have turned into compulsions, if not full-fledged addictions,” Eyal writes. “It’s the impulse to check a message notification. It’s the pull to visit YouTube, Facebook, or Twitter for just a few minutes, only to find yourself still tapping and scrolling an hour later.” None of this is an accident, he writes. It is all “just as their designers intended”
  • He explains the subtle psychological tricks that can be used to make people develop habits, such as varying the rewards people receive to create “a craving”, or exploiting negative emotions that can act as “triggers”. “Feelings of boredom, loneliness, frustration, confusion and indecisiveness often instigate a slight pain or irritation and prompt an almost instantaneous and often mindless action to quell the negative sensation,” Eyal writes.
  • The most seductive design, Harris explains, exploits the same psychological susceptibility that makes gambling so compulsive: variable rewards. When we tap those apps with red icons, we don’t know whether we’ll discover an interesting email, an avalanche of “likes”, or nothing at all. It is the possibility of disappointment that makes it so compulsive.
  • Finally, Eyal confided the lengths he goes to protect his own family. He has installed in his house an outlet timer connected to a router that cuts off access to the internet at a set time every day. “The idea is to remember that we are not powerless,” he said. “We are in control.
  • But are we? If the people who built these technologies are taking such radical steps to wean themselves free, can the rest of us reasonably be expected to exercise our free will?
  • Not according to Tristan Harris, a 33-year-old former Google employee turned vocal critic of the tech industry. “All of us are jacked into this system,” he says. “All of our minds can be hijacked. Our choices are not as free as we think they are.”
  • Harris, who has been branded “the closest thing Silicon Valley has to a conscience”, insists that billions of people have little choice over whether they use these now ubiquitous technologies, and are largely unaware of the invisible ways in which a small number of people in Silicon Valley are shaping their lives.
  • “I don’t know a more urgent problem than this,” Harris says. “It’s changing our democracy, and it’s changing our ability to have the conversations and relationships that we want with each other.” Harris went public – giving talks, writing papers, meeting lawmakers and campaigning for reform after three years struggling to effect change inside Google’s Mountain View headquarters.
  • He explored how LinkedIn exploits a need for social reciprocity to widen its network; how YouTube and Netflix autoplay videos and next episodes, depriving users of a choice about whether or not they want to keep watching; how Snapchat created its addictive Snapstreaks feature, encouraging near-constant communication between its mostly teenage users.
  • The techniques these companies use are not always generic: they can be algorithmically tailored to each person. An internal Facebook report leaked this year, for example, revealed that the company can identify when teens feel “insecure”, “worthless” and “need a confidence boost”. Such granular information, Harris adds, is “a perfect model of what buttons you can push in a particular person”.
  • Tech companies can exploit such vulnerabilities to keep people hooked; manipulating, for example, when people receive “likes” for their posts, ensuring they arrive when an individual is likely to feel vulnerable, or in need of approval, or maybe just bored. And the very same techniques can be sold to the highest bidder. “There’s no ethics,” he says. A company paying Facebook to use its levers of persuasion could be a car business targeting tailored advertisements to different types of users who want a new vehicle. Or it could be a Moscow-based troll farm seeking to turn voters in a swing county in Wisconsin.
  • It was Rosenstein’s colleague, Leah Pearlman, then a product manager at Facebook and on the team that created the Facebook “like”, who announced the feature in a 2009 blogpost. Now 35 and an illustrator, Pearlman confirmed via email that she, too, has grown disaffected with Facebook “likes” and other addictive feedback loops. She has installed a web browser plug-in to eradicate her Facebook news feed, and hired a social media manager to monitor her Facebook page so that she doesn’t have to.
  • Harris believes that tech companies never deliberately set out to make their products addictive. They were responding to the incentives of an advertising economy, experimenting with techniques that might capture people’s attention, even stumbling across highly effective design by accident.
  • It’s this that explains how the pull-to-refresh mechanism, whereby users swipe down, pause and wait to see what content appears, rapidly became one of the most addictive and ubiquitous design features in modern technology. “Each time you’re swiping down, it’s like a slot machine,” Harris says. “You don’t know what’s coming next. Sometimes it’s a beautiful photo. Sometimes it’s just an ad.”
  • The reality TV star’s campaign, he said, had heralded a watershed in which “the new, digitally supercharged dynamics of the attention economy have finally crossed a threshold and become manifest in the political realm”.
  • “Smartphones are useful tools,” he says. “But they’re addictive. Pull-to-refresh is addictive. Twitter is addictive. These are not good things. When I was working on them, it was not something I was mature enough to think about. I’m not saying I’m mature now, but I’m a little bit more mature, and I regret the downsides.”
  • All of it, he says, is reward-based behaviour that activates the brain’s dopamine pathways. He sometimes finds himself clicking on the red icons beside his apps “to make them go away”, but is conflicted about the ethics of exploiting people’s psychological vulnerabilities. “It is not inherently evil to bring people back to your product,” he says. “It’s capitalism.”
  • He identifies the advent of the smartphone as a turning point, raising the stakes in an arms race for people’s attention. “Facebook and Google assert with merit that they are giving users what they want,” McNamee says. “The same can be said about tobacco companies and drug dealers.”
  • McNamee chooses his words carefully. “The people who run Facebook and Google are good people, whose well-intentioned strategies have led to horrific unintended consequences,” he says. “The problem is that there is nothing the companies can do to address the harm unless they abandon their current advertising models.”
  • But how can Google and Facebook be forced to abandon the business models that have transformed them into two of the most profitable companies on the planet?
  • McNamee believes the companies he invested in should be subjected to greater regulation, including new anti-monopoly rules. In Washington, there is growing appetite, on both sides of the political divide, to rein in Silicon Valley. But McNamee worries the behemoths he helped build may already be too big to curtail.
  • Rosenstein, the Facebook “like” co-creator, believes there may be a case for state regulation of “psychologically manipulative advertising”, saying the moral impetus is comparable to taking action against fossil fuel or tobacco companies. “If we only care about profit maximisation,” he says, “we will go rapidly into dystopia.”
  • James Williams does not believe talk of dystopia is far-fetched. The ex-Google strategist who built the metrics system for the company’s global search advertising business, he has had a front-row view of an industry he describes as the “largest, most standardised and most centralised form of attentional control in human history”.
  • It is a journey that has led him to question whether democracy can survive the new technological age.
  • He says his epiphany came a few years ago, when he noticed he was surrounded by technology that was inhibiting him from concentrating on the things he wanted to focus on. “It was that kind of individual, existential realisation: what’s going on?” he says. “Isn’t technology supposed to be doing the complete opposite of this?
  • That discomfort was compounded during a moment at work, when he glanced at one of Google’s dashboards, a multicoloured display showing how much of people’s attention the company had commandeered for advertisers. “I realised: this is literally a million people that we’ve sort of nudged or persuaded to do this thing that they weren’t going to otherwise do,” he recalls.
  • Williams and Harris left Google around the same time, and co-founded an advocacy group, Time Well Spent, that seeks to build public momentum for a change in the way big tech companies think about design. Williams finds it hard to comprehend why this issue is not “on the front page of every newspaper every day.
  • “Eighty-seven percent of people wake up and go to sleep with their smartphones,” he says. The entire world now has a new prism through which to understand politics, and Williams worries the consequences are profound.
  • g. “The attention economy incentivises the design of technologies that grab our attention,” he says. “In so doing, it privileges our impulses over our intentions.”
  • That means privileging what is sensational over what is nuanced, appealing to emotion, anger and outrage. The news media is increasingly working in service to tech companies, Williams adds, and must play by the rules of the attention economy to “sensationalise, bait and entertain in order to survive”.
  • It is not just shady or bad actors who were exploiting the internet to change public opinion. The attention economy itself is set up to promote a phenomenon like Trump, who is masterly at grabbing and retaining the attention of supporters and critics alike, often by exploiting or creating outrage.
  • All of which has left Brichter, who has put his design work on the backburner while he focuses on building a house in New Jersey, questioning his legacy. “I’ve spent many hours and weeks and months and years thinking about whether anything I’ve done has made a net positive impact on society or humanity at all,” he says. He has blocked certain websites, turned off push notifications, restricted his use of the Telegram app to message only with his wife and two close friends, and tried to wean himself off Twitter. “I still waste time on it,” he confesses, “just reading stupid news I already know about.” He charges his phone in the kitchen, plugging it in at 7pm and not touching it until the next morning.
  • He stresses these dynamics are by no means isolated to the political right: they also play a role, he believes, in the unexpected popularity of leftwing politicians such as Bernie Sanders and Jeremy Corbyn, and the frequent outbreaks of internet outrage over issues that ignite fury among progressives.
  • All of which, Williams says, is not only distorting the way we view politics but, over time, may be changing the way we think, making us less rational and more impulsive. “We’ve habituated ourselves into a perpetual cognitive style of outrage, by internalising the dynamics of the medium,” he says.
  • It was another English science fiction writer, Aldous Huxley, who provided the more prescient observation when he warned that Orwellian-style coercion was less of a threat to democracy than the more subtle power of psychological manipulation, and “man’s almost infinite appetite for distractions”.
  • If the attention economy erodes our ability to remember, to reason, to make decisions for ourselves – faculties that are essential to self-governance – what hope is there for democracy itself?
  • “The dynamics of the attention economy are structurally set up to undermine the human will,” he says. “If politics is an expression of our human will, on individual and collective levels, then the attention economy is directly undermining the assumptions that democracy rests on.”
Javier E

The Lasting Lessons of John Conway's Game of Life - The New York Times - 0 views

  • “Because of its analogies with the rise, fall and alterations of a society of living organisms, it belongs to a growing class of what are called ‘simulation games,’” Mr. Gardner wrote when he introduced Life to the world 50 years ago with his October 1970 column.
  • The Game of Life motivated the use of cellular automata in the rich field of complexity science, with simulations modeling everything from ants to traffic, clouds to galaxies. More trivially, the game attracted a cult of “Lifenthusiasts,” programmers who spent a lot of time hacking Life — that is, constructing patterns in hopes of spotting new Life-forms.
  • The tree of Life also includes oscillators, such as the blinker, and spaceships of various sizes (the glider being the smallest).
  • ...24 more annotations...
  • Patterns that didn’t change one generation to the next, Dr. Conway called still lifes — such as the four-celled block, the six-celled beehive or the eight-celled pond. Patterns that took a long time to stabilize, he called methuselahs.
  • The second thing Life shows us is something that Darwin hit upon when he was looking at Life, the organic version. Complexity arises from simplicity!
  • I first encountered Life at the Exploratorium in San Francisco in 1978. I was hooked immediately by the thing that has always hooked me — watching complexity arise out of simplicity.
  • Life shows you two things. The first is sensitivity to initial conditions. A tiny change in the rules can produce a huge difference in the output, ranging from complete destruction (no dots) through stasis (a frozen pattern) to patterns that keep changing as they unfold.
  • Life shows us complex virtual “organisms” arising out of the interaction of a few simple rules — so goodbye “Intelligent Design.”
  • I’ve wondered for decades what one could learn from all that Life hacking. I recently realized it’s a great place to try to develop “meta-engineering” — to see if there are general principles that govern the advance of engineering and help us predict the overall future trajectory of technology.
  • Melanie Mitchell— Professor of complexity, Santa Fe Institute
  • Given that Conway’s proof that the Game of Life can be made to simulate a Universal Computer — that is, it could be “programmed” to carry out any computation that a traditional computer can do — the extremely simple rules can give rise to the most complex and most unpredictable behavior possible. This means that there are certain properties of the Game of Life that can never be predicted, even in principle!
  • I use the Game of Life to make vivid for my students the ideas of determinism, higher-order patterns and information. One of its great features is that nothing is hidden; there are no black boxes in Life, so you know from the outset that anything that you can get to happen in the Life world is completely unmysterious and explicable in terms of a very large number of simple steps by small items.
  • In Thomas Pynchon’s novel “Gravity’s Rainbow,” a character says, “But you had taken on a greater and more harmful illusion. The illusion of control. That A could do B. But that was false. Completely. No one can do. Things only happen.”This is compelling but wrong, and Life is a great way of showing this.
  • In Life, we might say, things only happen at the pixel level; nothing controls anything, nothing does anything. But that doesn’t mean that there is no such thing as action, as control; it means that these are higher-level phenomena composed (entirely, with no magic) from things that only happen.
  • Stephen Wolfram— Scientist and C.E.O., Wolfram Research
  • Brian Eno— Musician, London
  • Bert Chan— Artificial-life researcher and creator of the continuous cellular automaton “Lenia,” Hong Kong
  • it did have a big impact on beginner programmers, like me in the 90s, giving them a sense of wonder and a kind of confidence that some easy-to-code math models can produce complex and beautiful results. It’s like a starter kit for future software engineers and hackers, together with Mandelbrot Set, Lorenz Attractor, et cetera.
  • if we think about our everyday life, about corporations and governments, the cultural and technical infrastructures humans built for thousands of years, they are not unlike the incredible machines that are engineered in Life.
  • In normal times, they are stable and we can keep building stuff one component upon another, but in harder times like this pandemic or a new Cold War, we need something that is more resilient and can prepare for the unpreparable. That would need changes in our “rules of life,” which we take for granted.
  • Rudy Rucker— Mathematician and author of “Ware Tetralogy,” Los Gatos, Calif.
  • That’s what chaos is about. The Game of Life, or a kinky dynamical system like a pair of pendulums, or a candle flame, or an ocean wave, or the growth of a plant — they aren’t readily predictable. But they are not random. They do obey laws, and there are certain kinds of patterns — chaotic attractors — that they tend to produce. But again, unpredictable is not random. An important and subtle distinction which changed my whole view of the world.
  • William Poundstone— Author of “The Recursive Universe: Cosmic Complexity and the Limits of Scientific Knowledge,” Los Angeles, Calif.
  • The Game of Life’s pulsing, pyrotechnic constellations are classic examples of emergent phenomena, introduced decades before that adjective became a buzzword.
  • Fifty years later, the misfortunes of 2020 are the stuff of memes. The biggest challenges facing us today are emergent: viruses leaping from species to species; the abrupt onset of wildfires and tropical storms as a consequence of a small rise in temperature; economies in which billions of free transactions lead to staggering concentrations of wealth; an internet that becomes more fraught with hazard each year
  • Looming behind it all is our collective vision of an artificial intelligence-fueled future that is certain to come with surprises, not all of them pleasant.
  • The name Conway chose — the Game of Life — frames his invention as a metaphor. But I’m not sure that even he anticipated how relevant Life would become, and that in 50 years we’d all be playing an emergent game of life and death.
Javier E

Dhar Mann, YouTube's Moral Philosopher - The New York Times - 0 views

  • His “focus on universal truths,” he believes, is what has allowed him “to build such a massive audience.”
  • t’s 25 to 34. “Facebook and YouTube don’t give data for audience under 13 so I can’t say for sure 7 to 10 is the fastest growing audience, it just feels like it based on my interactions with people,” he wrote in an email.
  • Most of his videos incorporate timely narratives about police-calling Karens and Covid-19 hoarders, but in style and tone they are more reminiscent of 1980s after-school specials and the educational short films of the ’50s than other content that’s popular today.
  • ...3 more annotations...
  • The characters are broad and simple, each representing a demographic that any fourth grader could recognize: angry mom, spoiled wife, mean girl, lazy husband. They seem almost like instructional videos an alien species might watch to learn the basic points of American social dynamics.
  • Mr. Mann’s moral philosophy can at times feel thin and absolutist. A common narrative arc involves a bully mocking the protagonist for being poor or having acne; then a twist of fate strikes the bully with poverty or pimples. The videos often imply that having any kind of social problem is a form of shameful karmic punishment.
  • the size of his audience suggests that Mr. Mann is tapping into something millions of people find compelling. In trying times — say, a pandemic with no end in sight paired with devastating wildfires on several continents and a bleak climate outlook — people want to see villains reformed and lessons delivered. No ambiguity, no debates. Everything turns out just right.
Javier E

Why Didn't the Government Stop the Crypto Scam? - 0 views

  • By 1935, the New Dealers had set up a new agency, the Securities and Exchange Commission, and cleaned out the FTC. Yet there was still immense concern that Roosevelt had not been able to tame Wall Street. The Supreme Court didn’t really ratify the SEC as a constitutional body until 1938, and nearly struck it down in 1935 when a conservative Supreme Court made it harder for the SEC to investigate cases.
  • It took a few years, but New Dealers finally implemented a workable set of securities rules, with the courts agreeing on basic definitions of what was a security. By the 1950s, SEC investigators could raise an eyebrow and change market behavior, and the amount of cheating in finance had dropped dramatically.
  • Institutional change, in other words, takes time.
  • ...22 more annotations...
  • It’s a lesson to remember as we watch the crypto space melt down, with ex-billionaire Sam Bankman-Fried
  • It’s not like perfidy in crypto was some hidden secret. At the top of the market, back in December 2021, I wrote a piece very explicitly saying that crypto was a set of Ponzi schemes. It went viral, and I got a huge amount of hate mail from crypto types
  • one of the more bizarre aspects of the crypto meltdown is the deep anger not just at those who perpetrated it, but at those who were trying to stop the scam from going on. For instance, here’s crypto exchange Coinbase CEO Brian Armstrong, who just a year ago was fighting regulators vehemently, blaming the cops for allowing gambling in the casino he helps run.
  • FTX.com was an offshore exchange not regulated by the SEC. The problem is that the SEC failed to create regulatory clarity here in the US, so many American investors (and 95% of trading activity) went offshore. Punishing US companies for this makes no sense.
  • many crypto ‘enthusiasts’ watching Gensler discuss regulation with his predecessor “called for their incarceration or worse.”
  • Cryptocurrencies are securities, and should fit under securities law, which would have imposed rules that would foster a de facto ban of the entire space. But since regulators had not actually treated them as securities for the last ten years, a whole new gray area of fake law had emerged
  • Almost as soon as he took office, Gensler sought to fix this situation, and treat them as securities. He began investigating important players
  • But the legal wrangling to just get the courts to treat crypto as a set of speculative instruments regulated under securities law made the law moot
  • In May of 2022, a year after Gensler began trying to do something about Terra/Luna, Kwon’s scheme blew up. In a comically-too-late-to-matter gesture, an appeals court then said that the SEC had the right to compel information from Kwon’s now-bankrupt scheme. It is absolute lunacy that well-settled law, like the ability for the SEC to investigate those in the securities business, is now being re-litigated.
  • Securities and Exchange Commission Chair Gary Gensler, who took office in April of 2021 with a deep background in Wall Street, regulatory policy, and crypto, which he had taught at MIT years before joining the SEC. Gensler came in with the goal of implementing the rule of law in the crypto space, which he knew was full of scams and based on unproven technology. Yesterday, on CNBC, he was again confronted with Andrew Ross Sorkin essentially asking, “Why were you going after minor players when this Ponzi scheme was so flagrant?”
  • it wasn’t just the courts who were an impediment. Gensler wasn’t the only cop on the beat. Other regulators, like those at the Commodities Futures Trading Commission, the Federal Reserve, or the Office of Comptroller of the Currency, not only refused to take action, but actively defended their regulatory turf against an attempt from the SEC to stop the scams.
  • Behind this was the fist of political power. Everyone saw the incentives the Senate laid down when every single Republican, plus a smattering of Democrats, defeated the nomination of crypto-skeptic Saule Omarova in becoming the powerful bank regulator at the Comptroller of the Currency
  • Instead of strong figures like Omarova, we had a weakling acting Comptroller Michael Hsu at the OCC, put there by the excessively cautious Treasury Secretary Janet Yellen. Hsu refused to stop bank interactions with crypto or fintech because, as he told Congress in 2021, “These trends cannot be stopped.”
  • It’s not just these regulators; everyone wanted a piece of the bureaucratic pie. In March of 2022, before it all unraveled, the Biden administration issued an executive order on crypto. In it, Biden said that virtually every single government agency would have a hand in the space.
  • That’s… insane. If everyone’s in charge, no one is.
  • And behind all of these fights was the money and political prestige of some most powerful people in Silicon Valley, who were funding a large political fight to write the rules for crypto, with everyone from former Treasury Secretary Larry Summers to former SEC Chair Mary Jo White on the payroll.
  • (Even now, even after it was all revealed as a Ponzi scheme, Congress is still trying to write rules favorable to the industry. It’s like, guys, stop it. There’s no more bribe money!)
  • Moreover, the institution Gensler took over was deeply weakened. Since the Reagan administration, wave after wave of political leader at the SEC has gutted the place and dumbed down the enforcers. Courts have tied up the commission in knots, and Congress has defanged it
  • Under Trump crypto exploded, because his SEC chair Jay Clayton had no real policy on crypto (and then immediately went into the industry after leaving.) The SEC was so dormant that when Gensler came into office, some senior lawyers actually revolted over his attempt to make them do work.
  • In other words, the regulators were tied up in the courts, they were against an immensely powerful set of venture capitalists who have poured money into Congress and D.C., they had feeble legal levers, and they had to deal with ‘crypto enthusiasts' who thought they should be jailed or harmed for trying to impose basic rules around market manipulation.
  • The bottom line is, Gensler is just one regulator, up against a lot of massed power, money, and bad institutional habits. And we as a society simply made the choice through our elected leaders to have little meaningful law enforcement in financial markets, which first became blindingly obvious in 2008 during the financial crisis, and then became comical ten years later when a sector whose only real use cases were money laundering
  • , Ponzi scheming or buying drugs on the internet, managed to rack up enough political power to bring Tony Blair and Bill Clinton to a conference held in a tax haven billed as ‘the future.’
Javier E

Ad About Women's Self-Image Creates a Sensation - NYTimes.com - 0 views

  • An online video, presented in three- and six-minute versions, shows a forensic sketch artist who is asked to draw a series of women based only on their descriptions. Seated at a drafting table with his back to his subject, the artist, Gil Zamora, asks the women a series of questions about their features. “Tell me about your chin,” he says in the soft voice reminiscent of a therapist’s. Crow’s feet, big jaws, protruding chins and dark circles are just some of the many physical features that women criticized about themselves. After he finishes a drawing of a woman, he then draws another sketch of the same woman, only this time it is based on how someone else describes her. The sketches are then hung side by side and the women are asked to compare them. In every instance, the second sketch is more flattering than the first.
  • The video, shot in a loft in San Francisco, has become a sensation online. The three-minute version has been viewed more than 7.5 million times on the Dove YouTube channel, and the version that is twice as long has been viewed more than 936,000 times.
  • Dove executives said the campaign resulted from company research that showed only 4 percent of women consider themselves beautiful.
  • ...3 more annotations...
  • “As women we are so hard on ourselves physically and emotionally,” Ms. Olive said. “It gets you to stop and think about how we think of ourselves.”
  • Ms. Brice took issue with the tag line for the ad, “You’re More Beautiful Than You Think.” “I think it makes people much more susceptible to absorbing the subconscious messages,” Ms. Brice said, “that at the heart of it all is that beauty is still what defines women. It is a little hypocritical.”
  • “What if I did look like that woman on the left?” she said, referring to the less flattering sketches of the women. “There are people that look like that.”
1 - 20 of 127 Next › Last »
Showing 20 items per page