Skip to main content

Home/ History Readings/ Group items tagged YouTube

Rss Feed Group items tagged

Javier E

'Fiction is outperforming reality': how YouTube's algorithm distorts truth | Technology... - 0 views

  • There are 1.5 billion YouTube users in the world, which is more than the number of households that own televisions. What they watch is shaped by this algorithm, which skims and ranks billions of videos to identify 20 “up next” clips that are both relevant to a previous video and most likely, statistically speaking, to keep a person hooked on their screen.
  • Company insiders tell me the algorithm is the single most important engine of YouTube’s growth
  • YouTube engineers describe it as one of the “largest scale and most sophisticated industrial recommendation systems in existence”
  • ...49 more annotations...
  • Lately, it has also become one of the most controversial. The algorithm has been found to be promoting conspiracy theories about the Las Vegas mass shooting and incentivising, through recommendations, a thriving subculture that targets children with disturbing content
  • One YouTube creator who was banned from making advertising revenues from his strange videos – which featured his children receiving flu shots, removing earwax, and crying over dead pets – told a reporter he had only been responding to the demands of Google’s algorithm. “That’s what got us out there and popular,” he said. “We learned to fuel it and do whatever it took to please the algorithm.”
  • academics have speculated that YouTube’s algorithms may have been instrumental in fuelling disinformation during the 2016 presidential election. “YouTube is the most overlooked story of 2016,” Zeynep Tufekci, a widely respected sociologist and technology critic, tweeted back in October. “Its search and recommender algorithms are misinformation engines.”
  • Those are not easy questions to answer. Like all big tech companies, YouTube does not allow us to see the algorithms that shape our lives. They are secret formulas, proprietary software, and only select engineers are entrusted to work on the algorithm
  • Guillaume Chaslot, a 36-year-old French computer programmer with a PhD in artificial intelligence, was one of those engineers.
  • The experience led him to conclude that the priorities YouTube gives its algorithms are dangerously skewed.
  • Chaslot said none of his proposed fixes were taken up by his managers. “There are many ways YouTube can change its algorithms to suppress fake news and improve the quality and diversity of videos people see,” he says. “I tried to change YouTube from the inside but it didn’t work.”
  • Chaslot explains that the algorithm never stays the same. It is constantly changing the weight it gives to different signals: the viewing patterns of a user, for example, or the length of time a video is watched before someone clicks away.
  • The engineers he worked with were responsible for continuously experimenting with new formulas that would increase advertising revenues by extending the amount of time people watched videos. “Watch time was the priority,” he recalls. “Everything else was considered a distraction.”
  • Chaslot was fired by Google in 2013, ostensibly over performance issues. He insists he was let go after agitating for change within the company, using his personal time to team up with like-minded engineers to propose changes that could diversify the content people see.
  • He was especially worried about the distortions that might result from a simplistic focus on showing people videos they found irresistible, creating filter bubbles, for example, that only show people content that reinforces their existing view of the world.
  • “YouTube is something that looks like reality, but it is distorted to make you spend more time online,” he tells me when we meet in Berkeley, California. “The recommendation algorithm is not optimising for what is truthful, or balanced, or healthy for democracy.”
  • YouTube told me that its recommendation system had evolved since Chaslot worked at the company and now “goes beyond optimising for watchtime”.
  • It did not say why Google, which acquired YouTube in 2006, waited over a decade to make those changes
  • Chaslot believes such changes are mostly cosmetic, and have failed to fundamentally alter some disturbing biases that have evolved in the algorithm
  • It finds videos through a word search, selecting a “seed” video to begin with, and recording several layers of videos that YouTube recommends in the “up next” column. It does so with no viewing history, ensuring the videos being detected are YouTube’s generic recommendations, rather than videos personalised to a user. And it repeats the process thousands of times, accumulating layers of data about YouTube recommendations to build up a picture of the algorithm’s preferences.
  • Each study finds something different, but the research suggests YouTube systematically amplifies videos that are divisive, sensational and conspiratorial.
  • When his program found a seed video by searching the query “who is Michelle Obama?” and then followed the chain of “up next” suggestions, for example, most of the recommended videos said she “is a man”
  • He believes one of the most shocking examples was detected by his program in the run-up to the 2016 presidential election. As he observed in a short, largely unnoticed blogpost published after Donald Trump was elected, the impact of YouTube’s recommendation algorithm was not neutral during the presidential race: it was pushing videos that were, in the main, helpful to Trump and damaging to Hillary Clinton.
  • “It was strange,” he explains to me. “Wherever you started, whether it was from a Trump search or a Clinton search, the recommendation algorithm was much more likely to push you in a pro-Trump direction.”
  • Trump won the electoral college as a result of 80,000 votes spread across three swing states. There were more than 150 million YouTube users in the US. The videos contained in Chaslot’s database of YouTube-recommended election videos were watched, in total, more than 3bn times before the vote in November 2016.
  • “Algorithms that shape the content we see can have a lot of impact, particularly on people who have not made up their mind,”
  • “Gentle, implicit, quiet nudging can over time edge us toward choices we might not have otherwise made.”
  • But what was most compelling was how often Chaslot’s software detected anti-Clinton conspiracy videos appearing “up next” beside other videos.
  • I spent weeks watching, sorting and categorising the trove of videos with Erin McCormick, an investigative reporter and expert in database analysis. From the start, we were stunned by how many extreme and conspiratorial videos had been recommended, and the fact that almost all of them appeared to be directed against Clinton.
  • “This research captured the apparent direction of YouTube’s political ecosystem,” he says. “That has not been done before.”
  • There were too many videos in the database for us to watch them all, so we focused on 1,000 of the top-recommended videos. We sifted through them one by one to determine whether the content was likely to have benefited Trump or Clinton. Just over a third of the videos were either unrelated to the election or contained content that was broadly neutral or even-handed. Of the remaining 643 videos, 551 were videos favouring Trump, while only only 92 favoured the Clinton campaign.
  • The sample we had looked at suggested Chaslot’s conclusion was correct: YouTube was six times more likely to recommend videos that aided Trump than his adversary.
  • The spokesperson added: “Our search and recommendation systems reflect what people search for, the number of videos available, and the videos people choose to watch on YouTube. That’s not a bias towards any particular candidate; that is a reflection of viewer interest.”
  • YouTube seemed to be saying that its algorithm was a neutral mirror of the desires of the people who use it – if we don’t like what it does, we have ourselves to blame. How does YouTube interpret “viewer interest” – and aren’t “the videos people choose to watch” influenced by what the company shows them?
  • Offered the choice, we may instinctively click on a video of a dead man in a Japanese forest, or a fake news clip claiming Bill Clinton raped a 13-year-old. But are those in-the-moment impulses really a reflect of the content we want to be fed?
  • YouTube’s recommendation system has probably figured out that edgy and hateful content is engaging. “This is a bit like an autopilot cafeteria in a school that has figured out children have sweet teeth, and also like fatty and salty foods,” she says. “So you make a line offering such food, automatically loading the next plate as soon as the bag of chips or candy in front of the young person has been consumed.”
  • Once that gets normalised, however, what is fractionally more edgy or bizarre becomes, Tufekci says, novel and interesting. “So the food gets higher and higher in sugar, fat and salt – natural human cravings – while the videos recommended and auto-played by YouTube get more and more bizarre or hateful.”
  • “This is important research because it seems to be the first systematic look into how YouTube may have been manipulated,” he says, raising the possibility that the algorithm was gamed as part of the same propaganda campaigns that flourished on Twitter and Facebook.
  • “We believe that the activity we found was limited because of various safeguards that we had in place in advance of the 2016 election, and the fact that Google’s products didn’t lend themselves to the kind of micro-targeting or viral dissemination that these actors seemed to prefer.”
  • Senator Mark Warner, the ranking Democrat on the intelligence committee, later wrote to the company about the algorithm, which he said seemed “particularly susceptible to foreign influence”. The senator demanded to know what the company was specifically doing to prevent a “malign incursion” of YouTube’s recommendation system. Walker, in his written reply, offered few specifics
  • Tristan Harris, a former Google insider turned tech whistleblower, likes to describe Facebook as a “living, breathing crime scene for what happened in the 2016 election” that federal investigators have no access to. The same might be said of YouTube. About half the videos Chaslot’s program detected being recommended during the election have now vanished from YouTube – many of them taken down by their creators. Chaslot has always thought this suspicious. These were videos with titles such as “Must Watch!! Hillary Clinton tried to ban this video”, watched millions of times before they disappeared. “Why would someone take down a video that has been viewed millions of times?” he asks
  • I shared the entire database of 8,000 YouTube-recommended videos with John Kelly, the chief executive of the commercial analytics firm Graphika, which has been tracking political disinformation campaigns. He ran the list against his own database of Twitter accounts active during the election, and concluded many of the videos appeared to have been pushed by networks of Twitter sock puppets and bots controlled by pro-Trump digital consultants with “a presumably unsolicited assist” from Russia.
  • “I don’t have smoking-gun proof of who logged in to control those accounts,” he says. “But judging from the history of what we’ve seen those accounts doing before, and the characteristics of how they tweet and interconnect, they are assembled and controlled by someone – someone whose job was to elect Trump.”
  • After the Senate’s correspondence with Google over possible Russian interference with YouTube’s recommendation algorithm was made public last week, YouTube sent me a new statement. It emphasised changes it made in 2017 to discourage the recommendation system from promoting some types of problematic content. “We appreciate the Guardian’s work to shine a spotlight on this challenging issue,” it added. “We know there is more to do here and we’re looking forward to making more announcements in the months ahead.”
  • In the months leading up to the election, the Next News Network turned into a factory of anti-Clinton news and opinion, producing dozens of videos a day and reaching an audience comparable to that of MSNBC’s YouTube channel. Chaslot’s research indicated Franchi’s success could largely be credited to YouTube’s algorithms, which consistently amplified his videos to be played “up next”. YouTube had sharply dismissed Chaslot’s research.
  • I contacted Franchi to see who was right. He sent me screen grabs of the private data given to people who upload YouTube videos, including a breakdown of how their audiences found their clips. The largest source of traffic to the Bill Clinton rape video, which was viewed 2.4m times in the month leading up to the election, was YouTube recommendations.
  • The same was true of all but one of the videos Franchi sent me data for. A typical example was a Next News Network video entitled “WHOA! HILLARY THINKS CAMERA’S OFF… SENDS SHOCK MESSAGE TO TRUMP” in which Franchi, pointing to a tiny movement of Clinton’s lips during a TV debate, claims she says “fuck you” to her presidential rival. The data Franchi shared revealed in the month leading up to the election, 73% of the traffic to the video – amounting to 1.2m of its views – was due to YouTube recommendations. External traffic accounted for only 3% of the views.
  • many of the other creators of anti-Clinton videos I spoke to were amateur sleuths or part-time conspiracy theorists. Typically, they might receive a few hundred views on their videos, so they were shocked when their anti-Clinton videos started to receive millions of views, as if they were being pushed by an invisible force.
  • In every case, the largest source of traffic – the invisible force – came from the clips appearing in the “up next” column. William Ramsey, an occult investigator from southern California who made “Irrefutable Proof: Hillary Clinton Has a Seizure Disorder!”, shared screen grabs that showed the recommendation algorithm pushed his video even after YouTube had emailed him to say it violated its guidelines. Ramsey’s data showed the video was watched 2.4m times by US-based users before election day. “For a nobody like me, that’s a lot,” he says. “Enough to sway the election, right?”
  • Daniel Alexander Cannon, a conspiracy theorist from South Carolina, tells me: “Every video I put out about the Clintons, YouTube would push it through the roof.” His best-performing clip was a video titled “Hillary and Bill Clinton ‘The 10 Photos You Must See’”, essentially a slideshow of appalling (and seemingly doctored) images of the Clintons with voiceover in which Cannon speculates on their health. It has been seen 3.7m times on YouTube, and 2.9m of those views, Cannon said, came from “up next” recommendations.
  • his research also does something more important: revealing how thoroughly our lives are now mediated by artificial intelligence.
  • Less than a generation ago, the way voters viewed their politicians was largely shaped by tens of thousands of newspaper editors, journalists and TV executives. Today, the invisible codes behind the big technology platforms have become the new kingmakers.
  • They pluck from obscurity people like Dave Todeschini, a retired IBM engineer who, “let off steam” during the election by recording himself opining on Clinton’s supposed involvement in paedophilia, child sacrifice and cannibalism. “It was crazy, it was nuts,” he said of the avalanche of traffic to his YouTube channel, which by election day had more than 2m views
Javier E

Why Conspiracy Videos Go Viral on YouTube - The Atlantic - 0 views

  • “Many young people have absorbed a YouTube-centric worldview, including rejecting mainstream information sources in favor of platform-native creators bearing ‘secret histories’ and faux-authoritative explanations.”
  • YouTube likes to say that this problematic stuff is “less than one percent of the content on YouTube.” This is, undoubtedly, true, simply because there is so much stuff on YouTube
  • One exploration from 2015 found that fully half of its videos had fewer than 350 views, and that 90 percent had fewer than roughly 11,000 views. That is to say, YouTube is driven not by the tail of barely viewed videos, but by the head of wildly popular stuff
  • ...19 more annotations...
  • that doesn’t mean a smallish number of videos can’t assemble a vast audience, some of whom are led further into the lizard-person weirdness of the fringe.
  • The deeper argument that YouTube is making is that conspiracy videos on the platform are just a kind of mistake.
  • But the conspiratorial mind-set is threaded through the social fabric of YouTube. In fact, it’s intrinsic to the production economy of the site.
  • YouTube offers infinite opportunities to create, a closed ecosystem, an opaque algorithm, and the chance for a very small number of people to make a very large amount of money
  • Add in certain kinds of grievance politics, and you have the perfect recipe for hundreds of videos about YouTube “censoring” people or suppressing their views in some way.
  • The internet was supposed to set media free, which, for the content creator, should have removed all barriers to fame. But it did this for everyone, and suddenly every corner of the internet was a barrel of crabs, a hurly-burly of dumb, fierce competition from which only a select few scrabble out. They are plucked from above by the recommendation algorithm, which bestows the local currency (views) for reasons that no one can quite explain
  • “Our ancestors’ legacy to us is a brain programmed to see coincidence and infer cause.
  • what that means, Brotherton says, is that “sometimes, it would seem, buying into a conspiracy is the cognitive equivalent of seeing meaning in randomness.”
  • Google and Twitter spawned verbs, but YouTube created a noun: YouTuber. YouTube mints personalities engaged in great dramas among networks of other YouTubers
  • Creators are, in fact, responsible for YouTube’s massive revenues, and yet they are individually powerless to dictate the terms of their relationship, even strung together in so-called multichannel networks of creators. YouTube wants views where it makes money; YouTubers want views on their content, whether it is to YouTube’s benefit or not.
  • While these conditions of production—which incentivize content creation at a very low cost to YouTube—exist on other modern social platforms, YouTube’s particular constellation of them is special
  • Crucially, YouTubers must get viewers to emotionally invest in them, because they need people to “like, comment, and subscribe.” The dedicated community around YouTubers has to support them with concrete actions to pull them up the rankings
  • But because of that very accessibility, many, many people see the videos on YouTube and say, “I could do that.
  • The content-production system has created a kind of conspiracist politics that is native to YouTube
  • Richard Hofstadter identified “the paranoid style” in American politics decades ago. The “paranoid spokesman” was “overheated, oversuspicious, overaggressive, grandiose, and apocalyptic in expression,” seeing himself as the guardian of “a nation, a culture, a way of life” against “the hostile and conspiratorial world.
  • This audience of the aggrieved just happens to be the perfect group for successful YouTubers to find
  • Once something is known to work in the YouTube world—once it’s clear that the demand is out there—the supply side of video makers kicks in. Each is trying to find just the right conspiracy and spin on a conspiracy to move up the logarithmic scale of YouTube popularity
  • Now that YouTube corporate is attempting to use its levers to tamp down the worst conspiratorial thinking, isn’t that exactly what the conspiracists would predict would happen to the truth?
  • it’s not only that conspiracy content made YouTube viewers more prone to believe conspiracies. It’s that the economics and illusions of content production on YouTube itself made conspiracy content more likely to be created and viewed.
Javier E

How YouTube Radicalized Brazil - The New York Times - 0 views

  • “YouTube became the social media platform of the Brazilian right,”
  • Members of the nation’s newly empowered far right — from grass-roots organizers to federal lawmakers — say their movement would not have risen so far, so fast, without YouTube’s recommendation engine.
  • New research has found they may be correct. YouTube’s search and recommendation system appears to have systematically diverted users to far-right and conspiracy channels in Brazil.
  • ...42 more annotations...
  • A New York Times investigation in Brazil found that, time and again, videos promoted by the site have upended central elements of daily life
  • YouTube’s recommendation system is engineered to maximize watchtime, among other factors, the company says, but not to favor any political ideology.
  • Some parents look to “Dr. YouTube” for health advice but get dangerous misinformation instead, hampering the nation’s efforts to fight diseases like Zika. Viral videos have incited death threats against public health advocates.
  • And in politics, a wave of right-wing YouTube stars ran for office alongside Mr. Bolsonaro, some winning by historic margins. Most still use the platform, governing the world’s fourth-largest democracy through internet-honed trolling and provocation
  • Teachers describe classrooms made unruly by students who quote from YouTube conspiracy videos or who, encouraged by right-wing YouTube stars, secretly record their instructors
  • But the emotions that draw people in — like fear, doubt and anger — are often central features of conspiracy theories, and in particular, experts say, of right-wing extremism
  • As the system suggests more provocative videos to keep users watching, it can direct them toward extreme content they might otherwise never find. And it is designed to lead users to new topics to pique new interest
  • The system now drives 70 percent of total time on the platfor
  • Zeynep Tufekci, a social media scholar, has called it “one of the most powerful radicalizing instruments of the 21st century.”
  • Danah Boyd, founder of the think tank Data & Society, attributed the disruption in Brazil to YouTube’s unrelenting push for viewer engagement, and the revenues it generates.
  • Maurício Martins, the local vice president of Mr. Bolsonaro’s party in Niterói, credited “most” of the party’s recruitment to YouTube — including his own.
  • “Before that, I didn’t have an ideological political background,” Mr. Martins said. YouTube’s auto-playing recommendations, he declared, were “my political education.”
  • “It was like that with everyone,”
  • Sometimes I’m watching videos about a game, and all of a sudden it’s a Bolsonaro video,”
  • More and more, his fellow students are making extremist claims, often citing as evidence YouTube stars like Mr. Moura, the guitarist-turned-conspiracist.
  • “If social media didn’t exist, I wouldn’t be here,” he said. “Jair Bolsonaro wouldn’t be president.”
  • In the months after YouTube changed its algorithm, positive mentions of Mr. Bolsonaro ballooned. So did mentions of conspiracy theories that he had floated. This began as polls still showed him to be deeply unpopular, suggesting that the platform was doing more than merely reflecting political trends.
  • Jonas Kaiser and Yasodara Córdova, with Adrian Rauchfleisch of National Taiwan University, programmed a Brazil-based server to enter a popular channel or search term, then open YouTube’s top recommendations, then follow the recommendations on each of those, and so on.
  • By repeating this thousands of times, the researchers tracked how the platform moved users from one video to the next. They found that after users watched a video about politics or even entertainment, YouTube’s recommendations often favored right-wing, conspiracy-filled channels like Mr. Moura’s
  • Crucially, users who watched one far-right channel would often be shown many more.
  • The algorithm had united once-marginal channels — and then built an audience for them
  • One of those channels belonged to Mr. Bolsonaro, who had long used the platform to post hoaxes and conspiracies
  • The conspiracies were not limited to politics. Many Brazilians searching YouTube for health care information found videos that terrified them: some said Zika was being spread by vaccines, or by the insecticides meant to curb the spread of the mosquito-borne disease that has ravaged northeastern Brazi
  • The videos appeared to rise on the platform in much the same way as extremist political content: by making alarming claims and promising forbidden truths that kept users glued to their screens.
  • Doctors, social workers and former government officials said the videos had created the foundation of a public health crisis as frightened patients refused vaccines and even anti-Zika insecticides.
  • Not long after YouTube installed its new recommendation engine, Dr. Santana’s patients began telling him that they’d seen videos blaming Zika on vaccines — and, later, on larvicides. Many refused both.
  • Medical providers, she said, were competing “every single day” against “Dr. Google and Dr. YouTube” — and they were losing
  • Brazil’s medical community had reason to feel outmatched. The Harvard researchers found that YouTube’s systems frequently directed users who searched for information on Zika, or even those who watched a reputable video on health issues, toward conspiracy channels
  • As the far right rose, many of its leading voices had learned to weaponize the conspiracy videos, offering their vast audiences a target: people to blame
  • Eventually, the YouTube conspiracists turned their spotlight on Debora Diniz, a women’s rights activist whose abortion advocacy had long made her a target of the far right
  • Bernardo Küster, a YouTube star whose homemade rants had won him 750,000 subscribers and an endorsement from Mr. Bolsonaro, accused her of involvement in the supposed Zika plots.
  • As far-right and conspiracy channels began citing one another, YouTube’s recommendation system learned to string their videos together
  • However implausible any individual rumor might be on its own, joined together, they created the impression that dozens of disparate sources were revealing the same terrifying truth.
  • When the university where Ms. Diniz taught received a warning that a gunman would shoot her and her students, and the police said they could no longer guarantee her safety, she left Brazil.
  • “The YouTube system of recommending the next video and the next video,” she said, had created “an ecosystem of hate.
  • “‘I heard here that she’s an enemy of Brazil. I hear in the next one that feminists are changing family values. And the next one I hear that they receive money from abroad” she said. “That loop is what leads someone to say ‘I will do what has to be done.’
  • In Brazil, this is a growing online practice known as “linchamento” — lynching. Mr. Bolsonaro was an early pioneer, spreading videos in 2012 that falsely accused left-wing academics of plotting to force schools to distribute “gay kits” to convert children to homosexuality.
  • Mr. Jordy, his tattooed Niterói protégé, was untroubled to learn that his own YouTube campaign, accusing teachers of spreading communism, had turned their lives upside down.One of those teachers, Valeria Borges, said she and her colleagues had been overwhelmed with messages of hate, creating a climate of fear.
  • Mr. Jordy, far from disputing this, said it had been his goal. “I wanted her to feel fear,” he said
  • The group’s co-founder, a man-bunned former rock guitarist name Pedro D’Eyrot, said “we have something here that we call the dictatorship of the like.”
  • Reality, he said, is shaped by whatever message goes most viral.
  • Even as he spoke, a two-hour YouTube video was captivating the nation. Titled “1964” for the year of Brazil’s military coup, it argued that the takeover had been necessary to save Brazil from communism.Mr. Dominguez, the teenager learning to play guitar, said the video persuaded him that his teachers had fabricated the horrors of military rule.
lilyrashkind

Why YouTube Has Survived Russia's Social Media Crackdown | Time - 0 views

  • In a style part investigative journalism, part polemic, the video’s hosts report that one of President Vladimir Putin’s allies, Russian senator Valentina Matviyenko, owns a multimillion-dollar villa on the Italian seafront. The video contrasts the luxurious lifestyle of Matviyenko and her family with footage of dead Russian soldiers, and with images of Russian artillery hitting civilian apartment buildings in Ukraine. A voiceover calls the war “senseless” and “unimaginable.” A slide at the end urges Russians to head to squares in their cities to protest at specific dates and times. In less than a week, the video racked up more than 4 million views.
  • TV news is dominated by the misleading narrative that Russia’s invasion of Ukraine is actually a peace-keeping exercise. Despite this, YouTube has largely been spared from the Kremlin’s crackdown on American social media platforms since Russia invaded Ukraine nearly a month ago.
  • The app had been a particular venue for activism: Many Russian celebrities spoke out against the invasion of Ukraine in their Instagram stories, and Navalny’s Instagram page posted a statement criticizing the war, and calling on Russians to come out in protest.
  • ...9 more annotations...
  • On March 11, YouTube’s parent company Google announced that it would block Russian state-backed media globally, including within Russia. The policy was an expansion of an earlier announcement that these channels would be blocked within the European Union. “Our Community Guidelines prohibit content denying, minimizing or trivializing well-documented violent events, and we remove content about Russia’s invasion in Ukraine that violates this policy,” Google said in a statement. “In line with that, effective immediately, we are also blocking YouTube channels associated with Russian state-funded media, globally.”
  • That could leave many millions of Russians cut off from independent news and content shared by opposition activists like Navalny’s team. (It would also effectively delete 75 million YouTube users, or some 4% of the platform’s global total—representing a small but still-significant portion of Google’s overall profits.)
  • Part of the reason for YouTube’s survival amid the crackdown is its popularity, experts say. “YouTube is by far and away the most popular social media platform in Russia,” says Justin Sherman, a non-resident fellow at the Atlantic Council’s cyber statecraft initiative. The platform is even more popular than VK, the Russian-owned answer to Facebook.
  • Today, YouTube remains the most significant way for tens of millions of ordinary Russians to receive largely uncensored information from the outside world.
  • Still, Sherman says the situation is volatile, with Russia now more likely than ever before to ban YouTube. For an authoritarian government like Russia’s, “part of the decision to allow a foreign platform in your country is that you get to use it to spread propaganda and disinformation, even if people use it to spread truth and organize against you,” he says. “If you start losing the ability to spread misinformation and propaganda, but people can still use it to spread truth and organize, then all of a sudden, you start wondering why you’re allowing that platform in your country in the first place.” YouTube did not respond to a request for comment.
  • On the same day as Navalny’s channel posted the video about Matviyenko, elsewhere on YouTube a very different spectacle was playing out. In a video posted to the channel of the Kremlin-funded media outlet RT, (formerly known as Russia Today,) a commentator dismissed evidence of Russian bombings of Ukrainian cities. She blamed “special forces of NATO countries” for allegedly faking images of bombed-out Ukrainian schools, kindergartens and other buildings.
  • “YouTube has, over the years, been a really important place for spreading Russian propaganda,” Donovan said in an interview with TIME days before YouTube banned Russian state-backed media.
  • In July 2021, the Russian government passed a law that would require foreign tech companies with more than 500,000 users to open a local office within Russia. (A similar law passed previously in India had been used by the government there to pressure tech companies to take down opposition accounts and posts critical of the government, by threatening employees with arrest.)
  • The heightened risk to free expression in Russia Experts say that Russia’s ongoing crackdown on social media platforms heralds a significant shift in the shape of the Russian internet—and a potential end to the era where the Kremlin tolerated largely free expression on YouTube in return for access to a tool that allowed it to spread disinformation far and wide.
Javier E

A New Generation's Vanity, Heard Through Hit Lyrics - NYTimes.com - 1 views

  • psychologists report finding what they were looking for: a statistically significant trend toward narcissism and hostility in popula
  • r music
  • the words “I” and “me” appear more frequently along with anger-related words, while there’s been a corresponding decline in “we” and “us” and the expression of positive emotions.
  • ...4 more annotations...
  • “Late adolescents and college students love themselves more today than ever before,”
  • The researchers find that hit songs in the 1980s were more likely to emphasize happy togetherness, like the racial harmony sought by Paul McCartney and Stevie Wonder in “Ebony and Ivory” and the group exuberance promoted by Kool & the Gang: “Let’s all celebrate and have a good time.” Diana Ross and Lionel Richie sang of “two hearts that beat as one,” and John Lennon’s “(Just Like) Starting Over” emphasized the preciousness of “our life together.” Today’s songs, according to the researchers’ linguistic analysis, are more likely be about one very special person: the singer. “I’m bringing sexy back,” Justin Timberlake proclaimed in 2006. The year before, Beyoncé exulted in how hot she looked while dancing — “It’s blazin’, you watch me in amazement.” And Fergie, who boasted about her “humps” while singing with the Black Eyed Peas, subsequently released a solo album in which she told her lover that she needed quality time alone: “It’s personal, myself and I.”
  • a meta-analysis published last year in Social Psychological and Personality Science, Dr. Twenge and Joshua D. Foster looked at data from nearly 50,000 students — including the new data from critics — and concluded that narcissism has increased significantly in the past three decades.
  • Their song-lyrics analysis shows a decline in words related to social connections and positive emotions (like “love” or “sweet”) and an increase in words related to anger and antisocial behavior (like “hate” or “kill”).
Javier E

What if Being a YouTube Celebrity Is Actually Backbreaking Work? - The New York Times - 0 views

  • It’s been two years. Chamberlain now has 8 million YouTube followers. She brought in the editing tricks that first set her friends and family rolling on the floor, but now they take longer to perfect.
  • Chamberlain edits each video she makes for between 20 and 30 hours, often at stretches of 10 or 15 hours at a time. Her goal is to be funny, to keep people watching. It’s as if the comic value of each video is inversely proportional to how little humor she experiences while making it. During her marathon editing sessions, she said, she laughs for “maybe, maybe 10 seconds max.”
  • Like other professional social media users, the work has taken a physical toll on her. (She releases roughly one video a week.) She used to edit at a desktop, but she developed back pain. Now she works from her bed. She keeps blue mood lighting on, but her vision has deteriorated. She wears reading glasses “like I’m 85 years old, because my eyes do actually get really strained.”
  • ...11 more annotations...
  • “It’s almost like when you’re doing your homework, you’re halfway through a math work sheet, you’re really in it right there. You can’t hear anything, you can’t see anything,” she said. “Or if you’re watching a movie and you’re so zoned in you don’t even remember what real life is. You just think you’re in the movie. That’s exactly how it is, but times five. I’m so zoned in. I have this weird mind-set where it’s me quickly analyzing every five seconds, ‘Is this boring, is this stupid, can I cut this? Yes. No. Yes. No. Yes. No.’”
  • In June 2018, Chamberlain left the Bay Area to live alone in L.A. and fully immerse herself in YouTubeland.
  • I created this kind of style that was super cool to me and super exciting for me, and now that other people are doing it, now all of a sudden I’m unoriginal, which is something that I’ve always really tried to be. That’s what makes me feel good creatively. So when people started to say that, I kind of had a full, you know, not like mental breakdown, but we could also say that. Not a mental breakdown! But I definitely freaked out.”
  • Chamberlain’s parents have supported her unconventional choices, like dropping out of school in the beginning of her junior year and moving to Los Angeles to live by herself while still a teenager. She says that they were and are her best friends.
  • Over these two years, Chamberlain invented the way people talk on YouTube now, particularly the way they communicate authenticity. Her editing tricks and her mannerisms are ubiquitous. There is an entire subgenre of videos that mimic her style, and a host of YouTubers who talk, or edit, just like her. The Atlantic recently noted this and declared she is “the most important YouTuber” working today.
  • Professional YouTubers are the children of reality television. The dramas of their videos are often inextricable from their lives. When Jake Paul and Tana Mongeau, two famous YouTubers, said they were engaged last month, it was impossible for fans to parse whether they were telling the truth. It barely mattered
  • YouTubers tend to bond and/or feud with one another constantly, because this is social media as much as it is performance art. They recreate the overheated dynamics of the high school environment that Chamberlain wanted to escape.
  • Chamberlain has now decided upon a new approach. “I’m just going to not stick to one thing so strictly,” she said.Her recent videos are less jittery, less edited. She has been trying to let her narrative and her scripts speak, with fewer interruptions than before.
  • Recently, she has tried anthologies and also stunts, like spending 24 hours on the balcony of her house
  • “I’m trying to make the stuff that I’m filming more dynamic so that when I’m editing there’s less pressure on me to kind of create something that’s not there,”
  • I’m starting to realize that editing is very personal, and 90 percent of the editing is just so that I’m not bored. So I don’t have to overdo it. I’m trying to find that balance right now, so that I don’t overwork myself
aleija

YouTube Cut Down Misinformation. Then It Boosted Fox News. - The New York Times - 0 views

  • That algorithm decided which videos YouTube recommended that users watch next; the company said it was responsible for 70 percent of the one billion hours a day people spent on YouTube. But it had become clear that those recommendations tended to steer viewers toward videos that were hyperpartisan, divisive, misleading or downright false.
  • . In the weeks leading up to Tuesday’s election, YouTube recommended far fewer fringe channels alongside news videos than it did in 2016, which helped it to reduce its spread of disinformation, according to research by Guillaume Chaslot, a former Google engineer who helped build YouTube’s recommendation engine and now studies it.
  • The ascent of Fox News on the social media platforms was a reminder that tech companies have been walking a tricky line between limiting misinformation and appeasing politicians complaining that Silicon Valley is biased — all while still keeping people clicking, watching and sharing on their sites.
  • ...2 more annotations...
  • “The channel most recommended in our data set in 2016 was Alex Jones,” the notorious internet conspiracy theorist, who has since been barred from YouTube, Mr. Chaslot said. “Now it’s Fox News.”
  • YouTube’s promotion of Fox News’s unabashedly conservative pundits also undercut arguments from some of those same pundits that the biggest tech companies are trying to silence them.
Javier E

How 2020 Forced Facebook and Twitter to Step In - The Atlantic - 0 views

  • mainstream platforms learned their lesson, accepting that they should intervene aggressively in more and more cases when users post content that might cause social harm.
  • During the wildfires in the American West in September, Facebook and Twitter took down false claims about their cause, even though the platforms had not done the same when large parts of Australia were engulfed in flames at the start of the year
  • Twitter, Facebook, and YouTube cracked down on QAnon, a sprawling, incoherent, and constantly evolving conspiracy theory, even though its borders are hard to delineate.
  • ...15 more annotations...
  • It tweaked its algorithm to boost authoritative sources in the news feed and turned off recommendations to join groups based around political or social issues. Facebook is reversing some of these steps now, but it cannot make people forget this toolbox exists in the future
  • Nothing symbolizes this shift as neatly as Facebook’s decision in October (and Twitter’s shortly after) to start banning Holocaust denial. Almost exactly a year earlier, Zuckerberg had proudly tied himself to the First Amendment in a widely publicized “stand for free expression” at Georgetown University.
  • The evolution continues. Facebook announced earlier this month that it will join platforms such as YouTube and TikTok in removing, not merely labeling or down-ranking, false claims about COVID-19 vaccines.
  • the pandemic also showed that complete neutrality is impossible. Even though it’s not clear that removing content outright is the best way to correct misperceptions, Facebook and other platforms plainly want to signal that, at least in the current crisis, they don’t want to be seen as feeding people information that might kill them.
  • As platforms grow more comfortable with their power, they are recognizing that they have options beyond taking posts down or leaving them up. In addition to warning labels, Facebook implemented other “break glass” measures to stem misinformation as the election approached.
  • Down-ranking, labeling, or deleting content on an internet platform does not address the social or political circumstances that caused it to be posted in the first place
  • Content moderation comes to every content platform eventually, and platforms are starting to realize this faster than ever.
  • Platforms don’t deserve praise for belatedly noticing dumpster fires that they helped create and affixing unobtrusive labels to them
  • Warning labels for misinformation might make some commentators feel a little better, but whether labels actually do much to contain the spread of false information is still unknown.
  • News reporting suggests that insiders at Facebook knew they could and should do more about misinformation, but higher-ups vetoed their ideas. YouTube barely acted to stem the flood of misinformation about election results on its platform.
  • When internet platforms announce new policies, assessing whether they can and will enforce them consistently has always been difficult. In essence, the companies are grading their own work. But too often what can be gleaned from the outside suggests that they’re failing.
  • And if 2020 finally made clear to platforms the need for greater content moderation, it also exposed the inevitable limits of content moderation.
  • Even before the pandemic, YouTube had begun adjusting its recommendation algorithm to reduce the spread of borderline and harmful content, and is introducing pop-up nudges to encourage user
  • even the most powerful platform will never be able to fully compensate for the failures of other governing institutions or be able to stop the leader of the free world from constructing an alternative reality when a whole media ecosystem is ready and willing to enable him. As Renée DiResta wrote in The Atlantic last month, “reducing the supply of misinformation doesn’t eliminate the demand.”
  • Even so, this year’s events showed that nothing is innate, inevitable, or immutable about platforms as they currently exist. The possibilities for what they might become—and what role they will play in society—are limited more by imagination than any fixed technological constraint, and the companies appear more willing to experiment than ever.
Javier E

Opinion | Algorithms Won't Fix What's Wrong With YouTube - The New York Times - 0 views

  • YouTube’s recommendation algorithm is a set of rules followed by cold, hard computer logic. It was designed by human engineers, but is then programmed into and run automatically by computers, which return recommendations, telling viewers which videos they should watch.
  • Google Brain, an artificial intelligence research team within the company, powers those recommendations, and bases them on user’s prior viewing. The system is highly intelligent, accounting for variations in the way people watch their videos.
  • In 2016, a paper by three Google employees revealed the deep neural networks behind YouTube’s recommended videos, which rifle through every video we’ve previously watched. The algorithm then uses that information to select a few hundred videos we might like to view from the billions on the site, which are then winnowed down to dozens, which are then presented on our screens.
  • ...15 more annotations...
  • In the three years since Google Brain began making smart recommendations, watch time from the YouTube home page has grown 20-fold. More than 70 percent of the time people spend watching videos on YouTube, they spend watching videos suggested by Google Brain.
  • The more videos that are watched, the more ads that are seen, and the more money Google makes.
  • “We also wanted to serve the needs of people when they didn’t necessarily know what they wanted to look for.”
  • Last week, The New York Times reported that YouTube’s algorithm was encouraging pedophiles to watch videos of partially-clothed children, often after they watched sexual content.
  • o YouTube’s nuance-blind algorithm — trained to think with simple logic — serving up more videos to sate a sadist’s appetite is a job well done.
  • The result? The algorithm — and, consequently, YouTube — incentivizes bad behavior in viewers.
  • the algorithm relies on snapshots of visual content, rather than actions. If you (or your child) watch one Peppa Pig video, you’ll likely want another. And as long as it’s Peppa Pig in the frame, it doesn’t matter what the character does in the skit.
  • it didn’t take long for inappropriate videos to show up in YouTube Kids’ ‘Now playing’ feeds
  • Using cheap, widely available technology, animators created original video content featuring some of Hollywood’s best-loved characters. While an official Disney Mickey Mouse would never swear or act violently, in these videos Mickey and other children’s characters were sexual or violent
  • there’s a 3.5 percent chance of a child coming across inappropriate footage within 10 clicks of a child-friendly video.
  • Just four in 10 parents always monitor their child’s YouTube usage — and one in 20 children aged 4-to-12 say their parents never check what they’re watching.
  • At the height of the panic around Mr. Crowder’s videos, YouTube’s public policy on hate speech and harassment appeared to shift four times in a 24-hour period as the company sought to clarify what the new normal was.
  • One possible solution that would address both problems would be to strip out YouTube’s recommendation altogether. But it is highly unlikely that YouTube would ever do such a thing: that algorithm drives vast swaths of YouTube’s views, and to take it away would reduce the time viewers spend watching its videos, as well as reduce Google’s ad revenue.
  • it must, at the very least, make significant changes, and have greater human involvement in the recommendation process. The platform has some human moderators looking at so-called “borderline” content to train its algorithms, but more humanity is needed in the entire process.
  • Currently, the recommendation engine cannot understand why it shouldn’t recommend videos of children to pedophiles, and it cannot understand why it shouldn’t suggest sexually explicit videos to children. It cannot understand, because the incentives are twisted: every new video view, regardless of who the viewer is and what the viewer’s motives may be, is considered a success.
blythewallick

YouTube ads of 100 top brands fund climate misinformation - study | Technology | The Gu... - 0 views

  • Some of the biggest companies in the world are funding climate misinformation by advertising on YouTube, according to a study from activist group Avaaz.
  • “This is not about free speech, this is about the free advertising YouTube is giving to factually inaccurate videos that risk confusing people about one of the biggest crises of our time,” said Julie Deruy, a senior campaigner at the group. “YouTube should not feature, suggest, promote, advertise or lead users to misinformation.”
  • “YouTube has previously taken welcome steps to protect its users from anti-vaccine and conspiracy theories,” Avaaz argued
  • ...5 more annotations...
  • Include climate misinformation in its “borderline content” policy, which limits the algorithmic distribution of videos that do not reach the bar required to fully remove them from the site.
  • Demonetise misinformation, “ensuring such content does not include advertising and is not financially incentivised. YouTube should start immediately with the option for advertisers to exclude their ads from videos with climate misinformation.”
  • Work with independent fact-checkers to inform users who have seen or interacted with verifiably false or misleading information.
  • Provide transparency to researchers by releasing data showing how many views are driven to misinformation by its own recommendation algorithms.
  • “In 2019 alone, the consumption on authoritative news publishers’ channels grew by 60%. As our systems appear to have done in the majority of cases in this report, we prioritise authoritative voices for millions of news and information queries, and surface information panels on topics prone to misinformation – including climate change – to provide users with context alongside their content. We continue to expand these efforts to more topics and countries.”
aidenborst

What to Expect From Facebook, Twitter and YouTube on Election Day - The New York Times - 1 views

  • Facebook, YouTube and Twitter were misused by Russians to inflame American voters with divisive messages before the 2016 presidential election. The companies have spent the past four years trying to ensure that this November isn’t a repeat.
  • Since 2016, Facebook has poured billions of dollars into beefing up its security operations to fight misinformation and other harmful content. It now has more than 35,000 people working on this area, the company said.
  • Facebook has made changes up till the last minute. Last week, it said it had turned off political and social group recommendations and temporarily removed a feature in Instagram’s hashtag pages to slow the spread of misinformation.
  • ...11 more annotations...
  • Facebook’s app will also look different on Tuesday. To prevent candidates from prematurely and inaccurately declaring victory, the company plans to add a notification at the top of News Feeds letting people know that no winner has been chosen until election results are verified by news outlets like Reuters and The Associated Press
  • After the polls close, Facebook plans to suspend all political ads from circulating on the social network and its photo-sharing site
  • Twitter has also worked to combat misinformation since 2016, in some cases going far further than Facebook. Last year, for instance, it banned political advertising entirely, saying the reach of political messages “should be earned, not bought.”
  • In October, Twitter began experimenting with additional techniques to slow the spread of misinformation.
  • In September, Twitter added an Election Hub that users can use to look for curated information about polling, voting and candidates.
  • Twitter plans to add labels to tweets from candidates who claim victory before the election is called by authoritative sources.
  • Twitter will eventually allow people to retweet again without prompting them to add their own context. But many of the changes for the election — like the ban on political ads and the fact-checking labels — are permanent
  • For Google’s YouTube, it wasn’t the 2016 election that sounded a wake-up call about the toxic content spreading across its website. That moment came in 2017 when a group of men drove a van into pedestrians on London Bridge after being inspired by YouTube videos of inflammatory sermons from an Islamic cleric.
  • It has overhauled its policies to target misinformation, while tweaking its algorithms to slow the spread of what it deems borderline content — videos that do not blatantly violate its rules but butt up against them.
  • On Tuesday, Mr. Mohan plans to check in regularly with his teams to keep an eye on anything unusual, he said. There will be no “war room,” and he expects that most decisions to keep or remove videos will be clear and that the usual processes for making those decisions will be sufficient.
  • Starting on Tuesday and continuing as needed, YouTube will display a fact-check information panel above election-related search results and below videos discussing the results, the company said.
Javier E

Laura Lee, Jeffree Star and that racism scandal that's tearing Beauty YouTube apart - T... - 0 views

  • Beyond the walls of YouTube creators and viewers, there’s been a wider discussion about personal online archives and their meaning in 2018. The same Internet sleuthing tactics that held Star and his ex-friends accountable for their past words are also used to bully and harass the marginalized.
  • the consequences for terrible online pasts often spread faster than the more complicated questions they raise — in particular, the questions of what responsibility people bear for their own online past, and whether becoming a public figure changes that. And what even is a public figure in 2018?
  • But YouTubers, like every person involved in this battle, are role models. They are definitely public figures.
  • ...1 more annotation...
  • The thing about online archives is that they flatten your past into your present, into a world where, for YouTube creators, they have influence and power.
anonymous

YouTube Suspends Trump's Account Over Concerns About Violence : Insurrection At The Cap... - 0 views

  • YouTube, citing "the ongoing potential for violence," has suspended President Trump's account for at least a week
  • The social media platform is the latest to take action against Trump following a riot at the U.S. Capitol last week organized by the president's supporters.
  • As the nation prepares for President-elect Joe Biden's inauguration next week, law enforcement officials have said there are credible threats of more acts of violence from supporters of Trump.
  • ...6 more annotations...
  • The suspension of Trump's channel came after comments the president made at a news conference on Tuesday that streamed on the platform
  • Trump's channel is prevented from uploading new videos or livestreams for a minimum of seven days, which could be extended.
  • A first strike for a YouTube account leads to a one-week suspension, a second strike results in a two-week suspension, and a third strike will result in channel termination.
  • "After careful review, and in light of concerns about the ongoing potential for violence, we removed new content uploaded to the Donald J. Trump channel and issued a strike for violating our policies for inciting violence,"
  • "We are also indefinitely disabling comments under videos on the channel, we've taken similar actions in the past for other cases involving safety concerns,"
  • Twitter and Facebook removed Trump's accounts following the clash at the U.S. Capitol and have taken down content supporting last week's siege. Amazon, Google and Apple have also removed the Parler app, which is reportedly used by many of Trump's supporters.
Javier E

Pro-China YouTube Network Used A.I. to Malign U.S., Report Finds - The New York Times - 0 views

  • The 10-minute post was one of more than 4,500 videos in an unusually large network of YouTube channels spreading pro-China and anti-U.S. narratives, according to a report this week from the Australian Strategic Policy Institute
  • ome of the videos used artificially generated avatars or voice-overs, making the campaign the first influence operation known to the institute to pair A.I. voices with video essays.
  • The campaign’s goal, according to the report, was clear: to influence global opinion in favor of China and against the United States.
  • ...17 more annotations...
  • The videos promoted narratives that Chinese technology was superior to America’s, that the United States was doomed to economic collapse, and that China and Russia were responsible geopolitical players. Some of the clips fawned over Chinese companies like Huawei and denigrated American companies like Apple.
  • Content from at least 30 channels in the network drew nearly 120 million views and 730,000 subscribers since last year, along with occasional ads from Western companies
  • Disinformation — such as the false claim that some Southeast Asian nations had adopted the Chinese yuan as their own currency — was common. The videos were often able to quickly react to current events
  • he coordinated campaign might be “one of the most successful influence operations related to China ever witnessed on social media.”
  • Historically, its influence operations have focused on defending the Communist Party government and its policies on issues like the persecution of Uyghurs or the fate of Taiwan
  • Efforts to push pro-China messaging have proliferated in recent years, but have featured largely low-quality content that attracted limited engagement or failed to sustain meaningful audiences
  • “This campaign actually leverages artificial intelligence, which gives it the ability to create persuasive threat content at scale at a very limited cost compared to previous campaigns we’ve seen,”
  • YouTube said in a statement that its teams work around the clock to protect its community, adding that “we have invested heavily in robust systems to proactively detect coordinated influence operations.” The company said it welcomed research efforts and that it had shut down several of the channels mentioned in the report for violating the platform’s policies.
  • China began targeting the United States more directly amid the mass pro-democracy protests in Hong Kong in 2019 and continuing with the Covid-19 pandemic, echoing longstanding Russian efforts to discredit American leadership and influence at home and aboard.
  • Over the summer, researchers at Microsoft and other companies unearthed evidence of inauthentic accounts that China employed to falsely accuse the United States of using energy weapons to ignite the deadly wildfires in Hawaii in August.
  • Meta announced last month that it removed 4,789 Facebook accounts from China that were impersonating Americans to debate political issues, warning that the campaign appeared to be laying the groundwork for interference in the 2024 presidential elections.
  • It was the fifth network with ties to China that Meta had detected this year, the most of any other country.
  • The advent of artificial technology seems to have drawn special interest from Beijing. Ms. Keast of the Australian institute said that disinformation peddlers were increasingly using easily accessible video editing and A.I. programs to create large volumes of convincing content.
  • She said that the network of pro-China YouTube channels most likely fed English-language scripts into readily available online text-to-video software or other programs that require no technical expertise and can produce clips within minutes. Such programs often allow users to select A.I.-generated voice narration and customize the gender, accent and tone of voice.
  • In 39 of the videos, Ms. Keast found at least 10 artificially generated avatars advertised by a British A.I. company
  • she also discovered what may be the first example in an influence operation of a digital avatar created by a Chinese company — a woman in a red dress named Yanni.
  • The scale of the pro-China network is probably even larger, according to the report. Similar channels appeared to target Indonesian and French people. Three separate channels posted videos about chip production that used similar thumbnail images and the same title translated into English, French and Spanish.
Javier E

A Handful of Accounts Create Most of What We See on Social Media - WSJ - 0 views

  • Social media is turning into old-fashioned network television.
  • A handful of accounts create most of the content that we see. Everyone else? They play the role of the audience, which is there to mostly amplify and applaud
  • The personal tidbits that people used to share on social media have been relegated to private group chats and their equivalent.
  • ...23 more annotations...
  • The transformation of social media into mass media is largely because the rise of TikTok has demonstrated to every social-media company on the planet that people still really like things that can re-create the experience of TV
  • Advertisers also like things that function like TV, of course—after all, people are never more suggestible than when lulled into a sort of anesthetized mindlessness.
  • In this future, people who are good at making content with high production values will thrive, as audiences and tech company algorithms gravitate toward more professional content.
  • On these formerly-social platforms, whether content is coming from creators with better equipment and more skills, or Hollywood studios testing the waters, hardly matters. In the end, it will all look remarkably similar to the consumer.
  • It will look
  • like flipping through cable channels does, only our thumb on the remote has been replaced by our thumb on the screen of our phone, swiping from one TikTok, YouTube Short, or Instagram Reel to the next.
  • A telling indicator is the rise of a new kind of entertainment professional—the “creator.”
  • A creator is anyone who records or makes something that can go viral on the internet
  • TikTok is now more popular than Netflix among consumers younger than 35,
  • While YouTube and TikTok have always been about video, just about every other social-media platform that wants to keep people engaged is emphasizing it more than ever, so that’s what creators have to make,
  • His agency gets involved with creators and musicians at the earliest stages of their careers, helping them plan content, update their style, understand what the algorithms of different platforms demand, and connecting them with potentially lucrative brand deals
  • . Even more telling: In first place is YouTube, the original online TV analog.
  • Where attention flows, money—and content—must also. In 2023 brands will spend an estimated $6 billion on marketing through influencers—a subspecies of creators
  • Globally, the total addressable market for this kind of marketing is currently $250 billion
  • Then there is a new generation of shows that are going straight to TikTok, bypassing even streaming services
  • In the wake of the success of YouTube and TikTok, Facebook, Instagram, and even LinkedIn are all pushing more and more content made by professionals into our feeds,
  • In order to quantify how TikTok has mastered the art of discerning our interests and feeding us the most compelling possible content, Faltesek, of Oregon State University, conducted a two-year project to study exactly what kind of content TikTok pushes
  • With a team of students, he created dozens of fresh TikTok user accounts that didn’t like or interact with content in any way—they just let the algorithm play one video after another.
  • At the end of this exhaustive process of gathering data on TikTok’s algorithm, the conclusion became obvious, says Faltesek. “TikTok is television. It flips channels like TV, it provides a flow like TV.”
  • By this logic, Instagram’s move to copy TikTok, which is in turn encroaching on the turf of YouTube by allowing longer videos, and the increasing dominance of professional content on all three, means they’re all turning into TV. Even Threads, the new offering from Facebook parent company Meta, is fast becoming a broadcast medium for news, as Twitter was before it.
  • In every case, the structure of social networks has become one in which a handful of accounts create most of the content that others see, and the role of everyone else on the network is, primarily, to amplify and consume that content,
  • Some, like Magana, believe we’ll eventually see an ever more complete blending of what were once “social” platforms with the traditional television networks and even film studios.  
  • aren’t convinced they’ll eat the rest of the entertainment industry. “It’s hard to say this kind of short-form video will be the only kind of TV,” she reflects. “A long time ago, the internet became the new thing, but we still have the other forms on television, and scripted streaming shows. It’s almost like this is just another avenue for that—of watching shows and movies on your phone.”
redavistinnell

David Bowie Dies at 69; He Transcended Music, Art and Fashion - The New York Times - 0 views

  • David Bowie Dies at 69; He Transcended Music, Art and Fashion
  • He died after an 18-month battle with cancer, according to a statement on Mr. Bowie’s social-media accounts.
  • He had also collaborated on an Off Broadway musical, “Lazarus,” that was a surreal sequel to his definitive 1976 film role, “The Man Who Fell to Earth.”
  • ...7 more annotations...
  • Mr. Bowie earned admiration and emulation across the musical spectrum — from rockers, balladeers, punks, hip-hop acts, creators of pop spectacles and even classical composers like Philip Glass, who based two symphonies on Mr. Bowie’s albums “Low” and “ ‘Heroes’.”
  • Yet throughout Mr. Bowie’s metamorphoses, he was always recognizable. His voice was widely imitated but always his own; his message was that there was always empathy beyond difference.
  • He was Ziggy Stardust, the otherworldly pop star at the center of his 1972 album “The Rise and Fall of Ziggy Stardust and the Spiders From Mars.”
  • he arrival of MTV in the 1980s was the perfect complement to Mr. Bowie’s sense of theatricality and fashion. “Ashes to Ashes,” the “Space Oddity” sequel that revealed “we know Major Tom’s a junkie,” and “Let’s Dance,” which offered, “Put on your red shoes and dance the blues,” gave him worldwide popularity.
  • He also pushed the limits of “Fashion” and “Fame,” writing songs with those titles and also thinking deeply about the possibilities and strictures of pop renown.
  • Mr. Bowie largely left the spotlight after a heart attack in 2004 brought an abrupt end to a tour supporting his album “Reality.” The singer experienced pain during a performance at a German festival and sought treatment for what he believed was a shoulder injury; doctors then discovered a blocked artery.
  • And he collaborated with musicians like Brian Eno in the Berlin years and, in his final recordings, with the jazz musicians Maria Schneider and Donny McCaslin, introducing them to many new listeners.
Javier E

Why Pete Buttigieg Seems 'Authentic' - The Atlantic - 0 views

  • Authenticity is not about being honest; it’s about seeming unscripted. If you sound rehearsed, then you can’t possibly be saying whatever you’re thinking right now; you’re saying something you decided to say at some moment in the past
  • As a candidate and as president, Obama had the gift of seeming unrehearsed. He could deliver scripted speeches with the emotion, humor, energy, and surprise of someone articulating his ideas for the first time.
  • At the other end of the spectrum we find Hillary Clinton. Despite her obvious qualifications, she was hamstrung as a presidential candidate by an inability to sound like a normal person when addressing large audiences. Her performances in the major televised contexts in which most Americans saw her in 2016 were generally robotic and awkward—filled with strange pauses and painfully delivered jokes, drained of spontaneity.
  • ...6 more annotations...
  • the paradox of a serial liar such as Trump coming across as authentic isn’t much of a paradox at all. Trump lies authentically. He is so committed to saying whatever he feels like that he doesn’t let the truth get in the way.
  • If the art of authenticity resides in making the scripted seem spontaneous, doesn’t that make it fundamentally inauthentic?Short answer: yes. Great orators such as Obama—or Ronald Reagan, literally an actor—have the gift of obscuring the artificiality of political communication.
  • What sets Buttigieg apart as a political talent, then, is not really his intellect. It’s his ability to give a speech, or answer questions onstage, in a way that makes it seem as though he’s earnestly thinking through his beliefs in real time.
  • just as Trump’s most loyal voters can’t help but be taken in by the billionaire president’s man-of-the-people routine, well-educated liberals can’t help being drawn to someone who plays the part of the thoughtful intellectual.
  • public speaking has outsize importance, at least at the national level, simply because voters overwhelmingly get their input about a candidate’s personality by seeing them give a speech or interview or participate in a televised debate.
  • Authenticity is like political magic. The best you can do is remind yourself it’s a trick.
Javier E

PewDiePie Put in Spotlight After New Zealand Shooting - The New York Times - 0 views

  • A gunman who had broadcast part of the attack that left at least 49 people dead urged those watching to “subscribe to PewDiePie,” referring to the alias used by Felix Kjellberg, a Swede whose channel has long dominated YouTube and courted controversy along the way.
  • As PewDiePie, Mr. Kjellberg has amassed a following of 89 million YouTube subscribers over the past decade with videos that feature a mix of comedic rants, commentary and video game narration.
  • According to Forbes, a sponsored video with PewDiePie can cost up to $450,000. Mr. Kjellberg earned $15.5 million last year.
  • ...1 more annotation...
  • A post on The Daily Stormer, a neo-Nazi website, noted that the videos raised questions about Mr. Kjellberg’s own views, but concluded that they didn’t matter: “the effect is the same; it normalizes Nazism, and marginalizes our enemies.”
Javier E

The Coronavirus Can Be Stopped, but Only With Harsh Steps, Experts Say - The New York T... - 0 views

  • Terrifying though the coronavirus may be, it can be turned back. China, South Korea, Singapore and Taiwan have demonstrated that, with furious efforts, the contagion can be brought to heel.
  • for the United States to repeat their successes will take extraordinary levels of coordination and money from the country’s leaders, and extraordinary levels of trust and cooperation from citizens. It will also require international partnerships in an interconnected world.
  • This contagion has a weakness.
  • ...72 more annotations...
  • the coronavirus more often infects clusters of family members, friends and work colleagues,
  • “You can contain clusters,” Dr. Heymann said. “You need to identify and stop discrete outbreaks, and then do rigorous contact tracing.”
  • The microphone should not even be at the White House, scientists said, so that briefings of historic importance do not dissolve into angry, politically charged exchanges with the press corps, as happened again on Friday.
  • Americans must be persuaded to stay home, they said, and a system put in place to isolate the infected and care for them outside the home
  • Travel restrictions should be extended, they said; productions of masks and ventilators must be accelerated, and testing problems must be resolved.
  • It was not at all clear that a nation so fundamentally committed to individual liberty and distrustful of government could learn to adapt to many of these measures, especially those that smack of state compulsion.
  • What follows are the recommendations offered by the experts interviewed by The Times.
  • they were united in the opinion that politicians must step aside and let scientists both lead the effort to contain the virus and explain to Americans what must be done.
  • medical experts should be at the microphone now to explain complex ideas like epidemic curves, social distancing and off-label use of drugs.
  • doing so takes intelligent, rapidly adaptive work by health officials, and near-total cooperation from the populace. Containment becomes realistic only when Americans realize that working together is the only way to protect themselves and their loved ones.
  • Above all, the experts said, briefings should focus on saving lives and making sure that average wage earners survive the coming hard times — not on the stock market, the tourism industry or the president’s health.
  • “At this point in the emergency, there’s little merit in spending time on what we should have done or who’s at fault,”
  • The next priority, experts said, is extreme social distancing.If it were possible to wave a magic wand and make all Americans freeze in place for 14 days while sitting six feet apart, epidemiologists say, the whole epidemic would sputter to a halt.
  • The virus would die out on every contaminated surface and, because almost everyone shows symptoms within two weeks, it would be evident who was infected. If we had enough tests for every American, even the completely asymptomatic cases could be found and isolated.
  • The crisis would be over.
  • Obviously, there is no magic wand, and no 300 million tests. But the goal of lockdowns and social distancing is to approximate such a total freeze.
  • In contrast to the halting steps taken here, China shut down Wuhan — the epicenter of the nation’s outbreak — and restricted movement in much of the country on Jan. 23, when the country had a mere 500 cases and 17 deaths.Its rapid action had an important effect: With the virus mostly isolated in one province, the rest of China was able to save Wuhan.
  • Even as many cities fought their own smaller outbreaks, they sent 40,000 medical workers into Wuhan, roughly doubling its medical force.
  • Stop transmission within cities
  • the weaker the freeze, the more people die in overburdened hospitals — and the longer it ultimately takes for the economy to restart.
  • People in lockdown adapt. In Wuhan, apartment complexes submit group orders for food, medicine, diapers and other essentials. Shipments are assembled at grocery warehouses or government pantries and dropped off. In Italy, trapped neighbors serenade one another.
  • Each day’s delay in stopping human contact, experts said, creates more hot spots, none of which can be identified until about a week later, when the people infected there start falling ill.
  • South Korea avoided locking down any city, but only by moving early and with extraordinary speed. In January, the country had four companies making tests, and as of March 9 had tested 210,000 citizens — the equivalent of testing 2.3 million Americans.
  • As of the same date, fewer than 9,000 Americans had been tested.
  • Fix the testing mess
  • Testing must be done in a coordinated and safe way, experts said. The seriously ill must go first, and the testers must be protected.In China, those seeking a test must describe their symptoms on a telemedicine website. If a nurse decides a test is warranted, they are directed to one of dozens of “fever clinics” set up far from all other patients.
  • Isolate the infected
  • As soon as possible, experts said, the United States must develop an alternative to the practice of isolating infected people at home, as it endangers families. In China, 75 to 80 percent of all transmission occurred in family clusters.
  • Cellphone videos from China show police officers knocking on doors and taking temperatures. In some, people who resist are dragged away by force. The city of Ningbo offered bounties of $1,400 to anyone who turned in a coronavirus sufferer.
  • In China, said Dr. Bruce Aylward, leader of the World Health Organization’s observer team there, people originally resisted leaving home or seeing their children go into isolation centers with no visiting rights — just as Americans no doubt would.
  • In China, they came to accept it.“They realized they were keeping their families safe,” he said. “Also, isolation is really lonely. It’s psychologically difficult. Here, they were all together with other people in the same boat. They supported each other.”
  • Find the fevers
  • Make masks ubiquitous
  • In China, having a fever means a mandatory trip to a fever clinic to check for coronavirus. In the Wuhan area, different cities took different approaches.
  • In most cities in affected Asian countries, it is commonplace before entering any bus, train or subway station, office building, theater or even a restaurant to get a temperature check. Washing your hands in chlorinated water is often also required.
  • The city of Qianjiang, by contrast, offered the same amount of money to any resident who came in voluntarily and tested positive
  • Voluntary approaches, like explaining to patients that they will be keeping family and friends safe, are more likely to work in the West, she added.
  • Trace the contacts
  • Finding and testing all the contacts of every positive case is essential, experts said. At the peak of its epidemic, Wuhan had 18,000 people tracking down individuals who had come in contact with the infected.
  • Dr. Borio suggested that young Americans could use their social networks to “do their own contact tracing.” Social media also is used in Asia, but in different ways
  • When he lectured at a Singapore university, Dr. Heymann said, dozens of students were in the room. But just before he began class, they were photographed to record where everyone sat.
  • Instead of a policy that advises the infected to remain at home, as the Centers for Disease and Prevention now does, experts said cities should establish facilities where the mildly and moderately ill can recuperate under the care and observation of nurses.
  • There is very little data showing that flat surgical masks protect healthy individuals from disease. Nonetheless, Asian countries generally make it mandatory that people wear them.
  • The Asian approach is less about data than it is about crowd psychology, experts explained.All experts agree that the sick must wear masks to keep in their coughs. But if a mask indicates that the wearer is sick, many people will be reluctant to wear one. If everyone is required to wear masks, the sick automatically have one on and there is no stigma attached.
  • Also, experts emphasized, Americans should be taught to take seriously admonitions to stop shaking hands and hugging
  • Preserve vital services
  • Only the federal government can enforce interstate commerce laws to ensure that food, water, electricity, gas, phone lines and other basic needs keep flowing across state lines to cities and suburbs
  • “I sense that most people — and certainly those in business — get it. They would prefer to take the bitter medicine at once and contain outbreaks as they start rather than gamble with uncertainty.”
  • Produce ventilators and oxygen
  • The manufacturers, including a dozen in the United States, say there is no easy way to ramp up production quickly. But it is possible other manufacturers, including aerospace and automobile companies, could be enlisted to do so.
  • Canadian nurses are disseminating a 2006 paper describing how one ventilator can be modified to treat four patients simultaneously. Inventors have proposed combining C-PAP machines, which many apnea sufferers own, and oxygen tanks to improvise a ventilator.
  • One of the lessons of China, he noted, was that many Covid-19 patients who would normally have been intubated and on ventilators managed to survive with oxygen alone.
  • Retrofit hospitals
  • In Wuhan, the Chinese government famously built two new hospitals in two weeks. All other hospitals were divided: 48 were designated to handle 10,000 serious or critical coronavirus patients, while others were restricted to handling emergencies like heart attacks and births.
  • Wherever that was impractical, hospitals were divided into “clean” and “dirty” zones, and the medical teams did not cross over. Walls to isolate whole wards were built
  • Decide when to close schools
  • Recruit volunteers
  • China’s effort succeeded, experts said, in part because of hundreds of thousands of volunteers. The government declared a “people’s war” and rolled out a “Fight On, Wuhan! Fight On, China!” campaign.
  • Many people idled by the lockdowns stepped up to act as fever checkers, contact tracers, hospital construction workers, food deliverers, even babysitters for the children of first responders, or as crematory workers.
  • “In my experience, success is dependent on how much the public is informed and participates,” Admiral Ziemer said. “This truly is an ‘all hands on deck’ situation.”
  • Prioritize the treatments
  • Clinicians in China, Italy and France have thrown virtually everything they had in hospital pharmacies into the fight, and at least two possibilities have emerged that might save patients: the anti-malaria drugs chloroquine and hydroxychloroquine, and the antiviral remdesivir, which has no licensed use.
  • An alternative is to harvest protective antibodies from the blood of people who have survived the illness,
  • The purified blood serum — called immunoglobulin — could possibly be used in small amounts to protect emergency medical workers, too.
  • “Unfortunately, the first wave won’t benefit from this,” Dr. Hotez said. “We need to wait until we have enough survivors.”Find a vaccine
  • testing those candidate vaccines for safety and effectiveness takes time.
  • The roadblock, vaccine experts explained, is not bureaucratic. It is that the human immune system takes weeks to produce antibodies, and some dangerous side effects can take weeks to appear.
  • After extensive animal testing, vaccines are normally given to about 50 healthy human volunteers to see if they cause any unexpected side effects and to measure what dose produces enough antibodies to be considered protective.
  • If that goes well, the trial enrolls hundreds or thousands of volunteers in an area where the virus is circulating. Half get the vaccine, the rest do not — and the investigators wait. If the vaccinated half do not get the disease, the green light for production is finally given.
  • In the past, some experimental vaccines have produced serious side effects, like Guillain-Barre syndrome, which can paralyze and kill. A greater danger, experts said, is that some experimental vaccines, paradoxically, cause “immune enhancement,” meaning they make it more likely, not less, that recipients will get a disease. That would be a disaster.
  • One candidate coronavirus vaccine Dr. Hotez invented 10 years ago in the wake of SARS, he said, had to be abandoned when it appeared to make mice more likely to die from pneumonia when they were experimentally infected with the virus.
  • Reach out to other nations
criscimagnael

Jan. 6 Committee Subpoenas Twitter, Meta, Alphabet and Reddit - The New York Times - 0 views

  • The House committee investigating the Jan. 6 attack on the Capitol issued subpoenas on Thursday to four major social media companies — Alphabet, Meta, Reddit and Twitter — criticizing them for allowing extremism to spread on their platforms and saying they have failed to cooperate adequately with the inquiry.
  • In letters accompanying the subpoenas, the panel named Facebook, a unit of Meta, and YouTube, which is owned by Alphabet’s Google subsidiary, as among the worst offenders that contributed to the spread of misinformation and violent extremism.
  • The committee sent letters in August to 15 social media companies — including sites where misinformation about election fraud spread, such as the pro-Trump website TheDonald.win — seeking documents pertaining to efforts to overturn the election and any domestic violent extremists associated with the Jan. 6 rally and attack.
  • ...16 more annotations...
  • “It’s disappointing that after months of engagement, we still do not have the documents and information necessary to answer those basic questions,”
  • On Twitter, many of Mr. Trump’s followers used the site to amplify and spread false allegations of election fraud, while connecting with other Trump supporters and conspiracy theorists using the site. And on YouTube, some users broadcast the events of Jan. 6 using the platform’s video streaming technology.
  • In the year since the events of Jan. 6, social media companies have been heavily scrutinized for whether their sites played an instrumental role in organizing the attack.
  • In the months surrounding the 2020 election, employees inside Meta raised warning signs that Facebook posts and comments containing “combustible election misinformation” were spreading quickly across the social network, according to a cache of documents and photos reviewed by The New York Times.
  • Frances Haugen, a former Facebook employee turned whistle-blower, said the company relaxed its safeguards too quickly after the election, which then led it to be used in the storming of the Capitol.
  • In the days after the attack, Reddit banned a discussion forum dedicated to former President Donald J. Trump, where tens of thousands of Mr. Trump’s supporters regularly convened to express solidarity with him.
  • After months of discussions with the companies, only the four large corporations were issued subpoenas on Thursday, because the committee said the firms were “unwilling to commit to voluntarily and expeditiously” cooperating with its work.
  • The committee said letters to the four firms accompanied the subpoenas.The panel said YouTube served as a platform for “significant communications by its users that were relevant to the planning and execution of Jan. 6 attack on the United States Capitol,” including livestreams of the attack as it was taking place.
  • The panel said Facebook and other Meta platforms were used to share messages of “hate, violence and incitement; to spread misinformation, disinformation and conspiracy theories around the election; and to coordinate or attempt to coordinate the Stop the Steal movement.”
  • “Meta has declined to commit to a deadline for producing or even identifying these materials,” Mr. Thompson wrote to Mark Zuckerberg, Meta’s chief executive.
  • The panel said it was focused on Reddit because the platform hosted the r/The_Donald subreddit community that grew significantly before migrating in 2020 to the website TheDonald.win, which ultimately hosted significant discussion and planning related to the Jan. 6 attack.
  • “Unfortunately, the select committee believes Twitter has failed to disclose critical information,” the panel stated.
  • In recent years, Big Tech and Washington have had a history of butting heads. Some Republicans have accused sites including Facebook, Instagram and Twitter of silencing conservative voices.
  • The Federal Trade Commission is investigating whether a number of tech companies have grown too big, and in the process abused their market power to stifle competition. And a bipartisan group of senators and representatives continues to say sites like Facebook and YouTube are not doing enough to curb the spread of misinformation and conspiracy theories.
  • Meta said that it had “produced documents to the committee on a schedule committee staff requested — and we will continue to do so.”
  • The panel has interviewed more than 340 witnesses and issued dozens of subpoenas, including for bank and phone records.
1 - 20 of 425 Next › Last »
Showing 20 items per page