Skip to main content

Home/ Media in Middle East & North Africa/ Group items tagged memes

Rss Feed Group items tagged

Ed Webb

Memes: A gamechanger in Egyptian politics - Focus - Weekly - Ahram Online - 0 views

  • How has the political culture changed in Egypt due to the emergence of political memes and the influencers behind them?
  • President Abdel-Fattah Al-Sisi also recently laughed when he was shown some of the memes referencing the most recent spike in petrol prices, telling the first session of the seventh National Youth Conference held in the New Administrative Capital last month that when the government tackles vital issues it takes into consideration the reaction of people on social media. He said that the government routinely gauges the possible reaction of the people before taking difficult decisions. This demonstrates how the country’s political culture has evolved in the digital age, since physical banners, such as those that were once used in demonstrations, have now often been replaced by digital memes, with the government closely monitoring the Internet in order to gauge the state of public opinion.
  • Khaled Al-Baramawi, a media commentator, said that Egyptians had always been known for their wit and satire, particularly on political issues. “With the rise of social media, Egyptians have become top producers of digital content,” he said. “When you combine Egyptian humour and online activity, you can understand why digital satire is so popular among Egyptians, including on Facebook, Twitter, WhatsApp and other platforms. Politics is a conversation staple, and memes are often produced based on statements by officials.”
  • ...2 more annotations...
  • “Nationalist Memes” is a Facebook page with 11,000 followers and one administrator who advocates for Egyptian nationalism and promotes this idea to the public, for example. The aim is to instill patriotism into a generation that may be sceptical of it and to respond to rumours on social media.
  • “opposition political parties are very weak in Egypt and have no impact on the streets. People do not care to hear about them or know what they are doing. Memes, however, are followed by many Egyptians, especially young people who may sign up to several meme pages and interact with them all the time. Their impact is so great that many of the ideas doing the rounds of young people in Egypt have their origin in a meme on the Internet.”
    • Ed Webb
       
      Opposition parties have been weak for decades in Egypt. This speculation about memes is weakly supported.
Ed Webb

Muzzled by the Bots - www.slate.com - Readability - 0 views

  • It's through such combination of humans and bots that memes emerge
    • Ed Webb
       
      Android meme production
  • with just some clever manipulation, bots might get you to follow the right humans—and it's the humans, not bots, who would then influence your thinking
  • The digitization of our public life is also giving rise to many new intermediaries that are mostly of invisible—and possibly suspect—variety
  • ...5 more annotations...
  • a single Californian company making decisions over what counts as hate speech and profanity for some of the world's most popular sites without anyone ever examining whether its own algorithms might be biased or excessively conservative
  • It's the proliferation—not elimination—of intermediaries that has made blogging so widespread.  The right term here is “hyperintermediation,” not “disintermediation.”
  • this marriage of big data and automated content moderation might also have a darker side, particularly in undemocratic regimes, for whom a war on spam and hate speech—waged with the help of domestic spam-fighting champions—is just a pretense to suppress dissenting opinions. In their hands, solutions like Impermium's might make censorship more fine-grained and customized, eliminating the gaps that plague “dumb” systems that censor in bulk
  • Just imagine what kind of new censorship possibilities open up once moderation decisions can incorporate geolocational information (what some researchers already call7 “spatial big data”): Why not block comments, videos, or photos uploaded by anyone located in, say, Tahrir Square or some other politically explosive location?
  • For governments and corporations alike, the next frontier is to learn how to identify, pre-empt, and disrupt emerging memes before they coalesce behind a catchy hashtag—this is where “big data” analytics would be most helpful. Thus, one of the Russian security agencies has recently awarded a tender12 to create bots that can both spot the formation of memes and to disrupt and counter them in real-time through ”mass distribution of messages in social networks with a view to the formation of public opinion.” Moscow is learning from Washington here: Last year the Pentagon  awarded a $2.7 million contract to the San Diego-based firm Ntrepid in order to build software to create fake multiple online identities13 and “counter violent extremist and enemy propaganda outside the US.” “Big data”-powered analytics would make spotting such “enemy propaganda” much easier.
Ed Webb

The Making of a YouTube Radical - The New York Times - 0 views

  • Mr. Cain, 26, recently swore off the alt-right nearly five years after discovering it, and has become a vocal critic of the movement. He is scarred by his experience of being radicalized by what he calls a “decentralized cult” of far-right YouTube personalities, who convinced him that Western civilization was under threat from Muslim immigrants and cultural Marxists, that innate I.Q. differences explained racial disparities, and that feminism was a dangerous ideology.
  • Over years of reporting on internet culture, I’ve heard countless versions of Mr. Cain’s story: an aimless young man — usually white, frequently interested in video games — visits YouTube looking for direction or distraction and is seduced by a community of far-right creators. Some young men discover far-right videos by accident, while others seek them out. Some travel all the way to neo-Nazism, while others stop at milder forms of bigotry.
  • YouTube and its recommendation algorithm, the software that determines which videos appear on users’ home pages and inside the “Up Next” sidebar next to a video that is playing. The algorithm is responsible for more than 70 percent of all time spent on the site
  • ...22 more annotations...
  • YouTube has inadvertently created a dangerous on-ramp to extremism by combining two things: a business model that rewards provocative videos with exposure and advertising dollars, and an algorithm that guides users down personalized paths meant to keep them glued to their screens
  • “If I’m YouTube and I want you to watch more, I’m always going to steer you toward Crazytown.”
  • 94 percent of Americans ages 18 to 24 use YouTube, a higher percentage than for any other online service
  • YouTube has been a godsend for hyper-partisans on all sides. It has allowed them to bypass traditional gatekeepers and broadcast their views to mainstream audiences, and has helped once-obscure commentators build lucrative media businesses
  • Bellingcat, an investigative news site, analyzed messages from far-right chat rooms and found that YouTube was cited as the most frequent cause of members’ “red-pilling” — an internet slang term for converting to far-right beliefs
  • The internet was an escape. Mr. Cain grew up in postindustrial Appalachia and was raised by his conservative Christian grandparents. He was smart, but shy and socially awkward, and he carved out an identity during high school as a countercultural punk. He went to community college, but dropped out after three semesters. Broke and depressed, he resolved to get his act together. He began looking for help in the same place he looked for everything: YouTube.
  • they rallied around issues like free speech and antifeminism, portraying themselves as truth-telling rebels doing battle against humorless “social justice warriors.” Their videos felt like episodes in a long-running soap opera, with a constant stream of new heroes and villains. To Mr. Cain, all of this felt like forbidden knowledge — as if, just by watching some YouTube videos, he had been let into an exclusive club. “When I found this stuff, I felt like I was chasing uncomfortable truths,” he told me. “I felt like it was giving me power and respect and authority.”
  • YouTube’s executives announced that the recommendation algorithm would give more weight to watch time, rather than views. That way, creators would be encouraged to make videos that users would finish, users would be more satisfied and YouTube would be able to show them more ads.
  • A month after its algorithm tweak, YouTube changed its rules to allow all video creators to run ads alongside their videos and earn a portion of the revenue they generated.
  • Many right-wing creators already made long video essays, or posted video versions of their podcasts. Their inflammatory messages were more engaging than milder fare. And now that they could earn money from their videos, they had a financial incentive to churn out as much material as possible.
  • Several current and former YouTube employees, who would speak only on the condition of anonymity because they had signed confidentiality agreements, said company leaders were obsessed with increasing engagement during those years. The executives, the people said, rarely considered whether the company’s algorithms were fueling the spread of extreme and hateful political content.
  • Google Brain’s researchers wondered if they could keep YouTube users engaged for longer by steering them into different parts of YouTube, rather than feeding their existing interests. And they began testing a new algorithm that incorporated a different type of A.I., called reinforcement learning. The new A.I., known as Reinforce, was a kind of long-term addiction machine. It was designed to maximize users’ engagement over time by predicting which recommendations would expand their tastes and get them to watch not just one more video but many more.
  • YouTube’s recommendations system is not set in stone. The company makes many small changes every year, and has already introduced a version of its algorithm that is switched on after major news events to promote videos from “authoritative sources” over conspiracy theories and partisan content. This past week, the company announced that it would expand that approach, so that a person who had watched a series of conspiracy theory videos would be nudged toward videos from more authoritative news sources. It also said that a January change to its algorithm to reduce the spread of so-called “borderline” videos had resulted in significantly less traffic to those videos.
  • the bulk of his media diet came from far-right channels. And after the election, he began exploring a part of YouTube with a darker, more radical group of creators. These people didn’t couch their racist and anti-Semitic views in sarcastic memes, and they didn’t speak in dog whistles. One channel run by Jared Taylor, the editor of the white nationalist magazine American Renaissance, posted videos with titles like “‘Refugee’ Invasion Is European Suicide.” Others posted clips of interviews with white supremacists like Richard Spencer and David Duke.
  • As Mr. Molyneux promoted white nationalists, his YouTube channel kept growing. He now has more than 900,000 subscribers, and his videos have been watched nearly 300 million times. Last year, he and Ms. Southern — Mr. Cain’s “fashy bae” — went on a joint speaking tour in Australia and New Zealand, where they criticized Islam and discussed what they saw as the dangers of nonwhite immigration. In March, after a white nationalist gunman killed 50 Muslims in a pair of mosques in Christchurch, New Zealand, Mr. Molyneux and Ms. Southern distanced themselves from the violence, calling the killer a left-wing “eco-terrorist” and saying that linking the shooting to far-right speech was “utter insanity.” Neither Mr. Molyneux nor Ms. Southern replied to a request for comment. The day after my request, Mr. Molyneux uploaded a video titled “An Open Letter to Corporate Reporters,” in which he denied promoting hatred or violence and said labeling him an extremist was “just a way of slandering ideas without having to engage with the content of those ideas.”
  • Unlike most progressives Mr. Cain had seen take on the right, Mr. Bonnell and Ms. Wynn were funny and engaging. They spoke the native language of YouTube, and they didn’t get outraged by far-right ideas. Instead, they rolled their eyes at them, and made them seem shallow and unsophisticated.
  • “I noticed that right-wing people were taking these old-fashioned, knee-jerk, reactionary politics and packing them as edgy punk rock,” Ms. Wynn told me. “One of my goals was to take the excitement out of it.”
  • Ms. Wynn and Mr. Bonnell are part of a new group of YouTubers who are trying to build a counterweight to YouTube’s far-right flank. This group calls itself BreadTube, a reference to the left-wing anarchist Peter Kropotkin’s 1892 book, “The Conquest of Bread.” It also includes people like Oliver Thorn, a British philosopher who hosts the channel PhilosophyTube, where he posts videos about topics like transphobia, racism and Marxist economics.
  • The core of BreadTube’s strategy is a kind of algorithmic hijacking. By talking about many of the same topics that far-right creators do — and, in some cases, by responding directly to their videos — left-wing YouTubers are able to get their videos recommended to the same audience.
  • What is most surprising about Mr. Cain’s new life, on the surface, is how similar it feels to his old one. He still watches dozens of YouTube videos every day and hangs on the words of his favorite creators. It is still difficult, at times, to tell where the YouTube algorithm stops and his personality begins.
  • It’s possible that vulnerable young men like Mr. Cain will drift away from radical groups as they grow up and find stability elsewhere. It’s also possible that this kind of whiplash polarization is here to stay as political factions gain and lose traction online.
  • I’ve learned now that you can’t go to YouTube and think that you’re getting some kind of education, because you’re not.
Ed Webb

Where Countries Are Tinderboxes and Facebook Is a Match - The New York Times - 0 views

  • they had shared and could recite the viral Facebook memes constructing an alternate reality of nefarious Muslim plots. Mr. Lal called them “the embers beneath the ashes” of Sinhalese anger
  • the forces of social disruption that have followed Facebook’s rapid expansion in the developing world, whose markets represent the company’s financial future. For months, we had been tracking riots and lynchings around the world linked to misinformation and hate speech on Facebook, which pushes whatever content keeps users on the site longest — a potentially damaging practice in countries with weak institutions.
  • Time and again, communal hatreds overrun the newsfeed — the primary portal for news and information for many users — unchecked as local media are displaced by Facebook and governments find themselves with little leverage over the company. Some users, energized by hate speech and misinformation, plot real-world attacks.
  • ...23 more annotations...
  • Facebook’s newsfeed played a central role in nearly every step from rumor to killing
  • Facebook officials, they say, ignored repeated warnings of the potential for violence, resisting pressure to hire moderators or establish emergency points of contact
  • the imagined Ampara, which exists in rumors and memes on Sinhalese-speaking Facebook, is the shadowy epicenter of a Muslim plot to sterilize and destroy Sri Lanka’s Sinhalese majority
  • The mob, hearing confirmation, beat him, destroyed the shop and set fire to the local mosque.
  • As Facebook pushes into developing countries, it tends to be initially received as a force for good.In Sri Lanka, it keeps families in touch even as many work abroad. It provides for unprecedented open expression and access to information. Government officials say it was essential for the democratic transition that swept them into office in 2015.But where institutions are weak or undeveloped, Facebook’s newsfeed can inadvertently amplify dangerous tendencies. Designed to maximize user time on site, it promotes whatever wins the most attention. Posts that tap into negative, primal emotions like anger or fear, studies have found, produce the highest engagement, and so proliferate
  • in developing countries, Facebook is often perceived as synonymous with the internet and reputable sources are scarce, allowing emotionally charged rumors to run rampant
  • Last year, in rural Indonesia, rumors spread on Facebook and WhatsApp, a Facebook-owned messaging tool, that gangs were kidnapping local children and selling their organs. Some messages included photos of dismembered bodies or fake police fliers. Almost immediately, locals in nine villages lynched outsiders they suspected of coming for their children.
  • Near-identical social media rumors have also led to attacks in India and Mexico. Lynchings are increasingly filmed and posted back to Facebook, where they go viral as grisly tutorials
  • No organization has ever had to police billions of users in a panoply of languages.
  • Before Facebook, he said, officials facing communal violence “could ask media heads to be sensible, they could have their own media strategy.”
  • Desperate, the researchers flagged the video and subsequent posts using Facebook’s on-site reporting tool.Though they and government officials had repeatedly asked Facebook to establish direct lines, the company had insisted this tool would be sufficient, they said. But nearly every report got the same response: the content did not violate Facebook’s standards. Advertisement Continue reading the main story “You report to Facebook, they do nothing,” one of the researchers, Amalini De Sayrah, said. “There’s incitements to violence against entire communities and Facebook says it doesn’t violate community standards.”
  • Facebook still appears to employ few Sinhalese moderators. A call to a third-party employment service revealed that around 25 Sinhalese moderator openings, first listed last June, remain unfilled. The jobs are based in India, which has few Sinhalese speakers.
  • “We’re a society, we’re not just a market.”
  • Its gamelike interface rewards engagement, delivering a dopamine boost when users accrue likes and responses, training users to indulge behaviors that win affirmation.
  • the greatest rush comes by attacking outsiders: The other sports team. The other political party. The ethnic minority.
  • Mass media has long been used to mobilize mass violence. Facebook, by democratizing communication tools, gives anyone with a smartphone the ability to broadcast hate.
  • Mr. Weerasinghe posted a video that showed him walking the shops of a town called Digana, warning that too many were owned by Muslims, urging Sinhalese to take the town back. The researchers in Colombo reported his video to Facebook, along with his earlier posts, but all remained online.
  • the government temporarily blocked most social media. Only then did Facebook representatives get in touch with Sri Lankan officials, they say. Mr. Weerasinghe’s page was closed the same day.
  • officials rushed out statements debunking the sterilization rumors but could not match Facebook’s influence
  • Despite criticism and concerns from civil society groups, the company has done little to change its strategy of pushing into developing societies with weak institutions and histories of social instability, opening up information spaces where anger and fear often can dominate
  • From October to March, Facebook presented users in six countries, including Sri Lanka, with a separate newsfeed prioritizing content from friends and family. Posts by professional media were hidden away on another tab.“While this experiment lasted, many of us missed out on the bigger picture, on more credible news,” said Nalaka Gunawardene, a Sri Lankan media analyst. “It’s possible that this experiment inadvertently spread hate views in these six countries.”
  • government officials said, they face the same problem as before. Facebook wields enormous influence over their society, but they have little over Facebook.
  • Facebook had turned him into a national villain. It helped destroy his business, sending his family deeply into debt. And it had nearly gotten him killed.But he refused to abandon the platform. With long, empty days in hiding, he said, “I have more time and I look at Facebook much more.”“It’s not that I have more faith that social media is accurate, but you have to spend time and money to go to the market to get a newspaper,” he said. “I can just open my phone and get the news instead.”“Whether it’s wrong or right, it’s what I read.”
Ed Webb

The shoe incident continues... | Reuters.com - 0 views

  •  
    This meme has legs...
1 - 7 of 7
Showing 20 items per page