Contents contributed and discussions participated by Ed Webb
The Misinformation Susceptibility Test - 0 views
Inside the Pro-Israel Information War - 0 views
-
a rare public glimpse of how Israel and its American allies harness Israel’s influential tech sector and tech diaspora to run cover for the Jewish state as it endures scrutiny over the humanitarian impact of its invasion of Gaza.
-
reveal the degree to which, in the tech-oriented hasbara world, the lines between government, the private sector, and the nonprofit world are blurry at best. And the tactics that these wealthy individuals, advocates, and groups use -- hounding Israel critics on social media; firing pro-Palestine employees and canceling speaking engagements; smearing Palestinian journalists; and attempting to ship military-grade equipment to the IDF -- are often heavy-handed and controversial.
-
"President Biden seems incapable of using the one policy tool that may actually produce a change in Israel's actions that might limit civilian deaths, which would be to condition military aid that the United States provides to Israel,” Clifton added. He partially attributed the inability of the U.S. government to rein in Israel’s war actions to the “lobbying and advocacy efforts underway.”
- ...22 more annotations...
Mastodon - 0 views
AI Causes Real Harm. Let's Focus on That over the End-of-Humanity Hype - Scientific Ame... - 0 views
-
Wrongful arrests, an expanding surveillance dragnet, defamation and deep-fake pornography are all actually existing dangers of so-called “artificial intelligence” tools currently on the market. That, and not the imagined potential to wipe out humanity, is the real threat from artificial intelligence.
-
Beneath the hype from many AI firms, their technology already enables routine discrimination in housing, criminal justice and health care, as well as the spread of hate speech and misinformation in non-English languages. Already, algorithmic management programs subject workers to run-of-the-mill wage theft, and these programs are becoming more prevalent.
-
Because the term “AI” is ambiguous, it makes having clear discussions more difficult. In one sense, it is the name of a subfield of computer science. In another, it can refer to the computing techniques developed in that subfield, most of which are now focused on pattern matching based on large data sets and the generation of new media based on those patterns. Finally, in marketing copy and start-up pitch decks, the term “AI” serves as magic fairy dust that will supercharge your business.
- ...9 more annotations...
How the media is covering ChatGPT - Columbia Journalism Review - 0 views
-
Some observers have felt dissatisfied with the media coverage. “Are we in a hype cycle? Absolutely. But is that entirely surprising? No,” said Paris Martineau, a tech reporter at The Information. The structural headwinds buffeting journalism—the collapse of advertising revenue, shrinking editorial budgets, smaller newsrooms, demand for SEO traffic—help explain the “breathless” coverage—and a broader sense of chasing content for web traffic. “The more you look at it, especially from a bird’s eye view, the more it [high levels of low-quality coverage] is a symptom of the state of the modern publishing and news system that we currently live in,” Martineau said, referring to the sense newsrooms need to be covering every angle, including sensationalist ones, to gain audience attention. In a perfect world all reporters would have the time and resources to write ethically-framed, non-science fiction-like stories on AI. But they do not. “It is systemic,” she added.
-
One story that seems to have gotten lost is the “incredible consolidation of power and money in the very small set of people who invested in this tool, are building this too, are set to make a ton of money off of it.” We need to move away from focusing on red herrings like AI’s potential “sentience” to covering how AI is further concentrating wealth and power.
-
Sensationalized coverage of generative AI “leads us away from more pressing questions,” Simon of the Oxford Internet Institute said. For instance, the potential future dependence of newsrooms on big tech companies for news production, the governance decisions of these companies, the ethics and bias questions relating to models and training, the climate impact of these tools, and so on. “Ideally, we would want a broader public to be thinking about these things as well,” Simon said, not just the engineers building these tools or the “policy wonks” interested in this space.