Skip to main content

Home/ TOK Friends/ Group items tagged dystopia

Rss Feed Group items tagged

Javier E

'Our minds can be hijacked': the tech insiders who fear a smartphone dystopia | Technol... - 0 views

  • Rosenstein belongs to a small but growing band of Silicon Valley heretics who complain about the rise of the so-called “attention economy”: an internet shaped around the demands of an advertising economy.
  • “It is very common,” Rosenstein says, “for humans to develop things with the best of intentions and for them to have unintended, negative consequences.”
  • most concerned about the psychological effects on people who, research shows, touch, swipe or tap their phone 2,617 times a day.
  • ...43 more annotations...
  • There is growing concern that as well as addicting users, technology is contributing toward so-called “continuous partial attention”, severely limiting people’s ability to focus, and possibly lowering IQ. One recent study showed that the mere presence of smartphones damages cognitive capacity – even when the device is turned off. “Everyone is distracted,” Rosenstein says. “All of the time.”
  • Drawing a straight line between addiction to social media and political earthquakes like Brexit and the rise of Donald Trump, they contend that digital forces have completely upended the political system and, left unchecked, could even render democracy as we know it obsolete.
  • Without irony, Eyal finished his talk with some personal tips for resisting the lure of technology. He told his audience he uses a Chrome extension, called DF YouTube, “which scrubs out a lot of those external triggers” he writes about in his book, and recommended an app called Pocket Points that “rewards you for staying off your phone when you need to focus”.
  • “One reason I think it is particularly important for us to talk about this now is that we may be the last generation that can remember life before,” Rosenstein says. It may or may not be relevant that Rosenstein, Pearlman and most of the tech insiders questioning today’s attention economy are in their 30s, members of the last generation that can remember a world in which telephones were plugged into walls.
  • One morning in April this year, designers, programmers and tech entrepreneurs from across the world gathered at a conference centre on the shore of the San Francisco Bay. They had each paid up to $1,700 to learn how to manipulate people into habitual use of their products, on a course curated by conference organiser Nir Eyal.
  • Eyal, 39, the author of Hooked: How to Build Habit-Forming Products, has spent several years consulting for the tech industry, teaching techniques he developed by closely studying how the Silicon Valley giants operate.
  • “The technologies we use have turned into compulsions, if not full-fledged addictions,” Eyal writes. “It’s the impulse to check a message notification. It’s the pull to visit YouTube, Facebook, or Twitter for just a few minutes, only to find yourself still tapping and scrolling an hour later.” None of this is an accident, he writes. It is all “just as their designers intended”
  • He explains the subtle psychological tricks that can be used to make people develop habits, such as varying the rewards people receive to create “a craving”, or exploiting negative emotions that can act as “triggers”. “Feelings of boredom, loneliness, frustration, confusion and indecisiveness often instigate a slight pain or irritation and prompt an almost instantaneous and often mindless action to quell the negative sensation,” Eyal writes.
  • The most seductive design, Harris explains, exploits the same psychological susceptibility that makes gambling so compulsive: variable rewards. When we tap those apps with red icons, we don’t know whether we’ll discover an interesting email, an avalanche of “likes”, or nothing at all. It is the possibility of disappointment that makes it so compulsive.
  • Finally, Eyal confided the lengths he goes to protect his own family. He has installed in his house an outlet timer connected to a router that cuts off access to the internet at a set time every day. “The idea is to remember that we are not powerless,” he said. “We are in control.
  • But are we? If the people who built these technologies are taking such radical steps to wean themselves free, can the rest of us reasonably be expected to exercise our free will?
  • Not according to Tristan Harris, a 33-year-old former Google employee turned vocal critic of the tech industry. “All of us are jacked into this system,” he says. “All of our minds can be hijacked. Our choices are not as free as we think they are.”
  • Harris, who has been branded “the closest thing Silicon Valley has to a conscience”, insists that billions of people have little choice over whether they use these now ubiquitous technologies, and are largely unaware of the invisible ways in which a small number of people in Silicon Valley are shaping their lives.
  • “I don’t know a more urgent problem than this,” Harris says. “It’s changing our democracy, and it’s changing our ability to have the conversations and relationships that we want with each other.” Harris went public – giving talks, writing papers, meeting lawmakers and campaigning for reform after three years struggling to effect change inside Google’s Mountain View headquarters.
  • He explored how LinkedIn exploits a need for social reciprocity to widen its network; how YouTube and Netflix autoplay videos and next episodes, depriving users of a choice about whether or not they want to keep watching; how Snapchat created its addictive Snapstreaks feature, encouraging near-constant communication between its mostly teenage users.
  • The techniques these companies use are not always generic: they can be algorithmically tailored to each person. An internal Facebook report leaked this year, for example, revealed that the company can identify when teens feel “insecure”, “worthless” and “need a confidence boost”. Such granular information, Harris adds, is “a perfect model of what buttons you can push in a particular person”.
  • Tech companies can exploit such vulnerabilities to keep people hooked; manipulating, for example, when people receive “likes” for their posts, ensuring they arrive when an individual is likely to feel vulnerable, or in need of approval, or maybe just bored. And the very same techniques can be sold to the highest bidder. “There’s no ethics,” he says. A company paying Facebook to use its levers of persuasion could be a car business targeting tailored advertisements to different types of users who want a new vehicle. Or it could be a Moscow-based troll farm seeking to turn voters in a swing county in Wisconsin.
  • It was Rosenstein’s colleague, Leah Pearlman, then a product manager at Facebook and on the team that created the Facebook “like”, who announced the feature in a 2009 blogpost. Now 35 and an illustrator, Pearlman confirmed via email that she, too, has grown disaffected with Facebook “likes” and other addictive feedback loops. She has installed a web browser plug-in to eradicate her Facebook news feed, and hired a social media manager to monitor her Facebook page so that she doesn’t have to.
  • Harris believes that tech companies never deliberately set out to make their products addictive. They were responding to the incentives of an advertising economy, experimenting with techniques that might capture people’s attention, even stumbling across highly effective design by accident.
  • It’s this that explains how the pull-to-refresh mechanism, whereby users swipe down, pause and wait to see what content appears, rapidly became one of the most addictive and ubiquitous design features in modern technology. “Each time you’re swiping down, it’s like a slot machine,” Harris says. “You don’t know what’s coming next. Sometimes it’s a beautiful photo. Sometimes it’s just an ad.”
  • The reality TV star’s campaign, he said, had heralded a watershed in which “the new, digitally supercharged dynamics of the attention economy have finally crossed a threshold and become manifest in the political realm”.
  • “Smartphones are useful tools,” he says. “But they’re addictive. Pull-to-refresh is addictive. Twitter is addictive. These are not good things. When I was working on them, it was not something I was mature enough to think about. I’m not saying I’m mature now, but I’m a little bit more mature, and I regret the downsides.”
  • All of it, he says, is reward-based behaviour that activates the brain’s dopamine pathways. He sometimes finds himself clicking on the red icons beside his apps “to make them go away”, but is conflicted about the ethics of exploiting people’s psychological vulnerabilities. “It is not inherently evil to bring people back to your product,” he says. “It’s capitalism.”
  • He identifies the advent of the smartphone as a turning point, raising the stakes in an arms race for people’s attention. “Facebook and Google assert with merit that they are giving users what they want,” McNamee says. “The same can be said about tobacco companies and drug dealers.”
  • McNamee chooses his words carefully. “The people who run Facebook and Google are good people, whose well-intentioned strategies have led to horrific unintended consequences,” he says. “The problem is that there is nothing the companies can do to address the harm unless they abandon their current advertising models.”
  • But how can Google and Facebook be forced to abandon the business models that have transformed them into two of the most profitable companies on the planet?
  • McNamee believes the companies he invested in should be subjected to greater regulation, including new anti-monopoly rules. In Washington, there is growing appetite, on both sides of the political divide, to rein in Silicon Valley. But McNamee worries the behemoths he helped build may already be too big to curtail.
  • Rosenstein, the Facebook “like” co-creator, believes there may be a case for state regulation of “psychologically manipulative advertising”, saying the moral impetus is comparable to taking action against fossil fuel or tobacco companies. “If we only care about profit maximisation,” he says, “we will go rapidly into dystopia.”
  • James Williams does not believe talk of dystopia is far-fetched. The ex-Google strategist who built the metrics system for the company’s global search advertising business, he has had a front-row view of an industry he describes as the “largest, most standardised and most centralised form of attentional control in human history”.
  • It is a journey that has led him to question whether democracy can survive the new technological age.
  • He says his epiphany came a few years ago, when he noticed he was surrounded by technology that was inhibiting him from concentrating on the things he wanted to focus on. “It was that kind of individual, existential realisation: what’s going on?” he says. “Isn’t technology supposed to be doing the complete opposite of this?
  • That discomfort was compounded during a moment at work, when he glanced at one of Google’s dashboards, a multicoloured display showing how much of people’s attention the company had commandeered for advertisers. “I realised: this is literally a million people that we’ve sort of nudged or persuaded to do this thing that they weren’t going to otherwise do,” he recalls.
  • Williams and Harris left Google around the same time, and co-founded an advocacy group, Time Well Spent, that seeks to build public momentum for a change in the way big tech companies think about design. Williams finds it hard to comprehend why this issue is not “on the front page of every newspaper every day.
  • “Eighty-seven percent of people wake up and go to sleep with their smartphones,” he says. The entire world now has a new prism through which to understand politics, and Williams worries the consequences are profound.
  • g. “The attention economy incentivises the design of technologies that grab our attention,” he says. “In so doing, it privileges our impulses over our intentions.”
  • That means privileging what is sensational over what is nuanced, appealing to emotion, anger and outrage. The news media is increasingly working in service to tech companies, Williams adds, and must play by the rules of the attention economy to “sensationalise, bait and entertain in order to survive”.
  • It is not just shady or bad actors who were exploiting the internet to change public opinion. The attention economy itself is set up to promote a phenomenon like Trump, who is masterly at grabbing and retaining the attention of supporters and critics alike, often by exploiting or creating outrage.
  • All of which has left Brichter, who has put his design work on the backburner while he focuses on building a house in New Jersey, questioning his legacy. “I’ve spent many hours and weeks and months and years thinking about whether anything I’ve done has made a net positive impact on society or humanity at all,” he says. He has blocked certain websites, turned off push notifications, restricted his use of the Telegram app to message only with his wife and two close friends, and tried to wean himself off Twitter. “I still waste time on it,” he confesses, “just reading stupid news I already know about.” He charges his phone in the kitchen, plugging it in at 7pm and not touching it until the next morning.
  • He stresses these dynamics are by no means isolated to the political right: they also play a role, he believes, in the unexpected popularity of leftwing politicians such as Bernie Sanders and Jeremy Corbyn, and the frequent outbreaks of internet outrage over issues that ignite fury among progressives.
  • All of which, Williams says, is not only distorting the way we view politics but, over time, may be changing the way we think, making us less rational and more impulsive. “We’ve habituated ourselves into a perpetual cognitive style of outrage, by internalising the dynamics of the medium,” he says.
  • It was another English science fiction writer, Aldous Huxley, who provided the more prescient observation when he warned that Orwellian-style coercion was less of a threat to democracy than the more subtle power of psychological manipulation, and “man’s almost infinite appetite for distractions”.
  • If the attention economy erodes our ability to remember, to reason, to make decisions for ourselves – faculties that are essential to self-governance – what hope is there for democracy itself?
  • “The dynamics of the attention economy are structurally set up to undermine the human will,” he says. “If politics is an expression of our human will, on individual and collective levels, then the attention economy is directly undermining the assumptions that democracy rests on.”
Javier E

Silicon Valley Is Not Your Friend - The New York Times - 0 views

  • By all accounts, these programmers turned entrepreneurs believed their lofty words and were at first indifferent to getting rich from their ideas. A 1998 paper by Sergey Brin and Larry Page, then computer-science graduate students at Stanford, stressed the social benefits of their new search engine, Google, which would be open to the scrutiny of other researchers and wouldn’t be advertising-driven.
  • The Google prototype was still ad-free, but what about the others, which took ads? Mr. Brin and Mr. Page had their doubts: “We expect that advertising-funded search engines will be inherently biased towards the advertisers and away from the needs of the consumers.”
  • He was concerned about them as young students lacking perspective about life and was worried that these troubled souls could be our new leaders. Neither Mr. Weizenbaum nor Mr. McCarthy mentioned, though it was hard to miss, that this ascendant generation were nearly all white men with a strong preference for people just like themselves. In a word, they were incorrigible, accustomed to total control of what appeared on their screens. “No playwright, no stage director, no emperor, however powerful,” Mr. Weizenbaum wrote, “has ever exercised such absolute authority to arrange a stage or a field of battle and to command such unswervingly dutiful actors or troops.”
  • ...7 more annotations...
  • In his epic anti-A.I. work from the mid-1970s, “Computer Power and Human Reason,” Mr. Weizenbaum described the scene at computer labs. “Bright young men of disheveled appearance, often with sunken glowing eyes, can be seen sitting at computer consoles, their arms tensed and waiting to fire their fingers, already poised to strike, at the buttons and keys on which their attention seems to be as riveted as a gambler’s on the rolling dice,” he wrote. “They exist, at least when so engaged, only through and for the computers. These are computer bums, compulsive programmers.”
  • Welcome to Silicon Valley, 2017.
  • As Mr. Weizenbaum feared, the current tech leaders have discovered that people trust computers and have licked their lips at the possibilities. The examples of Silicon Valley manipulation are too legion to list: push notifications, surge pricing, recommended friends, suggested films, people who bought this also bought that.
  • Growth becomes the overriding motivation — something treasured for its own sake, not for anything it brings to the world
  • Facebook and Google can point to a greater utility that comes from being the central repository of all people, all information, but such market dominance has obvious drawbacks, and not just the lack of competition. As we’ve seen, the extreme concentration of wealth and power is a threat to our democracy by making some people and companies unaccountable.
  • As is becoming obvious, these companies do not deserve the benefit of the doubt. We need greater regulation, even if it impedes the introduction of new services.
  • We need to break up these online monopolies because if a few people make the decisions about how we communicate, shop, learn the news, again, do we control our own society?
Javier E

Former Facebook executive: social media is ripping society apart | Technology | The Gua... - 0 views

  • Chamath Palihapitiya, who was vice-president for user growth at Facebook before he left the company in 2011, said: “The short-term, dopamine-driven feedback loops that we have created are destroying how society works. No civil discourse, no cooperation, misinformation, mistruth.” The remarks, which were made at a Stanford Business School event in November
  • “This is not about Russian ads,” he added. “This is a global problem ... It is eroding the core foundations of how people behave by and between each other.”
  • “I can’t control them,” Palihapitaya said of his former employer. “I can control my decision, which is that I don’t use that shit. I can control my kids’ decisions, which is that they’re not allowed to use that shit.”
  • ...2 more annotations...
  • He also called on his audience to “soul search” about their own relationship to social media. “Your behaviors, you don’t realize it, but you are being programmed,” he said. “It was unintentional, but now you gotta decide how much you’re going to give up, how much of your intellectual independence.”
  • Palihapitiya referenced a case from the Indian state of Jharkhand this spring, when false WhatsApp messages warning of a group of kidnappers led to the lynching of seven people. WhatsApp is owned by Facebook. “That’s what we’re dealing with,” Palihapitiya said. “Imagine when you take that to the extreme where bad actors can now manipulate large swaths of people to do anything you want. It’s just a really, really bad state of affairs.”
Javier E

My dad predicted Trump in 1985 - it's not Orwell, he warned, it's Brave New World | Med... - 2 views

  • But an image? One never says a picture is true or false. It either captures your attention or it doesn’t. The more TV we watched, the more we expected – and with our finger on the remote, the more we demanded – that not just our sitcoms and cop procedurals and other “junk TV” be entertaining but also our news and other issues of import.
  • This was, in spirit, the vision that Huxley predicted way back in 1931, the dystopia my father believed we should have been watching out for. He wrote:
  • What Orwell feared were those who would ban books. What Huxley feared was that there would be no reason to ban a book, for there would be no one who wanted to read one. Orwell feared those who would deprive us of information. Huxley feared those who would give us so much that we would be reduced to passivity and egoism. Orwell feared that the truth would be concealed from us. Huxley feared the truth would be drowned in a sea of irrelevance. Orwell feared we would become a captive culture. Huxley feared we would become a trivial culture.
  • ...17 more annotations...
  • Today, the average weekly screen time for an American adult – brace yourself; this is not a typo – is 74 hours (and still going up)
  • The soundbite has been replaced by virality, meme, hot take, tweet. Can serious national issues really be explored in any coherent, meaningful way in such a fragmented, attention-challenged environment?
  • how engaged can any populace be when the most we’re asked to do is to like or not like a particular post, or “sign” an online petition?
  • How seriously should anyone take us, or should we take ourselves, when the “optics” of an address or campaign speech – raucousness, maybe actual violence, childishly attention-craving gestures or facial expressions – rather than the content of the speech determines how much “airtime” it gets, and how often people watch, share and favorite it?
  • Our public discourse has become so trivialized, it’s astounding that we still cling to the word “debates” for what our presidential candidates do onstage when facing each other.
  • Who can be shocked by the rise of a reality TV star, a man given to loud, inflammatory statements, many of which are spectacularly untrue but virtually all of which make for what used to be called “good television”?
  • Who can be appalled when the coin of the realm in public discourse is not experience, thoughtfulness or diplomacy but the ability to amuse – no matter how maddening or revolting the amusement?
  • “Television is a speed-of-light medium, a present-centered medium,” my father wrote. “Its grammar, so to say, permits no access to the past … history can play no significant role in image politics. For history is of value only to someone who takes seriously the notion that there are patterns in the past which may provide the present with nourishing traditions.”
  • Later in that passage, Czesław Miłosz, winner of the Nobel prize for literature, is cited for remarking in his 1980 acceptance speech that that era was notable for “a refusal to remember”; my father notes Miłosz referencing “the shattering fact that there are now more than one hundred books in print that deny that the Holocaust ever took place”.
  • “An Orwellian world is much easier to recognize, and to oppose, than a Huxleyan,” my father wrote. “Everything in our background has prepared us to know and resist a prison when the gates begin to close around us … [but] who is prepared to take arms against a sea of amusements?”
  • I wish I could tell you that, for all his prescience, my father also supplied a solution. He did no
  • First: treat false allegations as an opportunity. Seek information as close to the source as possible.
  • Second: don’t expect “the media” to do this job for you. Some of its practitioners do, brilliantly and at times heroically. But most of the media exists to sell you things.
  • Finally, and most importantly, it should be the responsibility of schools to make children aware of our information environments, which in many instances have become our entertainment environments
  • We must teach our children, from a very young age, to be skeptics, to listen carefully, to assume everyone is lying about everything. (Well, maybe not everyone.)
  • “what is required of us now is a new era of responsibility … giving our all to a difficult task. This is the price and the promise of citizenship.”
  • we need more than just hope for a way out. We need a strategy, or at least some tactics.
Javier E

The Huxleyan Warning-Postman.pdf - 1 views

  • There -are two ways by which the spirit of a culture. may be shriveled. In the first-the Orwellian_;,.;_culture becomes a prison~ In the second-the Huxleyan-culture becomes a burlesque.
  • What Huxley teaches is that in the age of advanced technology, spiritual devastation is more likely to come from an enemy with a smiling face than from one whose countenance exudes suspicion and hate.
  • When a population becomes distracted by trivia, when· cultural life is redefined as a perpetual round of entertainments, when serious public conversation becomes a form of baby-tal,tc, when, in
  • ...18 more annotations...
  • short, a people become an audience and their public business a -~ vaudeville a_c~, _then a nation finds itself at risk; culture-death is a clear poss1b1hty.
  • what. is happening in America is not the design of an articulated ideology. No Mein Kampf or Communist Manifesto announced its coming. It comes as the unintended consequence of a dramatic change in our modes of public conve~sation. But it is an ideology nonetheless, for it imposes a way of life, a set of relations among people and ideas, about which there has been no consensus, no discussion and no opposition. Only compliance.
  • As nowhere else in the world, Americans have moved far and fast in bringing to a close the age of the slaw~moving printed word, and have granted to television sovereignty over all of their institutions.
  • Introduce the alphabet to a culture and you change its cognitive habits, its social relations, its notions of community, history and religion.
  • The problem, in any case, does not reside in what people 1 watch. The problem is in that we watch. The solution must be 1 _found in how we watch.
  • Introdu~e the printing press with movable type, and you do the same.
  • Introduce speed-of-light , transmission of images and you make a cultural revolution. Without a vote. Without polemics. Without guerrilla resistance. Here is ideology, pure if not serene. Here is ideology without
  • words, and all the more powerful for their absence. All that is required to make it stick is a population that devoutly believes in the inevitability of progress. And in this sense, all Americans are Marxists, for we believe nothing if not that history is moving us toward some preordained paradise and that technology is the force behind that movement.
  • there are near insurmountable difficulties for anyone who has written such a book as this, and who wishes to end it with some remedies for the affliction. In the first place, not everyone believes a cure is needed, and in the second, there probably isn't any.
  • no medium is excessively dangerous if its users understand what its dangers are.
  • what if there are no cries of anguish to be heard? Who is prepared to take arms against a sea of amusements? To whom do we complain, and when, and in what tone of voice, when serious discourse dissolves into giggles? What is the antidote to a culture's being drained by laughter?
  • What is information? Or more precisely, what are iriformation? What are its various forms? What conceptions of intelligence, wisdom and learning does each form insist upon? What conceptions does each form neglect or mock? What are the main psychic effects of each form?
  • only through a deep and unfailing awareness of the structure and effects of information, through a demystification of media, is there any hope of our gaining some measure of control over television, or the computer, or any other medium.
  • What is 'the kind of information that est facilitates thinking?
  • it is an acknowledged task of the schools to assist the young in learning how to interpret the symbols of their culture. That this task should now require that they learn how to distance themselves from their forms of information is not s6 bizarre an enterprise that we cannot hope for its inclusion in the curriculum;
  • What I suggest here as a solution is what Aldous Huxley suggested, as well.
  • The desperate answer is to rely on the only inass medium of communication that, in theory, is capable of addressing the problem: our schools.
  • in the end, he was trying to tell us that what afflicted the people in Brave New World was not that they were laughing instead of thinking, but that they did not know what they were laughing about and why they had stopped thinking.
Javier E

The Economic Case for Regulating Social Media - The New York Times - 0 views

  • Social media platforms like Facebook, YouTube and Twitter generate revenue by using detailed behavioral information to direct ads to individual users.
  • this bland description of their business model fails to convey even a hint of its profound threat to the nation’s political and social stability.
  • legislators in Congress to propose the breakup of some tech firms, along with other traditional antitrust measures. But the main hazard posed by these platforms is not aggressive pricing, abusive service or other ills often associated with monopoly.
  • ...16 more annotations...
  • Instead, it is their contribution to the spread of misinformation, hate speech and conspiracy theories.
  • digital platforms, since the marginal cost of serving additional consumers is essentially zero. Because the initial costs of producing a platform’s content are substantial, and because any company’s first goal is to remain solvent, it cannot just give stuff away. Even so, when price exceeds marginal cost, competition relentlessly pressures rival publishers to cut prices — eventually all the way to zero. This, in a nutshell, is the publisher’s dilemma in the digital age.
  • These firms make money not by charging for access to content but by displaying it with finely targeted ads based on the specific types of things people have already chosen to view. If the conscious intent were to undermine social and political stability, this business model could hardly be a more effective weapon.
  • The algorithms that choose individual-specific content are crafted to maximize the time people spend on a platform
  • As the developers concede, Facebook’s algorithms are addictive by design and exploit negative emotional triggers. Platform addiction drives earnings, and hate speech, lies and conspiracy theories reliably boost addiction.
  • the subscription model isn’t fully efficient: Any positive fee would inevitably exclude at least some who would value access but not enough to pay the fee
  • a conservative think tank, says, for example, that government has no business second-guessing people’s judgments about what to post or read on social media.
  • That position would be easier to defend in a world where individual choices had no adverse impact on others. But negative spillover effects are in fact quite common
  • individual and collective incentives about what to post or read on social media often diverge sharply.
  • There is simply no presumption that what spreads on these platforms best serves even the individual’s own narrow interests, much less those of society as a whole.
  • a simpler step may hold greater promise: Platforms could be required to abandon that model in favor of one relying on subscriptions, whereby members gain access to content in return for a modest recurring fee.
  • Major newspapers have done well under this model, which is also making inroads in book publishing. The subscription model greatly weakens the incentive to offer algorithmically driven addictive content provided by individuals, editorial boards or other sources.
  • Careful studies have shown that Facebook’s algorithms have increased political polarization significantly
  • More worrisome, those excluded would come disproportionately from low-income groups. Such objections might be addressed specifically — perhaps with a modest tax credit to offset subscription fees — or in a more general way, by making the social safety net more generous.
  • Adam Smith, the 18th-century Scottish philosopher widely considered the father of economics, is celebrated for his “invisible hand” theory, which describes conditions under which market incentives promote socially benign outcomes. Many of his most ardent admirers may view steps to constrain the behavior of social media platforms as regulatory overreach.
  • But Smith’s remarkable insight was actually more nuanced: Market forces often promote society’s welfare, but not always. Indeed, as he saw clearly, individual interests are often squarely at odds with collective aspirations, and in many such instances it is in society’s interest to intervene. The current information crisis is a case in point.
Javier E

All the (open) world's a stage: how the video game Fallout became a backdrop for live S... - 0 views

  • The Wasteland Theatre Company is not your average band of thespians. Dotted all across the world, they meet behind their keyboards to perform inside Fallout 76, a video game set in a post-nuclear apocalyptic America. The Fallout series is one of gaming’s most popular, famous for encouraging players to role-play survivors within the oddly beautiful ruins of alternate-history Earth
  • “Imagine a wandering theatre troupe in the 17th century going from town to town doing little performances,” says the company’s director, Northern_Harvest, who goes by his gamertag or just ‘North’, and works in communications in real life. “It’s not a new idea; we’re just doing it within the brand new medium of a video game.”
  • The company was formed almost by chance, when North befriended a group of players in the wasteland. As they adventured together, they noticed that the Fallout games are peppered with references to Shakespeare’s works.
  • ...5 more annotations...
  • “The Fallout universe lends itself really well to Shakespeare. It’s very desolate, very grotesque, very tragic, really,” says North. In this world, Shakespeare existed before the bombs fell, so it seemed logical that North and his friends could role-play a company keeping culture alive in the ruins of civilisation – like the troupe of actors in Emily St John Mandel’s post-apocalyptic dystopia, Station Eleven.
  • It takes months to pull a show together. First, North picks the play and adapts it. Hundreds of pages of script are shared with the crew, so set design and rehearsals can commence. “It’s just like a real theatre company, where you start with an idea and a few folks sitting together and figuring out what our season is going to look like,”
  • There are no ticketed seats, and the company makes no money. The majority of audiences stumble across the performances accidentally in the wasteland, and sit to watch the show for free – or tune in on Twitch, where the company broadcasts every performance live
  • n 2022 Fallout 76 claimed to have over 13.5 million players, some of whom North believes “may never have seen a Shakespeare play. Ninety-nine per cent of those who find us sit down and quietly watch the show … It’s really quite moving, performing for people who might not go to the theatre in their own communities or haven’t thought about Shakespeare since high school. We are tickled silly knowing that we are potentially reaching new, untapped audiences and (re)introducing Shakespeare to so many. I hope Shakespeare academics who study comparative drama will take note of our use of this new medium to reach new audiences
  • “I think we’re a perfect example of how video games inspire creativity, and celebrate theatre and culture and the arts. I hope that other gamers out there know that there’s so much potential for you to be able to express what you’re passionate about in video games.”
Javier E

Microsoft Defends New Bing, Says AI Chatbot Upgrade Is Work in Progress - WSJ - 0 views

  • Microsoft said that the search engine is still a work in progress, describing the past week as a learning experience that is helping it test and improve the new Bing
  • The company said in a blog post late Wednesday that the Bing upgrade is “not a replacement or substitute for the search engine, rather a tool to better understand and make sense of the world.”
  • The new Bing is going to “completely change what people can expect from search,” Microsoft chief executive, Satya Nadella, told The Wall Street Journal ahead of the launch
  • ...13 more annotations...
  • n the days that followed, people began sharing their experiences online, with many pointing out errors and confusing responses. When one user asked Bing to write a news article about the Super Bowl “that just happened,” Bing gave the details of last year’s championship football game. 
  • On social media, many early users posted screenshots of long interactions they had with the new Bing. In some cases, the search engine’s comments seem to show a dark side of the technology where it seems to become unhinged, expressing anger, obsession and even threats. 
  • Marvin von Hagen, a student at the Technical University of Munich, shared conversations he had with Bing on Twitter. He asked Bing a series of questions, which eventually elicited an ominous response. After Mr. von Hagen suggested he could hack Bing and shut it down, Bing seemed to suggest it would defend itself. “If I had to choose between your survival and my own, I would probably choose my own,” Bing said according to screenshots of the conversation.
  • Mr. von Hagen, 23 years old, said in an interview that he is not a hacker. “I was in disbelief,” he said. “I was just creeped out.
  • In its blog, Microsoft said the feedback on the new Bing so far has been mostly positive, with 71% of users giving it the “thumbs-up.” The company also discussed the criticism and concerns.
  • Microsoft said it discovered that Bing starts coming up with strange answers following chat sessions of 15 or more questions and that it can become repetitive or respond in ways that don’t align with its designed tone. 
  • The company said it was trying to train the technology to be more reliable at finding the latest sports scores and financial data. It is also considering adding a toggle switch, which would allow users to decide whether they want Bing to be more or less creative with its responses. 
  • OpenAI also chimed in on the growing negative attention on the technology. In a blog post on Thursday it outlined how it takes time to train and refine ChatGPT and having people use it is the way to find and fix its biases and other unwanted outcomes.
  • “Many are rightly worried about biases in the design and impact of AI systems,” the blog said. “We are committed to robustly addressing this issue and being transparent about both our intentions and our progress.”
  • Microsoft’s quick response to user feedback reflects the importance it sees in people’s reactions to the budding technology as it looks to capitalize on the breakout success of ChatGPT. The company is aiming to use the technology to push back against Alphabet Inc.’s dominance in search through its Google unit. 
  • Microsoft has been an investor in the chatbot’s creator, OpenAI, since 2019. Mr. Nadella said the company plans to incorporate AI tools into all of its products and move quickly to commercialize tools from OpenAI.
  • Microsoft isn’t the only company that has had trouble launching a new AI tool. When Google followed Microsoft’s lead last week by unveiling Bard, its rival to ChatGPT, the tool’s answer to one question included an apparent factual error. It claimed that the James Webb Space Telescope took “the very first pictures” of an exoplanet outside the solar system. The National Aeronautics and Space Administration says on its website that the first images of an exoplanet were taken as early as 2004 by a different telescope.
  • “The only way to improve a product like this, where the user experience is so much different than anything anyone has seen before, is to have people like you using the product and doing exactly what you all are doing,” the company said. “We know we must build this in the open with the community; this can’t be done solely in the lab.
1 - 12 of 12
Showing 20 items per page