Skip to main content

Home/ QN2019/ Group items tagged controle

Rss Feed Group items tagged

Aurialie Jublin

It's Time to Break Up Facebook - The New York Times - 0 views

  • Mark’s influence is staggering, far beyond that of anyone else in the private sector or in government. He controls three core communications platforms — Facebook, Instagram and WhatsApp — that billions of people use every day. Facebook’s board works more like an advisory committee than an overseer, because Mark controls around 60 percent of voting shares. Mark alone can decide how to configure Facebook’s algorithms to determine what people see in their News Feeds, what privacy settings they can use and even which messages get delivered. He sets the rules for how to distinguish violent and incendiary speech from the merely offensive, and he can choose to shut down a competitor by acquiring, blocking or copying it.
  • Mark is a good, kind person. But I’m angry that his focus on growth led him to sacrifice security and civility for clicks. I’m disappointed in myself and the early Facebook team for not thinking more about how the News Feed algorithm could change our culture, influence elections and empower nationalist leaders. And I’m worried that Mark has surrounded himself with a team that reinforces his beliefs instead of challenging them.
  • We are a nation with a tradition of reining in monopolies, no matter how well intentioned the leaders of these companies may be. Mark’s power is unprecedented and un-American.It is time to break up Facebook.
  • ...26 more annotations...
  • We already have the tools we need to check the domination of Facebook. We just seem to have forgotten about them.America was built on the idea that power should not be concentrated in any one person, because we are all fallible. That’s why the founders created a system of checks and balances. They didn’t need to foresee the rise of Facebook to understand the threat that gargantuan companies would pose to democracy. Jefferson and Madison were voracious readers of Adam Smith, who believed that monopolies prevent the competition that spurs innovation and leads to economic growth.
  • The Sherman Antitrust Act of 1890 outlawed monopolies. More legislation followed in the 20th century, creating legal and regulatory structures to promote competition and hold the biggest companies accountable. The Department of Justice broke up monopolies like Standard Oil and AT&T.
  • For many people today, it’s hard to imagine government doing much of anything right, let alone breaking up a company like Facebook. This isn’t by coincidence. Starting in the 1970s, a small but dedicated group of economists, lawyers and policymakers sowed the seeds of our cynicism. Over the next 40 years, they financed a network of think tanks, journals, social clubs, academic centers and media outlets to teach an emerging generation that private interests should take precedence over public ones. Their gospel was simple: “Free” markets are dynamic and productive, while government is bureaucratic and ineffective. By the mid-1980s, they had largely managed to relegate energetic antitrust enforcement to the history books.
  • It was this drive to compete that led Mark to acquire, over the years, dozens of other companies, including Instagram and WhatsApp in 2012 and 2014. There was nothing unethical or suspicious, in my view, in these moves.
  • Over a decade later, Facebook has earned the prize of domination. It is worth half a trillion dollars and commands, by my estimate, more than 80 percent of the world’s social networking revenue. It is a powerful monopoly, eclipsing all of its rivals and erasing competition from the social networking category. This explains why, even during the annus horribilis of 2018, Facebook’s earnings per share increased by an astounding 40 percent compared with the year before. (I liquidated my Facebook shares in 2012, and I don’t invest directly in any social media companies.)
  • Facebook’s dominance is not an accident of history. The company’s strategy was to beat every competitor in plain view, and regulators and the government tacitly — and at times explicitly — approved. In one of the government’s few attempts to rein in the company, the F.T.C. in 2011 issued a consent decree that Facebook not share any private information beyond what users already agreed to. Facebook largely ignored the decree. Last month, the day after the company predicted in an earnings call that it would need to pay up to $5 billion as a penalty for its negligence — a slap on the wrist — Facebook’s shares surged 7 percent, adding $30 billion to its value, six times the size of the fine.
  • As markets become more concentrated, the number of new start-up businesses declines. This holds true in other high-tech areas dominated by single companies, like search (controlled by Google) and e-commerce (taken over by Amazon). Meanwhile, there has been plenty of innovation in areas where there is no monopolistic domination, such as in workplace productivity (Slack, Trello, Asana), urban transportation (Lyft, Uber, Lime, Bird) and cryptocurrency exchanges (Ripple, Coinbase, Circle).
  • Facebook’s business model is built on capturing as much of our attention as possible to encourage people to create and share more information about who they are and who they want to be. We pay for Facebook with our data and our attention, and by either measure it doesn’t come cheap.
  • The most problematic aspect of Facebook’s power is Mark’s unilateral control over speech. There is no precedent for his ability to monitor, organize and even censor the conversations of two billion people.Facebook engineers write algorithms that select which users’ comments or experiences end up displayed in the News Feeds of friends and family. These rules are proprietary and so complex that many Facebook employees themselves don’t understand them.
  • Facebook has responded to many of the criticisms of how it manages speech by hiring thousands of contractors to enforce the rules that Mark and senior executives develop. After a few weeks of training, these contractors decide which videos count as hate speech or free speech, which images are erotic and which are simply artistic, and which live streams are too violent to be broadcast. (The Verge reported that some of these moderators, working through a vendor in Arizona, were paid $28,800 a year, got limited breaks and faced significant mental health risks.)
  • As if Facebook’s opaque algorithms weren’t enough, last year we learned that Facebook executives had permanently deleted their own messages from the platform, erasing them from the inboxes of recipients; the justification was corporate security concerns. When I look at my years of Facebook messages with Mark now, it’s just a long stream of my own light-blue comments, clearly written in response to words he had once sent me. (Facebook now offers this as a feature to all users.)
  • Mark used to insist that Facebook was just a “social utility,” a neutral platform for people to communicate what they wished. Now he recognizes that Facebook is both a platform and a publisher and that it is inevitably making decisions about values. The company’s own lawyers have argued in court that Facebook is a publisher and thus entitled to First Amendment protection.
  • Mark knows that this is too much power and is pursuing a twofold strategy to mitigate it. He is pivoting Facebook’s focus toward encouraging more private, encrypted messaging that Facebook’s employees can’t see, let alone control. Second, he is hoping for friendly oversight from regulators and other industry executives.
  • In an op-ed essay in The Washington Post in March, he wrote, “Lawmakers often tell me we have too much power over speech, and I agree.” And he went even further than before, calling for more government regulation — not just on speech, but also on privacy and interoperability, the ability of consumers to seamlessly leave one network and transfer their profiles, friend connections, photos and other data to another.
  • Facebook isn’t afraid of a few more rules. It’s afraid of an antitrust case and of the kind of accountability that real government oversight would bring.
  • Mark may never have a boss, but he needs to have some check on his power. The American government needs to do two things: break up Facebook’s monopoly and regulate the company to make it more accountable to the American people.First, Facebook should be separated into multiple companies. The F.T.C., in conjunction with the Justice Department, should enforce antitrust laws by undoing the Instagram and WhatsApp acquisitions and banning future acquisitions for several years. The F.T.C. should have blocked these mergers, but it’s not too late to act. There is precedent for correcting bad decisions — as recently as 2009, Whole Foods settled antitrust complaints by selling off the Wild Oats brand and stores that it had bought a few years earlier.
  • Still others worry that the breakup of Facebook or other American tech companies could be a national security problem. Because advancements in artificial intelligence require immense amounts of data and computing power, only large companies like Facebook, Google and Amazon can afford these investments, they say. If American companies become smaller, the Chinese will outpace us.While serious, these concerns do not justify inaction. Even after a breakup, Facebook would be a hugely profitable business with billions to invest in new technologies — and a more competitive market would only encourage those investments. If the Chinese did pull ahead, our government could invest in research and development and pursue tactical trade policy, just as it is doing today to hold China’s 5G technology at bay.
  • The cost of breaking up Facebook would be next to zero for the government, and lots of people stand to gain economically. A ban on short-term acquisitions would ensure that competitors, and the investors who take a bet on them, would have the space to flourish. Digital advertisers would suddenly have multiple companies vying for their dollars.
  • But the biggest winners would be the American people. Imagine a competitive market in which they could choose among one network that offered higher privacy standards, another that cost a fee to join but had little advertising and another that would allow users to customize and tweak their feeds as they saw fit. No one knows exactly what Facebook’s competitors would offer to differentiate themselves. That’s exactly the point.
  • Just breaking up Facebook is not enough. We need a new agency, empowered by Congress to regulate tech companies. Its first mandate should be to protect privacy.The Europeans have made headway on privacy with the General Data Protection Regulation, a law that guarantees users a minimal level of protection. A landmark privacy bill in the United States should specify exactly what control Americans have over their digital information, require clearer disclosure to users and provide enough flexibility to the agency to exercise effective oversight over time. The agency should also be charged with guaranteeing basic interoperability across platforms.
  • Finally, the agency should create guidelines for acceptable speech on social media. This idea may seem un-American — we would never stand for a government agency censoring speech. But we already have limits on yelling “fire” in a crowded theater, child pornography, speech intended to provoke violence and false statements to manipulate stock prices. We will have to create similar standards that tech companies can use. These standards should of course be subject to the review of the courts, just as any other limits on speech are. But there is no constitutional right to harass others or live-stream violence.
  • These are difficult challenges. I worry that government regulators will not be able to keep up with the pace of digital innovation. I worry that more competition in social networking might lead to a conservative Facebook and a liberal one, or that newer social networks might be less secure if government regulation is weak. But sticking with the status quo would be worse: If we don’t have public servants shaping these policies, corporations will.
  • Similarly, the Justice Department’s 1970s suit accusing IBM of illegally maintaining its monopoly on personal computer sales ended in a stalemate. But along the way, IBM changed many of its behaviors. It stopped bundling its hardware and software, chose an extremely open design for the operating system in its personal computers and did not exercise undue control over its suppliers. Professor Wu has written that this “policeman at the elbow” led IBM to steer clear “of anything close to anticompetitive conduct, for fear of adding to the case against it.”
  • Finally, an aggressive case against Facebook would persuade other behemoths like Google and Amazon to think twice about stifling competition in their own sectors, out of fear that they could be next. If the government were to use this moment to resurrect an effective competition standard that takes a broader view of the full cost of “free” products, it could affect a whole host of industries.
  • I take responsibility for not sounding the alarm earlier. Don Graham, a former Facebook board member, has accused those who criticize the company now as having “all the courage of the last man leaping on the pile at a football game.” The financial rewards I reaped from working at Facebook radically changed the trajectory of my life, and even after I cashed out, I watched in awe as the company grew. It took the 2016 election fallout and Cambridge Analytica to awaken me to the dangers of Facebook’s monopoly. But anyone suggesting that Facebook is akin to a pinned football player misrepresents its resilience and power.
  • This movement of public servants, scholars and activists deserves our support. Mark Zuckerberg cannot fix Facebook, but our government can.
  •  
    "Since then, Mark's personal reputation and the reputation of Facebook have taken a nose-dive. The company's mistakes - the sloppy privacy practices that dropped tens of millions of users' data into a political consulting firm's lap; the slow response to Russian agents, violent rhetoric and fake news; and the unbounded drive to capture ever more of our time and attention - dominate the headlines. It's been 15 years since I co-founded Facebook at Harvard, and I haven't worked at the company in a decade. But I feel a sense of anger and responsibility."
Aurialie Jublin

"I Was Devastated": Tim Berners-Lee, the Man Who Created the World Wide Web, Has Some R... - 2 views

  • Initially, Berners-Lee’s innovation was intended to help scientists share data across a then obscure platform called the Internet, a version of which the U.S. government had been using since the 1960s. But owing to his decision to release the source code for free—to make the Web an open and democratic platform for all—his brainchild quickly took on a life of its own.
  • He also envisioned that his invention could, in the wrong hands, become a destroyer of worlds, as Robert Oppenheimer once infamously observed of his own creation. His prophecy came to life, most recently, when revelations emerged that Russian hackers interfered with the 2016 presidential election, or when Facebook admitted it exposed data on more than 80 million users to a political research firm, Cambridge Analytica, which worked for Donald Trump’s campaign. This episode was the latest in an increasingly chilling narrative. In 2012, Facebook conducted secret psychological experiments on nearly 700,000 users. Both Google and Amazon have filed patent applications for devices designed to listen for mood shifts and emotions in the human voice.
  • This agony, however, has had a profound effect on Berners-Lee. He is now embarking on a third act—determined to fight back through both his celebrity status and, notably, his skill as a coder. In particular, Berners-Lee has, for some time, been working on a new platform, Solid, to reclaim the Web from corporations and return it to its democratic roots.
  • ...10 more annotations...
  • What made the Web powerful, and ultimately dominant, however, would also one day prove to be its greatest vulnerability: Berners-Lee gave it away for free; anyone with a computer and an Internet connection could not only access it but also build off it. Berners-Lee understood that the Web needed to be unfettered by patents, fees, royalties, or any other controls in order to thrive. This way, millions of innovators could design their own products to take advantage of it.
  • “Tim and Vint made the system so that there could be many players that didn’t have an advantage over each other.” Berners-Lee, too, remembers the quixotism of the era. “The spirit there was very decentralized. The individual was incredibly empowered. It was all based on there being no central authority that you had to go to to ask permission,” he said. “That feeling of individual control, that empowerment, is something we’ve lost.”
  • The power of the Web wasn’t taken or stolen. We, collectively, by the billions, gave it away with every signed user agreement and intimate moment shared with technology. Facebook, Google, and Amazon now monopolize almost everything that happens online, from what we buy to the news we read to who we like. Along with a handful of powerful government agencies, they are able to monitor, manipulate, and spy in once unimaginable ways.
  • The idea is simple: re-decentralize the Web. Working with a small team of developers, he spends most of his time now on Solid, a platform designed to give individuals, rather than corporations, control of their own data. “There are people working in the lab trying to imagine how the Web could be different. How society on the Web could look different. What could happen if we give people privacy and we give people control of their data,” Berners-Lee told me. “We are building a whole eco-system.”
  • For now, the Solid technology is still new and not ready for the masses. But the vision, if it works, could radically change the existing power dynamics of the Web. The system aims to give users a platform by which they can control access to the data and content they generate on the Web. This way, users can choose how that data gets used rather than, say, Facebook and Google doing with it as they please. Solid’s code and technology is open to all—anyone with access to the Internet can come into its chat room and start coding.
  • It’s still the early days for Solid, but Berners-Lee is moving fast. Those who work closely with him say he has thrown himself into the project with the same vigor and determination he employed upon the Web’s inception. Popular sentiment also appears to facilitate his time frame. In India, a group of activists successfully blocked Facebook from implementing a new service that would have effectively controlled access to the Web for huge swaths of the country’s population. In Germany, one young coder built a decentralized version of Twitter called Mastodon. In France, another group created Peertube as a decentralized alternative to YouTube.
  • Berners-Lee is not the leader of this revolution—by definition, the decentralized Web shouldn’t have one—but he is a powerful weapon in the fight. And he fully recognizes that re-decentralizing the Web is going to be a lot harder than inventing it was in the first place.
  • But laws written now don’t anticipate future technologies. Nor do lawmakers—many badgered by corporate lobbyists—always choose to protect individual rights. In December, lobbyists for telecom companies pushed the Federal Communications Commission to roll back net-neutrality rules, which protect equal access to the Internet. In January, the U.S. Senate voted to advance a bill that would allow the National Security Agency to continue its mass online-surveillance program. Google’s lobbyists are now working to modify rules on how companies can gather and store biometric data, such as fingerprints, iris scans, and facial-recognition images.
  • It’s hard to believe that anyone—even Zuckerberg—wants the 1984 version. He didn’t found Facebook to manipulate elections; Jack Dorsey and the other Twitter founders didn’t intend to give Donald Trump a digital bullhorn. And this is what makes Berners-Lee believe that this battle over our digital future can be won.
  • When asked what ordinary people can do, Berners-Lee replied, “You don’t have to have any coding skills. You just have to have a heart to decide enough is enough. Get out your Magic Marker and your signboard and your broomstick. And go out on the streets.” In other words, it’s time to rise against the machines.
  •  
    "We demonstrated that the Web had failed instead of served humanity, as it was supposed to have done, and failed in many places," he told me. The increasing centralization of the Web, he says, has "ended up producing-with no deliberate action of the people who designed the platform-a large-scale emergent phenomenon which is anti-human." "While the problems facing the web are complex and large, I think we should see them as bugs: problems with existing code and software systems that have been created by people-and can be fixed by people." Tim Berners-Lee
  •  
    "Berners-Lee has seen his creation debased by everything from fake news to mass surveillance. But he's got a plan to fix it."
Aurialie Jublin

Can Worker Co-ops Make the Tech Sector More Equitable? | The Nation - 0 views

  • Fed up with this heartless model, some tech activists are developing online workplaces that operate as worker-driven communities. Daemo, a pilot program incubated at Stanford University’s Crowd Research Collective, is one such worker-driven crowd-labor platform. Since 2015, Daemo’s developers have been building on MTurk’s interface with a communications system aimed at allowing for more equitable “matching” between work requesters and digital taskers. As a non-hierarchical, nonprofit framework where workers control the operations, Daemo is designed for fairer working conditions, with a minimum wage of $10 an hour, which is a major improvement on MTurk’s precarious labor-outsourcing system.
  • Some former participants in Daemo’s project recently aired sharp criticism of the platform in response to a largely favorable article in Wired. In a collectively authored article on Medium, they argued that, in their practical experience with the platform, decision-making power rests with a “platform team” of researchers and leading developers. Though Daemo has established a Constitution that theoretically is open to amendments and revision based on workers’ input, critics say the day-to-day management remains tightly controlled by researchers.
  • “Whenever they talk about the decentralization, they talk about technical decentralization, like block-chain or decentralized platforms, but most of the time they overlook the governance level, which is more important,” Hashim says. “So it’s about who takes the positions, it’s about who has the right to access information. If you don’t have a well-informed society, you don’t have democracy.”
  • ...4 more annotations...
  • Kristy Milland, an activist with the MTurk advocacy network We Are Dynamo, says she’s given up collaborating with Daemo because “There hasn’t been any deep, involved worker input…. It’s built by academics with the bias they bring to such a platform that they expect will provide them with free data to publish down the road. Just like Amazon built MTurk with their needs in mind, even if many of the roadblocks this caused may have been unintentional.”
  • The “platform cooperativism” concept, as articulated by technologist Trebor Scholz and other academics, is that worker control can be integrated by working with the democratic aspects of the online sphere: entrepreneurial horizontalism and a pluralistic culture of innovation. But with online workspaces proliferating at breakneck speed, it’s a race to see whether these more principled worker-led models will ever be able to compete for market share with the app-based workforce of MTurk. Similarly, small-scale cab-service cooperatives are emerging in the United States, but Uber and Lyft’s mega brands are displacing cabbies by the minute.
  • The problem with crowd labor isn’t that it’s big, or complex; it’s that workers can’t control their means of technological production. According to Joshua Danielson of the Bay Area start-up cooperative Loconomics, Daemo’s model “has the potential to provide an alternative to Amazon Turk,” if the platform combines a good product and good jobs for the producers. The key, he says via e-mail, is “creating a cooperative business model that can be self-sufficient and be able to attract clients. The latter is the more challenging one given the deep pockets of the current players. That said, it’s important to remember that workers are the product, not the platform, and they hold an immense amount of power if they can organize.”
  • The digital frontier offers endless room both for exploitation and for social transformation. But if workers can get ahead of corporations in harnessing the potential of open-source technology, they can disrupt the incumbent Silicon Valley oligarchs from below. So far, technology hasn’t emancipated labor nearly as rapidly as it has liberalized markets. Cooperative thinking can make technological power part of the solution, but only if it’s matched with people power.
  •  
    "The crowdwork sector is dominated by low-paid gigs-can communally run companies make these jobs sustainable?"
Aurialie Jublin

Privacy expert Ann Cavoukian resigns as adviser to Sidewalk Labs - The Logic - 0 views

  •  
    Ann Cavoukian, a world-leading privacy expert, has resigned as an adviser to Sidewalk Labs on its proposed Toronto smart city development. Cavoukian sent a letter advising the company of her resignation Friday. In the letter, she expressed concerns regarding Sidewalk Labs recent digital governance proposals, specifically, the possibility that not all personal data would be de-identified at the source-a concern she said she raised with Sidewalk Labs early last month. Sidewalk Labs told The Logic it is committed to de-identifying data, but that it can't control what third-parties do.
Aurialie Jublin

14 years of Mark Zuckerberg saying sorry, not sorry about Facebook - Washington Post - 0 views

  •  
    "From the moment the Facebook founder entered the public eye in 2003 for creating a Harvard student hot-or-not rating site, he's been apologizing. So we collected this abbreviated history of his public mea culpas. It reads like a record on repeat. Zuckerberg, who made "move fast and break things" his slogan, says sorry for being naive, and then promises solutions such as privacy "controls," "transparency" and better policy "enforcement." And then he promises it again the next time. You can track his sorries in orange and promises in blue in the timeline below. All the while, Facebook's access to our personal data increases and little changes about the way Zuckerberg handles it. So as Zuckerberg prepares to apologize for the first time in front of Congress, the question that lingers is: What will be different this time?"
Aurialie Jublin

Jaron Lanier: How we need to remake the internet | TED Talk - 0 views

  •  
    "In the early days of digital culture, Jaron Lanier helped craft a vision for the internet as public commons where humanity could share its knowledge -- but even then, this vision was haunted by the dark side of how it could turn out: with personal devices that control our lives, monitor our data and feed us stimuli. (Sound familiar?) In this visionary talk, Lanier reflects on a "globally tragic, astoundingly ridiculous mistake" companies like Google and Facebook made at the foundation of digital culture -- and how we can undo it. "We cannot have a society in which, if two people wish to communicate, the only way that can happen is if it's financed by a third person who wishes to manipulate them," he says."
Aurialie Jublin

The Very First Oakland Co-op DiscoTech - Danny Spitzberg - Medium - 0 views

  • Technology isn’t necessary for bars or farms to become better co-ops, but it can help.Two coalitions that embrace co-design — civic tech and online organizing — can offer lessons on how to build better tech.At the same time, co-op theory and history offer a model of how to own, control, and share the value generated by the tech we build.
  • While each area has its emphasis, each can learn from the other, too:Civic tech is a coalition for better citizenship, trying to achieve citizen engagement. An example is southbendvoices.com, an automated call-in system. Yet tweeting the city to shut off sprinklers after the rain is a far cry from building better neighborhoods. What could it add? Economic solidarity.Online organizing is a digital approach to social change, trying to achieve community power. An example is 18millionrising.org, a group running rapid-response campaigns for racial justice. What’s missing? Platforms that support lasting effort with multiple allies.The co-op movement is about democracy in the workplace, trying to achieve real ownership, control, and value for the people doing the labor. An example is the Arizmendi Association, an umbrella group supporting six worker-owned bakeries. It’s a model that only the Enspiral network has replicated in New Zealand. What potential remains untapped here? Widespread relevance.How might all three of these areas become better, together?
  • There are co-ops, and then there is cooperation. Shane from CCA asked “what counts as a co-op?” and Willow Brugh from Aspiration Tech described a multi-stakeholder project in Africa that supports self-determining small businesses. I mentioned how Enspiral exemplifies the first co-op principle of open and voluntary membership better than most legally-recognized co-ops with a quarterly auto-email to their 30 member organizations that simply asks, “Do you still feel like a member?”
  • ...2 more annotations...
  • There are parallel worlds we can learn from, if we take care not to reproduce extractive practices. Developer and counselor Kermit Goodwin suggested that the open source community might be a place to learn more, and Noah Thorp of CoMakery cautioned that while developers might play better, the open source software economy is “broken” and dominated by corporate interests — most of the people making a livelihood through open source software do so through extractive enterprises (think, Microsoft).
  • And then there is the agitation and education that leads to organizing. After Evangeline asked “Why do people stop trying?” and how we can make co-ops familiar to more people, Molly McLeod brought up relatively passive directories like cultivate.coop and showed us Co-opoly, a boardgame about starting worker-owned businesses and having all of the poignant conversations that go along with it. Jackie Mahendra from CEL said her first serious role was working with a co-op house, and then others agreed co-ops can stay relevant if they provide services more widely — housing, education, health care, consumer finance, and more. Building viable co-op platforms is exactly what the creators of Platform Cooperativism are organizing around.
  •  
    "Evangeline asked why people get involved in co-ops, and then drop it. "Why do people stop trying?" For half of the 30 people at The Very First Oakland Co-op DiscoTech, it was a tough question - they had little exposure to co-ops in the first place."
Aurialie Jublin

Redecentralize.org - 0 views

  • Motivations for decentralizing the internet vary. These three keep coming up: Resilient - whether there’s a hurricane, a severed trans-atlantic cable or just a train line with a poor connection, we’d like to be able to carry on making phone calls and sharing documents. Private - as a company keeping our commercial secrets, or as an individual concerned about criminals and overreaching governments, we'd like our most personal messages not to be held in distant data centres. Competitive - we're often forced to use one dominant provider who restricts competition, we'd like to build communtities around new protocols, driving innovation of new services on top.
  •  
    "The original Internet was decentralized. Anyone could set up parts of it. That's why it won. For various reasons, control of our information technologies is increasingly falling into a few hands. Some big companies and Governments. We want it to become decentralized. Again."
Aurialie Jublin

An Apology for the Internet - From the People Who Built It - 1 views

  • There have always been outsiders who criticized the tech industry — even if their concerns have been drowned out by the oohs and aahs of consumers, investors, and journalists. But today, the most dire warnings are coming from the heart of Silicon Valley itself. The man who oversaw the creation of the original iPhone believes the device he helped build is too addictive. The inventor of the World Wide Web fears his creation is being “weaponized.” Even Sean Parker, Facebook’s first president, has blasted social media as a dangerous form of psychological manipulation. “God only knows what it’s doing to our children’s brains,” he lamented recently.
  • To keep the internet free — while becoming richer, faster, than anyone in history — the technological elite needed something to attract billions of users to the ads they were selling. And that something, it turns out, was outrage. As Jaron Lanier, a pioneer in virtual reality, points out, anger is the emotion most effective at driving “engagement” — which also makes it, in a market for attention, the most profitable one. By creating a self-perpetuating loop of shock and recrimination, social media further polarized what had already seemed, during the Obama years, an impossibly and irredeemably polarized country.
  • The Architects (In order of appearance.) Jaron Lanier, virtual-reality pioneer. Founded first company to sell VR goggles; worked at Atari and Microsoft. Antonio García Martínez, ad-tech entrepreneur. Helped create Facebook’s ad machine. Ellen Pao, former CEO of Reddit. Filed major gender-discrimination lawsuit against VC firm Kleiner Perkins. Can Duruk, programmer and tech writer. Served as project lead at Uber. Kate Losse, Facebook employee No. 51. Served as Mark Zuckerberg’s speechwriter. Tristan Harris, product designer. Wrote internal Google presentation about addictive and unethical design. Rich “Lowtax” Kyanka, entrepreneur who founded influential message board Something Awful. Ethan Zuckerman, MIT media scholar. Invented the pop-up ad. Dan McComas, former product chief at Reddit. Founded community-based platform Imzy. Sandy Parakilas, product manager at Uber. Ran privacy compliance for Facebook apps. Guillaume Chaslot, AI researcher. Helped develop YouTube’s algorithmic recommendation system. Roger McNamee, VC investor. Introduced Mark Zuckerberg to Sheryl Sandberg. Richard Stallman, MIT programmer. Created legendary software GNU and Emacs.
  • ...45 more annotations...
  • How It Went Wrong, in 15 Steps Step 1 Start With Hippie Good Intentions …
  • I think two things are at the root of the present crisis. One was the idealistic view of the internet — the idea that this is the great place to share information and connect with like-minded people. The second part was the people who started these companies were very homogeneous. You had one set of experiences, one set of views, that drove all of the platforms on the internet. So the combination of this belief that the internet was a bright, positive place and the very similar people who all shared that view ended up creating platforms that were designed and oriented around free speech.
  • Step 2 … Then mix in capitalism on steroids. To transform the world, you first need to take it over. The planetary scale and power envisioned by Silicon Valley’s early hippies turned out to be as well suited for making money as they were for saving the world.
  • Step 3 The arrival of Wall Streeters didn’t help … Just as Facebook became the first overnight social-media success, the stock market crashed, sending money-minded investors westward toward the tech industry. Before long, a handful of companies had created a virtual monopoly on digital life.
  • Ethan Zuckerman: Over the last decade, the social-media platforms have been working to make the web almost irrelevant. Facebook would, in many ways, prefer that we didn’t have the internet. They’d prefer that we had Facebook.
  • Step 4 … And we paid a high price for keeping it free. To avoid charging for the internet — while becoming fabulously rich at the same time — Silicon Valley turned to digital advertising. But to sell ads that target individual users, you need to grow a big audience — and use advancing technology to gather reams of personal data that will enable you to reach them efficiently.
  • Harris: If you’re YouTube, you want people to register as many accounts as possible, uploading as many videos as possible, driving as many views to those videos as possible, so you can generate lots of activity that you can sell to advertisers. So whether or not the users are real human beings or Russian bots, whether or not the videos are real or conspiracy theories or disturbing content aimed at kids, you don’t really care. You’re just trying to drive engagement to the stuff and maximize all that activity. So everything stems from this engagement-based business model that incentivizes the most mindless things that harm the fabric of society.
  • Step 5 Everything was designed to be really, really addictive. The social-media giants became “attention merchants,” bent on hooking users no mater the consequences. “Engagement” was the euphemism for the metric, but in practice it evolved into an unprecedented machine for behavior modification.
  • Harris: That blue Facebook icon on your home screen is really good at creating unconscious habits that people have a hard time extinguishing. People don’t see the way that their minds are being manipulated by addiction. Facebook has become the largest civilization-scale mind-control machine that the world has ever seen.
  • Step 6 At first, it worked — almost too well. None of the companies hid their plans or lied about how their money was made. But as users became deeply enmeshed in the increasingly addictive web of surveillance, the leading digital platforms became wildly popular.
  • Pao: There’s this idea that, “Yes, they can use this information to manipulate other people, but I’m not gonna fall for that, so I’m protected from being manipulated.” Slowly, over time, you become addicted to the interactions, so it’s hard to opt out. And they just keep taking more and more of your time and pushing more and more fake news. It becomes easy just to go about your life and assume that things are being taken care of.
  • McNamee: If you go back to the early days of propaganda theory, Edward Bernays had a hypothesis that to implant an idea and make it universally acceptable, you needed to have the same message appearing in every medium all the time for a really long period of time. The notion was it could only be done by a government. Then Facebook came along, and it had this ability to personalize for every single user. Instead of being a broadcast model, it was now 2.2 billion individualized channels. It was the most effective product ever created to revolve around human emotions.
  • Step 7 No one from Silicon Valley was held accountable … No one in the government — or, for that matter, in the tech industry’s user base — seemed interested in bringing such a wealthy, dynamic sector to heel.
  • Step 8 … Even as social networks became dangerous and toxic. With companies scaling at unprecedented rates, user security took a backseat to growth and engagement. Resources went to selling ads, not protecting users from abuse.
  • Lanier: Every time there’s some movement like Black Lives Matter or #MeToo, you have this initial period where people feel like they’re on this magic-carpet ride. Social media is letting them reach people and organize faster than ever before. They’re thinking, Wow, Facebook and Twitter are these wonderful tools of democracy. But it turns out that the same data that creates a positive, constructive process like the Arab Spring can be used to irritate other groups. So every time you have a Black Lives Matter, social media responds by empowering neo-Nazis and racists in a way that hasn’t been seen in generations. The original good intention winds up empowering its opposite.
  • Chaslot: As an engineer at Google, I would see something weird and propose a solution to management. But just noticing the problem was hurting the business model. So they would say, “Okay, but is it really a problem?” They trust the structure. For instance, I saw this conspiracy theory that was spreading. It’s really large — I think the algorithm may have gone crazy. But I was told, “Don’t worry — we have the best people working on it. It should be fine.” Then they conclude that people are just stupid. They don’t want to believe that the problem might be due to the algorithm.
  • Parakilas: One time a developer who had access to Facebook’s data was accused of creating profiles of people without their consent, including children. But when we heard about it, we had no way of proving whether it had actually happened, because we had no visibility into the data once it left Facebook’s servers. So Facebook had policies against things like this, but it gave us no ability to see what developers were actually doing.
  • McComas: Ultimately the problem Reddit has is the same as Twitter: By focusing on growth and growth only, and ignoring the problems, they amassed a large set of cultural norms on their platforms that stem from harassment or abuse or bad behavior. They have worked themselves into a position where they’re completely defensive and they can just never catch up on the problem. I don’t see any way it’s going to improve. The best they can do is figure out how to hide the bad behavior from the average user.
  • Step 9 … And even as they invaded our privacy. The more features Facebook and other platforms added, the more data users willingly, if unwittingly, released to them and the data brokers who power digital advertising.
  • Richard Stallman: What is data privacy? That means that if a company collects data about you, it should somehow protect that data. But I don’t think that’s the issue. The problem is that these companies are collecting data about you, period. We shouldn’t let them do that. The data that is collected will be abused. That’s not an absolute certainty, but it’s a practical extreme likelihood, which is enough to make collection a problem.
  • Losse: I’m not surprised at what’s going on now with Cambridge Analytica and the scandal over the election. For long time, the accepted idea at Facebook was: Giving developers as much data as possible to make these products is good. But to think that, you also have to not think about the data implications for users. That’s just not your priority.
  • Step 10 Then came 2016. The election of Donald Trump and the triumph of Brexit, two campaigns powered in large part by social media, demonstrated to tech insiders that connecting the world — at least via an advertising-surveillance scheme — doesn’t necessarily lead to that hippie utopia.
  • Chaslot: I realized personally that things were going wrong in 2011, when I was working at Google. I was working on this YouTube recommendation algorithm, and I realized that the algorithm was always giving you the same type of content. For instance, if I give you a video of a cat and you watch it, the algorithm thinks, Oh, he must really like cats. That creates these feeder bubbles where people just see one type of information. But when I notified my managers at Google and proposed a solution that would give a user more control so he could get out of the feeder bubble, they realized that this type of algorithm would not be very beneficial for watch time. They didn’t want to push that, because the entire business model is based on watch time.
  • Step 11 Employees are starting to revolt. Tech-industry executives aren’t likely to bite the hand that feeds them. But maybe their employees — the ones who signed up for the mission as much as the money — can rise up and make a change.
  • Harris: There’s a massive demoralizing wave that is hitting Silicon Valley. It’s getting very hard for companies to attract and retain the best engineers and talent when they realize that the automated system they’ve built is causing havoc everywhere around the world. So if Facebook loses a big chunk of its workforce because people don’t want to be part of that perverse system anymore, that is a very powerful and very immediate lever to force them to change.
  • Duruk: I was at Uber when all the madness was happening there, and it did affect recruiting and hiring. I don’t think these companies are going to go down because they can’t attract the right talent. But there’s going to be a measurable impact. It has become less of a moral positive now — you go to Facebook to write some code and then you go home. They’re becoming just another company.
  • Step 12 To fix it, we’ll need a new business model … If the problem is in the way the Valley makes money, it’s going to have to make money a different way. Maybe by trying something radical and new — like charging users for goods and services.
  • Parakilas: They’re going to have to change their business model quite dramatically. They say they want to make time well spent the focus of their product, but they have no incentive to do that, nor have they created a metric by which they would measure that. But if Facebook charged a subscription instead of relying on advertising, then people would use it less and Facebook would still make money. It would be equally profitable and more beneficial to society. In fact, if you charged users a few dollars a month, you would equal the revenue Facebook gets from advertising. It’s not inconceivable that a large percentage of their user base would be willing to pay a few dollars a month.
  • Step 13 … And some tough regulation. Mark Zuckerberg testifying before Congress on April 10. Photo: Jim Watson/AFP/Getty Images While we’re at it, where has the government been in all this? 
  • Stallman: We need a law. Fuck them — there’s no reason we should let them exist if the price is knowing everything about us. Let them disappear. They’re not important — our human rights are important. No company is so important that its existence justifies setting up a police state. And a police state is what we’re heading toward.
  • Duruk: The biggest existential problem for them would be regulation. Because it’s clear that nothing else will stop these companies from using their size and their technology to just keep growing. Without regulation, we’ll basically just be complaining constantly, and not much will change.
  • McNamee: Three things. First, there needs to be a law against bots and trolls impersonating other people. I’m not saying no bots. I’m just saying bots have to be really clearly marked. Second, there have to be strict age limits to protect children. And third, there has to be genuine liability for platforms when their algorithms fail. If Google can’t block the obviously phony story that the kids in Parkland were actors, they need to be held accountable.
  • Stallman: We need a law that requires every system to be designed in a way that achieves its basic goal with the least possible collection of data. Let’s say you want to ride in a car and pay for the ride. That doesn’t fundamentally require knowing who you are. So services which do that must be required by law to give you the option of paying cash, or using some other anonymous-payment system, without being identified. They should also have ways you can call for a ride without identifying yourself, without having to use a cell phone. Companies that won’t go along with this — well, they’re welcome to go out of business. Good riddance.
  • Step 14 Maybe nothing will change. The scariest possibility is that nothing can be done — that the behemoths of the new internet are too rich, too powerful, and too addictive for anyone to fix.
  • García: Look, I mean, advertising sucks, sure. But as the ad tech guys say, “We’re the people who pay for the internet.” It’s hard to imagine a different business model other than advertising for any consumer internet app that depends on network effects.
  • Step 15 … Unless, at the very least, some new people are in charge. If Silicon Valley’s problems are a result of bad decision-making, it might be time to look for better decision-makers. One place to start would be outside the homogeneous group currently in power.
  • Pao: I’ve urged Facebook to bring in people who are not part of a homogeneous majority to their executive team, to every product team, to every strategy discussion. The people who are there now clearly don’t understand the impact of their platforms and the nature of the problem. You need people who are living the problem to clarify the extent of it and help solve it.
  • Things That Ruined the Internet
  • Cookies (1994) The original surveillance tool of the internet. Developed by programmer Lou Montulli to eliminate the need for repeated log-ins, cookies also enabled third parties like Google to track users across the web. The risk of abuse was low, Montulli thought, because only a “large, publicly visible company” would have the capacity to make use of such data. The result: digital ads that follow you wherever you go online.
  • The Farmville vulnerability (2007)   When Facebook opened up its social network to third-party developers, enabling them to build apps that users could share with their friends, it inadvertently opened the door a bit too wide. By tapping into user accounts, developers could download a wealth of personal data — which is exactly what a political-consulting firm called Cambridge Analytica did to 87 million Americans.
  • Algorithmic sorting (2006) It’s how the internet serves up what it thinks you want — automated calculations based on dozens of hidden metrics. Facebook’s News Feed uses it every time you hit refresh, and so does YouTube. It’s highly addictive — and it keeps users walled off in their own personalized loops. “When social media is designed primarily for engagement,” tweets Guillaume Chaslot, the engineer who designed YouTube’s algorithm, “it is not surprising that it hurts democracy and free speech.”
  • The “like” button (2009) Initially known as the “awesome” button, the icon was designed to unleash a wave of positivity online. But its addictive properties became so troubling that one of its creators, Leah Pearlman, has since renounced it. “Do you know that episode of Black Mirror where everyone is obsessed with likes?” she told Vice last year. “I suddenly felt terrified of becoming those people — as well as thinking I’d created that environment for everyone else.”
  • Pull-to-refresh (2009) Developed by software developer Loren Brichter for an iPhone app, the simple gesture — scrolling downward at the top of a feed to fetch more data — has become an endless, involuntary tic. “Pull-to-refresh is addictive,” Brichter told The Guardian last year. “I regret the downsides.”
  • Pop-up ads (1996) While working at an early blogging platform, Ethan Zuckerman came up with the now-ubiquitous tool for separating ads from content that advertisers might find objectionable. “I really did not mean to break the internet,” he told the podcast Reply All. “I really did not mean to bring this horrible thing into people’s lives. I really am extremely sorry about this.”
  • The Silicon Valley dream was born of the counterculture. A generation of computer programmers and designers flocked to the Bay Area’s tech scene in the 1970s and ’80s, embracing new technology as a tool to transform the world for good.
  •  
    Internet en 15 étapes, de sa construction à aujourd'hui, regards et regrets de ceux qui l'ont construit... [...] "Things That Ruined the Internet" les cookies 1994 / la faille Farmville 2007 / le tri algorithmique 2006 / le "like" 2009 / le "pull to refresh" 2009 / les "pop-up ads" 1996 [...]
Aurialie Jublin

Mozilla Foundation - Open letter to Facebook - 0 views

  • We are writing you today as a group of technologists, human rights defenders, academics, journalists and Facebook users who are deeply concerned about the validity of Facebook’s promises to protect European users from targeted disinformation campaigns during the European Parliamentary elections. You have promised European lawmakers and users that you will increase the transparency of political advertising on the platform to prevent abuse during the elections. But in the very same breath, you took measures to block access to transparency tools that let your users see how they are being targeted.
  • In the company’s recent Wall Street Journal op-ed, Mark Zuckerberg wrote that the most important principles around data are transparency, choice and control. By restricting access to advertising transparency tools available to Facebook users, you are undermining transparency, eliminating the choice of your users to install tools that help them analyse political ads, and wielding control over good faith researchers who try to review data on the platform. Your alternative to these third party tools provides simple keyword search functionality and does not provide the level of data access necessary for meaningful transparency.
  • Specifically, we ask that you implement the following measures by 1 April 2019 to give developers sufficient lead time to create transparency tools in advance of the elections:Roll out a functional, open Ad Archive API that enables advanced research and development of tools that analyse political ads served to Facebook users in the EU.Ensure that all political advertisements are clearly distinguished from other content and are accompanied by key targeting criteria such as sponsor identity and amount spent on the platform in all EU countries.Cease harassment of good faith researchers who are building tools to provide greater transparency into the advertising on your platform.
  • ...1 more annotation...
  • UPDATE 13 February: In response to our campaign, Facebook announced that it would open up its Ad Archive API next month
  •  
    "Political actors use disinformation campaigns that prey on our emotions and values to manipulate our behaviour. We have a right to know who is paying to influence our vote, and Facebook is responsible for making sure that happens on their platform. They have made many promises to European lawmakers and users to make political ads more transparent, but so far we've seen little action. So we decided to pen an open letter telling them to implement what they've promised in enough time to protect users during the European elections."
Aurialie Jublin

Opinion | There May Soon Be Three Internets. America's Won't Necessarily Be the Best. -... - 0 views

  • The received wisdom was once that a unified, unbounded web promoted democracy through the free flow of information. Things don’t seem quite so simple anymore. China’s tight control of the internet within its borders continues to tamp down talk of democracy, and an increasingly sophisticated system of digital surveillance plays a major role in human rights abuses, such as the persecution of the Uighurs. We’ve also seen the dark side to connecting people to one another — as illustrated by how misinformation on social media played a significant role in the violence in Myanmar.
  • There’s a world of difference between the European Union’s General Data Protection Regulation, known commonly as G.D.P.R., and China’s technologically enforced censorship regime, often dubbed “the Great Firewall.” But all three spheres — Europe, America and China — are generating sets of rules, regulations and norms that are beginning to rub up against one another.
  • The information superhighway cracks apart more easily when so much of it depends on privately owned infrastructure. An error at Amazon Web Services created losses of service across the web in 2017; a storm disrupting a data center in Northern Virginia created similar failures in 2012. These were unintentional blackouts; the corporate custodians of the internet have it within their power to do far more. Of course, nobody wants to turn off the internet completely — that wouldn’t make anyone money. But when a single company with huge market share chooses to comply with a law — or more worryingly, a mere suggestion from the authorities — a large chunk of the internet ends up falling in line.
  • ...7 more annotations...
  • But eight years later, Google is working on a search engine for China known as Dragonfly. Its launch will be conditional on the approval of Chinese officials and will therefore comply with stringent censorship requirements. An internal memo written by one of the engineers on the project described surveillance capabilities built into the engine — namely by requiring users to log in and then tracking their browsing histories. This data will be accessible by an unnamed Chinese partner, presumably the government.
  • Google says all features are speculative and no decision has been made on whether to launch Dragonfly, but a leaked transcript of a meeting inside Google later acquired by The Intercept, a news site, contradicts that line. In the transcript, Google’s head of search, Ben Gomes, is quoted as saying that it hoped to launch within six to nine months, although the unstable American-China relationship makes it difficult to predict when or even whether the Chinese government will give the go-ahead.
  • Internet censorship and surveillance were once hallmarks of oppressive governments — with Egypt, Iran and China being prime examples. It’s since become clear that secretive digital surveillance isn’t just the domain of anti-democratic forces. The Snowden revelations in 2013 knocked the United States off its high horse, and may have pushed the technology industry into an increasingly agnostic outlook on human rights.
  • If the future of the internet is a tripartite cold war, Silicon Valley wants to be making money in all three of those worlds.
  • Yet even the best possible version of the disaggregated web has serious — though still uncertain — implications for a global future: What sorts of ideas and speech will become bounded by borders? What will an increasingly disconnected world do to the spread of innovation and to scientific progress? What will consumer protections around privacy and security look like as the internets diverge? And would the partitioning of the internet precipitate a slowing, or even a reversal, of globalization?
  • What these types of sky-is-falling articles keep getting wrong is the idea that the World Wide Web is the same as the Internet. It’s not. Web sites and the browsers that access them are an application that uses the Internet for transport.The Internet transports far more than just web traffic, but the most crucial one for companies is probably VPN: Companies connect to one another using site-to-site VPNs. Their employees can work from anywhere with remote user VPN. Disconnect the EU from the US, and you’ve removed the cheapest way for companies to connect their networks together.These regulatory worlds will get along somehow. Perhaps someone will write a web app that recognizes where a user is from, and apply appropriate policy to their session. Perhaps that web app will become wildly popular and be deployed on every website everywhere. I don’t know how it will work, but I do know the Internet will not become fragmented.
  • The internet was never meant to be a walled garden. Remember America Online began as a walled garden until the World Wide Web came along and “tore down that wall.” So, Europe can have its Europe Wide Web and China can have its China Wide Web, but we will always be the World Wide Web – truly open and free. The “one internet led by the United States” will remain the world’s “go to” information super highway just as the greenback has remained the world’s reserve currency for decades.
  •  
    "In September, Eric Schmidt, the former Google chief executive and Alphabet chairman, said that in the next 10 to 15 years, the internet would most likely be split in two - one internet led by China and one internet led by the United States. Mr. Schmidt, speaking at a private event hosted by a venture capital firm, did not seem to seriously entertain the possibility that the internet would remain global. He's correct to rule out that possibility - if anything, the flaw in Mr. Schmidt's thinking is that he too quickly dismisses the European internet that is coalescing around the European Union's ever-heightening regulation of technology platforms. All signs point to a future with three internets."
Aurialie Jublin

Ces start-up qui rêvent d'aider leurs utilisateurs à adopter des comportement... - 0 views

  • Comme toutes les start-up, les projets étudiés cherchent, souvent par tâtonnement, des modèles de revenus aussi bien en marchés B2C que B2B. Côté B2C, Plume Labs et Smart Citizen disposent d’une source de revenus directs issus de la vente d’objets connectés (la distribution du « Flow », le capteur de Plume Labs, est annoncée pour juin 2018 et le capteur de Smart Citizen est en rupture de stock dans l’attente d’une nouvelle version plus accessible au grand public). C’est le cas également de 90 jours dans un modèle « premium ». Côté modèle d’audience, tous sont confrontés à la difficulté, classique dans l’économie du web, de réunir une masse critique d’utilisateurs.
  • Par comparaison avec d’autres start-up, celles-ci doivent faire face à des contraintes spécifiques. Ainsi, la levée de fonds auprès des capital-risqueurs est particulièrement laborieuse, ces derniers se montrant frileux à l’égard de ces entreprises poursuivant une finalité d’intérêt général dont ils anticipent des retombées financières limitées. Autre contrainte à l’égard des clients : l’exigence de cohérence entre stratégie de croissance d’une part et discours sur les finalités de l’autre. Afin de ne pas froisser sa communauté d’utilisateurs d’origine, Tinkuy raconte avoir refusé des financements de la société Total et décrit les réactions provoquées par le partenariat passé avec la société Nouvelles Frontières: « Quoi ? Vous faites partir des gens en avion, c’est quoi cette communauté ? »
  • La finalité poursuivie par ces entrepreneurs, tout comme la fragilité de leurs modèles économiques, n’est pas sans évoquer une autre famille de start-up d’intérêt général, les « civic techs » qui cherchent à améliorer la démocratie. Mais à la différence de ces dernières, les acteurs étudiés interagissent peu les uns avec les autres. Ils ne se retrouvent pas dans des événements communs, pas plus qu’ils ne se domicilient dans des incubateurs dédiés. La construction d’une narration collective semble pour l’heure absente de leur vision stratégique. Seule leur observation dans le temps permettra de dire si les transition techs choisiront de renforcer leur modèle économique pour pouvoir se pérenniser au risque de prendre de la distance avec la finalité d’intérêt général ou si au contraire elles affirmeront collectivement leur identité pour attirer investisseurs et utilisateurs. Un suivi s’avère essentiel pour le Groupe Orange, déjà engagé dans une démarche de sobriété énergétique, afin de pouvoir accompagner ses clients sur leur propre chemin de transition écologique, en soutenant le cas échéant les entreprises les plus prometteuses.
  •  
    "Des start-up d'un genre nouveau, jusqu'ici peu observées, entendent aider les individus à adopter des comportements durables, en s'appuyant sur les technologies numériques et en conciliant ainsi secteur marchand et engagement sociétal. Qui sont ces entreprises que nous avons appelées « transition techs », par analogie avec les civic techs ? Quelles sont les questions environnementales dont elles entendent s'emparer, avec quel outillage technologique et quels modèles d'affaires ? Forment-elles un univers stabilisé ou s'agit-il d'un mouvement encore balbutiant ?"
Aurialie Jublin

Alexa, does AI have gender? | Research | University of Oxford - 0 views

  • Take Alexa as another example, Amazon’s voice-controlled AI assistant. Alexa is female. Why? As children and adults enthusiastically shout instructions, questions and demands at Alexa, what messages are being reinforced? Professor Neff wonders if this is how we would secretly like to treat women: ‘We are inadvertently reproducing stereotypical behaviour that we wouldn’t want to see,’ she says.
  • In the home, AI assistance with a subservient role is female. In the realm of law or finance, AI assistance is coded as male, affording the male computer voice a context of authority and professionalism.
  • These scenarios of a gendered, humanised means of interacting with technology are on some level obvious. We hear the voice and see the avatar – our own imaginations make the leap to engage on a personal level with a machine and some code. This is something tangible and in the public realm. We can choose to embrace, reject, push back. However, the issues of gender and power balances go deeper still and further from sight.
  •  
    "Professor Gina Neff has been asking questions about bias and balance of power in the development of artificial intelligence (AI) systems. She talks to Ruth Abrahams about the challenges we face in marrying futuristic solutions with values of trust, openness and equality."
Aurialie Jublin

Le contrôle des données numériques personnelles est un enjeu de liberté colle... - 0 views

  • Il serait commode de penser que l’humain du XXIe siècle a renoncé à sa vie privée. Mais il ne s’agit pourtant pas d’indifférence. Les sondages montrent avec insistance et sans ambiguïté que les internautes la chérissent encore à l’heure des réseaux sociaux et des smartphones. Comment, alors, expliquer cette apathie ?
  • Pendant des décennies, à raison, défendre la vie privée revenait à protéger l’individu. Aujourd’hui encore, on s’obstine à rechercher et mesurer les conséquences individuelles de cette collecte effrénée de données personnelles et de ces piratages à répétition. Mais le paradigme a changé : la question des données personnelle n’est pas un problème d’intimité. C’est un enjeu de liberté collective.
  • Prenez l’affaire Cambridge Analytica : le problème n’est pas que Donald Trump et son équipe de campagne ont consulté méthodiquement la liste d’amis de 87 millions d’utilisateurs de Facebook (dont plus de 200 000 Français). Mais qu’ils aient pu utiliser ces informations, agrégées à des millions d’autres, pour mener une campagne politique extrêmement personnalisée, quasi individualisée, en utilisant à plein l’invraisemblable machine à cibler des messages proposée par Facebook. L’impact de cette fuite de données personnelles n’est plus individuel, il est collectif. Il ne s’agit pas de l’intimité de son existence vis-à-vis d’une organisation politique, mais de la liberté collégiale de choisir en conscience son dirigeant politique ou ses conditions de vie commune.
  • ...2 more annotations...
  • Au bout du compte et si rien ne change, alors que ces entreprises s’immisceront de plus en plus dans nos activités quotidiennes, passant peu à peu de la « suggestion » à l’« injonction », nous serons sans doute pris au piège des données personnelles. On décidera à notre place, d’une manière qu’on nous présentera comme optimale puisque conçue sur l’analyse de données de millions de personnes dont la vie nous est similaire, et en nous confisquant une part de notre libre arbitre. Il ne s’agit pas d’intimité vis-à-vis d’une quelconque entreprise de la Silicon Valley, mais de liberté individuelle.
  • La seule solution est de limiter la dissémination aux quatre vents de nos données personnelles. Mais comment le faire sans se retirer des connexions, sociales et professionnelles, d’une société désormais numérisée ? Comment renoncer à tous ces avantages ? La solution se trouve quelque part entre le collectif (des règles politiques pour limiter la collecte et l’exploitation des données) et l’individuel (le recours à une technologie plus frugale et plus décentralisée).
  •  
    Les révélations des failles de sécurité touchant des services en ligne s'accumulent. Et la collecte de nos données fait peser un risque collectif d'envergure.
Aurialie Jublin

Mastercard, Microsoft to Advance Digital Identity Innovations - 0 views

  • “Today’s digital identity landscape is patchy, inconsistent and what works in one country often won’t work in another. We have an opportunity to establish a system that puts people first, giving them control of their identity data and where it is used,” says Ajay Bhalla, president, cyber and intelligence solutions, Mastercard. “Working with Microsoft brings us one step closer to making a globally interoperable digital identity service a reality, and we look forward to sharing more very soon.”
  •  
    "Mastercard (NYSE: MA) and Microsoft (Nasdaq "MSFT" @microsoft) today announced a strategic collaboration to improve how people manage and use their digital identity. Currently, verifying your identity online is still dependent on physical or digital proof managed by a central party, whether it's your passport number, your proof of address, driver's license, user credentials or other means. This dependence places a huge burden on individuals, who have to successfully remember hundreds of passwords for various identities and are increasingly being subjected to more complexity in proving their identity and managing their data. Working together, Mastercard and Microsoft aim to give people a secure, instant way to verify their digital identity with whomever they want, whenever they want."
Aurialie Jublin

affordance.info: Undocumented Men - 0 views

  • De fait nous semblons aujourd'hui n'avoir que deux choix fondés sur deux doctrines s'excluant mutuellement : Doctrine n°1 : La vie privée est une anomalie et si vous n'avez rien à vous reprocher vous n'avez rien à cacher. Soit le scénario d'une surveillance totale et globale de chacun de nos comportements par les grandes firmes tech en lien avec les états. Un projet au mieux de nature totalitaire et au pire de nature fasciste. Doctrine n°2 : La vie privée est un droit constitutionnel dont les états et les grandes firmes technologiques doivent être garants. Et là vous aurez toujours quelqu'un pour commencer à parler des exceptions, toujours un "oui mais" : "oui mais s'il s'agit de pédophilie ? De terrorisme ? D'empêcher un pilote d'avion mentalement déséquilibré de causer la mort de centaines de personnes ? A quel moment commence le "principe d'une surveillance de précaution" et quel aspect du droit fondamental à la vie privée doit-il fouler au pied ?" Et on ne s'en sort pas. 
  • Seul un troisième scénario pourra nous permettre d'éviter le pire. Il nécessite de passer par : la multiplication d'outils effectivement respectueux de notre vie privée dans le traitement technique des données qui y circulent (c'est la cas de DuckDuck Go, de Qwant, et de quelques autres) le déploiement de solutions alternatives à la toute puissance des GAFAM (comme l'initiative Dégooglisons internet le permet déjà, et comme le permettrait encore davantage la création d'un index indépendant du web combiné à la mutualisation déjà effective de ressources en Creative Commons) le soutien politique (et législatif) au déploiement du logiciel libre la formation à l'explicitation des enjeux de la surveillance ("Surveillance://" de Tristan Nitot en est un exemple magnifique) la prise de conscience que les données dont nous parlons sont avant tout celles de notre servitude volontaire
  • Demain peut-être, lorsque la dystopie se sera définitivement installée, ne nous faudra-t-il plus parler de "sans papiers" mais de "sans comptes". Des individus sans trace sociale numérique qui permette de les documenter. Ils ne seront plus "undocumented" mais "undigitized", non-numérisés. Et donc non-surveillables. Peut-être que ce sera une chance. Probablement même. De nouvelles formes de résistance documentaire émergeront, de nouvelles pédagogies contre la surveillance seront nécessaires. Et nous serons sans  peur, mais jamais sans reproche(s), car nous aurons compris que la doctrine selon laquelle "si nous n'avons rien à nous reprocher nous n'avons rien à cacher" est la source de tous les totalitarismes, que la vie privée est tout sauf une anomalie. Alors oui nous serons sans peur, dussions-nous pour cela rejoindre la cohorte de toutes celles et ceux qui sont, aujourd'hui, sans documents. Car cette humanité "undocumented" est bien plus qu'une humanité "sans papiers", elle est une humanité "non-documentée", en ce sens elle est aussi libre qu'elle permet de pointer et de mesurer notre propre asservissement aux nouveaux ordres documentaires. 
  •  
    "Deuxio, sa haine viscérale est adressée à des communautés et à des minorités (rien de nouveau sous le fascisme ordinaire ...) mais elle se caractérise en cela qu'elle stigmatise principalement ces communautés du fait de leur absence de papiers (les fameux "Undocumented Men") et qu'elle utilise, pour conduire et guider cette haine, les outils et les métriques ne pouvant s'appliquer qu'à une population "sur-documentable". C'est ainsi que Cambridge Analytica a joué un rôle majeur dans la construction de sa victoire ; c'est ainsi également que la toute puissance du discours totalitaire est d'autant plus efficiente qu'elle s'applique à des communautés ciblées parce que sur-documentées (et sur-documentables) ; c'est ainsi, enfin, qu'il faut voir la volonté affichée de récupérer à toute force les mots de passe de ces "undocumented men" car cela permet au discours totalitaire et au projet fasciste qui le sous-tend de faire entrer le monde dans les deux seules catégories binaires qu'il comprend et qui composent son horizon et son projet : d'un côté ceux qu'il est possible de contrôler et de manipuler (les "sur-documentés") et de l'autre ceux contre qui il faut diriger la haine des premiers (les "sous-documentés", les "undocumented")."
Aurialie Jublin

Les musulmans chinois, en première ligne de la surveillance - 1 views

  • La population musulmane de la région est soumise à de nombreuses technologies de contrôle afin de mesurer le degré de croyance et de pratique religieuse des individus, notamment par le biais du big data et de la reconnaissance faciale.
  • Le quotidien de la minorité religieuse est ainsi fait de caméras statiques et de drones-colombes intelligents, ou d’agents officiels et secrets susceptibles de fouiller tout individu plusieurs fois par jours. Des fouilles corporelles ou numériquse,  soit directement par le biais du smartphone ou via les traces numériques semées sur le très surveillé réseau social WeChat.
  • En cette période de hajj, le pélerinage à La Mecque que tous les musulmans en capacité de le faire se doivent d'effectuer une fois dans sa vie, cette surveillance étouffante se poursuit pour les ressortissants chinois. Accompagnés durant tout le pèlerinage par un guide de l’Association Islamique Chinoise, dirigée par l’Etat, les pélerins sont également porteurs de « cartes intelligentes » nouées autour de leur cou et enregistrant leurs données de géolocalisation ainsi que leurs conversations.
  • ...1 more annotation...
  • Maya Wang a signé, fin février, un rapport sur le « police cloud », un programme de police prédictive fondé sur l'analyse de données volumineuses, et déployé au Xinjiang. Il signale les personnes qui pourraient être des menaces, et celles ci peuvent être envoyées dans les « centres d'éducation politique ». « Pour la première fois, écrivait elle, nous sommes en mesure de démontrer que le recours au big data par les services de renseignements du gouvernement chinois non seulement viole de manière flagrante le droit à la vie privée, mais permet également aux autorités de détenir arbitrairement des personnes.»  
  •  
    "Le nationalisme religieux est dans la ligne de mire du Parti Communiste chinois. Le gouvernement cible particulièrement les minorités musulmanes du Xinjiang concentrées dans une région située à l'opposé des capitales économiques et politiques chinoises. Comme le souligne un article de The Atlantic, la région fait donc office de zone de test à grande échelle pour les technologies de contrôle dernière génération et les programmes de propagande au sein de camps de rééducation."
Aurialie Jublin

Le Zimbabwe vend le visage de ses citoyens à la Chine en échange de caméras - 1 views

  • Tout le monde, ou presque : le consentement des citoyens n’étant pas envisagé, impossible pour eux de refuser cette divulgation de données personnelles à l’entreprise chinoise, ni d’avoir aucune information sur ce qui sera fait de ces données après usage. Mais peut-être ces problèmes sont-ils encore secondaires lorsque l’on est, comme le peuple zimbabwéen, à la veille de l’installation d’une infrastructure de surveillance mise en place par un régime autoritaire. Et étant donnée la faible importance accordée au respects des Droits de l’Homme par ce régime au cours des quatre dernières décennies, difficile de croire le discours officiel justifiant l’arrivée de ces caméras intelligentes par l’éternelle excuse sécuritaire de vouloir réduire la criminalité. Surtout quand aucune loi ne protège l’accès aux données biométriques.
  •  
    "Le gouvernement du Zimbabwe a besoin de l'expertise chinoise en matière de surveillance. De son côté, CloudWalk Technology, start-up chinoise, a besoin d'images de visages de personnes noires pour perfectionner son logiciel de reconnaissance faciale, biaisé jusqu'ici, car reconnaissant mieux les visages blancs. L'accord, qui donne à la start-up chinoise l'accès aux informations biométriques des citoyens zimbabwéens, entrera en vigueur le 30 juillet, à la suite des élections présidentielles au Zimbabwe, raconte Foreign Policy. "
Aurialie Jublin

Microsoft Bug Testers Unionized. Then They Were Dismissed - Bloomberg - 0 views

  • In California, Uber, Lyft, TaskRabbit, and a half-dozen other companies are lobbying to defang a court ruling that could make it difficult to avoid reclassifying such workers as employees. And in Washington, the Republican-dominated National Labor Relations Board has made moves to undo an Obama-era precedent that could make big employers legally liable for contract workers even if they have only indirect control over them.The GOP takeover in Washington is one reason the Temporary Workers of America, a union of bug testers for Microsoft Corp., gave up on what had been, for people in the software world, an almost unheard of unionization victory, says the group’s founder, Philippe Boucher.
  • Boucher and his ex-colleagues are among a growing population of tech workers, including many Uber drivers, Amazon.com warehouse loaders, and Google software engineers, who lack the rights and perks of those companies’ full-fledged employees.
  • Google parent Alphabet Inc. now has fewer direct employees than it does contract workers, some of whom write code and test self-driving cars.
  • ...2 more annotations...
  • “Companies are deciding they don’t want to make long-term commitments to people, and they’re using a variety of devices to shift that work out,” says David Weil, dean of Brandeis University’s social policy and management school who oversaw federal wage-and-hour enforcement during the Obama presidency.
  • To help demonstrate that Microsoft was a joint employer, the union provided documents such as an email appearing to show a Lionbridge manager sharing performance metrics with Microsoft counterparts and a list of Microsoft managers who worked in the same office and oversaw Lionbridge employees’ work—at least one of whom listed his management of contractors on his LinkedIn résumé.
Aurialie Jublin

https://datatransferproject.dev/ - 0 views

  • Users should be in control of their data on the web, part of this is the ability to move their data. Currently users can download a copy of their data from most services, but that is only half the battle in terms of moving their data. DTP aims make move data between providers significantly easier for users.
  • DTP is still in very active development. While we have code that works for a variety of use cases we are continually making improvements that might cause things to break occasionally. So as you are trying things please use it with caution and expect some hiccups. Both our bug list, as well as documentation in each provider’s directory are good places to look for known issues, or report problems you encounter.
  •  
    "The Data Transfer Project was formed in 2017 to create an open-source, service-to-service data portability platform so that all individuals across the web could easily move their data between online service providers whenever they want. The contributors to the Data Transfer Project believe portability and interoperability are central to innovation. Making it easier for individuals to choose among services facilitates competition, empowers individuals to try new services and enables them to choose the offering that best suits their needs."
1 - 20 of 29 Next ›
Showing 20 items per page