Skip to main content

Home/ QN2019/ Group items tagged data

Rss Feed Group items tagged

Aurialie Jublin

An Apology for the Internet - From the People Who Built It - 1 views

  • There have always been outsiders who criticized the tech industry — even if their concerns have been drowned out by the oohs and aahs of consumers, investors, and journalists. But today, the most dire warnings are coming from the heart of Silicon Valley itself. The man who oversaw the creation of the original iPhone believes the device he helped build is too addictive. The inventor of the World Wide Web fears his creation is being “weaponized.” Even Sean Parker, Facebook’s first president, has blasted social media as a dangerous form of psychological manipulation. “God only knows what it’s doing to our children’s brains,” he lamented recently.
  • To keep the internet free — while becoming richer, faster, than anyone in history — the technological elite needed something to attract billions of users to the ads they were selling. And that something, it turns out, was outrage. As Jaron Lanier, a pioneer in virtual reality, points out, anger is the emotion most effective at driving “engagement” — which also makes it, in a market for attention, the most profitable one. By creating a self-perpetuating loop of shock and recrimination, social media further polarized what had already seemed, during the Obama years, an impossibly and irredeemably polarized country.
  • The Architects (In order of appearance.) Jaron Lanier, virtual-reality pioneer. Founded first company to sell VR goggles; worked at Atari and Microsoft. Antonio García Martínez, ad-tech entrepreneur. Helped create Facebook’s ad machine. Ellen Pao, former CEO of Reddit. Filed major gender-discrimination lawsuit against VC firm Kleiner Perkins. Can Duruk, programmer and tech writer. Served as project lead at Uber. Kate Losse, Facebook employee No. 51. Served as Mark Zuckerberg’s speechwriter. Tristan Harris, product designer. Wrote internal Google presentation about addictive and unethical design. Rich “Lowtax” Kyanka, entrepreneur who founded influential message board Something Awful. Ethan Zuckerman, MIT media scholar. Invented the pop-up ad. Dan McComas, former product chief at Reddit. Founded community-based platform Imzy. Sandy Parakilas, product manager at Uber. Ran privacy compliance for Facebook apps. Guillaume Chaslot, AI researcher. Helped develop YouTube’s algorithmic recommendation system. Roger McNamee, VC investor. Introduced Mark Zuckerberg to Sheryl Sandberg. Richard Stallman, MIT programmer. Created legendary software GNU and Emacs.
  • ...45 more annotations...
  • How It Went Wrong, in 15 Steps Step 1 Start With Hippie Good Intentions …
  • I think two things are at the root of the present crisis. One was the idealistic view of the internet — the idea that this is the great place to share information and connect with like-minded people. The second part was the people who started these companies were very homogeneous. You had one set of experiences, one set of views, that drove all of the platforms on the internet. So the combination of this belief that the internet was a bright, positive place and the very similar people who all shared that view ended up creating platforms that were designed and oriented around free speech.
  • Step 2 … Then mix in capitalism on steroids. To transform the world, you first need to take it over. The planetary scale and power envisioned by Silicon Valley’s early hippies turned out to be as well suited for making money as they were for saving the world.
  • Step 3 The arrival of Wall Streeters didn’t help … Just as Facebook became the first overnight social-media success, the stock market crashed, sending money-minded investors westward toward the tech industry. Before long, a handful of companies had created a virtual monopoly on digital life.
  • Ethan Zuckerman: Over the last decade, the social-media platforms have been working to make the web almost irrelevant. Facebook would, in many ways, prefer that we didn’t have the internet. They’d prefer that we had Facebook.
  • Step 4 … And we paid a high price for keeping it free. To avoid charging for the internet — while becoming fabulously rich at the same time — Silicon Valley turned to digital advertising. But to sell ads that target individual users, you need to grow a big audience — and use advancing technology to gather reams of personal data that will enable you to reach them efficiently.
  • Harris: If you’re YouTube, you want people to register as many accounts as possible, uploading as many videos as possible, driving as many views to those videos as possible, so you can generate lots of activity that you can sell to advertisers. So whether or not the users are real human beings or Russian bots, whether or not the videos are real or conspiracy theories or disturbing content aimed at kids, you don’t really care. You’re just trying to drive engagement to the stuff and maximize all that activity. So everything stems from this engagement-based business model that incentivizes the most mindless things that harm the fabric of society.
  • Step 5 Everything was designed to be really, really addictive. The social-media giants became “attention merchants,” bent on hooking users no mater the consequences. “Engagement” was the euphemism for the metric, but in practice it evolved into an unprecedented machine for behavior modification.
  • Harris: That blue Facebook icon on your home screen is really good at creating unconscious habits that people have a hard time extinguishing. People don’t see the way that their minds are being manipulated by addiction. Facebook has become the largest civilization-scale mind-control machine that the world has ever seen.
  • Step 6 At first, it worked — almost too well. None of the companies hid their plans or lied about how their money was made. But as users became deeply enmeshed in the increasingly addictive web of surveillance, the leading digital platforms became wildly popular.
  • Pao: There’s this idea that, “Yes, they can use this information to manipulate other people, but I’m not gonna fall for that, so I’m protected from being manipulated.” Slowly, over time, you become addicted to the interactions, so it’s hard to opt out. And they just keep taking more and more of your time and pushing more and more fake news. It becomes easy just to go about your life and assume that things are being taken care of.
  • McNamee: If you go back to the early days of propaganda theory, Edward Bernays had a hypothesis that to implant an idea and make it universally acceptable, you needed to have the same message appearing in every medium all the time for a really long period of time. The notion was it could only be done by a government. Then Facebook came along, and it had this ability to personalize for every single user. Instead of being a broadcast model, it was now 2.2 billion individualized channels. It was the most effective product ever created to revolve around human emotions.
  • Step 7 No one from Silicon Valley was held accountable … No one in the government — or, for that matter, in the tech industry’s user base — seemed interested in bringing such a wealthy, dynamic sector to heel.
  • Step 8 … Even as social networks became dangerous and toxic. With companies scaling at unprecedented rates, user security took a backseat to growth and engagement. Resources went to selling ads, not protecting users from abuse.
  • Lanier: Every time there’s some movement like Black Lives Matter or #MeToo, you have this initial period where people feel like they’re on this magic-carpet ride. Social media is letting them reach people and organize faster than ever before. They’re thinking, Wow, Facebook and Twitter are these wonderful tools of democracy. But it turns out that the same data that creates a positive, constructive process like the Arab Spring can be used to irritate other groups. So every time you have a Black Lives Matter, social media responds by empowering neo-Nazis and racists in a way that hasn’t been seen in generations. The original good intention winds up empowering its opposite.
  • Chaslot: As an engineer at Google, I would see something weird and propose a solution to management. But just noticing the problem was hurting the business model. So they would say, “Okay, but is it really a problem?” They trust the structure. For instance, I saw this conspiracy theory that was spreading. It’s really large — I think the algorithm may have gone crazy. But I was told, “Don’t worry — we have the best people working on it. It should be fine.” Then they conclude that people are just stupid. They don’t want to believe that the problem might be due to the algorithm.
  • Parakilas: One time a developer who had access to Facebook’s data was accused of creating profiles of people without their consent, including children. But when we heard about it, we had no way of proving whether it had actually happened, because we had no visibility into the data once it left Facebook’s servers. So Facebook had policies against things like this, but it gave us no ability to see what developers were actually doing.
  • McComas: Ultimately the problem Reddit has is the same as Twitter: By focusing on growth and growth only, and ignoring the problems, they amassed a large set of cultural norms on their platforms that stem from harassment or abuse or bad behavior. They have worked themselves into a position where they’re completely defensive and they can just never catch up on the problem. I don’t see any way it’s going to improve. The best they can do is figure out how to hide the bad behavior from the average user.
  • Step 9 … And even as they invaded our privacy. The more features Facebook and other platforms added, the more data users willingly, if unwittingly, released to them and the data brokers who power digital advertising.
  • Richard Stallman: What is data privacy? That means that if a company collects data about you, it should somehow protect that data. But I don’t think that’s the issue. The problem is that these companies are collecting data about you, period. We shouldn’t let them do that. The data that is collected will be abused. That’s not an absolute certainty, but it’s a practical extreme likelihood, which is enough to make collection a problem.
  • Losse: I’m not surprised at what’s going on now with Cambridge Analytica and the scandal over the election. For long time, the accepted idea at Facebook was: Giving developers as much data as possible to make these products is good. But to think that, you also have to not think about the data implications for users. That’s just not your priority.
  • Step 10 Then came 2016. The election of Donald Trump and the triumph of Brexit, two campaigns powered in large part by social media, demonstrated to tech insiders that connecting the world — at least via an advertising-surveillance scheme — doesn’t necessarily lead to that hippie utopia.
  • Chaslot: I realized personally that things were going wrong in 2011, when I was working at Google. I was working on this YouTube recommendation algorithm, and I realized that the algorithm was always giving you the same type of content. For instance, if I give you a video of a cat and you watch it, the algorithm thinks, Oh, he must really like cats. That creates these feeder bubbles where people just see one type of information. But when I notified my managers at Google and proposed a solution that would give a user more control so he could get out of the feeder bubble, they realized that this type of algorithm would not be very beneficial for watch time. They didn’t want to push that, because the entire business model is based on watch time.
  • Step 11 Employees are starting to revolt. Tech-industry executives aren’t likely to bite the hand that feeds them. But maybe their employees — the ones who signed up for the mission as much as the money — can rise up and make a change.
  • Harris: There’s a massive demoralizing wave that is hitting Silicon Valley. It’s getting very hard for companies to attract and retain the best engineers and talent when they realize that the automated system they’ve built is causing havoc everywhere around the world. So if Facebook loses a big chunk of its workforce because people don’t want to be part of that perverse system anymore, that is a very powerful and very immediate lever to force them to change.
  • Duruk: I was at Uber when all the madness was happening there, and it did affect recruiting and hiring. I don’t think these companies are going to go down because they can’t attract the right talent. But there’s going to be a measurable impact. It has become less of a moral positive now — you go to Facebook to write some code and then you go home. They’re becoming just another company.
  • Step 12 To fix it, we’ll need a new business model … If the problem is in the way the Valley makes money, it’s going to have to make money a different way. Maybe by trying something radical and new — like charging users for goods and services.
  • Parakilas: They’re going to have to change their business model quite dramatically. They say they want to make time well spent the focus of their product, but they have no incentive to do that, nor have they created a metric by which they would measure that. But if Facebook charged a subscription instead of relying on advertising, then people would use it less and Facebook would still make money. It would be equally profitable and more beneficial to society. In fact, if you charged users a few dollars a month, you would equal the revenue Facebook gets from advertising. It’s not inconceivable that a large percentage of their user base would be willing to pay a few dollars a month.
  • Step 13 … And some tough regulation. Mark Zuckerberg testifying before Congress on April 10. Photo: Jim Watson/AFP/Getty Images While we’re at it, where has the government been in all this? 
  • Stallman: We need a law. Fuck them — there’s no reason we should let them exist if the price is knowing everything about us. Let them disappear. They’re not important — our human rights are important. No company is so important that its existence justifies setting up a police state. And a police state is what we’re heading toward.
  • Duruk: The biggest existential problem for them would be regulation. Because it’s clear that nothing else will stop these companies from using their size and their technology to just keep growing. Without regulation, we’ll basically just be complaining constantly, and not much will change.
  • McNamee: Three things. First, there needs to be a law against bots and trolls impersonating other people. I’m not saying no bots. I’m just saying bots have to be really clearly marked. Second, there have to be strict age limits to protect children. And third, there has to be genuine liability for platforms when their algorithms fail. If Google can’t block the obviously phony story that the kids in Parkland were actors, they need to be held accountable.
  • Stallman: We need a law that requires every system to be designed in a way that achieves its basic goal with the least possible collection of data. Let’s say you want to ride in a car and pay for the ride. That doesn’t fundamentally require knowing who you are. So services which do that must be required by law to give you the option of paying cash, or using some other anonymous-payment system, without being identified. They should also have ways you can call for a ride without identifying yourself, without having to use a cell phone. Companies that won’t go along with this — well, they’re welcome to go out of business. Good riddance.
  • Step 14 Maybe nothing will change. The scariest possibility is that nothing can be done — that the behemoths of the new internet are too rich, too powerful, and too addictive for anyone to fix.
  • García: Look, I mean, advertising sucks, sure. But as the ad tech guys say, “We’re the people who pay for the internet.” It’s hard to imagine a different business model other than advertising for any consumer internet app that depends on network effects.
  • Step 15 … Unless, at the very least, some new people are in charge. If Silicon Valley’s problems are a result of bad decision-making, it might be time to look for better decision-makers. One place to start would be outside the homogeneous group currently in power.
  • Pao: I’ve urged Facebook to bring in people who are not part of a homogeneous majority to their executive team, to every product team, to every strategy discussion. The people who are there now clearly don’t understand the impact of their platforms and the nature of the problem. You need people who are living the problem to clarify the extent of it and help solve it.
  • Things That Ruined the Internet
  • Cookies (1994) The original surveillance tool of the internet. Developed by programmer Lou Montulli to eliminate the need for repeated log-ins, cookies also enabled third parties like Google to track users across the web. The risk of abuse was low, Montulli thought, because only a “large, publicly visible company” would have the capacity to make use of such data. The result: digital ads that follow you wherever you go online.
  • The Farmville vulnerability (2007)   When Facebook opened up its social network to third-party developers, enabling them to build apps that users could share with their friends, it inadvertently opened the door a bit too wide. By tapping into user accounts, developers could download a wealth of personal data — which is exactly what a political-consulting firm called Cambridge Analytica did to 87 million Americans.
  • Algorithmic sorting (2006) It’s how the internet serves up what it thinks you want — automated calculations based on dozens of hidden metrics. Facebook’s News Feed uses it every time you hit refresh, and so does YouTube. It’s highly addictive — and it keeps users walled off in their own personalized loops. “When social media is designed primarily for engagement,” tweets Guillaume Chaslot, the engineer who designed YouTube’s algorithm, “it is not surprising that it hurts democracy and free speech.”
  • The “like” button (2009) Initially known as the “awesome” button, the icon was designed to unleash a wave of positivity online. But its addictive properties became so troubling that one of its creators, Leah Pearlman, has since renounced it. “Do you know that episode of Black Mirror where everyone is obsessed with likes?” she told Vice last year. “I suddenly felt terrified of becoming those people — as well as thinking I’d created that environment for everyone else.”
  • Pull-to-refresh (2009) Developed by software developer Loren Brichter for an iPhone app, the simple gesture — scrolling downward at the top of a feed to fetch more data — has become an endless, involuntary tic. “Pull-to-refresh is addictive,” Brichter told The Guardian last year. “I regret the downsides.”
  • Pop-up ads (1996) While working at an early blogging platform, Ethan Zuckerman came up with the now-ubiquitous tool for separating ads from content that advertisers might find objectionable. “I really did not mean to break the internet,” he told the podcast Reply All. “I really did not mean to bring this horrible thing into people’s lives. I really am extremely sorry about this.”
  • The Silicon Valley dream was born of the counterculture. A generation of computer programmers and designers flocked to the Bay Area’s tech scene in the 1970s and ’80s, embracing new technology as a tool to transform the world for good.
  •  
    Internet en 15 étapes, de sa construction à aujourd'hui, regards et regrets de ceux qui l'ont construit... [...] "Things That Ruined the Internet" les cookies 1994 / la faille Farmville 2007 / le tri algorithmique 2006 / le "like" 2009 / le "pull to refresh" 2009 / les "pop-up ads" 1996 [...]
Aurialie Jublin

The Very First Oakland Co-op DiscoTech - Danny Spitzberg - Medium - 0 views

  • Technology isn’t necessary for bars or farms to become better co-ops, but it can help.Two coalitions that embrace co-design — civic tech and online organizing — can offer lessons on how to build better tech.At the same time, co-op theory and history offer a model of how to own, control, and share the value generated by the tech we build.
  • While each area has its emphasis, each can learn from the other, too:Civic tech is a coalition for better citizenship, trying to achieve citizen engagement. An example is southbendvoices.com, an automated call-in system. Yet tweeting the city to shut off sprinklers after the rain is a far cry from building better neighborhoods. What could it add? Economic solidarity.Online organizing is a digital approach to social change, trying to achieve community power. An example is 18millionrising.org, a group running rapid-response campaigns for racial justice. What’s missing? Platforms that support lasting effort with multiple allies.The co-op movement is about democracy in the workplace, trying to achieve real ownership, control, and value for the people doing the labor. An example is the Arizmendi Association, an umbrella group supporting six worker-owned bakeries. It’s a model that only the Enspiral network has replicated in New Zealand. What potential remains untapped here? Widespread relevance.How might all three of these areas become better, together?
  • There are co-ops, and then there is cooperation. Shane from CCA asked “what counts as a co-op?” and Willow Brugh from Aspiration Tech described a multi-stakeholder project in Africa that supports self-determining small businesses. I mentioned how Enspiral exemplifies the first co-op principle of open and voluntary membership better than most legally-recognized co-ops with a quarterly auto-email to their 30 member organizations that simply asks, “Do you still feel like a member?”
  • ...2 more annotations...
  • There are parallel worlds we can learn from, if we take care not to reproduce extractive practices. Developer and counselor Kermit Goodwin suggested that the open source community might be a place to learn more, and Noah Thorp of CoMakery cautioned that while developers might play better, the open source software economy is “broken” and dominated by corporate interests — most of the people making a livelihood through open source software do so through extractive enterprises (think, Microsoft).
  • And then there is the agitation and education that leads to organizing. After Evangeline asked “Why do people stop trying?” and how we can make co-ops familiar to more people, Molly McLeod brought up relatively passive directories like cultivate.coop and showed us Co-opoly, a boardgame about starting worker-owned businesses and having all of the poignant conversations that go along with it. Jackie Mahendra from CEL said her first serious role was working with a co-op house, and then others agreed co-ops can stay relevant if they provide services more widely — housing, education, health care, consumer finance, and more. Building viable co-op platforms is exactly what the creators of Platform Cooperativism are organizing around.
  •  
    "Evangeline asked why people get involved in co-ops, and then drop it. "Why do people stop trying?" For half of the 30 people at The Very First Oakland Co-op DiscoTech, it was a tough question - they had little exposure to co-ops in the first place."
Aurialie Jublin

Des impacts énergétiques et sociaux de ces data-centers qu'on ne voit pas - M... - 0 views

  • Côté consommation pour commencer, les études les plus pessimistes avancent que ceux-ci pourraient représenter jusqu’à 13% de l’électricité mondiale en 2030 (avec un secteur informatique qui consommerait jusqu’à 51% du total de la consommation électrique mondiale). Des chiffres qui ne font pas l’unanimité, The Shift Project prévoyant plutôt 25% pour le secteur informatique et 5% pour les data centers en 2025 (ces 5% équivaudraient tout de même à toute la consommation électrique actuelle du secteur numérique).
  • Même si l’on constate de nombreux efforts faits pour améliorer l’efficacité énergétique des infrastructures, par exemple du côté des « big tech », l’ADEME rapporte que les géants du numérique ne portent pas un véritable discours sur la « sobriété énergétique et numérique nécessaire pour rester sous la perspective d’une augmentation des températures planétaires de 1,5°.»
  • Le rapport insiste ensuite sur les déséquilibres qui résultent de certaines implantations dans les territoires. Première constatation : ces impacts sociaux sont très peu documentés. Bien souvent, les data centers arrivent sans dire leur nom, en périphérie des villes, aux Etats-Unis ou en France, à Saclay par exemple ou encore Plaine Commune. Cette furtivité des bâtiments rend d’autant plus difficile les contestations ou demandes de participation de la part des populations locales.
  • ...14 more annotations...
  • Ils sont considérés comme de simples entrepôts par les pouvoirs publics alors même que leur consommation électrique a des répercussions à l’échelle d’un territoire tout entier.
  • Autre phénomène important : les data centers attirent les data centers, pour des questions de mutualisation d’énergie et de réseaux de télécommunication. Bien souvent, les hiérarchies urbaines en place sont renforcées par ces mécanismes. Les élus eux, peinent à lutter contre des acteurs puissants qui imposent leurs conditions « dans une négociation asymétrique qui pousse certains territoires à sur-calibrer des infrastructures énergétiques, hydrauliques et viaires pour pouvoir accueillir quelques dizaines, ou centaines d’emploi si l’on inclut les phases de construction. »
  • Aujourd’hui, c’est plutôt pour installer des hangars logistiques et des fermes de serveurs qu’on artificialise les sols. Autre effet non négligeable qui mériterait sans doute une discussion plus ample avec les populations locales : l’étalement urbain.
  • Le rapport souligne ensuite les possibles synergies entre les infrastructures numériques et le territoire. La réutilisation de la chaleur générée par les data centers  est à ce titre un cas d’usage bien connu. A Bailly-Romainvilliers, par exemple, le data center de BNP Parisbas chauffe le centre nautique voisin. Celui de Céleste à Noisy-Champs, chauffe ses propres bureaux. D’autres systèmes très chauffants comme les chaudières numériques de Stimergy chauffent une partie de l’eau de la piscine de la Butte-aux-Cailles, dans le treizième arrondissement de Paris.
  • Cependant, ces exemples restent anecdotiques. Dans l’immense majorité des cas, la chaleur n’est pas récupérée. D’abord pour des questions de coût et de rentabilité économique : les promoteurs des data-centers attendent des rendements sur des périodes courtes incompatibles avec la contractualisation pour les réseaux de chaleur (des engagements qui coulent sur 25 à 30 ans
  • Il existe aussi un frein technique : il est préférable de prévoir ces éventuels contrats dès la construction du data center car le modifier a posteriori peut représenter des risques que les promoteurs ne sont pas prêts à prendre.
  • La cinquième partie du rapport, qui m’a particulièrement plu, fait la part belle aux initiatives citoyennes, associatives et publiques des « infrastructures numériques alternatives ». Du côté des fournisseurs d’accès, de nombreux acteurs associatifs comme franciliens.net ou Aquilenet dans le Sud-Ouest sont regroupés au sein de la Fédération FFDN. Ils viennent compléter l’offre des fournisseurs principaux (Bouygues, Free, Orange et SFR). Le grand atout de ces solutions est de miser sur le social, l’éducation et la démocratie : « Ils participent d’une gouvernance partagée du commun qu’est Internet en portant des valeurs de transparence, d’inclusion, de lien social, d’apprentissage technique, et d’incitation à la participation à la vie citoyenne. »
  • La socioanthropologue des techniques Laure Dobigny affirme que quand cette gestion inclut et implique, alors les consommateurs vont vers plus de sobriété : « la mise en place de systèmes techniques de plus petite échelle ont permis, en modifiant les usages, une réduction des consommations. » La question est ensuite de savoir comment passer d’une gestion commune de réseaux à une gestion commune de data-centers.
  • Le rapport présente un certain nombre de solutions, comme le cloud de pair-à-pair : « l’idée centrale sous-tendant ces dispositifs est que les fichiers et les contenus téléchargés par les utilisateurs dans le système sont stockés, totalement ou en partie, sur un nuage de stockage composé d’une partie des disques durs de chaque utilisateur, reliés entre eux en architecture P2P. » L’idée est plutôt simple : re-décentraliser internet, réduire le besoin de grands data-centers et atténuer l’impact spatial de ces infrastructures. Les limites de ces solutions sont nombreuses bien sûr : pertes de données, erreur, taille critique non atteinte… Il existe également des data centers « de proximité » comme les chatons (« Collectif d’Hébergeurs Alternatifs, Transparents, Ouverts, Neutres et Solidaires ») ou encore SCANI dans l’Yonne et Tetaneutral à Toulouse.
  • Pour terminer, le rapport dessine trois « mondes numériques possibles ». Le premier scénario parie sur l’extrême croissance et l’ultracentralisation numérique. Pour faire simple, c’est aujourd’hui mais en pire : tout est numérisé, plateformisé, big-daté et concentré dans les mains des GAFAMS ou d’autres acteurs similaires. La ville se conforme aux modèles numériques de la smart-city, la consommation de data explose. C’est la fuite en avant, la croyance qu’un monde infini existe. Côté C02, c’est la catastrophe, la température globale monte de 2,5° en 2050. Pics de chaleur dans les villes, problèmes sociaux, etc.
  • Le deuxième scénario est en demie teinte. On stabilise le système technique numérique en permettant la coexistence de deux mondes : celui des big tech et celui, plus centralisé, des infrastructures à plus petite échelle. L’Union Européenne taxe les « Net Goinfres », ce qui modifie les comportements : on échange moins de photos de chats et on tend à les stocker sur nos terminaux personnels plutôt que dans le cloud, idem pour la musique. Côté consommation, on parvient à réduire les émissions de CO2 du secteur de 5% par an entre 2025 et 2050, ce qui nous ramène au niveau de 2013.
  • Le dernier scénario propose une forme de décentralisation ultime du numérique qui signe plus ou moins la fin des data-centers tels que nous les connaissons. Internet devient plus local et dépendant des énergies renouvelables, ce qui ne permet plus d’assurer sa continuité. Le projet Greenstar au Canada suit ces principes et accepte les intermittences du réseau (follow the wind/follow the sun), de même, le blog du Low Tech Magazine s’arrête de fonctionner quand le vent ne souffle plus (le scénario nucléaire n’est pas vraiment envisagé car l’exercice prospectif est global). Ce scénario « effondrement » se base sur des infrastructures totalement low-tech (c’est-à-dire peu coûteuses en énergie) et permet assez ironiquement un « retour aux principes fondateurs d’internet (horizontal et distribué) ». Côté service, on se contente du local et l’international devient l’exception
  • L’ADEME invite également à soutenir les FAI indépendants et à créer un « service public du numérique et des data centers publics », notamment pour améliorer l’intégration spatiale des infrastructures. Les questions énergétiques font également l’objet de propositions : sobriété, récupération de chaleur, décentralisation.
  • Le chercheur Clément Marquet cité plus haut dans l’article me rappelle que ces différents rapports (Shift, ADEME) arrivent dans un moment particulier puisque le gouvernement a voté en octobre 2018 une loi visant à réduire la fiscalité énergétique pour attirer les gros data centers. Je le cite : « il y a une tension entre le projet de souveraineté numérique par les infrastructures (et des bénéfices économiques qui iraient avec bien sûr) et celui de réduction de la consommation énergétique. »
  •  
    "L'ADEME, via le projet Enernum, vient de publier une étude approfondie sur les impacts à la fois énergétiques et sociétaux de l'installation et du déploiement des data centers, en France et ailleurs. Ce travail vient confirmer les conclusions du Think tank The Shift Project qui alertait déjà des coûts carbone importants de ces infrastructures qui soutiennent tous nos usages numériques. Par delà les chiffres, l'ADEME replace les data-centers dans leur contexte humain et géographique et interroge la gestion de ce type d'infrastructure sur le territoire."
Asso Fing

« On cherche à éveiller la conscience éthique des développeurs et data scient... - 0 views

  • Comment la communauté Data For Good utilise-t-elle les algorithmes ? Qu’est-ce qui vous différencie des grandes entreprises tech ? Avec Data For Good, on veut permettre aux associations, aux projets citoyens et aux institutions d’avoir accès à la data science, qui est utilisée uniquement par les grandes entreprises comme les GAFA et les start-up pour l’instant, car c’est un service qui coûte très cher. Notre communauté bénévole soutient certains projets comme celui de l’association Frateli, qui travaille sur du mentorat. On leur a créé une plateforme de matching, avec des algorithmes, pour matcher en un clic un mentor avec un mentoré, alors qu’avant tout était fait à la main sur un tableur Excel. L’humain garde la décision finale, il peut changer les résultats donnés par l’algorithme, mais cela reste beaucoup plus rapide.
  • Notre but n’est pas de récolter des données, on ne fait pas des algorithmes qui utilisent les données des utilisateurs pour faire de la publicité. De plus, les codes sources produits par Data For Good sont tous en open source, donc si une autre association veut les utiliser, elle peut le faire gratuitement et librement. Les GAFA ouvrent certains de leurs codes sources, mais c’est avant tout pour attirer des développeurs, et ils n’ouvrent que des bribes de leurs codes. Et ils savent très bien que sans les données qu’ils possèdent, ces extraits de codes ne servent à rien.
  • Vous travaillez aussi chez AlgoTransparency, une plateforme qui cherche à décrypter les mécanismes de l’algorithme de YouTube : avez-vous réussi à savoir comment était construit cet algorithme, qui est un secret bien gardé par YouTube ? Sur YouTube, on est enfermé dans une spirale de recommandations qui ne montre pas forcément le meilleur de l’humanité... Sur AlgoTransparency, on a mis en place un robot qui est capable de mesurer quelle vidéo est recommandée, à partir de quelle vidéo. On a donc des données sur ces vidéos, mais comprendre comment fonctionne l’algorithme est très compliqué car celui-ci est très complexe, et il évolue en permanence car il est ré-entraîné tous les jours. Nous, on a décidé d’étudier nos données, en rentrant des mots clés, et de voir ce qu’il en sort.
  • ...4 more annotations...
  • L’algorithme de la plateforme, et YouTube le dit lui-même, c’est de maximiser le temps passé sur la plateforme, le « watch time ». Et quand l’algorithme voit que les gens passent plus de temps sur la plateforme quand ils regardent des vidéos complotistes par exemple, il va plus recommander ce contenu. Il fait juste le boulot pour lequel on l’a programmé.
  • Et dans un second temps, on pourrait créer pour la plateforme un statut hybride, qui serait entre l’hébergeur, qui n’a pas de responsabilité sur le contenu qu’il héberge, et le média, qui a toute la responsabilité sur ce qu’il partage. Pour l’instant, YouTube dit être un hébergeur, car il ne peut pas éditorialiser tout le contenu qui se trouve sur la plateforme. Pourtant, les algorithmes ont un rôle éditorial : quand ils recommandent un million de fois une vidéo à des êtres humains, il y a un choix fait derrière, l’algorithme a privilégié un contenu plutôt qu’un autre.
  • Par contre, là où on peut avoir peur, c’est quand ces algorithmes, notamment de machine learning (c’est-à-dire des algorithmes qui vont apprendre à partir des données qu’on leur a fourni pour prédire des choses), impactent la vie humaine : par exemple, lorsqu’on les utilise dans l’espace public pour faire de la reconnaissance faciale, ou quand les décisions concernant les peines de prison sont prises par des algorithmes. Si on ne sait pas quels sont les critères choisis pour définir les algorithmes, c’est là que ça devient dangereux. Et c’est pour cela qu’on demande l’ouverture de tous les codes sources utilisés dans les administrations (comme la loi Le Maire le recommande).
  • Est-ce que le problème de l’algorithme ne serait pas de décontextualiser les données ? Normalement, c’est aux data scientists de garder en tête le contexte des données qu’ils étudient, et de savoir qu’elles peuvent être biaisées : par exemple, connaître le quartier où habitent les utilisateurs peut être un biais sur leur niveau social. Dans le serment d’Hippocrate de Data For Good, on cherche au mieux à éveiller la conscience éthique des data scientist, en « informant les parties prenantes sur (…) l’utilisation des données », en assurant que « les individus ne soient pas discriminés par rapport à des critères illégaux ou illégitimes », en « respectant la vie privée et la dignité humaine » et en « assumant ses responsabilités en cas de difficulté ou de conflits d’intérêts ».
  •  
    "Et si les développeurs et data scientists prêtaient serment, comme les médecins, pour une utilisation des données plus éthique ? Data For Good, une communauté de data scientists et de développeurs bénévoles au service de projets d'intérêt général, en a fait une de ses missions. Ce « serment d'Hippocrate » des data scientists sera mis en ligne le 26 juin, lors d'un « Demo Day » qui présentera les projets soutenus par Data For Good. Curieux d'en savoir plus, nous avons interrogé Frédéric Bardolle, membre de Data For Good et d'Algotransparency, une plateforme qui décrypte les mécanismes de l'algorithme de YouTube. "
Aurialie Jublin

Charter for Building a Data Commons for a Free, Fair, and Sustainable Future - HackMD - 0 views

  • This Charter/Carta provides practical guidance and political orientation for mapping, modelling, managing and sharing data as a Commons. If you follow these guidelines, you will contribute to a Global Data Commons. That is, you will govern your mapping community and manage data differently than people who centralize data control for profit.The Charter does not describe the vision, scope or values of a specific mapping project, but Data Commons principles. It will help you reimagine how you protect the animating spirit of your mapping project and prevents your data from being co-opted or enclosed.The Charter as a whole is the maximum „commons denominator“ of mapping projects that aspire to share data for the common good.Help commonize maps and data! For the people, by the people.
  •  
    "Charter for Building a Data Commons for a Free, Fair, and Sustainable Future"
Aurialie Jublin

Where Next for #platformcoop? - Danny Spitzberg - Medium - 0 views

  • But in terms of vocational shifts or better business practices, it’s unclear what #platformcoop has produced. The original excitement is giving way to unmet expectations.
  • In Baltimore, a group of returning citizens — men and women who were formerly incarcerated — faced ridiculously unfair barriers to employment. And so, over the past two years, they formed Core Staffing, a staffing agency with 12 members.
  • When I asked about where Core Staffing is investing its energy lately, Joseph said, “we’re trying to figure out how to foster close relationships and community while managing a distributed workforce. What will keep people engaged outside of profit?” Core Staffing currently has 15 members, fewer than virtually any other platform co-operative, but these tensions are already very present.
  • ...8 more annotations...
  • A second question for Joseph revolves around money. Core Staffing is considering a large loan to build its platform. Joseph says, “We’re not going to be profitable for at least a year. I don’t want to us take on a huge amount of debt. But we have unanimous decisions for financial decisions that affect everyone’s equity, and right now, the members are voting ‘yes’ to take it.”
  • We all should obsess less over which organizations may or may not be “real” platform co-operatives, and reframe the conversation to assess how they model cooperativism — such as democratic governance, collective decision-making, and how their capital is made accountable to workers. This approach would cultivate the curiosity and humility necessary to help new participants.
  • Yet their basic questions remained unanswered. Where did “platform cooperativism” come from? How is a “platform” different than a website — don’t most co-ops have an online presence?
  • The more it can succeed with disintermediation, removing links in their supply chains and systems, the more we all learn and benefit. This begins with meeting people where they’re at.
  • While writing this article, I surveyed people about trends and challenges in the co-op movement. I polled groups obsessed with digital tools and online platforms, and yet the responses were far more grounded than I expected. In fact, the two challenges Joseph sees for Core Staffing echo everything: appropriate finance, and engagement at scale.
  • I believe radical technologist Micky Metts said it best: “I have run into many people that have existed in the corporate world but really do not understand co-operative engagement, even if they are caring and loving individuals.” In her mind, removing the fear we’re conditioned with is the first step for co-ops to grow. Data Commons co-founder Noemi Giszpenc takes it a step further by urging “true participation” among all users.
  • After two days of prototyping with Sylvia Morse and Up & Go, a New York platform for home cleaning, I learned that all platforms compete on quality. The members, mainly latinx women, are cleaning professionals. They provide reliable, consistent, five-star service. One user said, “I honestly don’t care if workers own the app.” Ironically, that indifference is a reason why platform co-ops like Up & Go can change the narrative about on-demand labor. It’s a labor of love, and it’s much more than an idea.
  • Robin Hood Co-op, a Finnish activist hedge fund, illustrates the diversity of co-operative models we have in this movement. Ana Fradique, a culture worker and community coordinator with Robin Hood, told me that all bureaucratic structures familiar to co-ops “need to be revized and supported by more flexible, faster technologies and modes of co-operation.” Her perspective comes from “experience with the different gears of operating online networks, where power and decision is faster than what the formal structure allows.” In a listserv, emails rarely achieve the escape velocity required to move from discussion to action.
  •  
    "I wrote this article to examine the idea of "platform cooperativism," where it's headed, and what it needs in order to use technology for economic justice."
Aurialie Jublin

Charter for Building a Data Commons for a Free, Fair and Sustainable Future | CommonsBlog - 0 views

  • 1. **Reflect on your intentions together** Discuss the core of your project again and again. Everybody involved should always feel in resonance with the direction in which it’s heading.
  • 2. **Make your community thrive** For the project to be successful, a reliable community is more important than anything else. Care for those who might support you when you need them most.
  • 3. **Separate commons and commerce** Mapping for the commons is different from producing services or products to compete on the map-market.
  • ...9 more annotations...
  • 4. **Design for interoperability** Think of your map as a node in a network of many maps. Talk with other contributors to the Data Commons to find out if you can use the same data model, licence and approach to mapping.
  • 5. **Care for a living vocabulary** Vocabularies as entry points to complex social worlds are always incomplete. Learn from other mappers‘ vocabularies. Make sure your vocabulary can be adjusted. Make it explicit and publish it openly, so that others can learn from it too.
  • 6. **Document transparently** Sharing your working process, learnings and failures allow others to replicate, join and contribute. Don’t leave documentation for after. Do it often and make it understandable. Use technologies designed for open cooperation.
  • 7. **Crowdsource what you can** Sustain your project whenever possible with money, time, knowledge, storing space, hardware or monitoring from your community or public support. Stay independent!
  • 8. **Use FLOSS tools** It gives you the freedom to further develop your own project and software according to your needs. And it enables you to contribute to the development of these tools.
  • 9. **Build upon the open web platform** Open web standards ensure your map, its data and associated applications cannot be enclosed and are prepared for later remixing and integration with other sources.
  • 10. **Own your data** In the short run, it seems to be a nightmare to refrain from importing or copying what you are not legally entitled to. In the long run, it is the only way to prevent you from being sued or your data being enclosed. Ban Google.
  • 11. **Protect your data** To own your data is important, but not enough. Make sure nobody dumps your data back into the world of marketization and enclosures. Use appropriate licenses to protect your collective work!
  • 12. **Archive your project** When it doesn’t work anymore for you, others still might want to build on it in the future.
  •  
    "Nations-States rely on constitutions. Common(er)s find common ground through a Charter. If you are part of the co-creation of a powerful Data Commons - through mapping, coding, data modelling or other activities - this is for you. It is an fundamental building-block for online and offline cooperation. The following is version 0.6 of what has been called in previously: Charter for Building a Data Commons of Alternative Economies or Mapping for the Commons Manifesto. We, the participants of the Intermapping meeting (March 2017 in Florence), hope to hereby publish a version that provides orientation to the countless mapping processes for a free, fair and sustainable world. We invite you to work together on the practical issues: how to implement the principles outlined in the Charter (see below)? Let's federate our efforts to make the Commons thrive!"
Aurialie Jublin

The pregnancy-tracking app Ovia lets women record their most sensitive data for themsel... - 0 views

  • But someone else was regularly checking in, too: her employer, which paid to gain access to the intimate details of its workers’ personal lives, from their trying-to-conceive months to early motherhood. Diller’s bosses could look up aggregate data on how many workers using Ovia’s fertility, pregnancy and parenting apps had faced high-risk pregnancies or gave birth prematurely; the top medical questions they had researched; and how soon the new moms planned to return to work.
  • “Maybe I’m naive, but I thought of it as positive reinforcement: They’re trying to help me take care of myself,” said Diller, 39, an event planner in Los Angeles for the video game company Activision Blizzard. The decision to track her pregnancy had been made easier by the $1 a day in gift cards the company paid her to use the app: That’s “diaper and formula money,” she said.
  • But Ovia also has become a powerful monitoring tool for employers and health insurers, which under the banner of corporate wellness have aggressively pushed to gather more data about their workers’ lives than ever before.
  • ...13 more annotations...
  • Employers who pay the apps’ developer, Ovia Health, can offer their workers a special version of the apps that relays their health data — in a “de-identified,” aggregated form — to an internal employer website accessible by human resources personnel. The companies offer it alongside other health benefits and incentivize workers to input as much about their bodies as they can, saying the data can help the companies minimize health-care spending, discover medical problems and better plan for the months ahead.
  • By giving counseling and feedback on mothers’ progress, executives said, Ovia has helped women conceive after months of infertility and even saved the lives of women who wouldn’t otherwise have realized they were at risk.
  • But health and privacy advocates say this new generation of “menstrual surveillance” tools is pushing the limits of what women will share about one of the most sensitive moments of their lives. The apps, they say, are designed largely to benefit not the women but their employers and insurers, who gain a sweeping new benchmark on which to assess their workers as they consider the next steps for their families and careers.
  • Experts worry that companies could use the data to bump up the cost or scale back the coverage of health-care benefits, or that women’s intimate information could be exposed in data breaches or security risks. And though the data is made anonymous, experts also fear that the companies could identify women based on information relayed in confidence, particularly in workplaces where few women are pregnant at any given time.
  • The rise of pregnancy-tracking apps shows how some companies increasingly view the human body as a technological gold mine, rich with a vast range of health data their algorithms can track and analyze. Women’s bodies have been portrayed as especially lucrative: The consulting firm Frost & Sullivan said the “femtech” market — including tracking apps for women’s menstruation, nutrition and sexual wellness — could be worth as much as $50 billion by 2025.
  • Companies pay for Ovia’s “family benefits solution” package on a per-employee basis, but Ovia also makes money off targeted in-app advertising, including from sellers of fertility-support supplements, life insurance, cord-blood banking and cleaning products.
  • In 2014, when the company rolled out incentives for workers who tracked their physical activity with a Fitbit, some employees voiced concerns over what they called a privacy-infringing overreach. But as the company offered more health tracking — including for mental health, sleep, diet, autism and cancer care — Ezzard said workers grew more comfortable with the trade-off and enticed by the financial benefits.
  • But a key element of Ovia’s sales pitch is how companies can cut back on medical costs and help usher women back to work. Pregnant women who track themselves, the company says, will live healthier, feel more in control and be less likely to give birth prematurely or via a C-section, both of which cost more in medical bills — for the family and the employer.
  • Women wanting to get pregnant are told they can rely on Ovia’s “fertility algorithms,” which analyze their menstrual data and suggest good times to try to conceive, potentially saving money on infertility treatments. “An average of 33 hours of productivity are lost for every round of treatment,” an Ovia marketing document says.
  • Ovia, in essence, promises companies a tantalizing offer: lower costs and fewer surprises. Wallace gave one example in which a woman had twins prematurely, received unneeded treatments and spent three months in intensive care. “It was a million-dollar birth … so the company comes to us: How can you help us with this?” he said.
  • “The fact that women’s pregnancies are being tracked that closely by employers is very disturbing,” said Deborah C. Peel, a psychiatrist and founder of the Texas nonprofit Patient Privacy Rights. “There’s so much discrimination against mothers and families in the workplace, and they can’t trust their employer to have their best interests at heart.” Federal law forbids companies from discriminating against pregnant women and mandates that pregnancy-related health-care expenses be covered in the same way as other medical conditions. Ovia said the data helps employers provide “better benefits, health coverage and support.”
  • Companies can also see which articles are most read in Ovia’s apps, offering them a potential road map to their workers’ personal questions or anxieties. The how-to guides touch on virtually every aspect of a woman’s changing body, mood, financial needs and lifestyle in hyper-intimate detail, including filing for disability, treating bodily aches and discharges, and suggestions for sex positions during pregnancy.
  • The coming years, however, will probably see companies pushing for more pregnancy data to come straight from the source. The Israeli start-up Nuvo advertises a sensor band strapped around a woman’s belly that can send real-time data on fetal heartbeat and uterine activity “across the home, the workplace, the doctor’s office and the hospital.” Nuvo executives said its “remote pregnancy monitoring platform” is undergoing U.S. Food and Drug Administration review.
  •  
    "As apps to help moms monitor their health proliferate, employers and insurers pay to keep tabs on the vast and valuable data"
Aurialie Jublin

Let's make private data into a public good - MIT Technology Review - 0 views

  • Why is this a problem? Well, maybe because these giants are making huge profits from technologies originally created with taxpayer money. Google’s algorithm was developed with funding from the National Science Foundation, and the internet came from DARPA funding. The same is true for touch-screen displays, GPS, and Siri. From this the tech giants have created de facto monopolies while evading the type of regulation that would rein in monopolies in any other industry. And their business model is built on taking advantage of the habits and private information of the taxpayers who funded the technologies in the first place.
  • Apologists like to portray the internet giants as forces for good. They praise the sharing economy in which digital platforms empower people via free access to everything from social networking to GPS navigation to health monitoring. But Google doesn’t give us anything for free. It’s really the other way around—we’re handing over to Google exactly what it needs. When you use Google’s services it might feel as if you’re getting something for nothing, but you’re not even the customer—you’re the product. The bulk of Google’s profits come from selling advertising space and users’ data to firms. Facebook’s and Google’s business models are built on the commodification of personal data, transforming our friendships, interests, beliefs, and preferences into sellable propositions.
  • And because of network effects, the new gig economy doesn’t spread the wealth so much as concentrate it even more in the hands of a few firms (see Rein in the Data Barons). Like the internal-combustion engine or the QWERTY keyboard, a company that establishes itself as the leader in a market achieves a dominance that becomes self-perpetuating almost automatically.
  • ...5 more annotations...
  • The low tax rates that technology companies are typically paying on these large rewards are also perverse, given that their success was built on technologies funded and developed by high-risk public investments: if anything, companies that owe their fortunes to taxpayer-funded investment should be repaying the taxpayer, not seeking tax breaks.
  • We should ask how the value of these companies has been created, how that value has been measured, and who benefits from it. If we go by national accounts, the contribution of internet platforms to national income (as measured, for example, by GDP) is represented by the advertisement-related services they sell. But does that make sense? It’s not clear that ads really contribute to the national product, let alone to social well-being—which should be the aim of economic activity. Measuring the value of a company like Google or Facebook by the number of ads it sells is consistent with standard neoclassical economics, which interprets any market-based transaction as signaling the production of some kind of output—in other words, no matter what the thing is, as long as a price is received, it must be valuable. But in the case of these internet companies, that’s misleading: if online giants contribute to social well-being, they do it through the services they provide to users, not through the accompanying advertisements.
  • This way we have of ascribing value to what the internet giants produce is completely confusing, and it’s generating a paradoxical result: their advertising activities are counted as a net contribution to national income, while the more valuable services they provide to users are not.
  • Let’s not forget that a large part of the technology and necessary data was created by all of us, and should thus belong to all of us. The underlying infrastructure that all these companies rely on was created collectively (via the tax dollars that built the internet), and it also feeds off network effects that are produced collectively. There is indeed no reason why the public’s data should not be owned by a public repository that sells the data to the tech giants, rather than vice versa. But the key issue here is not just sending a portion of the profits from data back to citizens but also allowing them to shape the digital economy in a way that satisfies public needs. Using big data and AI to improve the services provided by the welfare state—from health care to social housing—is just one example.
  • Only by thinking about digital platforms as collective creations can we construct a new model that offers something of real value, driven by public purpose. We’re never far from a media story that stirs up a debate about the need to regulate tech companies, which creates a sense that there’s a war between their interests and those of national governments. We need to move beyond this narrative. The digital economy must be subject to the needs of all sides; it’s a partnership of equals where regulators should have the confidence to be market shapers and value creators. 
  •  
    "The internet giants depend on our data. A new relationship between us and them could deliver real value to society."
Aurialie Jublin

There is a leftwing way to challenge big tech for our data. Here it is | Evgeny Morozov... - 0 views

  • There is also an emerging leftwing undercurrent to this movement. The idea of a new approach to data ownership, including the possibility of a national data trust, has gained some currency with the Labour party. Writing in Handelsblatt, Germany’s leading business paper, the Social Democrat leader, Andrea Nahles, argued that technology firms should be forced to share their data with the rest of society, so as not to impede social progress. She compared big tech to big pharma, which, thanks to legal interventions, cannot enjoy indefinite exclusive rights over its intellectual property.
  • A reasonable position, it seems. Yet to be credible and effective, the leftwing distributist agenda needs to overcome a great obstacle: citizens’ falling trust in the state as a vehicle of advancing their interests.
  • Handing more data to state institutions that already thrive on excessive surveillance would not restore that trust. Then there’s always the temptation that such data will be used for state-approved social engineering otherwise known as “nudging” and “behavioural change”. Giving government institutions even more data will only fuel the “deep state” conspiracy theories of the fringe rightwing groups.
  • ...1 more annotation...
  • The true challenge for the data distributist left is, thus, to find a way to distribute power, not just data. It must mobilise the nation state to turn cities into the harbingers of a new, radical democracy keen on deploying socialised big data and artificial intelligence in the interests of citizens. Without such an emphasis on radical empowerment, the data distributism of the left will only be a boon to the loony far right.
  •  
    "Only with radical empowerment can we as citizens halt further intrusion from Google and co"
Aurialie Jublin

https://datatransferproject.dev/ - 0 views

  • Users should be in control of their data on the web, part of this is the ability to move their data. Currently users can download a copy of their data from most services, but that is only half the battle in terms of moving their data. DTP aims make move data between providers significantly easier for users.
  • DTP is still in very active development. While we have code that works for a variety of use cases we are continually making improvements that might cause things to break occasionally. So as you are trying things please use it with caution and expect some hiccups. Both our bug list, as well as documentation in each provider’s directory are good places to look for known issues, or report problems you encounter.
  •  
    "The Data Transfer Project was formed in 2017 to create an open-source, service-to-service data portability platform so that all individuals across the web could easily move their data between online service providers whenever they want. The contributors to the Data Transfer Project believe portability and interoperability are central to innovation. Making it easier for individuals to choose among services facilitates competition, empowers individuals to try new services and enables them to choose the offering that best suits their needs."
Aurialie Jublin

Welcome to the Age of Privacy Nihilism - The Atlantic - 0 views

  • But more importantly, the velocity of acquisition and correlation of information has increased dramatically. Web browsers and smartphones contribute to that, in volume and value.
  • The process of correlation has become more sophisticated, too.
  • The centralization of information has also increased. With billions of users globally, organizations like Facebook and Google have a lot more data to offer—and from which to benefit. Enterprise services have also decentralized, and more data has moved to the Cloud—which often just means into the hands of big tech firms like Microsoft, Google, and Amazon. Externalizing that data creates data-privacy risk. But then again, so does storing it locally, where it is susceptible to breaches like the one Equifax experienced last year.
  • ...1 more annotation...
  • The real difference between the old and the new ages of data-intelligence-driven consumer marketing, and the invasion of privacy they entail, is that lots of people are finally aware that it is taking place.
  •  
    "Google and Facebook are easy scapegoats, but companies have been collecting, selling, and reusing your personal data for decades, and now that the public has finally noticed, it's too late. The personal-data privacy war is long over, and you lost."
Aurialie Jublin

J'ai testé pour vous : 8 jours de data detox challenge - 0 views

  • « Pour adopter un lifestyle data équilibré, il est crucial de diversifier les services que vous utilisez ». Comme tout régime qui se respecte, le Data Détox Challenge rappelle l’importance diversifier les familles d’aliments. Cinq moteurs de recherche et navigateurs par jour ? Ca va être dur.
  • Mais comment m’alléger en data sans perdre mes courbes d’audience sur chaque plateforme ? Les défauts du Par défaut. Je commence par changer mon profil publicitaire sur Facebook : non, on ne pourra plus me cibler selon mon genre. J’enfile dans la foulée ma panoplie d’agent de propreté des temps modernes, et j’efface les tags sur mes photos et celles de mes amis. Désormais nous circulerons incognito, ou presque.
  • Dans une détox, chaque jour est un challenge : aujourd’hui la chasse au trackers est ouverte. Comme de vilains sucres, ils se nichent partout de manière invisible. Chasser les pixels espions de Facebook et Twitter (cachés dans les boutons Like et Share de très nombreuses pages) n’est pas si simple. Puisqu’il est impossible de les faire disparaître, c’est à nous de nous éclipser. J’active le mode privé par défaut dans mon navigateur, je bloque les trackers et je vérifie la sécurité de mes connexions avec l’aide de mes nouveaux compléments de navigateur préférés : Privacy Badger, Panopticlick et HTTPS Everywhere. Mon nouveau menu favori ? Pomme + Maj + N (pour la navigation privée).
  • ...4 more annotations...
  • Le challenge invite à télécharger l’application Architecture of radio, qui cartographie les ondes électromagnétiques émises par les antennes relais de téléphonie mobile, les routeurs Wi-Fi, les satellites et les rend visibles, le tout en temps réel. Ici pas de login pour l’utiliser, mais la CB comme passage obligé. La vidéo démo m’a convaincue et je m'acquitte des 2,49 € pour assouvir ma curiosité. Cette fois au moins, je ne paierai pas avec mes data.
  • Je me plie au calcul de mon Indice de Masse Informationnelle. L’équation est facile : pour connaître son exposition à la collecte de données, il suffit de compter ses applications.
  • Tout à coup je me demande quelle utilisation fait Data Selfie de mes données. C’est le métier qui rentre ! Je vérifie : le code est accessible de manière transparente sur GitHub, et les créateurs précisent bien que les données ne sont pas stockés ni utilisées ailleurs que dans Data Selfie.
  • l'extension gratuite open source Adnauseam qui noie l'activité de l'utilisateur en cliquant sur des pubs aléatoires en arrière plan.
  •  
    "Scandale Cambridge Analytica, Règlement Général sur la Protection des Données (RGPD)... La mode est à la diète des datas. Chacun y va de son petit écrémage électronique, de sa politique anti-cookie et tout le monde ne jure plus que par son indice de masse d'info personnelle. Sortez de la boulimie et relevez comme Millie Servant, en charge de la communication numérique pour Cap Digital & Futur.e.s Festival, le seul défi slim qui vaille : le Data Detox Challenge. Un parcours « détox » proposé sur 8 jours par la fondation Mozilla et le Tactical Technology Collective. "
Aurialie Jublin

La fibre, « facteur clé » de la survie des FAI associatifs - 0 views

  • Pour la fédération, une éviction de ses membres de la fibre est « tout à fait inquiétant ». Aujourd'hui, le marché est dominé par l'opérateur historique (voir notre analyse), qui compte pour plus des deux tiers de recrutement de clients en fibre, en embarquant notamment des abonnés ADSL de concurrents. « On est en train de refermer un marché vivace, qui a permis à une foule d'acteurs locaux d'exister, sur quelques gros acteurs qui n'ont pas la même capacité d'innovation » appuie FFDN.
  • Il serait donc nécessaire de défendre l'accès de ces petits acteurs aux réseaux FTTH, à la fois pour leurs offres « neutres » et pour leur statut de « poisson-pilote », avec des compétences particulières. La fédération se targue de fournir un regard unique, fondé sur les droits fondamentaux et une longue expérience. Elle répond aussi bénévolement à des problèmes locaux (comme les réseaux hertziens en zones blanches) ou en palliant les manquements des opérateurs commerciaux. « La santé des associations que je représente est un des marqueurs du respect ou non de l'intérêt général sur ce marché. On est un peu des canaris, déclare Oriane Piquer-Louis. Si les plus petits meurent, les plus grands doivent quitter la mine. » 
  • Les réseaux d'initiative publique (RIP), initiés par des départements et régions, sont la voie la plus évidente vers la fibre pour les FAI associatifs. Ces réseaux sont tenus à une obligation de non-discrimination, soit les mêmes conditions tarifaires pour l'ensemble des opérateurs qui veulent y proposer leurs offres. Or, pour FFDN, les conditions ne conviennent pas à des acteurs qui comptent quelques centaines d'abonnés sur toute la France, alors que Bouygues Telecom ou Free négocient avec les délégataires des réseaux publics sur des millions de lignes.
  • ...1 more annotation...
  • Qu'en disent les réseaux publics ? Pour l'Avicca, la principale association de collectivités sur le numérique, une partie des difficultés des FAI associatifs résulteraient de l'absence d'offres activées sur certains réseaux publics. Une situation qui devrait changer.
  •  
    "Alors que la fibre jusqu'à l'abonné grignote chaque trimestre des dizaines de milliers d'abonnés à l'ADSL. Au sein de FFDN, des fournisseurs d'accès associatifs cherchent une porte d'entrée vers ces nouveaux réseaux, encore inaccessibles pour ces petits acteurs."
Aurialie Jublin

Ouvrir, ce n'est pas juste partager des données : simple, basique - 0 views

  • Les cas d’Airbnb ou d’Uber qui proclament faire de l’open data alors que les données ne répondent pas aux principes essentiels de l’open data ne créent aucun usage réel. Cela revient à faire de l’open washing (inspiré du «greenwashing» ou éco-blanchiment) : les producteurs proclament leurs données ouvertes, même si dans les faits, l’accès et la réutilisation des données sont trop limités.
  • Lorsque les données sont disponibles uniquement à travers une API demandant inscription, l’usager des données n’a aucune garantie que la base de données pourra être téléchargée et n’est pas assuré d’avoir effectivement accès aux données. Pour l’usager, cela crée une incertitude sur la pérennité des services réutilisant des données : l’organisation qui partage ses données reste libre d’exclure l’usager si elle considère que le service ne va pas dans le sens de ses intérêts. Ce n’est donc qu’un partage contrôlé là où l’ouverture consiste à laisser place à l’inattendu et à laisser libre cours à chacun de créer des services auquel nous n’aurions pas pensé quitte à parfois concurrencer les services développés par l’organisation qui a ouvert des données.
  • Distinguer l’ouverture du partage de données permet aussi de souligner l’essence même des principes de l’open data qui consiste à réduire les asymétries d’information et à créer une situation équitable entre tous les acteurs.
  •  
    "Durant l'été, Frédéric Charles, directeur Stratégie & Innovation chez SUEZ Smart Solutions, lançait sur Twitter une polémique sur la définition de l'open data qui a abouti sur un billet sur son blog sur ZDNet. Ce billet demande, rien de moins, que de redéfinir les principes de l'open data. Il nous paraît essentiel d'y répondre et de clarifier un point : ouvrir et partager des données sont deux choses différentes."
Aurialie Jublin

Liberté, Libertés chéries: Menace sur l'Open Data par défaut - 0 views

  • L'affaire montre qu'il ne faut jamais sous-estimer les questions de procédure car ce sont elles qui garantissent l'effectivité des droits et libertés. Si le tribunal administratif accueillait le moyen développé par le ministre de l'intérieur, il supprimerait de facto l'Open Data par défaut, repassant subrepticement, au mépris de la loi Lemaire, de la logique de l'offre à celle de la demande. La libre communication des données publiques sera-t-elle tuée par des services cherchent à se soustraire au devoir de transparence par un grignotage procédural efficace et discret ? Reste à savoir si la juridiction administrative sera ou non complice de ce mauvais coup.
  •  
    "L'Open Data peut être défini comme la mise à disposition des données produites et détenues par les administrations. L'Open Data par défaut, formule un peu obscure, signifie que toute administration de plus de cinquante salariés est désormais tenue de mettre à la disposition du public les documents administratifs qui ont déjà été individuellement communiqués à une personne, à la suite d'un avis favorable de la Commission d'accès aux documents administratifs (CADA). Les responsables du site spécialisé Next INPact ont entrepris de tester la procédure d'Open Data par défaut, en allant jusqu'au recours contentieux. En mettant en ligne à la fois le texte de sa requête et le mémoire en défense communiqué par le ministère de l'intérieur, ils mettent en lumière les difficultés techniques qui surgissent lorsque l'on veut faire respecter le principe d'ouverture des données publiques et les réticences d'une administration qui affirme la transparence en s'efforçant autant que possible d'en réduire le champ."
Aurialie Jublin

"I Was Devastated": Tim Berners-Lee, the Man Who Created the World Wide Web, Has Some R... - 2 views

  • Initially, Berners-Lee’s innovation was intended to help scientists share data across a then obscure platform called the Internet, a version of which the U.S. government had been using since the 1960s. But owing to his decision to release the source code for free—to make the Web an open and democratic platform for all—his brainchild quickly took on a life of its own.
  • He also envisioned that his invention could, in the wrong hands, become a destroyer of worlds, as Robert Oppenheimer once infamously observed of his own creation. His prophecy came to life, most recently, when revelations emerged that Russian hackers interfered with the 2016 presidential election, or when Facebook admitted it exposed data on more than 80 million users to a political research firm, Cambridge Analytica, which worked for Donald Trump’s campaign. This episode was the latest in an increasingly chilling narrative. In 2012, Facebook conducted secret psychological experiments on nearly 700,000 users. Both Google and Amazon have filed patent applications for devices designed to listen for mood shifts and emotions in the human voice.
  • This agony, however, has had a profound effect on Berners-Lee. He is now embarking on a third act—determined to fight back through both his celebrity status and, notably, his skill as a coder. In particular, Berners-Lee has, for some time, been working on a new platform, Solid, to reclaim the Web from corporations and return it to its democratic roots.
  • ...10 more annotations...
  • What made the Web powerful, and ultimately dominant, however, would also one day prove to be its greatest vulnerability: Berners-Lee gave it away for free; anyone with a computer and an Internet connection could not only access it but also build off it. Berners-Lee understood that the Web needed to be unfettered by patents, fees, royalties, or any other controls in order to thrive. This way, millions of innovators could design their own products to take advantage of it.
  • “Tim and Vint made the system so that there could be many players that didn’t have an advantage over each other.” Berners-Lee, too, remembers the quixotism of the era. “The spirit there was very decentralized. The individual was incredibly empowered. It was all based on there being no central authority that you had to go to to ask permission,” he said. “That feeling of individual control, that empowerment, is something we’ve lost.”
  • The power of the Web wasn’t taken or stolen. We, collectively, by the billions, gave it away with every signed user agreement and intimate moment shared with technology. Facebook, Google, and Amazon now monopolize almost everything that happens online, from what we buy to the news we read to who we like. Along with a handful of powerful government agencies, they are able to monitor, manipulate, and spy in once unimaginable ways.
  • The idea is simple: re-decentralize the Web. Working with a small team of developers, he spends most of his time now on Solid, a platform designed to give individuals, rather than corporations, control of their own data. “There are people working in the lab trying to imagine how the Web could be different. How society on the Web could look different. What could happen if we give people privacy and we give people control of their data,” Berners-Lee told me. “We are building a whole eco-system.”
  • For now, the Solid technology is still new and not ready for the masses. But the vision, if it works, could radically change the existing power dynamics of the Web. The system aims to give users a platform by which they can control access to the data and content they generate on the Web. This way, users can choose how that data gets used rather than, say, Facebook and Google doing with it as they please. Solid’s code and technology is open to all—anyone with access to the Internet can come into its chat room and start coding.
  • It’s still the early days for Solid, but Berners-Lee is moving fast. Those who work closely with him say he has thrown himself into the project with the same vigor and determination he employed upon the Web’s inception. Popular sentiment also appears to facilitate his time frame. In India, a group of activists successfully blocked Facebook from implementing a new service that would have effectively controlled access to the Web for huge swaths of the country’s population. In Germany, one young coder built a decentralized version of Twitter called Mastodon. In France, another group created Peertube as a decentralized alternative to YouTube.
  • Berners-Lee is not the leader of this revolution—by definition, the decentralized Web shouldn’t have one—but he is a powerful weapon in the fight. And he fully recognizes that re-decentralizing the Web is going to be a lot harder than inventing it was in the first place.
  • But laws written now don’t anticipate future technologies. Nor do lawmakers—many badgered by corporate lobbyists—always choose to protect individual rights. In December, lobbyists for telecom companies pushed the Federal Communications Commission to roll back net-neutrality rules, which protect equal access to the Internet. In January, the U.S. Senate voted to advance a bill that would allow the National Security Agency to continue its mass online-surveillance program. Google’s lobbyists are now working to modify rules on how companies can gather and store biometric data, such as fingerprints, iris scans, and facial-recognition images.
  • It’s hard to believe that anyone—even Zuckerberg—wants the 1984 version. He didn’t found Facebook to manipulate elections; Jack Dorsey and the other Twitter founders didn’t intend to give Donald Trump a digital bullhorn. And this is what makes Berners-Lee believe that this battle over our digital future can be won.
  • When asked what ordinary people can do, Berners-Lee replied, “You don’t have to have any coding skills. You just have to have a heart to decide enough is enough. Get out your Magic Marker and your signboard and your broomstick. And go out on the streets.” In other words, it’s time to rise against the machines.
  •  
    "We demonstrated that the Web had failed instead of served humanity, as it was supposed to have done, and failed in many places," he told me. The increasing centralization of the Web, he says, has "ended up producing-with no deliberate action of the people who designed the platform-a large-scale emergent phenomenon which is anti-human." "While the problems facing the web are complex and large, I think we should see them as bugs: problems with existing code and software systems that have been created by people-and can be fixed by people." Tim Berners-Lee
  •  
    "Berners-Lee has seen his creation debased by everything from fake news to mass surveillance. But he's got a plan to fix it."
Aurialie Jublin

Europe Lagging Behind: It's Not The Safety Net - 0 views

  • Interestingly, the developing startup crisis in the US might precisely be due to Americans not having a decent safety net. For one, in the US a steady job is the only sure way to access healthcare coverage. If founding a business means your family loses their health insurance, most people are going to think twice, and most won’t make the leap. And then there are ballooning student loans. If you start your working life with hundreds of thousands of dollars to pay off, no one should be surprised that you hunt for a steady salary instead of opting for the risks of entrepreneurship.
  • Still, it’s true that Europe is lagging behind. But rather than the reasons suggested above, it’s because Europe is an extremely fragmented market. Unlike in the US and China, it is almost impossible for European entrepreneurs to target a large market at an early stage. Instead, they launch in their home country, secure a dominant position locally, and only then raise more funds and expand to neighboring countries. Then they need to start all over again because everything’s different there: the language, the culture, and the laws and regulations (yes, even with the EU “single market”).
  • Their relatively smaller size also means that European countries have had to be more open to foreign trade. That openness has long meant more government intervention in the economy. In 1998, Harvard University’s Dani Rodrik explained the odd correlation between free trade and big government: being open to foreign trade leads to increased market instability, which in turn calls for more government intervention to stabilize markets and provide economic security to both households and domestic businesses. Free trade works better with a broad safety net!
  • ...1 more annotation...
  • So Europe’s fragmentation has meant being more open to foreign trade, a broader safety net, and an easier time getting people to go along with it all. But that same fragmentation explains why European entrepreneurs have a harder time building large tech companies. As ever, however, correlation is not causation. Domestic safety nets are not the reason why Europe is lagging behind in technology. Indeed, there are reasons to think that broad and strong safety nets will be a key asset if Europe makes a strong move toward a comeback.
  •  
    "There's an ongoing discussion about why Europe is lagging behind in technology. In a recent article, Bloomberg's Jeremy Kahn discusses various reasons, including a propensity to "think small" and the "pressure to cash out". Then on social media, people like to claim that Europeans are just unwilling to take risks, evidenced by their broad and strong social safety nets. But the truth is that a proper safety net fosters risk-taking more than it harms it. Just look at the statistics on entrepreneurship in the US. In recent years the number of startups in the US has been going down, not up. This suggests that the US has been racing ahead of Europe not because more Americans are creating startups, but rather because healthy ecosystems-Silicon Valley and a few other places such as Seattle-lift the best entrepreneurs up and help them turn tiny startups into global empires."
Aurialie Jublin

[Genre] Data Feminism · MIT Press Open - 0 views

  • Intersectional feminism isn't just about women nor even just about gender. Feminism is about power – who has it and who doesn’t. And in a world in which data is power, and that power is wielded unequally, data feminism can help us understand how it can be challenged and changed.
  •  
    "Welcome to the community review site for Data Feminism. Thank you for your generosity and time in choosing to read and comment on this manuscript draft. The review period for this draft will close on January 7, 2019, although the ability to leave comments will still be available after that point. We have chosen to put this draft online because of a foundational principle of this project: that all knowledge is incomplete, and that the best knowledge is gained by bringing together multiple partial perspectives. A corollary to this principle is that our own perspectives are limited, especially with respect to the topics and issues that we have not personally experienced. As we describe more fully in our values statement, we recognize that the people who are most directly affected by specific topics and issues are the ones who know the most about them. In our book, we have attempted to elevate their voices, and amplify their ideas. In our attempt to do so, we have also likely made mistakes. We strive to be reflexive and accountable in our work, and we hope to learn from you about places where we've gotten things wrong, and about how we can do better."
Aurialie Jublin

A viewpoint on Craft and the Internet - Ding Magazine - 0 views

  •  
    "On a recent visit to Barcelona, I was charmed by the Institute for Advanced Architecture of Catalonia's Smart Citizen platform that enables citizens to monitor levels of air or noise pollution around their home or business. The system connects data, people and knowledge based on their location; the device's low power consumption allows it to be placed on balconies and windowsills where power is provided by a solar panel or battery. Smart Citizen is just one among a growing array of devices and platforms that can sense everything from the health of a tomato in Brazil, to bacteria in the stomach of a cow in Perthshire - remotely. This innovation is welcome, but it leaves a difficult question unanswered: Under what circumstances will possession of this data contribute to the system transformation that we so urgently need? What's missing, so far - from the Internet of Things in general, and remote sensing in particular - is a value benchmark against which to analyze the data being generated. We've created a global infrastructure that is brilliant on means, but unambitious when it comes to ends"
1 - 20 of 82 Next › Last »
Showing 20 items per page