Skip to main content

Home/ Future of the Web/ Group items tagged create

Rss Feed Group items tagged

Paul Merrell

Sick Of Facebook? Read This. - 2 views

  • In 2012, The Guardian reported on Facebook’s arbitrary and ridiculous nudity and violence guidelines which allow images of crushed limbs but – dear god spare us the image of a woman breastfeeding. Still, people stayed – and Facebook grew. In 2014, Facebook admitted to mind control games via positive or negative emotional content tests on unknowing and unwilling platform users. Still, people stayed – and Facebook grew. Following the 2016 election, Facebook responded to the Harpie shrieks from the corporate Democrats bysetting up a so-called “fake news” task force to weed out those dastardly commies (or socialists or anarchists or leftists or libertarians or dissidents or…). And since then, I’ve watched my reach on Facebook drain like water in a bathtub – hard to notice at first and then a spastic swirl while people bicker about how to plug the drain. And still, we stayed – and the censorship tightened. Roughly a year ago, my show Act Out! reported on both the censorship we were experiencing but also the cramped filter bubbling that Facebook employs in order to keep the undesirables out of everyone’s news feed. Still, I stayed – and the censorship tightened. 2017 into 2018 saw more and more activist organizers, particularly black and brown, thrown into Facebook jail for questioning systemic violence and demanding better. In August, puss bag ass hat in a human suit Alex Jones was banned from Facebook – YouTube, Apple and Twitter followed suit shortly thereafter. Some folks celebrated. Some others of us skipped the party because we could feel what was coming.
  • On Thursday, October 11th of this year, Facebook purged more than 800 pages including The Anti-Media, Police the Police, Free Thought Project and many other social justice and alternative media pages. Their explanation rested on the painfully flimsy foundation of “inauthentic behavior.” Meanwhile, their fake-news checking team is stacked with the likes of the Atlantic Council and the Weekly Standard, neocon junk organizations that peddle such drivel as “The Character Assassination of Brett Kavanaugh.” Soon after, on the Monday before the Midterm elections, Facebook blocked another 115 accounts citing once again, “inauthentic behavior.” Then, in mid November, a massive New York Times piece chronicled Facebook’s long road to not only save its image amid rising authoritarian behavior, but “to discredit activist protesters, in part by linking them to the liberal financier George Soros.” (I consistently find myself waiting for those Soros and Putin checks in the mail that just never appear.)
  • What we need is an open source, non-surveillance platform. And right now, that platform is Minds. Before you ask, I’m not being paid to write that.
  • ...2 more annotations...
  • Fashioned as an alternative to the closed and creepy Facebook behemoth, Minds advertises itself as “an open source and decentralized social network for Internet freedom.” Minds prides itself on being hands-off with regards to any content that falls in line with what’s permitted by law, which has elicited critiques from some on the left who say Minds is a safe haven for fascists and right-wing extremists. Yet, Ottman has himself stated openly that he wants ideas on content moderation and ways to make Minds a better place for social network users as well as radical content creators. What a few fellow journos and I are calling #MindsShift is an important step in not only moving away from our gagged existence on Facebook but in building a social network that can serve up the real news folks are now aching for.
  • To be clear, we aren’t advocating that you delete your Facebook account – unless you want to. For many, Facebook is still an important tool and our goal is to add to the outreach toolkit, not suppress it. We have set January 1st, 2019 as the ultimate date for this #MindsShift. Several outlets with a combined reach of millions of users will be making the move – and asking their readerships/viewerships to move with them. Along with fellow journalists, I am working with Minds to brainstorm new user-friendly functions and ways to make this #MindsShift a loud and powerful move. We ask that you, the reader, add to the conversation by joining the #MindsShift and spreading the word to your friends and family. (Join Minds via this link) We have created the #MindsShift open group on Minds.com so that you can join and offer up suggestions and ideas to make this platform a new home for radical and progressive media.
Paul Merrell

US spy lab hopes to geotag every outdoor photo on social media | Ars Technica - 0 views

  • Imagine if someone could scan every image on Facebook, Twitter, and Instagram, then instantly determine where each was taken. The ability to combine this location data with information about who appears in those photos—and any social media contacts tied to them—would make it possible for government agencies to quickly track terrorist groups posting propaganda photos. (And, really, just about anyone else.) That's precisely the goal of Finder, a research program of the Intelligence Advanced Research Projects Agency (IARPA), the Office of the Director of National Intelligence's dedicated research organization. For many photos taken with smartphones (and with some consumer cameras), geolocation information is saved with the image by default. The location is stored in the Exif (Exchangable Image File Format) data of the photo itself unless geolocation services are turned off. If you have used Apple's iCloud photo store or Google Photos, you've probably created a rich map of your pattern of life through geotagged metadata. However, this location data is pruned off for privacy reasons when images are uploaded to some social media services, and privacy-conscious photographers (particularly those concerned about potential drone strikes) will purposely disable geotagging on their devices and social media accounts.
Paul Merrell

The Senate has its own insincere net neutrality bill - 0 views

  • Now that the House of Representatives has floated a superficial net neutrality bill, it's the Senate's turn. Louisiana Senator John Kennedy has introduced a companion version of the Open Internet Preservation Act that effectively replicates the House measure put forward by Tennessee Representative Marsha Blackburn. As before, it supports net neutrality only on a basic level -- and there are provisions that would make it difficult to combat other abuses. The legislation would technically forbid internet providers from blocking and throttling content, but it wouldn't bar paid prioritization. Theoretically, ISPs could create de facto "slow lanes" for competing services by offering mediocre speeds unless they pay for faster connections. The bill would also curb the FCC's ability to deal with other violations, and would prevent states from passing their own net neutrality laws. In short, the bill is much more about limiting regulation than protecting open access and competition.Kennedy's bill isn't expected to go far in the Senate, just as Blackburn's hasn't done much in the House. However, his proposal comes mere days after senators put forward a Congressional Review Act that would undo the FCC's decision to kill net neutrality. Kennedy had claimed he was considering support for the CRA, but his proposal contradicts that -- why push a heavily watered-down bill if you were willing to revert to the stronger legislation? It's not a completely surprising move and is largely symbolic, but it's disappointing for those who hoped there would be truly bipartisan support for a return to net neutrality.
Paul Merrell

From Radio to Porn, British Spies Track Web Users' Online Identities - 1 views

  • HERE WAS A SIMPLE AIM at the heart of the top-secret program: Record the website browsing habits of “every visible user on the Internet.” Before long, billions of digital records about ordinary people’s online activities were being stored every day. Among them were details cataloging visits to porn, social media and news websites, search engines, chat forums, and blogs. The mass surveillance operation — code-named KARMA POLICE — was launched by British spies about seven years ago without any public debate or scrutiny. It was just one part of a giant global Internet spying apparatus built by the United Kingdom’s electronic eavesdropping agency, Government Communications Headquarters, or GCHQ. The revelations about the scope of the British agency’s surveillance are contained in documents obtained by The Intercept from National Security Agency whistleblower Edward Snowden. Previous reports based on the leaked files have exposed how GCHQ taps into Internet cables to monitor communications on a vast scale, but many details about what happens to the data after it has been vacuumed up have remained unclear.
  • Amid a renewed push from the U.K. government for more surveillance powers, more than two dozen documents being disclosed today by The Intercept reveal for the first time several major strands of GCHQ’s existing electronic eavesdropping capabilities.
  • The surveillance is underpinned by an opaque legal regime that has authorized GCHQ to sift through huge archives of metadata about the private phone calls, emails and Internet browsing logs of Brits, Americans, and any other citizens — all without a court order or judicial warrant
  • ...17 more annotations...
  • A huge volume of the Internet data GCHQ collects flows directly into a massive repository named Black Hole, which is at the core of the agency’s online spying operations, storing raw logs of intercepted material before it has been subject to analysis. Black Hole contains data collected by GCHQ as part of bulk “unselected” surveillance, meaning it is not focused on particular “selected” targets and instead includes troves of data indiscriminately swept up about ordinary people’s online activities. Between August 2007 and March 2009, GCHQ documents say that Black Hole was used to store more than 1.1 trillion “events” — a term the agency uses to refer to metadata records — with about 10 billion new entries added every day. As of March 2009, the largest slice of data Black Hole held — 41 percent — was about people’s Internet browsing histories. The rest included a combination of email and instant messenger records, details about search engine queries, information about social media activity, logs related to hacking operations, and data on people’s use of tools to browse the Internet anonymously.
  • Throughout this period, as smartphone sales started to boom, the frequency of people’s Internet use was steadily increasing. In tandem, British spies were working frantically to bolster their spying capabilities, with plans afoot to expand the size of Black Hole and other repositories to handle an avalanche of new data. By 2010, according to the documents, GCHQ was logging 30 billion metadata records per day. By 2012, collection had increased to 50 billion per day, and work was underway to double capacity to 100 billion. The agency was developing “unprecedented” techniques to perform what it called “population-scale” data mining, monitoring all communications across entire countries in an effort to detect patterns or behaviors deemed suspicious. It was creating what it said would be, by 2013, “the world’s biggest” surveillance engine “to run cyber operations and to access better, more valued data for customers to make a real world difference.”
  • A document from the GCHQ target analysis center (GTAC) shows the Black Hole repository’s structure.
  • The data is searched by GCHQ analysts in a hunt for behavior online that could be connected to terrorism or other criminal activity. But it has also served a broader and more controversial purpose — helping the agency hack into European companies’ computer networks. In the lead up to its secret mission targeting Netherlands-based Gemalto, the largest SIM card manufacturer in the world, GCHQ used MUTANT BROTH in an effort to identify the company’s employees so it could hack into their computers. The system helped the agency analyze intercepted Facebook cookies it believed were associated with Gemalto staff located at offices in France and Poland. GCHQ later successfully infiltrated Gemalto’s internal networks, stealing encryption keys produced by the company that protect the privacy of cell phone communications.
  • Similarly, MUTANT BROTH proved integral to GCHQ’s hack of Belgian telecommunications provider Belgacom. The agency entered IP addresses associated with Belgacom into MUTANT BROTH to uncover information about the company’s employees. Cookies associated with the IPs revealed the Google, Yahoo, and LinkedIn accounts of three Belgacom engineers, whose computers were then targeted by the agency and infected with malware. The hacking operation resulted in GCHQ gaining deep access into the most sensitive parts of Belgacom’s internal systems, granting British spies the ability to intercept communications passing through the company’s networks.
  • In March, a U.K. parliamentary committee published the findings of an 18-month review of GCHQ’s operations and called for an overhaul of the laws that regulate the spying. The committee raised concerns about the agency gathering what it described as “bulk personal datasets” being held about “a wide range of people.” However, it censored the section of the report describing what these “datasets” contained, despite acknowledging that they “may be highly intrusive.” The Snowden documents shine light on some of the core GCHQ bulk data-gathering programs that the committee was likely referring to — pulling back the veil of secrecy that has shielded some of the agency’s most controversial surveillance operations from public scrutiny. KARMA POLICE and MUTANT BROTH are among the key bulk collection systems. But they do not operate in isolation — and the scope of GCHQ’s spying extends far beyond them.
  • The agency operates a bewildering array of other eavesdropping systems, each serving its own specific purpose and designated a unique code name, such as: SOCIAL ANTHROPOID, which is used to analyze metadata on emails, instant messenger chats, social media connections and conversations, plus “telephony” metadata about phone calls, cell phone locations, text and multimedia messages; MEMORY HOLE, which logs queries entered into search engines and associates each search with an IP address; MARBLED GECKO, which sifts through details about searches people have entered into Google Maps and Google Earth; and INFINITE MONKEYS, which analyzes data about the usage of online bulletin boards and forums. GCHQ has other programs that it uses to analyze the content of intercepted communications, such as the full written body of emails and the audio of phone calls. One of the most important content collection capabilities is TEMPORA, which mines vast amounts of emails, instant messages, voice calls and other communications and makes them accessible through a Google-style search tool named XKEYSCORE.
  • As of September 2012, TEMPORA was collecting “more than 40 billion pieces of content a day” and it was being used to spy on people across Europe, the Middle East, and North Africa, according to a top-secret memo outlining the scope of the program. The existence of TEMPORA was first revealed by The Guardian in June 2013. To analyze all of the communications it intercepts and to build a profile of the individuals it is monitoring, GCHQ uses a variety of different tools that can pull together all of the relevant information and make it accessible through a single interface. SAMUEL PEPYS is one such tool, built by the British spies to analyze both the content and metadata of emails, browsing sessions, and instant messages as they are being intercepted in real time. One screenshot of SAMUEL PEPYS in action shows the agency using it to monitor an individual in Sweden who visited a page about GCHQ on the U.S.-based anti-secrecy website Cryptome.
  • Partly due to the U.K.’s geographic location — situated between the United States and the western edge of continental Europe — a large amount of the world’s Internet traffic passes through its territory across international data cables. In 2010, GCHQ noted that what amounted to “25 percent of all Internet traffic” was transiting the U.K. through some 1,600 different cables. The agency said that it could “survey the majority of the 1,600” and “select the most valuable to switch into our processing systems.”
  • According to Joss Wright, a research fellow at the University of Oxford’s Internet Institute, tapping into the cables allows GCHQ to monitor a large portion of foreign communications. But the cables also transport masses of wholly domestic British emails and online chats, because when anyone in the U.K. sends an email or visits a website, their computer will routinely send and receive data from servers that are located overseas. “I could send a message from my computer here [in England] to my wife’s computer in the next room and on its way it could go through the U.S., France, and other countries,” Wright says. “That’s just the way the Internet is designed.” In other words, Wright adds, that means “a lot” of British data and communications transit across international cables daily, and are liable to be swept into GCHQ’s databases.
  • A map from a classified GCHQ presentation about intercepting communications from undersea cables. GCHQ is authorized to conduct dragnet surveillance of the international data cables through so-called external warrants that are signed off by a government minister. The external warrants permit the agency to monitor communications in foreign countries as well as British citizens’ international calls and emails — for example, a call from Islamabad to London. They prohibit GCHQ from reading or listening to the content of “internal” U.K. to U.K. emails and phone calls, which are supposed to be filtered out from GCHQ’s systems if they are inadvertently intercepted unless additional authorization is granted to scrutinize them. However, the same rules do not apply to metadata. A little-known loophole in the law allows GCHQ to use external warrants to collect and analyze bulk metadata about the emails, phone calls, and Internet browsing activities of British people, citizens of closely allied countries, and others, regardless of whether the data is derived from domestic U.K. to U.K. communications and browsing sessions or otherwise. In March, the existence of this loophole was quietly acknowledged by the U.K. parliamentary committee’s surveillance review, which stated in a section of its report that “special protection and additional safeguards” did not apply to metadata swept up using external warrants and that domestic British metadata could therefore be lawfully “returned as a result of searches” conducted by GCHQ.
  • Perhaps unsurprisingly, GCHQ appears to have readily exploited this obscure legal technicality. Secret policy guidance papers issued to the agency’s analysts instruct them that they can sift through huge troves of indiscriminately collected metadata records to spy on anyone regardless of their nationality. The guidance makes clear that there is no exemption or extra privacy protection for British people or citizens from countries that are members of the Five Eyes, a surveillance alliance that the U.K. is part of alongside the U.S., Canada, Australia, and New Zealand. “If you are searching a purely Events only database such as MUTANT BROTH, the issue of location does not occur,” states one internal GCHQ policy document, which is marked with a “last modified” date of July 2012. The document adds that analysts are free to search the databases for British metadata “without further authorization” by inputing a U.K. “selector,” meaning a unique identifier such as a person’s email or IP address, username, or phone number. Authorization is “not needed for individuals in the U.K.,” another GCHQ document explains, because metadata has been judged “less intrusive than communications content.” All the spies are required to do to mine the metadata troves is write a short “justification” or “reason” for each search they conduct and then click a button on their computer screen.
  • Intelligence GCHQ collects on British persons of interest is shared with domestic security agency MI5, which usually takes the lead on spying operations within the U.K. MI5 conducts its own extensive domestic surveillance as part of a program called DIGINT (digital intelligence).
  • GCHQ’s documents suggest that it typically retains metadata for periods of between 30 days to six months. It stores the content of communications for a shorter period of time, varying between three to 30 days. The retention periods can be extended if deemed necessary for “cyber defense.” One secret policy paper dated from January 2010 lists the wide range of information the agency classes as metadata — including location data that could be used to track your movements, your email, instant messenger, and social networking “buddy lists,” logs showing who you have communicated with by phone or email, the passwords you use to access “communications services” (such as an email account), and information about websites you have viewed.
  • Records showing the full website addresses you have visited — for instance, www.gchq.gov.uk/what_we_do — are treated as content. But the first part of an address you have visited — for instance, www.gchq.gov.uk — is treated as metadata. In isolation, a single metadata record of a phone call, email, or website visit may not reveal much about a person’s private life, according to Ethan Zuckerman, director of Massachusetts Institute of Technology’s Center for Civic Media. But if accumulated and analyzed over a period of weeks or months, these details would be “extremely personal,” he told The Intercept, because they could reveal a person’s movements, habits, religious beliefs, political views, relationships, and even sexual preferences. For Zuckerman, who has studied the social and political ramifications of surveillance, the most concerning aspect of large-scale government data collection is that it can be “corrosive towards democracy” — leading to a chilling effect on freedom of expression and communication. “Once we know there’s a reasonable chance that we are being watched in one fashion or another it’s hard for that not to have a ‘panopticon effect,’” he said, “where we think and behave differently based on the assumption that people may be watching and paying attention to what we are doing.”
  • When compared to surveillance rules in place in the U.S., GCHQ notes in one document that the U.K. has “a light oversight regime.” The more lax British spying regulations are reflected in secret internal rules that highlight greater restrictions on how NSA databases can be accessed. The NSA’s troves can be searched for data on British citizens, one document states, but they cannot be mined for information about Americans or other citizens from countries in the Five Eyes alliance. No such constraints are placed on GCHQ’s own databases, which can be sifted for records on the phone calls, emails, and Internet usage of Brits, Americans, and citizens from any other country. The scope of GCHQ’s surveillance powers explain in part why Snowden told The Guardian in June 2013 that U.K. surveillance is “worse than the U.S.” In an interview with Der Spiegel in July 2013, Snowden added that British Internet cables were “radioactive” and joked: “Even the Queen’s selfies to the pool boy get logged.”
  • In recent years, the biggest barrier to GCHQ’s mass collection of data does not appear to have come in the form of legal or policy restrictions. Rather, it is the increased use of encryption technology that protects the privacy of communications that has posed the biggest potential hindrance to the agency’s activities. “The spread of encryption … threatens our ability to do effective target discovery/development,” says a top-secret report co-authored by an official from the British agency and an NSA employee in 2011. “Pertinent metadata events will be locked within the encrypted channels and difficult, if not impossible, to prise out,” the report says, adding that the agencies were working on a plan that would “(hopefully) allow our Internet Exploitation strategy to prevail.”
Paul Merrell

Gov. Mills signs nation's strictest internet privacy protection bill - Portland Press H... - 0 views

  • Maine internet service providers will face the strictest consumer privacy protections in the nation under a bill signed Thursday by Gov. Janet Mills, but the new law will almost certainly be challenged in court. Several technology and communication trade groups warned in testimony before the Legislature that the measure may be in conflict with federal law and would likely be the subject of legal action.
  • The new law, which goes into effect on July 1, 2020, would require providers to ask for permission before they sell or share any of their customers’ data to a third party. The law would also apply to telecommunications companies that provide access to the internet via their cellular networks.
  • The law is modeled on a Federal Communications Commission rule, adopted under the administration of President Obama but overturned by the administration of President Trump in 2017. The rule blocked an ISP from selling a customer’s personal data, which is not prohibited under federal law.
  • ...1 more annotation...
  • The law is unlike any in the nation, as it requires an ISP to obtain consent from a consumer before sharing any data. Only California has a similar law on the books, but it requires consumers to “opt out”  by asking their ISP to protect their data. Maine’s new law does not allow an ISP to offer a discounted rate to customers who agree to share or sell their data.
patelvaishali

Best 15 Famous Mahatma Gandhi Quotes On Life - 0 views

  •  
    Best 15 Famous Mahatma Gandhi Quotes On Life Mohandas Karamchand Gandhi is well known for his quotes on love, friendship, life, happiness, wealth, leadership etc. Here we created images for the 15 famous Mahatma Gandhi quotes on life. 1) "Happiness is when what you think, what you say, and what you do are in harmony."
Paul Merrell

Evidence of Google blacklisting of left and progressive sites continues to mount - Worl... - 0 views

  • A growing number of leading left-wing websites have confirmed that their search traffic from Google has plunged in recent months, adding to evidence that Google, under the cover of a fraudulent campaign against fake news, is implementing a program of systematic and widespread censorship. Truthout, a not-for-profit news website that focuses on political, social, and ecological developments from a left progressive standpoint, had its readership plunge by 35 percent since April. The Real News , a nonprofit video news and documentary service, has had its search traffic fall by 37 percent. Another site, Common Dreams , last week told the WSWS that its search traffic had fallen by up to 50 percent. As extreme as these sudden drops in search traffic are, they do not equal the nearly 70 percent drop in traffic from Google seen by the WSWS. “This is political censorship of the worst sort; it’s just an excuse to suppress political viewpoints,” said Robert Epstein, a former editor in chief of Psychology Today and noted expert on Google. Epstein said that at this point, the question was whether the WSWS had been flagged specifically by human evaluators employed by the search giant, or whether those evaluators had influenced the Google Search engine to demote left-wing sites. “What you don’t know is whether this was the human evaluators who are demoting you, or whether it was the new algorithm they are training,” Epstein said.
  • Richard Stallman, the world-renowned technology pioneer and a leader of the free software movement, said he had read the WSWS’s coverage on Google’s censorship of left-wing sites. He warned about the immense control exercised by Google over the Internet, saying, “For people’s main way of finding articles about a topic to be run by a giant corporation creates an obvious potential for abuse.” According to data from the search optimization tool SEMRush, search traffic to Mr. Stallman’s personal website, Stallman.org, fell by 24 percent, while traffic to gnu.org, operated by the Free Software Foundation, fell 19 percent. Eric Maas, a search engine optimization consultant working in the San Francisco Bay area, said his team has surveyed a wide range of alternative news sites affected by changes in Google’s algorithms since April.  “While the update may be targeting specific site functions, there is evidence that this update is promoting only large mainstream news organizations. What I find problematic with this is that it appears that some sites have been targeted and others have not.” The massive drop in search traffic to the WSWS and other left-wing sites followed the implementation of changes in Google’s search evaluation protocols. In a statement issued on April 25, Ben Gomes, the company’s vice president for engineering, stated that Google’s update of its search engine would block access to “offensive” sites, while working to surface more “authoritative content.” In a set of guidelines issued to Google evaluators in March, the company instructed its search evaluators to flag pages returning “conspiracy theories” or “upsetting” content unless “the query clearly indicates the user is seeking an alternative viewpoint.”
Paul Merrell

FCC Turns Itself into a Deregulatory Agency - WhoWhatWhy - 2 views

  • Since taking office, President Donald Trump has wasted no time in proposing rollbacks to Obama-era federal regulations. So, it should come as no surprise that the Federal Communications Commission (FCC) voted last month to propose changes to current regulations on Internet service providers. Spearheaded by Ajit Pai — the Trump-appointed FCC chairman and former lawyer for Verizon — the 2-1 vote is the first step in dismantling the Open Internet Order. The lone FCC Democrat, Mignon Clyburn, was overruled by Pai and fellow commissioner Michael O’Reilly. The 2015 order classified broadband internet as a utility under Title II of the Communications Act of 1934. Opponents of the current state of net neutrality argue that the rules are archaic and place unnecessary — even harmful — restrictions on internet service providers (ISPs), leading to lack of innovation and investment. While it’s true that policies conceived in the 1930s could hardly anticipate the complexities of the modern Internet, a complete rollback of Title II protections would leave ISPs free to favor their own services and whichever company pays for upgraded service. Considering relaxed FEC rules on media ownership and lack of antitrust enforcement, some could argue that a rollback of net neutrality is even more toxic to innovation and affordable pricing. That is, fast lanes could be created for companies with deeper pockets, effectively giving them an advantage over companies and individuals who can’t pay extra. This approach effectively penalizes small businesses, nonprofits and innovative start-ups. Today’s Internet is so vast and so pervasive that it’s hard to grasp the impact that an abandonment of net neutrality would have on every aspect of our culture.
  • While the FCC’s proposed change will touch most Americans, net neutrality remains a mystifying concept to non-techies. To help our readers better understand the issue, we have compiled some videos that explain net neutrality and its importance. The FCC will be accepting comments from the public on their website until August 16, 2017.
Paul Merrell

Forget About Siri and Alexa - When It Comes to Voice Identification, the "NSA Reigns Su... - 0 views

  • These and other classified documents provided by former NSA contractor Edward Snowden reveal that the NSA has developed technology not just to record and transcribe private conversations but to automatically identify the speakers. Americans most regularly encounter this technology, known as speaker recognition, or speaker identification, when they wake up Amazon’s Alexa or call their bank. But a decade before voice commands like “Hello Siri” and “OK Google” became common household phrases, the NSA was using speaker recognition to monitor terrorists, politicians, drug lords, spies, and even agency employees. The technology works by analyzing the physical and behavioral features that make each person’s voice distinctive, such as the pitch, shape of the mouth, and length of the larynx. An algorithm then creates a dynamic computer model of the individual’s vocal characteristics. This is what’s popularly referred to as a “voiceprint.” The entire process — capturing a few spoken words, turning those words into a voiceprint, and comparing that representation to other “voiceprints” already stored in the database — can happen almost instantaneously. Although the NSA is known to rely on finger and face prints to identify targets, voiceprints, according to a 2008 agency document, are “where NSA reigns supreme.” It’s not difficult to see why. By intercepting and recording millions of overseas telephone conversations, video teleconferences, and internet calls — in addition to capturing, with or without warrants, the domestic conversations of Americans — the NSA has built an unrivaled collection of distinct voices. Documents from the Snowden archive reveal that analysts fed some of these recordings to speaker recognition algorithms that could connect individuals to their past utterances, even when they had used unknown phone numbers, secret code words, or multiple languages.
  • The classified documents, dating from 2004 to 2012, show the NSA refining increasingly sophisticated iterations of its speaker recognition technology. They confirm the uses of speaker recognition in counterterrorism operations and overseas drug busts. And they suggest that the agency planned to deploy the technology not just to retroactively identify spies like Pelton but to prevent whistleblowers like Snowden.
Paul Merrell

"In 10 Years, the Surveillance Business Model Will Have Been Made Illegal" - - 1 views

  • The opening panel of the Stigler Center’s annual antitrust conference discussed the source of digital platforms’ power and what, if anything, can be done to address the numerous challenges their ability to shape opinions and outcomes present. 
  • Google CEO Sundar Pichai caused a worldwide sensation earlier this week when he unveiled Duplex, an AI-driven digital assistant able to mimic human speech patterns (complete with vocal tics) to such a convincing degree that it managed to have real conversations with ordinary people without them realizing they were actually talking to a robot.   While Google presented Duplex as an exciting technological breakthrough, others saw something else: a system able to deceive people into believing they were talking to a human being, an ethical red flag (and a surefire way to get to robocall hell). Following the backlash, Google announced on Thursday that the new service will be designed “with disclosure built-in.” Nevertheless, the episode created the impression that ethical concerns were an “after-the-fact consideration” for Google, despite the fierce public scrutiny it and other tech giants faced over the past two months. “Silicon Valley is ethically lost, rudderless and has not learned a thing,” tweeted Zeynep Tufekci, a professor at the University of North Carolina at Chapel Hill and a prominent critic of tech firms.   The controversial demonstration was not the only sign that the global outrage has yet to inspire the profound rethinking critics hoped it would bring to Silicon Valley firms. In Pichai’s speech at Google’s annual I/O developer conference, the ethical concerns regarding the company’s data mining, business model, and political influence were briefly addressed with a general, laconic statement: “The path ahead needs to be navigated carefully and deliberately and we feel a deep sense of responsibility to get this right.”
  • Google’s fellow FAANGs also seem eager to put the “techlash” of the past two years behind them. Facebook, its shares now fully recovered from the Cambridge Analytica scandal, is already charging full-steam ahead into new areas like dating and blockchain.   But the techlash likely isn’t going away soon. The rise of digital platforms has had profound political, economic, and social effects, many of which are only now becoming apparent, and their sheer size and power makes it virtually impossible to exist on the Internet without using their services. As Stratechery’s Ben Thompson noted in the opening panel of the Stigler Center’s annual antitrust conference last month, Google and Facebook—already dominating search and social media and enjoying a duopoly in digital advertising—own many of the world’s top mobile apps. Amazon has more than 100 million Prime members, for whom it is usually the first and last stop for shopping online.   Many of the mechanisms that allowed for this growth are opaque and rooted in manipulation. What are those mechanisms, and how should policymakers and antitrust enforcers address them? These questions, and others, were the focus of the Stigler Center panel, which was moderated by the Economist’s New York bureau chief, Patrick Foulis.
Paul Merrell

The De-Americanization of Internet Freedom - Lawfare - 0 views

  • Why did the internet freedom agenda fail? Goldsmith’s essay tees up, but does not fully explore, a range of explanatory hypotheses. The most straightforward have to do with unrealistic expectations and unintended consequences. The idea that a minimally regulated internet would usher in an era of global peace, prosperity, and mutual understanding, Goldsmith tells us, was always a fantasy. As a project of democracy and human rights promotion, the internet freedom agenda was premised on a wildly overoptimistic view about the capacity of information flows, on their own, to empower oppressed groups and effect social change. Embracing this market-utopian view led the United States to underinvest in cybersecurity, social media oversight, and any number of other regulatory tools. In suggesting this interpretation of where U.S. policymakers and their civil society partners went wrong, Goldsmith’s essay complements recent critiques of the neoliberal strains in the broader human rights and transparency movements. Perhaps, however, the internet freedom agenda has faltered not because it was so naïve and unrealistic, but because it was so effective at achieving its realist goals. The seeds of this alternative account can be found in Goldsmith’s concession that the commercial non-regulation principle helped companies like Apple, Google, Facebook, and Amazon grab “huge market share globally.” The internet became an increasingly valuable cash cow for U.S. firms and an increasingly potent instrument of U.S. soft power over the past two decades; foreign governments, in due course, felt compelled to fight back. If the internet freedom agenda is understood as fundamentally a national economic project, rather than an international political or moral crusade, then we might say that its remarkable early success created the conditions for its eventual failure. Goldsmith’s essay also points to a third set of possible explanations for the collapse of the internet freedom agenda, involving its internal contradictions. Magaziner’s notion of a completely deregulated marketplace, if taken seriously, is incoherent. As Goldsmith and Tim Wu have discussed elsewhere, it takes quite a bit of regulation for any market, including markets related to the internet, to exist and to work. And indeed, even as Magaziner proposed “complete deregulation” of the internet, he simultaneously called for new legal protections against computer fraud and copyright infringement, which were soon followed by extensive U.S. efforts to penetrate foreign networks and to militarize cyberspace. Such internal dissonance was bound to invite charges of opportunism, and to render the American agenda unstable.
Paul Merrell

How a "location API" allows cops to figure out where we all are in real time | Ars Tech... - 0 views

  • The digital privacy world was rocked late Thursday evening when The New York Times reported on Securus, a prison telecom company that has a service enabling law enforcement officers to locate most American cell phones within seconds. The company does this via a basic Web interface leveraging a location API—creating a way to effectively access a massive real-time database of cell-site records. Securus’ location ability relies on other data brokers and location aggregators that obtain that information directly from mobile providers, usually for the purposes of providing some commercial service like an opt-in product discount triggered by being near a certain location. ("You’re near a Carl’s Jr.! Stop in now for a free order of fries with purchase!") The Texas-based Securus reportedly gets its data from 3CInteractive, which in turn buys data from LocationSmart. Ars reached 3CInteractive's general counsel, Scott Elk, who referred us to a spokesperson. The spokesperson did not immediately respond to our query. But currently, anyone can get a sense of the power of a location API by trying out a demo from LocationSmart itself. Currently, the Supreme Court is set to rule on the case of Carpenter v. United States, which asks whether police can obtain more than 120 days' worth of cell-site location information of a criminal suspect without a warrant. In that case, as is common in many investigations, law enforcement presented a cell provider with a court order to obtain such historical data. But the ability to obtain real-time location data that Securus reportedly offers skips that entire process, and it's potentially far more invasive. Securus’ location service as used by law enforcement is also currently being scrutinized. The service is at the heart of an ongoing federal prosecution of a former Missouri sheriff’s deputy who allegedly used it at least 11 times against a judge and other law enforcement officers. On Friday, Sen. Ron Wyden (D-Ore.) publicly released his formal letters to AT&T and also to the Federal Communications Commission demanding detailed answers regarding these Securus revelations.
Paul Merrell

The Internet May Be Underwater in 15 Years - 1 views

  • When the internet goes down, life as the modern American knows it grinds to a halt. Gone are the cute kitten photos and the Facebook status updates—but also gone are the signals telling stoplights to change from green to red, and doctors’ access to online patient records. A vast web of physical infrastructure undergirds the internet connections that touch nearly every aspect of modern life. Delicate fiber optic cables, massive data transfer stations, and power stations create a patchwork of literal nuts and bolts that facilitates the flow of zeros and ones. Now, research shows that a whole lot of that infrastructure sits squarely in the path of rising seas. (See what the planet would look like if all the ice melted.) Scientists mapped out the threads and knots of internet infrastructure in the U.S. and layered that on top of maps showing future sea level rise. What they found was ominous: Within 15 years, thousands of miles of fiber optic cable—and hundreds of pieces of other key infrastructure—are likely to be swamped by the encroaching ocean. And while some of that infrastructure may be water resistant, little of it was designed to live fully underwater. “So much of the infrastructure that's been deployed is right next to the coast, so it doesn't take much more than a few inches or a foot of sea level rise for it to be underwater,” says study coauthor Paul Barford, a computer scientist at the University of Wisconsin, Madison. “It was all was deployed 20ish years ago, when no one was thinking about the fact that sea levels might come up.”
  • “This will be a big problem,” says Rae Zimmerman, an expert on urban adaptation to climate change at NYU. Large parts of internet infrastructure soon “will be underwater, unless they're moved back pretty quickly.”
Paul Merrell

Facebook's Cryptocurrency: Stop It Before It Starts - Lawfare - 0 views

  • On Tuesday, Facebook announced its forthcoming cryptocurrency, Libra. The company says it intends to integrate it into Facebook’s Messenger and WhatsApp products. Although Facebook says it’s created an “independent” subsidiary, Calibra, and purports that the currency itself will be controlled by an independent Libra Foundation, the coin really a Facebook project. It is not live yet, giving governments the opportunity to kill this project before it actually gets off the ground and gives rise to cybercriminals that couldn’t capitalize on existing cryptocurrencies. In particular, the IRS and FinCEN should take action now.
Paul Merrell

UK Government Approves Net Censorship - British Free Speech Dies | Zero Hedge - 0 views

  • The United Kingdom has become the first Western nation to move ahead with large-scale censorship of the internet, effectively creating regulation that will limit freedom on the last frontier of digital liberty. In a move that has the nation reeling, Prime Minister Boris Johnson has unveiled rules that will punish internet companies with fines, and even imprisonment, if they fail to protect users from “harmful and illegal content.”
  • Couched in language that suggests this is being done to protect children from pedophiles and vulnerable people from cyberbullying, the proposals will place a massive burden on small companies. Further, they will ultimately make it impossible for those not of the pervasive politically correct ideology to produce and share content.
Paul Merrell

Comcast hints at plan for paid fast lanes after net neutrality repeal | Ars Technica - 0 views

  • For years, Comcast has been promising that it won't violate the principles of net neutrality, regardless of whether the government imposes any net neutrality rules. That meant that Comcast wouldn't block or throttle lawful Internet traffic and that it wouldn't create fast lanes in order to collect tolls from Web companies that want priority access over the Comcast network. This was one of the ways in which Comcast argued that the Federal Communications Commission should not reclassify broadband providers as common carriers, a designation that forces ISPs to treat customers fairly in other ways. The Title II common carrier classification that makes net neutrality rules enforceable isn't necessary because ISPs won't violate net neutrality principles anyway, Comcast and other ISPs have claimed. But with Republican Ajit Pai now in charge at the Federal Communications Commission, Comcast's stance has changed. While the company still says it won't block or throttle Internet content, it has dropped its promise about not instituting paid prioritization.
  • Instead, Comcast now vaguely says that it won't "discriminate against lawful content" or impose "anti-competitive paid prioritization." The change in wording suggests that Comcast may offer paid fast lanes to websites or other online services, such as video streaming providers, after Pai's FCC eliminates the net neutrality rules next month.
Paul Merrell

Do Not Track Implementation Guide Launched | Electronic Frontier Foundation - 1 views

  • Today we are releasing the implementation guide for EFF’s Do Not Track (DNT) policy. For years users have been able to set a Do Not Track signal in their browser, but there has been little guidance for websites as to how to honor that request. EFF’s DNT policy sets out a meaningful response for servers to follow, and this guide provides details about how to apply it in practice. At its core, DNT protects user privacy by excluding the use of unique identifiers for cross-site tracking, and by limiting the retention period of log data to ten days. This short retention period gives sites the time they need for debugging and security purposes, and to generate aggregate statistical data. From this baseline, the policy then allows exceptions when the user's interactions with the site—e.g., to post comments, make a purchase, or click on an ad—necessitates collecting more information. The site is then free to retain any data necessary to complete the transaction. We believe this approach balances users’ privacy expectations with the ability of websites to deliver the functionality users want. Websites often integrate third-party content and rely on third-party services (like content delivery networks or analytics), and this creates the potential for user data to be leaked despite the best intentions of the site operator. The guide identifies potential pitfalls and catalogs providers of compliant services. It is common, for example, to embed media from platforms like You Tube, Sound Cloud, and Twitter, all of which track users whenever their widgets are loaded. Fortunately, Embedly, which offers control over the appearance of embeds, also supports DNT via its API, displaying a poster instead and loading the widget only if the user clicks on it knowingly.
  • Knowledge makes the difference between willing tracking and non-consensual tracking. Users should be able to choose whether they want to give up their privacy in exchange for using a site or a  particular feature. This means sites need to be transparent about their practices. A great example of this is our biggest adopter, Medium, which does not track DNT users who browse the site and gives clear information about tracking to users when they choose to log in. This is their previous log-in panel, the DNT language is currently being added to their new interface.
Paul Merrell

It's Time to Nationalize the Internet - 0 views

  • Such profiteering tactics have disproportionately affected low-income and rural communities. ISPs have long redlined these demographic groups, creating what’s commonly known as the “digital divide.” Thirty-nine percent of Americans lack access to service fast enough to meet the federal definition of broadband. More than 50 percent of adults with household incomes below $30,000 have home broadband—a problem plaguing users of color most acutely. In contrast, internet access is near-universal for households with an annual income of $100,000 or more. The reason for such chasms is simple: Private network providers prioritize only those they expect to provide a return on investment, thus excluding poor and sparsely populated areas.
  • Chattanooga, Tennessee, has seen more success in addressing redlining. Since 2010, the city has offered public broadband via its municipal power organization, Electric Power Board (EPB). The project has become a rousing success: At half the price, its service is approximately 85 percent faster than that of Comcast, the region’s primary ISP prior to EPB’s inception. Coupled with a discounted program for low-income residents, Chattanooga’s publicly run broadband reaches about 82,000 residents—more than half of the area’s Internet users—and is only expected to grow. Chattanooga’s achievements have radiated to other locales. More than 450 communities have introduced publicly-owned broadband. And more than 110 communities in 24 states have access to publicly owned networks with one gigabit-per-second (Gbps) service. (AT&T, for example, has yet to introduce speeds this high.) Seattle City Councilmember Kshama Sawant proposed a pilot project in 2015 and has recently urged her city to invest in municipal broadband. Hawaii congressperson Kaniela Ing is drafting a bill for publicly-owned Internet for the state legislature to consider next year. In November, residents of Fort Collins, Colo. voted to authorize the city to build municipal broadband infrastructure.
Paul Merrell

Facebook agrees to $650M settlement to end Illinois privacy lawsuit | AppleInsider - 0 views

  • A judge has approved a settlement valued at $650 million from Facebook to end a privacy lawsuit, one which alleged the social network used facial recognition technology on user photos stored on its iPhone app without permission. The lawsuit, which started in April 2015, alleged Facebook did not gain consent from users to use its facial tagging features on their photographs. Originally filed by Chicago attorney Jay Edelson on behalf of plaintiff Carlo Licata, the complaint claimed the consent-less tagging was not allowed under privacy laws in Illinois. The case originated in Cook County Circuit Court before moving to Chicago federal court then California, reports the Chicago Tribune. On reaching California, the lawsuit attained class-action status. The class in question constitutes approximately 6.9 million Facebook users in Illinois that Facebook created and stored a face template for after June 7, 2011. Close to 1.6 million claim forms were filed ahead of the November 23 deadline for joining, making up roughly 22% of potential class members. Facebook went against the Illinois Biometric Information Privacy Act, the complaint alleged, which is among the toughest privacy laws in the United States. Part of the act requires companies to gain permission from users before being able to start using biometric systems with their data, which includes facial recognition systems.
Paul Merrell

Homepage - Contract for the Web - 0 views

  • The Web was designed to bring people together and make knowledge freely available. It has changed the world for good and improved the lives of billions. Yet, many people are still unable to access its benefits and, for others, the Web comes with too many unacceptable costs. Everyone has a role to play in safeguarding the future of the Web. The Contract for the Web was created by representatives from over 80 organizations, representing governments, companies and civil society, and sets out commitments to guide digital policy agendas. To achieve the Contract’s goals, governments, companies, civil society and individuals must commit to sustained policy development, advocacy, and implementation of the Contract text.
« First ‹ Previous 301 - 320 of 327 Next ›
Showing 20 items per page