Skip to main content

Home/ Socialism and the End of the American Dream/ Group items tagged Internet

Rss Feed Group items tagged

Gary Edwards

Birth of an Internet independence movement | CIO - 0 views

  • The arrogance and utter incongruity of declaring Internet and telephone networks equivalent has led a group of friends, all of them reluctant activists, to convene an effort to restore Internet independence. So far, the group of “Tech Innovators” includes John Perry Barlow, Mark Cuban, Tim Draper, Tom Evslin, Dave Farber, Charlie Giancarlo, George Gilder, John Gilmore, Brian Martin, Bob Metcalfe, Ray Ozzie, Jeff Pulver, Michael Robertson, Scott McNealy and Les Vadasz. Through this civic initiative, we hope to defend the remarkable success of the Internet and lead a conversation toward the future — not the past, where laws enacted under FDR must inevitably lead us. The open Internet rules from the FCC end the “permissionless innovation” they purport to protect by inviting the commission to regulate computer networks for the first time. The uncertain benefits and certain unintended consequences of the policy reversal expose the communicating public to unnecessary risk and threaten to upend the success of the past 20 years. The Tech Innovators believe that by recognizing “Internet Independence Day,” Congress can help initiate and advance bipartisan legislation to restore the private-sector framework responsible for of the success of the Internet.
  • Americans today enjoy a thousand-fold improvement from the 56Kbps dial-up modems that 15 million Internet early adopters relied on in the ’90s. The Internet now reaches 3 billion people, and a proliferation of services push communication options far beyond the long-distance phone call of 1995. The FCC plan to impose public utility Title II provisions ends the policies responsible for these accomplishments. Domains subject to telephone-style regulations suffer stagnation without exception. A routine 10Mbps connection available as a nonregulated information service prior to the Open Internet Order would have cost $10,000 per month as a Title II data service in 1995. The insertion of fiat regulatory powers will prove fatal to the entrepreneurial energies responsible for building what FCC Chairman Wheeler calls “the most powerful network in the history of mankind” — a network built beyond the reach of FCC regulatory jurisdiction.
  • The Open Internet Order invents artificial distinctions between content companies, Internet providers and end users for the purposes of regulation. This will lead to the same types of regulatory arbitrage and innovation-deadening consequences as prior distinctions such as “long distance” or “intra-lata.” History demonstrates that asserting artificial market distinctions for purposes of regulation always invites arbitrage and unintended consequences. Resources White Paper 802.11ac: Wireless The Easy Way White Paper Web Application Acceleration: Practical Implementations See All Go The commission obtains jurisdiction by changing the definition of “public switched network” to include networks with IP addresses. The complete transformation of a policy landscape represents a decision the Constitution grants exclusively to Congress.
  • ...1 more annotation...
  • The coming litigation leaves the Internet ecosystem in jeopardy without regard to the outcome. The preference for a congressional action addressing current conditions and issues relative to the prospects of an 80-year-old regulatory framework should not be controversial. The privatization of the Internet represented an experiment. Restoring Internet independence merely recognizes the remarkable success of the commercial Internet.
  •  
    "The 20th anniversary of the privatization of the Internet deserves recognition by the U.S. Congress and celebration by all Americans as "Internet Independence Day." Two decades ago, on April 30, 1995, the Internet was privatized with the decommissioning of the NSFNET backbone. State of the CIO 2015 More than 500 top IT leaders responded to our online survey to help us gauge the state of the READ NOW The past two decades of Internet-driven success were set in motion with the passage of the High Performance Computing Act of 1991, championed by Sen. Al Gore and signed into law by President George H.W. Bush. That decision of the U.S. government to step back and privatize the Internet led to a thriving and open Internet that provides a remarkable platform for innovation. Ironically, the Federal Communication Commission's recently announced Open Internet Order reasserts government control over the Internet by the means of repurposing Depression-era industrial policy meant to address a monopoly in voice-transmission technology. The FCC went down the dangerous and uncertain legal path of reverting to traditional, utility-style regulation under Title II of the Communications Act of 1934."
Gary Edwards

The US government has betrayed the internet. We need to take it back | Bruce Schneier |... - 0 views

  •  
    "The USA Government has betrayed the Internet. We need to take it back. The NSA has undermined a fundamental social contract. We engineers built the Internet - and now we have to fix it and take it back." "Government and industry have betrayed the internet, and us. By subverting the internet at every level to make it a vast, multi-layered and robust surveillance platform, the NSA has undermined a fundamental social contract. The companies that build and manage our internet infrastructure, the companies that create and sell us our hardware and software, or the companies that host our data: we can no longer trust them to be ethical internet stewards. This is not the internet the world needs, or the internet its creators envisioned. We need to take it back. And by we, I mean the engineering community. Yes, this is primarily a political problem, a policy matter that requires political intervention. But this is also an engineering problem, and there are several things engineers can - and should - do."
Gary Edwards

XKeyscore: NSA tool collects 'nearly everything a user does on the internet' | World ne... - 1 views

  • The latest revelations will add to the intense public and congressional debate around the extent of NSA surveillance programs. They come as senior intelligence officials testify to the Senate judiciary committee on Wednesday, releasing classified documents in response to the Guardian's earlier stories on bulk collection of phone records and Fisa surveillance court oversight.
  • The files shed light on one of Snowden's most controversial statements, made in his first video interview published by the Guardian on June 10
  • "I, sitting at my desk," said Snowden, could "wiretap anyone, from you or your accountant, to a federal judge or even the president, if I had a personal email".
  • ...23 more annotations...
  • US officials vehemently denied this specific claim. Mike Rogers, the Republican chairman of the House intelligence committee, said of Snowden's assertion: "He's lying. It's impossible for him to do what he was saying he could do."
  • But training materials for XKeyscore detail how analysts can use it and other systems to mine enormous agency databases by filling in a simple on-screen form giving only a broad justification for the search. The request is not reviewed by a court or any NSA personnel before it is processed.
  • XKeyscore, the documents boast, is the NSA's "widest reaching" system developing intelligence from computer networks – what the agency calls Digital Network Intelligence (DNI). One presentation claims the program covers "nearly everything a typical user does on the internet", including the content of emails, websites visited and searches, as well as their metadata.
  • Analysts can also use XKeyscore and other NSA systems to obtain ongoing "real-time" interception of an individual's internet activity.
  • Under US law, the NSA is required to obtain an individualized Fisa warrant only if the target of their surveillance is a 'US person', though no such warrant is required for intercepting the communications of Americans with foreign targets.
  • But XKeyscore provides the technological capability, if not the legal authority, to target even US persons for extensive electronic surveillance without a warrant provided that some identifying information, such as their email or IP address, is known to the analyst.
  • One training slide illustrates the digital activity constantly being collected by XKeyscore and the analyst's ability to query the databases at any time.
  • The purpose of XKeyscore is to allow analysts to search the metadata as well as the content of emails and other internet activity, such as browser history, even when there is no known email account (a "selector" in NSA parlance) associated with the individual being targeted.
  • Analysts can also search by name, telephone number, IP address, keywords, the language in which the internet activity was conducted or the type of browser used.
  • One document notes that this is because "strong selection [search by email address] itself gives us only a very limited capability" because "a large amount of time spent on the web is performing actions that are anonymous."
  • Email monitoring
  • One top-secret document describes how the program "searches within bodies of emails, webpages and documents", including the "To, From, CC, BCC lines" and the 'Contact Us' pages on websites".
  • To search for emails, an analyst using XKS enters the individual's email address into a simple online search form, along with the "justification" for the search and the time period for which the emails are sought.
  • One document, a top secret 2010 guide describing the training received by NSA analysts for general surveillance under the Fisa Amendments Act of 2008, explains that analysts can begin surveillance on anyone by clicking a few simple pull-down menus designed to provide both legal and targeting justifications.
  • Once options on the pull-down menus are selected, their target is marked for electronic surveillance and the analyst is able to review the content of their communications:
  • Chats, browsing history and other internet activity
  • Beyond emails, the XKeyscore system allows analysts to monitor a virtually unlimited array of other internet activities, including those within social media.
  • An NSA tool called DNI Presenter, used to read the content of stored emails, also enables an analyst using XKeyscore to read the content of Facebook chats or private messages.
  • The XKeyscore program also allows an analyst to learn the IP addresses of every person who visits any website the analyst specifies.
  • The quantity of communications accessible through programs such as XKeyscore is staggeringly large. One NSA report from 2007 estimated that there were 850bn "call events" collected and stored in the NSA databases, and close to 150bn internet records. Each day, the document says, 1-2bn records were added.
  • William Binney, a former NSA mathematician, said last year that the agency had "assembled on the order of 20tn transactions about US citizens with other US citizens", an estimate, he said, that "only was involving phone calls and emails". A 2010 Washington Post article reported that "every day, collection systems at the [NSA] intercept and store 1.7bn emails, phone calls and other type of communications."
  • The ACLU's deputy legal director, Jameel Jaffer, told the Guardian last month that national security officials expressly said that a primary purpose of the new law was to enable them to collect large amounts of Americans' communications without individualized warrants.
  • "The government doesn't need to 'target' Americans in order to collect huge volumes of their communications," said Jaffer. "The government inevitably sweeps up the communications of many Americans" when targeting foreign nationals for surveillance.
  •  
    "One presentation claims the XKeyscore program covers 'nearly everything a typical user does on the internet' ................................................................. A top secret National Security Agency program allows analysts to search with no prior authorization through vast databases containing emails, online chats and the browsing histories of millions of individuals, according to documents provided by whistleblower Edward Snowden. The NSA boasts in training materials that the program, called XKeyscore, is its "widest-reaching" system for developing intelligence from the internet. The latest revelations will add to the intense public and congressional debate around the extent of NSA surveillance programs. They come as senior intelligence officials testify to the Senate judiciary committee on Wednesday, releasing classified documents in response to the Guardian's earlier stories on bulk collection of phone records and Fisa surveillance court oversight. The files shed light on one of Snowden's most controversial statements, made in his first video interview published by the Guardian on June 10. "I, sitting at my desk," said Snowden, could "wiretap anyone, from you or your accountant, to a federal judge or even the president, if I had a personal email". US officials vehemently denied this specific claim. Mike Rogers, the Republican chairman of the House intelligence committee, said of Snowden's assertion: "He's lying. It's impossible for him to do what he was saying he could do." But training materials for XKeyscore detail how analysts can use it and other systems to mine enormous agency databases by filling in a simple on-screen form giving only a broad justification for the search. The request is not reviewed by a court or any NSA personnel before it is processed. XKeyscore, the documents boast, is the NSA's "widest reaching" system developing intelligence from computer networks - what the agency calls Digital Network Intelligence (DNI). One
  •  
    "But training materials for XKeyscore detail how analysts can use it and other systems to mine enormous agency databases by filling in a simple on-screen form giving only a broad justification for the search. The request is not reviewed by a court or any NSA personnel before it is processed. " Note in that regard that Snowden said in an earlier interview that use of this system rarely was audited and that when audited, the most common request if changes were requested was to beef up the justification for the search. The XScore system puts the lie to just about everything the Administration has claimed about intense oversight by all three branches of federal government and about not reading emails or listening to (Skype) phone calls. The lies keep stacking up in an ever-deepening pile.
Paul Merrell

From Radio to Porn, British Spies Track Web Users' Online Identities - 0 views

  • HERE WAS A SIMPLE AIM at the heart of the top-secret program: Record the website browsing habits of “every visible user on the Internet.” Before long, billions of digital records about ordinary people’s online activities were being stored every day. Among them were details cataloging visits to porn, social media and news websites, search engines, chat forums, and blogs. The mass surveillance operation — code-named KARMA POLICE — was launched by British spies about seven years ago without any public debate or scrutiny. It was just one part of a giant global Internet spying apparatus built by the United Kingdom’s electronic eavesdropping agency, Government Communications Headquarters, or GCHQ. The revelations about the scope of the British agency’s surveillance are contained in documents obtained by The Intercept from National Security Agency whistleblower Edward Snowden. Previous reports based on the leaked files have exposed how GCHQ taps into Internet cables to monitor communications on a vast scale, but many details about what happens to the data after it has been vacuumed up have remained unclear.
  • Amid a renewed push from the U.K. government for more surveillance powers, more than two dozen documents being disclosed today by The Intercept reveal for the first time several major strands of GCHQ’s existing electronic eavesdropping capabilities.
  • The surveillance is underpinned by an opaque legal regime that has authorized GCHQ to sift through huge archives of metadata about the private phone calls, emails and Internet browsing logs of Brits, Americans, and any other citizens — all without a court order or judicial warrant
  • ...17 more annotations...
  • A huge volume of the Internet data GCHQ collects flows directly into a massive repository named Black Hole, which is at the core of the agency’s online spying operations, storing raw logs of intercepted material before it has been subject to analysis. Black Hole contains data collected by GCHQ as part of bulk “unselected” surveillance, meaning it is not focused on particular “selected” targets and instead includes troves of data indiscriminately swept up about ordinary people’s online activities. Between August 2007 and March 2009, GCHQ documents say that Black Hole was used to store more than 1.1 trillion “events” — a term the agency uses to refer to metadata records — with about 10 billion new entries added every day. As of March 2009, the largest slice of data Black Hole held — 41 percent — was about people’s Internet browsing histories. The rest included a combination of email and instant messenger records, details about search engine queries, information about social media activity, logs related to hacking operations, and data on people’s use of tools to browse the Internet anonymously.
  • Throughout this period, as smartphone sales started to boom, the frequency of people’s Internet use was steadily increasing. In tandem, British spies were working frantically to bolster their spying capabilities, with plans afoot to expand the size of Black Hole and other repositories to handle an avalanche of new data. By 2010, according to the documents, GCHQ was logging 30 billion metadata records per day. By 2012, collection had increased to 50 billion per day, and work was underway to double capacity to 100 billion. The agency was developing “unprecedented” techniques to perform what it called “population-scale” data mining, monitoring all communications across entire countries in an effort to detect patterns or behaviors deemed suspicious. It was creating what it said would be, by 2013, “the world’s biggest” surveillance engine “to run cyber operations and to access better, more valued data for customers to make a real world difference.”
  • A document from the GCHQ target analysis center (GTAC) shows the Black Hole repository’s structure.
  • The data is searched by GCHQ analysts in a hunt for behavior online that could be connected to terrorism or other criminal activity. But it has also served a broader and more controversial purpose — helping the agency hack into European companies’ computer networks. In the lead up to its secret mission targeting Netherlands-based Gemalto, the largest SIM card manufacturer in the world, GCHQ used MUTANT BROTH in an effort to identify the company’s employees so it could hack into their computers. The system helped the agency analyze intercepted Facebook cookies it believed were associated with Gemalto staff located at offices in France and Poland. GCHQ later successfully infiltrated Gemalto’s internal networks, stealing encryption keys produced by the company that protect the privacy of cell phone communications.
  • Similarly, MUTANT BROTH proved integral to GCHQ’s hack of Belgian telecommunications provider Belgacom. The agency entered IP addresses associated with Belgacom into MUTANT BROTH to uncover information about the company’s employees. Cookies associated with the IPs revealed the Google, Yahoo, and LinkedIn accounts of three Belgacom engineers, whose computers were then targeted by the agency and infected with malware. The hacking operation resulted in GCHQ gaining deep access into the most sensitive parts of Belgacom’s internal systems, granting British spies the ability to intercept communications passing through the company’s networks.
  • In March, a U.K. parliamentary committee published the findings of an 18-month review of GCHQ’s operations and called for an overhaul of the laws that regulate the spying. The committee raised concerns about the agency gathering what it described as “bulk personal datasets” being held about “a wide range of people.” However, it censored the section of the report describing what these “datasets” contained, despite acknowledging that they “may be highly intrusive.” The Snowden documents shine light on some of the core GCHQ bulk data-gathering programs that the committee was likely referring to — pulling back the veil of secrecy that has shielded some of the agency’s most controversial surveillance operations from public scrutiny. KARMA POLICE and MUTANT BROTH are among the key bulk collection systems. But they do not operate in isolation — and the scope of GCHQ’s spying extends far beyond them.
  • The agency operates a bewildering array of other eavesdropping systems, each serving its own specific purpose and designated a unique code name, such as: SOCIAL ANTHROPOID, which is used to analyze metadata on emails, instant messenger chats, social media connections and conversations, plus “telephony” metadata about phone calls, cell phone locations, text and multimedia messages; MEMORY HOLE, which logs queries entered into search engines and associates each search with an IP address; MARBLED GECKO, which sifts through details about searches people have entered into Google Maps and Google Earth; and INFINITE MONKEYS, which analyzes data about the usage of online bulletin boards and forums. GCHQ has other programs that it uses to analyze the content of intercepted communications, such as the full written body of emails and the audio of phone calls. One of the most important content collection capabilities is TEMPORA, which mines vast amounts of emails, instant messages, voice calls and other communications and makes them accessible through a Google-style search tool named XKEYSCORE.
  • As of September 2012, TEMPORA was collecting “more than 40 billion pieces of content a day” and it was being used to spy on people across Europe, the Middle East, and North Africa, according to a top-secret memo outlining the scope of the program. The existence of TEMPORA was first revealed by The Guardian in June 2013. To analyze all of the communications it intercepts and to build a profile of the individuals it is monitoring, GCHQ uses a variety of different tools that can pull together all of the relevant information and make it accessible through a single interface. SAMUEL PEPYS is one such tool, built by the British spies to analyze both the content and metadata of emails, browsing sessions, and instant messages as they are being intercepted in real time. One screenshot of SAMUEL PEPYS in action shows the agency using it to monitor an individual in Sweden who visited a page about GCHQ on the U.S.-based anti-secrecy website Cryptome.
  • Partly due to the U.K.’s geographic location — situated between the United States and the western edge of continental Europe — a large amount of the world’s Internet traffic passes through its territory across international data cables. In 2010, GCHQ noted that what amounted to “25 percent of all Internet traffic” was transiting the U.K. through some 1,600 different cables. The agency said that it could “survey the majority of the 1,600” and “select the most valuable to switch into our processing systems.”
  • According to Joss Wright, a research fellow at the University of Oxford’s Internet Institute, tapping into the cables allows GCHQ to monitor a large portion of foreign communications. But the cables also transport masses of wholly domestic British emails and online chats, because when anyone in the U.K. sends an email or visits a website, their computer will routinely send and receive data from servers that are located overseas. “I could send a message from my computer here [in England] to my wife’s computer in the next room and on its way it could go through the U.S., France, and other countries,” Wright says. “That’s just the way the Internet is designed.” In other words, Wright adds, that means “a lot” of British data and communications transit across international cables daily, and are liable to be swept into GCHQ’s databases.
  • A map from a classified GCHQ presentation about intercepting communications from undersea cables. GCHQ is authorized to conduct dragnet surveillance of the international data cables through so-called external warrants that are signed off by a government minister. The external warrants permit the agency to monitor communications in foreign countries as well as British citizens’ international calls and emails — for example, a call from Islamabad to London. They prohibit GCHQ from reading or listening to the content of “internal” U.K. to U.K. emails and phone calls, which are supposed to be filtered out from GCHQ’s systems if they are inadvertently intercepted unless additional authorization is granted to scrutinize them. However, the same rules do not apply to metadata. A little-known loophole in the law allows GCHQ to use external warrants to collect and analyze bulk metadata about the emails, phone calls, and Internet browsing activities of British people, citizens of closely allied countries, and others, regardless of whether the data is derived from domestic U.K. to U.K. communications and browsing sessions or otherwise. In March, the existence of this loophole was quietly acknowledged by the U.K. parliamentary committee’s surveillance review, which stated in a section of its report that “special protection and additional safeguards” did not apply to metadata swept up using external warrants and that domestic British metadata could therefore be lawfully “returned as a result of searches” conducted by GCHQ.
  • Perhaps unsurprisingly, GCHQ appears to have readily exploited this obscure legal technicality. Secret policy guidance papers issued to the agency’s analysts instruct them that they can sift through huge troves of indiscriminately collected metadata records to spy on anyone regardless of their nationality. The guidance makes clear that there is no exemption or extra privacy protection for British people or citizens from countries that are members of the Five Eyes, a surveillance alliance that the U.K. is part of alongside the U.S., Canada, Australia, and New Zealand. “If you are searching a purely Events only database such as MUTANT BROTH, the issue of location does not occur,” states one internal GCHQ policy document, which is marked with a “last modified” date of July 2012. The document adds that analysts are free to search the databases for British metadata “without further authorization” by inputing a U.K. “selector,” meaning a unique identifier such as a person’s email or IP address, username, or phone number. Authorization is “not needed for individuals in the U.K.,” another GCHQ document explains, because metadata has been judged “less intrusive than communications content.” All the spies are required to do to mine the metadata troves is write a short “justification” or “reason” for each search they conduct and then click a button on their computer screen.
  • Intelligence GCHQ collects on British persons of interest is shared with domestic security agency MI5, which usually takes the lead on spying operations within the U.K. MI5 conducts its own extensive domestic surveillance as part of a program called DIGINT (digital intelligence).
  • GCHQ’s documents suggest that it typically retains metadata for periods of between 30 days to six months. It stores the content of communications for a shorter period of time, varying between three to 30 days. The retention periods can be extended if deemed necessary for “cyber defense.” One secret policy paper dated from January 2010 lists the wide range of information the agency classes as metadata — including location data that could be used to track your movements, your email, instant messenger, and social networking “buddy lists,” logs showing who you have communicated with by phone or email, the passwords you use to access “communications services” (such as an email account), and information about websites you have viewed.
  • Records showing the full website addresses you have visited — for instance, www.gchq.gov.uk/what_we_do — are treated as content. But the first part of an address you have visited — for instance, www.gchq.gov.uk — is treated as metadata. In isolation, a single metadata record of a phone call, email, or website visit may not reveal much about a person’s private life, according to Ethan Zuckerman, director of Massachusetts Institute of Technology’s Center for Civic Media. But if accumulated and analyzed over a period of weeks or months, these details would be “extremely personal,” he told The Intercept, because they could reveal a person’s movements, habits, religious beliefs, political views, relationships, and even sexual preferences. For Zuckerman, who has studied the social and political ramifications of surveillance, the most concerning aspect of large-scale government data collection is that it can be “corrosive towards democracy” — leading to a chilling effect on freedom of expression and communication. “Once we know there’s a reasonable chance that we are being watched in one fashion or another it’s hard for that not to have a ‘panopticon effect,’” he said, “where we think and behave differently based on the assumption that people may be watching and paying attention to what we are doing.”
  • When compared to surveillance rules in place in the U.S., GCHQ notes in one document that the U.K. has “a light oversight regime.” The more lax British spying regulations are reflected in secret internal rules that highlight greater restrictions on how NSA databases can be accessed. The NSA’s troves can be searched for data on British citizens, one document states, but they cannot be mined for information about Americans or other citizens from countries in the Five Eyes alliance. No such constraints are placed on GCHQ’s own databases, which can be sifted for records on the phone calls, emails, and Internet usage of Brits, Americans, and citizens from any other country. The scope of GCHQ’s surveillance powers explain in part why Snowden told The Guardian in June 2013 that U.K. surveillance is “worse than the U.S.” In an interview with Der Spiegel in July 2013, Snowden added that British Internet cables were “radioactive” and joked: “Even the Queen’s selfies to the pool boy get logged.”
  • In recent years, the biggest barrier to GCHQ’s mass collection of data does not appear to have come in the form of legal or policy restrictions. Rather, it is the increased use of encryption technology that protects the privacy of communications that has posed the biggest potential hindrance to the agency’s activities. “The spread of encryption … threatens our ability to do effective target discovery/development,” says a top-secret report co-authored by an official from the British agency and an NSA employee in 2011. “Pertinent metadata events will be locked within the encrypted channels and difficult, if not impossible, to prise out,” the report says, adding that the agencies were working on a plan that would “(hopefully) allow our Internet Exploitation strategy to prevail.”
Paul Merrell

How the NSA is still harvesting your online data | World news | guardian.co.uk - 0 views

  • A review of top-secret NSA documents suggests that the surveillance agency still collects and sifts through large quantities of Americans' online data – despite the Obama administration's insistence that the program that began under Bush ended in 2011.Shawn Turner, the Obama administration's director of communications for National Intelligence, told the Guardian that "the internet metadata collection program authorized by the Fisa court was discontinued in 2011 for operational and resource reasons and has not been restarted."But the documents indicate that the amount of internet metadata harvested, viewed, processed and overseen by the Special Source Operations (SSO) directorate inside the NSA is extensive.While there is no reference to any specific program currently collecting purely domestic internet metadata in bulk, it is clear that the agency collects and analyzes significant amounts of data from US communications systems in the course of monitoring foreign targets.
  • On December 26 2012, SSO announced what it described as a new capability to allow it to collect far more internet traffic and data than ever before. With this new system, the NSA is able to direct more than half of the internet traffic it intercepts from its collection points into its own repositories. One end of the communications collected are inside the United States.The NSA called it the "One-End Foreign (1EF) solution". It intended the program, codenamed EvilOlive, for "broadening the scope" of what it is able to collect. It relied, legally, on "FAA Authority", a reference to the 2008 Fisa Amendments Act that relaxed surveillance restrictions.This new system, SSO stated in December, enables vastly increased collection by the NSA of internet traffic. "The 1EF solution is allowing more than 75% of the traffic to pass through the filter," the SSO December document reads. "This milestone not only opened the aperture of the access but allowed the possibility for more traffic to be identified, selected and forwarded to NSA repositories."
  • It continued: "After the EvilOlive deployment, traffic has literally doubled."The scale of the NSA's metadata collection is highlighted by references in the documents to another NSA program, codenamed ShellTrumpet.On December 31, 2012, an SSO official wrote that ShellTrumpet had just "processed its One Trillionth metadata record".
  • ...4 more annotations...
  • An SSO entry dated September 21, 2012, announced that "Transient Thurible, a new Government Communications Head Quarters (GCHQ) managed XKeyScore (XKS) Deep Dive was declared operational." The entry states that GCHQ "modified" an existing program so the NSA could "benefit" from what GCHQ harvested."Transient Thurible metadata [has been] flowing into NSA repositories since 13 August 2012," the entry states.
  • Another SSO entry, dated February 6, 2013, described ongoing plans to expand metadata collection. A joint surveillance collection operation with an unnamed partner agency yielded a new program "to query metadata" that was "turned on in the Fall 2012". Two others, called MoonLightPath and Spinneret, "are planned to be added by September 2013."A substantial portion of the internet metadata still collected and analyzed by the NSA comes from allied governments, including its British counterpart, GCHQ.
  • Explaining that the five-year old program "began as a near-real-time metadata analyzer … for a classic collection system", the SSO official noted: "In its five year history, numerous other systems from across the Agency have come to use ShellTrumpet's processing capabilities for performance monitoring" and other tasks, such as "direct email tip alerting."Almost half of those trillion pieces of internet metadata were processed in 2012, the document detailed: "though it took five years to get to the one trillion mark, almost half of this volume was processed in this calendar year".
  • A review of top-secret NSA documents suggests that the surveillance agency still collects and sifts through large quantities of Americans' online data – despite the Obama administration's insistence that the program that began under Bush ended in 2011.Shawn Turner, the Obama administration's director of communications for National Intelligence, told the Guardian that "the internet metadata collection program authorized by the Fisa court was discontinued in 2011 for operational and resource reasons and has not been restarted."But the documents indicate that the amount of internet metadata harvested, viewed, processed and overseen by the Special Source Operations (SSO) directorate inside the NSA is extensive.While there is no reference to any specific program currently collecting purely domestic internet metadata in bulk, it is clear that the agency collects and analyzes significant amounts of data from US communications systems in the course of monitoring foreign targets.
Paul Merrell

Crypto-Gram: December 15, 2014 - 0 views

  • There's a new international survey on Internet security and trust, of "23,376 Internet users in 24 countries," including "Australia, Brazil, Canada, China, Egypt, France, Germany, Great Britain, Hong Kong, India, Indonesia, Italy, Japan, Kenya, Mexico, Nigeria, Pakistan, Poland, South Africa, South Korea, Sweden, Tunisia, Turkey and the United States." Amongst the findings, 60% of Internet users have heard of Edward Snowden, and 39% of those "have taken steps to protect their online privacy and security as a result of his revelations." The press is mostly spinning this as evidence that Snowden has not had an effect: "merely 39%," "only 39%," and so on. (Note that these articles are completely misunderstanding the data. It's not 39% of people who are taking steps to protect their privacy post-Snowden, it's 39% of the 60% of Internet users -- which is not everybody -- who have heard of him. So it's much less than 39%.)
  • Even so, I disagree with the "Edward Snowden Revelations Not Having Much Impact on Internet Users" headline. He's having an enormous impact. I ran the actual numbers country by country, combining "data on Internet penetration with data from this survey. Multiplying everything out, I calculate that *706 million people* have changed their behavior on the Internet because of what the NSA and GCHQ are doing. (For example, 17% of Indonesians use the Internet, 64% of them have heard of Snowden and 62% of them have taken steps to protect their privacy, which equals 17 million people out of its total 250-million population.)
  • Note that the countries in this survey only cover 4.7 billion out of a total 7 billion world population. Taking the conservative estimates that 20% of the remaining population uses the Internet, 40% of them have heard of Snowden, and 25% of those have done something about it, that's an additional 46 million people around the world. It's certainly true that most of those people took steps that didn't make any appreciable difference against an NSA level of surveillance, and probably not even against the even more pervasive corporate variety of surveillance. It's probably even true that some of those people didn't take steps at all, and just wish they did or wish they knew what to do. But it is absolutely extraordinary that *750 million people* are disturbed enough about their online privacy that they would represent to a survey taker that they did something about it.
  • ...2 more annotations...
  • Name another issue that has caused over ten percent of the world's population to change their behavior in the past year? Cory Doctorow is right: we have reached "peak indifference to surveillance." From now on, this issue is going to matter more and more, and policymakers around the world need to start paying attention. https://www.cigionline.org/internet-survey Press mentions: http://www.ibtimes.co.uk/...http://www.theguardian.com/technology/2014/nov/25/... Internet penetration by country: http://www.internetlivestats.com/... Cory Docorow: http://boingboing.net/2014/11/12/...
  • Related: a recent Pew Research Internet Project survey on Americans' perceptions of privacy, commented on by Ben Wittes. http://www.pewinternet.org/2014/11/12/...http://www.lawfareblog.com/2014/11/...
Paul Merrell

Canadian Spies Collect Domestic Emails in Secret Security Sweep - The Intercept - 0 views

  • Canada’s electronic surveillance agency is covertly monitoring vast amounts of Canadians’ emails as part of a sweeping domestic cybersecurity operation, according to top-secret documents. The surveillance initiative, revealed Wednesday by CBC News in collaboration with The Intercept, is sifting through millions of emails sent to Canadian government agencies and departments, archiving details about them on a database for months or even years. The data mining operation is carried out by the Communications Security Establishment, or CSE, Canada’s equivalent of the National Security Agency. Its existence is disclosed in documents obtained by The Intercept from NSA whistleblower Edward Snowden. The emails are vacuumed up by the Canadian agency as part of its mandate to defend against hacking attacks and malware targeting government computers. It relies on a system codenamed PONY EXPRESS to analyze the messages in a bid to detect potential cyber threats.
  • Last year, CSE acknowledged it collected some private communications as part of cybersecurity efforts. But it refused to divulge the number of communications being stored or to explain for how long any intercepted messages would be retained. Now, the Snowden documents shine a light for the first time on the huge scope of the operation — exposing the controversial details the government withheld from the public. Under Canada’s criminal code, CSE is not allowed to eavesdrop on Canadians’ communications. But the agency can be granted special ministerial exemptions if its efforts are linked to protecting government infrastructure — a loophole that the Snowden documents show is being used to monitor the emails. The latest revelations will trigger concerns about how Canadians’ private correspondence with government employees are being archived by the spy agency and potentially shared with police or allied surveillance agencies overseas, such as the NSA. Members of the public routinely communicate with government employees when, for instance, filing tax returns, writing a letter to a member of parliament, applying for employment insurance benefits or submitting a passport application.
  • Chris Parsons, an internet security expert with the Toronto-based internet think tank Citizen Lab, told CBC News that “you should be able to communicate with your government without the fear that what you say … could come back to haunt you in unexpected ways.” Parsons said that there are legitimate cybersecurity purposes for the agency to keep tabs on communications with the government, but he added: “When we collect huge volumes, it’s not just used to track bad guys. It goes into data stores for years or months at a time and then it can be used at any point in the future.” In a top-secret CSE document on the security operation, dated from 2010, the agency says it “processes 400,000 emails per day” and admits that it is suffering from “information overload” because it is scooping up “too much data.” The document outlines how CSE built a system to handle a massive 400 terabytes of data from Internet networks each month — including Canadians’ emails — as part of the cyber operation. (A single terabyte of data can hold about a billion pages of text, or about 250,000 average-sized mp3 files.)
  • ...1 more annotation...
  • The agency notes in the document that it is storing large amounts of “passively tapped network traffic” for “days to months,” encompassing the contents of emails, attachments and other online activity. It adds that it stores some kinds of metadata — data showing who has contacted whom and when, but not the content of the message — for “months to years.” The document says that CSE has “excellent access to full take data” as part of its cyber operations and is receiving policy support on “use of intercepted private communications.” The term “full take” is surveillance-agency jargon that refers to the bulk collection of both content and metadata from Internet traffic. Another top-secret document on the surveillance dated from 2010 suggests the agency may be obtaining at least some of the data by covertly mining it directly from Canadian Internet cables. CSE notes in the document that it is “processing emails off the wire.”
  •  
    " CANADIAN SPIES COLLECT DOMESTIC EMAILS IN SECRET SECURITY SWEEP BY RYAN GALLAGHER AND GLENN GREENWALD @rj_gallagher@ggreenwald YESTERDAY AT 2:02 AM SHARE TWITTER FACEBOOK GOOGLE EMAIL PRINT POPULAR EXCLUSIVE: TSA ISSUES SECRET WARNING ON 'CATASTROPHIC' THREAT TO AVIATION CHICAGO'S "BLACK SITE" DETAINEES SPEAK OUT WHY DOES THE FBI HAVE TO MANUFACTURE ITS OWN PLOTS IF TERRORISM AND ISIS ARE SUCH GRAVE THREATS? NET NEUTRALITY IS HERE - THANKS TO AN UNPRECEDENTED GUERRILLA ACTIVISM CAMPAIGN HOW SPIES STOLE THE KEYS TO THE ENCRYPTION CASTLE Canada's electronic surveillance agency is covertly monitoring vast amounts of Canadians' emails as part of a sweeping domestic cybersecurity operation, according to top-secret documents. The surveillance initiative, revealed Wednesday by CBC News in collaboration with The Intercept, is sifting through millions of emails sent to Canadian government agencies and departments, archiving details about them on a database for months or even years. The data mining operation is carried out by the Communications Security Establishment, or CSE, Canada's equivalent of the National Security Agency. Its existence is disclosed in documents obtained by The Intercept from NSA whistleblower Edward Snowden. The emails are vacuumed up by the Canadian agency as part of its mandate to defend against hacking attacks and malware targeting government computers. It relies on a system codenamed PONY EXPRESS to analyze the messages in a bid to detect potential cyber threats. Last year, CSE acknowledged it collected some private communications as part of cybersecurity efforts. But it refused to divulge the number of communications being stored or to explain for how long any intercepted messages would be retained. Now, the Snowden documents shine a light for the first time on the huge scope of the operation - exposing the controversial details the government withheld from the public. Under Canada's criminal code, CSE is no
Gary Edwards

The Ultimate Net Monitoring Tool: NARUS - 0 views

  •  
    Chilling stuff.  Note that Mark Klien is an important whistleblower whose testimony has helped expose the  Federal Government - NSA domestic dragnet that has violated the constitutional rights of hundreds of thousands of law abiding American citizens.  The question I have concerns cooperation between NSA NARUS spying and the IRS. We know that the IRS used key words such as "TEA PARTY", "PATRIOT", "Constitution", and "Tenth Amendment" to target American citizens.  Does the NSA NARUS target Americans in the same way?  Are there political enemy lists with background surveillance information now circulating through different government agencies based on this targeted and illegal spying? The first thing we need to do is protect whistle blowers who are risking it all to protect the constitutional rights of American citizens and save our country.   "The equipment that technician Mark Klein learned was installed in the National Security Agency's "secret room" inside AT&T's San Francisco switching office isn't some sinister Big Brother box designed solely to help governments eavesdrop on citizens' internet communications. Rather, it's a powerful commercial network-analysis product with all sorts of valuable uses for network operators. It just happens to be capable of doing things that make it one of the best internet spy tools around. "Anything that comes through (an internet protocol network), we can record," says Steve Bannerman, marketing vice president of Narus, a Mountain View, California, company. "We can reconstruct all of their e-mails along with attachments, see what web pages they clicked on, we can reconstruct their (voice over internet protocol) calls."" Narus' product, the Semantic Traffic Analyzer, is a software application that runs on standard IBM or Dell servers using the Linux operating system. It's renowned within certain circles for its ability to inspect traffic in real time on high-bandwidth pipes, identifying packets of interest as they r
Paul Merrell

How the NSA Plans to Infect 'Millions' of Computers with Malware - The Intercept - 0 views

  • Top-secret documents reveal that the National Security Agency is dramatically expanding its ability to covertly hack into computers on a mass scale by using automated systems that reduce the level of human oversight in the process. The classified files – provided previously by NSA whistleblower Edward Snowden – contain new details about groundbreaking surveillance technology the agency has developed to infect potentially millions of computers worldwide with malware “implants.” The clandestine initiative enables the NSA to break into targeted computers and to siphon out data from foreign Internet and phone networks. The covert infrastructure that supports the hacking efforts operates from the agency’s headquarters in Fort Meade, Maryland, and from eavesdropping bases in the United Kingdom and Japan. GCHQ, the British intelligence agency, appears to have played an integral role in helping to develop the implants tactic.
  • The NSA began rapidly escalating its hacking efforts a decade ago. In 2004, according to secret internal records, the agency was managing a small network of only 100 to 150 implants. But over the next six to eight years, as an elite unit called Tailored Access Operations (TAO) recruited new hackers and developed new malware tools, the number of implants soared to tens of thousands. To penetrate foreign computer networks and monitor communications that it did not have access to through other means, the NSA wanted to go beyond the limits of traditional signals intelligence, or SIGINT, the agency’s term for the interception of electronic communications. Instead, it sought to broaden “active” surveillance methods – tactics designed to directly infiltrate a target’s computers or network devices. In the documents, the agency describes such techniques as “a more aggressive approach to SIGINT” and says that the TAO unit’s mission is to “aggressively scale” these operations. But the NSA recognized that managing a massive network of implants is too big a job for humans alone.
  • “One of the greatest challenges for active SIGINT/attack is scale,” explains the top-secret presentation from 2009. “Human ‘drivers’ limit ability for large-scale exploitation (humans tend to operate within their own environment, not taking into account the bigger picture).” The agency’s solution was TURBINE. Developed as part of TAO unit, it is described in the leaked documents as an “intelligent command and control capability” that enables “industrial-scale exploitation.”
  • ...10 more annotations...
  • TURBINE was designed to make deploying malware much easier for the NSA’s hackers by reducing their role in overseeing its functions. The system would “relieve the user from needing to know/care about the details,” the NSA’s Technology Directorate notes in one secret document from 2009. “For example, a user should be able to ask for ‘all details about application X’ and not need to know how and where the application keeps files, registry entries, user application data, etc.” In practice, this meant that TURBINE would automate crucial processes that previously had to be performed manually – including the configuration of the implants as well as surveillance collection, or “tasking,” of data from infected systems. But automating these processes was about much more than a simple technicality. The move represented a major tactical shift within the NSA that was expected to have a profound impact – allowing the agency to push forward into a new frontier of surveillance operations. The ramifications are starkly illustrated in one undated top-secret NSA document, which describes how the agency planned for TURBINE to “increase the current capability to deploy and manage hundreds of Computer Network Exploitation (CNE) and Computer Network Attack (CNA) implants to potentially millions of implants.” (CNE mines intelligence from computers and networks; CNA seeks to disrupt, damage or destroy them.)
  • But not all of the NSA’s implants are used to gather intelligence, the secret files show. Sometimes, the agency’s aim is disruption rather than surveillance. QUANTUMSKY, a piece of NSA malware developed in 2004, is used to block targets from accessing certain websites. QUANTUMCOPPER, first tested in 2008, corrupts a target’s file downloads. These two “attack” techniques are revealed on a classified list that features nine NSA hacking tools, six of which are used for intelligence gathering. Just one is used for “defensive” purposes – to protect U.S. government networks against intrusions.
  • The NSA has a diverse arsenal of malware tools, each highly sophisticated and customizable for different purposes. One implant, codenamed UNITEDRAKE, can be used with a variety of “plug-ins” that enable the agency to gain total control of an infected computer. An implant plug-in named CAPTIVATEDAUDIENCE, for example, is used to take over a targeted computer’s microphone and record conversations taking place near the device. Another, GUMFISH, can covertly take over a computer’s webcam and snap photographs. FOGGYBOTTOM records logs of Internet browsing histories and collects login details and passwords used to access websites and email accounts. GROK is used to log keystrokes. And SALVAGERABBIT exfiltrates data from removable flash drives that connect to an infected computer. The implants can enable the NSA to circumvent privacy-enhancing encryption tools that are used to browse the Internet anonymously or scramble the contents of emails as they are being sent across networks. That’s because the NSA’s malware gives the agency unfettered access to a target’s computer before the user protects their communications with encryption. It is unclear how many of the implants are being deployed on an annual basis or which variants of them are currently active in computer systems across the world.
  • Eventually, the secret files indicate, the NSA’s plans for TURBINE came to fruition. The system has been operational in some capacity since at least July 2010, and its role has become increasingly central to NSA hacking operations. Earlier reports based on the Snowden files indicate that the NSA has already deployed between 85,000 and 100,000 of its implants against computers and networks across the world, with plans to keep on scaling up those numbers. The intelligence community’s top-secret “Black Budget” for 2013, obtained by Snowden, lists TURBINE as part of a broader NSA surveillance initiative named “Owning the Net.” The agency sought $67.6 million in taxpayer funding for its Owning the Net program last year. Some of the money was earmarked for TURBINE, expanding the system to encompass “a wider variety” of networks and “enabling greater automation of computer network exploitation.”
  • Infiltrating cellphone networks, however, is not all that the malware can be used to accomplish. The NSA has specifically tailored some of its implants to infect large-scale network routers used by Internet service providers in foreign countries. By compromising routers – the devices that connect computer networks and transport data packets across the Internet – the agency can gain covert access to monitor Internet traffic, record the browsing sessions of users, and intercept communications. Two implants the NSA injects into network routers, HAMMERCHANT and HAMMERSTEIN, help the agency to intercept and perform “exploitation attacks” against data that is sent through a Virtual Private Network, a tool that uses encrypted “tunnels” to enhance the security and privacy of an Internet session.
  • Before it can extract data from an implant or use it to attack a system, the NSA must first install the malware on a targeted computer or network. According to one top-secret document from 2012, the agency can deploy malware by sending out spam emails that trick targets into clicking a malicious link. Once activated, a “back-door implant” infects their computers within eight seconds. There’s only one problem with this tactic, codenamed WILLOWVIXEN: According to the documents, the spam method has become less successful in recent years, as Internet users have become wary of unsolicited emails and less likely to click on anything that looks suspicious. Consequently, the NSA has turned to new and more advanced hacking techniques. These include performing so-called “man-in-the-middle” and “man-on-the-side” attacks, which covertly force a user’s internet browser to route to NSA computer servers that try to infect them with an implant.
  • To perform a man-on-the-side attack, the NSA observes a target’s Internet traffic using its global network of covert “accesses” to data as it flows over fiber optic cables or satellites. When the target visits a website that the NSA is able to exploit, the agency’s surveillance sensors alert the TURBINE system, which then “shoots” data packets at the targeted computer’s IP address within a fraction of a second. In one man-on-the-side technique, codenamed QUANTUMHAND, the agency disguises itself as a fake Facebook server. When a target attempts to log in to the social media site, the NSA transmits malicious data packets that trick the target’s computer into thinking they are being sent from the real Facebook. By concealing its malware within what looks like an ordinary Facebook page, the NSA is able to hack into the targeted computer and covertly siphon out data from its hard drive. A top-secret animation demonstrates the tactic in action.
  • The TURBINE implants system does not operate in isolation. It is linked to, and relies upon, a large network of clandestine surveillance “sensors” that the agency has installed at locations across the world.
  • The NSA’s headquarters in Maryland are part of this network, as are eavesdropping bases used by the agency in Misawa, Japan and Menwith Hill, England. The sensors, codenamed TURMOIL, operate as a sort of high-tech surveillance dragnet, monitoring packets of data as they are sent across the Internet. When TURBINE implants exfiltrate data from infected computer systems, the TURMOIL sensors automatically identify the data and return it to the NSA for analysis. And when targets are communicating, the TURMOIL system can be used to send alerts or “tips” to TURBINE, enabling the initiation of a malware attack. The NSA identifies surveillance targets based on a series of data “selectors” as they flow across Internet cables. These selectors, according to internal documents, can include email addresses, IP addresses, or the unique “cookies” containing a username or other identifying information that are sent to a user’s computer by websites such as Google, Facebook, Hotmail, Yahoo, and Twitter. Other selectors the NSA uses can be gleaned from unique Google advertising cookies that track browsing habits, unique encryption key fingerprints that can be traced to a specific user, and computer IDs that are sent across the Internet when a Windows computer crashes or updates.
  • Documents published with this article: Menwith Hill Station Leverages XKeyscore for Quantum Against Yahoo and Hotmail Five Eyes Hacking Large Routers NSA Technology Directorate Analysis of Converged Data Selector Types There Is More Than One Way to Quantum NSA Phishing Tactics and Man in the Middle Attacks Quantum Insert Diagrams The NSA and GCHQ’s QUANTUMTHEORY Hacking Tactics TURBINE and TURMOIL VPN and VOIP Exploitation With HAMMERCHANT and HAMMERSTEIN Industrial-Scale Exploitation Thousands of Implants
  •  
    *Very* long article. Only small portions quoted.
Paul Merrell

Reset The Net - Privacy Pack - 0 views

  • This June 5th, I pledge to take strong steps to protect my freedom from government mass surveillance. I expect the services I use to do the same.
  • Fight for the Future and Center for Rights will contact you about future campaigns. Privacy Policy
  •  
    I wound up joining this campaign at the urging of the ACLU after checking the Privacy Policy. The Reset the Net campaign seems to be endorsed by a lot of change-oriented groups, from the ACLU to Greenpeac to the Pirate Party. A fair number of groups with a Progressive agenda, but certainly not limited to them. The right answer to that situation is to urge other groups to endorse, not to avoid the campaign. Single-issue coalition-building is all about focusing on an area of agreement rather than worrying about who you are rubbing elbows with.  I have been looking for a a bipartisan group that's tackling government surveillance issues via mass actions but has no corporate sponsors. This might be the one. The reason: Corporate types like Google have no incentive to really butt heads with the government voyeurs. They are themselves engaged in massive surveillance of their users and certainly will not carry the battle for digital privacy over to the private sector. But this *is* a battle over digital privacy and legally defining user privacy rights in the private sector is just as important as cutting back on government surveillance. As we have learned through the Snowden disclosures, what the private internet companies have, the NSA can and does get.  The big internet services successfully pushed in the U.S. for authorization to publish more numbers about how many times they pass private data to the government, but went no farther. They wanted to be able to say they did something, but there's a revolving door of staffers between NSA and the big internet companies and the internet service companies' data is an open book to the NSA.   The big internet services are not champions of their users' privacy. If they were, they would be featuring end-to-end encryption with encryption keys unique to each user and unknown to the companies.  Like some startups in Europe are doing. E.g., the Wuala.com filesync service in Switzerland (first 5 GB of storage free). Compare tha
Gary Edwards

Sony leaks reveal Hollywood is trying to break DNS, the backbone of the internet | The ... - 0 views

  • You could still type http://www.piratebay.se into your browser, but without a working DNS record, you wouldn't be able to find the site itself. If a takedown notice could blacklist a site from every available DNS provider, the URL would be effectively erased from the internet.
  • No one's ever tried to issue a takedown notice like that, but this latest memo suggests the MPAA is looking into it as a potentially powerful new tool in the fight against piracy. "A takedown notice program, therefore, could threaten ISPs with potential secondary liability in the event that they do not cease connecting users to known infringing material through their own DNS servers," the letter reads. "While not making it impossible for users to reach pirate sites (i.e., a user could still use a third-party DNS server), it could make it substantially more complicated for casual infringers to reach pirate sites if their ISPs decline to assist in the routing of communications to those sites." The full document is embedded below.
  • The MPAA’s legal argument centers on the claim that DNS records are working as an index or directory rather than simply routing data. If that argument holds, then the DNS links could be vulnerable to the same takedown notices used to strike torrent links from Google searches. The net effect would be similar to site-blocking, making it as easy to unplug a URL as it is to take down a YouTube video. It would also cast DNS providers as legally responsible for all the sites on the web, the same way YouTube is responsible for every video uploaded to its network. For many providers, simply managing the flood of notices might create a logistical nightmare.
  •  
    "Most anti-piracy tools take one of two paths: they either target the server that's sharing the files (pulling videos off YouTube or taking down sites like The Pirate Bay) or they make it harder to find (delisting offshore sites that share infringing content). But leaked documents reveal a frightening line of attack that's currently being considered by the MPAA: What if you simply erased any record that the site was there in the first place? A BOLD CHALLENGE TO THE BASIC ENGINEERING OF THE INTERNET To do that, the MPAA's lawyers would target the Domain Name System (DNS) that directs traffic across the internet. The tactic was first proposed as part of the Stop Online Piracy Act (SOPA) in 2011, but three years after the law failed in Congress, the MPAA has been looking for legal justification for the practice in existing law and working with ISPs like Comcast to examine how a system might work technically. If the system works, DNS-blocking could be the key to the MPAA's long-standing goal of blocking sites from delivering content to the US. At the same time, it represents a bold challenge to the basic engineering of the internet, threatening to break the very backbone of the web and drawing the industry into an increasingly nasty fight with Google. The Domain Name System is a kind of phone book for the internet, translating URLs like http://www.theverge.com into IP addresses like 192.5.151.3. Given a URL string, your computer will turn to a DNS server (often run by a local ISP or a third party like Google) to find the IP address of the corresponding server. Much like the phone book, that function is usually treated as a simple an engineering task - but a memo commissioned by the MPAA this August sketches out a legal case for blocking infringing sites from the DNS records entirely, like wiping unsavory addresses out of the phone book. You could still type http://www.piratebay.se into your browser, but without a working DNS record, you wouldn't be able to find the
Paul Merrell

Brazil Looks to Break from U.S.-Centric Internet | TIME.com - 0 views

  • Brazil plans to divorce itself from the U.S.-centric Internet over Washington’s widespread online spying, a move that many experts fear will be a potentially dangerous first step toward fracturing a global network built with minimal interference by governments. President Dilma Rousseff ordered a series of measures aimed at greater Brazilian online independence and security following revelations that the U.S. National Security Agency intercepted her communications, hacked into the state-owned Petrobras oil company’s network and spied on Brazilians who entrusted their personal data to U.S. tech companies such as Facebook and Google. The leader is so angered by the espionage that on Tuesday she postponed next month’s scheduled trip to Washington, where she was to be honored with a state dinner. Internet security and policy experts say the Brazilian government’s reaction to information leaked by former NSA contractor Edward Snowden is understandable, but warn it could set the Internet on a course of Balkanization.
  • “The global backlash is only beginning and will get far more severe in coming months,” said Sascha Meinrath, director of the Open Technology Institute at the Washington-based New America Foundation think tank. “This notion of national privacy sovereignty is going to be an increasingly salient issue around the globe.” While Brazil isn’t proposing to bar its citizens from U.S.-based Web services, it wants their data to be stored locally as the nation assumes greater control over Brazilians’ Internet use to protect them from NSA snooping. The danger of mandating that kind of geographic isolation, Meinrath said, is that it could render inoperable popular software applications and services and endanger the Internet’s open, interconnected structure.
  • The effort by Latin America’s biggest economy to digitally isolate itself from U.S. spying not only could be costly and difficult, it could encourage repressive governments to seek greater technical control over the Internet to crush free expression at home, experts say. In December, countries advocating greater “cyber-sovereignty” pushed for such control at an International Telecommunications Union meeting in Dubai, with Western democracies led by the United States and the European Union in opposition.
  • ...5 more annotations...
  • Rousseff says she intends to push for international rules on privacy and security in hardware and software during the U.N. General Assembly meeting later this month. Among Snowden revelations: the NSA has created backdoors in software and Web-based services. Brazil is now pushing more aggressively than any other nation to end U.S. commercial hegemony on the Internet. More than 80 percent of online search, for example, is controlled by U.S.-based companies. Most of Brazil’s global Internet traffic passes through the United States, so Rousseff’s government plans to lay underwater fiber optic cable directly to Europe and also link to all South American nations to create what it hopes will be a network free of U.S. eavesdropping.
  • More communications integrity protection is expected when Telebras, the state-run telecom company, works with partners to oversee the launch in 2016 of Brazil’s first communications satellite, for military and public Internet traffic. Brazil’s military currently relies on a satellite run by Embratel, which Mexican billionaire Carlos Slim controls. Rousseff is urging Brazil’s Congress to compel Facebook, Google and all companies to store data generated by Brazilians on servers physically located inside Brazil in order to shield it from the NSA. If that happens, and other nations follow suit, Silicon Valley’s bottom line could be hit by lost business and higher operating costs: Brazilians rank No. 3 on Facebook and No. 2 on Twitter and YouTube. An August study by a respected U.S. technology policy nonprofit estimated the fallout from the NSA spying scandal could cost the U.S. cloud computing industry, which stores data remotely to give users easy access from any device, as much as $35 billion by 2016 in lost business.
  • Brazil also plans to build more Internet exchange points, places where vast amounts of data are relayed, in order to route Brazilians’ traffic away from potential interception. And its postal service plans by next year to create an encrypted email service that could serve as an alternative to Gmail and Yahoo!, which according to Snowden-leaked documents are among U.S. tech giants that have collaborated closely with the NSA. “Brazil intends to increase its independent Internet connections with other countries,” Rousseff’s office said in an emailed response to questions from The Associated Press on its plans. It cited a “common understanding” between Brazil and the European Union on data privacy, and said “negotiations are underway in South America for the deployment of land connections between all nations.” It said Brazil plans to boost investment in home-grown technology and buy only software and hardware that meet government data privacy specifications.
  • While the plans’ technical details are pending, experts say they will be costly for Brazil and ultimately can be circumvented. Just as people in China and Iran defeat government censors with tools such as “proxy servers,” so could Brazilians bypass their government’s controls. International spies, not just from the United States, also will adjust, experts said. Laying cable to Europe won’t make Brazil safer, they say. The NSA has reportedly tapped into undersea telecoms cables for decades. Meinrath and others argue that what’s needed instead are strong international laws that hold nations accountable for guaranteeing online privacy.
  • “There’s nothing viable that Brazil can really do to protect its citizenry without changing what the U.S. is doing,” he said. Matthew Green, a Johns Hopkins computer security expert, said Brazil won’t protect itself from intrusion by isolating itself digitally. It will also be discouraging technological innovation, he said, by encouraging the entire nation to use a state-sponsored encrypted email service. “It’s sort of like a Soviet socialism of computing,” he said, adding that the U.S. “free-for-all model works better.”
  •  
    So both Brazil and the European Union are planning to boycott the U.S.-based cloud industry, seizing on the NSA's activities as legal grounds. Under the various GATT series of trade agreements, otherwise forbidden discriminatory actions taken that restrict trade in aid of national security are exempt from redress through the World Trade Organization Dispute Resolution Process. So the NSA voyeurs can add legalizing economic digital discrimination against the U.S. to its score card.
Paul Merrell

The US is Losing Control of the Internet…Oh, Really? | Global Research - 0 views

  • All of the major internet organisations have pledged, at a summit in Uruguay, to free themselves of the influence of the US government. The directors of ICANN, the Internet Engineering Task Force, the Internet Architecture Board, the World Wide Web Consortium, the Internet Society and all five of the regional Internet address registries have vowed to break their associations with the US government. In a statement, the group called for “accelerating the globalization of ICANN and IANA functions, towards an environment in which all stakeholders, including all governments, participate on an equal footing”. That’s a distinct change from the current situation, where the US department of commerce has oversight of ICANN. In another part of the statement, the group “expressed strong concern over the undermining of the trust and confidence of Internet users globally due to recent revelations of pervasive monitoring and surveillance”. Meanwhile, it was announced that the next Internet Governance Summit would be held in Brazil, whose president has been extremely critical of the US over web surveillance. In a statement announcing the location of the summit, Brazilian president Dilma Rousseff said: “The United States and its allies must urgently end their spying activities once and for all.”
Paul Merrell

Profiled From Radio to Porn, British Spies Track Web Users' Online Identities | Global ... - 0 views

  • One system builds profiles showing people’s web browsing histories. Another analyzes instant messenger communications, emails, Skype calls, text messages, cell phone locations, and social media interactions. Separate programs were built to keep tabs on “suspicious” Google searches and usage of Google Maps. The surveillance is underpinned by an opaque legal regime that has authorized GCHQ to sift through huge archives of metadata about the private phone calls, emails and Internet browsing logs of Brits, Americans, and any other citizens  all without a court order or judicial warrant.
  • The power of KARMA POLICE was illustrated in 2009, when GCHQ launched a top-secret operation to collect intelligence about people using the Internet to listen to radio shows. The agency used a sample of nearly 7 million metadata records, gathered over a period of three months, to observe the listening habits of more than 200,000 people across 185 countries, including the U.S., the U.K., Ireland, Canada, Mexico, Spain, the Netherlands, France, and Germany.
  • GCHQ’s documents indicate that the plans for KARMA POLICE were drawn up between 2007 and 2008. The system was designed to provide the agency with “either (a) a web browsing profile for every visible user on the Internet, or (b) a user profile for every visible website on the Internet.” The origin of the surveillance system’s name is not discussed in the documents. But KARMA POLICE is also the name of a popular song released in 1997 by the Grammy Award-winning British band Radiohead, suggesting the spies may have been fans. A verse repeated throughout the hit song includes the lyric, “This is what you’ll get, when you mess with us.”
  • ...3 more annotations...
  • GCHQ vacuums up the website browsing histories using “probes” that tap into the international fiber-optic cables that transport Internet traffic across the world. A huge volume of the Internet data GCHQ collects flows directly into a massive repository named Black Hole, which is at the core of the agency’s online spying operations, storing raw logs of intercepted material before it has been subject to analysis. Black Hole contains data collected by GCHQ as part of bulk “unselected” surveillance, meaning it is not focused on particular “selected” targets and instead includes troves of data indiscriminately swept up about ordinary people’s online activities. Between August 2007 and March 2009, GCHQ documents say that Black Hole was used to store more than 1.1 trillion “events”  a term the agency uses to refer to metadata records  with about 10 billion new entries added every day. As of March 2009, the largest slice of data Black Hole held  41 percent  was about people’s Internet browsing histories. The rest included a combination of email and instant messenger records, details about search engine queries, information about social media activity, logs related to hacking operations, and data on people’s use of tools to browse the Internet anonymously.
  • Throughout this period, as smartphone sales started to boom, the frequency of people’s Internet use was steadily increasing. In tandem, British spies were working frantically to bolster their spying capabilities, with plans afoot to expand the size of Black Hole and other repositories to handle an avalanche of new data. By 2010, according to the documents, GCHQ was logging 30 billion metadata records per day. By 2012, collection had increased to 50 billion per day, and work was underway to double capacity to 100 billion. The agency was developing “unprecedented” techniques to perform what it called “population-scale” data mining, monitoring all communications across entire countries in an effort to detect patterns or behaviors deemed suspicious. It was creating what it saidwould be, by 2013, “the world’s biggest” surveillance engine “to run cyber operations and to access better, more valued data for customers to make a real world difference.” HERE WAS A SIMPLE AIM at the heart of the top-secret program: Record the website browsing habits of “every visible user on the Internet.” Before long, billions of digital records about ordinary people’s online activities were being stored every day. Among them were details cataloging visits to porn, social media and news websites, search engines, chat forums, and blogs.
  • The mass surveillance operation — code-named KARMA POLICE — was launched by British spies about seven years ago without any public debate or scrutiny. It was just one part of a giant global Internet spying apparatus built by the United Kingdom’s electronic eavesdropping agency, Government Communications Headquarters, or GCHQ. The revelations about the scope of the British agency’s surveillance are contained in documents obtained by The Intercept from National Security Agency whistleblower Edward Snowden. Previous reports based on the leaked files have exposed how GCHQ taps into Internet cables to monitor communications on a vast scale, but many details about what happens to the data after it has been vacuumed up have remained unclear.
Paul Merrell

U.S. surveillance architecture includes collection of revealing Internet, phone metadat... - 0 views

  • On March 12, 2004, acting attorney general James B. Comey and the Justice Department’s top leadership reached the brink of resignation over electronic surveillance orders that they believed to be illegal. President George W. Bush backed down, halting secret foreign-intelligence-gathering operations that had crossed into domestic terrain. That morning marked the beginning of the end of STELLARWIND, the cover name for a set of four surveillance programs that brought Americans and American territory within the domain of the National Security Agency for the first time in decades. It was also a prelude to new legal structures that allowed Bush and then President Obama to reproduce each of those programs and expand their reach.What exactly STELLARWIND did has never been disclosed in an unclassified form. Which parts of it did Comey approve? Which did he shut down? What became of the programs when the crisis passed and Comey, now Obama’s expected nominee for FBI director, returned to private life?Authoritative new answers to those questions, drawing upon a classified NSA history of STELLARWIND and interviews with high-ranking intelligence officials, offer the clearest map yet of the Bush-era programs and the NSA’s contemporary U.S. operations.STELLARWIND was succeeded by four major lines of intelligence collection in the territorial United States, together capable of spanning the full range of modern telecommunications, according to the interviews and documents.
  • Two of the four collection programs, one each for telephony and the Internet, process trillions of “metadata” records for storage and analysis in systems called MAINWAY and MARINA, respectively. Metadata includes highly revealing information about the times, places, devices and participants in electronic communication, but not its contents. The bulk collection of telephone call records from Verizon Business Services, disclosed this month by the British newspaper the Guardian, is one source of raw intelligence for MAINWAY.The other two types of collection, which operate on a much smaller scale, are aimed at content. One of them intercepts telephone calls and routes the spoken words to a system called ­NUCLEON.For Internet content, the most important source collection is the PRISM project reported on June 6 by The Washington Post and the Guardian. It draws from data held by Google, Yahoo, Microsoft and other Silicon Valley giants, collectively the richest depositories of personal information in history.
  • The debate has focused on two of the four U.S.-based collection programs: PRISM, for Internet content, and the comprehensive collection of telephone call records, foreign and domestic, that the Guardian revealed by posting a classified order from the Foreign Intelligence Surveillance Court to Verizon Business Services.The Post has learned that similar orders have been renewed every three months for other large U.S. phone companies, including Bell South and AT&T, since May 24, 2006. On that day, the surveillance court made a fundamental shift in its approach to Section 215 of the Patriot Act, which permits the FBI to compel production of “business records” that are relevant to a particular terrorism investigation and to share those in some circumstances with the NSA. Henceforth, the court ruled, it would define the relevant business records as the entirety of a telephone company’s call database.The Bush administration, by then, had been taking “bulk metadata” from the phone companies under voluntary agreements for more than four years. The volume of information overwhelmed the MAINWAY database, according to a classified report from the NSA inspector general in 2009. The agency spent $146 million in supplemental counterterrorism funds to buy new hardware and contract support — and to make unspecified payments to the phone companies for “collaborative partnerships.”When the New York Times revealed the warrantless surveillance of voice calls, in December 2005, the telephone companies got nervous. One of them, unnamed in the report, approached the NSA with a request. Rather than volunteer the data, at a price, the “provider preferred to be compelled to do so by a court order,” the report said. Other companies followed suit. The surveillance court order that recast the meaning of business records “essentially gave NSA the same authority to collect bulk telephony metadata from business records that it had” under Bush’s asserted authority alone.
  • ...3 more annotations...
  • Telephone metadata was not the issue that sparked a rebellion at the Justice Department, first by Jack Goldsmith of the Office of Legal Counsel and then by Comey, who was acting attorney general because John D. Ashcroft was in intensive care with acute gallstone pancreatitis. It was Internet metadata.At Bush’s direction, in orders prepared by David Addington, the counsel to Vice President Richard B. Cheney, the NSA had been siphoning e-mail metadata and technical records of Skype calls from data links owned by AT&T, Sprint and MCI, which later merged with Verizon.For reasons unspecified in the report, Goldsmith and Comey became convinced that Bush had no lawful authority to do that.MARINA and the collection tools that feed it are probably the least known of the NSA’s domestic operations, even among experts who follow the subject closely. Yet they probably capture information about more American citizens than any other, because the volume of e-mail, chats and other Internet communications far exceeds the volume of standard telephone calls.The NSA calls Internet metadata “digital network information.” Sophisticated analysis of those records can reveal unknown associates of known terrorism suspects. Depending on the methods applied, it can also expose medical conditions, political or religious affiliations, confidential business negotiations and extramarital affairs.What permits the former and prevents the latter is a complex set of policies that the public is not permitted to see.
  • In the urgent aftermath of Sept. 11, 2001, with more attacks thought to be imminent, analysts wanted to use “contact chaining” techniques to build what the NSA describes as network graphs of people who represented potential threats.The legal challenge for the NSA was that its practice of collecting high volumes of data from digital links did not seem to meet even the relatively low requirements of Bush’s authorization, which allowed collection of Internet metadata “for communications with at least one communicant outside the United States or for which no communicant was known to be a citizen of the United States,” the NSA inspector general’s report said.Lawyers for the agency came up with an interpretation that said the NSA did not “acquire” the communications, a term with formal meaning in surveillance law, until analysts ran searches against it. The NSA could “obtain” metadata in bulk, they argued, without meeting the required standards for acquisition.Goldsmith and Comey did not buy that argument, and a high-ranking U.S. intelligence official said the NSA does not rely on it today.As soon as surveillance data “touches us, we’ve got it, whatever verbs you choose to use,” the official said in an interview. “We’re not saying there’s a magic formula that lets us have it without having it.”
  • When Comey finally ordered a stop to the program, Bush signed an order renewing it anyway. Comey, Goldsmith, FBI Director Robert S. Mueller III and most of the senior Bush appointees in the Justice Department began drafting letters of resignation.Then-NSA Director Michael V. Hayden was not among them. According to the inspector general’s classified report, Cheney’s lawyer, Addington, placed a phone call and “General Hayden had to decide whether NSA would execute the Authorization without the Attorney General’s signature.” He decided to go along.The following morning, when Mueller told Bush that he and Comey intended to resign, the president reversed himself.Three months later, on July 15, the secret surveillance court allowed the NSA to resume bulk collection under the court’s own authority. The opinion, which remains highly classified, was based on a provision of electronic surveillance law, known as “pen register, trap and trace,” that was written to allow law enforcement officers to obtain the phone numbers of incoming and outgoing calls from a single telephone line.
  •  
    Note particularly the mention that the FISA Court decision to throw the doors open for government snooping was based on "pen register, trap and trace" law. As suspected, now we are into territory dealt with by the Supreme Court in the pre-internet days of 1979 In Smith v. Maryland, 442 U.S. 735 (1979), More about that next, in a bookmark also tagged with "pen-register".
Paul Merrell

Group Thinks Anonymity Should Be Baked Into the Internet Itself Using Tor - Slashdot - 0 views

  • "David Talbot writes at MIT Technology review that engineers on the Internet Engineering Task Force (IETF), an informal organization of engineers that changes Internet code and operates by rough consensus, have asked the architects of Tor to consider turning the technology into an Internet standard. If widely adopted, such a standard would make it easy to include the technology in consumer and business products ranging from routers to apps and would allow far more people to browse the Web without being identified by anyone who might be spying on Internet traffic. The IETF is already working to make encryption standard in all web traffic. Stephen Farrell believes that forging Tor into a standard that interoperates with other parts of the Internet could be better than leaving Tor as a separate tool that requires people to take special action to implement. 'I think there are benefits that might flow in both directions,' says Farrell. 'I think other IETF participants could learn useful things about protocol design from the Tor people, who've faced interesting challenges that aren't often seen in practice. And the Tor people might well get interest and involvement from IETF folks who've got a lot of experience with large-scale systems.' Andrew Lewman, executive director of Tor, says the group is considering it. 'We're basically at the stage of 'Do we even want to go on a date together?' It's not clear we are going to do it, but it's worth exploring to see what is involved. It adds legitimacy, it adds validation of all the research we've done.'"
Paul Merrell

Canada Casts Global Surveillance Dragnet Over File Downloads - The Intercept - 0 views

  • Canada’s leading surveillance agency is monitoring millions of Internet users’ file downloads in a dragnet search to identify extremists, according to top-secret documents. The covert operation, revealed Wednesday by CBC News in collaboration with The Intercept, taps into Internet cables and analyzes records of up to 15 million downloads daily from popular websites commonly used to share videos, photographs, music, and other files. The revelations about the spying initiative, codenamed LEVITATION, are the first from the trove of files provided by National Security Agency whistleblower Edward Snowden to show that the Canadian government has launched its own globe-spanning Internet mass surveillance system. According to the documents, the LEVITATION program can monitor downloads in several countries across Europe, the Middle East, North Africa, and North America. It is led by the Communications Security Establishment, or CSE, Canada’s equivalent of the NSA. (The Canadian agency was formerly known as “CSEC” until a recent name change.)
  • The latest disclosure sheds light on Canada’s broad existing surveillance capabilities at a time when the country’s government is pushing for a further expansion of security powers following attacks in Ottawa and Quebec last year. Ron Deibert, director of University of Toronto-based Internet security think tank Citizen Lab, said LEVITATION illustrates the “giant X-ray machine over all our digital lives.” “Every single thing that you do – in this case uploading/downloading files to these sites – that act is being archived, collected and analyzed,” Deibert said, after reviewing documents about the online spying operation for CBC News. David Christopher, a spokesman for Vancouver-based open Internet advocacy group OpenMedia.ca, said the surveillance showed “robust action” was needed to rein in the Canadian agency’s operations.
  • In a top-secret PowerPoint presentation, dated from mid-2012, an analyst from the agency jokes about how, while hunting for extremists, the LEVITATION system gets clogged with information on innocuous downloads of the musical TV series Glee. CSE finds some 350 “interesting” downloads each month, the presentation notes, a number that amounts to less than 0.0001 per cent of the total collected data. The agency stores details about downloads and uploads to and from 102 different popular file-sharing websites, according to the 2012 document, which describes the collected records as “free file upload,” or FFU, “events.” Only three of the websites are named: RapidShare, SendSpace, and the now defunct MegaUpload.
  • ...3 more annotations...
  • “The specific uses that they talk about in this [counter-terrorism] context may not be the problem, but it’s what else they can do,” said Tamir Israel, a lawyer with the University of Ottawa’s Canadian Internet Policy and Public Interest Clinic. Picking which downloads to monitor is essentially “completely at the discretion of CSE,” Israel added. The file-sharing surveillance also raises questions about the number of Canadians whose downloading habits could have been swept up as part of LEVITATION’s dragnet. By law, CSE isn’t allowed to target Canadians. In the LEVITATION presentation, however, two Canadian IP addresses that trace back to a web server in Montreal appear on a list of suspicious downloads found across the world. The same list includes downloads that CSE monitored in closely allied countries, including the United Kingdom, United States, Spain, Brazil, Germany and Portugal. It is unclear from the document whether LEVITATION has ever prevented any terrorist attacks. The agency cites only two successes of the program in the 2012 presentation: the discovery of a hostage video through a previously unknown target, and an uploaded document that contained the hostage strategy of a terrorist organization. The hostage in the discovered video was ultimately killed, according to public reports.
  • LEVITATION does not rely on cooperation from any of the file-sharing companies. A separate secret CSE operation codenamed ATOMIC BANJO obtains the data directly from internet cables that it has tapped into, and the agency then sifts out the unique IP address of each computer that downloaded files from the targeted websites. The IP addresses are valuable pieces of information to CSE’s analysts, helping to identify people whose downloads have been flagged as suspicious. The analysts use the IP addresses as a kind of search term, entering them into other surveillance databases that they have access to, such as the vast repositories of intercepted Internet data shared with the Canadian agency by the NSA and its British counterpart Government Communications Headquarters. If successful, the searches will return a list of results showing other websites visited by the people downloading the files – in some cases revealing associations with Facebook or Google accounts. In turn, these accounts may reveal the names and the locations of individual downloaders, opening the door for further surveillance of their activities.
  • Canada’s leading surveillance agency is monitoring millions of Internet users’ file downloads in a dragnet search to identify extremists, according to top-secret documents. The covert operation, revealed Wednesday by CBC News in collaboration with The Intercept, taps into Internet cables and analyzes records of up to 15 million downloads daily from popular websites commonly used to share videos, photographs, music, and other files. The revelations about the spying initiative, codenamed LEVITATION, are the first from the trove of files provided by National Security Agency whistleblower Edward Snowden to show that the Canadian government has launched its own globe-spanning Internet mass surveillance system. According to the documents, the LEVITATION program can monitor downloads in several countries across Europe, the Middle East, North Africa, and North America. It is led by the Communications Security Establishment, or CSE, Canada’s equivalent of the NSA. (The Canadian agency was formerly known as “CSEC” until a recent name change.)
Paul Merrell

GCHQ taps fibre-optic cables for secret access to world's communications | UK news | gu... - 0 views

  • Britain's spy agency GCHQ has secretly gained access to the network of cables which carry the world's phone calls and internet traffic and has started to process vast streams of sensitive personal information which it is sharing with its American partner, the National Security Agency (NSA).The sheer scale of the agency's ambition is reflected in the titles of its two principal components: Mastering the Internet and Global Telecoms Exploitation, aimed at scooping up as much online and telephone traffic as possible. This is all being carried out without any form of public acknowledgement or debate.One key innovation has been GCHQ's ability to tap into and store huge volumes of data drawn from fibre-optic cables for up to 30 days so that it can be sifted and analysed. That operation, codenamed Tempora, has been running for some 18 months.
  • GCHQ and the NSA are consequently able to access and process vast quantities of communications between entirely innocent people, as well as targeted suspects.This includes recordings of phone calls, the content of email messages, entries on Facebook and the history of any internet user's access to websites – all of which is deemed legal, even though the warrant system was supposed to limit interception to a specified range of targets.The existence of the programme has been disclosed in documents shown to the Guardian by the NSA whistleblower Edward Snowden as part of his attempt to expose what he has called "the largest programme of suspicionless surveillance in human history"."It's not just a US problem. The UK has a huge dog in this fight," Snowden told the Guardian. "They [GCHQ] are worse than the US."
  • However, on Friday a source with knowledge of intelligence argued that the data was collected legally under a system of safeguards, and had provided material that had led to significant breakthroughs in detecting and preventing serious crime.Britain's technical capacity to tap into the cables that carry the world's communications – referred to in the documents as special source exploitation – has made GCHQ an intelligence superpower.By 2010, two years after the project was first trialled, it was able to boast it had the "biggest internet access" of any member of the Five Eyes electronic eavesdropping alliance, comprising the US, UK, Canada, Australia and New Zealand.UK officials could also claim GCHQ "produces larger amounts of metadata than NSA". (Metadata describes basic information on who has been contacting whom, without detailing the content.)By May last year 300 analysts from GCHQ, and 250 from the NSA, had been assigned to sift through the flood of data.The Americans were given guidelines for its use, but were told in legal briefings by GCHQ lawyers: "We have a light oversight regime compared with the US".
  • ...8 more annotations...
  • When it came to judging the necessity and proportionality of what they were allowed to look for, would-be American users were told it was "your call".The Guardian understands that a total of 850,000 NSA employees and US private contractors with top secret clearance had access to GCHQ databases.
  • For the 2 billion users of the world wide web, Tempora represents a window on to their everyday lives, sucking up every form of communication from the fibre-optic cables that ring the world.The NSA has meanwhile opened a second window, in the form of the Prism operation, revealed earlier this month by the Guardian, from which it secured access to the internal systems of global companies that service the internet.The GCHQ mass tapping operation has been built up over five years by attaching intercept probes to transatlantic fibre-optic cables where they land on British shores carrying data to western Europe from telephone exchanges and internet servers in north America.This was done under secret agreements with commercial companies, described in one document as "intercept partners".The papers seen by the Guardian suggest some companies have been paid for the cost of their co-operation and GCHQ went to great lengths to keep their names secret. They were assigned "sensitive relationship teams" and staff were urged in one internal guidance paper to disguise the origin of "special source" material in their reports for fear that the role of the companies as intercept partners would cause "high-level political fallout".
  • "The criteria are security, terror, organised crime. And economic well-being. There's an auditing process to go back through the logs and see if it was justified or not. The vast majority of the data is discarded without being looked at … we simply don't have the resources."However, the legitimacy of the operation is in doubt. According to GCHQ's legal advice, it was given the go-ahead by applying old law to new technology. The 2000 Regulation of Investigatory Powers Act (Ripa) requires the tapping of defined targets to be authorised by a warrant signed by the home secretary or foreign secretary.However, an obscure clause allows the foreign secretary to sign a certificate for the interception of broad categories of material, as long as one end of the monitored communications is abroad. But the nature of modern fibre-optic communications means that a proportion of internal UK traffic is relayed abroad and then returns through the cables.
  • The categories of material have included fraud, drug trafficking and terrorism, but the criteria at any one time are secret and are not subject to any public debate. GCHQ's compliance with the certificates is audited by the agency itself, but the results of those audits are also secret.An indication of how broad the dragnet can be was laid bare in advice from GCHQ's lawyers, who said it would be impossible to list the total number of people targeted because "this would be an infinite list which we couldn't manage".There is an investigatory powers tribunal to look into complaints that the data gathered by GCHQ has been improperly used, but the agency reassured NSA analysts in the early days of the programme, in 2009: "So far they have always found in our favour".
  • Historically, the spy agencies have intercepted international communications by focusing on microwave towers and satellites. The NSA's intercept station at Menwith Hill in North Yorkshire played a leading role in this. One internal document quotes the head of the NSA, Lieutenant General Keith Alexander, on a visit to Menwith Hill in June 2008, asking: "Why can't we collect all the signals all the time? Sounds like a good summer project for Menwith."By then, however, satellite interception accounted for only a small part of the network traffic. Most of it now travels on fibre-optic cables, and the UK's position on the western edge of Europe gave it natural access to cables emerging from the Atlantic.
  • The processing centres apply a series of sophisticated computer programmes in order to filter the material through what is known as MVR – massive volume reduction. The first filter immediately rejects high-volume, low-value traffic, such as peer-to-peer downloads, which reduces the volume by about 30%. Others pull out packets of information relating to "selectors" – search terms including subjects, phone numbers and email addresses of interest. Some 40,000 of these were chosen by GCHQ and 31,000 by the NSA. Most of the information extracted is "content", such as recordings of phone calls or the substance of email messages. The rest is metadata.
  • The GCHQ documents that the Guardian has seen illustrate a constant effort to build up storage capacity at the stations at Cheltenham, Bude and at one overseas location, as well a search for ways to maintain the agency's comparative advantage as the world's leading communications companies increasingly route their cables through Asia to cut costs. Meanwhile, technical work is ongoing to expand GCHQ's capacity to ingest data from new super cables carrying data at 100 gigabits a second. As one training slide told new users: "You are in an enviable position – have fun and make the most of it."
  • British spy agency collects and stores vast quantities of global email messages, Facebook posts, internet histories and calls, and shares them with NSA, latest documents from Edward Snowden reveal
  •  
    Note particularly that the Brit criteria adds economic data to the list of categories categories the NSA trawls for and shares its data with the U.S. NSA. Both agencies claim to be targeting foreigners, so now we're into the "we surveil your citizens; you surveil our citizens, then we'll share the results" scenario that leaves both sides of the pond with a superficial excuse to say "we don't surveil our own citizens, just foreigners." But it's just ring-around-the-rosy. 850,000 NSA employees and U.S. private contractors with access to GCHQ surveillance databases.  Lots more in the article that I didn't highlight.
Paul Merrell

N.S.A. Able to Foil Basic Safeguards of Privacy on Web - NYTimes.com - 1 views

  • The National Security Agency is winning its long-running secret war on encryption, using supercomputers, technical trickery, court orders and behind-the-scenes persuasion to undermine the major tools protecting the privacy of everyday communications in the Internet age, according to newly disclosed documents.
  • The agency has circumvented or cracked much of the encryption, or digital scrambling, that guards global commerce and banking systems, protects sensitive data like trade secrets and medical records, and automatically secures the e-mails, Web searches, Internet chats and phone calls of Americans and others around the world, the documents show.
  • The N.S.A. hacked into target computers to snare messages before they were encrypted. In some cases, companies say they were coerced by the government into handing over their master encryption keys or building in a back door. And the agency used its influence as the world’s most experienced code maker to covertly introduce weaknesses into the encryption standards followed by hardware and software developers around the world.
  • ...11 more annotations...
  • “For the past decade, N.S.A. has led an aggressive, multipronged effort to break widely used Internet encryption technologies,” said a 2010 memo describing a briefing about N.S.A. accomplishments for employees of its British counterpart, Government Communications Headquarters, or GCHQ. “Cryptanalytic capabilities are now coming online. Vast amounts of encrypted Internet data which have up till now been discarded are now exploitable.”
  • Some of the agency’s most intensive efforts have focused on the encryption in universal use in the United States, including Secure Sockets Layer, or SSL; virtual private networks, or VPNs; and the protection used on fourth-generation, or 4G, smartphones. Many Americans, often without realizing it, rely on such protection every time they send an e-mail, buy something online, consult with colleagues via their company’s computer network, or use a phone or a tablet on a 4G network.
  • For at least three years, one document says, GCHQ, almost certainly in collaboration with the N.S.A., has been looking for ways into protected traffic of popular Internet companies: Google, Yahoo, Facebook and Microsoft’s Hotmail. By 2012, GCHQ had developed “new access opportunities” into Google’s systems, according to the document. (Google denied giving any government access and said it had no evidence its systems had been breached).
  • Paul Kocher, a leading cryptographer who helped design the SSL protocol, recalled how the N.S.A. lost the heated national debate in the 1990s about inserting into all encryption a government back door called the Clipper Chip. “And they went and did it anyway, without telling anyone,” Mr. Kocher said. He said he understood the agency’s mission but was concerned about the danger of allowing it unbridled access to private information.
  • The documents are among more than 50,000 shared by The Guardian with The New York Times and ProPublica, the nonprofit news organization. They focus on GCHQ but include thousands from or about the N.S.A. Intelligence officials asked The Times and ProPublica not to publish this article, saying it might prompt foreign targets to switch to new forms of encryption or communications that would be harder to collect or read. The news organizations removed some specific facts but decided to publish the article because of the value of a public debate about government actions that weaken the most powerful privacy tools.
  • The files show that the agency is still stymied by some encryption, as Mr. Snowden suggested in a question-and-answer session on The Guardian’s Web site in June. “Properly implemented strong crypto systems are one of the few things that you can rely on,” he said, though cautioning that the N.S.A. often bypasses the encryption altogether by targeting the computers at one end or the other and grabbing text before it is encrypted or after it is decrypted.
  • Because strong encryption can be so effective, classified N.S.A. documents make clear, the agency’s success depends on working with Internet companies — by getting their voluntary collaboration, forcing their cooperation with court orders or surreptitiously stealing their encryption keys or altering their software or hardware.
  • At Microsoft, as The Guardian has reported, the N.S.A. worked with company officials to get pre-encryption access to Microsoft’s most popular services, including Outlook e-mail, Skype Internet phone calls and chats, and SkyDrive, the company’s cloud storage service.
  • Simultaneously, the N.S.A. has been deliberately weakening the international encryption standards adopted by developers. One goal in the agency’s 2013 budget request was to “influence policies, standards and specifications for commercial public key technologies,” the most common encryption method. Cryptographers have long suspected that the agency planted vulnerabilities in a standard adopted in 2006 by the National Institute of Standards and Technology and later by the International Organization for Standardization, which has 163 countries as members. Classified N.S.A. memos appear to confirm that the fatal weakness, discovered by two Microsoft cryptographers in 2007, was engineered by the agency. The N.S.A. wrote the standard and aggressively pushed it on the international group, privately calling the effort “a challenge in finesse.” “Eventually, N.S.A. became the sole editor,” the memo says.
  • But the agencies’ goal was to move away from decrypting targets’ tools one by one and instead decode, in real time, all of the information flying over the world’s fiber optic cables and through its Internet hubs, only afterward searching the decrypted material for valuable intelligence. A 2010 document calls for “a new approach for opportunistic decryption, rather than targeted.” By that year, a Bullrun briefing document claims that the agency had developed “groundbreaking capabilities” against encrypted Web chats and phone calls. Its successes against Secure Sockets Layer and virtual private networks were gaining momentum.
  • Ladar Levison, the founder of Lavabit, wrote a public letter to his disappointed customers, offering an ominous warning. “Without Congressional action or a strong judicial precedent,” he wrote, “I would strongly recommend against anyone trusting their private data to a company with physical ties to the United States.”
  •  
    Lengthy article, lots of new information on NSA decryption capabilities, none of it good for those who value their data privacy.
  •  
    Thanks Paul - nice job cutting this monster down to size :)
Paul Merrell

How a false witness helped the CIA make a case for torture | Al Jazeera America - 0 views

  • Buried amid details of “rectal rehydration” and waterboarding that dominated the headlines over last week’s Senate Intelligence Committee findings was an alarming detail: Both the committee’s summary report and its rebuttal by the CIA admit that a source whose claims were central to the July 2004 resumption of the torture program  — and, almost certainly, to authorizing the Internet dragnet collecting massive amounts of Americans’ email metadata — fabricated claims about an election year plot. Both the torture program and President Bush's warrantless wiretap program, Stellar Wind, were partly halted from March through June of 2004. That March, Assistant Attorney General Jack Goldsmith prepared to withdraw Pentagon authorization for torture, amid growing concern following the publication of pictures of detainee abuse at Iraq's Abu Ghraib, and a May 2004 CIA inspector general report criticizing a number of aspects of the Agency's interrogation program. On June 4, 2004, CIA Director George Tenet suspended the use of torture techniques.
  • During the same period, the DOJ lawyers who pushed to stop torture were also persuading President George W. Bush to halt aspects of Stellar Wind, a program that conducted warrantless wiretapping of Americans’ communications inside the U.S., on top of the Internet metadata. After a dramatic confrontation in the hospital room of Attorney General John Ashcroft on March 10, 2004, acting Attorney General Jim Comey and Goldsmith informed Bush there was no legal basis for parts of the program. Ultimately, Bush agreed to modify aspects of it, in part by halting the collection of Internet metadata. But even as Bush officials suspended that part of the program on March 26, they quickly set about finding legal cover for its resumption. One way they did so was by pointing to imminent threats — such as a planned election-season attack — in the United States.
  • The CIA in March 2004 received reporting from a source the torture report calls "Asset Y,” who said a known Al-Qaeda associate in Pakistan, Janat Gul — whom CIA at the time believed was a key facilitator — had set up a meeting between Asset Y and Al-Qaeda's finance chief, and was helping plan attacks inside the United States timed to coincide with the November 2004 elections. According to the report, CIA officers immediately expressed doubts about the veracity of the information they’d been given by Asset Y. A senior CIA officer called the report "vague" and "worthless in terms of actionable intelligence." He noted that Al Qaeda had already issued a statement “emphasizing a lack of desire to strike before the U.S. election” and suggested that since Al-Qaeda was aware that “threat reporting causes panic in Washington” and inevitably results in leaks, planting a false claim of an election season attack would be a good way for the network to test whether Asset Y was working for its enemies. Another officer, assigned to the group hunting Osama bin Laden, also expressed doubts. In its rebuttal to the Senate report, the CIA argues the agency was right to take seriously Asset Y’s reporting , in spite of those initial doubts. The CIA wrote numerous reports about the claim “even as we worked to resolve the inconsistencies.” Reports from detainee Hassan Ghul, who was captured in January 2004, supported the possibility that a cell of Al-Qaeda members in Pakistan’s tribal areas might be planning a plot of which he was unaware. And the CIA corroborated other parts of Asset Y's reporting.
  • ...5 more annotations...
  • Still, the CIA had one further reason for doubting claims that Gul was at the center of an Al-Qaeda election-year plot. Ghu told the CIA about an attempt by Gul, in the fall of 2003, to sell anti-aircraft missiles to Al-Qaeda; the Qaeda figure in Ghul’s story didn't even want to work with Gul. And Ghul later learned Gul was probably lying about his ability to acquire the missiles.
  • Nevertheless, the CIA took seriously Asset Y’s claim that Gul was involved in an election plot and moved quickly to gain custody of him after his arrest by Pakistan in June 2004. Even before CIA rendered Gul to its custody, Tenet started lobbying to get torture techniques reapproved for his interrogation. On June 29, Tenet wrote National Security Adviser Condoleezza Rice seeking approval to once again use some of the techniques whose use he suspended less than four weeks earlier, in the hope of gathering information on the election season plot. "Given the magnitude of the danger posed by the pre-election plot and Gul's almost certain knowledge of any intelligence about that plot” Tenet wrote, relying on Asset Y's claims, “I request the fastest possible resolution of the above issues." On July 20, according to the report, top administration officials gave CIA verbal approval to get back into the torture business. Ashcroft stated that most previously approved interrogation techniques would not violate U.S. law on July 22 (though not waterboarding). And by the end of July, CIA started coaxing DOJ to approve other techniques — such as slapping someone in the stomach or hosing them down with cold water or limiting their food — which had already been used by the CIA but never officially approved by DOJ.
  • At the same time, the government was also using the ostensible election-season plot, among others, to persuade the Foreign Intelligence Surveillance Court (FISC) – the secret court that approves domestic spying on Americans – to authorize the Internet dragnet. After Bush halted the Internet dragnet on March 26, his aides began working with FISC presiding judge Colleen Kollar-Kotelly to find a way to use FISA authority -- normally been used to access records for a single phone or Internet account -- to collect Internet metadata in bulk. They provided a series of briefings, including one attended by Terrorist Threat Integration Center head John Brennan and CIA Director George Tenet, to explain the threat. In addition, they provided what – under Stellar Wind – analysts called a “scary memo,” summarizing all the threats facing the country to underscore the urgency of the program. Tenet's declaration included as an appendix to an application submitted in the days before July 14, 2004, laid out the threats CIA and others were fighting that summer.
  • Judge Kollar-Kotelly invoked Tenet's material in a redacted section of her opinion authorizing the phone dragnet, pointing to it as a key reason to permit collection of what she called “enormous” amounts of data from innocent Americans.
  • Soon after the reauthorization of the torture and the Internet dragnet, the CIA realized ASSET Y's story wasn't true. By September, an officer involved in Janat Gul's interrogation observed, “we lack credible information that ties him to pre-election threat information or direct operational planning against the United States, at home or abroad.” In October, CIA reassessed ASSET Y, and found him to be deceptive. When pressured, ASSET Y admitted had had made up the story of a meeting set up by Gul. ASSET Y blamed his CIA handler for pressuring him for intelligence, leading him to lie about the meeting. By 2005, CIA had concluded that ASSET Y was a fabricator, and Janat Gul was a “rather poorly educated village man [who is] quite lazy [who] was looking to make some easy money for little work and he was easily persuaded to move people and run errands for folks on our target list” (though the Agency wasn't always forthright about the judgment to DOJ). The torture program, which was resumed in part because of a perceived urgency of extracting information from Gul on a plot that didn't exist, continued for several more years. The Internet dragnet continued under FISC authorization, on and off, until December 2011. And several other still active NSA programs, including the phone dragnet, relied on Kollar-Kotelly's earlier authorization as precedents – the case for which had also been derived, in part, from one long discredited fabricator.
1 - 20 of 465 Next › Last »
Showing 20 items per page