Skip to main content

Home/ Open Web/ Group items tagged court-order

Rss Feed Group items tagged

Paul Merrell

From Radio to Porn, British Spies Track Web Users' Online Identities - 0 views

  • HERE WAS A SIMPLE AIM at the heart of the top-secret program: Record the website browsing habits of “every visible user on the Internet.” Before long, billions of digital records about ordinary people’s online activities were being stored every day. Among them were details cataloging visits to porn, social media and news websites, search engines, chat forums, and blogs. The mass surveillance operation — code-named KARMA POLICE — was launched by British spies about seven years ago without any public debate or scrutiny. It was just one part of a giant global Internet spying apparatus built by the United Kingdom’s electronic eavesdropping agency, Government Communications Headquarters, or GCHQ. The revelations about the scope of the British agency’s surveillance are contained in documents obtained by The Intercept from National Security Agency whistleblower Edward Snowden. Previous reports based on the leaked files have exposed how GCHQ taps into Internet cables to monitor communications on a vast scale, but many details about what happens to the data after it has been vacuumed up have remained unclear.
  • Amid a renewed push from the U.K. government for more surveillance powers, more than two dozen documents being disclosed today by The Intercept reveal for the first time several major strands of GCHQ’s existing electronic eavesdropping capabilities.
  • The surveillance is underpinned by an opaque legal regime that has authorized GCHQ to sift through huge archives of metadata about the private phone calls, emails and Internet browsing logs of Brits, Americans, and any other citizens — all without a court order or judicial warrant
  • ...17 more annotations...
  • A huge volume of the Internet data GCHQ collects flows directly into a massive repository named Black Hole, which is at the core of the agency’s online spying operations, storing raw logs of intercepted material before it has been subject to analysis. Black Hole contains data collected by GCHQ as part of bulk “unselected” surveillance, meaning it is not focused on particular “selected” targets and instead includes troves of data indiscriminately swept up about ordinary people’s online activities. Between August 2007 and March 2009, GCHQ documents say that Black Hole was used to store more than 1.1 trillion “events” — a term the agency uses to refer to metadata records — with about 10 billion new entries added every day. As of March 2009, the largest slice of data Black Hole held — 41 percent — was about people’s Internet browsing histories. The rest included a combination of email and instant messenger records, details about search engine queries, information about social media activity, logs related to hacking operations, and data on people’s use of tools to browse the Internet anonymously.
  • Throughout this period, as smartphone sales started to boom, the frequency of people’s Internet use was steadily increasing. In tandem, British spies were working frantically to bolster their spying capabilities, with plans afoot to expand the size of Black Hole and other repositories to handle an avalanche of new data. By 2010, according to the documents, GCHQ was logging 30 billion metadata records per day. By 2012, collection had increased to 50 billion per day, and work was underway to double capacity to 100 billion. The agency was developing “unprecedented” techniques to perform what it called “population-scale” data mining, monitoring all communications across entire countries in an effort to detect patterns or behaviors deemed suspicious. It was creating what it said would be, by 2013, “the world’s biggest” surveillance engine “to run cyber operations and to access better, more valued data for customers to make a real world difference.”
  • A document from the GCHQ target analysis center (GTAC) shows the Black Hole repository’s structure.
  • The data is searched by GCHQ analysts in a hunt for behavior online that could be connected to terrorism or other criminal activity. But it has also served a broader and more controversial purpose — helping the agency hack into European companies’ computer networks. In the lead up to its secret mission targeting Netherlands-based Gemalto, the largest SIM card manufacturer in the world, GCHQ used MUTANT BROTH in an effort to identify the company’s employees so it could hack into their computers. The system helped the agency analyze intercepted Facebook cookies it believed were associated with Gemalto staff located at offices in France and Poland. GCHQ later successfully infiltrated Gemalto’s internal networks, stealing encryption keys produced by the company that protect the privacy of cell phone communications.
  • Similarly, MUTANT BROTH proved integral to GCHQ’s hack of Belgian telecommunications provider Belgacom. The agency entered IP addresses associated with Belgacom into MUTANT BROTH to uncover information about the company’s employees. Cookies associated with the IPs revealed the Google, Yahoo, and LinkedIn accounts of three Belgacom engineers, whose computers were then targeted by the agency and infected with malware. The hacking operation resulted in GCHQ gaining deep access into the most sensitive parts of Belgacom’s internal systems, granting British spies the ability to intercept communications passing through the company’s networks.
  • In March, a U.K. parliamentary committee published the findings of an 18-month review of GCHQ’s operations and called for an overhaul of the laws that regulate the spying. The committee raised concerns about the agency gathering what it described as “bulk personal datasets” being held about “a wide range of people.” However, it censored the section of the report describing what these “datasets” contained, despite acknowledging that they “may be highly intrusive.” The Snowden documents shine light on some of the core GCHQ bulk data-gathering programs that the committee was likely referring to — pulling back the veil of secrecy that has shielded some of the agency’s most controversial surveillance operations from public scrutiny. KARMA POLICE and MUTANT BROTH are among the key bulk collection systems. But they do not operate in isolation — and the scope of GCHQ’s spying extends far beyond them.
  • The agency operates a bewildering array of other eavesdropping systems, each serving its own specific purpose and designated a unique code name, such as: SOCIAL ANTHROPOID, which is used to analyze metadata on emails, instant messenger chats, social media connections and conversations, plus “telephony” metadata about phone calls, cell phone locations, text and multimedia messages; MEMORY HOLE, which logs queries entered into search engines and associates each search with an IP address; MARBLED GECKO, which sifts through details about searches people have entered into Google Maps and Google Earth; and INFINITE MONKEYS, which analyzes data about the usage of online bulletin boards and forums. GCHQ has other programs that it uses to analyze the content of intercepted communications, such as the full written body of emails and the audio of phone calls. One of the most important content collection capabilities is TEMPORA, which mines vast amounts of emails, instant messages, voice calls and other communications and makes them accessible through a Google-style search tool named XKEYSCORE.
  • As of September 2012, TEMPORA was collecting “more than 40 billion pieces of content a day” and it was being used to spy on people across Europe, the Middle East, and North Africa, according to a top-secret memo outlining the scope of the program. The existence of TEMPORA was first revealed by The Guardian in June 2013. To analyze all of the communications it intercepts and to build a profile of the individuals it is monitoring, GCHQ uses a variety of different tools that can pull together all of the relevant information and make it accessible through a single interface. SAMUEL PEPYS is one such tool, built by the British spies to analyze both the content and metadata of emails, browsing sessions, and instant messages as they are being intercepted in real time. One screenshot of SAMUEL PEPYS in action shows the agency using it to monitor an individual in Sweden who visited a page about GCHQ on the U.S.-based anti-secrecy website Cryptome.
  • Partly due to the U.K.’s geographic location — situated between the United States and the western edge of continental Europe — a large amount of the world’s Internet traffic passes through its territory across international data cables. In 2010, GCHQ noted that what amounted to “25 percent of all Internet traffic” was transiting the U.K. through some 1,600 different cables. The agency said that it could “survey the majority of the 1,600” and “select the most valuable to switch into our processing systems.”
  • According to Joss Wright, a research fellow at the University of Oxford’s Internet Institute, tapping into the cables allows GCHQ to monitor a large portion of foreign communications. But the cables also transport masses of wholly domestic British emails and online chats, because when anyone in the U.K. sends an email or visits a website, their computer will routinely send and receive data from servers that are located overseas. “I could send a message from my computer here [in England] to my wife’s computer in the next room and on its way it could go through the U.S., France, and other countries,” Wright says. “That’s just the way the Internet is designed.” In other words, Wright adds, that means “a lot” of British data and communications transit across international cables daily, and are liable to be swept into GCHQ’s databases.
  • A map from a classified GCHQ presentation about intercepting communications from undersea cables. GCHQ is authorized to conduct dragnet surveillance of the international data cables through so-called external warrants that are signed off by a government minister. The external warrants permit the agency to monitor communications in foreign countries as well as British citizens’ international calls and emails — for example, a call from Islamabad to London. They prohibit GCHQ from reading or listening to the content of “internal” U.K. to U.K. emails and phone calls, which are supposed to be filtered out from GCHQ’s systems if they are inadvertently intercepted unless additional authorization is granted to scrutinize them. However, the same rules do not apply to metadata. A little-known loophole in the law allows GCHQ to use external warrants to collect and analyze bulk metadata about the emails, phone calls, and Internet browsing activities of British people, citizens of closely allied countries, and others, regardless of whether the data is derived from domestic U.K. to U.K. communications and browsing sessions or otherwise. In March, the existence of this loophole was quietly acknowledged by the U.K. parliamentary committee’s surveillance review, which stated in a section of its report that “special protection and additional safeguards” did not apply to metadata swept up using external warrants and that domestic British metadata could therefore be lawfully “returned as a result of searches” conducted by GCHQ.
  • Perhaps unsurprisingly, GCHQ appears to have readily exploited this obscure legal technicality. Secret policy guidance papers issued to the agency’s analysts instruct them that they can sift through huge troves of indiscriminately collected metadata records to spy on anyone regardless of their nationality. The guidance makes clear that there is no exemption or extra privacy protection for British people or citizens from countries that are members of the Five Eyes, a surveillance alliance that the U.K. is part of alongside the U.S., Canada, Australia, and New Zealand. “If you are searching a purely Events only database such as MUTANT BROTH, the issue of location does not occur,” states one internal GCHQ policy document, which is marked with a “last modified” date of July 2012. The document adds that analysts are free to search the databases for British metadata “without further authorization” by inputing a U.K. “selector,” meaning a unique identifier such as a person’s email or IP address, username, or phone number. Authorization is “not needed for individuals in the U.K.,” another GCHQ document explains, because metadata has been judged “less intrusive than communications content.” All the spies are required to do to mine the metadata troves is write a short “justification” or “reason” for each search they conduct and then click a button on their computer screen.
  • Intelligence GCHQ collects on British persons of interest is shared with domestic security agency MI5, which usually takes the lead on spying operations within the U.K. MI5 conducts its own extensive domestic surveillance as part of a program called DIGINT (digital intelligence).
  • GCHQ’s documents suggest that it typically retains metadata for periods of between 30 days to six months. It stores the content of communications for a shorter period of time, varying between three to 30 days. The retention periods can be extended if deemed necessary for “cyber defense.” One secret policy paper dated from January 2010 lists the wide range of information the agency classes as metadata — including location data that could be used to track your movements, your email, instant messenger, and social networking “buddy lists,” logs showing who you have communicated with by phone or email, the passwords you use to access “communications services” (such as an email account), and information about websites you have viewed.
  • Records showing the full website addresses you have visited — for instance, www.gchq.gov.uk/what_we_do — are treated as content. But the first part of an address you have visited — for instance, www.gchq.gov.uk — is treated as metadata. In isolation, a single metadata record of a phone call, email, or website visit may not reveal much about a person’s private life, according to Ethan Zuckerman, director of Massachusetts Institute of Technology’s Center for Civic Media. But if accumulated and analyzed over a period of weeks or months, these details would be “extremely personal,” he told The Intercept, because they could reveal a person’s movements, habits, religious beliefs, political views, relationships, and even sexual preferences. For Zuckerman, who has studied the social and political ramifications of surveillance, the most concerning aspect of large-scale government data collection is that it can be “corrosive towards democracy” — leading to a chilling effect on freedom of expression and communication. “Once we know there’s a reasonable chance that we are being watched in one fashion or another it’s hard for that not to have a ‘panopticon effect,’” he said, “where we think and behave differently based on the assumption that people may be watching and paying attention to what we are doing.”
  • When compared to surveillance rules in place in the U.S., GCHQ notes in one document that the U.K. has “a light oversight regime.” The more lax British spying regulations are reflected in secret internal rules that highlight greater restrictions on how NSA databases can be accessed. The NSA’s troves can be searched for data on British citizens, one document states, but they cannot be mined for information about Americans or other citizens from countries in the Five Eyes alliance. No such constraints are placed on GCHQ’s own databases, which can be sifted for records on the phone calls, emails, and Internet usage of Brits, Americans, and citizens from any other country. The scope of GCHQ’s surveillance powers explain in part why Snowden told The Guardian in June 2013 that U.K. surveillance is “worse than the U.S.” In an interview with Der Spiegel in July 2013, Snowden added that British Internet cables were “radioactive” and joked: “Even the Queen’s selfies to the pool boy get logged.”
  • In recent years, the biggest barrier to GCHQ’s mass collection of data does not appear to have come in the form of legal or policy restrictions. Rather, it is the increased use of encryption technology that protects the privacy of communications that has posed the biggest potential hindrance to the agency’s activities. “The spread of encryption … threatens our ability to do effective target discovery/development,” says a top-secret report co-authored by an official from the British agency and an NSA employee in 2011. “Pertinent metadata events will be locked within the encrypted channels and difficult, if not impossible, to prise out,” the report says, adding that the agencies were working on a plan that would “(hopefully) allow our Internet Exploitation strategy to prevail.”
Paul Merrell

FCC Prepares to Re-Regulate Broadband Providers | Epicenter | Wired.com - 0 views

  • Reversing a controversial deregulation decision made by the Bush administration, the FCC will seek to force broadband internet providers to adhere to some of the rules that have long applied to the nation’s landline phone providers. The decision will be announced officially tomorrow by FCC Chairman Julius Genachowski, according to a senior FCC official’s statement Wednesday, and will likely set off a firestorm of protests from the nation’s well-connected telecommunications industry. The FCC says the move is a response to a recent court ruling that called into question whether the FCC had authority to regulate how the nation’s broadband providers run their networks, including whether providers can block content. The ruling came in a case where Comcast appealed an FCC order that forbade the carrier from blocking peer-to-peer file sharing.
Paul Merrell

As Belgium threatens fines, Facebook's defence of tracking visitors rings hollow | nsnb... - 0 views

  • Facebook has been ordered by a Belgian court to stop tracking non-Facebook users when they visit the Facebook site. Facebook has been given 48 hours to stop the tracking or face possible fines of up to 250,000 Euro a day.
  • Facebook has said that it will appeal the ruling, claiming that since their european headquarters are situated in Ireland, they should only be bound by the Irish Data Protection Regulator. Facebook’s chief of security Alex Stamos has posted an explanation about why non-Facebook users are tracked when they visit the site. The tracking issue centres around the creation of a “cookie” called “datr” whenever anyone visits a Facebook page. This cookie contains an identification number that identifies the same browser returning each time to different Facebook pages. Once created, the cookie will last 2 years unless the user explicitly deletes it. The cookie is created for all visitors to Facebook, irrespective of whether they are a Facebook user or even whether they are logged into Facebook at the time. According to Stamos, the measure is needed to: Prevent the creation of fake and spammy accounts Reduce the risk of someone’s account being taken over by someone else Protect people’s content from being stolen Stopping denial of service attacks against Facebook
  • The principle behind this is that if you can identify requests that arrive at the site for whatever reason, abnormal patterns may unmask people creating fake accounts, hijacking a real account or just issuing so many requests that it overwhelms the site. Stamos’ defence of tracking users is that they have been using it for the past 5 years and nobody had complained until now, that it was common practice and that there was little harm because the data was not collected for any purpose other than security. The dilemma raised by Facebook’s actions is a common one in the conflicting spheres of maintaining privacy and maintaining security. It is obvious that if you can identify all visitors to a site, then it is possible to determine more information about what they are doing than if they were anonymous. The problem with this from a moral perspective is that everyone is being tagged, irrespective of whether their intent was going to be malicious or not. It is essentially compromising the privacy of the vast majority for the sake of a much smaller likelihood of bad behaviour.
  •  
    I checked and sure enough: five Facebook cookies even though I have no Facebook account. They're gone now, and I've created an exception blocking Facebook from planting more cookies on my systems. 
Paul Merrell

Google Caves to Russian Federal Antimonopoly Service, Agrees to Pay Fine - nsnbc intern... - 0 views

  • Google ultimately caved to Russia’s Federal Antimonopoly Service, agreeing to pay $7.8 million (438 million rubles) for violating antitrust laws. The corporate Colossus will also pay two other fines totaling an additional $18,000 (1 million rubles) for failing to comply with past orders issued by state regulators. Last year Google caved to similar demands by the European Union.
  • In August 2016 Russia’s Federal Antimonopoly Service responded to a complaint by Russian search engine operator Yandex and fined the U.S.-based Google 438 million rubles for abusing its dominant market position to force manufacturers to make Google applications the default services on devices using Android. Regulators set the fine at 9 percent of Google’s reported profits on the Russian market in 2014, plus inflation. Similar to the case against the European Union Google challenged the penalty in several appellate courts before finally agreeing this week to meet the government’s demands. The corporation also agreed to stop requiring manufacturers to install Google services as the default applications on Android-powered devices. The agreement is valid for six years and nine months, Russia’s Antimonopoly Service reported. Last year Google, after a protracted battle, caved to similar antitrust regulations by the European Union, but the internet giant has also come under fire elsewhere. In 2015 Australian treasurer Joe Hockey implied Google in his list of corporate tax thieves. In January 2016 British lawmakers decided to fry Google over tax evasion. Google and taxes were compared to the Bermuda Triangle. One year ago the dispute between the European Union’s competition watchdog and Google, culminated in the European Commission formally charging Google with abusing the dominant position of its Android mobile phone operating system, having launched an investigation in April 2015.
Paul Merrell

Google will 'de-rank' RT articles to make them harder to find - Eric Schmidt - RT World... - 0 views

  • Eric Schmidt, the Executive Chairman of Google’s parent company Alphabet, says the company will “engineer” specific algorithms for RT and Sputnik to make their articles less prominent on the search engine’s news delivery services. “We are working on detecting and de-ranking those kinds of sites – it’s basically RT and Sputnik,” Schmidt said during a Q & A session at the Halifax International Security Forum in Canada on Saturday, when asked about whether Google facilitates “Russian propaganda.”
  • “We are well of aware of it, and we are trying to engineer the systems to prevent that [the content being delivered to wide audiences]. But we don’t want to ban the sites – that’s not how we operate.”The discussion focused on the company’s popular Google News service, which clusters the news by stories, then ranks the various media outlets depending on their reach, article length and veracity, and Google Alerts, which proactively informs subscribers of new publications.
  • The Alphabet chief, who has been referred to by Hillary Clinton as a “longtime friend,” added that the experience of “the last year” showed that audiences could not be trusted to distinguish fake and real news for themselves.“We started with the default American view that ‘bad’ speech would be replaced with ‘good’ speech, but the problem found in the last year is that this may not be true in certain situations, especially when you have a well-funded opponent who is trying to actively spread this information,” he told the audience.
  • ...1 more annotation...
  • RT America registered under FARA earlier this month, after being threatened by the US Department of Justice with arrests and confiscations of property if it failed to comply. The broadcaster is fighting the order in court.
« First ‹ Previous 61 - 65 of 65
Showing 20 items per page