Skip to main content

Home/ Open Web/ Group items tagged Tor

Rss Feed Group items tagged

Paul Merrell

No, Department of Justice, 80 Percent of Tor Traffic Is Not Child Porn | WIRED - 0 views

  • The debate over online anonymity, and all the whistleblowers, trolls, anarchists, journalists and political dissidents it enables, is messy enough. It doesn’t need the US government making up bogus statistics about how much that anonymity facilitates child pornography.
  • he debate over online anonymity, and all the whistleblowers, trolls, anarchists, journalists and political dissidents it enables, is messy enough. It doesn’t need the US government making up bogus statistics about how much that anonymity facilitates child pornography. At the State of the Net conference in Washington on Tuesday, US assistant attorney general Leslie Caldwell discussed what she described as the dangers of encryption and cryptographic anonymity tools like Tor, and how those tools can hamper law enforcement. Her statements are the latest in a growing drumbeat of federal criticism of tech companies and software projects that provide privacy and anonymity at the expense of surveillance. And as an example of the grave risks presented by that privacy, she cited a study she said claimed an overwhelming majority of Tor’s anonymous traffic relates to pedophilia. “Tor obviously was created with good intentions, but it’s a huge problem for law enforcement,” Caldwell said in comments reported by Motherboard and confirmed to me by others who attended the conference. “We understand 80 percent of traffic on the Tor network involves child pornography.” That statistic is horrifying. It’s also baloney.
  • In a series of tweets that followed Caldwell’s statement, a Department of Justice flack said Caldwell was citing a University of Portsmouth study WIRED covered in December. He included a link to our story. But I made clear at the time that the study claimed 80 percent of traffic to Tor hidden services related to child pornography, not 80 percent of all Tor traffic. That is a huge, and important, distinction. The vast majority of Tor’s users run the free anonymity software while visiting conventional websites, using it to route their traffic through encrypted hops around the globe to avoid censorship and surveillance. But Tor also allows websites to run Tor, something known as a Tor hidden service. This collection of hidden sites, which comprise what’s often referred to as the “dark web,” use Tor to obscure the physical location of the servers that run them. Visits to those dark web sites account for only 1.5 percent of all Tor traffic, according to the software’s creators at the non-profit Tor Project. The University of Portsmouth study dealt exclusively with visits to hidden services. In contrast to Caldwell’s 80 percent claim, the Tor Project’s director Roger Dingledine pointed out last month that the study’s pedophilia findings refer to something closer to a single percent of Tor’s overall traffic.
  • ...1 more annotation...
  • So to whoever at the Department of Justice is preparing these talking points for public consumption: Thanks for citing my story. Next time, please try reading it.
Paul Merrell

Use Tor or 'EXTREMIST' Tails Linux? Congrats, you're on the NSA's list * The Register - 0 views

  • Alleged leaked documents about the NSA's XKeyscore snooping software appear to show the paranoid agency is targeting Tor and Tails users, Linux Journal readers – and anyone else interested in online privacy.Apparently, this configuration file for XKeyscore is in the divulged data, which was obtained and studied by members of the Tor project and security specialists for German broadcasters NDR and WDR. <a href="http://pubads.g.doubleclick.net/gampad/jump?iu=/6978/reg_security/front&sz=300x250%7C300x600&tile=3&c=33U7ZK6qwQrMkAACSrTugAAAP1&t=ct%3Dns%26unitnum%3D3%26unitname%3Dwww_top_mpu%26pos%3Dtop%26test%3D0" target="_blank"> <img src="http://pubads.g.doubleclick.net/gampad/ad?iu=/6978/reg_security/front&sz=300x250%7C300x600&tile=3&c=33U7ZK6qwQrMkAACSrTugAAAP1&t=ct%3Dns%26unitnum%3D3%26unitname%3Dwww_top_mpu%26pos%3Dtop%26test%3D0" alt=""></a> In their analysis of the alleged top-secret documents, they claim the NSA is, among other things:Specifically targeting Tor directory servers Reading email contents for mentions of Tor bridges Logging IP addresses used to search for privacy-focused websites and software And possibly breaking international law in doing so. We already know from leaked Snowden documents that Western intelligence agents hate Tor for its anonymizing abilities. But what the aforementioned leaked source code, written in a rather strange custom language, shows is that not only is the NSA targeting the anonymizing network Tor specifically, it is also taking digital fingerprints of any netizens who are remotely interested in privacy.
  • These include readers of the Linux Journal site, anyone visiting the website for the Tor-powered Linux operating system Tails – described by the NSA as "a comsec mechanism advocated by extremists on extremist forums" – and anyone looking into combining Tails with the encryption tool Truecrypt.If something as innocuous as Linux Journal is on the NSA's hit list, it's a distinct possibility that El Reg is too, particularly in light of our recent exclusive report on GCHQ – which led to a Ministry of Defence advisor coming round our London office for a chat.
  • If you take even the slightest interest in online privacy or have Googled a Linux Journal article about a broken package, you are earmarked in an NSA database for further surveillance, according to these latest leaks.This is assuming the leaked file is genuine, of course.Other monitored sites, we're told, include HotSpotShield, FreeNet, Centurian, FreeProxies.org, MegaProxy, privacy.li and an anonymous email service called MixMinion. The IP address of computer users even looking at these sites is recorded and stored on the NSA's servers for further analysis, and it's up to the agency how long it keeps that data.The XKeyscore code, we're told, includes microplugins that target Tor servers in Germany, at MIT in the United States, in Sweden, in Austria, and in the Netherlands. In doing so it may not only fall foul of German law but also the US's Fourth Amendment.
  • ...2 more annotations...
  • The nine Tor directory servers receive especially close monitoring from the NSA's spying software, which states the "goal is to find potential Tor clients connecting to the Tor directory servers." Tor clients linking into the directory servers are also logged."This shows that Tor is working well enough that Tor has become a target for the intelligence services," said Sebastian Hahn, who runs one of the key Tor servers. "For me this means that I will definitely go ahead with the project.”
  • While the German reporting team has published part of the XKeyscore scripting code, it doesn't say where it comes from. NSA whistleblower Edward Snowden would be a logical pick, but security experts are not so sure."I do not believe that this came from the Snowden documents," said security guru Bruce Schneier. "I also don't believe the TAO catalog came from the Snowden documents. I think there's a second leaker out there."If so, the NSA is in for much more scrutiny than it ever expected.
Paul Merrell

Edward Snowden Explains How To Reclaim Your Privacy - 0 views

  • Micah Lee: What are some operational security practices you think everyone should adopt? Just useful stuff for average people. Edward Snowden: [Opsec] is important even if you’re not worried about the NSA. Because when you think about who the victims of surveillance are, on a day-to-day basis, you’re thinking about people who are in abusive spousal relationships, you’re thinking about people who are concerned about stalkers, you’re thinking about children who are concerned about their parents overhearing things. It’s to reclaim a level of privacy. The first step that anyone could take is to encrypt their phone calls and their text messages. You can do that through the smartphone app Signal, by Open Whisper Systems. It’s free, and you can just download it immediately. And anybody you’re talking to now, their communications, if it’s intercepted, can’t be read by adversaries. [Signal is available for iOS and Android, and, unlike a lot of security tools, is very easy to use.] You should encrypt your hard disk, so that if your computer is stolen the information isn’t obtainable to an adversary — pictures, where you live, where you work, where your kids are, where you go to school. [I’ve written a guide to encrypting your disk on Windows, Mac, and Linux.] Use a password manager. One of the main things that gets people’s private information exposed, not necessarily to the most powerful adversaries, but to the most common ones, are data dumps. Your credentials may be revealed because some service you stopped using in 2007 gets hacked, and your password that you were using for that one site also works for your Gmail account. A password manager allows you to create unique passwords for every site that are unbreakable, but you don’t have the burden of memorizing them. [The password manager KeePassX is free, open source, cross-platform, and never stores anything in the cloud.]
  • The other thing there is two-factor authentication. The value of this is if someone does steal your password, or it’s left or exposed somewhere … [two-factor authentication] allows the provider to send you a secondary means of authentication — a text message or something like that. [If you enable two-factor authentication, an attacker needs both your password as the first factor and a physical device, like your phone, as your second factor, to login to your account. Gmail, Facebook, Twitter, Dropbox, GitHub, Battle.net, and tons of other services all support two-factor authentication.]
  • We should armor ourselves using systems we can rely on every day. This doesn’t need to be an extraordinary lifestyle change. It doesn’t have to be something that is disruptive. It should be invisible, it should be atmospheric, it should be something that happens painlessly, effortlessly. This is why I like apps like Signal, because they’re low friction. It doesn’t require you to re-order your life. It doesn’t require you to change your method of communications. You can use it right now to talk to your friends.
  • ...4 more annotations...
  • Lee: What do you think about Tor? Do you think that everyone should be familiar with it, or do you think that it’s only a use-it-if-you-need-it thing? Snowden: I think Tor is the most important privacy-enhancing technology project being used today. I use Tor personally all the time. We know it works from at least one anecdotal case that’s fairly familiar to most people at this point. That’s not to say that Tor is bulletproof. What Tor does is it provides a measure of security and allows you to disassociate your physical location. … But the basic idea, the concept of Tor that is so valuable, is that it’s run by volunteers. Anyone can create a new node on the network, whether it’s an entry node, a middle router, or an exit point, on the basis of their willingness to accept some risk. The voluntary nature of this network means that it is survivable, it’s resistant, it’s flexible. [Tor Browser is a great way to selectively use Tor to look something up and not leave a trace that you did it. It can also help bypass censorship when you’re on a network where certain sites are blocked. If you want to get more involved, you can volunteer to run your own Tor node, as I do, and support the diversity of the Tor network.]
  • Lee: So that is all stuff that everybody should be doing. What about people who have exceptional threat models, like future intelligence-community whistleblowers, and other people who have nation-state adversaries? Maybe journalists, in some cases, or activists, or people like that? Snowden: So the first answer is that you can’t learn this from a single article. The needs of every individual in a high-risk environment are different. And the capabilities of the adversary are constantly improving. The tooling changes as well. What really matters is to be conscious of the principles of compromise. How can the adversary, in general, gain access to information that is sensitive to you? What kinds of things do you need to protect? Because of course you don’t need to hide everything from the adversary. You don’t need to live a paranoid life, off the grid, in hiding, in the woods in Montana. What we do need to protect are the facts of our activities, our beliefs, and our lives that could be used against us in manners that are contrary to our interests. So when we think about this for whistleblowers, for example, if you witnessed some kind of wrongdoing and you need to reveal this information, and you believe there are people that want to interfere with that, you need to think about how to compartmentalize that.
  • Tell no one who doesn’t need to know. [Lindsay Mills, Snowden’s girlfriend of several years, didn’t know that he had been collecting documents to leak to journalists until she heard about it on the news, like everyone else.] When we talk about whistleblowers and what to do, you want to think about tools for protecting your identity, protecting the existence of the relationship from any type of conventional communication system. You want to use something like SecureDrop, over the Tor network, so there is no connection between the computer that you are using at the time — preferably with a non-persistent operating system like Tails, so you’ve left no forensic trace on the machine you’re using, which hopefully is a disposable machine that you can get rid of afterward, that can’t be found in a raid, that can’t be analyzed or anything like that — so that the only outcome of your operational activities are the stories reported by the journalists. [SecureDrop is a whistleblower submission system. Here is a guide to using The Intercept’s SecureDrop server as safely as possible.]
  • And this is to be sure that whoever has been engaging in this wrongdoing cannot distract from the controversy by pointing to your physical identity. Instead they have to deal with the facts of the controversy rather than the actors that are involved in it. Lee: What about for people who are, like, in a repressive regime and are trying to … Snowden: Use Tor. Lee: Use Tor? Snowden: If you’re not using Tor you’re doing it wrong. Now, there is a counterpoint here where the use of privacy-enhancing technologies in certain areas can actually single you out for additional surveillance through the exercise of repressive measures. This is why it’s so critical for developers who are working on security-enhancing tools to not make their protocols stand out.
  •  
    Lots more in the interview that I didn't highlight. This is a must-read.
Paul Merrell

ExposeFacts - For Whistleblowers, Journalism and Democracy - 0 views

  • Launched by the Institute for Public Accuracy in June 2014, ExposeFacts.org represents a new approach for encouraging whistleblowers to disclose information that citizens need to make truly informed decisions in a democracy. From the outset, our message is clear: “Whistleblowers Welcome at ExposeFacts.org.” ExposeFacts aims to shed light on concealed activities that are relevant to human rights, corporate malfeasance, the environment, civil liberties and war. At a time when key provisions of the First, Fourth and Fifth Amendments are under assault, we are standing up for a free press, privacy, transparency and due process as we seek to reveal official information—whether governmental or corporate—that the public has a right to know. While no software can provide an ironclad guarantee of confidentiality, ExposeFacts—assisted by the Freedom of the Press Foundation and its “SecureDrop” whistleblower submission system—is utilizing the latest technology on behalf of anonymity for anyone submitting materials via the ExposeFacts.org website. As journalists we are committed to the goal of protecting the identity of every source who wishes to remain anonymous.
  • The seasoned editorial board of ExposeFacts will be assessing all the submitted material and, when deemed appropriate, will arrange for journalistic release of information. In exercising its judgment, the editorial board is able to call on the expertise of the ExposeFacts advisory board, which includes more than 40 journalists, whistleblowers, former U.S. government officials and others with wide-ranging expertise. We are proud that Pentagon Papers whistleblower Daniel Ellsberg was the first person to become a member of the ExposeFacts advisory board. The icon below links to a SecureDrop implementation for ExposeFacts overseen by the Freedom of the Press Foundation and is only accessible using the Tor browser. As the Freedom of the Press Foundation notes, no one can guarantee 100 percent security, but this provides a “significantly more secure environment for sources to get information than exists through normal digital channels, but there are always risks.” ExposeFacts follows all guidelines as recommended by Freedom of the Press Foundation, and whistleblowers should too; the SecureDrop onion URL should only be accessed with the Tor browser — and, for added security, be running the Tails operating system. Whistleblowers should not log-in to SecureDrop from a home or office Internet connection, but rather from public wifi, preferably one you do not frequent. Whistleblowers should keep to a minimum interacting with whistleblowing-related websites unless they are using such secure software.
  •  
    A new resource site for whistle-blowers. somewhat in the tradition of Wikileaks, but designed for encrypted communications between whistleblowers and journalists.  This one has an impressive board of advisors that includes several names I know and tend to trust, among them former whistle-blowers Daniel Ellsberg, Ray McGovern, Thomas Drake, William Binney, and Ann Wright. Leaked records can only be dropped from a web browser running the Tor anonymizer software and uses the SecureDrop system originally developed by Aaron Schwartz. They strongly recommend using the Tails secure operating system that can be installed to a thumb drive and leaves no tracks on the host machine. https://tails.boum.org/index.en.html Curious, I downloaded Tails and installed it to a virtual machine. It's a heavily customized version of Debian. It has a very nice Gnome desktop and blocks any attempt to connect to an external network by means other than installed software that demands encrypted communications. For example, web sites can only be viewed via the Tor anonymizing proxy network. It does take longer for web pages to load because they are moving over a chain of proxies, but even so it's faster than pages loaded in the dial-up modem days, even for web pages that are loaded with graphics, javascript, and other cruft. E.g., about 2 seconds for New York Times pages. All cookies are treated by default as session cookies so disappear when you close the page or the browser. I love my Linux Mint desktop, but I am thinking hard about switching that box to Tails. I've been looking for methods to send a lot more encrypted stuff down the pipe for NSA to store. Tails looks to make that not only easy, but unavoidable. From what I've gathered so far, if you want to install more software on Tails, it takes about an hour to create a customized version and then update your Tails installation from a new ISO file. Tails has a wonderful odor of having been designed for secure computing. Current
Paul Merrell

Internet privacy, funded by spooks: A brief history of the BBG | PandoDaily - 0 views

  • For the past few months I’ve been covering U.S. government funding of popular Internet privacy tools like Tor, CryptoCat and Open Whisper Systems. During my reporting, one agency in particular keeps popping up: An agency with one of those really bland names that masks its wild, bizarre history: the Broadcasting Board of Governors, or BBG. The BBG was formed in 1999 and runs on a $721 million annual budget. It reports directly to Secretary of State John Kerry and operates like a holding company for a host of Cold War-era CIA spinoffs and old school “psychological warfare” projects: Radio Free Europe, Radio Free Asia, Radio Martí, Voice of America, Radio Liberation from Bolshevism (since renamed “Radio Liberty”) and a dozen other government-funded radio stations and media outlets pumping out pro-American propaganda across the globe. Today, the Congressionally-funded federal agency is also one of the biggest backers of grassroots and open-source Internet privacy technology. These investments started in 2012, when the BBG launched the “Open Technology Fund” (OTF) — an initiative housed within and run by Radio Free Asia (RFA), a premier BBG property that broadcasts into communist countries like North Korea, Vietnam, Laos, China and Myanmar. The BBG endowed Radio Free Asia’s Open Technology Fund with a multimillion dollar budget and a single task: “to fulfill the U.S. Congressional global mandate for Internet freedom.”
  • Here’s a small sample of what the Broadcasting Board of Governors funded (through Radio Free Asia and then through the Open Technology Fund) between 2012 and 2014: Open Whisper Systems, maker of free encrypted text and voice mobile apps like TextSecure and Signal/RedPhone, got a generous $1.35-million infusion. (Facebook recently started using Open Whisper Systems to secure its WhatsApp messages.) CryptoCat, an encrypted chat app made by Nadim Kobeissi and promoted by EFF, received $184,000. LEAP, an email encryption startup, got just over $1 million. LEAP is currently being used to run secure VPN services at RiseUp.net, the radical anarchist communication collective. A Wikileaks alternative called GlobaLeaks (which was endorsed by the folks at Tor, including Jacob Appelbaum) received just under $350,000. The Guardian Project — which makes an encrypted chat app called ChatSecure, as well a mobile version of Tor called Orbot — got $388,500. The Tor Project received over $1 million from OTF to pay for security audits, traffic analysis tools and set up fast Tor exit nodes in the Middle East and South East Asia.
  •  
    But can we trust them?
Paul Merrell

After Paris Attacks, French Cops Want to Block Tor and Forbid Free Wi-Fi | Motherboard - 0 views

  • After the recent Paris terror attacks, French law enforcement wants to have several powers added to a proposed law, including the move to forbid and block the use of the Tor anonymity network, according to an internal document from the Ministry of Interior seen by French newspaper Le Monde.That document talks about two proposed pieces of legislation, one around the state of emergency, and the other concerning counter-terrorism. Regarding the former, French law enforcement wish to “Forbid free and shared wi-fi connections” during a state of emergency. This comes from a police opinion included in the document: the reason being that it is apparently difficult to track individuals who use public wi-fi networks.As the latter, law enforcement would like “to block or forbid communications of the Tor network.” The legislation, according to Le Monde, could be presented as early as January 2016.
Paul Merrell

He Was a Hacker for the NSA and He Was Willing to Talk. I Was Willing to Listen. - 0 views

  • he message arrived at night and consisted of three words: “Good evening sir!” The sender was a hacker who had written a series of provocative memos at the National Security Agency. His secret memos had explained — with an earthy use of slang and emojis that was unusual for an operative of the largest eavesdropping organization in the world — how the NSA breaks into the digital accounts of people who manage computer networks, and how it tries to unmask people who use Tor to browse the web anonymously. Outlining some of the NSA’s most sensitive activities, the memos were leaked by Edward Snowden, and I had written about a few of them for The Intercept. There is no Miss Manners for exchanging pleasantries with a man the government has trained to be the digital equivalent of a Navy SEAL. Though I had initiated the contact, I was wary of how he might respond. The hacker had publicly expressed a visceral dislike for Snowden and had accused The Intercept of jeopardizing lives by publishing classified information. One of his memos outlined the ways the NSA reroutes (or “shapes”) the internet traffic of entire countries, and another memo was titled “I Hunt Sysadmins.” I felt sure he could hack anyone’s computer, including mine. Good evening sir!
  • The sender was a hacker who had written a series of provocative memos at the National Security Agency. His secret memos had explained — with an earthy use of slang and emojis that was unusual for an operative of the largest eavesdropping organization in the world — how the NSA breaks into the digital accounts of people who manage computer networks, and how it tries to unmask people who use Tor to browse the web anonymously. Outlining some of the NSA’s most sensitive activities, the memos were leaked by Edward Snowden, and I had written about a few of them for The Intercept. There is no Miss Manners for exchanging pleasantries with a man the government has trained to be the digital equivalent of a Navy SEAL. Though I had initiated the contact, I was wary of how he might respond. The hacker had publicly expressed a visceral dislike for Snowden and had accused The Intercept of jeopardizing lives by publishing classified information. One of his memos outlined the ways the NSA reroutes (or “shapes”) the internet traffic of entire countries, and another memo was titled “I Hunt Sysadmins.” I felt sure he could hack anyone’s computer, including mine.
  • I got lucky with the hacker, because he recently left the agency for the cybersecurity industry; it would be his choice to talk, not the NSA’s. Fortunately, speaking out is his second nature.
  • ...7 more annotations...
  • The Lamb’s memos on cool ways to hunt sysadmins triggered a strong reaction when I wrote about them in 2014 with my colleague Ryan Gallagher. The memos explained how the NSA tracks down the email and Facebook accounts of systems administrators who oversee computer networks. After plundering their accounts, the NSA can impersonate the admins to get into their computer networks and pilfer the data flowing through them. As the Lamb wrote, “sys admins generally are not my end target. My end target is the extremist/terrorist or government official that happens to be using the network … who better to target than the person that already has the ‘keys to the kingdom’?” Another of his NSA memos, “Network Shaping 101,” used Yemen as a theoretical case study for secretly redirecting the entirety of a country’s internet traffic to NSA servers.
  • In recent years, two developments have helped make hacking for the government a lot more attractive than hacking for yourself. First, the Department of Justice has cracked down on freelance hacking, whether it be altruistic or malignant. If the DOJ doesn’t like the way you hack, you are going to jail. Meanwhile, hackers have been warmly invited to deploy their transgressive impulses in service to the homeland, because the NSA and other federal agencies have turned themselves into licensed hives of breaking into other people’s computers. For many, it’s a techno sandbox of irresistible delights, according to Gabriella Coleman, a professor at McGill University who studies hackers. “The NSA is a very exciting place for hackers because you have unlimited resources, you have some of the best talent in the world, whether it’s cryptographers or mathematicians or hackers,” she said. “It is just too intellectually exciting not to go there.”
  • He agreed to a video chat that turned into a three-hour discussion sprawling from the ethics of surveillance to the downsides of home improvements and the difficulty of securing your laptop.
  • “If I turn the tables on you,” I asked the Lamb, “and say, OK, you’re a target for all kinds of people for all kinds of reasons. How do you feel about being a target and that kind of justification being used to justify getting all of your credentials and the keys to your kingdom?” The Lamb smiled. “There is no real safe, sacred ground on the internet,” he replied. “Whatever you do on the internet is an attack surface of some sort and is just something that you live with. Any time that I do something on the internet, yeah, that is on the back of my mind. Anyone from a script kiddie to some random hacker to some other foreign intelligence service, each with their different capabilities — what could they be doing to me?”
  • “You know, the situation is what it is,” he said. “There are protocols that were designed years ago before anybody had any care about security, because when they were developed, nobody was foreseeing that they would be taken advantage of. … A lot of people on the internet seem to approach the problem [with the attitude of] ‘I’m just going to walk naked outside of my house and hope that nobody looks at me.’ From a security perspective, is that a good way to go about thinking? No, horrible … There are good ways to be more secure on the internet. But do most people use Tor? No. Do most people use Signal? No. Do most people use insecure things that most people can hack? Yes. Is that a bash against the intelligence community that people use stuff that’s easily exploitable? That’s a hard argument for me to make.”
  • I mentioned that lots of people, including Snowden, are now working on the problem of how to make the internet more secure, yet he seemed to do the opposite at the NSA by trying to find ways to track and identify people who use Tor and other anonymizers. Would he consider working on the other side of things? He wouldn’t rule it out, he said, but dismally suggested the game was over as far as having a liberating and safe internet, because our laptops and smartphones will betray us no matter what we do with them. “There’s the old adage that the only secure computer is one that is turned off, buried in a box ten feet underground, and never turned on,” he said. “From a user perspective, someone trying to find holes by day and then just live on the internet by night, there’s the expectation [that] if somebody wants to have access to your computer bad enough, they’re going to get it. Whether that’s an intelligence agency or a cybercrimes syndicate, whoever that is, it’s probably going to happen.”
  • There are precautions one can take, and I did that with the Lamb. When we had our video chat, I used a computer that had been wiped clean of everything except its operating system and essential applications. Afterward, it was wiped clean again. My concern was that the Lamb might use the session to obtain data from or about the computer I was using; there are a lot of things he might have tried, if he was in a scheming mood. At the end of our three hours together, I mentioned to him that I had taken these precautions—and he approved. “That’s fair,” he said. “I’m glad you have that appreciation. … From a perspective of a journalist who has access to classified information, it would be remiss to think you’re not a target of foreign intelligence services.” He was telling me the U.S. government should be the least of my worries. He was trying to help me. Documents published with this article: Tracking Targets Through Proxies & Anonymizers Network Shaping 101 Shaping Diagram I Hunt Sys Admins (first published in 2014)
Paul Merrell

Visit the Wrong Website, and the FBI Could End Up in Your Computer | Threat Level | WIRED - 0 views

  • Security experts call it a “drive-by download”: a hacker infiltrates a high-traffic website and then subverts it to deliver malware to every single visitor. It’s one of the most powerful tools in the black hat arsenal, capable of delivering thousands of fresh victims into a hackers’ clutches within minutes. Now the technique is being adopted by a different kind of a hacker—the kind with a badge. For the last two years, the FBI has been quietly experimenting with drive-by hacks as a solution to one of law enforcement’s knottiest Internet problems: how to identify and prosecute users of criminal websites hiding behind the powerful Tor anonymity system. The approach has borne fruit—over a dozen alleged users of Tor-based child porn sites are now headed for trial as a result. But it’s also engendering controversy, with charges that the Justice Department has glossed over the bulk-hacking technique when describing it to judges, while concealing its use from defendants. Critics also worry about mission creep, the weakening of a technology relied on by human rights workers and activists, and the potential for innocent parties to wind up infected with government malware because they visited the wrong website. “This is such a big leap, there should have been congressional hearings about this,” says ACLU technologist Chris Soghoian, an expert on law enforcement’s use of hacking tools. “If Congress decides this is a technique that’s perfectly appropriate, maybe that’s OK. But let’s have an informed debate about it.”
  • The FBI’s use of malware is not new. The bureau calls the method an NIT, for “network investigative technique,” and the FBI has been using it since at least 2002 in cases ranging from computer hacking to bomb threats, child porn to extortion. Depending on the deployment, an NIT can be a bulky full-featured backdoor program that gives the government access to your files, location, web history and webcam for a month at a time, or a slim, fleeting wisp of code that sends the FBI your computer’s name and address, and then evaporates. What’s changed is the way the FBI uses its malware capability, deploying it as a driftnet instead of a fishing line. And the shift is a direct response to Tor, the powerful anonymity system endorsed by Edward Snowden and the State Department alike.
Paul Merrell

The Latest Rules on How Long NSA Can Keep Americans' Encrypted Data Look Too Familiar |... - 0 views

  • Does the National Security Agency (NSA) have the authority to collect and keep all encrypted Internet traffic for as long as is necessary to decrypt that traffic? That was a question first raised in June 2013, after the minimization procedures governing telephone and Internet records collected under Section 702 of the Foreign Intelligence Surveillance Act were disclosed by Edward Snowden. The issue quickly receded into the background, however, as the world struggled to keep up with the deluge of surveillance disclosures. The Intelligence Authorization Act of 2015, which passed Congress this last December, should bring the question back to the fore. It established retention guidelines for communications collected under Executive Order 12333 and included an exception that allows NSA to keep ‘incidentally’ collected encrypted communications for an indefinite period of time. This creates a massive loophole in the guidelines. NSA’s retention of encrypted communications deserves further consideration today, now that these retention guidelines have been written into law. It has become increasingly clear over the last year that surveillance reform will be driven by technological change—specifically by the growing use of encryption technologies. Therefore, any legislation touching on encryption should receive close scrutiny.
  • Section 309 of the intel authorization bill describes “procedures for the retention of incidentally acquired communications.” It establishes retention guidelines for surveillance programs that are “reasonably anticipated to result in the acquisition of [telephone or electronic communications] to or from a United States person.” Communications to or from a United States person are ‘incidentally’ collected because the U.S. person is not the actual target of the collection. Section 309 states that these incidentally collected communications must be deleted after five years unless they meet a number of exceptions. One of these exceptions is that “the communication is enciphered or reasonably believed to have a secret meaning.” This exception appears to be directly lifted from NSA’s minimization procedures for data collected under Section 702 of FISA, which were declassified in 2013. 
  • While Section 309 specifically applies to collection taking place under E.O. 12333, not FISA, several of the exceptions described in Section 309 closely match exceptions in the FISA minimization procedures. That includes the exception for “enciphered” communications. Those minimization procedures almost certainly served as a model for these retention guidelines and will likely shape how this new language is interpreted by the Executive Branch. Section 309 also asks the heads of each relevant member of the intelligence community to develop procedures to ensure compliance with new retention requirements. I expect those procedures to look a lot like the FISA minimization guidelines.
  • ...6 more annotations...
  • This language is broad, circular, and technically incoherent, so it takes some effort to parse appropriately. When the minimization procedures were disclosed in 2013, this language was interpreted by outside commentators to mean that NSA may keep all encrypted data that has been incidentally collected under Section 702 for at least as long as is necessary to decrypt that data. Is this the correct interpretation? I think so. It is important to realize that the language above isn’t just broad. It seems purposefully broad. The part regarding relevance seems to mirror the rationale NSA has used to justify its bulk phone records collection program. Under that program, all phone records were relevant because some of those records could be valuable to terrorism investigations and (allegedly) it isn’t possible to collect only those valuable records. This is the “to find a needle a haystack, you first have to have the haystack” argument. The same argument could be applied to encrypted data and might be at play here.
  • This exception doesn’t just apply to encrypted data that might be relevant to a current foreign intelligence investigation. It also applies to cases in which the encrypted data is likely to become relevant to a future intelligence requirement. This is some remarkably generous language. It seems one could justify keeping any type of encrypted data under this exception. Upon close reading, it is difficult to avoid the conclusion that these procedures were written carefully to allow NSA to collect and keep a broad category of encrypted data under the rationale that this data might contain the communications of NSA targets and that it might be decrypted in the future. If NSA isn’t doing this today, then whoever wrote these minimization procedures wanted to at least ensure that NSA has the authority to do this tomorrow.
  • There are a few additional observations that are worth making regarding these nominally new retention guidelines and Section 702 collection. First, the concept of incidental collection as it has typically been used makes very little sense when applied to encrypted data. The way that NSA’s Section 702 upstream “about” collection is understood to work is that technology installed on the network does some sort of pattern match on Internet traffic; say that an NSA target uses example@gmail.com to communicate. NSA would then search content of emails for references to example@gmail.com. This could notionally result in a lot of incidental collection of U.S. persons’ communications whenever the email that references example@gmail.com is somehow mixed together with emails that have nothing to do with the target. This type of incidental collection isn’t possible when the data is encrypted because it won’t be possible to search and find example@gmail.com in the body of an email. Instead, example@gmail.com will have been turned into some alternative, indecipherable string of bits on the network. Incidental collection shouldn’t occur because the pattern match can’t occur in the first place. This demonstrates that, when communications are encrypted, it will be much harder for NSA to search Internet traffic for a unique ID associated with a specific target.
  • This lends further credence to the conclusion above: rather than doing targeted collection against specific individuals, NSA is collecting, or plans to collect, a broad class of data that is encrypted. For example, NSA might collect all PGP encrypted emails or all Tor traffic. In those cases, NSA could search Internet traffic for patterns associated with specific types of communications, rather than specific individuals’ communications. This would technically meet the definition of incidental collection because such activity would result in the collection of communications of U.S. persons who aren’t the actual targets of surveillance. Collection of all Tor traffic would entail a lot of this “incidental” collection because the communications of NSA targets would be mixed with the communications of a large number of non-target U.S. persons. However, this “incidental” collection is inconsistent with how the term is typically used, which is to refer to over-collection resulting from targeted surveillance programs. If NSA were collecting all Tor traffic, that activity wouldn’t actually be targeted, and so any resulting over-collection wouldn’t actually be incidental. Moreover, greater use of encryption by the general public would result in an ever-growing amount of this type of incidental collection.
  • This type of collection would also be inconsistent with representations of Section 702 upstream collection that have been made to the public and to Congress. Intelligence officials have repeatedly suggested that search terms used as part of this program have a high degree of specificity. They have also argued that the program is an example of targeted rather than bulk collection. ODNI General Counsel Robert Litt, in a March 2014 meeting before the Privacy and Civil Liberties Oversight Board, stated that “there is either a misconception or a mischaracterization commonly repeated that Section 702 is a form of bulk collection. It is not bulk collection. It is targeted collection based on selectors such as telephone numbers or email addresses where there’s reason to believe that the selector is relevant to a foreign intelligence purpose.” The collection of Internet traffic based on patterns associated with types of communications would be bulk collection; more akin to NSA’s collection of phone records en mass than it is to targeted collection focused on specific individuals. Moreover, this type of collection would certainly fall within the definition of bulk collection provided just last week by the National Academy of Sciences: “collection in which a significant portion of the retained data pertains to identifiers that are not targets at the time of collection.”
  • The Section 702 minimization procedures, which will serve as a template for any new retention guidelines established for E.O. 12333 collection, create a large loophole for encrypted communications. With everything from email to Internet browsing to real-time communications moving to encrypted formats, an ever-growing amount of Internet traffic will fall within this loophole.
  •  
    Tucked into a budget authorization act in December without press notice. Section 309 (the Act is linked from the article) appears to be very broad authority for the NSA to intercept any form of telephone or other electronic information in bulk. There are far more exceptions from the five-year retention limitation than the encrypted information exception. When reading this, keep in mind that the U.S. intelligence community plays semantic games to obfuscate what it does. One of its word plays is that communications are not "collected" until an analyst looks at or listens to partiuclar data, even though the data will be searched to find information countless times before it becomes "collected." That searching was the major basis for a decision by the U.S. District Court in Washington, D.C. that bulk collection of telephone communications was unconstitutional: Under the Fourth Amendment, a "search" or "seizure" requiring a judicial warrant occurs no later than when the information is intercepted. That case is on appeal, has been briefed and argued, and a decision could come any time now. Similar cases are pending in two other courts of appeals. Also, an important definition from the new Intelligence Authorization Act: "(a) DEFINITIONS.-In this section: (1) COVERED COMMUNICATION.-The term ''covered communication'' means any nonpublic telephone or electronic communication acquired without the consent of a person who is a party to the communication, including communications in electronic storage."       
Gary Edwards

Opt out of PRISM, the NSA's global data surveillance program - PRISM BREAK - 0 views

  •  
    "Opt out of PRISM, the NSA's global data surveillance program. Stop reporting your online activities to the American government with these free alternatives to proprietary software." A designer named Peng Zhong is so strongly opposed to PRISM, the NSA's domestic spying program, that he created a site to educate people on how to "opt out" of it. According to the original report that brought PRISM to public attention, the nine companies that "participate knowingly" with the NSA are Microsoft, Yahoo, Google, Facebook, PalTalk, AOL, Skype, YouTube, and Apple. Zhong's approach is to replace your workflow with open-source tools that aren't attached to these companies, since they easily stay off the government's radar. If you want to drop totally off the map, it'll take quite a commitment.   Are you ready to give up your operating system?  The NSA tracks everything on Windows, OSX and Google Chrome.  You will need to switch to Debian or some other brand of GNU Linux!  Like Mint!!!!! Personally I have switched from Google Chrome Browser to Mozilla Firefox using the TOR Browser Bundle - Private mode.
Gary Edwards

Google's uProxy could help fight Internet censorship - 0 views

  •  
    "At its Ideas Summit in New York, Google has announced that it is working on developing a browser extension that will act as an easy-to-use way to bypass country-specific Internet censorship and make connections safer and more private. Safer connections The tool, which was developed by the University of Washington and seeded by Google, is at its core a peer-to-peer personalized virtual private network (VPN) that redirects Internet traffic coming from an initial, less secure connection through a second, trusted connection, and then encrypts the pathway between the two terminals. Whenever you access the Internet, the connection is routed through a number of terminals. At each step of the way the connection may be blocked, surveilled, or even tampered with (especially if the data is not encrypted). On the whole, the safety and privacy of your data is only as good as the weakest link in the chain. Google's solution with uProxy was to develop a tool that makes it much easier to make an unsafe connection more secure, with the help of a trusted friend. The software, which will be available as a Chrome and Firefox extension to begin with, can use existing social networks like Facebook or Google Hangouts to help find users who already have uProxy installed on their system. If two users agree to use the service in tandem, the software can begin to make data connections safer. How it works Let's assume that Alice, who lives in a country with an Internet censorship problem such as China or Iran, contacts Bob, who has much safer, or uncensored, or unmonitored access to the Internet. Bob agrees to act as a proxy for Alice, and as long as his browser is open, Alice's outgoing web traffic will now be routed through Bob's connection, and so she'll now be able to access websites that she wouldn't otherwise be able to reach on her own. The connection between Alice and Bob is also encrypted. To an external observer looking at Bob's connection, it would appear that he is simply s
Paul Merrell

Google Chrome Listening In To Your Room Shows The Importance Of Privacy Defense In Depth - 0 views

  • Yesterday, news broke that Google has been stealth downloading audio listeners onto every computer that runs Chrome, and transmits audio data back to Google. Effectively, this means that Google had taken itself the right to listen to every conversation in every room that runs Chrome somewhere, without any kind of consent from the people eavesdropped on. In official statements, Google shrugged off the practice with what amounts to “we can do that”.It looked like just another bug report. "When I start Chromium, it downloads something." Followed by strange status information that notably included the lines "Microphone: Yes" and "Audio Capture Allowed: Yes".
  • Without consent, Google’s code had downloaded a black box of code that – according to itself – had turned on the microphone and was actively listening to your room.A brief explanation of the Open-source / Free-software philosophy is needed here. When you’re installing a version of GNU/Linux like Debian or Ubuntu onto a fresh computer, thousands of really smart people have analyzed every line of human-readable source code before that operating system was built into computer-executable binary code, to make it common and open knowledge what the machine actually does instead of trusting corporate statements on what it’s supposed to be doing. Therefore, you don’t install black boxes onto a Debian or Ubuntu system; you use software repositories that have gone through this source-code audit-then-build process. Maintainers of operating systems like Debian and Ubuntu use many so-called “upstreams” of source code to build the final product.Chromium, the open-source version of Google Chrome, had abused its position as trusted upstream to insert lines of source code that bypassed this audit-then-build process, and which downloaded and installed a black box of unverifiable executable code directly onto computers, essentially rendering them compromised. We don’t know and can’t know what this black box does. But we see reports that the microphone has been activated, and that Chromium considers audio capture permitted.
  • This was supposedly to enable the “Ok, Google” behavior – that when you say certain words, a search function is activated. Certainly a useful feature. Certainly something that enables eavesdropping of every conversation in the entire room, too.Obviously, your own computer isn’t the one to analyze the actual search command. Google’s servers do. Which means that your computer had been stealth configured to send what was being said in your room to somebody else, to a private company in another country, without your consent or knowledge, an audio transmission triggered by… an unknown and unverifiable set of conditions.Google had two responses to this. The first was to introduce a practically-undocumented switch to opt out of this behavior, which is not a fix: the default install will still wiretap your room without your consent, unless you opt out, and more importantly, know that you need to opt out, which is nowhere a reasonable requirement. But the second was more of an official statement following technical discussions on Hacker News and other places. That official statement amounted to three parts (paraphrased, of course):
  • ...4 more annotations...
  • Of course, people were quick to downplay the alarm. “It only listens when you say ‘Ok, Google’.” (Ok, so how does it know to start listening just before I’m about to say ‘Ok, Google?’) “It’s no big deal.” (A company stealth installs an audio listener that listens to every room in the world it can, and transmits audio data to the mothership when it encounters an unknown, possibly individually tailored, list of keywords – and it’s no big deal!?) “You can opt out. It’s in the Terms of Service.” (No. Just no. This is not something that is the slightest amount of permissible just because it’s hidden in legalese.) “It’s opt-in. It won’t really listen unless you check that box.” (Perhaps. We don’t know, Google just downloaded a black box onto my computer. And it may not be the same black box as was downloaded onto yours. )Early last decade, privacy activists practically yelled and screamed that the NSA’s taps of various points of the Internet and telecom networks had the technical potential for enormous abuse against privacy. Everybody else dismissed those points as basically tinfoilhattery – until the Snowden files came out, and it was revealed that precisely everybody involved had abused their technical capability for invasion of privacy as far as was possible.Perhaps it would be wise to not repeat that exact mistake. Nobody, and I really mean nobody, is to be trusted with a technical capability to listen to every room in the world, with listening profiles customizable at the identified-individual level, on the mere basis of “trust us”.
  • If you think this is an excusable and responsible statement, raise your hand now.Now, it should be noted that this was Chromium, the open-source version of Chrome. If somebody downloads the Google product Google Chrome, as in the prepackaged binary, you don’t even get a theoretical choice. You’re already downloading a black box from a vendor. In Google Chrome, this is all included from the start.This episode highlights the need for hard, not soft, switches to all devices – webcams, microphones – that can be used for surveillance. A software on/off switch for a webcam is no longer enough, a hard shield in front of the lens is required. A software on/off switch for a microphone is no longer enough, a physical switch that breaks its electrical connection is required. That’s how you defend against this in depth.
  • 1) Yes, we’re downloading and installing a wiretapping black-box to your computer. But we’re not actually activating it. We did take advantage of our position as trusted upstream to stealth-insert code into open-source software that installed this black box onto millions of computers, but we would never abuse the same trust in the same way to insert code that activates the eavesdropping-blackbox we already downloaded and installed onto your computer without your consent or knowledge. You can look at the code as it looks right now to see that the code doesn’t do this right now.2) Yes, Chromium is bypassing the entire source code auditing process by downloading a pre-built black box onto people’s computers. But that’s not something we care about, really. We’re concerned with building Google Chrome, the product from Google. As part of that, we provide the source code for others to package if they like. Anybody who uses our code for their own purpose takes responsibility for it. When this happens in a Debian installation, it is not Google Chrome’s behavior, this is Debian Chromium’s behavior. It’s Debian’s responsibility entirely.3) Yes, we deliberately hid this listening module from the users, but that’s because we consider this behavior to be part of the basic Google Chrome experience. We don’t want to show all modules that we install ourselves.
  • Privacy remains your own responsibility.
  •  
    And of course, Google would never succumb to a subpoena requiring it to turn over the audio stream to the NSA. The Tor Browser just keeps looking better and better. https://www.torproject.org/projects/torbrowser.html.en
Paul Merrell

'Let's Encrypt' Project Strives To Make Encryption Simple - Slashdot - 0 views

  •  
    The blurb is a bit misleading. This is a project that's been under way since last year; what's new is that they're moving under the Linux Foundation umbrella for various non-technical suoport purposes. By sometime this summer, encrypting web site data and broadcasting it over https is  slated to become a two-click process. Or on the linux command line: $ sudo apt-get install lets-encrypt $ lets-encrypt example.com This is a project that grew out of public disgust with NSA surveillance, designed to flood the NSA (and other bad actors) with so much encrypted data that they will be able to decrypt only a tiny fraction (decryption without the decryption key takes gobs of computer cycles).  The other half of the solution is already available, the HTTPS Everywhere extension for the Chrome, FIrefox, and Opera web browsers by the Electronic Frontier Foundation and the TOR Project that translates your every request for a http address into an effort to connect to an https address preferentially before establishing an http connection if https is not available. HTTPS Everywhere is fast and does not noticeably add to your page loading time. If you'd like to effortlessly imoprove your online security and help burden NSA, install HTTPS Everywhere. Get it at https://www.eff.org/https-everywhere
Paul Merrell

What to Do About Lawless Government Hacking and the Weakening of Digital Security | Ele... - 0 views

  • In our society, the rule of law sets limits on what government can and cannot do, no matter how important its goals. To give a simple example, even when chasing a fleeing murder suspect, the police have a duty not to endanger bystanders. The government should pay the same care to our safety in pursuing threats online, but right now we don’t have clear, enforceable rules for government activities like hacking and "digital sabotage." And this is no abstract question—these actions increasingly endanger everyone’s security
  • The problem became especially clear this year during the San Bernardino case, involving the FBI’s demand that Apple rewrite its iOS operating system to defeat security features on a locked iPhone. Ultimately the FBI exploited an existing vulnerability in iOS and accessed the contents of the phone with the help of an "outside party." Then, with no public process or discussion of the tradeoffs involved, the government refused to tell Apple about the flaw. Despite the obvious fact that the security of the computers and networks we all use is both collective and interwoven—other iPhones used by millions of innocent people presumably have the same vulnerability—the government chose to withhold information Apple could have used to improve the security of its phones. Other examples include intelligence activities like Stuxnet and Bullrun, and law enforcement investigations like the FBI’s mass use of malware against Tor users engaged in criminal behavior. These activities are often disproportionate to stopping legitimate threats, resulting in unpatched software for millions of innocent users, overbroad surveillance, and other collateral effects.  That’s why we’re working on a positive agenda to confront governmental threats to digital security. Put more directly, we’re calling on lawyers, advocates, technologists, and the public to demand a public discussion of whether, when, and how governments can be empowered to break into our computers, phones, and other devices; sabotage and subvert basic security protocols; and stockpile and exploit software flaws and vulnerabilities.  
  • Smart people in academia and elsewhere have been thinking and writing about these issues for years. But it’s time to take the next step and make clear, public rules that carry the force of law to ensure that the government weighs the tradeoffs and reaches the right decisions. This long post outlines some of the things that can be done. It frames the issue, then describes some of the key areas where EFF is already pursuing this agenda—in particular formalizing the rules for disclosing vulnerabilities and setting out narrow limits for the use of government malware. Finally it lays out where we think the debate should go from here.   
  •  
    It's not often that I disagree with EFF's positions, but on this one I do. The government should be prohibited from exploiting computer vulnerabilities and should be required to immediately report all vulnerabilities discovered to the relevant developers of hardware or software. It's been one long slippery slope since the Supreme Court first approved wiretapping in Olmstead v. United States, 277 US 438 (1928), https://goo.gl/NJevsr (.) Left undecided to this day is whether we have a right to whisper privately, a right that is undeniable. All communications intercept cases since Olmstead fly directly in the face of that right.
Paul Merrell

We're Halfway to Encrypting the Entire Web | Electronic Frontier Foundation - 0 views

  • The movement to encrypt the web has reached a milestone. As of earlier this month, approximately half of Internet traffic is now protected by HTTPS. In other words, we are halfway to a web safer from the eavesdropping, content hijacking, cookie stealing, and censorship that HTTPS can protect against. Mozilla recently reported that the average volume of encrypted web traffic on Firefox now surpasses the average unencrypted volume
  • Google Chrome’s figures on HTTPS usage are consistent with that finding, showing that over 50% of of all pages loaded are protected by HTTPS across different operating systems.
  • This milestone is a combination of HTTPS implementation victories: from tech giants and large content providers, from small websites, and from users themselves.
  • ...4 more annotations...
  • Starting in 2010, EFF members have pushed tech companies to follow crypto best practices. We applauded when Facebook and Twitter implemented HTTPS by default, and when Wikipedia and several other popular sites later followed suit. Google has also put pressure on the tech community by using HTTPS as a signal in search ranking algorithms and, starting this year, showing security warnings in Chrome when users load HTTP sites that request passwords or credit card numbers. EFF’s Encrypt the Web Report also played a big role in tracking and encouraging specific practices. Recently other organizations have followed suit with more sophisticated tracking projects. For example, Secure the News and Pulse track HTTPS progress among news media sites and U.S. government sites, respectively.
  • But securing large, popular websites is only one part of a much bigger battle. Encrypting the entire web requires HTTPS implementation to be accessible to independent, smaller websites. Let’s Encrypt and Certbot have changed the game here, making what was once an expensive, technically demanding process into an easy and affordable task for webmasters across a range of resource and skill levels. Let’s Encrypt is a Certificate Authority (CA) run by the Internet Security Research Group (ISRG) and founded by EFF, Mozilla, and the University of Michigan, with Cisco and Akamai as founding sponsors. As a CA, Let’s Encrypt issues and maintains digital certificates that help web users and their browsers know they’re actually talking to the site they intended to. CAs are crucial to secure, HTTPS-encrypted communication, as these certificates verify the association between an HTTPS site and a cryptographic public key. Through EFF’s Certbot tool, webmasters can get a free certificate from Let’s Encrypt and automatically configure their server to use it. Since we announced that Let’s Encrypt was the web’s largest certificate authority last October, it has exploded from 12 million certs to over 28 million. Most of Let’s Encrypt’s growth has come from giving previously unencrypted sites their first-ever certificates. A large share of these leaps in HTTPS adoption are also thanks to major hosting companies and platforms--like WordPress.com, Squarespace, and dozens of others--integrating Let’s Encrypt and providing HTTPS to their users and customers.
  • Unfortunately, you can only use HTTPS on websites that support it--and about half of all web traffic is still with sites that don’t. However, when sites partially support HTTPS, users can step in with the HTTPS Everywhere browser extension. A collaboration between EFF and the Tor Project, HTTPS Everywhere makes your browser use HTTPS wherever possible. Some websites offer inconsistent support for HTTPS, use unencrypted HTTP as a default, or link from secure HTTPS pages to unencrypted HTTP pages. HTTPS Everywhere fixes these problems by rewriting requests to these sites to HTTPS, automatically activating encryption and HTTPS protection that might otherwise slip through the cracks.
  • Our goal is a universally encrypted web that makes a tool like HTTPS Everywhere redundant. Until then, we have more work to do. Protect your own browsing and websites with HTTPS Everywhere and Certbot, and spread the word to your friends, family, and colleagues to do the same. Together, we can encrypt the entire web.
  •  
    HTTPS connections don't work for you if you don't use them. If you're not using HTTPS Everywhere in your browser, you should be; it's your privacy that is at stake. And every encrypted communication you make adds to the backlog of encrypted data that NSA and other internet voyeurs must process as encrypted traffic; because cracking encrypted messages is computer resource intensive, the voyeurs do not have the resources to crack more than a tiny fraction. HTTPS is a free extension for Firefox, Chrome, and Opera. You can get it here. https://www.eff.org/HTTPS-everywhere
Paul Merrell

Revealed: How DOJ Gagged Google over Surveillance of WikiLeaks Volunteer - The Intercept - 0 views

  • The Obama administration fought a legal battle against Google to secretly obtain the email records of a security researcher and journalist associated with WikiLeaks. Newly unsealed court documents obtained by The Intercept reveal the Justice Department won an order forcing Google to turn over more than one year’s worth of data from the Gmail account of Jacob Appelbaum (pictured above), a developer for the Tor online anonymity project who has worked with WikiLeaks as a volunteer. The order also gagged Google, preventing it from notifying Appelbaum that his records had been provided to the government. The surveillance of Appelbaum’s Gmail account was tied to the Justice Department’s long-running criminal investigation of WikiLeaks, which began in 2010 following the transparency group’s publication of a large cache of U.S. government diplomatic cables. According to the unsealed documents, the Justice Department first sought details from Google about a Gmail account operated by Appelbaum in January 2011, triggering a three-month dispute between the government and the tech giant. Government investigators demanded metadata records from the account showing email addresses of those with whom Appelbaum had corresponded between the period of November 2009 and early 2011; they also wanted to obtain information showing the unique IP addresses of the computers he had used to log in to the account.
  • The Justice Department argued in the case that Appelbaum had “no reasonable expectation of privacy” over his email records under the Fourth Amendment, which protects against unreasonable searches and seizures. Rather than seeking a search warrant that would require it to show probable cause that he had committed a crime, the government instead sought and received an order to obtain the data under a lesser standard, requiring only “reasonable grounds” to believe that the records were “relevant and material” to an ongoing criminal investigation. Google repeatedly attempted to challenge the demand, and wanted to immediately notify Appelbaum that his records were being sought so he could have an opportunity to launch his own legal defense. Attorneys for the tech giant argued in a series of court filings that the government’s case raised “serious First Amendment concerns.” They noted that Appelbaum’s records “may implicate journalistic and academic freedom” because they could “reveal confidential sources or information about WikiLeaks’ purported journalistic or academic activities.” However, the Justice Department asserted that “journalists have no special privilege to resist compelled disclosure of their records, absent evidence that the government is acting in bad faith,” and refused to concede Appelbaum was in fact a journalist. It claimed it had acted in “good faith throughout this criminal investigation, and there is no evidence that either the investigation or the order is intended to harass the … subscriber or anyone else.” Google’s attempts to fight the surveillance gag order angered the government, with the Justice Department stating that the company’s “resistance to providing the records” had “frustrated the government’s ability to efficiently conduct a lawful criminal investigation.”
  • The Justice Department wanted to keep the surveillance secret largely because of an earlier public backlash over its WikiLeaks investigation. In January 2011, Appelbaum and other WikiLeaks volunteers’ – including Icelandic parlimentarian Birgitta Jonsdottir – were notified by Twitter that the Justice Department had obtained data about their accounts. This disclosure generated widepread news coverage and controversy; the government says in the unsealed court records that it “failed to anticipate the degree of  damage that would be caused” by the Twitter disclosure and did not want to “exacerbate this problem” when it went after Appelbaum’s Gmail data. The court documents show the Justice Department said the disclosure of its Twitter data grab “seriously jeopardized the [WikiLeaks] investigation” because it resulted in efforts to “conceal evidence” and put public pressure on other companies to resist similar surveillance orders. It also claimed that officials named in the subpeona ordering Twitter to turn over information were “harassed” after a copy was published by Intercept co-founder Glenn Greenwald at Salon in 2011. (The only specific evidence of the alleged harassment cited by the government is an email that was sent to an employee of the U.S. Attorney’s office that purportedly said: “You guys are fucking nazis trying to controll [sic] the whole fucking world. Well guess what. WE DO NOT FORGIVE. WE DO NOT FORGET. EXPECT US.”)
  • ...4 more annotations...
  • Google accused the government of hyperbole and argued that the backlash over the Twitter order did not justify secrecy related to the Gmail surveillance. “Rather than demonstrating how unsealing the order will harm its well-publicized investigation, the government lists a parade of horribles that have allegedly occurred since it unsealed the Twitter order, yet fails to establish how any of these developments could be further exacerbated by unsealing this order,” wrote Google’s attorneys. “The proverbial toothpaste is out of the tube, and continuing to seal a materially identical order will not change it.” But Google’s attempt to overturn the gag order was denied by magistrate judge Ivan D. Davis in February 2011. The company launched an appeal against that decision, but this too was rebuffed, in March 2011, by District Court judge Thomas Selby Ellis, III.
  • The government agreed to unseal some of the court records on Apr. 1 this year, and they were apparently turned over to Appelbaum on May 14 through a notification sent to his Gmail account. The files were released on condition that they would contain some redactions, which are bizarre and inconsistent, in some cases censoring the name of “WikiLeaks” from cited public news reports. Not all of the documents in the case – such as the original surveillance orders contested by Google – were released as part of the latest disclosure. Some contain “specific and sensitive details of the investigation” and “remain properly sealed while the grand jury investigation continues,” according to the court records from April this year. Appelbaum, an American citizen who is based in Berlin, called the case “a travesty that continues at a slow pace” and said he felt it was important to highlight “the absolute madness in these documents.”
  • He told The Intercept: “After five years, receiving such legal documents is neither a shock nor a needed confirmation. … Will we ever see the full documents about our respective cases? Will we even learn the names of those signing so-called legal orders against us in secret sealed documents? Certainly not in a timely manner and certainly not in a transparent, just manner.” The 32-year-old, who has recently collaborated with Intercept co-founder Laura Poitras to report revelations about National Security Agency surveillance for German news magazine Der Spiegel, said he plans to remain in Germany “in exile, rather than returning to the U.S. to experience more harassment of a less than legal kind.”
  • “My presence in Berlin ensures that the cost of physically harassing me or politically harassing me is much higher than when I last lived on U.S. soil,” Appelbaum said. “This allows me to work as a journalist freely from daily U.S. government interference. It also ensures that any further attempts to continue this will be forced into the open through [a Mutal Legal Assistance Treaty] and other international processes. The German goverment is less likely to allow the FBI to behave in Germany as they do on U.S. soil.” The Justice Department’s WikiLeaks investigaton is headed by prosecutors in the Eastern District of Virginia. Since 2010, the secretive probe has seen activists affiliated with WikiLeaks compelled to appear before a grand jury and the FBI attempting to infiltrate the group with an informant. Earlier this year, it was revealed that the government had obtained the contents of three core WikiLeaks staffers’ Gmail accounts as part of the investigation.
Paul Merrell

DOJ Pushes to Expand Hacking Abilities Against Cyber-Criminals - Law Blog - WSJ - 0 views

  • The U.S. Department of Justice is pushing to make it easier for law enforcement to get warrants to hack into the computers of criminal suspects across the country. The move, which would alter federal court rules governing search warrants, comes amid increases in cases related to computer crimes. Investigators say they need more flexibility to get warrants to allow hacking in such cases, especially when multiple computers are involved or the government doesn’t know where the suspect’s computer is physically located. The Justice Department effort is raising questions among some technology advocates, who say the government should focus on fixing the holes in computer software that allow such hacking instead of exploiting them. Privacy advocates also warn government spyware could end up on innocent people’s computers if remote attacks are authorized against equipment whose ownership isn’t clear.
  • The government’s push for rule changes sheds light on law enforcement’s use of remote hacking techniques, which are being deployed more frequently but have been protected behind a veil of secrecy for years. In documents submitted by the government to the judicial system’s rule-making body this year, the government discussed using software to find suspected child pornographers who visited a U.S. site and concealed their identity using a strong anonymization tool called Tor. The government’s hacking tools—such as sending an email embedded with code that installs spying software — resemble those used by criminal hackers. The government doesn’t describe these methods as hacking, preferring instead to use terms like “remote access” and “network investigative techniques.” Right now, investigators who want to search property, including computers, generally need to get a warrant from a judge in the district where the property is located, according to federal court rules. In a computer investigation, that might not be possible, because criminals can hide behind anonymizing technologies. In cases involving botnets—groups of hijacked computers—investigators might also want to search many machines at once without getting that many warrants.
  • Some judges have already granted warrants in cases when authorities don’t know where the machine is. But at least one judge has denied an application in part because of the current rules. The department also wants warrants to be allowed for multiple computers at the same time, as well as for searches of many related storage, email and social media accounts at once, as long as those accounts are accessed by the computer being searched. “Remote searches of computers are often essential to the successful investigation” of computer crimes, Acting Assistant Attorney General Mythili Raman wrote in a letter to the judicial system’s rulemaking authority requesting the change in September. The government tries to obtain these “remote access warrants” mainly to “combat Internet anonymizing techniques,” the department said in a memo to the authority in March. Some groups have raised questions about law enforcement’s use of hacking technologies, arguing that such tools mean the government is failing to help fix software problems exploited by criminals. “It is crucial that we have a robust public debate about how the Fourth Amendment and federal law should limit the government’s use of malware and spyware within the U.S.,” said Nathan Wessler, a staff attorney at the American Civil Liberties Union who focuses on technology issues.
  • ...1 more annotation...
  • A Texas judge who denied a warrant application last year cited privacy concerns associated with sending malware when the location of the computer wasn’t known. He pointed out that a suspect opening an email infected with spyware could be doing so on a public computer, creating risk of information being collected from innocent people. A former computer crimes prosecutor serving on an advisory committee of the U.S. Judicial Conference, which is reviewing the request, said he was concerned that allowing the search of multiple computers under a single warrant would violate the Fourth Amendment’s protections against overly broad searches. The proposed rule is set to be debated by the Judicial Conference’s Advisory Committee on Criminal Rules in early April, after which it would be opened to public comment.
Paul Merrell

EFF Pries More Information on Zero Days from the Government's Grasp | Electronic Fronti... - 0 views

  • Until just last week, the U.S. government kept up the charade that its use of a stockpile of security vulnerabilities for hacking was a closely held secret.1 In fact, in response to EFF’s FOIA suit to get access to the official U.S. policy on zero days, the government redacted every single reference to “offensive” use of vulnerabilities. To add insult to injury, the government’s claim was that even admitting to offensive use would cause damage to national security. Now, in the face of EFF’s brief marshaling overwhelming evidence to the contrary, the charade is over. In response to EFF’s motion for summary judgment, the government has disclosed a new version of the Vulnerabilities Equities Process, minus many of the worst redactions. First and foremost, it now admits that the “discovery of vulnerabilities in commercial information technology may present competing ‘equities’ for the [government’s] offensive and defensive mission.” That might seem painfully obvious—a flaw or backdoor in a Juniper router is dangerous for anyone running a network, whether that network is in the U.S. or Iran. But the government’s failure to adequately weigh these “competing equities” was so severe that in 2013 a group of experts appointed by President Obama recommended that the policy favor disclosure “in almost all instances for widely used code.” [.pdf].
  • The newly disclosed version of the Vulnerabilities Equities Process (VEP) also officially confirms what everyone already knew: the use of zero days isn’t confined to the spies. Rather, the policy states that the “law enforcement community may want to use information pertaining to a vulnerability for similar offensive or defensive purposes but for the ultimate end of law enforcement.” Similarly it explains that “counterintelligence equities can be defensive, offensive, and/or law enforcement-related” and may “also have prosecutorial responsibilities.” Given that the government is currently prosecuting users for committing crimes over Tor hidden services, and that it identified these individuals using vulnerabilities called a “Network Investigative Technique”, this too doesn’t exactly come as a shocker. Just a few weeks ago, the government swore that even acknowledging the mere fact that it uses vulnerabilities offensively “could be expected to cause serious damage to the national security.” That’s a standard move in FOIA cases involving classified information, even though the government unnecessarily classifies documents at an astounding rate. In this case, the government relented only after nearly a year and a half of litigation by EFF. The government would be well advised to stop relying on such weak secrecy claims—it only risks undermining its own credibility.
  • The new version of the VEP also reveals significantly more information about the general process the government follows when a vulnerability is identified. In a nutshell, an agency that discovers a zero day is responsible for invoking the VEP, which then provides for centralized coordination and weighing of equities among all affected agencies. Along with a declaration from an official at the Office of the Director of National Intelligence, this new information provides more background on the reasons why the government decided to develop an overarching zero day policy in the first place: it “recognized that not all organizations see the entire picture of vulnerabilities, and each organization may have its own equities and concerns regarding the prioritization of patches and fixes, as well as its own distinct mission obligations.” We now know the VEP was finalized in February 2010, but the government apparently failed to implement it in any substantial way, prompting the presidential review group’s recommendation to prioritize disclosure over offensive hacking. We’re glad to have forced a little more transparency on this important issue, but the government is still foolishly holding on to a few last redactions, including refusing to name which agencies participate in the VEP. That’s just not supportable, and we’ll be in court next month to argue that the names of these agencies must be disclosed. 
1 - 20 of 21 Next ›
Showing 20 items per page