Skip to main content

Home/ Socialism and the End of the American Dream/ Group items tagged encryption-backdoors

Rss Feed Group items tagged

Paul Merrell

Cy Vance's Proposal to Backdoor Encrypted Devices Is Riddled With Vulnerabilities | Jus... - 0 views

  • Less than a week after the attacks in Paris — while the public and policymakers were still reeling, and the investigation had barely gotten off the ground — Cy Vance, Manhattan’s District Attorney, released a policy paper calling for legislation requiring companies to provide the government with backdoor access to their smartphones and other mobile devices. This is the first concrete proposal of this type since September 2014, when FBI Director James Comey reignited the “Crypto Wars” in response to Apple’s and Google’s decisions to use default encryption on their smartphones. Though Comey seized on Apple’s and Google’s decisions to encrypt their devices by default, his concerns are primarily related to end-to-end encryption, which protects communications that are in transit. Vance’s proposal, on the other hand, is only concerned with device encryption, which protects data stored on phones. It is still unclear whether encryption played any role in the Paris attacks, though we do know that the attackers were using unencrypted SMS text messages on the night of the attack, and that some of them were even known to intelligence agencies and had previously been under surveillance. But regardless of whether encryption was used at some point during the planning of the attacks, as I lay out below, prohibiting companies from selling encrypted devices would not prevent criminals or terrorists from being able to access unbreakable encryption. Vance’s primary complaint is that Apple’s and Google’s decisions to provide their customers with more secure devices through encryption interferes with criminal investigations. He claims encryption prevents law enforcement from accessing stored data like iMessages, photos and videos, Internet search histories, and third party app data. He makes several arguments to justify his proposal to build backdoors into encrypted smartphones, but none of them hold water.
  • Before addressing the major privacy, security, and implementation concerns that his proposal raises, it is worth noting that while an increase in use of fully encrypted devices could interfere with some law enforcement investigations, it will help prevent far more crimes — especially smartphone theft, and the consequent potential for identity theft. According to Consumer Reports, in 2014 there were more than two million victims of smartphone theft, and nearly two-thirds of all smartphone users either took no steps to secure their phones or their data or failed to implement passcode access for their phones. Default encryption could reduce instances of theft because perpetrators would no longer be able to break into the phone to steal the data.
  • Vance argues that creating a weakness in encryption to allow law enforcement to access data stored on devices does not raise serious concerns for security and privacy, since in order to exploit the vulnerability one would need access to the actual device. He considers this an acceptable risk, claiming it would not be the same as creating a widespread vulnerability in encryption protecting communications in transit (like emails), and that it would be cheap and easy for companies to implement. But Vance seems to be underestimating the risks involved with his plan. It is increasingly important that smartphones and other devices are protected by the strongest encryption possible. Our devices and the apps on them contain astonishing amounts of personal information, so much that an unprecedented level of harm could be caused if a smartphone or device with an exploitable vulnerability is stolen, not least in the forms of identity fraud and credit card theft. We bank on our phones, and have access to credit card payments with services like Apple Pay. Our contact lists are stored on our phones, including phone numbers, emails, social media accounts, and addresses. Passwords are often stored on people’s phones. And phones and apps are often full of personal details about their lives, from food diaries to logs of favorite places to personal photographs. Symantec conducted a study, where the company spread 50 “lost” phones in public to see what people who picked up the phones would do with them. The company found that 95 percent of those people tried to access the phone, and while nearly 90 percent tried to access private information stored on the phone or in other private accounts such as banking services and email, only 50 percent attempted contacting the owner.
  • ...8 more annotations...
  • In addition to his weak reasoning for why it would be feasible to create backdoors to encrypted devices without creating undue security risks or harming privacy, Vance makes several flawed policy-based arguments in favor of his proposal. He argues that criminals benefit from devices that are protected by strong encryption. That may be true, but strong encryption is also a critical tool used by billions of average people around the world every day to protect their transactions, communications, and private information. Lawyers, doctors, and journalists rely on encryption to protect their clients, patients, and sources. Government officials, from the President to the directors of the NSA and FBI, and members of Congress, depend on strong encryption for cybersecurity and data security. There are far more innocent Americans who benefit from strong encryption than there are criminals who exploit it. Encryption is also essential to our economy. Device manufacturers could suffer major economic losses if they are prohibited from competing with foreign manufacturers who offer more secure devices. Encryption also protects major companies from corporate and nation-state espionage. As more daily business activities are done on smartphones and other devices, they may now hold highly proprietary or sensitive information. Those devices could be targeted even more than they are now if all that has to be done to access that information is to steal an employee’s smartphone and exploit a vulnerability the manufacturer was required to create.
  • Privacy is another concern that Vance dismisses too easily. Despite Vance’s arguments otherwise, building backdoors into device encryption undermines privacy. Our government does not impose a similar requirement in any other context. Police can enter homes with warrants, but there is no requirement that people record their conversations and interactions just in case they someday become useful in an investigation. The conversations that we once had through disposable letters and in-person conversations now happen over the Internet and on phones. Just because the medium has changed does not mean our right to privacy has.
  • Vance attempts to downplay this serious risk by asserting that anyone can use the “Find My Phone” or Android Device Manager services that allow owners to delete the data on their phones if stolen. However, this does not stand up to scrutiny. These services are effective only when an owner realizes their phone is missing and can take swift action on another computer or device. This delay ensures some period of vulnerability. Encryption, on the other hand, protects everyone immediately and always. Additionally, Vance argues that it is safer to build backdoors into encrypted devices than it is to do so for encrypted communications in transit. It is true that there is a difference in the threats posed by the two types of encryption backdoors that are being debated. However, some manner of widespread vulnerability will inevitably result from a backdoor to encrypted devices. Indeed, the NSA and GCHQ reportedly hacked into a database to obtain cell phone SIM card encryption keys in order defeat the security protecting users’ communications and activities and to conduct surveillance. Clearly, the reality is that the threat of such a breach, whether from a hacker or a nation state actor, is very real. Even if companies go the extra mile and create a different means of access for every phone, such as a separate access key for each phone, significant vulnerabilities will be created. It would still be possible for a malicious actor to gain access to the database containing those keys, which would enable them to defeat the encryption on any smartphone they took possession of. Additionally, the cost of implementation and maintenance of such a complex system could be high.
  • Vance also suggests that the US would be justified in creating such a requirement since other Western nations are contemplating requiring encryption backdoors as well. Regardless of whether other countries are debating similar proposals, we cannot afford a race to the bottom on cybersecurity. Heads of the intelligence community regularly warn that cybersecurity is the top threat to our national security. Strong encryption is our best defense against cyber threats, and following in the footsteps of other countries by weakening that critical tool would do incalculable harm. Furthermore, even if the US or other countries did implement such a proposal, criminals could gain access to devices with strong encryption through the black market. Thus, only innocent people would be negatively affected, and some of those innocent people might even become criminals simply by trying to protect their privacy by securing their data and devices. Finally, Vance argues that David Kaye, UN Special Rapporteur for Freedom of Expression and Opinion, supported the idea that court-ordered decryption doesn’t violate human rights, provided certain criteria are met, in his report on the topic. However, in the context of Vance’s proposal, this seems to conflate the concepts of court-ordered decryption and of government-mandated encryption backdoors. The Kaye report was unequivocal about the importance of encryption for free speech and human rights. The report concluded that:
  • States should promote strong encryption and anonymity. National laws should recognize that individuals are free to protect the privacy of their digital communications by using encryption technology and tools that allow anonymity online. … States should not restrict encryption and anonymity, which facilitate and often enable the rights to freedom of opinion and expression. Blanket prohibitions fail to be necessary and proportionate. States should avoid all measures that weaken the security that individuals may enjoy online, such as backdoors, weak encryption standards and key escrows. Additionally, the group of intelligence experts that was hand-picked by the President to issue a report and recommendations on surveillance and technology, concluded that: [R]egarding encryption, the U.S. Government should: (1) fully support and not undermine efforts to create encryption standards; (2) not in any way subvert, undermine, weaken, or make vulnerable generally available commercial software; and (3) increase the use of encryption and urge US companies to do so, in order to better protect data in transit, at rest, in the cloud, and in other storage.
  • The clear consensus among human rights experts and several high-ranking intelligence experts, including the former directors of the NSA, Office of the Director of National Intelligence, and DHS, is that mandating encryption backdoors is dangerous. Unaddressed Concerns: Preventing Encrypted Devices from Entering the US and the Slippery Slope In addition to the significant faults in Vance’s arguments in favor of his proposal, he fails to address the question of how such a restriction would be effectively implemented. There is no effective mechanism for preventing code from becoming available for download online, even if it is illegal. One critical issue the Vance proposal fails to address is how the government would prevent, or even identify, encrypted smartphones when individuals bring them into the United States. DHS would have to train customs agents to search the contents of every person’s phone in order to identify whether it is encrypted, and then confiscate the phones that are. Legal and policy considerations aside, this kind of policy is, at the very least, impractical. Preventing strong encryption from entering the US is not like preventing guns or drugs from entering the country — encrypted phones aren’t immediately obvious as is contraband. Millions of people use encrypted devices, and tens of millions more devices are shipped to and sold in the US each year.
  • Finally, there is a real concern that if Vance’s proposal were accepted, it would be the first step down a slippery slope. Right now, his proposal only calls for access to smartphones and devices running mobile operating systems. While this policy in and of itself would cover a number of commonplace devices, it may eventually be expanded to cover laptop and desktop computers, as well as communications in transit. The expansion of this kind of policy is even more worrisome when taking into account the speed at which technology evolves and becomes widely adopted. Ten years ago, the iPhone did not even exist. Who is to say what technology will be commonplace in 10 or 20 years that is not even around today. There is a very real question about how far law enforcement will go to gain access to information. Things that once seemed like merely science fiction, such as wearable technology and artificial intelligence that could be implanted in and work with the human nervous system, are now available. If and when there comes a time when our “smart phone” is not really a device at all, but is rather an implant, surely we would not grant law enforcement access to our minds.
  • Policymakers should dismiss Vance’s proposal to prohibit the use of strong encryption to protect our smartphones and devices in order to ensure law enforcement access. Undermining encryption, regardless of whether it is protecting data in transit or at rest, would take us down a dangerous and harmful path. Instead, law enforcement and the intelligence community should be working to alter their skills and tactics in a fast-evolving technological world so that they are not so dependent on information that will increasingly be protected by encryption.
Paul Merrell

NSA Director Finally Admits Encryption Is Needed to Protect Public's Privacy - 0 views

  • NSA Director Finally Admits Encryption Is Needed to Protect Public’s Privacy The new stance denotes a growing awareness within the government that Americans are not comfortable with the State’s grip on their data. By Carey Wedler | AntiMedia | January 22, 2016 Share this article! https://mail.google.com/mail/?view=cm&fs=1&to&su=NSA%20Director%20Finally%20Admits%20Encryption%20Is%20Needed%20to%20Protect%20Public%E2%80%99s%20Privacy&body=http%3A%2F%2Fwww.mintpress
  • Rogers cited the recent Office of Personnel Management hack of over 20 million users as a reason to increase encryption rather than scale it back. “What you saw at OPM, you’re going to see a whole lot more of,” he said, referring to the massive hack that compromised the personal data about 20 million people who obtained background checks. Rogers’ comments, while forward-thinking, signify an about face in his stance on encryption. In February 2015, he said he “shares [FBI] Director [James] Comey’s concern” about cell phone companies’ decision to add encryption features to their products. Comey has been one loudest critics of encryption. However, Rogers’ comments on Thursday now directly conflict with Comey’s stated position. The FBI director has publicly chastised encryption, as well as the companies that provide it. In 2014, he claimed Apple’s then-new encryption feature could lead the world to “a very dark place.” At a Department of Justice hearing in November, Comey testified that “Increasingly, the shadow that is ‘going dark’ is falling across more and more of our work.” Though he claimed, “We support encryption,” he insisted “we have a problem that encryption is crashing into public safety and we have to figure out, as people who care about both, to resolve it. So, I think the conversation’s in a healthier place.”
  • At the same hearing, Comey and Attorney General Loretta Lynch declined to comment on whether they had proof the Paris attackers used encryption. Even so, Comey recently lobbied for tech companies to do away with end-to-end encryption. However, his crusade has fallen on unsympathetic ears, both from the private companies he seeks to control — and from the NSA. Prior to Rogers’ statements in support of encryption Thursday, former NSA chief Michael Hayden said, “I disagree with Jim Comey. I actually think end-to-end encryption is good for America.” Still another former NSA chair has criticized calls for backdoor access to information. In October, Mike McConnell told a panel at an encryption summit that the United States is “better served by stronger encryption, rather than baking in weaker encryption.” Former Department of Homeland Security chief, Michael Chertoff, has also spoken out against government being able to bypass encryption.
  • ...2 more annotations...
  • Regardless of these individual defenses of encryption, the Intercept explained why these statements may be irrelevant: “Left unsaid is the fact that the FBI and NSA have the ability to circumvent encryption and get to the content too — by hacking. Hacking allows law enforcement to plant malicious code on someone’s computer in order to gain access to the photos, messages, and text before they were ever encrypted in the first place, and after they’ve been decrypted. The NSA has an entire team of advanced hackers, possibly as many as 600, camped out at Fort Meade.”
  • Rogers statements, of course, are not a full-fledged endorsement of privacy, nor can the NSA be expected to make it a priority. Even so, his new stance denotes a growing awareness within the government that Americans are not comfortable with the State’s grip on their data. “So spending time arguing about ‘hey, encryption is bad and we ought to do away with it’ … that’s a waste of time to me,” Rogers said Thursday. “So what we’ve got to ask ourselves is, with that foundation, what’s the best way for us to deal with it? And how do we meet those very legitimate concerns from multiple perspectives?”
Paul Merrell

Your Computer May Already be Hacked - NSA Inside? | Steve Blank - 1 views

  • But while the interviewer focused on the Skype revelation, I thought the most interesting part was the other claim, “that the National Security Agency already had pre-encryption stage access to email on Outlook.”  Say what??  They can see the plaintext on my computer before I encrypt it? That defeats any/all encryption methods. How could they do that? Bypass Encryption While most outside observers think the NSA’s job is cracking encrypted messages, as the Prism disclosures have shown, the actual mission is simply to read all communications. Cracking codes is a last resort.
  • The NSA has a history of figuring out how to get to messages before or after they are encrypted. Whether it was by putting keyloggers on keyboards and recording the keystrokes or detecting the images of the characters as they were being drawn on a CRT. Today every desktop and laptop computer has another way for the NSA to get inside. Intel Inside It’s inevitable that complex microprocessors have bugs in them when they ship. When the first microprocessors shipped the only thing you could hope is that the bug didn’t crash your computer. The only way the chip vendor could fix the problem was to physically revise the chip and put out a new version. But computer manufacturers and users were stuck if you had an old chip. After a particularly embarrassing math bug in 1994 that cost Intel $475 million, the company decided to fix the problem by allowing it’s microprocessors to load fixes automatically when your computer starts.
  • Starting in 1996 with the Intel P6 (Pentium Pro) to today’s P7 chips (Core i7) these processors contain instructions that are reprogrammable in what is called microcode. Intel can fix bugs on the chips by reprogramming a microprocessors microcode with a patch. This patch, called a microcode update, can be loaded into a processor by using special CPU instructions reserved for this purpose. These updates are not permanent, which means each time you turn the computer on, its microprocessor is reset to its built-in microcode, and the update needs to be applied again (through a computer’s BIOS.). Since 2000, Intel has put out 29 microcode updates to their processors. The microcode is distributed by 1) Intel or by 2) Microsoft integrated into a BIOS or 3) as part of a Windows update. Unfortunately, the microcode update format is undocumented and the code is encrypted. This allows Intel to make sure that 3rd parties can’t make unauthorized add-ons to their chips. But it also means that no one can look inside to understand the microcode, which makes it is impossible to know whether anyone is loading a backdoor into your computer.
  • ...3 more annotations...
  • Or perhaps the NSA, working with Intel and/or Microsoft, have wittingly have put backdoors in the microcode updates. A backdoor is is a way of gaining illegal remote access to a computer by getting around the normal security built-in to the computer. Typically someone trying to sneak malicious software on to a computer would try to install a rootkit (software that tries to conceal the malicious code.) A rootkit tries to hide itself and its code, but security conscious sites can discover rootkits by tools that check kernel code and data for changes. But what if you could use the configuration and state of microprocessor hardware in order to hide? You’d be invisible to all rootkit detection techniques that checks the operating system. Or what if you can make the microprocessor random number generator (the basis of encryption) not so random for a particular machine? (The NSA’s biggest coup was inserting backdoors in crypto equipment the Swiss sold to other countries.) Rather than risk getting caught messing with everyone’s updates, my bet is that the NSA has compromised the microcode update signing keys  giving the NSA the ability to selectively target specific computers. (Your operating system ensures security of updates by checking downloaded update packages against the signing key.) The NSA then can send out backdoors disguised as a Windows update for “security.” (Ironic but possible.) That means you don’t need backdoors baked in the hardware, don’t need Intel’s buy-in, don’t have discoverable rootkits, and you can target specific systems without impacting the public at large.
  • A few months ago these kind of discussions would have been theory at best, if not paranoia.
  • The Prism disclosures prove otherwise – the National Security Agency has decided it needs the ability to capture all communications in all forms. Getting inside of a target computer and weakening its encryption or having access to the plaintext of encrypted communication seems likely. Given the technical sophistication of the other parts of their surveillance net, the surprise would be if they haven’t implemented a microcode backdoor. The downside is that 1) backdoors can be hijacked by others with even worse intent. So if NSA has a microcode backdoor – who else is using it? and 2) What other pieces of our infrastructure, (routers, smartphones, military computers, satellites, etc) use processors with uploadable microcode? —— And that may be why the Russian president is now using a typewriter rather than a personal computer.
Paul Merrell

Transcript: Comey Says Authors of Encryption Letter Are Uninformed or Not Fair-Minded |... - 0 views

  • Earlier today, FBI Director James Comey implied that a broad coalition of technology companies, trade associations, civil society groups, and security experts were either uninformed or were not “fair-minded” in a letter they sent to the President yesterday urging him to reject any legislative proposals that would undermine the adoption of strong encryption by US companies. The letter was signed by dozens of organizations and companies in the latest part of the debate over whether the government should be given built-in access to encrypted data (see, for example, here, here, here, and here for previous iterations). The comments were made at the Third Annual Cybersecurity Law Institute held at Georgetown University Law Center. The transcript of his encryption-related discussion is below (emphasis added).
  • Increasingly, communications at rest sitting on a device or in motion are encrypted. The device is encrypted or the communication is encrypted and therefore unavailable to us even with a court order. So I make a showing of probable cause to a judge in a criminal case or in an intelligence case to the Foreign Intelligence Surveillance Court judge that the content of a particular defense or a particular communication stream should be collected to our statutory authority, and the judge approves, increasingly we are finding ourselves unable to read what we find or we’re unable to open a device. And that is a serious concern. I am actually — I think encryption is a good thing. I think there are tremendous societal benefits to encryption. That’s one of the reasons the FBI tells people not only lock your cars, but you should encrypt things that are important to you to make it harder for thieves to take them.
  • A group of tech companies and some prominent folks wrote a letter to the President yesterday that I frankly found depressing. Because their letter contains no acknowledgment that there are societal costs to universal encryption. Look, I recognize the challenges facing our tech companies. Competitive challenges, regulatory challenges overseas, all kinds of challenges. I recognize the benefits of encryption, but I think fair-minded people also have to recognize the costs associated with that. And I read this letter and I think, “Either these folks don’t see what I see or they’re not fair-minded.” And either one of those things is depressing to me. So I’ve just got to continue to have the conversation. I don’t know the answer, but I don’t think a democracy should drift to a place where suddenly law enforcement people say, “Well, actually we — the Fourth Amendment is an awesome thing, but we actually can’t access any information.”
  • ...2 more annotations...
  • But we have a collision going on in this country that’s getting closer and closer to an actual head-on, which is our important interest in privacy — which I am passionate about — and our important interest in public safety. The logic of universal encryption is inexorable that our authority under the Fourth Amendment — an amendment that I think is critical to ordered liberty — with the right predication and the right oversight to obtain information is going to become increasingly irrelevant. As all of our lives become digital, the logic of encryption is that all of our lives will be covered by strong encryption, therefore all of our lives — I know there are no criminals here, but including the lives of criminals and terrorists and spies — will be in a place that is utterly unavailable to court ordered process. And that, I think, to a democracy should be very, very concerning. I think we need to have a conversation about it. Again, how do we strike the right balance? Privacy matters tremendously. Public safety, I think, matters tremendously to everybody. I think fair-minded people have to recognize that there are tremendous benefits to a society from encryption. There are tremendous costs to a society from universal strong encryption. And how do we think about that?
  • We’ve got to have a conversation long before the logic of strong encryption takes us to that place. And smart people, reasonable people will disagree mightily. Technical people will say it’s too hard. My reaction to that is: Really? Too hard? Too hard for the people we have in this country to figure something out? I’m not that pessimistic. I think we ought to have a conversation.
  •  
    Considering that I'm over 10 times as likely to die from a police shoooting as I am from a terrorist attack, how about we begin this conversation, Mr. Comey, by you providing formal notice to everyone who's had the telephone metadata gathered or searched all dates on which such gatherings and searches were conducted so citizens can file suit for violation of their privacy rights? Note that the Second U.S. Circuit Court of Appeals held last week that the FBI exceeded statutory authority in gathering and searching that information. Because the gathering and searching was not authorized, that would bring the gathering and searching under the protections of the Privacy Act, including the FBI duty to account for the disclosures  and to pay at least the statutory minimum $1,500 in damges per incident.  Then I would like to have an itemization of all of the commercial software and hardware products that your agency and or your buddies at NSA built backdoors into.  Then your resignation for millions of violations of the Privacy Act would be deeply appreciated. Please feel free to delegate the above mentioned tasks to your successor. 
Paul Merrell

Verizon's New, Encrypted Calling App Plays Nice With the NSA - Businessweek - 0 views

  • Verizon is the latest big company to enter the post-Snowden market for secure communication, and it's doing so with an encryption standard that comes with a way for law enforcement to access ostensibly secure phone conversations.Verizon Voice Cypher, the product introduced on Thursday with the encryption company Cellcrypt, offers business and government customers end-to-end encryption for voice calls on iOS, Android, or BlackBerry devices equipped with a special app. The encryption software provides secure communications for people speaking on devices with the app, regardless of their wireless carrier, and it can also connect to an organization's secure phone system. Cellcrypt and Verizon both say that law enforcement agencies will be able to access communications that take place over Voice Cypher, so long as they're able to prove that there's a legitimate law enforcement reason for doing so. Seth Polansky, Cellcrypt's vice president for North America, disputes the idea that building technology to allow wiretapping is a security risk. "It's only creating a weakness for government agencies," he says. "Just because a government access option exists, it doesn't mean other companies can access it." 
  • Phone carriers like Verizon are required by U.S. law to build networks that can be wiretapped. But the legislation known as the Communications Assistance for Law Enforcement Act requires phone carriers to decrypt communications for the government only if they have designed their technology to make it possible to do so. If Verizon and Cellcrypt had structured their encryption so that neither company had the information necessary to decrypt the calls, they would not have been breaking the law.
  • There has been increased interest in encryption from individual consumers, too, largely thanks to the NSA revelations leaked by Edward Snowden. Yahoo and Google began offering end-to-end encrypted e-mail services this year. Silent Circle, a startup catering to consumer and enterprise clients, has been developing end-to-end voice encryption for phones calls. Verizon's service, with a monthly price of $45 per device, isn't targeting individual buyers and won't be offered to average consumers in the near future.But Verizon's partner, Cellcrypt, looks upon selling to large organizations as the first step toward bringing down the price before eventually offering a consumer-level encryption service. "At the end of the day, we'd love to have this be a line item on your Verizon bill," says Polansky.
  • ...2 more annotations...
  • Other companies have designed their encryption in this way, including AT&T, which offers encrypted phone service for business customers. Apple and Android recently began protecting content stored on users's phones in a way that would keep the tech companies from being able to comply with requests from law enforcement. The move drew public criticism from FBI Director James Comey, and some security experts expect that a renewed effort to stir passage of legislation banning such encryption will accompany Silicon Valley's increased interest in developing these services. Verizon believes major demand for its new encryption service will come from governmental agencies conveying sensitive but unclassified information over the phone, says Tim Petsky, a senior product manager for Verizon Wireless. Corporate customers who are concerned about corporate espionage are also itching for answers. "You read about breaches in security almost every week in the press," says Petsky. "Enterprise customers have been asking about ways to secure their communications and up until this point, we didn't have a solution." 
  • Many people in the security industry believe that a designed access point creates a vulnerability for criminals or spies to exploit. Last year reports surfaced that the FBI was pushing legislation that would require many forms of Internet communication to be wiretap-ready. A group of prominent security experts responded strongly: "Requiring software vendors to build intercept functionality into their products is unwise and will be ineffective, with the result being serious consequences (PDF) for the economic well-being and national security of the United States," they wrote in a report issued in May. 
Paul Merrell

The 'Athens Affair' shows why we need encryption without backdoors | Trevor Timm | Comm... - 0 views

  • Just as it seems the White House is close to finally announcing its policy on encryption - the FBI has been pushing for tech companies like Apple and Google to insert backdoors into their phones so the US government can always access users’ data - new Snowden revelations and an investigation by a legendary journalist show exactly why the FBI’s plans are so dangerous. One of the biggest arguments against mandating backdoors in encryption is the fact that, even if you trust the United States government never to abuse that power (and who does?), other criminal hackers and foreign governments will be able to exploit the backdoor to use it themselves. A backdoor is an inherent vulnerability that other actors will attempt to find and try to use it for their own nefarious purposes as soon as they know it exists, putting all of our cybersecurity at risk. In a meticulous investigation, longtime NSA reporter James Bamford reported at the Intercept Tuesday that the NSA was behind the notorious “Athens Affair”. In surveillance circles, the Athens Affair is stuff of legend: after the 2004 Olympics, the Greek government discovered that an unknown attacker had hacked into Vodafone’s “lawful intercept” system, the phone company’s mechanism of wiretapping phone calls. The attacker spied on phone calls of the president, other Greek politicians and journalists before it was discovered. According to Bamford’s story, all this happened after the US spy agency cooperated with Greek law enforcement to keep an eye on potential terrorist attacks for the Olympics. Instead of packing up their surveillance gear, they covertly pointed it towards the Greek government and its people. But that’s not all: according to Snowden documents that Bamford cited, this is a common tactic of the NSA. They often attack the “lawful intercept” systems in other countries to spy on government and citizens without their knowledge:
  • Exploiting the weaknesses associated with lawful intercept programs was a common trick for NSA. According to a previously unreleased top-secret PowerPoint presentation from 2012, titled “Exploiting Foreign Lawful Intercept Roundtable”, the agency’s “countries of interest” for this work included, at that time, Mexico, Indonesia, Egypt and others. The presentation also notes that NSA had about 60 “Fingerprints” — ways to identify data — from telecom companies and industry groups that develop lawful intercept systems, including Ericsson, as well as Motorola, Nokia and Siemens. It’s the exact nightmare scenario security experts have warned about when it comes to backdoors: they are not only available to those that operate them “legally”, but also to those who can hack into them to spy without anyone’s knowledge. If the NSA can do it, so can China, Russia and a host of other malicious actors.
Paul Merrell

Security Experts Oppose Government Access to Encrypted Communication - The New York Times - 0 views

  • An elite group of security technologists has concluded that the American and British governments cannot demand special access to encrypted communications without putting the world’s most confidential data and critical infrastructure in danger.A new paper from the group, made up of 14 of the world’s pre-eminent cryptographers and computer scientists, is a formidable salvo in a skirmish between intelligence and law enforcement leaders, and technologists and privacy advocates. After Edward J. Snowden’s revelations — with security breaches and awareness of nation-state surveillance at a record high and data moving online at breakneck speeds — encryption has emerged as a major issue in the debate over privacy rights.
  • That has put Silicon Valley at the center of a tug of war. Technology companies including Apple, Microsoft and Google have been moving to encrypt more of their corporate and customer data after learning that the National Security Agency and its counterparts were siphoning off digital communications and hacking into corporate data centers.
  • Yet law enforcement and intelligence agency leaders argue that such efforts thwart their ability to monitor kidnappers, terrorists and other adversaries. In Britain, Prime Minister David Cameron threatened to ban encrypted messages altogether. In the United States, Michael S. Rogers, the director of the N.S.A., proposed that technology companies be required to create a digital key to unlock encrypted data, but to divide the key into pieces and secure it so that no one person or government agency could use it alone.The encryption debate has left both sides bitterly divided and in fighting mode. The group of cryptographers deliberately issued its report a day before James B. Comey Jr., the director of the Federal Bureau of Investigation, and Sally Quillian Yates, the deputy attorney general at the Justice Department, are scheduled to testify before the Senate Judiciary Committee on the concerns that they and other government agencies have that encryption technologies will prevent them from effectively doing their jobs.
  • ...2 more annotations...
  • The new paper is the first in-depth technical analysis of government proposals by leading cryptographers and security thinkers, including Whitfield Diffie, a pioneer of public key cryptography, and Ronald L. Rivest, the “R” in the widely used RSA public cryptography algorithm. In the report, the group said any effort to give the government “exceptional access” to encrypted communications was technically unfeasible and would leave confidential data and critical infrastructure like banks and the power grid at risk. Handing governments a key to encrypted communications would also require an extraordinary degree of trust. With government agency breaches now the norm — most recently at the United States Office of Personnel Management, the State Department and the White House — the security specialists said authorities could not be trusted to keep such keys safe from hackers and criminals. They added that if the United States and Britain mandated backdoor keys to communications, China and other governments in foreign markets would be spurred to do the same.
  • “Such access will open doors through which criminals and malicious nation-states can attack the very individuals law enforcement seeks to defend,” the report said. “The costs would be substantial, the damage to innovation severe and the consequences to economic growth hard to predict. The costs to the developed countries’ soft power and to our moral authority would also be considerable.”
  •  
    Our system of government does not expect that every criminal will be apprehended and convicted. There are numerous values our society believes are more important. Some examples: [i] a presumption of innocence unless guilt is established beyond any reasonable doubt; [ii] the requirement that government officials convince a neutral magistrate that they have probable cause to believe that a search or seizure will produce evidence of a crime; [iii] many communications cannot be compelled to be disclosed and used in evidence, such as attorney-client communications, spousal communications, and priest-penitent communications; and [iv] etc. Moral of my story: the government needs a much stronger reason to justify interception of communications than saying, "some crooks will escape prosecution if we can't do that." We have a right to whisper to each other, concealing our communicatons from all others. Why does the right to whisper privately disappear if our whisperings are done electronically? The Supreme Court took its first step on a very slippery slope when it permitted wiretapping in Olmstead v. United States, 277 U.S. 438, 48 S. Ct. 564, 72 L. Ed. 944 (1928). https://goo.gl/LaZGHt It's been a long slide ever since. It's past time to revisit Olmstead and recognize that American citizens have the absolute right to communicate privately. "The President … recognizes that U.S. citizens and institutions should have a reasonable expectation of privacy from foreign or domestic intercept when using the public telephone system." - Brent Scowcroft, U.S. National Security Advisor, National Security Decision Memorandum 338 (1 September 1976) (Nixon administration), http://www.fas.org/irp/offdocs/nsdm-ford/nsdm-338.pdf   
Paul Merrell

"We cannot trust" Intel and Via's chip-based crypto, FreeBSD developers say | Ars Technica - 0 views

  • Developers of the FreeBSD operating system will no longer allow users to trust processors manufactured by Intel and Via Technologies as the sole source of random numbers needed to generate cryptographic keys that can't easily be cracked by government spies and other adversaries. The change, which will be effective in the upcoming FreeBSD version 10.0, comes three months after secret documents leaked by former National Security Agency (NSA) subcontractor Edward Snowden said the US spy agency was able to decode vast swaths of the Internet's encrypted traffic. Among other ways, The New York Times, Pro Publica, and The Guardian reported in September, the NSA and its British counterpart defeat encryption technologies by working with chipmakers to insert backdoors, or cryptographic weaknesses, in their products. The revelations are having a direct effect on the way FreeBSD will use hardware-based random number generators to seed the data used to ensure cryptographic systems can't be easily broken by adversaries. Specifically, "RDRAND" and "Padlock"—RNGs provided by Intel and Via respectively—will no longer be the sources FreeBSD uses to directly feed random numbers into the /dev/random engine used to generate random data in Unix-based operating systems. Instead, it will be possible to use the pseudo random output of RDRAND and Padlock to seed /dev/random only after it has passed through a separate RNG algorithm known as "Yarrow." Yarrow, in turn, will add further entropy to the data to ensure intentional backdoors, or unpatched weaknesses, in the hardware generators can't be used by adversaries to predict their output.
  • "For 10, we are going to backtrack and remove RDRAND and Padlock backends and feed them into Yarrow instead of delivering their output directly to /dev/random," FreeBSD developers said. "It will still be possible to access hardware random number generators, that is, RDRAND, Padlock etc., directly by inline assembly or by using OpenSSL from userland, if required, but we cannot trust them any more." In separate meeting minutes, developers specifically invoked Snowden's name when discussing the change. "Edward Snowdon [sic] -- v. high probability of backdoors in some (HW) RNGs," the notes read, referring to hardware RNGs. Then, alluding to the Dual EC_DRBG RNG forged by the National Institute of Standards and Technology and said to contain an NSA-engineered backdoor, the notes read: "Including elliptic curve generator included in NIST. rdrand in ivbridge not implemented by Intel... Cannot trust HW RNGs to provide good entropy directly. (rdrand implemented in microcode. Intel will add opcode to go directly to HW.) This means partial revert of some work on rdrand and padlock."
  •  
    Hopefully, all Linux distros jump on this bandwagon.
Paul Merrell

Exclusive: Secret contract tied NSA and security industry pioneer | Reuters - 0 views

  • (Reuters) - As a key part of a campaign to embed encryption software that it could crack into widely used computer products, the U.S. National Security Agency arranged a secret $10 million contract with RSA, one of the most influential firms in the computer security industry, Reuters has learned. Documents leaked by former NSA contractor Edward Snowden show that the NSA created and promulgated a flawed formula for generating random numbers to create a "back door" in encryption products, the New York Times reported in September. Reuters later reported that RSA became the most important distributor of that formula by rolling it into a software tool called Bsafe that is used to enhance security in personal computers and many other products.Undisclosed until now was that RSA received $10 million in a deal that set the NSA formula as the preferred, or default, method for number generation in the BSafe software, according to two sources familiar with the contract. Although that sum might seem paltry, it represented more than a third of the revenue that the relevant division at RSA had taken in during the entire previous year, securities filings show.
  • The earlier disclosures of RSA's entanglement with the NSA already had shocked some in the close-knit world of computer security experts. The company had a long history of championing privacy and security, and it played a leading role in blocking a 1990s effort by the NSA to require a special chip to enable spying on a wide range of computer and communications products.
  • The RSA deal shows one way the NSA carried out what Snowden's documents describe as a key strategy for enhancing surveillance: the systematic erosion of security tools. NSA documents released in recent months called for using "commercial relationships" to advance that goal, but did not name any security companies as collaborators.
  • ...2 more annotations...
  • The NSA came under attack this week in a landmark report from a White House panel appointed to review U.S. surveillance policy. The panel noted that "encryption is an essential basis for trust on the Internet," and called for a halt to any NSA efforts to undermine it.
  • From RSA's earliest days, the U.S. intelligence establishment worried it would not be able to crack well-engineered public key cryptography. Martin Hellman, a former Stanford researcher who led the team that first invented the technique, said NSA experts tried to talk him and others into believing that the keys did not have to be as large as they planned.
  •  
    Reuters gives the NSA's history of introducing backdoors in encryption standards a deep look, focusing on RSA's acceptance of a $10 million NSA bribe post-9/11 to implement the NSA-created Dual Elliptic Curve standard for generating "random" numbers, which had what Bruce Schneier described as a "back door." A tip of the hat to Miro for alerting me to this article.
Paul Merrell

Silicon Valley spars with Obama over 'backdoor' surveillance | TheHill - 0 views

  • Silicon Valley and a bipartisan group of lawmakers are lining up against the Obama administration, criticizing what they see as a lack of support for total online privacy.The steady rise of sophisticated privacy techniques such as encryption and anonymity software has put the government in a difficult position — trying to support the right to privacy while figuring out how to prevent people from evading law enforcement.ADVERTISEMENT“The technologies are evolving in ways that potentially make this trickier,” President Obama said during a January news conference with British Prime Minister David Cameron.The conundrum has led to a heated debate in Washington: Should law enforcement have guaranteed access to data?
  • The Obama administration — from officials with FBI and the National Security Agency (NSA) to the president himself — has come out in favor of some form of guaranteed access while still endorsing strong encryption.“If we get into a situation in which the technologies do not allow us at all to track somebody that we're confident is a terrorist,” Obama said, “that's a problem.”What shape that access takes, however, is unclear.“The dialogue that we're engaged in is designed to make sure that all of us feel confident that if there is an actual threat out there, our law enforcement and our intelligence officers can identify that threat and track that threat at the same time that our governments are not going around phishing into whatever text you might be sending on your smartphone,” Obama said. “And I think that's something that can be achieved.”Privacy hawks on Capitol Hill aren’t buying it.
  • “I don’t think much of that,” Rep. Joe Barton (R-Texas), co-founder of the Congressional Bipartisan Privacy Caucus, told The Hill. “We have a huge homeland security apparatus with almost unlimited authority to — with some sort of a reasonable suspicion — check almost any type of communication, whether it’s voice, Internet, telephonic, electronic, you name it.”“Those were positions that did not receive rave reviews here in Silicon Valley,” said Rep. Zoe Lofgren (D-Calif.), whose district includes parts of tech-heavy San Jose.Many believe the administration’s stance is inherently at odds with robust digital protection.“In order to fully implement what he's suggesting, you would need one of two things,” Lofgren said.One would be installing so-called “backdoors” in encryption — an access point known only to law enforcement agencies. Security experts find this concept abhorrent, since cyber crooks or foreign intelligence agencies would likely exploit it.
  • ...1 more annotation...
  • The second would be to have a third-party company hold all user data, with some sort of agreement to disclose information to the government, Lofgren said.“I think actually the trend line is in a different direction, which is encryption that is not accessible to the companies that provide it, either,” she added.  Major tech companies like Apple have done exactly that, claiming that even they can’t unlock data on newer devices.
Paul Merrell

Obama administration opts not to force firms to decrypt data - for now - The Washington... - 0 views

  • After months of deliberation, the Obama administration has made a long-awaited decision on the thorny issue of how to deal with encrypted communications: It will not — for now — call for legislation requiring companies to decode messages for law enforcement. Rather, the administration will continue trying to persuade companies that have moved to encrypt their customers’ data to create a way for the government to still peer into people’s data when needed for criminal or terrorism investigations. “The administration has decided not to seek a legislative remedy now, but it makes sense to continue the conversations with industry,” FBI Director James B. Comey said at a Senate hearing Thursday of the Homeland Security and Governmental Affairs Committee.
  • The decision, which essentially maintains the status quo, underscores the bind the administration is in — balancing competing pressures to help law enforcement and protect consumer privacy. The FBI says it is facing an increasing challenge posed by the encryption of communications of criminals, terrorists and spies. A growing number of companies have begun to offer encryption in which the only people who can read a message, for instance, are the person who sent it and the person who received it. Or, in the case of a device, only the device owner has access to the data. In such cases, the companies themselves lack “backdoors” or keys to decrypt the data for government investigators, even when served with search warrants or intercept orders.
  • The decision was made at a Cabinet meeting Oct. 1. “As the president has said, the United States will work to ensure that malicious actors can be held to account — without weakening our commitment to strong encryption,” National Security Council spokesman Mark Stroh said. “As part of those efforts, we are actively engaged with private companies to ensure they understand the public safety and national security risks that result from malicious actors’ use of their encrypted products and services.” But privacy advocates are concerned that the administration’s definition of strong encryption also could include a system in which a company holds a decryption key or can retrieve unencrypted communications from its servers for law enforcement. “The government should not erode the security of our devices or applications, pressure companies to keep and allow government access to our data, mandate implementation of vulnerabilities or backdoors into products, or have disproportionate access to the keys to private data,” said Savecrypto.org, a coalition of industry and privacy groups that has launched a campaign to petition the Obama administration.
  • ...3 more annotations...
  • To Amie Stepanovich, the U.S. policy manager for Access, one of the groups signing the petition, the status quo isn’t good enough. “It’s really crucial that even if the government is not pursuing legislation, it’s also not pursuing policies that will weaken security through other methods,” she said. The FBI and Justice Department have been talking with tech companies for months. On Thursday, Comey said the conversations have been “increasingly productive.” He added: “People have stripped out a lot of the venom.” He said the tech executives “are all people who care about the safety of America and also care about privacy and civil liberties.” Comey said the issue afflicts not just federal law enforcement but also state and local agencies investigating child kidnappings and car crashes — “cops and sheriffs . . . [who are] increasingly encountering devices they can’t open with a search warrant.”
  • One senior administration official said the administration thinks it’s making enough progress with companies that seeking legislation now is unnecessary. “We feel optimistic,” said the official, who spoke on the condition of anonymity to describe internal discussions. “We don’t think it’s a lost cause at this point.” Legislation, said Rep. Adam Schiff (D-Calif.), is not a realistic option given the current political climate. He said he made a recent trip to Silicon Valley to talk to Twitter, Facebook and Google. “They quite uniformly are opposed to any mandate or pressure — and more than that, they don’t want to be asked to come up with a solution,” Schiff said. Law enforcement officials know that legislation is a tough sell now. But, one senior official stressed, “it’s still going to be in the mix.” On the other side of the debate, technology, diplomatic and commerce agencies were pressing for an outright statement by Obama to disavow a legislative mandate on companies. But their position did not prevail.
  • Daniel Castro, vice president of the Information Technology & Innovation Foundation, said absent any new laws, either in the United States or abroad, “companies are in the driver’s seat.” He said that if another country tried to require companies to retain an ability to decrypt communications, “I suspect many tech companies would try to pull out.”
Paul Merrell

Surveillance scandal rips through hacker community | Security & Privacy - CNET News - 0 views

  • One security start-up that had an encounter with the FBI was Wickr, a privacy-forward text messaging app for the iPhone with an Android version in private beta. Wickr's co-founder Nico Sell told CNET at Defcon, "Wickr has been approached by the FBI and asked for a backdoor. We said, 'No.'" The mistrust runs deep. "Even if [the NSA] stood up tomorrow and said that [they] have eliminated these programs," said Marlinspike, "How could we believe them? How can we believe that anything they say is true?" Where does security innovation go next? The immediate future of information security innovation most likely lies in software that provides an existing service but with heightened privacy protections, such as webmail that doesn't mine you for personal data.
  • Wickr's Sell thinks that her company has hit upon a privacy innovation that a few others are also doing, but many will soon follow: the company itself doesn't store user data. "[The FBI] would have to force us to build a new app. With the current app there's no way," she said, that they could incorporate backdoor access to Wickr users' texts or metadata. "Even if you trust the NSA 100 percent that they're going to use [your data] correctly," Sell said, "Do you trust that they're going to be able to keep it safe from hackers? What if somebody gets that database and posts it online?" To that end, she said, people will start seeing privacy innovation for services that don't currently provide it. Calling it "social networks 2.0," she said that social network competitors will arise that do a better job of protecting their customer's privacy and predicted that some that succeed will do so because of their emphasis on privacy. Abine's recent MaskMe browser add-on and mobile app for creating disposable e-mail addresses, phone numbers, and credit cards is another example of a service that doesn't have access to its own users' data.
  • Stamos predicted changes in services that companies with cloud storage offer, including offering customers the ability to store their data outside of the U.S. "If they want to stay competitive, they're going to have to," he said. But, he cautioned, "It's impossible to do a cloud-based ad supported service." Soghoian added, "The only way to keep a service running is to pay them money." This, he said, is going to give rise to a new wave of ad-free, privacy protective subscription services.
  • ...2 more annotations...
  • The issue with balancing privacy and surveillance is that the wireless carriers are not interested in privacy, he said. "They've been providing wiretapping for 100 years. Apple may in the next year protect voice calls," he said, and said that the best hope for ending widespread government surveillance will be the makers of mobile operating systems like Apple and Google. Not all upcoming security innovation will be focused on that kind of privacy protection. Security researcher Brandon Wiley showed off at Defcon a protocol he calls Dust that can obfuscate different kinds of network traffic, with the end goal of preventing censorship. "I only make products about letting you say what you want to say anywhere in the world," such as content critical of governments, he said. Encryption can hide the specifics of the traffic, but some governments have figured out that they can simply block all encrypted traffic, he said. The Dust protocol would change that, he said, making it hard to tell the difference between encrypted and unencrypted traffic. It's hard to build encryption into pre-existing products, Wiley said. "I think people are going to make easy-to-use, encrypted apps, and that's going to be the future."
  • Companies could face severe consequences from their security experts, said Stamos, if the in-house experts find out that they've been lied to about providing government access to customer data. You could see "lots of resignations and maybe publicly," he said. "It wouldn't hurt their reputations to go out in a blaze of glory." Perhaps not surprisingly, Marlinspike sounded a hopeful call for non-destructive activism on Defcon's 21st anniversary. "As hackers, we don't have a lot of influence on policy. I hope that's something that we can focus our energy on," he said.
  •  
    NSA as the cause of the next major disruption in the social networking service industry?  Grief ahead for Google? Note the point made that: "It's impossible to do a cloud-based ad supported service" where the encryption/decryption takes place on the client side. 
Paul Merrell

iSpy: The CIA Campaign to Steal Apple's Secrets - 0 views

  • ESEARCHERS WORKING with the Central Intelligence Agency have conducted a multi-year, sustained effort to break the security of Apple’s iPhones and iPads, according to top-secret documents obtained by The Intercept. The security researchers presented their latest tactics and achievements at a secret annual gathering, called the “Jamboree,” where attendees discussed strategies for exploiting security flaws in household and commercial electronics. The conferences have spanned nearly a decade, with the first CIA-sponsored meeting taking place a year before the first iPhone was released. By targeting essential security keys used to encrypt data stored on Apple’s devices, the researchers have sought to thwart the company’s attempts to provide mobile security to hundreds of millions of Apple customers across the globe. Studying both “physical” and “non-invasive” techniques, U.S. government-sponsored research has been aimed at discovering ways to decrypt and ultimately penetrate Apple’s encrypted firmware. This could enable spies to plant malicious code on Apple devices and seek out potential vulnerabilities in other parts of the iPhone and iPad currently masked by encryption.
  • The CIA declined to comment for this story. The security researchers also claimed they had created a modified version of Apple’s proprietary software development tool, Xcode, which could sneak surveillance backdoors into any apps or programs created using the tool. Xcode, which is distributed by Apple to hundreds of thousands of developers, is used to create apps that are sold through Apple’s App Store. The modified version of Xcode, the researchers claimed, could enable spies to steal passwords and grab messages on infected devices. Researchers also claimed the modified Xcode could “force all iOS applications to send embedded data to a listening post.” It remains unclear how intelligence agencies would get developers to use the poisoned version of Xcode. Researchers also claimed they had successfully modified the OS X updater, a program used to deliver updates to laptop and desktop computers, to install a “keylogger.”
  • Other presentations at the CIA conference have focused on the products of Apple’s competitors, including Microsoft’s BitLocker encryption system, which is used widely on laptop and desktop computers running premium editions of Windows. The revelations that the CIA has waged a secret campaign to defeat the security mechanisms built into Apple’s devices come as Apple and other tech giants are loudly resisting pressure from senior U.S. and U.K. government officials to weaken the security of their products. Law enforcement agencies want the companies to maintain the government’s ability to bypass security tools built into wireless devices. Perhaps more than any other corporate leader, Apple’s CEO, Tim Cook, has taken a stand for privacy as a core value, while sharply criticizing the actions of U.S. law enforcement and intelligence agencies. “If U.S. products are OK to target, that’s news to me,” says Matthew Green, a cryptography expert at Johns Hopkins University’s Information Security Institute. “Tearing apart the products of U.S. manufacturers and potentially putting backdoors in software distributed by unknowing developers all seems to be going a bit beyond ‘targeting bad guys.’ It may be a means to an end, but it’s a hell of a means.”
Paul Merrell

EFF Statement on Passage of Massie-Lofgren Amendment Regarding NSA Backdoors | Electron... - 0 views

  • Today, the US House of Representatives passed an amendment to the Defense Appropriations bill designed to cut funding for NSA backdoors. The amendment passed overwhelmingly with strong bipartisan support: 293 ayes, 123 nays, and 1 present. Currently, the NSA collects emails, browsing and chat history under Section 702 of the FISA Amendments Act, and searches this information without a warrant for the communications of Americans—a practice known as "backdoor searches." The amendment would block the NSA from using any of its funding from this Defense Appropriations Bill to conduct such warrantless searches. In addition, the amendment would prohibit the NSA from using its budget to mandate or request that private companies and organizations add backdoors to the encryption standards that are meant to keep you safe on the web. Mark Rumold, staff attorney for the Electronic Frontier Foundation, stated:
  • Tonight, the House of Representatives took an important first step in reining in the NSA. The House voted overwhelmingly to cut funding for two of the NSA's invasive surveillance practices: the warrantless searching of Americans' international communications, and the practice of requiring companies to install vulnerabilities in communications products or services. We applaud the House for taking this important first step, and we look forward to other elected officials standing up for our right to privacy. Digital rights organizations, including EFF, strongly supported the amendment. We and other organizations—including Free Press, Fight for the Future, Demand Progress, and Taskforce.is—helped to organize a grassroots campaign to promote the amendment. The day before the vote, we urged friends and members to call their members of Congress through the website ShuttheBackDoor.net. Thousands responded to the call to action. We extend our heartfelt thanks to everyone who spoke out on this issue. This is a great day in the fight to rein in NSA surveillance abuses, and we hope Congress will work to ensure this amendment is in the final version of the appropriations bill that is enacted.
  •  
    Big majority in the House and it's in the Defense Spending act. That puts a lot of pressure on the Senate and if sustained in the Senate, makes it all but veto-proof.  
Paul Merrell

Exclusive: U.S. tech industry appeals to Obama to keep hands off encryption | Reuters - 0 views

  • As Washington weighs new cybersecurity steps amid a public backlash over mass surveillance, U.S. tech companies warned President Barack Obama not to weaken increasingly sophisticated encryption systems designed to protect consumers' privacy.In a strongly worded letter to Obama on Monday, two industry associations for major software and hardware companies said, "We are opposed to any policy actions or measures that would undermine encryption as an available and effective tool."The Information Technology Industry Council and the Software and Information Industry Association, representing tech giants, including Apple Inc, Google Inc, Facebook Inc, IBM and Microsoft Corp, fired the latest salvo in what is shaping up to be a long fight over government access into smart phones and other digital devices.
Paul Merrell

GCHQ handed new smartphone-hacking legal powers - RT UK - 0 views

  • Spy agencies in Britain will be given the explicit right to hack into smartphones and computers as part of a new law being introduced by the Conservative government. Security services MI5, MI6 and GCHQ can already access electronic devices by exploiting software security vulnerabilities, but the legal foundation for the practice is under scrutiny.New powers laid out in the Investigatory Powers Bill, due to be introduced in Parliament next month, will give spies a solid legal basis for hacking into computer systems, according to the Times.The revelation has sparked criticism from human rights group Liberty, which accuses the government of giving spy agencies “unlimited potential” to act against citizens.The bill, which was announced in the Queens’ Speech following the general election, is likely to include the new Snooper’s Charter, according to privacy campaigners at the Open Rights Group.
  • British spies will be able to hack into a person’s “property” through backdoors in the software. Once inside, intelligence agents can install software that allows them operate microphones to eavesdrop on conversations and even control the camera to take photographs of targets.The government admitted in February that MI5, MI6 and GCHQ were hacking into computers, servers, routers and mobile phones using the Intelligence Services Act 1994, which does not give explicit authorization for such practices.Independent reviewer of terrorism legislation Dave Anderson QC recommended in June that new legislation be introduced to clarify give intrusive hacking a firm legal basis.Anderson said that hacking presents a “dizzying array of possibilities to the security and intelligence agencies.”While some methods are appropriate, “many are of the view that there are others which are so intrusive that they would require exceptional safeguards for their use to be legal … A debate is clearly needed,” he said.
  • The investigatory powers bill will give agents explicit powers to interfere with “property” once they have obtained a warrant from the home secretary.Digital evidence expert Peter Sommer said the powers circumvented encryption technology.“Increasingly, [intelligence agents] can’t read communications sent over the internet because of encryption, so their ability to get information from interception is rapidly diminishing. The best way around this is to get inside someone’s computer. This is an increasingly important avenue for them,” he told the Times.
Paul Merrell

UK government is secretly planning to break encryption and spy on people's phones, reve... - 0 views

  • The UK government is secretly planning to force technology companies to build backdoors into their products, to enable intelligence agencies to read people’s private messages. A draft document leaked by the Open Rights Group details extreme new surveillance proposals, which would enable government agencies to spy on one in 10,000 citizens – around 6,500 people – at any one time.  The document, which follows the controversial Investigatory Powers Act, reveals government plans to force mobile operators and internet service providers to provide real-time communications of customers to the government “in an intelligible form”, and within one working day.
  • This would effectively ban encryption, an important security measure used by a wide range of companies, including WhatsApp and major banks, to keep people’s private data private and to protect them from hackers and cyber criminals. 
Paul Merrell

Even the Former Director of the NSA Hates the FBI's New Surveillance Push - The Daily B... - 0 views

  • The head of the FBI has spent the last several months in something of a panic, warning anyone who will listen that terrorists are “going dark”—using encrypted communications to hide from the FBI—and insisting that the bureau needs some kind of electronic back door to get access to those chats.It’s an argument that civil libertarians and technology industry executives have largely rejected. And now, members of the national security establishment—veterans of both the Obama and Bush administrations—are beginning to speak out publicly against FBI Director Jim Comey’s call to give the government a skeleton key to your private talks.
  • The encryption issue was also one of several small, but telling, ways in which Comey seemed out of sync with some of his fellow members of the national security establishment here at the Aspen Security Forum.
  • This isn’t the first intra-government fight over encryption, Chertoff noted. The last time an administration insisted on a technological back door—in the 1990s—Congress shot down the idea. And despite cries of “going dark” back then, the government found all kinds of new ways to spy. “We collected more than ever. We found ways to deal with that issue,” Chertoff told the forum.
Paul Merrell

The All Writs Act, Software Licenses, and Why Judges Should Ask More Questions | Just S... - 0 views

  • Pending before federal magistrate judge James Orenstein is the government’s request for an order obligating Apple, Inc. to unlock an iPhone and thereby assist prosecutors in decrypting data the government has seized and is authorized to search pursuant to a warrant. In an order questioning the government’s purported legal basis for this request, the All Writs Act of 1789 (AWA), Judge Orenstein asked Apple for a brief informing the court whether the request would be technically feasible and/or burdensome. After Apple filed, the court asked it to file a brief discussing whether the government had legal grounds under the AWA to compel Apple’s assistance. Apple filed that brief and the government filed a reply brief last week in the lead-up to a hearing this morning.
  • We’ve long been concerned about whether end users own software under the law. Software owners have rights of adaptation and first sale enshrined in copyright law. But software publishers have claimed that end users are merely licensees, and our rights under copyright law can be waived by mass-market end user license agreements, or EULAs. Over the years, Granick has argued that users should retain their rights even if mass-market licenses purport to take them away. The government’s brief takes advantage of Apple’s EULA for iOS to argue that Apple, the software publisher, is responsible for iPhones around the world. Apple’s EULA states that when you buy an iPhone, you’re not buying the iOS software it runs, you’re just licensing it from Apple. The government argues that having designed a passcode feature into a copy of software which it owns and licenses rather than sells, Apple can be compelled under the All Writs Act to bypass the passcode on a defendant’s iPhone pursuant to a search warrant and thereby access the software owned by Apple. Apple’s supplemental brief argues that in defining its users’ contractual rights vis-à-vis Apple with regard to Apple’s intellectual property, Apple in no way waived its own due process rights vis-à-vis the government with regard to users’ devices. Apple’s brief compares this argument to forcing a car manufacturer to “provide law enforcement with access to the vehicle or to alter its functionality at the government’s request” merely because the car contains licensed software. 
  • This is an interesting twist on the decades-long EULA versus users’ rights fight. As far as we know, this is the first time that the government has piggybacked on EULAs to try to compel software companies to provide assistance to law enforcement. Under the government’s interpretation of the All Writs Act, anyone who makes software could be dragooned into assisting the government in investigating users of the software. If the court adopts this view, it would give investigators immense power. The quotidian aspects of our lives increasingly involve software (from our cars to our TVs to our health to our home appliances), and most of that software is arguably licensed, not bought. Conscripting software makers to collect information on us would afford the government access to the most intimate information about us, on the strength of some words in some license agreements that people never read. (And no wonder: The iPhone’s EULA came to over 300 pages when the government filed it as an exhibit to its brief.)
  • ...1 more annotation...
  • The government’s brief does not acknowledge the sweeping implications of its arguments. It tries to portray its requested unlocking order as narrow and modest, because it “would not require Apple to make any changes to its software or hardware, … [or] to introduce any new ability to access data on its phones. It would simply require Apple to use its existing capability to bypass the passcode on a passcode-locked iOS 7 phone[.]” But that undersells the implications of the legal argument the government is making: that anything a company already can do, it could be compelled to do under the All Writs Act in order to assist law enforcement. Were that the law, the blow to users’ trust in their encrypted devices, services, and products would be little different than if Apple and other companies were legally required to design backdoors into their encryption mechanisms (an idea the government just can’t seem to drop, its assurances in this brief notwithstanding). Entities around the world won’t buy security software if its makers cannot be trusted not to hand over their users’ secrets to the US government. That’s what makes the encryption in iOS 8 and later versions, which Apple has told the court it “would not have the technical ability” to bypass, so powerful — and so despised by the government: Because no matter how broadly the All Writs Act extends, no court can compel Apple to do the impossible.
Paul Merrell

Obama Lets N.S.A. Exploit Some Internet Flaws, Officials Say - NYTimes.com - 0 views

  • Stepping into a heated debate within the nation’s intelligence agencies, President Obama has decided that when the National Security Agency discovers major flaws in Internet security, it should — in most circumstances — reveal them to assure that they will be fixed, rather than keep mum so that the flaws can be used in espionage or cyberattacks, senior administration officials said Saturday.But Mr. Obama carved a broad exception for “a clear national security or law enforcement need,” the officials said, a loophole that is likely to allow the N.S.A. to continue to exploit security flaws both to crack encryption on the Internet and to design cyberweapons.
  • elements of the decision became evident on Friday, when the White House denied that it had any prior knowledge of the Heartbleed bug, a newly known hole in Internet security that sent Americans scrambling last week to change their online passwords. The White House statement said that when such flaws are discovered, there is now a “bias” in the government to share that knowledge with computer and software manufacturers so a remedy can be created and distributed to industry and consumers.Caitlin Hayden, the spokeswoman for the National Security Council, said the review of the recommendations was now complete, and it had resulted in a “reinvigorated” process to weigh the value of disclosure when a security flaw is discovered, against the value of keeping the discovery secret for later use by the intelligence community.“This process is biased toward responsibly disclosing such vulnerabilities,” she said.
  • One recommendation urged the N.S.A. to get out of the business of weakening commercial encryption systems or trying to build in “back doors” that would make it far easier for the agency to crack the communications of America’s adversaries. Tempting as it was to create easy ways to break codes — the reason the N.S.A. was established by Harry S. Truman 62 years ago — the committee concluded that the practice would undercut trust in American software and hardware products. In recent months, Silicon Valley companies have urged the United States to abandon such practices, while Germany and Brazil, among other nations, have said they were considering shunning American-made equipment and software. Their motives were hardly pure: Foreign companies see the N.S.A. disclosures as a way to bar American competitors.Continue reading the main story Continue reading the main story AdvertisementAnother recommendation urged the government to make only the most limited, temporary use of what hackers call “zero days,” the coding flaws in software like Microsoft Windows that can give an attacker access to a computer — and to any business, government agency or network connected to it. The flaws get their name from the fact that, when identified, the computer user has “zero days” to fix them before hackers can exploit the accidental vulnerability.
  • ...2 more annotations...
  • The N.S.A. made use of four “zero day” vulnerabilities in its attack on Iran’s nuclear enrichment sites. That operation, code-named “Olympic Games,” managed to damage roughly 1,000 Iranian centrifuges, and by some accounts helped drive the country to the negotiating table.Not surprisingly, officials at the N.S.A. and at its military partner, the United States Cyber Command, warned that giving up the capability to exploit undisclosed vulnerabilities would amount to “unilateral disarmament” — a phrase taken from the battles over whether and how far to cut America’s nuclear arsenal.“We don’t eliminate nuclear weapons until the Russians do,” one senior intelligence official said recently. “You are not going to see the Chinese give up on ‘zero days’ just because we do.” Even a senior White House official who was sympathetic to broad reforms after the N.S.A. disclosures said last month, “I can’t imagine the president — any president — entirely giving up a technology that might enable him some day to take a covert action that could avoid a shooting war.”
  • But documents released by Edward J. Snowden, the former N.S.A. contractor, make it clear that two years before Heartbleed became known, the N.S.A. was looking at ways to accomplish exactly what the flaw did by accident. A program code-named Bullrun, apparently named for the site of two Civil War battles just outside Washington, was part of a decade-long effort to crack or circumvent encryption on the web. The documents do not make clear how well it succeeded, but it may well have been more effective than exploiting Heartbleed would be at enabling access to secret data.The government has become one of the biggest developers and purchasers of information identifying “zero days,” officials acknowledge. Those flaws are big business — Microsoft pays up to $150,000 to those who find them and bring them to the company to fix — and other countries are gathering them so avidly that something of a modern-day arms race has broken out. Chief among the nations seeking them are China and Russia, though Iran and North Korea are in the market as well.
  •  
    Note that this is only an elastic policy, not law. Also notice that NYT is now reporting as *fact* that the NSA did the cyber attack on the Iranian enrichment centrifuges. By any legal measure, if true that was an act of war, a war of aggression.  So why wasn't the American public informed that we were at war with Iran? 
1 - 20 of 28 Next ›
Showing 20 items per page