Skip to main content

Home/ Future of the Web/ Group items tagged apple security

Rss Feed Group items tagged

Paul Merrell

The All Writs Act, Software Licenses, and Why Judges Should Ask More Questions | Just S... - 0 views

  • Pending before federal magistrate judge James Orenstein is the government’s request for an order obligating Apple, Inc. to unlock an iPhone and thereby assist prosecutors in decrypting data the government has seized and is authorized to search pursuant to a warrant. In an order questioning the government’s purported legal basis for this request, the All Writs Act of 1789 (AWA), Judge Orenstein asked Apple for a brief informing the court whether the request would be technically feasible and/or burdensome. After Apple filed, the court asked it to file a brief discussing whether the government had legal grounds under the AWA to compel Apple’s assistance. Apple filed that brief and the government filed a reply brief last week in the lead-up to a hearing this morning.
  • We’ve long been concerned about whether end users own software under the law. Software owners have rights of adaptation and first sale enshrined in copyright law. But software publishers have claimed that end users are merely licensees, and our rights under copyright law can be waived by mass-market end user license agreements, or EULAs. Over the years, Granick has argued that users should retain their rights even if mass-market licenses purport to take them away. The government’s brief takes advantage of Apple’s EULA for iOS to argue that Apple, the software publisher, is responsible for iPhones around the world. Apple’s EULA states that when you buy an iPhone, you’re not buying the iOS software it runs, you’re just licensing it from Apple. The government argues that having designed a passcode feature into a copy of software which it owns and licenses rather than sells, Apple can be compelled under the All Writs Act to bypass the passcode on a defendant’s iPhone pursuant to a search warrant and thereby access the software owned by Apple. Apple’s supplemental brief argues that in defining its users’ contractual rights vis-à-vis Apple with regard to Apple’s intellectual property, Apple in no way waived its own due process rights vis-à-vis the government with regard to users’ devices. Apple’s brief compares this argument to forcing a car manufacturer to “provide law enforcement with access to the vehicle or to alter its functionality at the government’s request” merely because the car contains licensed software. 
  • This is an interesting twist on the decades-long EULA versus users’ rights fight. As far as we know, this is the first time that the government has piggybacked on EULAs to try to compel software companies to provide assistance to law enforcement. Under the government’s interpretation of the All Writs Act, anyone who makes software could be dragooned into assisting the government in investigating users of the software. If the court adopts this view, it would give investigators immense power. The quotidian aspects of our lives increasingly involve software (from our cars to our TVs to our health to our home appliances), and most of that software is arguably licensed, not bought. Conscripting software makers to collect information on us would afford the government access to the most intimate information about us, on the strength of some words in some license agreements that people never read. (And no wonder: The iPhone’s EULA came to over 300 pages when the government filed it as an exhibit to its brief.)
  • ...1 more annotation...
  • The government’s brief does not acknowledge the sweeping implications of its arguments. It tries to portray its requested unlocking order as narrow and modest, because it “would not require Apple to make any changes to its software or hardware, … [or] to introduce any new ability to access data on its phones. It would simply require Apple to use its existing capability to bypass the passcode on a passcode-locked iOS 7 phone[.]” But that undersells the implications of the legal argument the government is making: that anything a company already can do, it could be compelled to do under the All Writs Act in order to assist law enforcement. Were that the law, the blow to users’ trust in their encrypted devices, services, and products would be little different than if Apple and other companies were legally required to design backdoors into their encryption mechanisms (an idea the government just can’t seem to drop, its assurances in this brief notwithstanding). Entities around the world won’t buy security software if its makers cannot be trusted not to hand over their users’ secrets to the US government. That’s what makes the encryption in iOS 8 and later versions, which Apple has told the court it “would not have the technical ability” to bypass, so powerful — and so despised by the government: Because no matter how broadly the All Writs Act extends, no court can compel Apple to do the impossible.
Gary Edwards

Apple and Facebook Flash Forward to Computer Memory of the Future | Enterprise | WIRED - 1 views

  •  
    Great story that is at the center of a new cloud computing platform. I met David Flynn back when he was first demonstrating the Realmsys flash card. Extraordinary stuff. He was using the technology to open a secure Linux computing window on an operating Windows XP system. The card opened up a secure data socket, connecting to any Internet Server or Data Server, and running applications on that data - while running Windows and Windows apps in the background. Incredible mesh of Linux, streaming data, and legacy Windows apps. Everytime I find these tech pieces explaining Fusion-io though, I can't help but think that David Flynn is one of the most decent, kind and truly deserving of success people that I have ever met. excerpt: "Apple is spending mountains of money on a new breed of hardware device from a company called Fusion-io. As a public company, Fusion-io is required to disclose information about customers that account for an usually large portion of its revenue, and with its latest annual report, the Salt Lake City outfit reveals that in 2012, at least 25 percent of its revenue - $89.8 million - came from Apple. That's just one figure, from just one company. But it serves as a sign post, showing you where the modern data center is headed. 'There's now a blurring between the storage world and the memory world. People have been enlightened by Fusion-io.' - Gary Gentry Inside a data center like the one Apple operates in Maiden, North Carolina, you'll find thousands of computer servers. Fusion-io makes a slim card that slots inside these machines, and it's packed with hundreds of gigabytes of flash memory, the same stuff that holds all the software and the data on your smartphone. You can think of this card as a much-needed replacement for the good old-fashioned hard disk that typically sits inside a server. Much like a hard disk, it stores information. But it doesn't have any moving parts, which means it's generally more reliable. It c
Paul Merrell

Apple's New Challenge: Learning How the U.S. Cracked Its iPhone - The New York Times - 0 views

  • Now that the United States government has cracked open an iPhone that belonged to a gunman in the San Bernardino, Calif., mass shooting without Apple’s help, the tech company is under pressure to find and fix the flaw.But unlike other cases where security vulnerabilities have cropped up, Apple may face a higher set of hurdles in ferreting out and repairing the particular iPhone hole that the government hacked.The challenges start with the lack of information about the method that the law enforcement authorities, with the aid of a third party, used to break into the iPhone of Syed Rizwan Farook, an attacker in the San Bernardino rampage last year. Federal officials have refused to identify the person, or organization, who helped crack the device, and have declined to specify the procedure used to open the iPhone. Apple also cannot obtain the device to reverse-engineer the problem, the way it would in other hacking situations.
  •  
    It would make a very interesting Freedom of Information Act case if Apple sued under that Act to force disclosure of the security hole iPhone product defect the FBI exploited. I know of no interpretation of the law enforcement FOIA exemption that would justify FBI disclosure of the information. It might be alleged that the information is the trade secret of the company that disclosed the defect and exploit to the the FBI, but there's a very strong argument that the fact that the information was shared with the FBI waived the trade secrecy claim. And the notion that government is entitled to collect product security defects and exploit them without informing the exploited product's company of the specific defect is extremely weak.  Were I Tim Cook, I would have already told my lawyers to get cracking on filing the FOIA request with the FBI to get the legal ball rolling. 
Paul Merrell

What to Do About Lawless Government Hacking and the Weakening of Digital Security | Ele... - 0 views

  • In our society, the rule of law sets limits on what government can and cannot do, no matter how important its goals. To give a simple example, even when chasing a fleeing murder suspect, the police have a duty not to endanger bystanders. The government should pay the same care to our safety in pursuing threats online, but right now we don’t have clear, enforceable rules for government activities like hacking and "digital sabotage." And this is no abstract question—these actions increasingly endanger everyone’s security
  • The problem became especially clear this year during the San Bernardino case, involving the FBI’s demand that Apple rewrite its iOS operating system to defeat security features on a locked iPhone. Ultimately the FBI exploited an existing vulnerability in iOS and accessed the contents of the phone with the help of an "outside party." Then, with no public process or discussion of the tradeoffs involved, the government refused to tell Apple about the flaw. Despite the obvious fact that the security of the computers and networks we all use is both collective and interwoven—other iPhones used by millions of innocent people presumably have the same vulnerability—the government chose to withhold information Apple could have used to improve the security of its phones. Other examples include intelligence activities like Stuxnet and Bullrun, and law enforcement investigations like the FBI’s mass use of malware against Tor users engaged in criminal behavior. These activities are often disproportionate to stopping legitimate threats, resulting in unpatched software for millions of innocent users, overbroad surveillance, and other collateral effects.  That’s why we’re working on a positive agenda to confront governmental threats to digital security. Put more directly, we’re calling on lawyers, advocates, technologists, and the public to demand a public discussion of whether, when, and how governments can be empowered to break into our computers, phones, and other devices; sabotage and subvert basic security protocols; and stockpile and exploit software flaws and vulnerabilities.  
  • Smart people in academia and elsewhere have been thinking and writing about these issues for years. But it’s time to take the next step and make clear, public rules that carry the force of law to ensure that the government weighs the tradeoffs and reaches the right decisions. This long post outlines some of the things that can be done. It frames the issue, then describes some of the key areas where EFF is already pursuing this agenda—in particular formalizing the rules for disclosing vulnerabilities and setting out narrow limits for the use of government malware. Finally it lays out where we think the debate should go from here.   
  •  
    "In our society, the rule of law sets limits on what government can and cannot do, no matter how important its goals. "
  •  
    It's not often that I disagree with EFF's positions, but on this one I do. The government should be prohibited from exploiting computer vulnerabilities and should be required to immediately report all vulnerabilities discovered to the relevant developers of hardware or software. It's been one long slippery slope since the Supreme Court first approved wiretapping in Olmstead v. United States, 277 US 438 (1928), https://goo.gl/NJevsr (.) Left undecided to this day is whether we have a right to whisper privately, a right that is undeniable. All communications intercept cases since Olmstead fly directly in the face of that right.
Paul Merrell

EXCLUSIVE: Edward Snowden Explains Why Apple Should Continue To Fight the Government on... - 0 views

  • As the Obama administration campaign to stop the commercialization of strong encryption heats up, National Security Agency whistleblower Edward Snowden is firing back on behalf of the companies like Apple and Google that are finding themselves under attack. “Technologists and companies working to protect ordinary citizens should be applauded, not sued or prosecuted,” Snowden wrote in an email through his lawyer. Snowden was asked by The Intercept to respond to the contentious suggestion — made Thursday on a blog that frequently promotes the interests of the national security establishment — that companies like Apple and Google might in certain cases be found legally liable for providing material aid to a terrorist organization because they provide encryption services to their users.
  • In his email, Snowden explained how law enforcement officials who are demanding that U.S. companies build some sort of window into unbreakable end-to-end encryption — he calls that an “insecurity mandate” — haven’t thought things through. “The central problem with insecurity mandates has never been addressed by its proponents: if one government can demand access to private communications, all governments can,” Snowden wrote. “No matter how good the reason, if the U.S. sets the precedent that Apple has to compromise the security of a customer in response to a piece of government paper, what can they do when the government is China and the customer is the Dalai Lama?”
  • Weakened encryption would only drive people away from the American technology industry, Snowden wrote. “Putting the most important driver of our economy in a position where they have to deal with the devil or lose access to international markets is public policy that makes us less competitive and less safe.”
  • ...1 more annotation...
  • FBI Director James Comey and others have repeatedly stated that law enforcement is “going dark” when it comes to the ability to track bad actors’ communications because of end-to-end encrypted messages, which can only be deciphered by the sender and the receiver. They have never provided evidence for that, however, and have put forth no technologically realistic alternative. Meanwhile, Apple and Google are currently rolling out user-friendly end-to-end encryption for their customers, many of whom have demanded greater privacy protections — especially following Snowden’s disclosures.
Gonzalo San Gil, PhD.

Apple security flaw could be a backdoor for the NSA - RT USA - 1 views

  •  
    "Was the National Security Agency exploiting two just-discovered security flaws to hack into the iPhones and Apple computers of certain targets? Some skeptics are saying there is cause to be concerned about recent coincidences regarding the NSA and Apple"
Paul Merrell

Apple could use Brooklyn case to pursue details about FBI iPhone hack: source | Reuters - 0 views

  • If the U.S. Department of Justice asks a New York court to force Apple Inc to unlock an iPhone, the technology company could push the government to reveal how it accessed the phone which belonged to a shooter in San Bernardino, a source familiar with the situation said.The Justice Department will disclose over the next two weeks whether it will continue with its bid to compel Apple to help access an iPhone in a Brooklyn drug case, according to a court filing on Tuesday.The Justice Department this week withdrew a similar request in California, saying it had succeeded in unlocking an iPhone used by one of the shooters involved in a rampage in San Bernardino in December without Apple's help.The legal dispute between the U.S. government and Apple has been a high-profile test of whether law enforcement should have access to encrypted phone data.
  • Apple, supported by most of the technology industry, says anything that helps authorities bypass security features will undermine security for all users. Government officials say that all kinds of criminal investigations will be crippled without access to phone data.Prosecutors have not said whether the San Bernardino technique would work for other seized iPhones, including the one at issue in Brooklyn. Should the Brooklyn case continue, Apple could pursue legal discovery that would potentially force the FBI to reveal what technique it used on the San Bernardino phone, the source said. A Justice Department representative did not have immediate comment.
Paul Merrell

Cy Vance's Proposal to Backdoor Encrypted Devices Is Riddled With Vulnerabilities | Jus... - 0 views

  • Less than a week after the attacks in Paris — while the public and policymakers were still reeling, and the investigation had barely gotten off the ground — Cy Vance, Manhattan’s District Attorney, released a policy paper calling for legislation requiring companies to provide the government with backdoor access to their smartphones and other mobile devices. This is the first concrete proposal of this type since September 2014, when FBI Director James Comey reignited the “Crypto Wars” in response to Apple’s and Google’s decisions to use default encryption on their smartphones. Though Comey seized on Apple’s and Google’s decisions to encrypt their devices by default, his concerns are primarily related to end-to-end encryption, which protects communications that are in transit. Vance’s proposal, on the other hand, is only concerned with device encryption, which protects data stored on phones. It is still unclear whether encryption played any role in the Paris attacks, though we do know that the attackers were using unencrypted SMS text messages on the night of the attack, and that some of them were even known to intelligence agencies and had previously been under surveillance. But regardless of whether encryption was used at some point during the planning of the attacks, as I lay out below, prohibiting companies from selling encrypted devices would not prevent criminals or terrorists from being able to access unbreakable encryption. Vance’s primary complaint is that Apple’s and Google’s decisions to provide their customers with more secure devices through encryption interferes with criminal investigations. He claims encryption prevents law enforcement from accessing stored data like iMessages, photos and videos, Internet search histories, and third party app data. He makes several arguments to justify his proposal to build backdoors into encrypted smartphones, but none of them hold water.
  • Before addressing the major privacy, security, and implementation concerns that his proposal raises, it is worth noting that while an increase in use of fully encrypted devices could interfere with some law enforcement investigations, it will help prevent far more crimes — especially smartphone theft, and the consequent potential for identity theft. According to Consumer Reports, in 2014 there were more than two million victims of smartphone theft, and nearly two-thirds of all smartphone users either took no steps to secure their phones or their data or failed to implement passcode access for their phones. Default encryption could reduce instances of theft because perpetrators would no longer be able to break into the phone to steal the data.
  • Vance argues that creating a weakness in encryption to allow law enforcement to access data stored on devices does not raise serious concerns for security and privacy, since in order to exploit the vulnerability one would need access to the actual device. He considers this an acceptable risk, claiming it would not be the same as creating a widespread vulnerability in encryption protecting communications in transit (like emails), and that it would be cheap and easy for companies to implement. But Vance seems to be underestimating the risks involved with his plan. It is increasingly important that smartphones and other devices are protected by the strongest encryption possible. Our devices and the apps on them contain astonishing amounts of personal information, so much that an unprecedented level of harm could be caused if a smartphone or device with an exploitable vulnerability is stolen, not least in the forms of identity fraud and credit card theft. We bank on our phones, and have access to credit card payments with services like Apple Pay. Our contact lists are stored on our phones, including phone numbers, emails, social media accounts, and addresses. Passwords are often stored on people’s phones. And phones and apps are often full of personal details about their lives, from food diaries to logs of favorite places to personal photographs. Symantec conducted a study, where the company spread 50 “lost” phones in public to see what people who picked up the phones would do with them. The company found that 95 percent of those people tried to access the phone, and while nearly 90 percent tried to access private information stored on the phone or in other private accounts such as banking services and email, only 50 percent attempted contacting the owner.
  • ...8 more annotations...
  • Vance attempts to downplay this serious risk by asserting that anyone can use the “Find My Phone” or Android Device Manager services that allow owners to delete the data on their phones if stolen. However, this does not stand up to scrutiny. These services are effective only when an owner realizes their phone is missing and can take swift action on another computer or device. This delay ensures some period of vulnerability. Encryption, on the other hand, protects everyone immediately and always. Additionally, Vance argues that it is safer to build backdoors into encrypted devices than it is to do so for encrypted communications in transit. It is true that there is a difference in the threats posed by the two types of encryption backdoors that are being debated. However, some manner of widespread vulnerability will inevitably result from a backdoor to encrypted devices. Indeed, the NSA and GCHQ reportedly hacked into a database to obtain cell phone SIM card encryption keys in order defeat the security protecting users’ communications and activities and to conduct surveillance. Clearly, the reality is that the threat of such a breach, whether from a hacker or a nation state actor, is very real. Even if companies go the extra mile and create a different means of access for every phone, such as a separate access key for each phone, significant vulnerabilities will be created. It would still be possible for a malicious actor to gain access to the database containing those keys, which would enable them to defeat the encryption on any smartphone they took possession of. Additionally, the cost of implementation and maintenance of such a complex system could be high.
  • Privacy is another concern that Vance dismisses too easily. Despite Vance’s arguments otherwise, building backdoors into device encryption undermines privacy. Our government does not impose a similar requirement in any other context. Police can enter homes with warrants, but there is no requirement that people record their conversations and interactions just in case they someday become useful in an investigation. The conversations that we once had through disposable letters and in-person conversations now happen over the Internet and on phones. Just because the medium has changed does not mean our right to privacy has.
  • In addition to his weak reasoning for why it would be feasible to create backdoors to encrypted devices without creating undue security risks or harming privacy, Vance makes several flawed policy-based arguments in favor of his proposal. He argues that criminals benefit from devices that are protected by strong encryption. That may be true, but strong encryption is also a critical tool used by billions of average people around the world every day to protect their transactions, communications, and private information. Lawyers, doctors, and journalists rely on encryption to protect their clients, patients, and sources. Government officials, from the President to the directors of the NSA and FBI, and members of Congress, depend on strong encryption for cybersecurity and data security. There are far more innocent Americans who benefit from strong encryption than there are criminals who exploit it. Encryption is also essential to our economy. Device manufacturers could suffer major economic losses if they are prohibited from competing with foreign manufacturers who offer more secure devices. Encryption also protects major companies from corporate and nation-state espionage. As more daily business activities are done on smartphones and other devices, they may now hold highly proprietary or sensitive information. Those devices could be targeted even more than they are now if all that has to be done to access that information is to steal an employee’s smartphone and exploit a vulnerability the manufacturer was required to create.
  • Vance also suggests that the US would be justified in creating such a requirement since other Western nations are contemplating requiring encryption backdoors as well. Regardless of whether other countries are debating similar proposals, we cannot afford a race to the bottom on cybersecurity. Heads of the intelligence community regularly warn that cybersecurity is the top threat to our national security. Strong encryption is our best defense against cyber threats, and following in the footsteps of other countries by weakening that critical tool would do incalculable harm. Furthermore, even if the US or other countries did implement such a proposal, criminals could gain access to devices with strong encryption through the black market. Thus, only innocent people would be negatively affected, and some of those innocent people might even become criminals simply by trying to protect their privacy by securing their data and devices. Finally, Vance argues that David Kaye, UN Special Rapporteur for Freedom of Expression and Opinion, supported the idea that court-ordered decryption doesn’t violate human rights, provided certain criteria are met, in his report on the topic. However, in the context of Vance’s proposal, this seems to conflate the concepts of court-ordered decryption and of government-mandated encryption backdoors. The Kaye report was unequivocal about the importance of encryption for free speech and human rights. The report concluded that:
  • States should promote strong encryption and anonymity. National laws should recognize that individuals are free to protect the privacy of their digital communications by using encryption technology and tools that allow anonymity online. … States should not restrict encryption and anonymity, which facilitate and often enable the rights to freedom of opinion and expression. Blanket prohibitions fail to be necessary and proportionate. States should avoid all measures that weaken the security that individuals may enjoy online, such as backdoors, weak encryption standards and key escrows. Additionally, the group of intelligence experts that was hand-picked by the President to issue a report and recommendations on surveillance and technology, concluded that: [R]egarding encryption, the U.S. Government should: (1) fully support and not undermine efforts to create encryption standards; (2) not in any way subvert, undermine, weaken, or make vulnerable generally available commercial software; and (3) increase the use of encryption and urge US companies to do so, in order to better protect data in transit, at rest, in the cloud, and in other storage.
  • The clear consensus among human rights experts and several high-ranking intelligence experts, including the former directors of the NSA, Office of the Director of National Intelligence, and DHS, is that mandating encryption backdoors is dangerous. Unaddressed Concerns: Preventing Encrypted Devices from Entering the US and the Slippery Slope In addition to the significant faults in Vance’s arguments in favor of his proposal, he fails to address the question of how such a restriction would be effectively implemented. There is no effective mechanism for preventing code from becoming available for download online, even if it is illegal. One critical issue the Vance proposal fails to address is how the government would prevent, or even identify, encrypted smartphones when individuals bring them into the United States. DHS would have to train customs agents to search the contents of every person’s phone in order to identify whether it is encrypted, and then confiscate the phones that are. Legal and policy considerations aside, this kind of policy is, at the very least, impractical. Preventing strong encryption from entering the US is not like preventing guns or drugs from entering the country — encrypted phones aren’t immediately obvious as is contraband. Millions of people use encrypted devices, and tens of millions more devices are shipped to and sold in the US each year.
  • Finally, there is a real concern that if Vance’s proposal were accepted, it would be the first step down a slippery slope. Right now, his proposal only calls for access to smartphones and devices running mobile operating systems. While this policy in and of itself would cover a number of commonplace devices, it may eventually be expanded to cover laptop and desktop computers, as well as communications in transit. The expansion of this kind of policy is even more worrisome when taking into account the speed at which technology evolves and becomes widely adopted. Ten years ago, the iPhone did not even exist. Who is to say what technology will be commonplace in 10 or 20 years that is not even around today. There is a very real question about how far law enforcement will go to gain access to information. Things that once seemed like merely science fiction, such as wearable technology and artificial intelligence that could be implanted in and work with the human nervous system, are now available. If and when there comes a time when our “smart phone” is not really a device at all, but is rather an implant, surely we would not grant law enforcement access to our minds.
  • Policymakers should dismiss Vance’s proposal to prohibit the use of strong encryption to protect our smartphones and devices in order to ensure law enforcement access. Undermining encryption, regardless of whether it is protecting data in transit or at rest, would take us down a dangerous and harmful path. Instead, law enforcement and the intelligence community should be working to alter their skills and tactics in a fast-evolving technological world so that they are not so dependent on information that will increasingly be protected by encryption.
Paul Merrell

Cameron Calls June 23 EU Referendum as Cabinet Fractures - Bloomberg Business - 0 views

  • In a secret meeting convened by the White House around Thanksgiving, senior national security officials ordered agencies across the U.S. government to find ways to counter encryption software and gain access to the most heavily protected user data on the most secure consumer devices, including Apple Inc.’s iPhone, the marquee product of one of America’s most valuable companies, according to two people familiar with the decision.The approach was formalized in a confidential National Security Council “decision memo,” tasking government agencies with developing encryption workarounds, estimating additional budgets and identifying laws that may need to be changed to counter what FBI Director James Comey calls the “going dark” problem: investigators being unable to access the contents of encrypted data stored on mobile devices or traveling across the Internet. Details of the memo reveal that, in private, the government was honing a sharper edge to its relationship with Silicon Valley alongside more public signs of rapprochement.
  • On Tuesday, the public got its first glimpse of what those efforts may look like when a federal judge ordered Apple to create a special tool for the FBI to bypass security protections on an iPhone 5c belonging to one of the shooters in the Dec. 2 terrorist attack in San Bernardino, California that killed 14 people. Apple Chief Executive Officer Tim Cook has vowed to fight the order, calling it a “chilling” demand that Apple “hack our own users and undermine decades of security advancements that protect our customers.” The order was not a direct outcome of the memo but is in line with the broader government strategy.White House spokesman Josh Earnest said Wednesday that the Federal Bureau of Investigation and Department of Justice have the Obama administration’s “full” support in the matter. The government is “not asking Apple to redesign its product or to create a new backdoor to their products,” but rather are seeking entry “to this one device,” he said.
Paul Merrell

Hackers Prove Fingerprints Are Not Secure, Now What? | nsnbc international - 0 views

  • The Office of Personnel Management (OPM) recently revealed that an estimated 5.6 million government employees were affected by the hack; and not 1.1 million as previously assumed.
  • Samuel Schumach, spokesman for the OPM, said: “As part of the government’s ongoing work to notify individuals affected by the theft of background investigation records, the Office of Personnel Management and the Department of Defense have been analyzing impacted data to verify its quality and completeness. Of the 21.5 million individuals whose Social Security Numbers and other sensitive information were impacted by the breach, the subset of individuals whose fingerprints have been stolen has increased from a total of approximately 1.1 million to approximately 5.6 million.” This endeavor expended the use of the Department of Defense (DoD), the Department of Homeland Security (DHS), the National Security Agency (NSA), and the Pentagon. Schumer added that “if, in the future, new means are developed to misuse the fingerprint data, the government will provide additional information to individuals whose fingerprints may have been stolen in this breach.” However, we do not need to wait for the future for fingerprint data to be misused and coveted by hackers.
  • Look no further than the security flaws in Samsung’s new Galaxy 5 smartphone as was demonstrated by researchers at Security Research Labs (SRL) showing how fingerprints, iris scans and other biometric identifiers could be fabricated and yet authenticated by the Apple Touch ID fingerprints scanner. The shocking part of this demonstration is that this hack was achieved less than 2 days after the technology was released to the public by Apple. Ben Schlabs, researcher for SRL explained: “We expected we’d be able to spoof the S5’s Finger Scanner, but I hoped it would at least be a challenge. The S5 Finger Scanner feature offers nothing new except—because of the way it is implemented in this Android device—slightly higher risk than that already posed by previous devices.” Schlabs and other researchers discovered that “the S5 has no mechanism requiring a password when encountering a large number of incorrect finger swipes.” By rebotting the smartphone, Schlabs could force “the handset to accept an unlimited number of incorrect swipes without requiring users to enter a password [and] the S5 fingerprint authenticator [could] be associated with sensitive banking or payment apps such as PayPal.”
  • ...1 more annotation...
  • Schlab said: “Perhaps most concerning is that Samsung does not seem to have learned from what others have done less poorly. Not only is it possible to spoof the fingerprint authentication even after the device has been turned off, but the implementation also allows for seemingly unlimited authentication attempts without ever requiring a password. Incorporation of fingerprint authentication into highly sensitive apps such as PayPal gives a would-be attacker an even greater incentive to learn the simple skill of fingerprint spoofing.” Last year Hackers from the Chaos Computer Club (CCC) proved Apple wrong when the corporation insisted that their new iPhone 5S fingerprint sensor is “a convenient and highly secure way to access your phone.” CCC stated that it is as easy as stealing a fingerprint from a drinking glass – and anyone can do it.
Gonzalo San Gil, PhD.

Mobile security: iOS 8 vs. Android 5 vs. BlackBerry vs. Windows Phone [# ! x BoS...]] - 0 views

  •  
    "Apple's iPhone and iPad long ago pushed out the BlackBerry as the corporate standard for mobile devices, in all but the highest-security environments. Google -- whose Android platform reigns outside the corporate world -- is now trying to push out Apple, with a new effort called Android for Work."
  •  
    "Apple's iPhone and iPad long ago pushed out the BlackBerry as the corporate standard for mobile devices, in all but the highest-security environments. Google -- whose Android platform reigns outside the corporate world -- is now trying to push out Apple, with a new effort called Android for Work."
  •  
    "Apple's iPhone and iPad long ago pushed out the BlackBerry as the corporate standard for mobile devices, in all but the highest-security environments. Google -- whose Android platform reigns outside the corporate world -- is now trying to push out Apple, with a new effort called Android for Work."
Paul Merrell

This Is the Real Reason Apple Is Fighting the FBI | TIME - 0 views

  • The first thing to understand about Apple’s latest fight with the FBI—over a court order to help unlock the deceased San Bernardino shooter’s phone—is that it has very little to do with the San Bernardino shooter’s phone. It’s not even, really, the latest round of the Crypto Wars—the long running debate about how law enforcement and intelligence agencies can adapt to the growing ubiquity of uncrackable encryption tools. Rather, it’s a fight over the future of high-tech surveillance, the trust infrastructure undergirding the global software ecosystem, and how far technology companies and software developers can be conscripted as unwilling suppliers of hacking tools for governments. It’s also the public face of a conflict that will undoubtedly be continued in secret—and is likely already well underway.
  • Considered in isolation, the request seems fairly benign: If it were merely a question of whether to unlock a single device—even one unlikely to contain much essential evidence—there would probably be little enough harm in complying. The reason Apple CEO Tim Cook has pledged to fight a court’s order to assist the bureau is that he understands the danger of the underlying legal precedent the FBI is seeking to establish. Four important pieces of context are necessary to see the trouble with the Apple order.
Paul Merrell

NSA Director Finally Admits Encryption Is Needed to Protect Public's Privacy - 0 views

  • NSA Director Finally Admits Encryption Is Needed to Protect Public’s Privacy The new stance denotes a growing awareness within the government that Americans are not comfortable with the State’s grip on their data. By Carey Wedler | AntiMedia | January 22, 2016 Share this article! https://mail.google.com/mail/?view=cm&fs=1&to&su=NSA%20Director%20Finally%20Admits%20Encryption%20Is%20Needed%20to%20Protect%20Public%E2%80%99s%20Privacy&body=http%3A%2F%2Fwww.mintpress
  • Rogers cited the recent Office of Personnel Management hack of over 20 million users as a reason to increase encryption rather than scale it back. “What you saw at OPM, you’re going to see a whole lot more of,” he said, referring to the massive hack that compromised the personal data about 20 million people who obtained background checks. Rogers’ comments, while forward-thinking, signify an about face in his stance on encryption. In February 2015, he said he “shares [FBI] Director [James] Comey’s concern” about cell phone companies’ decision to add encryption features to their products. Comey has been one loudest critics of encryption. However, Rogers’ comments on Thursday now directly conflict with Comey’s stated position. The FBI director has publicly chastised encryption, as well as the companies that provide it. In 2014, he claimed Apple’s then-new encryption feature could lead the world to “a very dark place.” At a Department of Justice hearing in November, Comey testified that “Increasingly, the shadow that is ‘going dark’ is falling across more and more of our work.” Though he claimed, “We support encryption,” he insisted “we have a problem that encryption is crashing into public safety and we have to figure out, as people who care about both, to resolve it. So, I think the conversation’s in a healthier place.”
  • At the same hearing, Comey and Attorney General Loretta Lynch declined to comment on whether they had proof the Paris attackers used encryption. Even so, Comey recently lobbied for tech companies to do away with end-to-end encryption. However, his crusade has fallen on unsympathetic ears, both from the private companies he seeks to control — and from the NSA. Prior to Rogers’ statements in support of encryption Thursday, former NSA chief Michael Hayden said, “I disagree with Jim Comey. I actually think end-to-end encryption is good for America.” Still another former NSA chair has criticized calls for backdoor access to information. In October, Mike McConnell told a panel at an encryption summit that the United States is “better served by stronger encryption, rather than baking in weaker encryption.” Former Department of Homeland Security chief, Michael Chertoff, has also spoken out against government being able to bypass encryption.
  • ...2 more annotations...
  • Regardless of these individual defenses of encryption, the Intercept explained why these statements may be irrelevant: “Left unsaid is the fact that the FBI and NSA have the ability to circumvent encryption and get to the content too — by hacking. Hacking allows law enforcement to plant malicious code on someone’s computer in order to gain access to the photos, messages, and text before they were ever encrypted in the first place, and after they’ve been decrypted. The NSA has an entire team of advanced hackers, possibly as many as 600, camped out at Fort Meade.”
  • Rogers statements, of course, are not a full-fledged endorsement of privacy, nor can the NSA be expected to make it a priority. Even so, his new stance denotes a growing awareness within the government that Americans are not comfortable with the State’s grip on their data. “So spending time arguing about ‘hey, encryption is bad and we ought to do away with it’ … that’s a waste of time to me,” Rogers said Thursday. “So what we’ve got to ask ourselves is, with that foundation, what’s the best way for us to deal with it? And how do we meet those very legitimate concerns from multiple perspectives?”
Paul Merrell

Security Experts Oppose Government Access to Encrypted Communication - The New York Times - 0 views

  • An elite group of security technologists has concluded that the American and British governments cannot demand special access to encrypted communications without putting the world’s most confidential data and critical infrastructure in danger.A new paper from the group, made up of 14 of the world’s pre-eminent cryptographers and computer scientists, is a formidable salvo in a skirmish between intelligence and law enforcement leaders, and technologists and privacy advocates. After Edward J. Snowden’s revelations — with security breaches and awareness of nation-state surveillance at a record high and data moving online at breakneck speeds — encryption has emerged as a major issue in the debate over privacy rights.
  • That has put Silicon Valley at the center of a tug of war. Technology companies including Apple, Microsoft and Google have been moving to encrypt more of their corporate and customer data after learning that the National Security Agency and its counterparts were siphoning off digital communications and hacking into corporate data centers.
  • Yet law enforcement and intelligence agency leaders argue that such efforts thwart their ability to monitor kidnappers, terrorists and other adversaries. In Britain, Prime Minister David Cameron threatened to ban encrypted messages altogether. In the United States, Michael S. Rogers, the director of the N.S.A., proposed that technology companies be required to create a digital key to unlock encrypted data, but to divide the key into pieces and secure it so that no one person or government agency could use it alone.The encryption debate has left both sides bitterly divided and in fighting mode. The group of cryptographers deliberately issued its report a day before James B. Comey Jr., the director of the Federal Bureau of Investigation, and Sally Quillian Yates, the deputy attorney general at the Justice Department, are scheduled to testify before the Senate Judiciary Committee on the concerns that they and other government agencies have that encryption technologies will prevent them from effectively doing their jobs.
  • ...2 more annotations...
  • The new paper is the first in-depth technical analysis of government proposals by leading cryptographers and security thinkers, including Whitfield Diffie, a pioneer of public key cryptography, and Ronald L. Rivest, the “R” in the widely used RSA public cryptography algorithm. In the report, the group said any effort to give the government “exceptional access” to encrypted communications was technically unfeasible and would leave confidential data and critical infrastructure like banks and the power grid at risk. Handing governments a key to encrypted communications would also require an extraordinary degree of trust. With government agency breaches now the norm — most recently at the United States Office of Personnel Management, the State Department and the White House — the security specialists said authorities could not be trusted to keep such keys safe from hackers and criminals. They added that if the United States and Britain mandated backdoor keys to communications, China and other governments in foreign markets would be spurred to do the same.
  • “Such access will open doors through which criminals and malicious nation-states can attack the very individuals law enforcement seeks to defend,” the report said. “The costs would be substantial, the damage to innovation severe and the consequences to economic growth hard to predict. The costs to the developed countries’ soft power and to our moral authority would also be considerable.”
  •  
    Our system of government does not expect that every criminal will be apprehended and convicted. There are numerous values our society believes are more important. Some examples: [i] a presumption of innocence unless guilt is established beyond any reasonable doubt; [ii] the requirement that government officials convince a neutral magistrate that they have probable cause to believe that a search or seizure will produce evidence of a crime; [iii] many communications cannot be compelled to be disclosed and used in evidence, such as attorney-client communications, spousal communications, and priest-penitent communications; and [iv] etc. Moral of my story: the government needs a much stronger reason to justify interception of communications than saying, "some crooks will escape prosecution if we can't do that." We have a right to whisper to each other, concealing our communicatons from all others. Why does the right to whisper privately disappear if our whisperings are done electronically? The Supreme Court took its first step on a very slippery slope when it permitted wiretapping in Olmstead v. United States, 277 U.S. 438, 48 S. Ct. 564, 72 L. Ed. 944 (1928). https://goo.gl/LaZGHt It's been a long slide ever since. It's past time to revisit Olmstead and recognize that American citizens have the absolute right to communicate privately. "The President … recognizes that U.S. citizens and institutions should have a reasonable expectation of privacy from foreign or domestic intercept when using the public telephone system." - Brent Scowcroft, U.S. National Security Advisor, National Security Decision Memorandum 338 (1 September 1976) (Nixon administration), http://www.fas.org/irp/offdocs/nsdm-ford/nsdm-338.pdf   
Paul Merrell

FBI's secret method of unlocking iPhone may never reach Apple | Reuters - 0 views

  • The FBI may be allowed to withhold information about how it broke into an iPhone belonging to a gunman in the December San Bernardino shootings, despite a U.S. government policy of disclosing technology security flaws discovered by federal agencies. Under the U.S. vulnerabilities equities process, the government is supposed to err in favor of disclosing security issues so companies can devise fixes to protect data. The policy has exceptions for law enforcement, and there are no hard rules about when and how it must be applied.Apple Inc has said it would like the government to share how it cracked the iPhone security protections. But the Federal Bureau of Investigation, which has been frustrated by its inability to access data on encrypted phones belonging to criminal suspects, might prefer to keep secret the technique it used to gain access to gunman Syed Farook's phone. The referee is likely to be a White House group formed during the Obama administration to review computer security flaws discovered by federal agencies and decide whether they should be disclosed.
  • Stewart Baker, former general counsel of the NSA and now a lawyer with Steptoe & Johnson, said the review process could be complicated if the cracking method is considered proprietary by the third party that assisted the FBI.Several security researchers have pointed to the Israel-based mobile forensics firm Cellebrite as the likely third party that helped the FBI. That company has repeatedly declined comment.
  •  
    The article is wide of the mark, based on analysis of Executive Branch policy rather than the governing law such as the Freedom of Information Act. And I still find it somewhat ludicrous that a third party with knowledge of the defect could succeed in convincing a court that knowledge of a defect in a company's product is trade-secret proprietary information. "Your honor, my client has discovered a way to break into Mr. Tim Cook's house without a key to his house. That is a valuable trade secret that this Court must keep Mr. Cook from learning." Pow! The Computer Fraud and Abuse Act makes it a crime to access a computer that can connect to the Internet by exploiting a software bug. 
Paul Merrell

Forget Apple vs. the FBI: WhatsApp Just Switched on Encryption for a Billion People | W... - 0 views

  • For most of the past six weeks, the biggest story out of Silicon Valley was Apple’s battle with the FBI over a federal order to unlock the iPhone of a mass shooter. The company’s refusal touched off a searing debate over privacy and security in the digital age. But this morning, at a small office in Mountain View, California, three guys made the scope of that enormous debate look kinda small. Mountain View is home to WhatsApp, an online messaging service now owned by tech giant Facebook, that has grown into one of the world’s most important applications. More than a billion people trade messages, make phone calls, send photos, and swap videos using the service. This means that only Facebook itself runs a larger self-contained communications network. And today, the enigmatic founders of WhatsApp, Brian Acton and Jan Koum, together with a high-minded coder and cryptographer who goes by the pseudonym Moxie Marlinspike, revealed that the company has added end-to-end encryption to every form of communication on its service.
  • This means that if any group of people uses the latest version of WhatsApp—whether that group spans two people or ten—the service will encrypt all messages, phone calls, photos, and videos moving among them. And that’s true on any phone that runs the app, from iPhones to Android phones to Windows phones to old school Nokia flip phones. With end-to-end encryption in place, not even WhatsApp’s employees can read the data that’s sent across its network. In other words, WhatsApp has no way of complying with a court order demanding access to the content of any message, phone call, photo, or video traveling through its service. Like Apple, WhatsApp is, in practice, stonewalling the federal government, but it’s doing so on a larger front—one that spans roughly a billion devices.
  • The FBI and the Justice Department declined to comment for this story. But many inside the government and out are sure to take issue with the company’s move. In late 2014, WhatsApp encrypted a portion of its network. In the months since, its service has apparently been used to facilitate criminal acts, including the terrorist attacks on Paris last year. According to The New York Times, as recently as this month, the Justice Department was considering a court case against the company after a wiretap order (still under seal) ran into WhatsApp’s end-to-end encryption. “The government doesn’t want to stop encryption,” says Joseph DeMarco, a former federal prosecutor who specializes in cybercrime and has represented various law enforcement agencies backing the Justice Department and the FBI in their battle with Apple. “But the question is: what do you do when a company creates an encryption system that makes it impossible for court-authorized search warrants to be executed? What is the reasonable level of assistance you should ask from that company?”
Paul Merrell

Chinese State Media Declares iPhone a Threat To National Security - Slashdot - 0 views

  • "When NSA whistleblower Edward Snowden came forth last year with U.S. government spying secrets, it didn't take long to realize that some of the information revealed could bring on serious repercussions — not just for the U.S. government, but also for U.S.-based companies. The latest to feel the hit? None other than Apple, and in a region the company has been working hard to increase market share: China. China, via state media, has today declared that Apple's iPhone is a threat to national security — all because of its thorough tracking capabilities. It has the ability to keep track of user locations, and to the country, this could potentially reveal "state secrets" somehow. It's being noted that the iPhone will continue to track the user to some extent even if the overall feature is disabled. China's iPhone ousting comes hot on the heels of Russia's industry and trade deeming AMD and Intel processors to be untrustworthy. The nation will instead be building its own ARM-based "Baikal" processor.
Gonzalo San Gil, PhD.

Canonical, Microsoft, and Apple Want OS Convergence - Who Will Get There First? - 0 views

  •  
    "- The Ubuntu Family The idea of OS convergence is starting to take a hold in the world and major companies like Microsoft, Apple, and Canonical are working hard to achieve it. There seems to be a race going on and all the players want to reach the finish as soon as possible." [# ! It's #not a #Race... # ! It should be seen -and promoted- more than #mountaineering: # ! T@gether -as a '(Dream) Team'- to Reach The Same Top: # ! An #Open, #Free, #Fast, #Secure & #Reliable #OperatingSystem for #Everyone # ! Leave the #business for #differentiated #hardware & #specialized #services...]
Paul Merrell

Surveillance scandal rips through hacker community | Security & Privacy - CNET News - 0 views

  • One security start-up that had an encounter with the FBI was Wickr, a privacy-forward text messaging app for the iPhone with an Android version in private beta. Wickr's co-founder Nico Sell told CNET at Defcon, "Wickr has been approached by the FBI and asked for a backdoor. We said, 'No.'" The mistrust runs deep. "Even if [the NSA] stood up tomorrow and said that [they] have eliminated these programs," said Marlinspike, "How could we believe them? How can we believe that anything they say is true?" Where does security innovation go next? The immediate future of information security innovation most likely lies in software that provides an existing service but with heightened privacy protections, such as webmail that doesn't mine you for personal data.
  • Wickr's Sell thinks that her company has hit upon a privacy innovation that a few others are also doing, but many will soon follow: the company itself doesn't store user data. "[The FBI] would have to force us to build a new app. With the current app there's no way," she said, that they could incorporate backdoor access to Wickr users' texts or metadata. "Even if you trust the NSA 100 percent that they're going to use [your data] correctly," Sell said, "Do you trust that they're going to be able to keep it safe from hackers? What if somebody gets that database and posts it online?" To that end, she said, people will start seeing privacy innovation for services that don't currently provide it. Calling it "social networks 2.0," she said that social network competitors will arise that do a better job of protecting their customer's privacy and predicted that some that succeed will do so because of their emphasis on privacy. Abine's recent MaskMe browser add-on and mobile app for creating disposable e-mail addresses, phone numbers, and credit cards is another example of a service that doesn't have access to its own users' data.
  • Stamos predicted changes in services that companies with cloud storage offer, including offering customers the ability to store their data outside of the U.S. "If they want to stay competitive, they're going to have to," he said. But, he cautioned, "It's impossible to do a cloud-based ad supported service." Soghoian added, "The only way to keep a service running is to pay them money." This, he said, is going to give rise to a new wave of ad-free, privacy protective subscription services.
  • ...2 more annotations...
  • The issue with balancing privacy and surveillance is that the wireless carriers are not interested in privacy, he said. "They've been providing wiretapping for 100 years. Apple may in the next year protect voice calls," he said, and said that the best hope for ending widespread government surveillance will be the makers of mobile operating systems like Apple and Google. Not all upcoming security innovation will be focused on that kind of privacy protection. Security researcher Brandon Wiley showed off at Defcon a protocol he calls Dust that can obfuscate different kinds of network traffic, with the end goal of preventing censorship. "I only make products about letting you say what you want to say anywhere in the world," such as content critical of governments, he said. Encryption can hide the specifics of the traffic, but some governments have figured out that they can simply block all encrypted traffic, he said. The Dust protocol would change that, he said, making it hard to tell the difference between encrypted and unencrypted traffic. It's hard to build encryption into pre-existing products, Wiley said. "I think people are going to make easy-to-use, encrypted apps, and that's going to be the future."
  • Companies could face severe consequences from their security experts, said Stamos, if the in-house experts find out that they've been lied to about providing government access to customer data. You could see "lots of resignations and maybe publicly," he said. "It wouldn't hurt their reputations to go out in a blaze of glory." Perhaps not surprisingly, Marlinspike sounded a hopeful call for non-destructive activism on Defcon's 21st anniversary. "As hackers, we don't have a lot of influence on policy. I hope that's something that we can focus our energy on," he said.
  •  
    NSA as the cause of the next major disruption in the social networking service industry?  Grief ahead for Google? Note the point made that: "It's impossible to do a cloud-based ad supported service" where the encryption/decryption takes place on the client side. 
Paul Merrell

WhatsApp Encryption Said to Stymie Wiretap Order - The New York Times - 0 views

  • While the Justice Department wages a public fight with Apple over access to a locked iPhone, government officials are privately debating how to resolve a prolonged standoff with another technology company, WhatsApp, over access to its popular instant messaging application, officials and others involved in the case said. No decision has been made, but a court fight with WhatsApp, the world’s largest mobile messaging service, would open a new front in the Obama administration’s dispute with Silicon Valley over encryption, security and privacy.WhatsApp, which is owned by Facebook, allows customers to send messages and make phone calls over the Internet. In the last year, the company has been adding encryption to those conversations, making it impossible for the Justice Department to read or eavesdrop, even with a judge’s wiretap order.
  • As recently as this past week, officials said, the Justice Department was discussing how to proceed in a continuing criminal investigation in which a federal judge had approved a wiretap, but investigators were stymied by WhatsApp’s encryption.The Justice Department and WhatsApp declined to comment. The government officials and others who discussed the dispute did so on condition of anonymity because the wiretap order and all the information associated with it were under seal. The nature of the case was not clear, except that officials said it was not a terrorism investigation. The location of the investigation was also unclear.
  • To understand the battle lines, consider this imperfect analogy from the predigital world: If the Apple dispute is akin to whether the F.B.I. can unlock your front door and search your house, the issue with WhatsApp is whether it can listen to your phone calls. In the era of encryption, neither question has a clear answer.Some investigators view the WhatsApp issue as even more significant than the one over locked phones because it goes to the heart of the future of wiretapping. They say the Justice Department should ask a judge to force WhatsApp to help the government get information that has been encrypted. Others are reluctant to escalate the dispute, particularly with senators saying they will soon introduce legislation to help the government get data in a format it can read.
1 - 20 of 41 Next › Last »
Showing 20 items per page