Skip to main content

Home/ Future of the Web/ Group items tagged iPhone FBI

Rss Feed Group items tagged

Paul Merrell

Apple's New Challenge: Learning How the U.S. Cracked Its iPhone - The New York Times - 0 views

  • Now that the United States government has cracked open an iPhone that belonged to a gunman in the San Bernardino, Calif., mass shooting without Apple’s help, the tech company is under pressure to find and fix the flaw.But unlike other cases where security vulnerabilities have cropped up, Apple may face a higher set of hurdles in ferreting out and repairing the particular iPhone hole that the government hacked.The challenges start with the lack of information about the method that the law enforcement authorities, with the aid of a third party, used to break into the iPhone of Syed Rizwan Farook, an attacker in the San Bernardino rampage last year. Federal officials have refused to identify the person, or organization, who helped crack the device, and have declined to specify the procedure used to open the iPhone. Apple also cannot obtain the device to reverse-engineer the problem, the way it would in other hacking situations.
  •  
    It would make a very interesting Freedom of Information Act case if Apple sued under that Act to force disclosure of the security hole iPhone product defect the FBI exploited. I know of no interpretation of the law enforcement FOIA exemption that would justify FBI disclosure of the information. It might be alleged that the information is the trade secret of the company that disclosed the defect and exploit to the the FBI, but there's a very strong argument that the fact that the information was shared with the FBI waived the trade secrecy claim. And the notion that government is entitled to collect product security defects and exploit them without informing the exploited product's company of the specific defect is extremely weak.  Were I Tim Cook, I would have already told my lawyers to get cracking on filing the FOIA request with the FBI to get the legal ball rolling. 
Paul Merrell

Forget Apple vs. the FBI: WhatsApp Just Switched on Encryption for a Billion People | W... - 0 views

  • For most of the past six weeks, the biggest story out of Silicon Valley was Apple’s battle with the FBI over a federal order to unlock the iPhone of a mass shooter. The company’s refusal touched off a searing debate over privacy and security in the digital age. But this morning, at a small office in Mountain View, California, three guys made the scope of that enormous debate look kinda small. Mountain View is home to WhatsApp, an online messaging service now owned by tech giant Facebook, that has grown into one of the world’s most important applications. More than a billion people trade messages, make phone calls, send photos, and swap videos using the service. This means that only Facebook itself runs a larger self-contained communications network. And today, the enigmatic founders of WhatsApp, Brian Acton and Jan Koum, together with a high-minded coder and cryptographer who goes by the pseudonym Moxie Marlinspike, revealed that the company has added end-to-end encryption to every form of communication on its service.
  • This means that if any group of people uses the latest version of WhatsApp—whether that group spans two people or ten—the service will encrypt all messages, phone calls, photos, and videos moving among them. And that’s true on any phone that runs the app, from iPhones to Android phones to Windows phones to old school Nokia flip phones. With end-to-end encryption in place, not even WhatsApp’s employees can read the data that’s sent across its network. In other words, WhatsApp has no way of complying with a court order demanding access to the content of any message, phone call, photo, or video traveling through its service. Like Apple, WhatsApp is, in practice, stonewalling the federal government, but it’s doing so on a larger front—one that spans roughly a billion devices.
  • The FBI and the Justice Department declined to comment for this story. But many inside the government and out are sure to take issue with the company’s move. In late 2014, WhatsApp encrypted a portion of its network. In the months since, its service has apparently been used to facilitate criminal acts, including the terrorist attacks on Paris last year. According to The New York Times, as recently as this month, the Justice Department was considering a court case against the company after a wiretap order (still under seal) ran into WhatsApp’s end-to-end encryption. “The government doesn’t want to stop encryption,” says Joseph DeMarco, a former federal prosecutor who specializes in cybercrime and has represented various law enforcement agencies backing the Justice Department and the FBI in their battle with Apple. “But the question is: what do you do when a company creates an encryption system that makes it impossible for court-authorized search warrants to be executed? What is the reasonable level of assistance you should ask from that company?”
Paul Merrell

Upgrade Your iPhone Passcode to Defeat the FBI's Backdoor Strategy - 0 views

  • It’s true that ordering Apple to develop the backdoor will fundamentally undermine iPhone security, as Cook and other digital security advocates have argued. But it’s possible for individual iPhone users to protect themselves from government snooping by setting strong passcodes on their phones — passcodes the FBI would not be able to unlock even if it gets its iPhone backdoor. The technical details of how the iPhone encrypts data, and how the FBI might circumvent this protection, are complex and convoluted, and are being thoroughly explored elsewhere on the internet. What I’m going to focus on here is how ordinary iPhone users can protect themselves. The short version: If you’re worried about governments trying to access your phone, set your iPhone up with a random, 11-digit numeric passcode. What follows is an explanation of why that will protect you and how to actually do it.
Paul Merrell

FBI Got Into San Bernardino Killer's iPhone Without Apple's Help - 0 views

  • AFTER MORE THAN a month of insisting that Apple weaken its security to help the FBI break into San Bernardino killer Syed Rizwan Farook’s iPhone, the government has dropped its legal fight. “The government has now successfully accessed the data stored on Farook’s iPhone and therefore no longer requires the assistance from Apple,” wrote attorneys for the Department of Justice on Monday evening. It’s not yet known if anything valuable was stored on the phone, however. “The FBI is currently reviewing the information on the phone, consistent with standard investigatory procedures,” said Department of Justice spokesperson Melanie Newman in a statement.
Paul Merrell

Feds Force Suspect To Unlock An Apple iPhone X With Their Face - 0 views

  • It finally happened. The feds forced an Apple iPhone X owner to unlock their device with their face.A child abuse investigation unearthed by Forbes includes the first known case in which law enforcement used Apple Face ID facial recognition technology to open a suspect's iPhone. That's by any police agency anywhere in the world, not just in America.It happened on August 10, when the FBI searched the house of 28-year-old Grant Michalski, a Columbus, Ohio, resident who would later that month be charged with receiving and possessing child pornography. With a search warrant in hand, a federal investigator told Michalski to put his face in front of the phone, which he duly did. That allowed the agent to pick through the suspect's online chats, photos and whatever else he deemed worthy of investigation.The case marks another significant moment in the ongoing battle between law enforcement and tech providers, with the former trying to break the myriad security protections put in place by the latter. Since the fight between the world's most valuable company and the FBI in San Bernardino over access to an iPhone in 2016, Forbes has been tracking the various ways cops have been trying to break Apple's protections.
Paul Merrell

FBI director: Cover up your webcam | TheHill - 0 views

  • The head of the FBI on Wednesday defended putting a piece of tape over his personal laptop's webcam, claiming the security step was a common sense one that most should take.  “There’s some sensible things you should be doing, and that’s one of them,” Director James Comey said during a conference at the Center for Strategic and International Studies.ADVERTISEMENT“You go into any government office and we all have the little camera things that sit on top of the screen,” he added. “They all have a little lid that closes down on them.“You do that so that people who don’t have authority don’t look at you. I think that’s a good thing.”Comey was pilloried online earlier this year, after he revealed that he puts a piece of tape over his laptop camera to keep away prying eyes. The precaution is a common one among security advocates, given the relative ease of hacking laptop cameras.  
  • But many found it ironic for Comey, who this year launched a high profile battle against Apple to gain access to data locked inside of the iPhone used by one of the San Bernardino, Calif., terrorists. Many viewed that fight as a referendum on digital privacy.Comey was “much mocked for that,” he acknowledged on Wednesday.But he still uses the tape on his laptop.“I hope people lock their cars,” he said. “Lock your doors at night… if you have an alarm system, you should use it.”“It’s not crazy that the FBI director cares about personal security as well,” the FBI director added. “So I think people ought to take responsibility for their own safety and security.”
Paul Merrell

This Is the Real Reason Apple Is Fighting the FBI | TIME - 0 views

  • The first thing to understand about Apple’s latest fight with the FBI—over a court order to help unlock the deceased San Bernardino shooter’s phone—is that it has very little to do with the San Bernardino shooter’s phone. It’s not even, really, the latest round of the Crypto Wars—the long running debate about how law enforcement and intelligence agencies can adapt to the growing ubiquity of uncrackable encryption tools. Rather, it’s a fight over the future of high-tech surveillance, the trust infrastructure undergirding the global software ecosystem, and how far technology companies and software developers can be conscripted as unwilling suppliers of hacking tools for governments. It’s also the public face of a conflict that will undoubtedly be continued in secret—and is likely already well underway.
  • Considered in isolation, the request seems fairly benign: If it were merely a question of whether to unlock a single device—even one unlikely to contain much essential evidence—there would probably be little enough harm in complying. The reason Apple CEO Tim Cook has pledged to fight a court’s order to assist the bureau is that he understands the danger of the underlying legal precedent the FBI is seeking to establish. Four important pieces of context are necessary to see the trouble with the Apple order.
Paul Merrell

What to Do About Lawless Government Hacking and the Weakening of Digital Security | Ele... - 0 views

  • In our society, the rule of law sets limits on what government can and cannot do, no matter how important its goals. To give a simple example, even when chasing a fleeing murder suspect, the police have a duty not to endanger bystanders. The government should pay the same care to our safety in pursuing threats online, but right now we don’t have clear, enforceable rules for government activities like hacking and "digital sabotage." And this is no abstract question—these actions increasingly endanger everyone’s security
  • The problem became especially clear this year during the San Bernardino case, involving the FBI’s demand that Apple rewrite its iOS operating system to defeat security features on a locked iPhone. Ultimately the FBI exploited an existing vulnerability in iOS and accessed the contents of the phone with the help of an "outside party." Then, with no public process or discussion of the tradeoffs involved, the government refused to tell Apple about the flaw. Despite the obvious fact that the security of the computers and networks we all use is both collective and interwoven—other iPhones used by millions of innocent people presumably have the same vulnerability—the government chose to withhold information Apple could have used to improve the security of its phones. Other examples include intelligence activities like Stuxnet and Bullrun, and law enforcement investigations like the FBI’s mass use of malware against Tor users engaged in criminal behavior. These activities are often disproportionate to stopping legitimate threats, resulting in unpatched software for millions of innocent users, overbroad surveillance, and other collateral effects.  That’s why we’re working on a positive agenda to confront governmental threats to digital security. Put more directly, we’re calling on lawyers, advocates, technologists, and the public to demand a public discussion of whether, when, and how governments can be empowered to break into our computers, phones, and other devices; sabotage and subvert basic security protocols; and stockpile and exploit software flaws and vulnerabilities.  
  • Smart people in academia and elsewhere have been thinking and writing about these issues for years. But it’s time to take the next step and make clear, public rules that carry the force of law to ensure that the government weighs the tradeoffs and reaches the right decisions. This long post outlines some of the things that can be done. It frames the issue, then describes some of the key areas where EFF is already pursuing this agenda—in particular formalizing the rules for disclosing vulnerabilities and setting out narrow limits for the use of government malware. Finally it lays out where we think the debate should go from here.   
  •  
    "In our society, the rule of law sets limits on what government can and cannot do, no matter how important its goals. "
  •  
    It's not often that I disagree with EFF's positions, but on this one I do. The government should be prohibited from exploiting computer vulnerabilities and should be required to immediately report all vulnerabilities discovered to the relevant developers of hardware or software. It's been one long slippery slope since the Supreme Court first approved wiretapping in Olmstead v. United States, 277 US 438 (1928), https://goo.gl/NJevsr (.) Left undecided to this day is whether we have a right to whisper privately, a right that is undeniable. All communications intercept cases since Olmstead fly directly in the face of that right.
Paul Merrell

FBI's secret method of unlocking iPhone may never reach Apple | Reuters - 0 views

  • The FBI may be allowed to withhold information about how it broke into an iPhone belonging to a gunman in the December San Bernardino shootings, despite a U.S. government policy of disclosing technology security flaws discovered by federal agencies. Under the U.S. vulnerabilities equities process, the government is supposed to err in favor of disclosing security issues so companies can devise fixes to protect data. The policy has exceptions for law enforcement, and there are no hard rules about when and how it must be applied.Apple Inc has said it would like the government to share how it cracked the iPhone security protections. But the Federal Bureau of Investigation, which has been frustrated by its inability to access data on encrypted phones belonging to criminal suspects, might prefer to keep secret the technique it used to gain access to gunman Syed Farook's phone. The referee is likely to be a White House group formed during the Obama administration to review computer security flaws discovered by federal agencies and decide whether they should be disclosed.
  • Stewart Baker, former general counsel of the NSA and now a lawyer with Steptoe & Johnson, said the review process could be complicated if the cracking method is considered proprietary by the third party that assisted the FBI.Several security researchers have pointed to the Israel-based mobile forensics firm Cellebrite as the likely third party that helped the FBI. That company has repeatedly declined comment.
  •  
    The article is wide of the mark, based on analysis of Executive Branch policy rather than the governing law such as the Freedom of Information Act. And I still find it somewhat ludicrous that a third party with knowledge of the defect could succeed in convincing a court that knowledge of a defect in a company's product is trade-secret proprietary information. "Your honor, my client has discovered a way to break into Mr. Tim Cook's house without a key to his house. That is a valuable trade secret that this Court must keep Mr. Cook from learning." Pow! The Computer Fraud and Abuse Act makes it a crime to access a computer that can connect to the Internet by exploiting a software bug. 
Paul Merrell

Apple could use Brooklyn case to pursue details about FBI iPhone hack: source | Reuters - 0 views

  • If the U.S. Department of Justice asks a New York court to force Apple Inc to unlock an iPhone, the technology company could push the government to reveal how it accessed the phone which belonged to a shooter in San Bernardino, a source familiar with the situation said.The Justice Department will disclose over the next two weeks whether it will continue with its bid to compel Apple to help access an iPhone in a Brooklyn drug case, according to a court filing on Tuesday.The Justice Department this week withdrew a similar request in California, saying it had succeeded in unlocking an iPhone used by one of the shooters involved in a rampage in San Bernardino in December without Apple's help.The legal dispute between the U.S. government and Apple has been a high-profile test of whether law enforcement should have access to encrypted phone data.
  • Apple, supported by most of the technology industry, says anything that helps authorities bypass security features will undermine security for all users. Government officials say that all kinds of criminal investigations will be crippled without access to phone data.Prosecutors have not said whether the San Bernardino technique would work for other seized iPhones, including the one at issue in Brooklyn. Should the Brooklyn case continue, Apple could pursue legal discovery that would potentially force the FBI to reveal what technique it used on the San Bernardino phone, the source said. A Justice Department representative did not have immediate comment.
Paul Merrell

Cameron Calls June 23 EU Referendum as Cabinet Fractures - Bloomberg Business - 0 views

  • In a secret meeting convened by the White House around Thanksgiving, senior national security officials ordered agencies across the U.S. government to find ways to counter encryption software and gain access to the most heavily protected user data on the most secure consumer devices, including Apple Inc.’s iPhone, the marquee product of one of America’s most valuable companies, according to two people familiar with the decision.The approach was formalized in a confidential National Security Council “decision memo,” tasking government agencies with developing encryption workarounds, estimating additional budgets and identifying laws that may need to be changed to counter what FBI Director James Comey calls the “going dark” problem: investigators being unable to access the contents of encrypted data stored on mobile devices or traveling across the Internet. Details of the memo reveal that, in private, the government was honing a sharper edge to its relationship with Silicon Valley alongside more public signs of rapprochement.
  • On Tuesday, the public got its first glimpse of what those efforts may look like when a federal judge ordered Apple to create a special tool for the FBI to bypass security protections on an iPhone 5c belonging to one of the shooters in the Dec. 2 terrorist attack in San Bernardino, California that killed 14 people. Apple Chief Executive Officer Tim Cook has vowed to fight the order, calling it a “chilling” demand that Apple “hack our own users and undermine decades of security advancements that protect our customers.” The order was not a direct outcome of the memo but is in line with the broader government strategy.White House spokesman Josh Earnest said Wednesday that the Federal Bureau of Investigation and Department of Justice have the Obama administration’s “full” support in the matter. The government is “not asking Apple to redesign its product or to create a new backdoor to their products,” but rather are seeking entry “to this one device,” he said.
Paul Merrell

Cy Vance's Proposal to Backdoor Encrypted Devices Is Riddled With Vulnerabilities | Jus... - 0 views

  • Less than a week after the attacks in Paris — while the public and policymakers were still reeling, and the investigation had barely gotten off the ground — Cy Vance, Manhattan’s District Attorney, released a policy paper calling for legislation requiring companies to provide the government with backdoor access to their smartphones and other mobile devices. This is the first concrete proposal of this type since September 2014, when FBI Director James Comey reignited the “Crypto Wars” in response to Apple’s and Google’s decisions to use default encryption on their smartphones. Though Comey seized on Apple’s and Google’s decisions to encrypt their devices by default, his concerns are primarily related to end-to-end encryption, which protects communications that are in transit. Vance’s proposal, on the other hand, is only concerned with device encryption, which protects data stored on phones. It is still unclear whether encryption played any role in the Paris attacks, though we do know that the attackers were using unencrypted SMS text messages on the night of the attack, and that some of them were even known to intelligence agencies and had previously been under surveillance. But regardless of whether encryption was used at some point during the planning of the attacks, as I lay out below, prohibiting companies from selling encrypted devices would not prevent criminals or terrorists from being able to access unbreakable encryption. Vance’s primary complaint is that Apple’s and Google’s decisions to provide their customers with more secure devices through encryption interferes with criminal investigations. He claims encryption prevents law enforcement from accessing stored data like iMessages, photos and videos, Internet search histories, and third party app data. He makes several arguments to justify his proposal to build backdoors into encrypted smartphones, but none of them hold water.
  • Before addressing the major privacy, security, and implementation concerns that his proposal raises, it is worth noting that while an increase in use of fully encrypted devices could interfere with some law enforcement investigations, it will help prevent far more crimes — especially smartphone theft, and the consequent potential for identity theft. According to Consumer Reports, in 2014 there were more than two million victims of smartphone theft, and nearly two-thirds of all smartphone users either took no steps to secure their phones or their data or failed to implement passcode access for their phones. Default encryption could reduce instances of theft because perpetrators would no longer be able to break into the phone to steal the data.
  • Vance argues that creating a weakness in encryption to allow law enforcement to access data stored on devices does not raise serious concerns for security and privacy, since in order to exploit the vulnerability one would need access to the actual device. He considers this an acceptable risk, claiming it would not be the same as creating a widespread vulnerability in encryption protecting communications in transit (like emails), and that it would be cheap and easy for companies to implement. But Vance seems to be underestimating the risks involved with his plan. It is increasingly important that smartphones and other devices are protected by the strongest encryption possible. Our devices and the apps on them contain astonishing amounts of personal information, so much that an unprecedented level of harm could be caused if a smartphone or device with an exploitable vulnerability is stolen, not least in the forms of identity fraud and credit card theft. We bank on our phones, and have access to credit card payments with services like Apple Pay. Our contact lists are stored on our phones, including phone numbers, emails, social media accounts, and addresses. Passwords are often stored on people’s phones. And phones and apps are often full of personal details about their lives, from food diaries to logs of favorite places to personal photographs. Symantec conducted a study, where the company spread 50 “lost” phones in public to see what people who picked up the phones would do with them. The company found that 95 percent of those people tried to access the phone, and while nearly 90 percent tried to access private information stored on the phone or in other private accounts such as banking services and email, only 50 percent attempted contacting the owner.
  • ...8 more annotations...
  • Vance attempts to downplay this serious risk by asserting that anyone can use the “Find My Phone” or Android Device Manager services that allow owners to delete the data on their phones if stolen. However, this does not stand up to scrutiny. These services are effective only when an owner realizes their phone is missing and can take swift action on another computer or device. This delay ensures some period of vulnerability. Encryption, on the other hand, protects everyone immediately and always. Additionally, Vance argues that it is safer to build backdoors into encrypted devices than it is to do so for encrypted communications in transit. It is true that there is a difference in the threats posed by the two types of encryption backdoors that are being debated. However, some manner of widespread vulnerability will inevitably result from a backdoor to encrypted devices. Indeed, the NSA and GCHQ reportedly hacked into a database to obtain cell phone SIM card encryption keys in order defeat the security protecting users’ communications and activities and to conduct surveillance. Clearly, the reality is that the threat of such a breach, whether from a hacker or a nation state actor, is very real. Even if companies go the extra mile and create a different means of access for every phone, such as a separate access key for each phone, significant vulnerabilities will be created. It would still be possible for a malicious actor to gain access to the database containing those keys, which would enable them to defeat the encryption on any smartphone they took possession of. Additionally, the cost of implementation and maintenance of such a complex system could be high.
  • Privacy is another concern that Vance dismisses too easily. Despite Vance’s arguments otherwise, building backdoors into device encryption undermines privacy. Our government does not impose a similar requirement in any other context. Police can enter homes with warrants, but there is no requirement that people record their conversations and interactions just in case they someday become useful in an investigation. The conversations that we once had through disposable letters and in-person conversations now happen over the Internet and on phones. Just because the medium has changed does not mean our right to privacy has.
  • In addition to his weak reasoning for why it would be feasible to create backdoors to encrypted devices without creating undue security risks or harming privacy, Vance makes several flawed policy-based arguments in favor of his proposal. He argues that criminals benefit from devices that are protected by strong encryption. That may be true, but strong encryption is also a critical tool used by billions of average people around the world every day to protect their transactions, communications, and private information. Lawyers, doctors, and journalists rely on encryption to protect their clients, patients, and sources. Government officials, from the President to the directors of the NSA and FBI, and members of Congress, depend on strong encryption for cybersecurity and data security. There are far more innocent Americans who benefit from strong encryption than there are criminals who exploit it. Encryption is also essential to our economy. Device manufacturers could suffer major economic losses if they are prohibited from competing with foreign manufacturers who offer more secure devices. Encryption also protects major companies from corporate and nation-state espionage. As more daily business activities are done on smartphones and other devices, they may now hold highly proprietary or sensitive information. Those devices could be targeted even more than they are now if all that has to be done to access that information is to steal an employee’s smartphone and exploit a vulnerability the manufacturer was required to create.
  • Vance also suggests that the US would be justified in creating such a requirement since other Western nations are contemplating requiring encryption backdoors as well. Regardless of whether other countries are debating similar proposals, we cannot afford a race to the bottom on cybersecurity. Heads of the intelligence community regularly warn that cybersecurity is the top threat to our national security. Strong encryption is our best defense against cyber threats, and following in the footsteps of other countries by weakening that critical tool would do incalculable harm. Furthermore, even if the US or other countries did implement such a proposal, criminals could gain access to devices with strong encryption through the black market. Thus, only innocent people would be negatively affected, and some of those innocent people might even become criminals simply by trying to protect their privacy by securing their data and devices. Finally, Vance argues that David Kaye, UN Special Rapporteur for Freedom of Expression and Opinion, supported the idea that court-ordered decryption doesn’t violate human rights, provided certain criteria are met, in his report on the topic. However, in the context of Vance’s proposal, this seems to conflate the concepts of court-ordered decryption and of government-mandated encryption backdoors. The Kaye report was unequivocal about the importance of encryption for free speech and human rights. The report concluded that:
  • States should promote strong encryption and anonymity. National laws should recognize that individuals are free to protect the privacy of their digital communications by using encryption technology and tools that allow anonymity online. … States should not restrict encryption and anonymity, which facilitate and often enable the rights to freedom of opinion and expression. Blanket prohibitions fail to be necessary and proportionate. States should avoid all measures that weaken the security that individuals may enjoy online, such as backdoors, weak encryption standards and key escrows. Additionally, the group of intelligence experts that was hand-picked by the President to issue a report and recommendations on surveillance and technology, concluded that: [R]egarding encryption, the U.S. Government should: (1) fully support and not undermine efforts to create encryption standards; (2) not in any way subvert, undermine, weaken, or make vulnerable generally available commercial software; and (3) increase the use of encryption and urge US companies to do so, in order to better protect data in transit, at rest, in the cloud, and in other storage.
  • The clear consensus among human rights experts and several high-ranking intelligence experts, including the former directors of the NSA, Office of the Director of National Intelligence, and DHS, is that mandating encryption backdoors is dangerous. Unaddressed Concerns: Preventing Encrypted Devices from Entering the US and the Slippery Slope In addition to the significant faults in Vance’s arguments in favor of his proposal, he fails to address the question of how such a restriction would be effectively implemented. There is no effective mechanism for preventing code from becoming available for download online, even if it is illegal. One critical issue the Vance proposal fails to address is how the government would prevent, or even identify, encrypted smartphones when individuals bring them into the United States. DHS would have to train customs agents to search the contents of every person’s phone in order to identify whether it is encrypted, and then confiscate the phones that are. Legal and policy considerations aside, this kind of policy is, at the very least, impractical. Preventing strong encryption from entering the US is not like preventing guns or drugs from entering the country — encrypted phones aren’t immediately obvious as is contraband. Millions of people use encrypted devices, and tens of millions more devices are shipped to and sold in the US each year.
  • Finally, there is a real concern that if Vance’s proposal were accepted, it would be the first step down a slippery slope. Right now, his proposal only calls for access to smartphones and devices running mobile operating systems. While this policy in and of itself would cover a number of commonplace devices, it may eventually be expanded to cover laptop and desktop computers, as well as communications in transit. The expansion of this kind of policy is even more worrisome when taking into account the speed at which technology evolves and becomes widely adopted. Ten years ago, the iPhone did not even exist. Who is to say what technology will be commonplace in 10 or 20 years that is not even around today. There is a very real question about how far law enforcement will go to gain access to information. Things that once seemed like merely science fiction, such as wearable technology and artificial intelligence that could be implanted in and work with the human nervous system, are now available. If and when there comes a time when our “smart phone” is not really a device at all, but is rather an implant, surely we would not grant law enforcement access to our minds.
  • Policymakers should dismiss Vance’s proposal to prohibit the use of strong encryption to protect our smartphones and devices in order to ensure law enforcement access. Undermining encryption, regardless of whether it is protecting data in transit or at rest, would take us down a dangerous and harmful path. Instead, law enforcement and the intelligence community should be working to alter their skills and tactics in a fast-evolving technological world so that they are not so dependent on information that will increasingly be protected by encryption.
Paul Merrell

Surveillance scandal rips through hacker community | Security & Privacy - CNET News - 0 views

  • One security start-up that had an encounter with the FBI was Wickr, a privacy-forward text messaging app for the iPhone with an Android version in private beta. Wickr's co-founder Nico Sell told CNET at Defcon, "Wickr has been approached by the FBI and asked for a backdoor. We said, 'No.'" The mistrust runs deep. "Even if [the NSA] stood up tomorrow and said that [they] have eliminated these programs," said Marlinspike, "How could we believe them? How can we believe that anything they say is true?" Where does security innovation go next? The immediate future of information security innovation most likely lies in software that provides an existing service but with heightened privacy protections, such as webmail that doesn't mine you for personal data.
  • Wickr's Sell thinks that her company has hit upon a privacy innovation that a few others are also doing, but many will soon follow: the company itself doesn't store user data. "[The FBI] would have to force us to build a new app. With the current app there's no way," she said, that they could incorporate backdoor access to Wickr users' texts or metadata. "Even if you trust the NSA 100 percent that they're going to use [your data] correctly," Sell said, "Do you trust that they're going to be able to keep it safe from hackers? What if somebody gets that database and posts it online?" To that end, she said, people will start seeing privacy innovation for services that don't currently provide it. Calling it "social networks 2.0," she said that social network competitors will arise that do a better job of protecting their customer's privacy and predicted that some that succeed will do so because of their emphasis on privacy. Abine's recent MaskMe browser add-on and mobile app for creating disposable e-mail addresses, phone numbers, and credit cards is another example of a service that doesn't have access to its own users' data.
  • Stamos predicted changes in services that companies with cloud storage offer, including offering customers the ability to store their data outside of the U.S. "If they want to stay competitive, they're going to have to," he said. But, he cautioned, "It's impossible to do a cloud-based ad supported service." Soghoian added, "The only way to keep a service running is to pay them money." This, he said, is going to give rise to a new wave of ad-free, privacy protective subscription services.
  • ...2 more annotations...
  • The issue with balancing privacy and surveillance is that the wireless carriers are not interested in privacy, he said. "They've been providing wiretapping for 100 years. Apple may in the next year protect voice calls," he said, and said that the best hope for ending widespread government surveillance will be the makers of mobile operating systems like Apple and Google. Not all upcoming security innovation will be focused on that kind of privacy protection. Security researcher Brandon Wiley showed off at Defcon a protocol he calls Dust that can obfuscate different kinds of network traffic, with the end goal of preventing censorship. "I only make products about letting you say what you want to say anywhere in the world," such as content critical of governments, he said. Encryption can hide the specifics of the traffic, but some governments have figured out that they can simply block all encrypted traffic, he said. The Dust protocol would change that, he said, making it hard to tell the difference between encrypted and unencrypted traffic. It's hard to build encryption into pre-existing products, Wiley said. "I think people are going to make easy-to-use, encrypted apps, and that's going to be the future."
  • Companies could face severe consequences from their security experts, said Stamos, if the in-house experts find out that they've been lied to about providing government access to customer data. You could see "lots of resignations and maybe publicly," he said. "It wouldn't hurt their reputations to go out in a blaze of glory." Perhaps not surprisingly, Marlinspike sounded a hopeful call for non-destructive activism on Defcon's 21st anniversary. "As hackers, we don't have a lot of influence on policy. I hope that's something that we can focus our energy on," he said.
  •  
    NSA as the cause of the next major disruption in the social networking service industry?  Grief ahead for Google? Note the point made that: "It's impossible to do a cloud-based ad supported service" where the encryption/decryption takes place on the client side. 
Paul Merrell

F.B.I. Director to Call 'Dark' Devices a Hindrance to Crime Solving in a Policy Speech ... - 0 views

  • In his first major policy speech as director of the F.B.I., James B. Comey on Thursday plans to wade deeper into the debate between law enforcement agencies and technology companies about new programs intended to protect personal information on communication devices.Mr. Comey will say that encryption technologies used on these devices, like the new iPhone, have become so sophisticated that crimes will go unsolved because law enforcement officers will not be able to get information from them, according to a senior F.B.I. official who provided a preview of the speech.The speech was prompted, in part, by the new encryption technology on the iPhone 6, which was released last month. The phone encrypts emails, photos and contacts, thwarting intelligence and law enforcement agencies, like the National Security Agency and F.B.I., from gaining access to it, even if they have court approval.
  • The F.B.I. has long had concerns about devices “going dark” — when technology becomes so sophisticated that the authorities cannot gain access. But now, Mr. Comey said he believes that the new encryption technology has evolved to the point that it will adversely affect crime solving.He will say in the speech that these new programs will most severely affect state and local law enforcement agencies, because they are the ones who most often investigate crimes like kidnappings and robberies in which getting information from electronic devices in a timely manner is essential to solving the crime.
  • They also do not have the resources that are available to the F.B.I. and other federal intelligence and law enforcement authorities in order to get around the programs.Mr. Comey will cite examples of crimes that the authorities were able to solve because they gained access to a phone.“He is going to call for a discussion on this issue and ask whether this is the path we want to go down,” said the senior F.B.I. official. “He is not going to accuse the companies of designing the technologies to prevent the F.B.I. from accessing them. But, he will say that this is a negative byproduct and we need to work together to fix it.”
  • ...2 more annotations...
  • Mr. Comey is scheduled to give the speech — titled “Going Dark: Are Technology, Privacy and Public Safety on a Collision Course?” — at the Brookings Institution in Washington.
  • In the interview that aired on “60 Minutes” on Sunday, Mr. Comey said that “the notion that we would market devices that would allow someone to place themselves beyond the law troubles me a lot.”He said that it was the equivalent of selling cars with trunks that could never be opened, even with a court order.“The notion that people have devices, again, that with court orders, based on a showing of probable cause in a case involving kidnapping or child exploitation or terrorism, we could never open that phone?” he said. “My sense is that we've gone too far when we've gone there.”
  •  
    I'm informed that Comey will also call for legislation outlawing communication by whispering because of technical difficulties in law enforcement monitoring of such communications. 
Paul Merrell

US State Police Have Spent Millions on Israeli Phone Cracking Tech | Motherboard - 0 views

  • This is part of a Motherboard mini-series on the proliferation of phone cracking technology, the people behind it, and who is buying it. Follow along here.When cops have a phone to break into, they just might pull a small, laptop-sized device out of a rugged briefcase. After plugging the phone in with a cable, and a few taps of a touch-screen, the cops have now bypassed the phone’s passcode. Almost like magic, they now have access to call logs, text messages, and in some cases even deleted data.State police forces and highway patrols in the US have collectively spent millions of dollars on this sort of technology to break into and extract data from mobile phones, according to documents obtained by Motherboard. Over 2,000 pages of invoices, purchase orders, communications, and other documents lay out in unprecedented detail how one company in particular has cornered the trade in mobile phone forensics equipment across the United States.Cellebrite, an Israel-based firm, sells tools that can pull data from most mobile phones on the market, such as contact lists, emails, and wiped messages. Cellebrite's products can also circumvent the passcode locks or other security protections on many current mobile phones. The gear is typically used to gather evidence from a criminal suspect's device after it has been seized, and although not many public examples of abuse are available, Cellebrite’s tools have been used by non-US authorities to prosecute dissidents.Previous reports have focused on federal agencies' acquisition of Cellebrite tools. But as smartphones have proliferated and increasingly become the digital center of our lives, the demand and supply of mobile forensics tools has trickled down to more local bodies.
1 - 15 of 15
Showing 20 items per page