Skip to main content

Home/ Future of the Web/ Group items tagged surveillance state security

Rss Feed Group items tagged

Paul Merrell

Nearly Everyone In The U.S. And Canada Just Had Their Private Cell Phone Location Data ... - 0 views

  • A company by the name of LocationSmart isn't having a particularly good month. The company recently received all the wrong kind of attention when it was caught up in a privacy scandal involving the nation's wireless carriers and our biggest prison phone monopoly. Like countless other companies and governments, LocationSmart buys your wireless location data from cell carriers. It then sells access to that data via a portal that can provide real-time access to a user's location via a tailored graphical interface using just the target's phone number.
  • Theoretically, this functionality is sold under the pretense that the tool can be used to track things like drug offenders who have skipped out of rehab. And ideally, all the companies involved were supposed to ensure that data lookup requests were accompanied by something vaguely resembling official documentation. But a recent deep dive by the New York Times noted how the system was open to routine abuse by law enforcement, after a Missouri Sherrif used the system to routinely spy on Judges and fellow law enforcement officers without much legitimate justification (or pesky warrants): "The service can find the whereabouts of almost any cellphone in the country within seconds. It does this by going through a system typically used by marketers and other companies to get location data from major cellphone carriers, including AT&T, Sprint, T-Mobile and Verizon, documents show. Between 2014 and 2017, the sheriff, Cory Hutcheson, used the service at least 11 times, prosecutors said. His alleged targets included a judge and members of the State Highway Patrol. Mr. Hutcheson, who was dismissed last year in an unrelated matter, has pleaded not guilty in the surveillance cases." It was yet another example of the way nonexistent to lax consumer privacy laws in the States (especially for wireless carriers) routinely come back to bite us. But then things got worse.
  • Driven by curiousity in the wake of the Times report, a PhD student at Carnegie Mellon University by the name of Robert Xiao discovered that the "try before you buy" system used by LocationSmart to advertise the cell location tracking system contained a bug, A bug so bad that it exposed the data of roughly 200 million wireless subscribers across the United States and Canada (read: nearly everybody). As we see all too often, the researcher highlighted how the security standards in place to safeguard this data were virtually nonexistent: "Due to a very elementary bug in the website, you can just skip that consent part and go straight to the location," said Robert Xiao, a PhD student at the Human-Computer Interaction Institute at Carnegie Mellon University, in a phone call. "The implication of this is that LocationSmart never required consent in the first place," he said. "There seems to be no security oversight here."
  • ...1 more annotation...
  • Meanwhile, none of the four major wireless carriers have been willing to confirm any business relationship with LocationSmart, but all claim to be investigating the problem after the week of bad press. That this actually results in substantive changes to the nation's cavalier treatment of private user data is a wager few would be likely to make.
Paul Merrell

What to Do About Lawless Government Hacking and the Weakening of Digital Security | Ele... - 0 views

  • In our society, the rule of law sets limits on what government can and cannot do, no matter how important its goals. To give a simple example, even when chasing a fleeing murder suspect, the police have a duty not to endanger bystanders. The government should pay the same care to our safety in pursuing threats online, but right now we don’t have clear, enforceable rules for government activities like hacking and "digital sabotage." And this is no abstract question—these actions increasingly endanger everyone’s security
  • The problem became especially clear this year during the San Bernardino case, involving the FBI’s demand that Apple rewrite its iOS operating system to defeat security features on a locked iPhone. Ultimately the FBI exploited an existing vulnerability in iOS and accessed the contents of the phone with the help of an "outside party." Then, with no public process or discussion of the tradeoffs involved, the government refused to tell Apple about the flaw. Despite the obvious fact that the security of the computers and networks we all use is both collective and interwoven—other iPhones used by millions of innocent people presumably have the same vulnerability—the government chose to withhold information Apple could have used to improve the security of its phones. Other examples include intelligence activities like Stuxnet and Bullrun, and law enforcement investigations like the FBI’s mass use of malware against Tor users engaged in criminal behavior. These activities are often disproportionate to stopping legitimate threats, resulting in unpatched software for millions of innocent users, overbroad surveillance, and other collateral effects.  That’s why we’re working on a positive agenda to confront governmental threats to digital security. Put more directly, we’re calling on lawyers, advocates, technologists, and the public to demand a public discussion of whether, when, and how governments can be empowered to break into our computers, phones, and other devices; sabotage and subvert basic security protocols; and stockpile and exploit software flaws and vulnerabilities.  
  • Smart people in academia and elsewhere have been thinking and writing about these issues for years. But it’s time to take the next step and make clear, public rules that carry the force of law to ensure that the government weighs the tradeoffs and reaches the right decisions. This long post outlines some of the things that can be done. It frames the issue, then describes some of the key areas where EFF is already pursuing this agenda—in particular formalizing the rules for disclosing vulnerabilities and setting out narrow limits for the use of government malware. Finally it lays out where we think the debate should go from here.   
  •  
    "In our society, the rule of law sets limits on what government can and cannot do, no matter how important its goals. "
  •  
    It's not often that I disagree with EFF's positions, but on this one I do. The government should be prohibited from exploiting computer vulnerabilities and should be required to immediately report all vulnerabilities discovered to the relevant developers of hardware or software. It's been one long slippery slope since the Supreme Court first approved wiretapping in Olmstead v. United States, 277 US 438 (1928), https://goo.gl/NJevsr (.) Left undecided to this day is whether we have a right to whisper privately, a right that is undeniable. All communications intercept cases since Olmstead fly directly in the face of that right.
Paul Merrell

Apple could use Brooklyn case to pursue details about FBI iPhone hack: source | Reuters - 0 views

  • If the U.S. Department of Justice asks a New York court to force Apple Inc to unlock an iPhone, the technology company could push the government to reveal how it accessed the phone which belonged to a shooter in San Bernardino, a source familiar with the situation said.The Justice Department will disclose over the next two weeks whether it will continue with its bid to compel Apple to help access an iPhone in a Brooklyn drug case, according to a court filing on Tuesday.The Justice Department this week withdrew a similar request in California, saying it had succeeded in unlocking an iPhone used by one of the shooters involved in a rampage in San Bernardino in December without Apple's help.The legal dispute between the U.S. government and Apple has been a high-profile test of whether law enforcement should have access to encrypted phone data.
  • Apple, supported by most of the technology industry, says anything that helps authorities bypass security features will undermine security for all users. Government officials say that all kinds of criminal investigations will be crippled without access to phone data.Prosecutors have not said whether the San Bernardino technique would work for other seized iPhones, including the one at issue in Brooklyn. Should the Brooklyn case continue, Apple could pursue legal discovery that would potentially force the FBI to reveal what technique it used on the San Bernardino phone, the source said. A Justice Department representative did not have immediate comment.
Paul Merrell

Apple's New Challenge: Learning How the U.S. Cracked Its iPhone - The New York Times - 0 views

  • Now that the United States government has cracked open an iPhone that belonged to a gunman in the San Bernardino, Calif., mass shooting without Apple’s help, the tech company is under pressure to find and fix the flaw.But unlike other cases where security vulnerabilities have cropped up, Apple may face a higher set of hurdles in ferreting out and repairing the particular iPhone hole that the government hacked.The challenges start with the lack of information about the method that the law enforcement authorities, with the aid of a third party, used to break into the iPhone of Syed Rizwan Farook, an attacker in the San Bernardino rampage last year. Federal officials have refused to identify the person, or organization, who helped crack the device, and have declined to specify the procedure used to open the iPhone. Apple also cannot obtain the device to reverse-engineer the problem, the way it would in other hacking situations.
  •  
    It would make a very interesting Freedom of Information Act case if Apple sued under that Act to force disclosure of the security hole iPhone product defect the FBI exploited. I know of no interpretation of the law enforcement FOIA exemption that would justify FBI disclosure of the information. It might be alleged that the information is the trade secret of the company that disclosed the defect and exploit to the the FBI, but there's a very strong argument that the fact that the information was shared with the FBI waived the trade secrecy claim. And the notion that government is entitled to collect product security defects and exploit them without informing the exploited product's company of the specific defect is extremely weak.  Were I Tim Cook, I would have already told my lawyers to get cracking on filing the FOIA request with the FBI to get the legal ball rolling. 
Paul Merrell

This Is the Real Reason Apple Is Fighting the FBI | TIME - 0 views

  • The first thing to understand about Apple’s latest fight with the FBI—over a court order to help unlock the deceased San Bernardino shooter’s phone—is that it has very little to do with the San Bernardino shooter’s phone. It’s not even, really, the latest round of the Crypto Wars—the long running debate about how law enforcement and intelligence agencies can adapt to the growing ubiquity of uncrackable encryption tools. Rather, it’s a fight over the future of high-tech surveillance, the trust infrastructure undergirding the global software ecosystem, and how far technology companies and software developers can be conscripted as unwilling suppliers of hacking tools for governments. It’s also the public face of a conflict that will undoubtedly be continued in secret—and is likely already well underway.
  • Considered in isolation, the request seems fairly benign: If it were merely a question of whether to unlock a single device—even one unlikely to contain much essential evidence—there would probably be little enough harm in complying. The reason Apple CEO Tim Cook has pledged to fight a court’s order to assist the bureau is that he understands the danger of the underlying legal precedent the FBI is seeking to establish. Four important pieces of context are necessary to see the trouble with the Apple order.
Paul Merrell

Upgrade Your iPhone Passcode to Defeat the FBI's Backdoor Strategy - 0 views

  • It’s true that ordering Apple to develop the backdoor will fundamentally undermine iPhone security, as Cook and other digital security advocates have argued. But it’s possible for individual iPhone users to protect themselves from government snooping by setting strong passcodes on their phones — passcodes the FBI would not be able to unlock even if it gets its iPhone backdoor. The technical details of how the iPhone encrypts data, and how the FBI might circumvent this protection, are complex and convoluted, and are being thoroughly explored elsewhere on the internet. What I’m going to focus on here is how ordinary iPhone users can protect themselves. The short version: If you’re worried about governments trying to access your phone, set your iPhone up with a random, 11-digit numeric passcode. What follows is an explanation of why that will protect you and how to actually do it.
Paul Merrell

The Democratization of Cyberattack - Schneier on Security - 0 views

  • We can't choose a world where the US gets to spy but China doesn't, or even a world where governments get to spy and criminals don't. We need to choose, as a matter of policy, communications systems that are secure for all users, or ones that are vulnerable to all attackers. It's security or surveillance.
  •  
    Pithy quote from Bruce Schneier.
Paul Merrell

On the NSA, PRISM, and what it means for your 1Password data | Agile Blog - 1 views

  •  
    Might be right. Might be wrong. Oh, the joy of having people around who feel entitled to read other people's data, whether it's shared or not. 
1 - 8 of 8
Showing 20 items per page