Skip to main content

Home/ Future of the Web/ Group items tagged optimism

Rss Feed Group items tagged

Paul Merrell

Months After Appeals Argued, NSA Cases Twist in the Wind - US News - 0 views

  • Three cases that likely lay the groundwork for a major privacy battle at the U.S. Supreme Court are pending before federal appeals courts, whose judges are taking their time announcing whether they believe the dragnet collection of Americans' phone records is legal. It’s been more than five months since the American Civil Liberties Union argued against the National Security Agency program in New York, three months since legal activist Larry Klayman defended his thus far unprecedented preliminary injunction win in Washington, D.C., and two months since Idaho nurse Anna Smith’s case was heard by appeals judges in Seattle. At the district court level, judges handed down decisions about a month after oral arguments in the cases. It’s unclear what accounts for the delay. It’s possible judges are meticulously crafting opinions that are likely to receive wide coverage, or that members of the three-judge panels are clashing on the appropriate decision.
  • Attorneys involved in the cases understandably are reluctant to criticize the courts, but all express hope for speedy resolution of their fights against alleged violations of Americans’ Fourth Amendment rights.
  • Though it’s difficult to accurately predict court decisions based on oral arguments, opponents of the mass surveillance program may have reason for optimism.
  • ...1 more annotation...
  • Two executive branch review panels have found the dragnet phone program has had minimal value for catching terrorists, its stated purpose. After years of presiding over the collection and months of publicly defending it, President Barack Obama pivoted last year and asked Congress to pass legislation ending the program. A measure to do so failed last year.
Paul Merrell

What is Boxcryptor | Easy to use encryption for cloud storage | boxcryptor.com - 0 views

  • Boxcryptor is an easy-to-use encryption software optimized for the cloud. It allows the secure use of cloud storage services without sacrificing comfort. Boxcryptor supports all major cloud storage providers (such as Dropbox, Google Drive, Microsoft OneDrive, SugarSync) and supports all the clouds that use the WebDAV standard (such as Cubby, Strato HiDrive, and ownCloud). With Boxcryptor your files go protected to your cloud provider and you can enjoy peace of mind knowing that your information cannot fall into the wrong hands. Here is how it works: Boxcryptor creates a virtual drive on your computer that allows you to encrypt your files locally before uploading them to your cloud or clouds of choice. It encrypts individual files - and does not create containers. Any file dropped into an encrypted folder within the Boxcryptor drive will get automatically encrypted before it is synced to the cloud. To protect your files, Boxcryptor uses the AES-256 and RSA encryption algorithms.
  •  
    Free for personal use. I haven't tried this yet, but the need for it has been near the top of my head since I first tried Dropbox and then realized how insecure it was. I tried a lot of sync services, but am now using Wuala, which features end-to-end encryption baked into the client software. But I also use MEGAsync for remote backup so I'[ll probably be trying this out with that service. I hope there's a way to sync the two programs.
Paul Merrell

Deep Fakes: A Looming Crisis for National Security, Democracy and Privacy? - Lawfare - 1 views

  • “We are truly fucked.” That was Motherboard’s spot-on reaction to deep fake sex videos (realistic-looking videos that swap a person’s face into sex scenes actually involving other people). And that sleazy application is just the tip of the iceberg. As Julian Sanchez tweeted, “The prospect of any Internet rando being able to swap anyone’s face into porn is incredibly creepy. But my first thought is that we have not even scratched the surface of how bad ‘fake news’ is going to get.” Indeed. Recent events amply demonstrate that false claims—even preposterous ones—can be peddled with unprecedented success today thanks to a combination of social media ubiquity and virality, cognitive biases, filter bubbles, and group polarization. The resulting harms are significant for individuals, businesses, and democracy. Belated recognition of the problem has spurred a variety of efforts to address this most recent illustration of truth decay, and at first blush there seems to be reason for optimism. Alas, the problem may soon take a significant turn for the worse thanks to deep fakes. Get used to hearing that phrase. It refers to digital manipulation of sound, images, or video to impersonate someone or make it appear that a person did something—and to do so in a manner that is increasingly realistic, to the point that the unaided observer cannot detect the fake. Think of it as a destructive variation of the Turing test: imitation designed to mislead and deceive rather than to emulate and iterate.
  • Fueled by artificial intelligence, digital impersonation is on the rise. Machine-learning algorithms (often neural networks) combined with facial-mapping software enable the cheap and easy fabrication of content that hijacks one’s identity—voice, face, body. Deep fake technology inserts individuals’ faces into videos without their permission. The result is “believable videos of people doing and saying things they never did.” Not surprisingly, this concept has been quickly leveraged to sleazy ends. The latest craze is fake sex videos featuring celebrities like Gal Gadot and Emma Watson. Although the sex scenes look realistic, they are not consensual cyber porn. Conscripting individuals (more often women) into fake porn undermines their agency, reduces them to sexual objects, engenders feeling of embarrassment and shame, and inflicts reputational harm that can devastate careers (especially for everyday people). Regrettably, cyber stalkers are sure to use fake sex videos to torment victims. What comes next? We can expect to see deep fakes used in other abusive, individually-targeted ways, such as undermining a rival’s relationship with fake evidence of an affair or an enemy’s career with fake evidence of a racist comment.
‹ Previous 21 - 23 of 23
Showing 20 items per page