Skip to main content

Home/ Future of the Web/ Group items tagged THE

Rss Feed Group items tagged

Paul Merrell

Privacy Shield Program Overview | Privacy Shield - 0 views

  • EU-U.S. Privacy Shield Program Overview The EU-U.S. Privacy Shield Framework was designed by the U.S. Department of Commerce and European Commission to provide companies on both sides of the Atlantic with a mechanism to comply with EU data protection requirements when transferring personal data from the European Union to the United States in support of transatlantic commerce. On July 12, the European Commission deemed the Privacy Shield Framework adequate to enable data transfers under EU law (see the adequacy determination). The Privacy Shield program, which is administered by the International Trade Administration (ITA) within the U.S. Department of Commerce, enables U.S.-based organizations to join the Privacy Shield Framework in order to benefit from the adequacy determination. To join the Privacy Shield Framework, a U.S.-based organization will be required to self-certify to the Department of Commerce (via this website) and publicly commit to comply with the Framework’s requirements. While joining the Privacy Shield Framework is voluntary, once an eligible organization makes the public commitment to comply with the Framework’s requirements, the commitment will become enforceable under U.S. law. All organizations interested in joining the Privacy Shield Framework should review its requirements in their entirety. To assist in that effort, Commerce’s Privacy Shield Team has compiled resources and addressed frequently asked questions below. ResourcesKey New Requirements for Participating Organizations How to Join the Privacy ShieldPrivacy Policy FAQs Frequently Asked Questions
  •  
    I got a notice from Dropbox tonight that it is now certified under this program. This program is fallout from an E.U. Court of Justice decision following the Snowden disclosures, holding that the then existing U.S.-E.U. framework for ptoecting the rights of E.U. citozens' data were invalid because that framework did not adequately protect digital privacy rights. This new framework is intended to comoply with the court's decision but one need only look at section 5 of the agreement to see that it does not. Expect follow-on litigation. THe agreement is at https://www.privacyshield.gov/servlet/servlet.FileDownload?file=015t00000004qAg Section 5 lets NSA continue to intercept and read data from E.U. citizens and also allows their data to be disclosed to U.S. law enforcement. And the agreement adds nothing to U.S. citizens' digital privacy rights. In my view, this framework is a stopgap measure that will only last as long as it takes for another case to reach the Court of Justice and be ruled upon. The ox that got gored by the Court of Justice ruling was U.S. company's ability to store E.U. citizens' data outside the E.U. and to allow internet traffic from the E.U. to pass through the U.S. Microsoft had leadership that set up new server farms in Europe under the control of a business entity beyond the jurisdiction of U.S. courts. Other I/.S. internet biggies didn't follow suit. This framework is their lifeline until the next ruling by the Court of Justice.
Gary Edwards

Blog | Spritz - 0 views

  • Therein lies one of the biggest problems with traditional RSVP. Each time you see text that is not centered properly on the ORP position, your eyes naturally will look for the ORP to process the word and understand its meaning. This requisite eye movement creates a “saccade”, a physical eye movement caused by your eyes taking a split second to find the proper ORP for a word. Every saccade has a penalty in both time and comprehension, especially when you start to speed up reading. Some saccades are considered by your brain to be “normal” during reading, such as when you move your eye from left to right to go from one ORP position to the next ORP position while reading a book. Other saccades are not normal to your brain during reading, such as when you move your eyes right to left to spot an ORP. This eye movement is akin to trying to read a line of text backwards. In normal reading, you normally won’t saccade right-to-left unless you encounter a word that your brain doesn’t already know and you go back for another look; those saccades will increase based on the difficulty of the text being read and the percentage of words within it that you already know. And the math doesn’t look good, either. If you determined the length of all the words in a given paragraph, you would see that, depending on the language you’re reading, there is a low (less than 15%) probability of two adjacent words being the same length and not requiring a saccade when they are shown to you one at a time. This means you move your eyes on a regular basis with traditional RSVP! In fact, you still move them with almost every word. In general, left-to-right saccades contribute to slower reading due to the increased travel time for the eyeballs, while right-to-left saccades are discombobulating for many people, especially at speed. It’s like reading a lot of text that contains words you don’t understand only you DO understand the words! The experience is frustrating to say the least.
  • In addition to saccading, another issue with RSVP is associated with “foveal vision,” the area in focus when you look at a sentence. This distance defines the number of letters on which your eyes can sharply focus as you read. Its companion is called “parafoveal vision” and refers to the area outside foveal vision that cannot be seen sharply.
  •  
    "To understand Spritz, you must understand Rapid Serial Visual Presentation (RSVP). RSVP is a common speed-reading technique used today. However, RSVP was originally developed for psychological experiments to measure human reactions to content being read. When RSVP was created, there wasn't much digital content and most people didn't have access to it anyway. The internet didn't even exist yet. With traditional RSVP, words are displayed either left-aligned or centered. Figure 1 shows an example of a center-aligned RSVP, with a dashed line on the center axis. When you read a word, your eyes naturally fixate at one point in that word, which visually triggers the brain to recognize the word and process its meaning. In Figure 1, the preferred fixation point (character) is indicated in red. In this figure, the Optimal Recognition Position (ORP) is different for each word. For example, the ORP is only in the middle of a 3-letter word. As the length of a word increases, the percentage that the ORP shifts to the left of center also increases. The longer the word, the farther to the left of center your eyes must move to locate the ORP."
Paul Merrell

Security Experts Oppose Government Access to Encrypted Communication - The New York Times - 0 views

  • An elite group of security technologists has concluded that the American and British governments cannot demand special access to encrypted communications without putting the world’s most confidential data and critical infrastructure in danger.A new paper from the group, made up of 14 of the world’s pre-eminent cryptographers and computer scientists, is a formidable salvo in a skirmish between intelligence and law enforcement leaders, and technologists and privacy advocates. After Edward J. Snowden’s revelations — with security breaches and awareness of nation-state surveillance at a record high and data moving online at breakneck speeds — encryption has emerged as a major issue in the debate over privacy rights.
  • That has put Silicon Valley at the center of a tug of war. Technology companies including Apple, Microsoft and Google have been moving to encrypt more of their corporate and customer data after learning that the National Security Agency and its counterparts were siphoning off digital communications and hacking into corporate data centers.
  • Yet law enforcement and intelligence agency leaders argue that such efforts thwart their ability to monitor kidnappers, terrorists and other adversaries. In Britain, Prime Minister David Cameron threatened to ban encrypted messages altogether. In the United States, Michael S. Rogers, the director of the N.S.A., proposed that technology companies be required to create a digital key to unlock encrypted data, but to divide the key into pieces and secure it so that no one person or government agency could use it alone.The encryption debate has left both sides bitterly divided and in fighting mode. The group of cryptographers deliberately issued its report a day before James B. Comey Jr., the director of the Federal Bureau of Investigation, and Sally Quillian Yates, the deputy attorney general at the Justice Department, are scheduled to testify before the Senate Judiciary Committee on the concerns that they and other government agencies have that encryption technologies will prevent them from effectively doing their jobs.
  • ...2 more annotations...
  • The new paper is the first in-depth technical analysis of government proposals by leading cryptographers and security thinkers, including Whitfield Diffie, a pioneer of public key cryptography, and Ronald L. Rivest, the “R” in the widely used RSA public cryptography algorithm. In the report, the group said any effort to give the government “exceptional access” to encrypted communications was technically unfeasible and would leave confidential data and critical infrastructure like banks and the power grid at risk. Handing governments a key to encrypted communications would also require an extraordinary degree of trust. With government agency breaches now the norm — most recently at the United States Office of Personnel Management, the State Department and the White House — the security specialists said authorities could not be trusted to keep such keys safe from hackers and criminals. They added that if the United States and Britain mandated backdoor keys to communications, China and other governments in foreign markets would be spurred to do the same.
  • “Such access will open doors through which criminals and malicious nation-states can attack the very individuals law enforcement seeks to defend,” the report said. “The costs would be substantial, the damage to innovation severe and the consequences to economic growth hard to predict. The costs to the developed countries’ soft power and to our moral authority would also be considerable.”
  •  
    Our system of government does not expect that every criminal will be apprehended and convicted. There are numerous values our society believes are more important. Some examples: [i] a presumption of innocence unless guilt is established beyond any reasonable doubt; [ii] the requirement that government officials convince a neutral magistrate that they have probable cause to believe that a search or seizure will produce evidence of a crime; [iii] many communications cannot be compelled to be disclosed and used in evidence, such as attorney-client communications, spousal communications, and priest-penitent communications; and [iv] etc. Moral of my story: the government needs a much stronger reason to justify interception of communications than saying, "some crooks will escape prosecution if we can't do that." We have a right to whisper to each other, concealing our communicatons from all others. Why does the right to whisper privately disappear if our whisperings are done electronically? The Supreme Court took its first step on a very slippery slope when it permitted wiretapping in Olmstead v. United States, 277 U.S. 438, 48 S. Ct. 564, 72 L. Ed. 944 (1928). https://goo.gl/LaZGHt It's been a long slide ever since. It's past time to revisit Olmstead and recognize that American citizens have the absolute right to communicate privately. "The President … recognizes that U.S. citizens and institutions should have a reasonable expectation of privacy from foreign or domestic intercept when using the public telephone system." - Brent Scowcroft, U.S. National Security Advisor, National Security Decision Memorandum 338 (1 September 1976) (Nixon administration), http://www.fas.org/irp/offdocs/nsdm-ford/nsdm-338.pdf   
Paul Merrell

Google Chrome Listening In To Your Room Shows The Importance Of Privacy Defense In Depth - 0 views

  • Yesterday, news broke that Google has been stealth downloading audio listeners onto every computer that runs Chrome, and transmits audio data back to Google. Effectively, this means that Google had taken itself the right to listen to every conversation in every room that runs Chrome somewhere, without any kind of consent from the people eavesdropped on. In official statements, Google shrugged off the practice with what amounts to “we can do that”.It looked like just another bug report. "When I start Chromium, it downloads something." Followed by strange status information that notably included the lines "Microphone: Yes" and "Audio Capture Allowed: Yes".
  • Without consent, Google’s code had downloaded a black box of code that – according to itself – had turned on the microphone and was actively listening to your room.A brief explanation of the Open-source / Free-software philosophy is needed here. When you’re installing a version of GNU/Linux like Debian or Ubuntu onto a fresh computer, thousands of really smart people have analyzed every line of human-readable source code before that operating system was built into computer-executable binary code, to make it common and open knowledge what the machine actually does instead of trusting corporate statements on what it’s supposed to be doing. Therefore, you don’t install black boxes onto a Debian or Ubuntu system; you use software repositories that have gone through this source-code audit-then-build process. Maintainers of operating systems like Debian and Ubuntu use many so-called “upstreams” of source code to build the final product.Chromium, the open-source version of Google Chrome, had abused its position as trusted upstream to insert lines of source code that bypassed this audit-then-build process, and which downloaded and installed a black box of unverifiable executable code directly onto computers, essentially rendering them compromised. We don’t know and can’t know what this black box does. But we see reports that the microphone has been activated, and that Chromium considers audio capture permitted.
  • This was supposedly to enable the “Ok, Google” behavior – that when you say certain words, a search function is activated. Certainly a useful feature. Certainly something that enables eavesdropping of every conversation in the entire room, too.Obviously, your own computer isn’t the one to analyze the actual search command. Google’s servers do. Which means that your computer had been stealth configured to send what was being said in your room to somebody else, to a private company in another country, without your consent or knowledge, an audio transmission triggered by… an unknown and unverifiable set of conditions.Google had two responses to this. The first was to introduce a practically-undocumented switch to opt out of this behavior, which is not a fix: the default install will still wiretap your room without your consent, unless you opt out, and more importantly, know that you need to opt out, which is nowhere a reasonable requirement. But the second was more of an official statement following technical discussions on Hacker News and other places. That official statement amounted to three parts (paraphrased, of course):
  • ...4 more annotations...
  • 1) Yes, we’re downloading and installing a wiretapping black-box to your computer. But we’re not actually activating it. We did take advantage of our position as trusted upstream to stealth-insert code into open-source software that installed this black box onto millions of computers, but we would never abuse the same trust in the same way to insert code that activates the eavesdropping-blackbox we already downloaded and installed onto your computer without your consent or knowledge. You can look at the code as it looks right now to see that the code doesn’t do this right now.2) Yes, Chromium is bypassing the entire source code auditing process by downloading a pre-built black box onto people’s computers. But that’s not something we care about, really. We’re concerned with building Google Chrome, the product from Google. As part of that, we provide the source code for others to package if they like. Anybody who uses our code for their own purpose takes responsibility for it. When this happens in a Debian installation, it is not Google Chrome’s behavior, this is Debian Chromium’s behavior. It’s Debian’s responsibility entirely.3) Yes, we deliberately hid this listening module from the users, but that’s because we consider this behavior to be part of the basic Google Chrome experience. We don’t want to show all modules that we install ourselves.
  • If you think this is an excusable and responsible statement, raise your hand now.Now, it should be noted that this was Chromium, the open-source version of Chrome. If somebody downloads the Google product Google Chrome, as in the prepackaged binary, you don’t even get a theoretical choice. You’re already downloading a black box from a vendor. In Google Chrome, this is all included from the start.This episode highlights the need for hard, not soft, switches to all devices – webcams, microphones – that can be used for surveillance. A software on/off switch for a webcam is no longer enough, a hard shield in front of the lens is required. A software on/off switch for a microphone is no longer enough, a physical switch that breaks its electrical connection is required. That’s how you defend against this in depth.
  • Of course, people were quick to downplay the alarm. “It only listens when you say ‘Ok, Google’.” (Ok, so how does it know to start listening just before I’m about to say ‘Ok, Google?’) “It’s no big deal.” (A company stealth installs an audio listener that listens to every room in the world it can, and transmits audio data to the mothership when it encounters an unknown, possibly individually tailored, list of keywords – and it’s no big deal!?) “You can opt out. It’s in the Terms of Service.” (No. Just no. This is not something that is the slightest amount of permissible just because it’s hidden in legalese.) “It’s opt-in. It won’t really listen unless you check that box.” (Perhaps. We don’t know, Google just downloaded a black box onto my computer. And it may not be the same black box as was downloaded onto yours. )Early last decade, privacy activists practically yelled and screamed that the NSA’s taps of various points of the Internet and telecom networks had the technical potential for enormous abuse against privacy. Everybody else dismissed those points as basically tinfoilhattery – until the Snowden files came out, and it was revealed that precisely everybody involved had abused their technical capability for invasion of privacy as far as was possible.Perhaps it would be wise to not repeat that exact mistake. Nobody, and I really mean nobody, is to be trusted with a technical capability to listen to every room in the world, with listening profiles customizable at the identified-individual level, on the mere basis of “trust us”.
  • Privacy remains your own responsibility.
  •  
    And of course, Google would never succumb to a subpoena requiring it to turn over the audio stream to the NSA. The Tor Browser just keeps looking better and better. https://www.torproject.org/projects/torbrowser.html.en
Paul Merrell

Why the Sony hack is unlikely to be the work of North Korea. | Marc's Security Ramblings - 0 views

  • Everyone seems to be eager to pin the blame for the Sony hack on North Korea. However, I think it’s unlikely. Here’s why:1. The broken English looks deliberately bad and doesn’t exhibit any of the classic comprehension mistakes you actually expect to see in “Konglish”. i.e it reads to me like an English speaker pretending to be bad at writing English. 2. The fact that the code was written on a PC with Korean locale & language actually makes it less likely to be North Korea. Not least because they don’t speak traditional “Korean” in North Korea, they speak their own dialect and traditional Korean is forbidden. This is one of the key things that has made communication with North Korean refugees difficult. I would find the presence of Chinese far more plausible.
  • 3. It’s clear from the hard-coded paths and passwords in the malware that whoever wrote it had extensive knowledge of Sony’s internal architecture and access to key passwords. While it’s plausible that an attacker could have built up this knowledge over time and then used it to make the malware, Occam’s razor suggests the simpler explanation of an insider. It also fits with the pure revenge tact that this started out as. 4. Whoever did this is in it for revenge. The info and access they had could have easily been used to cash out, yet, instead, they are making every effort to burn Sony down. Just think what they could have done with passwords to all of Sony’s financial accounts? With the competitive intelligence in their business documents? From simple theft, to the sale of intellectual property, or even extortion – the attackers had many ways to become rich. Yet, instead, they chose to dump the data, rendering it useless. Likewise, I find it hard to believe that a “Nation State” which lives by propaganda would be so willing to just throw away such an unprecedented level of access to the beating heart of Hollywood itself.
  • 5. The attackers only latched onto “The Interview” after the media did – the film was never mentioned by GOP right at the start of their campaign. It was only after a few people started speculating in the media that this and the communication from DPRK “might be linked” that suddenly it became linked. I think the attackers both saw this as an opportunity for “lulz” and as a way to misdirect everyone into thinking it was a nation state. After all, if everyone believes it’s a nation state, then the criminal investigation will likely die.
  • ...4 more annotations...
  • 6. Whoever is doing this is VERY net and social media savvy. That, and the sophistication of the operation, do not match with the profile of DPRK up until now. Grugq did an excellent analysis of this aspect his findings are here – http://0paste.com/6875#md 7. Finally, blaming North Korea is the easy way out for a number of folks, including the security vendors and Sony management who are under the microscope for this. Let’s face it – most of today’s so-called “cutting edge” security defenses are either so specific, or so brittle, that they really don’t offer much meaningful protection against a sophisticated attacker or group of attackers.
  • 8. It probably also suits a number of political agendas to have something that justifies sabre-rattling at North Korea, which is why I’m not that surprised to see politicians starting to point their fingers at the DPRK also. 9. It’s clear from the leaked data that Sony has a culture which doesn’t take security very seriously. From plaintext password files, to using “password” as the password in business critical certificates, through to just the shear volume of aging unclassified yet highly sensitive data left out in the open. This isn’t a simple slip-up or a “weak link in the chain” – this is a serious organization-wide failure to implement anything like a reasonable security architecture.
  • The reality is, as things stand, Sony has little choice but to burn everything down and start again. Every password, every key, every certificate is tainted now and that’s a terrifying place for an organization to find itself. This hack should be used as the definitive lesson in why security matters and just how bad things can get if you don’t take it seriously. 10. Who do I think is behind this? My money is on a disgruntled (possibly ex) employee of Sony.
  • EDIT: This appears (at least in part) to be substantiated by a conversation the Verge had with one of the alleged hackers – http://www.theverge.com/2014/11/25/7281097/sony-pictures-hackers-say-they-want-equality-worked-with-staff-to-break-in Finally for an EXCELLENT blow by blow analysis of the breach and the events that followed, read the following post by my friends from Risk Based Security – https://www.riskbasedsecurity.com/2014/12/a-breakdown-and-analysis-of-the-december-2014-sony-hack EDIT: Also make sure you read my good friend Krypt3ia’s post on the hack – http://krypt3ia.wordpress.com/2014/12/18/sony-hack-winners-and-losers/
  •  
    Seems that the FBI overlooked a few clues before it told Obama to go ahead and declare war against North Korea. 
Gonzalo San Gil, PhD.

The Universal Declaration of Human Rights - 3 views

  •  
    [PREAMBLE Whereas recognition of the inherent dignity and of the equal and inalienable rights of all members of the human family is the foundation of freedom, justice and peace in the world, Whereas disregard and contempt for human rights have resulted in barbarous acts which have outraged the conscience of mankind, and the advent of a world in which human beings shall enjoy freedom of speech and belief and freedom from fear and want has been proclaimed as the highest aspiration of the common people, Whereas it is essential, if man is not to be compelled to have recourse, as a last resort, to rebellion against tyranny and oppression, that human rights should be protected by the rule of law, Whereas it is essential to promote the development of friendly relations between nations, Whereas the peoples of the United Nations have in the Charter reaffirmed their faith in fundamental human rights, in the dignity and worth of the human person and in the equal rights of men and women and have determined to promote social progress and better standards of life in larger freedom, Whereas Member States have pledged themselves to achieve, in co-operation with the United Nations, the promotion of universal respect for and observance of human rights and fundamental freedoms, Whereas a common understanding of these rights and freedoms is of the greatest importance for the full realization of this pledge, Now, Therefore THE GENERAL ASSEMBLY proclaims THIS UNIVERSAL DECLARATION OF HUMAN RIGHTS as a common standard of achievement for all peoples and all nations, to the end that every individual and every organ of society, keeping this Declaration constantly in mind, shall strive by teaching and education to promote respect for these rights and freedoms and by progressive measures, national and international, to secure their universal and effective recognition and observance, both among the peoples of Member States themselves and among the peoples of territories
  •  
    The Declaration is an important document but only aspirational in nature. It was hamstrung from the beginning by omission of mandated procedures by which an aggrieved person could seek its enforcement or protection.
  •  
    Oh.. of course, Paul. This is Just a Reminder... ... of the other ways to do the things... For Every@ne. Perhaps One Day... :)
Paul Merrell

Testosterone Pit - Home - The Other Reason Why IBM Throws A Billion At Linux ... - 0 views

  • IBM announced today that it would throw another billion at Linux, the open-source operating system, to run its Power System servers. The first time it had thrown a billion at Linux was in 2001, when Linux was a crazy, untested, even ludicrous proposition for the corporate world. So the moolah back then didn’t go to Linux itself, which was free, but to related technologies across hardware, software, and service, including things like sales and advertising – and into IBM’s partnership with Red Hat which was developing its enterprise operating system, Red Hat Enterprise Linux. “It helped start a flurry of innovation that has never slowed,” said Jim Zemlin, executive director of the Linux Foundation. IBM claims that the investment would “help clients capitalize on big data and cloud computing with modern systems built to handle the new wave of applications coming to the data center in the post-PC era.” Some of the moolah will be plowed into the Power Systems Linux Center in Montpellier, France, which opened today. IBM’s first Power Systems Linux Center opened in Beijing in May. IBM may be trying to make hay of the ongoing revelations that have shown that the NSA and other intelligence organizations in the US and elsewhere have roped in American tech companies of all stripes with huge contracts to perfect a seamless spy network. They even include physical aspects of surveillance, such as license plate scanners and cameras, which are everywhere [read.... Surveillance Society: If You Drive, You Get Tracked].
  • It would be an enormous competitive advantage for an IBM salesperson to walk into a government or corporate IT department and sell Big Data servers that don’t run on Windows, but on Linux. With the Windows 8 debacle now in public view, IBM salespeople don’t even have to mention it. In the hope of stemming the pernicious revenue decline their employer has been suffering from, they can politely and professionally hype the security benefits of IBM’s systems and mention in passing the comforting fact that some of it would be developed in the Power Systems Linux Centers in Montpellier and Beijing. Alas, Linux too is tarnished. The backdoors are there, though the code can be inspected, unlike Windows code. And then there is Security-Enhanced Linux (SELinux), which was integrated into the Linux kernel in 2003. It provides a mechanism for supporting “access control” (a backdoor) and “security policies.” Who developed SELinux? Um, the NSA – which helpfully discloses some details on its own website (emphasis mine): The results of several previous research projects in this area have yielded a strong, flexible mandatory access control architecture called Flask. A reference implementation of this architecture was first integrated into a security-enhanced Linux® prototype system in order to demonstrate the value of flexible mandatory access controls and how such controls could be added to an operating system. The architecture has been subsequently mainstreamed into Linux and ported to several other systems, including the Solaris™ operating system, the FreeBSD® operating system, and the Darwin kernel, spawning a wide range of related work.
  • Then another boon for IBM. Experts at the German Federal Office for Security in Information Technology (BIS) determined that Windows 8 is dangerous for data security. It allows Microsoft to control the computer remotely through a “special surveillance chip,” the wonderfully named Trusted Platform Module (TPM), and a backdoor in the software – with keys likely accessible to the NSA and possibly other third parties, such as the Chinese. Risks: “Loss of control over the operating system and the hardware” [read.... LEAKED: German Government Warns Key Entities Not To Use Windows 8 – Links The NSA.
  • ...1 more annotation...
  • Among a slew of American companies who contributed to the NSA’s “mainstreaming” efforts: Red Hat. And IBM? Like just about all of our American tech heroes, it looks at the NSA and other agencies in the Intelligence Community as “the Customer” with deep pockets, ever increasing budgets, and a thirst for technology and data. Which brings us back to Windows 8 and TPM. A decade ago, a group was established to develop and promote Trusted Computing that governs how operating systems and the “special surveillance chip” TPM work together. And it too has been cooperating with the NSA. The founding members of this Trusted Computing Group, as it’s called facetiously: AMD, Cisco, Hewlett-Packard, Intel, Microsoft, and Wave Systems. Oh, I almost forgot ... and IBM. And so IBM might not escape, despite its protestations and slick sales presentations, the suspicion by foreign companies and governments alike that its Linux servers too have been compromised – like the cloud products of other American tech companies. And now, they’re going to pay a steep price for their cooperation with the NSA. Read...  NSA Pricked The “Cloud” Bubble For US Tech Companies
Paul Merrell

Internet Giants Erect Barriers to Spy Agencies - NYTimes.com - 0 views

  • As fast as it can, Google is sealing up cracks in its systems that Edward J. Snowden revealed the N.S.A. had brilliantly exploited. It is encrypting more data as it moves among its servers and helping customers encode their own emails. Facebook, Microsoft and Yahoo are taking similar steps.
  • After years of cooperating with the government, the immediate goal now is to thwart Washington — as well as Beijing and Moscow. The strategy is also intended to preserve business overseas in places like Brazil and Germany that have threatened to entrust data only to local providers. Google, for example, is laying its own fiber optic cable under the world’s oceans, a project that began as an effort to cut costs and extend its influence, but now has an added purpose: to assure that the company will have more control over the movement of its customer data.
  • A year after Mr. Snowden’s revelations, the era of quiet cooperation is over. Telecommunications companies say they are denying requests to volunteer data not covered by existing law. A.T.&T., Verizon and others say that compared with a year ago, they are far more reluctant to cooperate with the United States government in “gray areas” where there is no explicit requirement for a legal warrant.
  • ...8 more annotations...
  • Eric Grosse, Google’s security chief, suggested in an interview that the N.S.A.'s own behavior invited the new arms race.“I am willing to help on the purely defensive side of things,” he said, referring to Washington’s efforts to enlist Silicon Valley in cybersecurity efforts. “But signals intercept is totally off the table,” he said, referring to national intelligence gathering.“No hard feelings, but my job is to make their job hard,” he added.
  • In Washington, officials acknowledge that covert programs are now far harder to execute because American technology companies, fearful of losing international business, are hardening their networks and saying no to requests for the kind of help they once quietly provided.Continue reading the main story Robert S. Litt, the general counsel of the Office of the Director of National Intelligence, which oversees all 17 American spy agencies, said on Wednesday that it was “an unquestionable loss for our nation that companies are losing the willingness to cooperate legally and voluntarily” with American spy agencies.
  • Many point to an episode in 2012, when Russian security researchers uncovered a state espionage tool, Flame, on Iranian computers. Flame, like the Stuxnet worm, is believed to have been produced at least in part by American intelligence agencies. It was created by exploiting a previously unknown flaw in Microsoft’s operating systems. Companies argue that others could have later taken advantage of this defect.Worried that such an episode undercuts confidence in its wares, Microsoft is now fully encrypting all its products, including Hotmail and Outlook.com, by the end of this year with 2,048-bit encryption, a stronger protection that would take a government far longer to crack. The software is protected by encryption both when it is in data centers and when data is being sent over the Internet, said Bradford L. Smith, the company’s general counsel.
  • Mr. Smith also said the company was setting up “transparency centers” abroad so that technical experts of foreign governments could come in and inspect Microsoft’s proprietary source code. That will allow foreign governments to check to make sure there are no “back doors” that would permit snooping by United States intelligence agencies. The first such center is being set up in Brussels.Microsoft has also pushed back harder in court. In a Seattle case, the government issued a “national security letter” to compel Microsoft to turn over data about a customer, along with a gag order to prevent Microsoft from telling the customer it had been compelled to provide its communications to government officials. Microsoft challenged the gag order as violating the First Amendment. The government backed down.
  • Hardware firms like Cisco, which makes routers and switches, have found their products a frequent subject of Mr. Snowden’s disclosures, and their business has declined steadily in places like Asia, Brazil and Europe over the last year. The company is still struggling to convince foreign customers that their networks are safe from hackers — and free of “back doors” installed by the N.S.A. The frustration, companies here say, is that it is nearly impossible to prove that their systems are N.S.A.-proof.
  • In one slide from the disclosures, N.S.A. analysts pointed to a sweet spot inside Google’s data centers, where they could catch traffic in unencrypted form. Next to a quickly drawn smiley face, an N.S.A. analyst, referring to an acronym for a common layer of protection, had noted, “SSL added and removed here!”
  • Facebook and Yahoo have also been encrypting traffic among their internal servers. And Facebook, Google and Microsoft have been moving to more strongly encrypt consumer traffic with so-called Perfect Forward Secrecy, specifically devised to make it more labor intensive for the N.S.A. or anyone to read stored encrypted communications.One of the biggest indirect consequences from the Snowden revelations, technology executives say, has been the surge in demands from foreign governments that saw what kind of access to user information the N.S.A. received — voluntarily or surreptitiously. Now they want the same.
  • The latest move in the war between intelligence agencies and technology companies arrived this week, in the form of a new Google encryption tool. The company released a user-friendly, email encryption method to replace the clunky and often mistake-prone encryption schemes the N.S.A. has readily exploited.But the best part of the tool was buried in Google’s code, which included a jab at the N.S.A.'s smiley-face slide. The code included the phrase: “ssl-added-and-removed-here-; - )”
Paul Merrell

The Digital Hunt for Duqu, a Dangerous and Cunning U.S.-Israeli Spy Virus - The Intercept - 1 views

  • “Is this related to what we talked about before?” Bencsáth said, referring to a previous discussion they’d had about testing new services the company planned to offer customers. “No, something else,” Bartos said. “Can you come now? It’s important. But don’t tell anyone where you’re going.” Bencsáth wolfed down the rest of his lunch and told his colleagues in the lab that he had a “red alert” and had to go. “Don’t ask,” he said as he ran out the door. A while later, he was at Bartos’ office, where a triage team had been assembled to address the problem they wanted to discuss. “We think we’ve been hacked,” Bartos said.
  • They found a suspicious file on a developer’s machine that had been created late at night when no one was working. The file was encrypted and compressed so they had no idea what was inside, but they suspected it was data the attackers had copied from the machine and planned to retrieve later. A search of the company’s network found a few more machines that had been infected as well. The triage team felt confident they had contained the attack but wanted Bencsáth’s help determining how the intruders had broken in and what they were after. The company had all the right protections in place—firewalls, antivirus, intrusion-detection and -prevention systems—and still the attackers got in.
  • Bencsáth was a teacher, not a malware hunter, and had never done such forensic work before. At the CrySyS Lab, where he was one of four advisers working with a handful of grad students, he did academic research for the European Union and occasional hands-on consulting work for other clients, but the latter was mostly run-of-the-mill cleanup work—mopping up and restoring systems after random virus infections. He’d never investigated a targeted hack before, let alone one that was still live, and was thrilled to have the chance. The only catch was, he couldn’t tell anyone what he was doing. Bartos’ company depended on the trust of customers, and if word got out that the company had been hacked, they could lose clients. The triage team had taken mirror images of the infected hard drives, so they and Bencsáth spent the rest of the afternoon poring over the copies in search of anything suspicious. By the end of the day, they’d found what they were looking for—an “infostealer” string of code that was designed to record passwords and other keystrokes on infected machines, as well as steal documents and take screenshots. It also catalogued any devices or systems that were connected to the machines so the attackers could build a blueprint of the company’s network architecture. The malware didn’t immediately siphon the stolen data from infected machines but instead stored it in a temporary file, like the one the triage team had found. The file grew fatter each time the infostealer sucked up data, until at some point the attackers would reach out to the machine to retrieve it from a server in India that served as a command-and-control node for the malware.
  • ...1 more annotation...
  • Bencsáth took the mirror images and the company’s system logs with him, after they had been scrubbed of any sensitive customer data, and over the next few days scoured them for more malicious files, all the while being coy to his colleagues back at the lab about what he was doing. The triage team worked in parallel, and after several more days they had uncovered three additional suspicious files. When Bencsáth examined one of them—a kernel-mode driver, a program that helps the computer communicate with devices such as printers—his heart quickened. It was signed with a valid digital certificate from a company in Taiwan (digital certificates are documents ensuring that a piece of software is legitimate). Wait a minute, he thought. Stuxnet—the cyberweapon that was unleashed on Iran’s uranium-enrichment program—also used a driver that was signed with a certificate from a company in Taiwan. That one came from RealTek Semiconductor, but this certificate belonged to a different company, C-Media Electronics. The driver had been signed with the certificate in August 2009, around the same time Stuxnet had been unleashed on machines in Iran.
Paul Merrell

Edward Snowden Explains How To Reclaim Your Privacy - 0 views

  • Micah Lee: What are some operational security practices you think everyone should adopt? Just useful stuff for average people. Edward Snowden: [Opsec] is important even if you’re not worried about the NSA. Because when you think about who the victims of surveillance are, on a day-to-day basis, you’re thinking about people who are in abusive spousal relationships, you’re thinking about people who are concerned about stalkers, you’re thinking about children who are concerned about their parents overhearing things. It’s to reclaim a level of privacy. The first step that anyone could take is to encrypt their phone calls and their text messages. You can do that through the smartphone app Signal, by Open Whisper Systems. It’s free, and you can just download it immediately. And anybody you’re talking to now, their communications, if it’s intercepted, can’t be read by adversaries. [Signal is available for iOS and Android, and, unlike a lot of security tools, is very easy to use.] You should encrypt your hard disk, so that if your computer is stolen the information isn’t obtainable to an adversary — pictures, where you live, where you work, where your kids are, where you go to school. [I’ve written a guide to encrypting your disk on Windows, Mac, and Linux.] Use a password manager. One of the main things that gets people’s private information exposed, not necessarily to the most powerful adversaries, but to the most common ones, are data dumps. Your credentials may be revealed because some service you stopped using in 2007 gets hacked, and your password that you were using for that one site also works for your Gmail account. A password manager allows you to create unique passwords for every site that are unbreakable, but you don’t have the burden of memorizing them. [The password manager KeePassX is free, open source, cross-platform, and never stores anything in the cloud.]
  • The other thing there is two-factor authentication. The value of this is if someone does steal your password, or it’s left or exposed somewhere … [two-factor authentication] allows the provider to send you a secondary means of authentication — a text message or something like that. [If you enable two-factor authentication, an attacker needs both your password as the first factor and a physical device, like your phone, as your second factor, to login to your account. Gmail, Facebook, Twitter, Dropbox, GitHub, Battle.net, and tons of other services all support two-factor authentication.]
  • We should armor ourselves using systems we can rely on every day. This doesn’t need to be an extraordinary lifestyle change. It doesn’t have to be something that is disruptive. It should be invisible, it should be atmospheric, it should be something that happens painlessly, effortlessly. This is why I like apps like Signal, because they’re low friction. It doesn’t require you to re-order your life. It doesn’t require you to change your method of communications. You can use it right now to talk to your friends.
  • ...4 more annotations...
  • Lee: What do you think about Tor? Do you think that everyone should be familiar with it, or do you think that it’s only a use-it-if-you-need-it thing? Snowden: I think Tor is the most important privacy-enhancing technology project being used today. I use Tor personally all the time. We know it works from at least one anecdotal case that’s fairly familiar to most people at this point. That’s not to say that Tor is bulletproof. What Tor does is it provides a measure of security and allows you to disassociate your physical location. … But the basic idea, the concept of Tor that is so valuable, is that it’s run by volunteers. Anyone can create a new node on the network, whether it’s an entry node, a middle router, or an exit point, on the basis of their willingness to accept some risk. The voluntary nature of this network means that it is survivable, it’s resistant, it’s flexible. [Tor Browser is a great way to selectively use Tor to look something up and not leave a trace that you did it. It can also help bypass censorship when you’re on a network where certain sites are blocked. If you want to get more involved, you can volunteer to run your own Tor node, as I do, and support the diversity of the Tor network.]
  • Lee: So that is all stuff that everybody should be doing. What about people who have exceptional threat models, like future intelligence-community whistleblowers, and other people who have nation-state adversaries? Maybe journalists, in some cases, or activists, or people like that? Snowden: So the first answer is that you can’t learn this from a single article. The needs of every individual in a high-risk environment are different. And the capabilities of the adversary are constantly improving. The tooling changes as well. What really matters is to be conscious of the principles of compromise. How can the adversary, in general, gain access to information that is sensitive to you? What kinds of things do you need to protect? Because of course you don’t need to hide everything from the adversary. You don’t need to live a paranoid life, off the grid, in hiding, in the woods in Montana. What we do need to protect are the facts of our activities, our beliefs, and our lives that could be used against us in manners that are contrary to our interests. So when we think about this for whistleblowers, for example, if you witnessed some kind of wrongdoing and you need to reveal this information, and you believe there are people that want to interfere with that, you need to think about how to compartmentalize that.
  • Tell no one who doesn’t need to know. [Lindsay Mills, Snowden’s girlfriend of several years, didn’t know that he had been collecting documents to leak to journalists until she heard about it on the news, like everyone else.] When we talk about whistleblowers and what to do, you want to think about tools for protecting your identity, protecting the existence of the relationship from any type of conventional communication system. You want to use something like SecureDrop, over the Tor network, so there is no connection between the computer that you are using at the time — preferably with a non-persistent operating system like Tails, so you’ve left no forensic trace on the machine you’re using, which hopefully is a disposable machine that you can get rid of afterward, that can’t be found in a raid, that can’t be analyzed or anything like that — so that the only outcome of your operational activities are the stories reported by the journalists. [SecureDrop is a whistleblower submission system. Here is a guide to using The Intercept’s SecureDrop server as safely as possible.]
  • And this is to be sure that whoever has been engaging in this wrongdoing cannot distract from the controversy by pointing to your physical identity. Instead they have to deal with the facts of the controversy rather than the actors that are involved in it. Lee: What about for people who are, like, in a repressive regime and are trying to … Snowden: Use Tor. Lee: Use Tor? Snowden: If you’re not using Tor you’re doing it wrong. Now, there is a counterpoint here where the use of privacy-enhancing technologies in certain areas can actually single you out for additional surveillance through the exercise of repressive measures. This is why it’s so critical for developers who are working on security-enhancing tools to not make their protocols stand out.
  •  
    Lots more in the interview that I didn't highlight. This is a must-read.
Paul Merrell

Court gave NSA broad leeway in surveillance, documents show - The Washington Post - 0 views

  • Virtually no foreign government is off-limits for the National Security Agency, which has been authorized to intercept information “concerning” all but four countries, according to top-secret documents. The United States has long had broad no-spying arrangements with those four countries — Britain, Canada, Australia and New Zealand — in a group known collectively with the United States as the Five Eyes. But a classified 2010 legal certification and other documents indicate the NSA has been given a far more elastic authority than previously known, one that allows it to intercept through U.S. companies not just the communications of its overseas targets but any communications about its targets as well.
  • The certification — approved by the Foreign Intelligence Surveillance Court and included among a set of documents leaked by former NSA contractor Edward Snowden — lists 193 countries that would be of valid interest for U.S. intelligence. The certification also permitted the agency to gather intelligence about entities including the World Bank, the International Monetary Fund, the European Union and the International Atomic Energy Agency. The NSA is not necessarily targeting all the countries or organizations identified in the certification, the affidavits and an accompanying exhibit; it has only been given authority to do so. Still, the privacy implications are far-reaching, civil liberties advocates say, because of the wide spectrum of people who might be engaged in communication about foreign governments and entities and whose communications might be of interest to the United States.
  • That language could allow for surveillance of academics, journalists and human rights researchers. A Swiss academic who has information on the German government’s position in the run-up to an international trade negotiation, for instance, could be targeted if the government has determined there is a foreign-intelligence need for that information. If a U.S. college professor e-mails the Swiss professor’s e-mail address or phone number to a colleague, the American’s e-mail could be collected as well, under the program’s court-approved rules
  • ...4 more annotations...
  • On Friday, the Office of the Director of National Intelligence released a transparency report stating that in 2013 the government targeted nearly 90,000 foreign individuals or organizations for foreign surveillance under the program. Some tech-industry lawyers say the number is relatively low, considering that several billion people use U.S. e-mail services.
  • Still, some lawmakers are concerned that the potential for intrusions on Americans’ privacy has grown under the 2008 law because the government is intercepting not just communications of its targets but communications about its targets as well. The expansiveness of the foreign-powers certification increases that concern.
  • In a 2011 FISA court opinion, a judge using an NSA-provided sample estimated that the agency could be collecting as many as 46,000 wholly domestic e-mails a year that mentioned a particular target’s e-mail address or phone number, in what is referred to as “about” collection. “When Congress passed Section 702 back in 2008, most members of Congress had no idea that the government was collecting Americans’ communications simply because they contained a particular individual’s contact information,” Sen. Ron Wyden (D-Ore.), who has co-sponsored ­legislation to narrow “about” collection authority, said in an e-mail to The Washington Post. “If ‘about the target’ collection were limited to genuine national security threats, there would be very little privacy impact. In fact, this collection is much broader than that, and it is scooping up huge amounts of Americans’ wholly domestic communications.”
  • The only reason the court has oversight of the NSA program is that Congress in 2008 gave the government a new authority to gather intelligence from U.S. companies that own the Internet cables running through the United States, former officials noted. Edgar, the former privacy officer at the Office of the Director of National Intelligence, said ultimately he believes the authority should be narrowed. “There are valid privacy concerns with leaving these collection decisions entirely in the executive branch,” he said. “There shouldn’t be broad collection, using this authority, of foreign government information without any meaningful judicial role that defines the limits of what can be collected.”
Gary Edwards

Will Intel let Jen-Hsun Huang spread graphics beyond PCs? » VentureBeat - 0 views

  •  
    Nvidia chief executive Jen-Hsun Huang is on a mission to get graphics chips into everything from handheld computers to smart phones. He expects, for instance, that low-cost Netbooks will become the norm and that gadgets will need to have battery life lasting for days. Holding up an Ion platform, which couples an Intel low-cost Atom processor with an Nvidia integrated graphics chip set, he said his company is looking to determine "what is the soul of the new PC." With Ion, Huang said he is prepared for the future of the computer industry. But first, he has to deal with Intel. Good interview. See interview with Charlie Rose! The Dance of the Sugarplum Documents is about the evolution of the Web document model from a text-typographical/calculation model to one that is visually rich with graphical media streams meshing into traditional text/calc. The thing is, this visual document model is being defined on the edge. The challenge to the traditional desktop document model is coming from the edge, primarily from the WebKit - Chrome - iPhone Community. Jen-Hsun argues on Charlie Rose that desktop computers featured processing power and applications designed to automate typewritter (wordprocessing) and calculator (spreadsheet) functions. The x86 CPU design reflects this orientation. He argues that we are now entering the age of visual computing. A GPU is capable of dramatic increases in processing power because the architecture is geared to the volumes of graphical information being processed. Let the CPU do the traditional stuff, and let the GPU race into the future with the visual processing. That a GPU architecture can scale in parallel is an enormous advantage. But Jen-Hsun does not see the need to try to replicate CPU tasks in a GPU. The best way forward in his opinion is to combine the two!!!
Paul Merrell

American Surveillance Now Threatens American Business - The Atlantic - 0 views

  • What does it look like when a society loses its sense of privacy? <div><a href="http://pubads.g.doubleclick.net/gampad/jump?iu=%2F4624%2FTheAtlanticOnline%2Fchannel_technology&t=src%3Dblog%26by%3Drobinson-meyer%26title%3Damerican-surveillance-now-threatens-american-business%26pos%3Din-article&sz=300x250&c=285899172&tile=1" title=""><img style="border:none;" src="http://pubads.g.doubleclick.net/gampad/ad?iu=%2F4624%2FTheAtlanticOnline%2Fchannel_technology&t=src%3Dblog%26by%3Drobinson-meyer%26title%3Damerican-surveillance-now-threatens-american-business%26pos%3Din-article&sz=300x250&c=285899172&tile=1" alt="" /></a></div>In the almost 18 months since the Snowden files first received coverage, writers and critics have had to guess at the answer. Does a certain trend, consumer complaint, or popular product epitomize some larger shift? Is trust in tech companies eroding—or is a subset just especially vocal about it? Polling would make those answers clear, but polling so far has been… confused. A new study, conducted by the Pew Internet Project last January and released last week, helps make the average American’s view of his or her privacy a little clearer. And their confidence in their own privacy is ... low. The study's findings—and the statistics it reports—stagger. Vast majorities of Americans are uncomfortable with how the government uses their data, how private companies use and distribute their data, and what the government does to regulate those companies. No summary can equal a recounting of the findings. Americans are displeased with government surveillance en masse:   
  • A new study finds that a vast majority of Americans trust neither the government nor tech companies with their personal data.
  • According to the study, 70 percent of Americans are “at least somewhat concerned” with the government secretly obtaining information they post to social networking sites. Eighty percent of respondents agreed that “Americans should be concerned” with government surveillance of telephones and the web. They are also uncomfortable with how private corporations use their data: Ninety-one percent of Americans believe that “consumers have lost control over how personal information is collected and used by companies,” according to the study. Eighty percent of Americans who use social networks “say they are concerned about third parties like advertisers or businesses accessing the data they share on these sites.” And even though they’re squeamish about the government’s use of data, they want it to regulate tech companies and data brokers more strictly: 64 percent wanted the government to do more to regulate private data collection. Since June 2013, American politicians and corporate leaders have fretted over how much the leaks would cost U.S. businesses abroad.
  • ...3 more annotations...
  • What does it look like when a society loses its sense of privacy? <div><a href="http://pubads.g.doubleclick.net/gampad/jump?iu=%2F4624%2FTheAtlanticOnline%2Fchannel_technology&t=src%3Dblog%26by%3Drobinson-meyer%26title%3Damerican-surveillance-now-threatens-american-business%26pos%3Din-article&sz=300x250&c=285899172&tile=1" title=""><img style="border:none;" src="http://pubads.g.doubleclick.net/gampad/ad?iu=%2F4624%2FTheAtlanticOnline%2Fchannel_technology&t=src%3Dblog%26by%3Drobinson-meyer%26title%3Damerican-surveillance-now-threatens-american-business%26pos%3Din-article&sz=300x250&c=285899172&tile=1" alt="" /></a></div>In the almost 18 months since the Snowden files first received coverage, writers and critics have had to guess at the answer. Does a certain trend, consumer complaint, or popular product epitomize some larger shift? Is trust in tech companies eroding—or is a subset just especially vocal about it? Polling would make those answers clear, but polling so far has been… confused. A new study, conducted by the Pew Internet Project last January and released last week, helps make the average American’s view of his or her privacy a little clearer. And their confidence in their own privacy is ... low. The study's findings—and the statistics it reports—stagger. Vast majorities of Americans are uncomfortable with how the government uses their data, how private companies use and distribute their data, and what the government does to regulate those companies. No summary can equal a recounting of the findings. Americans are displeased with government surveillance en masse:   
  • “It’s clear the global community of Internet users doesn’t like to be caught up in the American surveillance dragnet,” Senator Ron Wyden said last month. At the same event, Google chairman Eric Schmidt agreed with him. “What occurred was a loss of trust between America and other countries,” he said, according to the Los Angeles Times. “It's making it very difficult for American firms to do business.” But never mind the world. Americans don’t trust American social networks. More than half of the poll’s respondents said that social networks were “not at all secure. Only 40 percent of Americans believe email or texting is at least “somewhat” secure. Indeed, Americans trusted most of all communication technologies where some protections has been enshrined into the law (though the report didn’t ask about snail mail). That is: Talking on the telephone, whether on a landline or cell phone, is the only kind of communication that a majority of adults believe to be “very secure” or “somewhat secure.”
  • (That may seem a bit incongruous, because making a telephone call is one area where you can be almost sure you are being surveilled: The government has requisitioned mass call records from phone companies since 2001. But Americans appear, when discussing security, to differentiate between the contents of the call and data about it.) Last month, Ramsey Homsany, the general counsel of Dropbox, said that one big thing could take down the California tech scene. “We have built this incredible economic engine in this region of the country,” said Homsany in the Los Angeles Times, “and [mistrust] is the one thing that starts to rot it from the inside out.” According to this poll, the mistrust has already begun corroding—and is already, in fact, well advanced. We’ve always assumed that the great hurt to American business will come globally—that citizens of other nations will stop using tech companies’s services. But the new Pew data shows that Americans suspect American businesses just as much. And while, unlike citizens of other nations, they may not have other places to turn, they may stop putting sensitive or delicate information online.
Paul Merrell

What's Scarier: Terrorism, or Governments Blocking Websites in its Name? - The Intercept - 0 views

  • Forcibly taking down websites deemed to be supportive of terrorism, or criminalizing speech deemed to “advocate” terrorism, is a major trend in both Europe and the West generally. Last month in Brussels, the European Union’s counter-terrorism coordinator issued a memo proclaiming that “Europe is facing an unprecedented, diverse and serious terrorist threat,” and argued that increased state control over the Internet is crucial to combating it. The memo noted that “the EU and its Member States have developed several initiatives related to countering radicalisation and terrorism on the Internet,” yet argued that more must be done. It argued that the focus should be on “working with the main players in the Internet industry [a]s the best way to limit the circulation of terrorist material online.” It specifically hailed the tactics of the U.K. Counter-Terrorism Internet Referral Unit (CTIRU), which has succeeded in causing the removal of large amounts of material it deems “extremist”:
  • In addition to recommending the dissemination of “counter-narratives” by governments, the memo also urged EU member states to “examine the legal and technical possibilities to remove illegal content.” Exploiting terrorism fears to control speech has been a common practice in the West since 9/11, but it is becoming increasingly popular even in countries that have experienced exceedingly few attacks. A new extremist bill advocated by the right-wing Harper government in Canada (also supported by Liberal Party leader Justin Trudeau even as he recognizes its dangers) would create new crimes for “advocating terrorism”; specifically: “every person who, by communicating statements, knowingly advocates or promotes the commission of terrorism offences in general” would be a guilty and can be sent to prison for five years for each offense. In justifying the new proposal, the Canadian government admits that “under the current criminal law, it is [already] a crime to counsel or actively encourage others to commit a specific terrorism offence.” This new proposal is about criminalizing ideas and opinions. In the government’s words, it “prohibits the intentional advocacy or promotion of terrorism, knowing or reckless as to whether it would result in terrorism.”
  • If someone argues that continuous Western violence and interference in the Muslim world for decades justifies violence being returned to the West, or even advocates that governments arm various insurgents considered by some to be “terrorists,” such speech could easily be viewed as constituting a crime. To calm concerns, Canadian authorities point out that “the proposed new offence is similar to one recently enacted by Australia, that prohibits advocating a terrorist act or the commission of a terrorism offence-all while being reckless as to whether another person will engage in this kind of activity.” Indeed, Australia enacted a new law late last year that indisputably targets political speech and ideas, as well as criminalizing journalism considered threatening by the government. Punishing people for their speech deemed extremist or dangerous has been a vibrant practice in both the U.K. and U.S. for some time now, as I detailed (coincidentally) just a couple days before free speech marches broke out in the West after the Charlie Hebdo attacks. Those criminalization-of-speech attacks overwhelmingly target Muslims, and have resulted in the punishment of such classic free speech activities as posting anti-war commentary on Facebook, tweeting links to “extremist” videos, translating and posting “radicalizing” videos to the Internet, writing scholarly articles in defense of Palestinian groups and expressing harsh criticism of Israel, and even including a Hezbollah channel in a cable package.
  • ...2 more annotations...
  • Beyond the technical issues, trying to legislate ideas out of existence is a fool’s game: those sufficiently determined will always find ways to make themselves heard. Indeed, as U.S. pop star Barbra Streisand famously learned, attempts to suppress ideas usually result in the greatest publicity possible for their advocates and/or elevate them by turning fringe ideas into martyrs for free speech (I have zero doubt that all five of the targeted sites enjoyed among their highest traffic dates ever today as a result of the French targeting). But the comical futility of these efforts is exceeded by their profound dangers. Who wants governments to be able to unilaterally block websites? Isn’t the exercise of this website-blocking power what has long been cited as reasons we should regard the Bad Countries — such as China and Iran — as tyrannies (which also usually cite “counterterrorism” to justify their censorship efforts)?
  • s those and countless other examples prove, the concepts of “extremism” and “radicalizing” (like “terrorism” itself) are incredibly vague and elastic, and in the hands of those who wield power, almost always expand far beyond what you think it should mean (plotting to blow up innocent people) to mean: anyone who disseminates ideas that are threatening to the exercise of our power. That’s why powers justified in the name of combating “radicalism” or “extremism” are invariably — not often or usually, but invariably — applied to activists, dissidents, protesters and those who challenge prevailing orthodoxies and power centers. My arguments for distrusting governments to exercise powers of censorship are set forth here (in the context of a prior attempt by a different French minister to control the content of Twitter). In sum, far more damage has been inflicted historically by efforts to censor and criminalize political ideas than by the kind of “terrorism” these governments are invoking to justify these censorship powers. And whatever else may be true, few things are more inimical to, or threatening of, Internet freedom than allowing functionaries inside governments to unilaterally block websites from functioning on the ground that the ideas those sites advocate are objectionable or “dangerous.” That’s every bit as true when the censors are in Paris, London, and Ottawa, and Washington as when they are in Tehran, Moscow or Beijing.
Paul Merrell

WikiLeaks' Julian Assange warns: Google is not what it seems - 0 views

  • Back in 2011, Julian Assange met up with Eric Schmidt for an interview that he considers the best he’s ever given. That doesn’t change, however, the opinion he now has about Schmidt and the company he represents, Google.In fact, the WikiLeaks leader doesn’t believe in the famous “Don’t Be Evil” mantra that Google has been preaching for years.Assange thinks both Schmidt and Google are at the exact opposite spectrum.“Nobody wants to acknowledge that Google has grown big and bad. But it has. Schmidt’s tenure as CEO saw Google integrate with the shadiest of US power structures as it expanded into a geographically invasive megacorporation. But Google has always been comfortable with this proximity,” Assange writes in an opinion piece for Newsweek.
  • “Long before company founders Larry Page and Sergey Brin hired Schmidt in 2001, their initial research upon which Google was based had been partly funded by the Defense Advanced Research Projects Agency (DARPA). And even as Schmidt’s Google developed an image as the overly friendly giant of global tech, it was building a close relationship with the intelligence community,” Assange continues.Throughout the lengthy article, Assange goes on to explain how the 2011 meeting came to be and talks about the people the Google executive chairman brought along - Lisa Shields, then vice president of the Council on Foreign Relationship, Jared Cohen, who would later become the director of Google Ideas, and Scott Malcomson, the book’s editor, who would later become the speechwriter and principal advisor to Susan Rice.“At this point, the delegation was one part Google, three parts US foreign-policy establishment, but I was still none the wiser.” Assange goes on to explain the work Cohen was doing for the government prior to his appointment at Google and just how Schmidt himself plays a bigger role than previously thought.In fact, he says that his original image of Schmidt, as a politically unambitious Silicon Valley engineer, “a relic of the good old days of computer science graduate culture on the West Coast,” was wrong.
  • However, Assange concedes that that is not the sort of person who attends Bilderberg conferences, who regularly visits the White House, and who delivers speeches at the Davos Economic Forum.He claims that Schmidt’s emergence as Google’s “foreign minister” did not come out of nowhere, but it was “presaged by years of assimilation within US establishment networks of reputation and influence.” Assange makes further accusations that, well before Prism had even been dreamed of, the NSA was already systematically violating the Foreign Intelligence Surveillance Act under its director at the time, Michael Hayden. He states, however, that during the same period, namely around 2003, Google was accepting NSA money to provide the agency with search tools for its rapidly-growing database of information.Assange continues by saying that in 2008, Google helped launch the NGA spy satellite, the GeoEye-1, into space and that the search giant shares the photographs from the satellite with the US military and intelligence communities. Later on, 2010, after the Chinese government was accused of hacking Google, the company entered into a “formal information-sharing” relationship with the NSA, which would allow the NSA’s experts to evaluate the vulnerabilities in Google’s hardware and software.
  • ...1 more annotation...
  • “Around the same time, Google was becoming involved in a program known as the “Enduring Security Framework” (ESF), which entailed the sharing of information between Silicon Valley tech companies and Pentagon-affiliated agencies at network speed.’’Emails obtained in 2014 under Freedom of Information requests show Schmidt and his fellow Googler Sergey Brin corresponding on first-name terms with NSA chief General Keith Alexander about ESF,” Assange writes.Assange seems to have a lot of backing to his statements, providing links left and right, which people can go check on their own.
  •  
    The "opinion piece for Newsweek" is an excerpt from Assange's new book, When Google met Wikileaks.  The chapter is well worth the read. http://www.newsweek.com/assange-google-not-what-it-seems-279447
Paul Merrell

Obama administration opts not to force firms to decrypt data - for now - The Washington... - 1 views

  • After months of deliberation, the Obama administration has made a long-awaited decision on the thorny issue of how to deal with encrypted communications: It will not — for now — call for legislation requiring companies to decode messages for law enforcement. Rather, the administration will continue trying to persuade companies that have moved to encrypt their customers’ data to create a way for the government to still peer into people’s data when needed for criminal or terrorism investigations. “The administration has decided not to seek a legislative remedy now, but it makes sense to continue the conversations with industry,” FBI Director James B. Comey said at a Senate hearing Thursday of the Homeland Security and Governmental Affairs Committee.
  • The decision, which essentially maintains the status quo, underscores the bind the administration is in — balancing competing pressures to help law enforcement and protect consumer privacy. The FBI says it is facing an increasing challenge posed by the encryption of communications of criminals, terrorists and spies. A growing number of companies have begun to offer encryption in which the only people who can read a message, for instance, are the person who sent it and the person who received it. Or, in the case of a device, only the device owner has access to the data. In such cases, the companies themselves lack “backdoors” or keys to decrypt the data for government investigators, even when served with search warrants or intercept orders.
  • The decision was made at a Cabinet meeting Oct. 1. “As the president has said, the United States will work to ensure that malicious actors can be held to account — without weakening our commitment to strong encryption,” National Security Council spokesman Mark Stroh said. “As part of those efforts, we are actively engaged with private companies to ensure they understand the public safety and national security risks that result from malicious actors’ use of their encrypted products and services.” But privacy advocates are concerned that the administration’s definition of strong encryption also could include a system in which a company holds a decryption key or can retrieve unencrypted communications from its servers for law enforcement. “The government should not erode the security of our devices or applications, pressure companies to keep and allow government access to our data, mandate implementation of vulnerabilities or backdoors into products, or have disproportionate access to the keys to private data,” said Savecrypto.org, a coalition of industry and privacy groups that has launched a campaign to petition the Obama administration.
  • ...3 more annotations...
  • To Amie Stepanovich, the U.S. policy manager for Access, one of the groups signing the petition, the status quo isn’t good enough. “It’s really crucial that even if the government is not pursuing legislation, it’s also not pursuing policies that will weaken security through other methods,” she said. The FBI and Justice Department have been talking with tech companies for months. On Thursday, Comey said the conversations have been “increasingly productive.” He added: “People have stripped out a lot of the venom.” He said the tech executives “are all people who care about the safety of America and also care about privacy and civil liberties.” Comey said the issue afflicts not just federal law enforcement but also state and local agencies investigating child kidnappings and car crashes — “cops and sheriffs . . . [who are] increasingly encountering devices they can’t open with a search warrant.”
  • One senior administration official said the administration thinks it’s making enough progress with companies that seeking legislation now is unnecessary. “We feel optimistic,” said the official, who spoke on the condition of anonymity to describe internal discussions. “We don’t think it’s a lost cause at this point.” Legislation, said Rep. Adam Schiff (D-Calif.), is not a realistic option given the current political climate. He said he made a recent trip to Silicon Valley to talk to Twitter, Facebook and Google. “They quite uniformly are opposed to any mandate or pressure — and more than that, they don’t want to be asked to come up with a solution,” Schiff said. Law enforcement officials know that legislation is a tough sell now. But, one senior official stressed, “it’s still going to be in the mix.” On the other side of the debate, technology, diplomatic and commerce agencies were pressing for an outright statement by Obama to disavow a legislative mandate on companies. But their position did not prevail.
  • Daniel Castro, vice president of the Information Technology & Innovation Foundation, said absent any new laws, either in the United States or abroad, “companies are in the driver’s seat.” He said that if another country tried to require companies to retain an ability to decrypt communications, “I suspect many tech companies would try to pull out.”
  •  
    # ! upcoming Elections...
Gary Edwards

Microsoft's Next Big Thing; Rich MS Client / MS Cloud of Servers - 0 views

  •  
    CIO Magazine has an extensive interview with Craig Mundie, the man responsible for nailing down the next generation of monopolist profits: "You talk about technology waves. What will be the next big wave? What happens in waves is the shift from one generation of computing platform to the next. That platform gets established by a small number of killer apps. We've been through a number of these major platform shifts, from the mainframe to the minicomputer to the personal computer to adding the Internet as an adjunct platform. We're now trending to the next big platform, which I call "the client plus the cloud."

    That's one thing, not two things. Today, we've got a broadening out of what people call the client. My 16 years here was in large measure about that. And then we introduced the network. The Internet was a place where you had Web content and Web publishing, but other than being delivered on some of those clients, the two things were somewhat divorced.

    The next thing that will emerge is an architecture that allows the application developer to think of the cloud plus the client architecturally as a single thing. In a sense, it is like client/sever computing in the enterprise. It was the homogeneity that existed between some of the facilities at the server and the client end that allowed people to build those applications. We've never had that kind of architectural homogeneity in this cloud-plus-client or Internet-plus-smart-devices world, and I'm predicting that will be the next big thing.
Gary Edwards

What Oracle Sees in Sun Microsystems | NewsFactor Network - 0 views

  • Citigroup's Thill estimates Oracle could cut between 40 percent and 70 percent of Sun's roughly 33,000 employees. Excluding restructuring costs, Oracle expects Sun to add $1.5 billion in profit during the first year after the acquisition closes this summer, and another $2 billion the following year. Oracle executives declined to say how many jobs would be eliminated.
  • Citigroup's Thill estimates Oracle could cut between 40 percent and 70 percent of Sun's roughly 33,000 employees. Excluding restructuring costs, Oracle expects Sun to add $1.5 billion in profit during the first year after the acquisition closes this summer, and another $2 billion the following year. Oracle executives declined to say how many jobs would be eliminated.
  •  
    Good article from Aaron Ricadela. The focus is on Java, Sun's hardware-Server business, and Oracle's business objectives. No mention of OpenOffice or ODf though. There is however an interesting quote from IBM regarding the battle between Java and Microsoft .NET. Also, no mention of a OpenOffice-Java Foundation that would truly open source these technologies.

    When we were involved with the Massachusetts Pilot Study and ODF Plug-in proposals, IBM and Oracle lead the effort to open source the da Vinci plug-in. They put together a group of vendors known as "the benefactors", with the objective of completing work on da Vinci while forming a patent pool - open source foundation for all OpenOffice and da Vinci source. This idea was based on the Eclipse model.

    One of the more interesting ideas coming out of the IBM-Oracle led "benefactors", was the idea of breaking OpenOffice into components that could then be re-purposed by the Eclipse community of developers. The da Vinci plug-in was to be the integration bridge between Eclipse and the Microsoft Office productivity environment. Very cool. And no doubt IBM and Oracle were in synch on this in 2006. The problem was that they couldn't convince Sun to go along with the plan.

    Sun of course owned both Java and OpenOffice, and thought they could build a better ODF plug-in for OpenOffice (and own that too). A year later, Sun actually did produce an ODF plug-in for MSOffice. It was sent to Massachusetts on July 3rd, 2007, and tested against the same set of 150 critical documents da Vinci had to successfully convert without breaking. The next day, July 4th, Massachusetts announced their decision that they would approve the use of both ODF and OOXML! The much hoped for exclusive ODF requirement failed in Massachusetts exactly because Sun insisted on their way or the highway.

    Let's hope Oracle can right the ship and get OpenOffice-ODF-Java back on track.

    "......To gain
Paul Merrell

Leaked docs show spyware used to snoop on US computers | Ars Technica - 0 views

  • Software created by the controversial UK-based Gamma Group International was used to spy on computers that appear to be located in the United States, the UK, Germany, Russia, Iran, and Bahrain, according to a leaked trove of documents analyzed by ProPublica. It's not clear whether the surveillance was conducted by governments or private entities. Customer e-mail addresses in the collection appeared to belong to a German surveillance company, an independent consultant in Dubai, the Bosnian and Hungarian Intelligence services, a Dutch law enforcement officer, and the Qatari government.
  • The leaked files—which were posted online by hackers—are the latest in a series of revelations about how state actors including repressive regimes have used Gamma's software to spy on dissidents, journalists, and activist groups. The documents, leaked last Saturday, could not be readily verified, but experts told ProPublica they believed them to be genuine. "I think it's highly unlikely that it's a fake," said Morgan Marquis-Bore, a security researcher who while at The Citizen Lab at the University of Toronto had analyzed Gamma Group's software and who authored an article about the leak on Thursday. The documents confirm many details that have already been reported about Gamma, such as that its tools were used to spy on Bahraini activists. Some documents in the trove contain metadata tied to e-mail addresses of several Gamma employees. Bill Marczak, another Gamma Group expert at the Citizen Lab, said that several dates in the documents correspond to publicly known events—such as the day that a particular Bahraini activist was hacked.
  • The leaked files contain more than 40 gigabytes of confidential technical material, including software code, internal memos, strategy reports, and user guides on how to use Gamma Group software suite called FinFisher. FinFisher enables customers to monitor secure Web traffic, Skype calls, webcams, and personal files. It is installed as malware on targets' computers and cell phones. A price list included in the trove lists a license of the software at almost $4 million. The documents reveal that Gamma uses technology from a French company called Vupen Security that sells so-called computer "exploits." Exploits include techniques called "zero days" for "popular software like Microsoft Office, Internet Explorer, Adobe Acrobat Reader, and many more." Zero days are exploits that have not yet been detected by the software maker and therefore are not blocked.
  • ...2 more annotations...
  • Many of Gamma's product brochures have previously been published by the Wall Street Journal and Wikileaks, but the latest trove shows how the products are getting more sophisticated. In one document, engineers at Gamma tested a product called FinSpy, which inserts malware onto a user's machine, and found that it could not be blocked by most antivirus software. Documents also reveal that Gamma had been working to bypass encryption tools including a mobile phone encryption app, Silent Circle, and were able to bypass the protection given by hard-drive encryption products TrueCrypt and Microsoft's Bitlocker.
  • The documents also describe a "country-wide" surveillance product called FinFly ISP which promises customers the ability to intercept Internet traffic and masquerade as ordinary websites in order to install malware on a target's computer. The most recent date-stamp found in the documents is August 2, coincidung with the first tweet by a parody Twitter account, @GammaGroupPR, which first announced the hack and may be run by the hacker or hackers responsible for the leak. On Reddit, a user called PhineasFisher claimed responsibility for the leak. "Two years ago their software was found being widely used by governments in the middle east, especially Bahrain, to hack and spy on the computers and phones of journalists and dissidents," the user wrote. The name on the @GammaGroupPR Twitter account is also "Phineas Fisher." GammaGroup, the surveillance company whose documents were released, is no stranger to the spotlight. The security firm F-Secure first reported the purchase of FinFisher software by the Egyptian State Security agency in 2011. In 2012, Bloomberg News and The Citizen Lab showed how the company's malware was used to target activists in Bahrain. In 2013, the software company Mozilla sent a cease-and-desist letter to the company after a report by The Citizen Lab showed that a spyware-infected version of the Firefox browser manufactured by Gamma was being used to spy on Malaysian activists.
Paul Merrell

The Newest Reforms on SIGINT Collection Still Leave Loopholes | Just Security - 0 views

  • Director of National Intelligence James Clapper this morning released a report detailing new rules aimed at reforming the way signals intelligence is collected and stored by certain members of the United States Intelligence Community (IC). The long-awaited changes follow up on an order announced by President Obama one year ago that laid out the White House’s principles governing the collection of signals intelligence. That order, commonly known as PPD-28, purports to place limits on the use of data collected in bulk and to increase privacy protections related to the data collected, regardless of nationality. Accordingly, most of the changes presented as “new” by Clapper’s office  (ODNI) stem directly from the guidance provided in PPD-28, and so aren’t truly new. And of the biggest changes outlined in the report, there are still large exceptions that appear to allow the government to escape the restrictions with relative ease. Here’s a quick rundown.
  • Retention policy for non-U.S. persons. The new rules say that the IC must now delete information about “non-U.S. persons” that’s been gathered via signals intelligence after five-years. However, there is a loophole that will let spies hold onto that information indefinitely whenever the Director of National Intelligence determines (after considering the views of the ODNI’s Civil Liberties Protection Officer) that retaining information is in the interest of national security. The new rules don’t say whether the exceptions will be directed at entire groups of people or individual surveillance targets.  Section 215 metadata. Updates to the rules concerning the use of data collected under Section 215 of the Patriot Act includes the requirement that the Foreign Intelligence Surveillance Court (rather than authorized NSA officials) must determine spies have “reasonable, articulable suspicion” prior to query Section 215 data, outside of emergency circumstances. What qualifies as an emergency for these purposes? We don’t know. Additionally, the IC is now limited to two “hops” in querying the database. This means that spies can only play two degrees of Kevin Bacon, instead of the previously allowed three degrees, with the contacts of anyone targeted under Section 215. The report doesn’t explain what would prevent the NSA (or other agency using the 215 databases) from getting around this limit by redesignating a phone number found in the first or second hop as a new “target,” thereby allowing the agency to continue the contact chain.
  • National security letters (NSLs). The report also states that the FBI’s gag orders related to NSLs expire three years after the opening of a full-blown investigation or three years after an investigation’s close, whichever is earlier. However, these expiration dates can be easily overridden by by an FBI Special Agent in Charge or a Deputy Assistant FBI Director who finds that the statutory standards for secrecy about the NSL continue to be satisfied (which at least one court has said isn’t a very high bar). This exception also doesn’t address concerns that NSL gag orders lack adequate due process protections, lack basic judicial oversight, and may violate the First Amendment.
  • ...1 more annotation...
  • The report also details the ODNI’s and IC’s plans for the future, including: (1) Working with Congress to reauthorize bulk collection under Section 215. (2) Updating agency guidelines under Executive Order 12333 “to protect the privacy and civil liberties of U.S. persons.” (3) Producing another annual report in January 2016 on the IC’s progress in implementing signals intelligence reforms. These plans raise more questions than they answer. Given the considerable doubts about Section 215’s effectiveness, why is the ODNI pushing for its reauthorization? And what will the ODNI consider appropriate privacy protections under Executive Order 12333?
« First ‹ Previous 81 - 100 of 3060 Next › Last »
Showing 20 items per page