Skip to main content

Home/ Open Web/ Group items tagged technical

Rss Feed Group items tagged

Paul Merrell

'Let's Encrypt' Project Strives To Make Encryption Simple - Slashdot - 0 views

  •  
    The blurb is a bit misleading. This is a project that's been under way since last year; what's new is that they're moving under the Linux Foundation umbrella for various non-technical suoport purposes. By sometime this summer, encrypting web site data and broadcasting it over https is  slated to become a two-click process. Or on the linux command line: $ sudo apt-get install lets-encrypt $ lets-encrypt example.com This is a project that grew out of public disgust with NSA surveillance, designed to flood the NSA (and other bad actors) with so much encrypted data that they will be able to decrypt only a tiny fraction (decryption without the decryption key takes gobs of computer cycles).  The other half of the solution is already available, the HTTPS Everywhere extension for the Chrome, FIrefox, and Opera web browsers by the Electronic Frontier Foundation and the TOR Project that translates your every request for a http address into an effort to connect to an https address preferentially before establishing an http connection if https is not available. HTTPS Everywhere is fast and does not noticeably add to your page loading time. If you'd like to effortlessly imoprove your online security and help burden NSA, install HTTPS Everywhere. Get it at https://www.eff.org/https-everywhere
Gary Edwards

IBM Declares War on Standards Process | BNET - 0 views

  •  
    Is this the wrath of Sutor?  Last week he threw the Linux desktop under the bus.  And now this?  No wonder @rcweir  has shut down the slime machine. excerpt:  IBM has essentially declared war on the technical standards process. The words seem supportive enough, but when you read between them, you can see that a systematic round of arm-twisting is likely about to happen. Anyone who knows tech is used to the dance that standard setting is. Experts from various companies serve on the committees, each with a sense of what ought to work "best" and trying to wrangle decisions in their direction. This is competitive dominance in action and the stakes are high. That's what makes Big Blue's more-collegial-than-thou stance so amusing, with the following "tenets":
Gary Edwards

oEmbed: How New Twitter Could Help Combine Content From Different Sites - 0 views

  •  
    transclusion of hypertext documents. Transclusion is technically defined as "when you put that one thing in that other thing". In its current implementation, Twitter has declared that media which is shown within the Twitter interface comes from selected partners. But actually, the technology to allow embedding of rich media from almost any site already exists, using a system called OEmbed. Geeky stuff, but it's made by nice people who are pretty smart, and it lets any site say, "Hey, if you want to put our thing in your thing, do it like this". It works. Lots of sites do it. Nobody's getting rich off of it, but nobody's getting sued, and in between those two extremes lies most of what makes the Web great.
Gary Edwards

GSA picks Google Apps: What it means | ZDNet - 0 views

  •  
    The General Services Administration made a bold decision to move its email and collaboration systems to the cloud.  This is a huge win for cloud-computing, but perhaps should have been expected since last week the Feds announced a new requisition and purchase mandate that cloud-computing had to be the FIRST consideration for federal agency purchases.  Note that the General Services Administration oversees requisitions and purchases for all Federal agencies!  This is huge.  Estimated to be worth $8 billion to cloud-computing providers. The cloud-computing market is estimated to be $30 Billion, but Gartner did not anticipate or expect Federal Agencies to embrace cloud-computing let alone issue a mandate for it.   In the RFP issued last June, it was easy to see their goals in the statement of objectives: This Statement of Objectives (SOO) describes the goals that GSA expects to achieve with regard to the 1. modernization of its e-mail system; 2. provision of an effective collaborative working environment; 3. reduction of the government's in-house system maintenance burden by providing related business, technical, and management functions; and 4. application of appropriate security and privacy safeguards. GSA announced yesterday that they choose Google Apps for email and collaboration and Unisys as the implementation partner. So what does this mean? What it means (WIM) #1: GSA employees will be using a next-generation information workplace. And that means mobile, device-agnostic, and location-agile. Gmail on an iPad? No problem. Email from a home computer? Yep. For GSA and for every other agency and most companies, it's important to give employees the tools to be productive and engage from every location on every device. "Work becomes a thing you do and not a place you go." [Thanks to Earl Newsome of Estee Lauder for that quote.] WIM #2: GSA will save 50% of the cost of email over five years. This is also what our research on the cost of email o
Gary Edwards

Diary Of An x264 Developer » Flash, Google, VP8, and the future of internet v... - 0 views

  •  
    In depth technical discussion about Flash, HTML5, H.264, and Google's VP8.  Excellent.  Read the comments.  Bottom line - Google has the juice to put Flash and H.264 in the dirt.  The YouTube acquisition turns out to be very strategic. excerpt: The internet has been filled for quite some time with an enormous number of blog posts complaining about how Flash sucks-so much that it's sounding as if the entire internet is crying wolf.  But, of course, despite the incessant complaining, they're right: Flash has terrible performance on anything other than Windows x86 and Adobe doesn't seem to care at all.  But rather than repeat this ad nauseum, let's be a bit more intellectual and try to figure out what happened. Flash became popular because of its power and flexibility.  At the time it was the only option for animated vector graphics and interactive content (stuff like VRML hardly counts).  Furthermore, before Flash, the primary video options were Windows Media, Real, and Quicktime: all of which were proprietary, had no free software encoders or decoders, and (except for Windows Media) required the user to install a clunky external application, not merely a plugin.  Given all this, it's clear why Flash won: it supported open multimedia formats like H.263 and MP3, used an ultra-simple container format that anyone could write (FLV), and worked far more easily and reliably than any alternative. Thus, Adobe (actually, at the time, Macromedia) got their 98% install base.  And with that, they began to become complacent.  Any suggestion of a competitor was immediately shrugged off; how could anyone possibly compete with Adobe, given their install base?  It'd be insane, nobody would be able to do it.  They committed the cardinal sin of software development: believing that a competitor being better is excusable.  At x264, if we find a competitor that does something better, we immediately look into trying to put ourselves back on top.  This is why
Gary Edwards

Munich administration switches to OpenDocument Format - The H Open Source: News and Fea... - 0 views

  •  
    wow.  Six years and all they have migrated are 2,500 out of 14,0000 desktops!  The curse of the Microsoft Productivity Environment strikes again as legacy workgroups, workflows and the mesh of compound documents that drive them prove to be very stubborn.  The funny thing is that, as Munich struggles with this 1995 level desktop transition, Microsoft is preparing to move those very same legacy productivity environments to a proprietary Web Productivity Platform.  I wonder what Munich's Web plans are? excerpt: Schießl says the transition required enormous background effort which involved eliminating many IT dependencies created by individual vendors over the years. More than 20,000 templates had to be consolidated and converted into new templates, macros or web applications. Most templates and text blocks are now managed via the WollMux program, which was released in 2008. Schießl said that the developers also had to adapt a number of corporate applications such as SAP for use with ODF. According to the review, another achievement in 2009 was the establishment of Linux client pilot areas as a step towards the final aim of migrating all twelve of the city administration's departments to Linux. Schießl says this was the last fundamental step required to enable general client migration in the coming years. Although only 2,500 of around 14,000 workstations have been converted to the custom-built basic LiMux client, the hardest part was to get them all up and running, which required going over inconsistent IT infrastructures that had developed over the years and training the IT staff for the technical switch. As Robert Pogson observes in his blog, six and a half years after the decision was made to switch to free software, the Munich Linux pioneers have completed about 80 per cent of the project's total workload.
Gary Edwards

EU settlement will alter Microsoft's stance on interoperability -- Government Computer ... - 0 views

  •  
    EU settlement will alter Microsoft's stance on interoperability By Kurt Mackie Dec 21, 2009 Microsoft provided more details about its settlement with the European Commission (EC), particularly with regard to interoperability agreements. In a blog post on Thursday, Dave Heiner, Microsoft's vice president and deputy general counsel, claimed that the company has pledged to implement a threefold approach to interoperability that EC Commissioner Neelie Kroes outlined in past speeches. Heiner summarized that approach: companies should disclose technical information, provide a remedy if the information is inadequate and charge equitable royalty rates for associated intellectual property. Kroes had also specifically called for companies to follow open standards as one of the best ways to achieve interoperability. However, Heiner omitted the word, "open," from his comment. He said that "products from different firms can work well together when they implement common, well-designed industry standards." Microsoft's interoperability pledge announced this week appears to continue ideas the company put forth in February 2008. At that time, the company announced broad interoperability principles as well as APIs for software developers working with Microsoft's mainline products, including Windows client and server operating systems, Exchange, Office and SharePoint, among others. Microsoft has been releasing documentation for that purpose, with "hundreds of Microsoft developers" devoted to the effort, according to Heiner. The new elements to Microsoft's interoperability pledge appear to be warranty and patent-sharing templates. Those documents, and more, can be accessed at the end of a statement about the settlement by Brad Smith, Microsoft's senior vice president and general counsel.
Gary Edwards

Is Oracle Quietly Killing OpenOffice? | Revelations From An Unwashed Brain - 1 views

  •  
    Bingo!  Took five years, but finally someone gets it: excerpt:  Great question. After 10 years, OpenOffice hasn't had much traction in the enterprise - supported by under 10% of firms, and today it's facing more competition from online apps from Google and Zoho. I'm not counting OpenOffice completely out yet, however, since IBM has been making good progress on features with Symphony and Oracle is positioning OpenOffice for the web, desktop and mobile - a first. But barriers to OpenOffice and Web-based tools persist, and not just on a feature/function basis. Common barriers include: Third-party integration requirements. Some applications only work with Office. For example, one financial services firm I spoke with was forced to retain Office because its employees needed to work with Fiserv, a proprietary data center that is very Microsoft centric. "What was working pretty well was karate chopped." Another firm rolled out OpenOffice.org to 7,00 users and had to revert back 5,00 of them when they discovered one of the main apps they work with only supported Microsoft. User acceptance. Many firms say that they can overcome pretty much all of the technical issues but face challenges around user acceptance. One firm I spoke with went so far as to "customize" their OpenOffice solution with a Microsoft logo and told employees it was a version of Office. The implementation went smoothly. Others have said that they have met resistance from business users who didn't want Office taken off their desktop. Other strategies include providing OpenOffice to only new employees and to transition through attrition. But this can cause compatibility issues. Lack of seamless interoperability with Office. Just like third-party apps may only work with Office, many collaborative activities force use of particular versions of Office. Today's Web-based and OpenOffice solutions do not provide seamless round tripping between Office and their applications. Corel, with its
Gary Edwards

Adeptol Viewing Technology Features - 0 views

  •  
    Quick LinksGet a TrialEnterprise On DemandEnterprise On PremiseFAQHelpContact UsWhy Adeptol?Document SupportSupport for more than 300 document types out of boxNot a Virtual PrinterMultitenant platform for high end document viewingNo SoftwaresNo need to install any additional softwares on serverNo ActiveX/PluginsNo plugins or active x or applets need to be downloaded on client side.Fully customizableAdvanced API offers full customization and UI changes.Any OS/Any Prog LanguageInstall Adeptol Server on any OS and integrate with any programming language.AwardsAdeptol products receive industry awards and accolades year after year  View a DemoAttend a WebcastContact AdeptolView a Success StoryNo ActiveX, No Plug-in, No Software's to download. Any OS, Any Browser, Any Programming Language. That is the Power of Adeptol. Adeptol can help you retain your customers and streamline your content integration efforts. Leverage Web 2.0 technologies to get a completely scalable content viewer that easily handles any type of content in virtually unlimited volume, with additional capabilities to support high-volume transaction and archive environments. Our enterprise-class infrastructure was built to meet the needs of the world's most demanding global enterprises. Based on AJAX technology you can easily integrate the viewer into your application with complete ease. Support for all Server PlatformsCan be installed on Windows   (32bit/64bit) Server and Linux   (32bit/64bit) Server. Click here to see technical specifications.Integrate with any programming languageWhether you work in .net, c#, php, cold fusion or JSP. Adeptol Viewer can be integrated easily in any programming language using the easy API set. It also comes with sample code for all languages to get you started.Compatibility with more than 99% of the browsersTested & verified for compatibility with 99% of the various browsers on different platforms. Click here to see browser compatibility report.More than 300 Document T
Gary Edwards

The Cloud Rises to Top of 2011 CIO Priorities - 0 views

  •  
    For those that needed just a little more proof, Gartner (news, site) has confirmed cloud computing is hot. In fact, cloud technology is so hot that it topped Gartner's 2011 CIO Agenda. Even if you managed to resist the siren of prime time TV beckoning, "To the cloud!", it's unlikely you'll want to ignore the voices of over 2,000 CIOs . The Cloud Gartner released its survey of 2,014 CIOs representing more than US$ 160 billion in spending across 50 countries and 38 industries. Cloud computing, virtualization and mobile led the list of technical priorities.
Gary Edwards

The Fastest Way to a Drupal Site, from Acquia - Business White Papers, Webcasts and Cas... - 2 views

  •  
    This white paper will give you a small planning toolset and a few tips to help get you oriented. I'd like you to get a Drupal 7 website planned and online today, and the fastest way to get there is Drupal Gardens (drupalgardens.com). It saves you from the technical overhead and lets you dive into how Drupal works, exploring the administrative interface and learning how to organize content. Download - 10 pages.  How to set up Drupal 7, May 2011.  Acquia
Gary Edwards

MSOffice Finally Embraces SharePoint in 2010 - 0 views

  •  
    Review of SharePoint-Office 2010 integration.  MSOffice access and integration with SharePoint "content" has been improved and expanded.  Templates, forms and reports have been moved to the SharePoint center.  Or should i say "the many possible SharePoint centers".   There is also an interesting integration of the tagging system, with smart-tags becoming Bing enriched.  Good-bye Google.  Good-bye RDF - RDFa. excerpt:  Many enterprises buy into Microsoft's integrated vision of collaboration because they assume their products work well together. While in the case of SharePoint and Office this is technically true (more so with the 2007 versions), many of these integrations are either not used by or not visible to the average Office user. However, integrations introduced in SharePoint and Office 2010 may change this perception because they are exposed in menus and dialogs used by nearly every Office user. Perhaps supporting SharePoint Online and having SharePoint provide the basis of Office Live Workspaces is encouraging the SharePoint and Office planners to take a more user-centric perspective. In any case, here are some of the new integrations between SharePoint and Office 2010 that you should look for.
Paul Merrell

OASIS Protects Open Source Developers From Software Patents [on Simon Phipps, SunMink] - 0 views

  • OASIS seems to have taken it to heart, because it has today announced what looks to me like the perfect basis for technology standards in an open source world.Their new rules2 include a new "mode" which standards projects can opt into using. In this new mode, all contributors promise that they will not assert any patents they may own related to the standard the project is defining. Contributors make this covenant:Each Obligated Party in a Non-Assertion Mode TC irrevocably covenants that, subject to Section 10.3.2 and Section 11 of the OASIS IPR Policy, it will not assert any of its Essential Claims covered by its Contribution Obligations or Participation Obligations against any OASIS Party or third party for making, having made, using, marketing, importing, offering to sell, selling, and otherwise distributing Covered Products that implement an OASIS Final Deliverable developed by that TC.
  • The covenant described in Section 10.3.1 may be suspended or revoked by the Obligated Party with respect to any OASIS Party or third party if that OASIS Party or third party asserts an Essential Claim in a suit first brought against, or attempts in writing to assert an Essential Claim against, a Beneficiary with respect to a Covered Product that implements the same OASIS Final Deliverable.
  • There's a redline PDF document showing the changes - the new stuff is mainly in section 10, although other areas had to be changed to match as well, I gather.
  • ...1 more annotation...
  • OASIS Protects Open Source Developers From Software Patents
  •  
    This new technical committee IPR mode may not make much sense to the legally-inclined without reading the new section 2.7 definition of "Covered Product." There we learn that the patent covenant extends only so far as the implementation is conformant with the standard. I count that as a good thing, curing a defect in the Sun Covenant Not to Sue in regard to ODF, which at least arguably extended far enough to confer immunity on those who embrace and extend a standard. But the reciprocity provision allowing contributors to counter-sue for infringement if sued clashes with many definitions of an "open standard" adopted by governmental entities for procurement purposes. So a question remains as to who must bend, government or OASIS members.
Paul Merrell

Microsoft launches Office Web Apps preview - 0 views

  • Microsoft today launched a limited beta test of its Office Web Apps, the company's first public unveiling of its rival for Google's Web applications. Dubbed a "technical preview" by Microsoft to denote that it's by invitation only, Office Web Apps will be available on the company's Windows Live site via a special "Documents" tab, a company spokeswoman said. "Tens of thousands have been invited to participate in the Technical Preview," said the spokeswoman in a reply to questions.
Gary Edwards

AppleInsider | Microsoft takes aim at Google with online Office suite - 0 views

  •  
    Microsoft has announced the next generation of MSOffice, and it turns out to be SharePoint at the center of the deep connected MSOffice "rich client" desktop productivity environment, and, an online Web version of MSOffice. Who would have guessed that one of the key features to MOSS would be universal accessibility to and collaboration on MSOffice documents - without loss of fidelity? No doubt the embedded logic that drive BBP's (Bound Business Processes) is also perfectly preserved. Excerpt: "Office Web Applications, the online companion to Word, Excel, PowerPoint and OneNote applications, allow you to access documents from anywhere. You can even simultaneously share and work on documents with others online," Microsoft says on its Office 2010 Technical Preview site. "View documents across PCs, mobile phones, and the Web without compromising document fidelity. Create new documents and do basic editing using the familiar Office interface."
Gary Edwards

Microsoft Unleashes Stream of Docs in the Name of Interoperability - 0 views

  • Yesterday, Microsoft announced the release of Version 1.0 technical documentation for Microsoft Office 2007, SharePoint 2007 and Exchange 2007 as an effort to drive greater interoperability and foster a stronger open relationship with their developer and partner communities. They also posted over 5000 pages of technical documentation on Microsoft Office Word, Excel and PowerPoint binary file formats on the MSDN site royalty-free basis under Microsoft’s Open Specification Promise (OSP).
  •  
    wikiWORD and SPoint get the go ahead!
Gary Edwards

MobiUs Accelerates Mobile HTML5 Development, Aims to Kill Mobile Flash - 0 views

  •  
    HTML5 development company appMobi is releasing a new browser today called MobiUs that will give mobile Web apps the same type of functionality that now is currently only enjoyed by native apps for platforms like iOS and Android. AppMobi thinks of MobiUs as the replacement for Flash in mobile - it renders mobile websites like a Flash extension would and gives developers device access in ways previously unavailable to in HTML5. MobiUs is technically a mobile browser. That is not the way appMobi thinks it will be used though. The company expects it to be function more like a browser extension. Like Flash, users will be prompted to download it once and from then it will just run in on the device.
Paul Merrell

F.B.I. Director to Call 'Dark' Devices a Hindrance to Crime Solving in a Policy Speech ... - 0 views

  • In his first major policy speech as director of the F.B.I., James B. Comey on Thursday plans to wade deeper into the debate between law enforcement agencies and technology companies about new programs intended to protect personal information on communication devices.Mr. Comey will say that encryption technologies used on these devices, like the new iPhone, have become so sophisticated that crimes will go unsolved because law enforcement officers will not be able to get information from them, according to a senior F.B.I. official who provided a preview of the speech.The speech was prompted, in part, by the new encryption technology on the iPhone 6, which was released last month. The phone encrypts emails, photos and contacts, thwarting intelligence and law enforcement agencies, like the National Security Agency and F.B.I., from gaining access to it, even if they have court approval.
  • The F.B.I. has long had concerns about devices “going dark” — when technology becomes so sophisticated that the authorities cannot gain access. But now, Mr. Comey said he believes that the new encryption technology has evolved to the point that it will adversely affect crime solving.He will say in the speech that these new programs will most severely affect state and local law enforcement agencies, because they are the ones who most often investigate crimes like kidnappings and robberies in which getting information from electronic devices in a timely manner is essential to solving the crime.
  • They also do not have the resources that are available to the F.B.I. and other federal intelligence and law enforcement authorities in order to get around the programs.Mr. Comey will cite examples of crimes that the authorities were able to solve because they gained access to a phone.“He is going to call for a discussion on this issue and ask whether this is the path we want to go down,” said the senior F.B.I. official. “He is not going to accuse the companies of designing the technologies to prevent the F.B.I. from accessing them. But, he will say that this is a negative byproduct and we need to work together to fix it.”
  • ...2 more annotations...
  • Mr. Comey is scheduled to give the speech — titled “Going Dark: Are Technology, Privacy and Public Safety on a Collision Course?” — at the Brookings Institution in Washington.
  • In the interview that aired on “60 Minutes” on Sunday, Mr. Comey said that “the notion that we would market devices that would allow someone to place themselves beyond the law troubles me a lot.”He said that it was the equivalent of selling cars with trunks that could never be opened, even with a court order.“The notion that people have devices, again, that with court orders, based on a showing of probable cause in a case involving kidnapping or child exploitation or terrorism, we could never open that phone?” he said. “My sense is that we've gone too far when we've gone there.”
  •  
    I'm informed that Comey will also call for legislation outlawing communication by whispering because of technical difficulties in law enforcement monitoring of such communications. 
Paul Merrell

Operation Socialist: How GCHQ Spies Hacked Belgium's Largest Telco - 0 views

  • When the incoming emails stopped arriving, it seemed innocuous at first. But it would eventually become clear that this was no routine technical problem. Inside a row of gray office buildings in Brussels, a major hacking attack was in progress. And the perpetrators were British government spies. It was in the summer of 2012 that the anomalies were initially detected by employees at Belgium’s largest telecommunications provider, Belgacom. But it wasn’t until a year later, in June 2013, that the company’s security experts were able to figure out what was going on. The computer systems of Belgacom had been infected with a highly sophisticated malware, and it was disguising itself as legitimate Microsoft software while quietly stealing data. Last year, documents from National Security Agency whistleblower Edward Snowden confirmed that British surveillance agency Government Communications Headquarters was behind the attack, codenamed Operation Socialist. And in November, The Intercept revealed that the malware found on Belgacom’s systems was one of the most advanced spy tools ever identified by security researchers, who named it “Regin.”
  • The full story about GCHQ’s infiltration of Belgacom, however, has never been told. Key details about the attack have remained shrouded in mystery—and the scope of the attack unclear. Now, in partnership with Dutch and Belgian newspapers NRC Handelsblad and De Standaard, The Intercept has pieced together the first full reconstruction of events that took place before, during, and after the secret GCHQ hacking operation. Based on new documents from the Snowden archive and interviews with sources familiar with the malware investigation at Belgacom, The Intercept and its partners have established that the attack on Belgacom was more aggressive and far-reaching than previously thought. It occurred in stages between 2010 and 2011, each time penetrating deeper into Belgacom’s systems, eventually compromising the very core of the company’s networks.
  • When the incoming emails stopped arriving, it seemed innocuous at first. But it would eventually become clear that this was no routine technical problem. Inside a row of gray office buildings in Brussels, a major hacking attack was in progress. And the perpetrators were British government spies. It was in the summer of 2012 that the anomalies were initially detected by employees at Belgium’s largest telecommunications provider, Belgacom. But it wasn’t until a year later, in June 2013, that the company’s security experts were able to figure out what was going on. The computer systems of Belgacom had been infected with a highly sophisticated malware, and it was disguising itself as legitimate Microsoft software while quietly stealing data. Last year, documents from National Security Agency whistleblower Edward Snowden confirmed that British surveillance agency Government Communications Headquarters was behind the attack, codenamed Operation Socialist. And in November, The Intercept revealed that the malware found on Belgacom’s systems was one of the most advanced spy tools ever identified by security researchers, who named it “Regin.”
  • ...7 more annotations...
  • Snowden told The Intercept that the latest revelations amounted to unprecedented “smoking-gun attribution for a governmental cyber attack against critical infrastructure.” The Belgacom hack, he said, is the “first documented example to show one EU member state mounting a cyber attack on another…a breathtaking example of the scale of the state-sponsored hacking problem.”
  • Publicly, Belgacom has played down the extent of the compromise, insisting that only its internal systems were breached and that customers’ data was never found to have been at risk. But secret GCHQ documents show the agency gained access far beyond Belgacom’s internal employee computers and was able to grab encrypted and unencrypted streams of private communications handled by the company. Belgacom invested several million dollars in its efforts to clean-up its systems and beef-up its security after the attack. However, The Intercept has learned that sources familiar with the malware investigation at the company are uncomfortable with how the clean-up operation was handled—and they believe parts of the GCHQ malware were never fully removed.
  • The revelations about the scope of the hacking operation will likely alarm Belgacom’s customers across the world. The company operates a large number of data links internationally (see interactive map below), and it serves millions of people across Europe as well as officials from top institutions including the European Commission, the European Parliament, and the European Council. The new details will also be closely scrutinized by a federal prosecutor in Belgium, who is currently carrying out a criminal investigation into the attack on the company. Sophia in ’t Veld, a Dutch politician who chaired the European Parliament’s recent inquiry into mass surveillance exposed by Snowden, told The Intercept that she believes the British government should face sanctions if the latest disclosures are proven.
  • What sets the secret British infiltration of Belgacom apart is that it was perpetrated against a close ally—and is backed up by a series of top-secret documents, which The Intercept is now publishing.
  • Between 2009 and 2011, GCHQ worked with its allies to develop sophisticated new tools and technologies it could use to scan global networks for weaknesses and then penetrate them. According to top-secret GCHQ documents, the agency wanted to adopt the aggressive new methods in part to counter the use of privacy-protecting encryption—what it described as the “encryption problem.” When communications are sent across networks in encrypted format, it makes it much harder for the spies to intercept and make sense of emails, phone calls, text messages, internet chats, and browsing sessions. For GCHQ, there was a simple solution. The agency decided that, where possible, it would find ways to hack into communication networks to grab traffic before it’s encrypted.
  • The Snowden documents show that GCHQ wanted to gain access to Belgacom so that it could spy on phones used by surveillance targets travelling in Europe. But the agency also had an ulterior motive. Once it had hacked into Belgacom’s systems, GCHQ planned to break into data links connecting Belgacom and its international partners, monitoring communications transmitted between Europe and the rest of the world. A map in the GCHQ documents, named “Belgacom_connections,” highlights the company’s reach across Europe, the Middle East, and North Africa, illustrating why British spies deemed it of such high value.
  • Documents published with this article: Automated NOC detection Mobile Networks in My NOC World Making network sense of the encryption problem Stargate CNE requirements NAC review – October to December 2011 GCHQ NAC review – January to March 2011 GCHQ NAC review – April to June 2011 GCHQ NAC review – July to September 2011 GCHQ NAC review – January to March 2012 GCHQ Hopscotch Belgacom connections
Paul Merrell

The Latest Rules on How Long NSA Can Keep Americans' Encrypted Data Look Too Familiar |... - 0 views

  • Does the National Security Agency (NSA) have the authority to collect and keep all encrypted Internet traffic for as long as is necessary to decrypt that traffic? That was a question first raised in June 2013, after the minimization procedures governing telephone and Internet records collected under Section 702 of the Foreign Intelligence Surveillance Act were disclosed by Edward Snowden. The issue quickly receded into the background, however, as the world struggled to keep up with the deluge of surveillance disclosures. The Intelligence Authorization Act of 2015, which passed Congress this last December, should bring the question back to the fore. It established retention guidelines for communications collected under Executive Order 12333 and included an exception that allows NSA to keep ‘incidentally’ collected encrypted communications for an indefinite period of time. This creates a massive loophole in the guidelines. NSA’s retention of encrypted communications deserves further consideration today, now that these retention guidelines have been written into law. It has become increasingly clear over the last year that surveillance reform will be driven by technological change—specifically by the growing use of encryption technologies. Therefore, any legislation touching on encryption should receive close scrutiny.
  • Section 309 of the intel authorization bill describes “procedures for the retention of incidentally acquired communications.” It establishes retention guidelines for surveillance programs that are “reasonably anticipated to result in the acquisition of [telephone or electronic communications] to or from a United States person.” Communications to or from a United States person are ‘incidentally’ collected because the U.S. person is not the actual target of the collection. Section 309 states that these incidentally collected communications must be deleted after five years unless they meet a number of exceptions. One of these exceptions is that “the communication is enciphered or reasonably believed to have a secret meaning.” This exception appears to be directly lifted from NSA’s minimization procedures for data collected under Section 702 of FISA, which were declassified in 2013. 
  • While Section 309 specifically applies to collection taking place under E.O. 12333, not FISA, several of the exceptions described in Section 309 closely match exceptions in the FISA minimization procedures. That includes the exception for “enciphered” communications. Those minimization procedures almost certainly served as a model for these retention guidelines and will likely shape how this new language is interpreted by the Executive Branch. Section 309 also asks the heads of each relevant member of the intelligence community to develop procedures to ensure compliance with new retention requirements. I expect those procedures to look a lot like the FISA minimization guidelines.
  • ...6 more annotations...
  • This language is broad, circular, and technically incoherent, so it takes some effort to parse appropriately. When the minimization procedures were disclosed in 2013, this language was interpreted by outside commentators to mean that NSA may keep all encrypted data that has been incidentally collected under Section 702 for at least as long as is necessary to decrypt that data. Is this the correct interpretation? I think so. It is important to realize that the language above isn’t just broad. It seems purposefully broad. The part regarding relevance seems to mirror the rationale NSA has used to justify its bulk phone records collection program. Under that program, all phone records were relevant because some of those records could be valuable to terrorism investigations and (allegedly) it isn’t possible to collect only those valuable records. This is the “to find a needle a haystack, you first have to have the haystack” argument. The same argument could be applied to encrypted data and might be at play here.
  • This exception doesn’t just apply to encrypted data that might be relevant to a current foreign intelligence investigation. It also applies to cases in which the encrypted data is likely to become relevant to a future intelligence requirement. This is some remarkably generous language. It seems one could justify keeping any type of encrypted data under this exception. Upon close reading, it is difficult to avoid the conclusion that these procedures were written carefully to allow NSA to collect and keep a broad category of encrypted data under the rationale that this data might contain the communications of NSA targets and that it might be decrypted in the future. If NSA isn’t doing this today, then whoever wrote these minimization procedures wanted to at least ensure that NSA has the authority to do this tomorrow.
  • There are a few additional observations that are worth making regarding these nominally new retention guidelines and Section 702 collection. First, the concept of incidental collection as it has typically been used makes very little sense when applied to encrypted data. The way that NSA’s Section 702 upstream “about” collection is understood to work is that technology installed on the network does some sort of pattern match on Internet traffic; say that an NSA target uses example@gmail.com to communicate. NSA would then search content of emails for references to example@gmail.com. This could notionally result in a lot of incidental collection of U.S. persons’ communications whenever the email that references example@gmail.com is somehow mixed together with emails that have nothing to do with the target. This type of incidental collection isn’t possible when the data is encrypted because it won’t be possible to search and find example@gmail.com in the body of an email. Instead, example@gmail.com will have been turned into some alternative, indecipherable string of bits on the network. Incidental collection shouldn’t occur because the pattern match can’t occur in the first place. This demonstrates that, when communications are encrypted, it will be much harder for NSA to search Internet traffic for a unique ID associated with a specific target.
  • This lends further credence to the conclusion above: rather than doing targeted collection against specific individuals, NSA is collecting, or plans to collect, a broad class of data that is encrypted. For example, NSA might collect all PGP encrypted emails or all Tor traffic. In those cases, NSA could search Internet traffic for patterns associated with specific types of communications, rather than specific individuals’ communications. This would technically meet the definition of incidental collection because such activity would result in the collection of communications of U.S. persons who aren’t the actual targets of surveillance. Collection of all Tor traffic would entail a lot of this “incidental” collection because the communications of NSA targets would be mixed with the communications of a large number of non-target U.S. persons. However, this “incidental” collection is inconsistent with how the term is typically used, which is to refer to over-collection resulting from targeted surveillance programs. If NSA were collecting all Tor traffic, that activity wouldn’t actually be targeted, and so any resulting over-collection wouldn’t actually be incidental. Moreover, greater use of encryption by the general public would result in an ever-growing amount of this type of incidental collection.
  • This type of collection would also be inconsistent with representations of Section 702 upstream collection that have been made to the public and to Congress. Intelligence officials have repeatedly suggested that search terms used as part of this program have a high degree of specificity. They have also argued that the program is an example of targeted rather than bulk collection. ODNI General Counsel Robert Litt, in a March 2014 meeting before the Privacy and Civil Liberties Oversight Board, stated that “there is either a misconception or a mischaracterization commonly repeated that Section 702 is a form of bulk collection. It is not bulk collection. It is targeted collection based on selectors such as telephone numbers or email addresses where there’s reason to believe that the selector is relevant to a foreign intelligence purpose.” The collection of Internet traffic based on patterns associated with types of communications would be bulk collection; more akin to NSA’s collection of phone records en mass than it is to targeted collection focused on specific individuals. Moreover, this type of collection would certainly fall within the definition of bulk collection provided just last week by the National Academy of Sciences: “collection in which a significant portion of the retained data pertains to identifiers that are not targets at the time of collection.”
  • The Section 702 minimization procedures, which will serve as a template for any new retention guidelines established for E.O. 12333 collection, create a large loophole for encrypted communications. With everything from email to Internet browsing to real-time communications moving to encrypted formats, an ever-growing amount of Internet traffic will fall within this loophole.
  •  
    Tucked into a budget authorization act in December without press notice. Section 309 (the Act is linked from the article) appears to be very broad authority for the NSA to intercept any form of telephone or other electronic information in bulk. There are far more exceptions from the five-year retention limitation than the encrypted information exception. When reading this, keep in mind that the U.S. intelligence community plays semantic games to obfuscate what it does. One of its word plays is that communications are not "collected" until an analyst looks at or listens to partiuclar data, even though the data will be searched to find information countless times before it becomes "collected." That searching was the major basis for a decision by the U.S. District Court in Washington, D.C. that bulk collection of telephone communications was unconstitutional: Under the Fourth Amendment, a "search" or "seizure" requiring a judicial warrant occurs no later than when the information is intercepted. That case is on appeal, has been briefed and argued, and a decision could come any time now. Similar cases are pending in two other courts of appeals. Also, an important definition from the new Intelligence Authorization Act: "(a) DEFINITIONS.-In this section: (1) COVERED COMMUNICATION.-The term ''covered communication'' means any nonpublic telephone or electronic communication acquired without the consent of a person who is a party to the communication, including communications in electronic storage."       
‹ Previous 21 - 40 of 109 Next › Last »
Showing 20 items per page