Skip to main content

Home/ Future of the Web/ Group items tagged operating systems specific

Rss Feed Group items tagged

Gary Edwards

XML Production Workflows? Start with the Web and XHTML - 0 views

  • Challenges: Some Ugly Truths The challenges of building—and living with—an XML workflow are clear enough. The return on investment is a long-term proposition. Regardless of the benefits XML may provide, the starting reality is that it represents a very different way of doing things than the one we are familiar with. The Word Processing and Desktop Publishing paradigm, based on the promise of onscreen, WYSIWYG layout, is so dominant as to be practically inescapable. It has proven really hard to get from here to there, no matter how attractive XML might be on paper. A considerable amount of organizational effort and labour must be expended up front in order to realize the benefits. This is why XML is often referred to as an “investment”: you sink a bunch of time and money up front, and realize the benefits—greater flexibility, multiple output options, searching and indexing, and general futureproofing—later, over the long haul. It is not a short-term return proposition. And, of course, the returns you are able to realize from your XML investment are commensurate with what you put in up front: fine-grained, semantically rich tagging is going to give you more potential for searchability and recombination than a looser, more general-purpose approach, but it sure costs more. For instance, the Text Encoding Initiative (TEI) is the grand example of pouring enormous amounts of energy into the up-front tagging, with a very open-ended set of possibilities down the line. TEI helpfully defines a level to which most of us do not have to aspire.[5] But understanding this on a theoretical level is only part of the challenge. There are many practical issues that must be addressed. Software and labour are two of the most critical. How do you get the content into XML in the first place? Unfortunately, despite two decades of people doing SGML and XML, this remains an ugly question.
  • Practical Challenges In 2009, there is still no truly likeable—let alone standard—editing and authoring software for XML. For many (myself included), the high-water mark here was Adobe’s FrameMaker, substantially developed by the late 1990s. With no substantial market for it, it is relegated today mostly to the tech writing industry, unavailable for the Mac, and just far enough afield from the kinds of tools we use today that its adoption represents a significant hurdle. And FrameMaker was the best of the breed; most of the other software in decent circulation are programmers’ tools—the sort of things that, as Michael Tamblyn pointed out, encourage editors to drink at their desks. The labour question represents a stumbling block as well. The skill-sets and mind-sets that effective XML editors need have limited overlap with those needed by literary and more traditional production editors. The need to think of documents as machine-readable databases is not something that comes naturally to folks steeped in literary culture. In combination with the sheer time and effort that rich tagging requires, many publishers simply outsource the tagging to India, drawing a division of labour that spans oceans, to put it mildly. Once you have XML content, then what do you do with it? How do you produce books from it? Presumably, you need to be able to produce print output as well as digital formats. But while the latter are new enough to be generally XML-friendly (e-book formats being largely XML based, for instance), there aren’t any straightforward, standard ways of moving XML content into the kind of print production environments we are used to seeing. This isn’t to say that there aren’t ways of getting print—even very high-quality print—output from XML, just that most of them involve replacing your prepress staff with Java programmers.
  • Why does this have to be so hard? It’s not that XML is new, or immature, or untested. Remember that the basics have been around, and in production, since the early 1980s at least. But we have to take account of a substantial and long-running cultural disconnect between traditional editorial and production processes (the ones most of us know intimately) and the ways computing people have approached things. Interestingly, this cultural divide looked rather different in the 1970s, when publishers were looking at how to move to digital typesetting. Back then, printers and software developers could speak the same language. But that was before the ascendancy of the Desktop Publishing paradigm, which computerized the publishing industry while at the same time isolating it culturally. Those of us who learned how to do things the Quark way or the Adobe way had little in common with people who programmed databases or document-management systems. Desktop publishing technology isolated us in a smooth, self-contained universe of toolbars, grid lines, and laser proofs. So, now that the reasons to get with this program, XML, loom large, how can we bridge this long-standing divide?
  • ...44 more annotations...
  • Using the Web as a Production Platform The answer, I think, is right in front of you. The bridge is the Web, a technology and platform that is fundamentally based on XML, and which many publishers are by now comfortably familiar with. Perhaps not entirely comfortably, but at least most publishers are already working with the Web; they already either know or have on staff people who understand it and can work with it. The foundation of our argument is this: rather than looking at jumping to XML in its full, industrial complexity, which seems to be what the O'Reilly-backed StartWithXML initiative[6] is suggesting, publishers instead leverage existing tools and technologies—starting with the Web—as a means of getting XML workflows in place. This means making small investments and working with known tools rather than spending tens of thousands of dollars on XML software and rarefied consultants. It means re-thinking how the existing pieces of the production toolchain fit together; re-thinking the existing roles of software components already in use. It means, fundamentally, taking the Web seriously as a content platform, rather than thinking of it as something you need to get content out to, somehow. If nothing else, the Web represents an opportunity to think about editorial and production from outside the shrink-wrapped Desktop Publishing paradigm.
  • Is the Web made of Real XML? At this point some predictable objections can be heard: wait a moment, the Web isn’t really made out of XML; the HTML that makes up most of the Web is at best the bastard child of SGML, and it is far too flaky/unstructured/underpowered to be taken seriously. We counter by arguing that although HTML on the Web exists in a staggering array of different incarnations, and that the majority of it is indeed an unstructured mess, this does not undermine the general principle that basic, ubiquitous Web technologies can make a solid platform for content management, editorial process, and production workflow.
  • With the advent of a published XML standard in the late 1990s came the W3C’s adoption of XHTML: the realization of the Web’s native content markup as a proper XML document type. Today, its acceptance is almost ubiquitous, even while the majority of actual content out there may not be strictly conforming. The more important point is that most contemporary Web software, from browsers to authoring tools to content management systems (from blogs to enterprise systems), are capable of working with clean, valid XHTML. Or, to put the argument the other way around, clean, valid XHTML content plays absolutely seamlessly with everything else on the Web.[7]
  • The objection which follows, then, will be that even if we grant that XHTML is a real XML document type, that it is underpowered for “serious” content because it is almost entirely presentation (formatting) oriented; it lacks any semantic depth. In XHTML, a paragraph is a paragraph is a paragraph, as opposed to a section or an epigraph or a summary.
  • n contrast, more “serious” XML document types like DocBook[8] or DITA-derived schemas[9] are capable of making semantic distinctions about content chunks at a fine level of granularity and with a high degree of specificity.
  • So there is an argument for recalling the 80:20 rule here. If XHTML can provide 80% of the value with just 20% of the investment, then what exactly is the business case for spending the other 80% to achieve that last 20% of value? We suspect the ratio is actually quite a bit steeper than 80:20 for most publishers.
  • Furthermore, just to get technical for a moment, XHTML is extensible in a fairly straightforward way, through the common “class” attribute on each element. Web developers have long leveraged this kind of extensibility in the elaboration of “microformats” for semantic-web applications.[10] There is no reason why publishers shouldn’t think to use XHTML’s simple extensibility in a similar way for their own ends.
  • XHTML, on the other hand, is supported by a vast array of quotidian software, starting with the ubiquitous Web browser. For this very reason, XHTML is in fact employed as a component part of several more specialized document types (ONIX and ePub among them).
  • Why re-invent a general-purpose prose representation when XHTML already does the job?
  • It is worth pausing for a moment to consider the role of XHTML in the ePub standard for ebook content. An ePub file is, anatomically, a simply disguised zip archive. Inside the zip archive are a few standard component parts: there are specialized files that declare metadata about the book, and about the format of the book. And then there is the book’s content, represented in XHTML. An ePub book is a Web page in a wrapper.
  • To sum up the general argument: the Web as it already exists presents incredible value to publishers, as a platform for doing XML content management with existing (and often free) tools, and without having to go blindly into the unknown. At this point, we can offer a few design guidelines: prefer existing and/or ubiquitous tools over specialized ones wherever possible; prefer free software over proprietary systems where possible; prefer simple tools controlled and coordinated by human beings over fully automated (and therefore complex) systems; play to our strengths: use Web software for storing and managing content, use layout software for layout, and keep editors and production people in charge of their own domains.
  • Putting the Pieces Together: A Prototype
  • At the SFU Master of Publishing Program, we have been chipping away at this general line of thinking for a few years. Over that time, Web content management systems have been getting more and more sophisticated, all the while getting more streamlined and easier to use. (NB: if you have a blog, you have a Web content management system.) The Web is beginning to be recognized as a writing and editing environment used by millions of people. And the ways in which content is represented, stored, and exchanged online have become increasingly robust and standardized.
  • The missing piece of the puzzle has been print production: how can we move content from its malleable, fluid form on line into the kind of high-quality print production environments we’ve come to expect after two decades of Desktop Publishing?
  • Anyone who has tried to print Web content knows that the existing methods leave much to be desired (hyphenation and justification, for starters). In the absence of decent tools for this, most publishers quite naturally think of producing the print content first, and then think about how to get material onto the Web for various purposes. So we tend to export from Word, or from Adobe, as something of an afterthought.
  • While this sort of works, it isn’t elegant, and it completely ignores the considerable advantages of Web-based content management.
  • Content managed online is stored in one central location, accessible simultaneously to everyone in your firm, available anywhere you have an Internet connection, and usually exists in a much more fluid format than Word files. If only we could manage the editorial flow online, and then go to print formats at the end, instead of the other way around. At SFU, we made several attempts to make this work by way of the supposed “XML import” capabilities of various Desktop Publishing tools, without much success.[12]
  • In the winter of 2009, Adobe solved this part of the problem for us with the introduction of its Creative Suite 4. What CS4 offers is the option of a complete XML representation of an InDesign document: what Adobe calls IDML (InDesign Markup Language).
  • The IDML file format is—like ePub—a simply disguised zip archive that, when unpacked, reveals a cluster of XML files that represent all the different facets of an InDesign document: layout spreads, master pages, defined styles, colours, and of course, the content.
  • IDML is a well thought-out XML standard that achieves two very different goals simultaneously: it preserves all of the information that InDesign needs to do what it does; and it is broken up in a way that makes it possible for mere mortals (or at least our Master of Publishing students) to work with it.
  • What this represented to us in concrete terms was the ability to take Web-based content and move it into InDesign in a straightforward way, thus bridging Web and print production environments using existing tools and skillsets, with a little added help from free software.
  • We would take clean XHTML content, transform it to IDML-marked content, and merge that with nicely designed templates in InDesign.
  • The result is an almost push-button publication workflow, which results in a nice, familiar InDesign document that fits straight into the way publishers actually do production.
  • Tracing the steps To begin with, we worked backwards, moving the book content back to clean XHTML.
  • The simplest method for this conversion—and if you want to create Web content, this is an excellent route—was to use Adobe’s “Export to Digital Editions” option, which creates an ePub file.
  • Recall that ePub is just XHTML in a wrapper, so within the ePub file was a relatively clean XHTML document. It was somewhat cleaner (that is, the XHTML tagging was simpler and less cluttered) than InDesign’s other Web-oriented exports, possibly because Digital Editions is a well understood target, compared with somebody’s website.
  • In order to achieve our target of clean XHTML, we needed to do some editing; the XHTML produced by InDesign’s “Digital Editions” export was presentation-oriented. For instance, bulleted list items were tagged as paragraphs, with a class attribute identifying them as list items. Using the search-and-replace function, we converted such structures to proper XHTML list and list-item elements. Our guiding principle was to make the XHTML as straightforward as possible, not dependent on any particular software to interpret it.
  • We broke the book’s content into individual chapter files; each chapter could then carry its own basic metadata, and the pages conveniently fit our Web content management system (which is actually just a wiki). We assembled a dynamically generated table of contents for the 12 chapters, and created a cover page. Essentially, the book was entirely Web-based at this point.
  • When the book chapters are viewed online, they are formatted via a CSS2 stylesheet that defines a main column for content as well as dedicating screen real estate for navigational elements. We then created a second template to render the content for exporting; this was essentially a bare-bones version of the book with no navigation and minimal styling. Pages (or even the entire book) can be exported (via the “Save As...” function in a Web browser) for use in either print production or ebook conversion. At this point, we required no skills beyond those of any decent Web designer.
  • Integrating with CS4 for Print Adobe’s IDML language defines elements specific to InDesign; there is nothing in the language that looks remotely like XHTML. So a mechanical transformation step is needed to convert the XHTML content into something InDesign can use. This is not as hard as it might seem.
  • Both XHTML and IDML are composed of straightforward, well-documented structures, and so transformation from one to the other is, as they say, “trivial.” We chose to use XSLT (Extensible Stylesheet Language Transforms) to do the work. XSLT is part of the overall XML specification, and thus is very well supported in a wide variety of tools. Our prototype used a scripting engine called xsltproc, a nearly ubiquitous piece of software that we found already installed as part of Mac OS X (contemporary Linux distributions also have this as a standard tool), though any XSLT processor would work.
  • In other words, we don’t need to buy InCopy, because we just replaced it with the Web. Our wiki is now plugged directly into our InDesign layout. It even automatically updates the InDesign document when the content changes. Credit is due at this point to Adobe: this integration is possible because of the open file format in the Creative Suite 4.
  • We wrote an XSLT transformation script[18] that converted the XHTML content from the Web into an InCopy ICML file. The script itself is less than 500 lines long, and was written and debugged over a period of about a week by amateurs (again, the people named at the start of this article). The script runs in a couple of seconds, and the resulting .icml file can then be “placed” directly into an InDesign template. The ICML file references an InDesign stylesheet, so the template file can be set up with a house-styled layout, master pages, and stylesheet definitions for paragraphs and character ranges.
  • The result is very simple and easy to use. Our demonstration requires that a production editor run the XSLT transformation script manually, but there is no reason why this couldn’t be built directly into the Web content management system so that exporting the content to print ran the transformation automatically. The resulting file would then be “placed” in InDesign and proofed.
  • It should be noted that the Book Publishing 1 proof-of-concept was artificially complex; we began with a book laid out in InDesign and ended up with a look-alike book laid out in InDesign. But next time—for instance, when we publish Book Publishing 2—we can begin the process with the content on the Web, and keep it there throughout the editorial process. The book’s content could potentially be written and edited entirely online, as Web content, and then automatically poured into an InDesign template at proof time. “Just in time,” as they say. This represents an entirely new way of thinking of book production. With a Web-first orientation, it makes little sense to think of the book as “in print” or “out of print”—the book is simply available, in the first place online; in the second place in derivative digital formats; and third, but really not much more difficult, in print-ready format, via the usual InDesign CS print production system publishers are already familiar with.
  • Creating Ebook Files Creating electronic versions from XHTML source is vastly simpler than trying to generate these out of the existing print process. The ePub version is extremely easy to generate; so is online marketing copy or excerpts for the Web, since the content begins life Web-native.
  • Since an ePub file is essentially XHTML content in a special wrapper, all that is required is that we properly “wrap” our XHTML content. Ideally, the content in an ePub file is broken into chapters (as ours was) and a table of contents file is generated in order to allow easy navigation within an ebook reader. We used Julian Smart’s free tool eCub[19] to simply and automatically generate the ePub wrapper and the table of contents. The only custom development we did was to create a CSS stylesheet for the ebook so that headings and paragraph indents looked the way we wanted. Starting with XHTML content, creating ePub is almost too easy.
  • Such a workflow—beginning with the Web and exporting to print—is surely more in line with the way we will do business in the 21st century, where the Web is the default platform for reaching audiences, developing content, and putting the pieces together. It is time, we suggest, for publishers to re-orient their operations and start with the Web.
  • Our project demonstrates that Web technologies are indeed good enough to use in an XML-oriented workflow; more specialized and expensive options are not necessarily required. For massive-scale enterprise publishing, this approach may not offer enough flexibility, and the challenge of adding and extracting extra semantic richness may prove more trouble than it's worth.
  • But for smaller firms who are looking at the straightforward benefits of XML-based processes—single source publishing, online content and workflow management, open and accessible archive formats, greater online discoverability—here is a way forward.
  • Rather than a public-facing website, our system relies on the Web as a content management platform—of course a public face could easily be added.
  • The final piece of our puzzle, the ability to integrate print production, was made possible by Adobe's release of InDesign with an open XML file format. Since the Web's XHTML is also XML, is can be easily and confidently transformed to the InDesign format.
  • today, we are able to put the process together using nothing but standard, relatively ubiquitous Web tools: the Web itself as an editing and content management environment, standard Web scripting tools for the conversion process, and the well-documented IDML file format to integrate the layout tool.
  • Using the Web as a Production Platform
  •  
    I was looking for an answer to a problem Marbux had presented, and found this interesting article.  The issue was that of the upcoming conversion of the Note Case Pro (NCP) layout engine to the WebKit layout engine, and what to do about the NCP document format. My initial reaction was to encode the legacy NCP document format in XML, and run an XSLT to a universal pivot format like TEI-XML.  From there, the TEI-XML community would provide all the XSLT transformation routines for conversion to ODF, OOXML, XHTML, ePUB and HTML/CSS. Researching the problems one might encounter with this approach, I found this article.  Fascinating stuff. My take away is that TEI-XML would not be as effective a "universal pivot point" as XHTML.  Or perhaps, if NCP really wants to get aggressive; IDML - InDesign Markup Language. The important point though is that XHTML is a browser specific version of XML, and compatible with the Web Kit layout engine Miro wants to move NCP to. The concept of encoding an existing application-specific format in XML has been around since 1998, when XML was first introduced as a W3C standard, a "structured" subset of SGML. (HTML is also a subset of SGML). The multiplatform StarOffice productivity suite became "OpenOffice" when Sun purchased the company in 1998, and open sourced the code base. The OpenOffice developer team came out with a XML encoding of their existing document formats in 2000. The application specific encoding became an OASIS document format standard proposal in 2002 - also known as ODF. Microsoft followed OpenOffice with a XML encoding of their application-specific binary document formats, known as OOXML. Encoding the existing NCP format in XML, specifically targeting XHTML as a "universal pivot point", would put the NCP Outliner in the Web editor category, without breaking backwards compatibility. The trick is in the XSLT conversion process. But I think that is something much easier to handle then trying to
  •  
    I was looking for an answer to a problem Marbux had presented, and found this interesting article.  The issue was that of the upcoming conversion of the Note Case Pro (NCP) layout engine to the WebKit layout engine, and what to do about the NCP document format. My initial reaction was to encode the legacy NCP document format in XML, and run an XSLT to a universal pivot format like TEI-XML.  From there, the TEI-XML community would provide all the XSLT transformation routines for conversion to ODF, OOXML, XHTML, ePUB and HTML/CSS. Researching the problems one might encounter with this approach, I found this article.  Fascinating stuff. My take away is that TEI-XML would not be as effective a "universal pivot point" as XHTML.  Or perhaps, if NCP really wants to get aggressive; IDML - InDesign Markup Language. The important point though is that XHTML is a browser specific version of XML, and compatible with the Web Kit layout engine Miro wants to move NCP to. The concept of encoding an existing application-specific format in XML has been around since 1998, when XML was first introduced as a W3C standard, a "structured" subset of SGML. (HTML is also a subset of SGML). The multiplatform StarOffice productivity suite became "OpenOffice" when Sun purchased the company in 1998, and open sourced the code base. The OpenOffice developer team came out with a XML encoding of their existing document formats in 2000. The application specific encoding became an OASIS document format standard proposal in 2002 - also known as ODF. Microsoft followed OpenOffice with a XML encoding of their application-specific binary document formats, known as OOXML. Encoding the existing NCP format in XML, specifically targeting XHTML as a "universal pivot point", would put the NCP Outliner in the Web editor category, without breaking backwards compatibility. The trick is in the XSLT conversion process. But I think that is something much easier to handle then trying to
Paul Merrell

From Radio to Porn, British Spies Track Web Users' Online Identities - 1 views

  • HERE WAS A SIMPLE AIM at the heart of the top-secret program: Record the website browsing habits of “every visible user on the Internet.” Before long, billions of digital records about ordinary people’s online activities were being stored every day. Among them were details cataloging visits to porn, social media and news websites, search engines, chat forums, and blogs. The mass surveillance operation — code-named KARMA POLICE — was launched by British spies about seven years ago without any public debate or scrutiny. It was just one part of a giant global Internet spying apparatus built by the United Kingdom’s electronic eavesdropping agency, Government Communications Headquarters, or GCHQ. The revelations about the scope of the British agency’s surveillance are contained in documents obtained by The Intercept from National Security Agency whistleblower Edward Snowden. Previous reports based on the leaked files have exposed how GCHQ taps into Internet cables to monitor communications on a vast scale, but many details about what happens to the data after it has been vacuumed up have remained unclear.
  • Amid a renewed push from the U.K. government for more surveillance powers, more than two dozen documents being disclosed today by The Intercept reveal for the first time several major strands of GCHQ’s existing electronic eavesdropping capabilities.
  • The surveillance is underpinned by an opaque legal regime that has authorized GCHQ to sift through huge archives of metadata about the private phone calls, emails and Internet browsing logs of Brits, Americans, and any other citizens — all without a court order or judicial warrant
  • ...17 more annotations...
  • A huge volume of the Internet data GCHQ collects flows directly into a massive repository named Black Hole, which is at the core of the agency’s online spying operations, storing raw logs of intercepted material before it has been subject to analysis. Black Hole contains data collected by GCHQ as part of bulk “unselected” surveillance, meaning it is not focused on particular “selected” targets and instead includes troves of data indiscriminately swept up about ordinary people’s online activities. Between August 2007 and March 2009, GCHQ documents say that Black Hole was used to store more than 1.1 trillion “events” — a term the agency uses to refer to metadata records — with about 10 billion new entries added every day. As of March 2009, the largest slice of data Black Hole held — 41 percent — was about people’s Internet browsing histories. The rest included a combination of email and instant messenger records, details about search engine queries, information about social media activity, logs related to hacking operations, and data on people’s use of tools to browse the Internet anonymously.
  • Throughout this period, as smartphone sales started to boom, the frequency of people’s Internet use was steadily increasing. In tandem, British spies were working frantically to bolster their spying capabilities, with plans afoot to expand the size of Black Hole and other repositories to handle an avalanche of new data. By 2010, according to the documents, GCHQ was logging 30 billion metadata records per day. By 2012, collection had increased to 50 billion per day, and work was underway to double capacity to 100 billion. The agency was developing “unprecedented” techniques to perform what it called “population-scale” data mining, monitoring all communications across entire countries in an effort to detect patterns or behaviors deemed suspicious. It was creating what it said would be, by 2013, “the world’s biggest” surveillance engine “to run cyber operations and to access better, more valued data for customers to make a real world difference.”
  • A document from the GCHQ target analysis center (GTAC) shows the Black Hole repository’s structure.
  • The data is searched by GCHQ analysts in a hunt for behavior online that could be connected to terrorism or other criminal activity. But it has also served a broader and more controversial purpose — helping the agency hack into European companies’ computer networks. In the lead up to its secret mission targeting Netherlands-based Gemalto, the largest SIM card manufacturer in the world, GCHQ used MUTANT BROTH in an effort to identify the company’s employees so it could hack into their computers. The system helped the agency analyze intercepted Facebook cookies it believed were associated with Gemalto staff located at offices in France and Poland. GCHQ later successfully infiltrated Gemalto’s internal networks, stealing encryption keys produced by the company that protect the privacy of cell phone communications.
  • Similarly, MUTANT BROTH proved integral to GCHQ’s hack of Belgian telecommunications provider Belgacom. The agency entered IP addresses associated with Belgacom into MUTANT BROTH to uncover information about the company’s employees. Cookies associated with the IPs revealed the Google, Yahoo, and LinkedIn accounts of three Belgacom engineers, whose computers were then targeted by the agency and infected with malware. The hacking operation resulted in GCHQ gaining deep access into the most sensitive parts of Belgacom’s internal systems, granting British spies the ability to intercept communications passing through the company’s networks.
  • In March, a U.K. parliamentary committee published the findings of an 18-month review of GCHQ’s operations and called for an overhaul of the laws that regulate the spying. The committee raised concerns about the agency gathering what it described as “bulk personal datasets” being held about “a wide range of people.” However, it censored the section of the report describing what these “datasets” contained, despite acknowledging that they “may be highly intrusive.” The Snowden documents shine light on some of the core GCHQ bulk data-gathering programs that the committee was likely referring to — pulling back the veil of secrecy that has shielded some of the agency’s most controversial surveillance operations from public scrutiny. KARMA POLICE and MUTANT BROTH are among the key bulk collection systems. But they do not operate in isolation — and the scope of GCHQ’s spying extends far beyond them.
  • The agency operates a bewildering array of other eavesdropping systems, each serving its own specific purpose and designated a unique code name, such as: SOCIAL ANTHROPOID, which is used to analyze metadata on emails, instant messenger chats, social media connections and conversations, plus “telephony” metadata about phone calls, cell phone locations, text and multimedia messages; MEMORY HOLE, which logs queries entered into search engines and associates each search with an IP address; MARBLED GECKO, which sifts through details about searches people have entered into Google Maps and Google Earth; and INFINITE MONKEYS, which analyzes data about the usage of online bulletin boards and forums. GCHQ has other programs that it uses to analyze the content of intercepted communications, such as the full written body of emails and the audio of phone calls. One of the most important content collection capabilities is TEMPORA, which mines vast amounts of emails, instant messages, voice calls and other communications and makes them accessible through a Google-style search tool named XKEYSCORE.
  • As of September 2012, TEMPORA was collecting “more than 40 billion pieces of content a day” and it was being used to spy on people across Europe, the Middle East, and North Africa, according to a top-secret memo outlining the scope of the program. The existence of TEMPORA was first revealed by The Guardian in June 2013. To analyze all of the communications it intercepts and to build a profile of the individuals it is monitoring, GCHQ uses a variety of different tools that can pull together all of the relevant information and make it accessible through a single interface. SAMUEL PEPYS is one such tool, built by the British spies to analyze both the content and metadata of emails, browsing sessions, and instant messages as they are being intercepted in real time. One screenshot of SAMUEL PEPYS in action shows the agency using it to monitor an individual in Sweden who visited a page about GCHQ on the U.S.-based anti-secrecy website Cryptome.
  • Partly due to the U.K.’s geographic location — situated between the United States and the western edge of continental Europe — a large amount of the world’s Internet traffic passes through its territory across international data cables. In 2010, GCHQ noted that what amounted to “25 percent of all Internet traffic” was transiting the U.K. through some 1,600 different cables. The agency said that it could “survey the majority of the 1,600” and “select the most valuable to switch into our processing systems.”
  • According to Joss Wright, a research fellow at the University of Oxford’s Internet Institute, tapping into the cables allows GCHQ to monitor a large portion of foreign communications. But the cables also transport masses of wholly domestic British emails and online chats, because when anyone in the U.K. sends an email or visits a website, their computer will routinely send and receive data from servers that are located overseas. “I could send a message from my computer here [in England] to my wife’s computer in the next room and on its way it could go through the U.S., France, and other countries,” Wright says. “That’s just the way the Internet is designed.” In other words, Wright adds, that means “a lot” of British data and communications transit across international cables daily, and are liable to be swept into GCHQ’s databases.
  • A map from a classified GCHQ presentation about intercepting communications from undersea cables. GCHQ is authorized to conduct dragnet surveillance of the international data cables through so-called external warrants that are signed off by a government minister. The external warrants permit the agency to monitor communications in foreign countries as well as British citizens’ international calls and emails — for example, a call from Islamabad to London. They prohibit GCHQ from reading or listening to the content of “internal” U.K. to U.K. emails and phone calls, which are supposed to be filtered out from GCHQ’s systems if they are inadvertently intercepted unless additional authorization is granted to scrutinize them. However, the same rules do not apply to metadata. A little-known loophole in the law allows GCHQ to use external warrants to collect and analyze bulk metadata about the emails, phone calls, and Internet browsing activities of British people, citizens of closely allied countries, and others, regardless of whether the data is derived from domestic U.K. to U.K. communications and browsing sessions or otherwise. In March, the existence of this loophole was quietly acknowledged by the U.K. parliamentary committee’s surveillance review, which stated in a section of its report that “special protection and additional safeguards” did not apply to metadata swept up using external warrants and that domestic British metadata could therefore be lawfully “returned as a result of searches” conducted by GCHQ.
  • Perhaps unsurprisingly, GCHQ appears to have readily exploited this obscure legal technicality. Secret policy guidance papers issued to the agency’s analysts instruct them that they can sift through huge troves of indiscriminately collected metadata records to spy on anyone regardless of their nationality. The guidance makes clear that there is no exemption or extra privacy protection for British people or citizens from countries that are members of the Five Eyes, a surveillance alliance that the U.K. is part of alongside the U.S., Canada, Australia, and New Zealand. “If you are searching a purely Events only database such as MUTANT BROTH, the issue of location does not occur,” states one internal GCHQ policy document, which is marked with a “last modified” date of July 2012. The document adds that analysts are free to search the databases for British metadata “without further authorization” by inputing a U.K. “selector,” meaning a unique identifier such as a person’s email or IP address, username, or phone number. Authorization is “not needed for individuals in the U.K.,” another GCHQ document explains, because metadata has been judged “less intrusive than communications content.” All the spies are required to do to mine the metadata troves is write a short “justification” or “reason” for each search they conduct and then click a button on their computer screen.
  • Intelligence GCHQ collects on British persons of interest is shared with domestic security agency MI5, which usually takes the lead on spying operations within the U.K. MI5 conducts its own extensive domestic surveillance as part of a program called DIGINT (digital intelligence).
  • GCHQ’s documents suggest that it typically retains metadata for periods of between 30 days to six months. It stores the content of communications for a shorter period of time, varying between three to 30 days. The retention periods can be extended if deemed necessary for “cyber defense.” One secret policy paper dated from January 2010 lists the wide range of information the agency classes as metadata — including location data that could be used to track your movements, your email, instant messenger, and social networking “buddy lists,” logs showing who you have communicated with by phone or email, the passwords you use to access “communications services” (such as an email account), and information about websites you have viewed.
  • Records showing the full website addresses you have visited — for instance, www.gchq.gov.uk/what_we_do — are treated as content. But the first part of an address you have visited — for instance, www.gchq.gov.uk — is treated as metadata. In isolation, a single metadata record of a phone call, email, or website visit may not reveal much about a person’s private life, according to Ethan Zuckerman, director of Massachusetts Institute of Technology’s Center for Civic Media. But if accumulated and analyzed over a period of weeks or months, these details would be “extremely personal,” he told The Intercept, because they could reveal a person’s movements, habits, religious beliefs, political views, relationships, and even sexual preferences. For Zuckerman, who has studied the social and political ramifications of surveillance, the most concerning aspect of large-scale government data collection is that it can be “corrosive towards democracy” — leading to a chilling effect on freedom of expression and communication. “Once we know there’s a reasonable chance that we are being watched in one fashion or another it’s hard for that not to have a ‘panopticon effect,’” he said, “where we think and behave differently based on the assumption that people may be watching and paying attention to what we are doing.”
  • When compared to surveillance rules in place in the U.S., GCHQ notes in one document that the U.K. has “a light oversight regime.” The more lax British spying regulations are reflected in secret internal rules that highlight greater restrictions on how NSA databases can be accessed. The NSA’s troves can be searched for data on British citizens, one document states, but they cannot be mined for information about Americans or other citizens from countries in the Five Eyes alliance. No such constraints are placed on GCHQ’s own databases, which can be sifted for records on the phone calls, emails, and Internet usage of Brits, Americans, and citizens from any other country. The scope of GCHQ’s surveillance powers explain in part why Snowden told The Guardian in June 2013 that U.K. surveillance is “worse than the U.S.” In an interview with Der Spiegel in July 2013, Snowden added that British Internet cables were “radioactive” and joked: “Even the Queen’s selfies to the pool boy get logged.”
  • In recent years, the biggest barrier to GCHQ’s mass collection of data does not appear to have come in the form of legal or policy restrictions. Rather, it is the increased use of encryption technology that protects the privacy of communications that has posed the biggest potential hindrance to the agency’s activities. “The spread of encryption … threatens our ability to do effective target discovery/development,” says a top-secret report co-authored by an official from the British agency and an NSA employee in 2011. “Pertinent metadata events will be locked within the encrypted channels and difficult, if not impossible, to prise out,” the report says, adding that the agencies were working on a plan that would “(hopefully) allow our Internet Exploitation strategy to prevail.”
Gonzalo San Gil, PhD.

Making UEFI Secure Boot Work With Open Platforms | The Linux Foundation - 0 views

  •  
    ["Secure boot" is a technology described by recent revisions of the UEFI specification; it offers the prospect of a hardware-verified, malware-free operating system bootstrap process that can improve the security of many system deployments. Linux and other open operating systems will be able to take advantage of secure boot if it is implemented properly in the hardware. This document is intended to describe how the UEFI secure boot specification can be implemented to interoperate well with open systems and to avoid adversely affecting the rights of the owners of those systems while providing compliance with proprietary software vendors' requirements. To learn more about the recommendations please download the .PDF below. ]
Gonzalo San Gil, PhD.

Do you need a container-specific Linux distribution? | ITworld - 0 views

  •  
    [... As Docker in particular and containers in general explode in popularity, operating system companies are taking a different tack. They're now arguing that to make the most of containers you need a skinny operating system to go with them. ...]
  •  
    [... As Docker in particular and containers in general explode in popularity, operating system companies are taking a different tack. They're now arguing that to make the most of containers you need a skinny operating system to go with them. ...]
Gary Edwards

Skynet rising: Google acquires 512-qubit quantum computer; NSA surveillance to be turne... - 0 views

  •  
    "The ultimate code breakers" If you know anything about encryption, you probably also realize that quantum computers are the secret KEY to unlocking all encrypted files. As I wrote about last year here on Natural News, once quantum computers go into widespread use by the NSA, the CIA, Google, etc., there will be no more secrets kept from the government. All your files - even encrypted files - will be easily opened and read. Until now, most people believed this day was far away. Quantum computing is an "impractical pipe dream," we've been told by scowling scientists and "flat Earth" computer engineers. "It's not possible to build a 512-qubit quantum computer that actually works," they insisted. Don't tell that to Eric Ladizinsky, co-founder and chief scientist of a company called D-Wave. Because Ladizinsky's team has already built a 512-qubit quantum computer. And they're already selling them to wealthy corporations, too. DARPA, Northrup Grumman and Goldman Sachs In case you're wondering where Ladizinsky came from, he's a former employee of Northrup Grumman Space Technology (yes, a weapons manufacturer) where he ran a multi-million-dollar quantum computing research project for none other than DARPA - the same group working on AI-driven armed assault vehicles and battlefield robots to replace human soldiers. .... When groundbreaking new technology is developed by smart people, it almost immediately gets turned into a weapon. Quantum computing will be no different. This technology grants God-like powers to police state governments that seek to dominate and oppress the People.  ..... Google acquires "Skynet" quantum computers from D-Wave According to an article published in Scientific American, Google and NASA have now teamed up to purchase a 512-qubit quantum computer from D-Wave. The computer is called "D-Wave Two" because it's the second generation of the system. The first system was a 128-qubit computer. Gen two
  •  
    Normally, I'd be suspicious of anything published by Infowars because its editors are willing to publish really over the top stuff, but: [i] this is subject matter I've maintained an interest in over the years and I was aware that working quantum computers were imminent; and [ii] the pedigree on this particular information does not trace to Scientific American, as stated in the article. I've known Scientific American to publish at least one soothing and lengthy article on the subject of chlorinated dioxin hazard -- my specialty as a lawyer was litigating against chemical companies that generated dioxin pollution -- that was generated by known closet chemical industry advocates long since discredited and was totally lacking in scientific validity and contrary to established scientific knowledge. So publication in Scientific American doesn't pack a lot of weight with me. But checking the Scientific American linked article, notes that it was reprinted by permission from Nature, a peer-reviewed scientific journal and news organization that I trust much more. That said, the InfoWars version is a rewrite that contains lots of information not in the Nature/Scientific American version of a sensationalist nature, so heightened caution is still in order. Check the reprinted Nature version before getting too excited: "The D-Wave computer is not a 'universal' computer that can be programmed to tackle any kind of problem. But scientists have found they can usefully frame questions in machine-learning research as optimisation problems. "D-Wave has battled to prove that its computer really operates on a quantum level, and that it is better or faster than a conventional computer. Before striking the latest deal, the prospective customers set a series of tests for the quantum computer. D-Wave hired an outside expert in algorithm-racing, who concluded that the speed of the D-Wave Two was above average overall, and that it was 3,600 times faster than a leading conventional comput
Paul Merrell

Rapid - Press Releases - EUROPA - 0 views

  • MEMO/09/15 Brussels, 17th January 2009
  • The European Commission can confirm that it has sent a Statement of Objections (SO) to Microsoft on 15th January 2009. The SO outlines the Commission’s preliminary view that Microsoft’s tying of its web browser Internet Explorer to its dominant client PC operating system Windows infringes the EC Treaty rules on abuse of a dominant position (Article 82).
  • In the SO, the Commission sets out evidence and outlines its preliminary conclusion that Microsoft’s tying of Internet Explorer to the Windows operating system harms competition between web browsers, undermines product innovation and ultimately reduces consumer choice. The SO is based on the legal and economic principles established in the judgment of the Court of First Instance of 17 September 2007 (case T-201/04), in which the Court of First Instance upheld the Commission's decision of March 2004 (see IP/04/382), finding that Microsoft had abused its dominant position in the PC operating system market by tying Windows Media Player to its Windows PC operating system (see MEMO/07/359).
  • ...3 more annotations...
  • The evidence gathered during the investigation leads the Commission to believe that the tying of Internet Explorer with Windows, which makes Internet Explorer available on 90% of the world's PCs, distorts competition on the merits between competing web browsers insofar as it provides Internet Explorer with an artificial distribution advantage which other web browsers are unable to match. The Commission is concerned that through the tying, Microsoft shields Internet Explorer from head to head competition with other browsers which is detrimental to the pace of product innovation and to the quality of products which consumers ultimately obtain. In addition, the Commission is concerned that the ubiquity of Internet Explorer creates artificial incentives for content providers and software developers to design websites or software primarily for Internet Explorer which ultimately risks undermining competition and innovation in the provision of services to consumers.
  • Microsoft has 8 weeks to reply the SO, and will then have the right to be heard in an Oral Hearing should it wish to do so. If the preliminary views expressed in the SO are confirmed, the Commission may impose a fine on Microsoft, require Microsoft to cease the abuse and impose a remedy that would restore genuine consumer choice and enable competition on the merits.
  • A Statement of Objections is a formal step in Commission antitrust investigations in which the Commission informs the parties concerned in writing of the objections raised against them. The addressee of a Statement of Objections can reply in writing to the Statement of Objections, setting out all facts known to it which are relevant to its defence against the objections raised by the Commission. The party may also request an oral hearing to present its comments on the case. The Commission may then take a decision on whether conduct addressed in the Statement of Objections is compatible or not with the EC Treaty’s antitrust rules. Sending a Statement of Objections does not prejudge the final outcome of the procedure. In the March 2004 Decision the Commission ordered Microsoft to offer to PC manufacturers a version of its Windows client PC operating system without Windows Media Player. Microsoft, however, retained the right to also offer a version with Windows Media Player (see IP/04/382).
  •  
    It's official, hot off the presses (wasn't there a few minutes ago). We're now into a process where DG Competition will revisit its previous order requiring Microsoft to market two versions of Windows, one with Media Player and one without. DG Competition staff were considerably outraged that Microsoft took advantage of a bit of under-specification in the previous order and sold the two versions at the same price. That detail will not be neglected this time around. Moreover, given the ineffectiveness of the previous order in restoring competition among media players, don't be surprised if this results in an outright ban on bundling MSIE with Windows.
Paul Merrell

BitTorrent Sync creates private, peer-to-peer Dropbox, no cloud required | Ars Technica - 6 views

  • BitTorrent today released folder syncing software that replicates files across multiple computers using the same peer-to-peer file sharing technology that powers BitTorrent clients. The free BitTorrent Sync application is labeled as being in the alpha stage, so it's not necessarily ready for prime-time, but it is publicly available for download and working as advertised on my home network. BitTorrent, Inc. (yes, there is a legitimate company behind BitTorrent) took to its blog to announce the move from a pre-alpha, private program to the publicly available alpha. Additions since the private alpha include one-way synchronization, one-time secrets for sharing files with a friend or colleague, and the ability to exclude specific files and directories.
  • BitTorrent Sync provides "unlimited, secure file-syncing," the company said. "You can use it for remote backup. Or, you can use it to transfer large folders of personal media between users and machines; editors and collaborators. It’s simple. It’s free. It’s the awesome power of P2P, applied to file-syncing." File transfers are encrypted, with private information never being stored on an external server or in the "cloud." "Since Sync is based on P2P and doesn’t require a pit-stop in the cloud, you can transfer files at the maximum speed supported by your network," BitTorrent said. "BitTorrent Sync is specifically designed to handle large files, so you can sync original, high quality, uncompressed files."
  •  
    Direct P2P encrypted file syncing, no cloud intermediate, which should translate to far more secure exchange of files, with less opportunity for snooping by governments or others, than with cloud-based services. 
  • ...5 more comments...
  •  
    Hey Paul, is there an open source document management system that I could hook the BitTorrent Sync to?
  •  
    More detail please. What do you want to do with the doc management system? Platform? Server-side or stand-alone? Industrial strength and highly configurable or lightweight and simple? What do you mean by "hook?" Not that I would be able to answer anyway. I really know very little about BitTorrent Sync. In fact, as far as I'd gone before your question was to look at the FAQ. It's linked from . But there's a link to a forum on the same page. Giving the first page a quick scan confirms that this really is alpha-state software. But that would probably be a better place to ask. (Just give them more specific information of what you'd like to do.) There are other projects out there working on getting around the surveillance problem. I2P is one that is a farther along than BitTorrent Sync and quite a bit more flexible. See . (But I haven't used it, so caveat emptor.)
  •  
    There is a great list of PRISM Proof software at http://prism-break.org/. Includes a link to I2P. I want to replace gmail though, but would like another Web based system since I need multi device access. Of course, I need to replace my Google Apps / Google Docs system. That's why I asked about a PRISM Proof sync-share-store DMS. My guess is that there are many users similarly seeking a PRISM Proof platform of communications, content and collaborative computing systems. BusinessIndiser.com is crushed with articles about Google struggling to squirm out from under the NSA PRISM boot-on-the-back-of-their-neck situation. As if blaming the NSA makes up for the dragnet that they consented/allowed/conceded to cover their entire platform. Perhaps we should be watching Germany? There must be tons of startup operations underway, all seeking to replace Google, Amazon, FaceBook, Microsoft, Skype and so many others. It's a great day for Libertyware :)
  •  
    Is the NSA involvement the "Kiss of Death"? Google seems to think so. I'm wondering what the impact would be if ZOHO were to announce a PRISM Proof productivity platform?
  •  
    It is indeed. The E.U. has far more protective digital privacy rights than we do (none). If you're looking for a Dropbox replacement (you should be), for a cloud-based solution take a look at . Unlike Dropbox, all of the encryption/decryption happens on your local machine; Wuala never sees your files unencrypted. Dropbox folks have admitted that there's no technical barrier to them looking at your files. Their encrypt/decrypt operations are done in the cloud (if they actually bother) and they have the key. Which makes it more chilling that the PRISM docs Snowden link make reference to Dropbox being the next cloud service NSA plans to add to their collection. Wuala also is located (as are its servers) in Switzerland, which also has far stronger digital data privacy laws than the U.S. Plus the Swiss are well along the path to E.U. membership; they've ratified many of the E.U. treaties including the treaty on Human Rights, which as I recall is where the digital privacy sections are. I've begun to migrate from Dropbox to Wuala. It seems to be neck and neck with Dropbox on features and supported platforms, with the advantage of a far more secure approach and 5 GB free. But I'd also love to see more approaches akin to IP2 and Bittorrent Sync that provide the means to bypass the cloud. Don't depend on government to ensure digital privacy, route around the government voyeurs. Hmmm ... I wonder if the NSA has the computer capacity to handle millions of people switching to encrypted communication? :-) Thanks for the link to the software list.
  •  
    Re: Google. I don't know if it's the 'kiss of death" but they're definitely going to take a hit, particularly outside the U.S. BTW, I'm remembering from a few years back when the ODF Foundation was still kicking. I did a fair bit of research on the bureaucratic forces in the E.U. that were pushing for the Open Document Exchange Formats. That grew out of a then-ongoing push to get all of the E.U. nations connected via a network that is not dependent on the Internet. It was fairly complete at the time down to the national level and was branching out to the local level and the plan from there was to push connections to business and then to Joe Sixpack and wife. Interop was key, hence ODEF. The E.U. might not be that far away from an ability to sever the digital connections with the U.S. Say a bunch of daisy-chained proxy anonymizers for communications with the U.S. Of course they'd have to block the UK from the network and treat it like it is the U.S. There's a formal signals intelligence service collaboration/integration dating back to WW 2, as I recall, among the U.S., the U.K., Canada, Australia, and New Zealand. Don't remember its name. But it's the same group of nations that were collaborating on Echelon. So the E.U. wouldn't want to let the UK fox inside their new chicken coop. Ah, it's just a fantasy. The U.S. and the E.U. are too interdependent. I have no idea hard it would be for the Zoho folk to come up with desktop/side encryption/decryption. And I don't know whether their servers are located outside the reach of a U.S. court's search warrant. But I think Google is going to have to move in that direction fast if it wants to minimize the damage. Or get way out in front of the hounds chomping at the NSA's ankles and reduce the NSA to compost. OTOH, Google might be a government covert op. for all I know. :-) I'm really enjoying watching the NSA show. Who knows what facet of their Big Brother operation gets revealed next?
  •  
    ZOHO is an Indian company with USA marketing offices. No idea where the server farm is located, but they were not on the NSA list. I've known Raju Vegesna for years, mostly from the old Web 2.0 and Office 2.0 Conferences. Raju runs the USA offices in Santa Clara. I'll try to catch up with him on Thursday. How he could miss this once in a lifetime moment to clean out Google, Microsoft and SalesForce.com is something I'd like to find out about. Thanks for the Wuala tip. You sent me that years ago, when i was working on research and design for the SurDocs project. Incredible that all our notes, research, designs and correspondence was left to rot in Google Wave! Too too funny. I recall telling Alex from SurDocs that he had to use a USA host, like Amazon, that could be trusted by USA customers to keep their docs safe and secure. Now look what i've done! I've tossed his entire company information set into the laps of the NSA and their cabal of connected corporatists :)
Gonzalo San Gil, PhD.

European Commision vs Microsoft: chronology of the case - 1 views

  •  
    "1998 Sun complains to the EC on Microsoft's dominant position as a supplier of operating systems for personal computers. 02/2000 EC launches investigation on Microsoft's anti-competitive conduct ("Case No. COMP/C-3/37.792"). Two main issues are under investigation: (1) lack of interoperability information, and (2) incorporation of Windows Media Player with the Windows operating system."
  •  
    EC v. Microsoft was a landmark advance in the law governing software interoperability. In the preceding case in the U.S., the courts refused to set a standard for the degree of "compatibility" that Microsoft would have to provide competitors. Forewarned, the DG Competition prosecutors had done their homework. Commission v. Microsoft, No. T-167/08, European Community Court of First Instance (Grand Chamber Judgment of 17 September, 2007), para. 230, 374, 421, http://preview.tinyurl.com/chsdb4w (rejecting Microsoft's argument that "interoperability" has a 1-way rather than 2-way meaning; information technology specifications must be disclosed with sufficient specificity to place competitors on an "equal footing" with Mictrosoft's own software in regard to interoperability; "the 12th recital to Directive 91/250 defines interoperability as 'the ability to exchange information and mutually to use the information which has been exchanged'").
Gary Edwards

With faster Chrome browser, Google offers an Android alternative - CNET - 0 views

  •  
    "On mobile devices, the Web hasn't lived up to its promise of a universal programming foundation. Google is trying to change that." Android hogged the spotlight at Google I/O, but performance improvements in Google's Chrome browser show that the company hasn't given up on trying to advance its other programming foundation -- the Web. The mobile version of Chrome has become much more responsive since 2013, said Paul Irish, a developer advocate on the Chrome team, speaking at the San Francisco conference. "We've improved the speed of animation by 75 percent and of scrolling 35 percent," Irish told developers Thursday. "We're committed to getting you 60 frames per second on the mobile Web." That performance is crucial for persuading people to use Web sites rather than native apps for things like posting on social networks, reading news, and playing games. It's also key to getting programmers to take the Web path when so many today focus on native apps written directly for Google's Android operating system and Apple's iOS competitor. The 60 frames-per-second rate refers to how fast the screen redraws when elements are in motion, either during games or when people are doing things like swiping among pages and dragging icons. The 60fps threshold is the minimum that game developers strive for, and to achieve it with no distracting stutters, a device must calculate how to update its entire screen every 16.7 milliseconds. Google, whose Android operating system initially lagged Apple's rival iOS significantly in this domain of responsiveness, has made great strides in improving its OS and its apps. But the mobile Web hasn't kept pace, and that means programmers have been more likely to aim for native apps rather than Web-based apps that can run on any device. ............................ Good review focused on the growing threat that native "paltform specific" apps are replacing Web apps as the developer's best choice. Florian thinks that native apps will win
Paul Merrell

Canada Casts Global Surveillance Dragnet Over File Downloads - The Intercept - 0 views

  • Canada’s leading surveillance agency is monitoring millions of Internet users’ file downloads in a dragnet search to identify extremists, according to top-secret documents. The covert operation, revealed Wednesday by CBC News in collaboration with The Intercept, taps into Internet cables and analyzes records of up to 15 million downloads daily from popular websites commonly used to share videos, photographs, music, and other files. The revelations about the spying initiative, codenamed LEVITATION, are the first from the trove of files provided by National Security Agency whistleblower Edward Snowden to show that the Canadian government has launched its own globe-spanning Internet mass surveillance system. According to the documents, the LEVITATION program can monitor downloads in several countries across Europe, the Middle East, North Africa, and North America. It is led by the Communications Security Establishment, or CSE, Canada’s equivalent of the NSA. (The Canadian agency was formerly known as “CSEC” until a recent name change.)
  • The latest disclosure sheds light on Canada’s broad existing surveillance capabilities at a time when the country’s government is pushing for a further expansion of security powers following attacks in Ottawa and Quebec last year. Ron Deibert, director of University of Toronto-based Internet security think tank Citizen Lab, said LEVITATION illustrates the “giant X-ray machine over all our digital lives.” “Every single thing that you do – in this case uploading/downloading files to these sites – that act is being archived, collected and analyzed,” Deibert said, after reviewing documents about the online spying operation for CBC News. David Christopher, a spokesman for Vancouver-based open Internet advocacy group OpenMedia.ca, said the surveillance showed “robust action” was needed to rein in the Canadian agency’s operations.
  • In a top-secret PowerPoint presentation, dated from mid-2012, an analyst from the agency jokes about how, while hunting for extremists, the LEVITATION system gets clogged with information on innocuous downloads of the musical TV series Glee. CSE finds some 350 “interesting” downloads each month, the presentation notes, a number that amounts to less than 0.0001 per cent of the total collected data. The agency stores details about downloads and uploads to and from 102 different popular file-sharing websites, according to the 2012 document, which describes the collected records as “free file upload,” or FFU, “events.” Only three of the websites are named: RapidShare, SendSpace, and the now defunct MegaUpload.
  • ...3 more annotations...
  • “The specific uses that they talk about in this [counter-terrorism] context may not be the problem, but it’s what else they can do,” said Tamir Israel, a lawyer with the University of Ottawa’s Canadian Internet Policy and Public Interest Clinic. Picking which downloads to monitor is essentially “completely at the discretion of CSE,” Israel added. The file-sharing surveillance also raises questions about the number of Canadians whose downloading habits could have been swept up as part of LEVITATION’s dragnet. By law, CSE isn’t allowed to target Canadians. In the LEVITATION presentation, however, two Canadian IP addresses that trace back to a web server in Montreal appear on a list of suspicious downloads found across the world. The same list includes downloads that CSE monitored in closely allied countries, including the United Kingdom, United States, Spain, Brazil, Germany and Portugal. It is unclear from the document whether LEVITATION has ever prevented any terrorist attacks. The agency cites only two successes of the program in the 2012 presentation: the discovery of a hostage video through a previously unknown target, and an uploaded document that contained the hostage strategy of a terrorist organization. The hostage in the discovered video was ultimately killed, according to public reports.
  • LEVITATION does not rely on cooperation from any of the file-sharing companies. A separate secret CSE operation codenamed ATOMIC BANJO obtains the data directly from internet cables that it has tapped into, and the agency then sifts out the unique IP address of each computer that downloaded files from the targeted websites. The IP addresses are valuable pieces of information to CSE’s analysts, helping to identify people whose downloads have been flagged as suspicious. The analysts use the IP addresses as a kind of search term, entering them into other surveillance databases that they have access to, such as the vast repositories of intercepted Internet data shared with the Canadian agency by the NSA and its British counterpart Government Communications Headquarters. If successful, the searches will return a list of results showing other websites visited by the people downloading the files – in some cases revealing associations with Facebook or Google accounts. In turn, these accounts may reveal the names and the locations of individual downloaders, opening the door for further surveillance of their activities.
  • Canada’s leading surveillance agency is monitoring millions of Internet users’ file downloads in a dragnet search to identify extremists, according to top-secret documents. The covert operation, revealed Wednesday by CBC News in collaboration with The Intercept, taps into Internet cables and analyzes records of up to 15 million downloads daily from popular websites commonly used to share videos, photographs, music, and other files. The revelations about the spying initiative, codenamed LEVITATION, are the first from the trove of files provided by National Security Agency whistleblower Edward Snowden to show that the Canadian government has launched its own globe-spanning Internet mass surveillance system. According to the documents, the LEVITATION program can monitor downloads in several countries across Europe, the Middle East, North Africa, and North America. It is led by the Communications Security Establishment, or CSE, Canada’s equivalent of the NSA. (The Canadian agency was formerly known as “CSEC” until a recent name change.)
Gonzalo San Gil, PhD.

How to use the NMAP Security Scanner on Linux - 1 views

  •  
    "Nmap is a free and open source network discovery and security auditing utility that is widely used in the Linux users community as it is simple to use yet very powerful. Nmap works by sending data packets on a specific target (by IP) and by interpreting the incoming packets to determine what posts are open/closed, what services are running on the scanned system, whether firewalls or filters are set up and enabled, and finally what operation system is running. Those abilities are used for a wide variety of reasons and howtoforge.com is not encouraging nor suggesting the use of nmap for malicious purposes"
Paul Merrell

Prepare to Hang Up the Phone, Forever - WSJ.com - 0 views

  • At decade's end, the trusty landline telephone could be nothing more than a memory. Telecom giants AT&T T +0.31% AT&T Inc. U.S.: NYSE $35.07 +0.11 +0.31% March 28, 2014 4:00 pm Volume (Delayed 15m) : 24.66M AFTER HOURS $35.03 -0.04 -0.11% March 28, 2014 7:31 pm Volume (Delayed 15m): 85,446 P/E Ratio 10.28 Market Cap $182.60 Billion Dividend Yield 5.25% Rev. per Employee $529,844 03/29/14 Prepare to Hang Up the Phone, ... 03/21/14 AT&T Criticizes Netflix's 'Arr... 03/21/14 Samsung's Galaxy S5 Smartphone... More quote details and news » T in Your Value Your Change Short position and Verizon Communications VZ -0.57% Verizon Communications Inc. U.S.: NYSE $47.42 -0.27 -0.57% March 28, 2014 4:01 pm Volume (Delayed 15m) : 24.13M AFTER HOURS $47.47 +0.05 +0.11% March 28, 2014 7:59 pm Volume (Delayed 15m): 1.57M
  • The two providers want to lay the crumbling POTS to rest and replace it with Internet Protocol-based systems that use the same wired and wireless broadband networks that bring Web access, cable programming and, yes, even your telephone service, into your homes. You may think you have a traditional landline because your home phone plugs into a jack, but if you have bundled your phone with Internet and cable services, you're making calls over an IP network, not twisted copper wires. California, Florida, Texas, Georgia, North Carolina, Wisconsin and Ohio are among states that agree telecom resources would be better redirected into modern telephone technologies and innovations, and will kill copper-based technologies in the next three years or so. Kentucky and Colorado are weighing similar laws, which force people to go wireless whether they want to or not. In Mantoloking, N.J., Verizon wants to replace the landline system, which Hurricane Sandy wiped out, with its wireless Voice Link. That would make it the first entire town to go landline-less, a move that isn't sitting well with all residents.
  • New Jersey's legislature, worried about losing data applications such as credit-card processing and alarm systems that wireless systems can't handle, wants a one-year moratorium to block that switch. It will vote on the measure this month. (Verizon tried a similar change in Fire Island, N.Y., when its copper lines were destroyed, but public opposition persuaded Verizon to install fiber-optic cable.) It's no surprise that landlines are unfashionable, considering many of us already have or are preparing to ditch them. More than 38% of adults and 45.5% of children live in households without a landline telephone, says the Centers for Disease Control and Prevention. That means two in every five U.S. homes, or 39%, are wireless, up from 26.6% three years ago. Moreover, a scant 8.5% of households relied only on a landline, while 2% were phoneless in 2013. Metropolitan residents have few worries about the end of landlines. High-speed wire and wireless services are abundant and work well, despite occasional dropped calls. Those living in rural areas, where cell towers are few and 4G capability limited, face different issues.
  • ...2 more annotations...
  • Safety is one of them. Call 911 from a landline and the emergency operator pinpoints your exact address, down to the apartment number. Wireless phones lack those specifics, and even with GPS navigation aren't as precise. Matters are worse in rural and even suburban areas that signals don't reach, sometimes because they're blocked by buildings or the landscape. That's of concern to the Federal Communications Commission, which oversees all forms of U.S. communications services. Universal access is a tenet of its mission, and, despite the state-by-state degradation of the mandate, it's unwilling to let telecom companies simply drop geographically undesirable customers. Telecom firms need FCC approval to ax services completely, and can't do so unless there is a viable competitor to pick up the slack. Last year AT&T asked to turn off its legacy network, which could create gaps in universal coverage and will force people off the grid to get a wireless provider.
  • AT&T and the FCC will soon begin trials to explore life without copper-wired landlines. Consumers will voluntarily test IP-connected networks and their impact on towns like Carbon Hills, Ala., population 2,071. They want to know how households will reach 911, how small businesses will connect to customers, how people with medical-monitoring devices or home alarms know they will always be connected to a reliable network, and what the costs are. "We cannot be a nation of opportunity without networks of opportunity," said FCC Chairman Tom Wheeler in unveiling the plan. "This pilot program will help us learn how fiber might be deployed where it is not now deployed…and how new forms of wireless can reach deep into the interior of rural America."
Paul Merrell

European Lawmakers Demand Answers on Phone Key Theft - The Intercept - 0 views

  • European officials are demanding answers and investigations into a joint U.S. and U.K. hack of the world’s largest manufacturer of mobile SIM cards, following a report published by The Intercept Thursday. The report, based on leaked documents provided by NSA whistleblower Edward Snowden, revealed the U.S. spy agency and its British counterpart Government Communications Headquarters, GCHQ, hacked the Franco-Dutch digital security giant Gemalto in a sophisticated heist of encrypted cell-phone keys. The European Parliament’s chief negotiator on the European Union’s data protection law, Jan Philipp Albrecht, said the hack was “obviously based on some illegal activities.” “Member states like the U.K. are frankly not respecting the [law of the] Netherlands and partner states,” Albrecht told the Wall Street Journal. Sophie in ’t Veld, an EU parliamentarian with D66, the Netherlands’ largest opposition party, added, “Year after year we have heard about cowboy practices of secret services, but governments did nothing and kept quiet […] In fact, those very same governments push for ever-more surveillance capabilities, while it remains unclear how effective these practices are.”
  • “If the average IT whizzkid breaks into a company system, he’ll end up behind bars,” In ’t Veld added in a tweet Friday. The EU itself is barred from undertaking such investigations, leaving individual countries responsible for looking into cases that impact their national security matters. “We even get letters from the U.K. government saying we shouldn’t deal with these issues because it’s their own issue of national security,” Albrecht said. Still, lawmakers in the Netherlands are seeking investigations. Gerard Schouw, a Dutch member of parliament, also with the D66 party, has called on Ronald Plasterk, the Dutch minister of the interior, to answer questions before parliament. On Tuesday, the Dutch parliament will debate Schouw’s request. Additionally, European legal experts tell The Intercept, public prosecutors in EU member states that are both party to the Cybercrime Convention, which prohibits computer hacking, and home to Gemalto subsidiaries could pursue investigations into the breach of the company’s systems.
  • According to secret documents from 2010 and 2011, a joint NSA-GCHQ unit penetrated Gemalto’s internal networks and infiltrated the private communications of its employees in order to steal encryption keys, embedded on tiny SIM cards, which are used to protect the privacy of cellphone communications across the world. Gemalto produces some 2 billion SIM cards a year. The company’s clients include AT&T, T-Mobile, Verizon, Sprint and some 450 wireless network providers. “[We] believe we have their entire network,” GCHQ boasted in a leaked slide, referring to the Gemalto heist.
  • ...4 more annotations...
  • While Gemalto was indeed another casualty in Western governments’ sweeping effort to gather as much global intelligence advantage as possible, the leaked documents make clear that the company was specifically targeted. According to the materials published Thursday, GCHQ used a specific codename — DAPINO GAMMA — to refer to the operations against Gemalto. The spies also actively penetrated the email and social media accounts of Gemalto employees across the world in an effort to steal the company’s encryption keys. Evidence of the Gemalto breach rattled the digital security community. “Almost everyone in the world carries cell phones and this is an unprecedented mass attack on the privacy of citizens worldwide,” said Greg Nojeim, senior counsel at the Center for Democracy & Technology, a non-profit that advocates for digital privacy and free online expression. “While there is certainly value in targeted surveillance of cell phone communications, this coordinated subversion of the trusted technical security infrastructure of cell phones means the US and British governments now have easy access to our mobile communications.”
  • For Gemalto, evidence that their vaunted security systems and the privacy of customers had been compromised by the world’s top spy agencies made an immediate financial impact. The company’s shares took a dive on the Paris bourse Friday, falling $500 million. In the U.S., Gemalto’s shares fell as much 10 percent Friday morning. They had recovered somewhat — down 4 percent — by the close of trading on the Euronext stock exchange. Analysts at Dutch financial services company Rabobank speculated in a research note that Gemalto could be forced to recall “a large number” of SIM cards. The French daily L’Express noted today that Gemalto board member Alex Mandl was a founding trustee of the CIA-funded venture capital firm In-Q-Tel. Mandl resigned from In-Q-Tel’s board in 2002, when he was appointed CEO of Gemplus, which later merged with another company to become Gemalto. But the CIA connection still dogged Mandl, with the French press regularly insinuating that American spies could infiltrate the company. In 2003, a group of French lawmakers tried unsuccessfully to create a commission to investigate Gemplus’s ties to the CIA and its implications for the security of SIM cards. Mandl, an Austrian-American businessman who was once a top executive at AT&T, has denied that he had any relationship with the CIA beyond In-Q-Tel. In 2002, he said he did not even have a security clearance.
  • AT&T, T-Mobile and Verizon could not be reached for comment Friday. Sprint declined to comment. Vodafone, the world’s second largest telecom provider by subscribers and a customer of Gemalto, said in a statement, “[W]e have no further details of these allegations which are industrywide in nature and are not focused on any one mobile operator. We will support industry bodies and Gemalto in their investigations.” Deutsche Telekom AG, a German company, said it has changed encryption algorithms in its Gemalto SIM cards. “We currently have no knowledge that this additional protection mechanism has been compromised,” the company said in a statement. “However, we cannot rule out this completely.”
  • Update: Asked about the SIM card heist, White House press secretary Josh Earnest said he did not expect the news would hurt relations with the tech industry: “It’s hard for me to imagine that there are a lot of technology executives that are out there that are in a position of saying that they hope that people who wish harm to this country will be able to use their technology to do so. So, I do think in fact that there are opportunities for the private sector and the federal government to coordinate and to cooperate on these efforts, both to keep the country safe, but also to protect our civil liberties.”
  •  
    Watch for massive class action product defect litigation to be filed against the phone companies.and mobile device manufacturers.  In most U.S. jurisdictions, proof that the vendors/manufacturers  knew of the product defect is not required, only proof of the defect. Also, this is a golden opportunity for anyone who wants to get out of a pricey cellphone contract, since providing a compromised cellphone is a material breach of warranty, whether explicit or implied..   
Paul Merrell

Fight over 'forms' clouds future of Net applications | Pagalz.com - Blog - 0 views

  • As Net heavyweights vie to define the next generation of Web applications, the Web’s main standards body is facing a revolt within its own ranks over electronic forms, a cornerstone of interactive documents.
  • “The W3C is saying the answer is XForms. Microsoft is saying it’s XAML. Macromedia is saying its Flash MX. And Mozilla is saying it’s XUL.
  • Though the success of one method or another might not seem to make much difference to the person filling out an order form, the fate of open standards in the process could determine whether that form can relay the data it collects to any standards-compliant database or banking system, or whether it can only operate within certain proprietary systems. The fate of a standard could also determine whether the order form could be accessed in any standards-compliant Web browser, or if it would be available only to users of a particular operating system–an outcome that has browser makers and others worried about the role of Microsoft.
  • ...5 more annotations...
  • browser makers still want a standards-based forms technology to help the Web steer clear of proprietary application platforms. They’re particularly concerned about Microsoft’s sprawling vision for Windows “Longhorn” applications built in the XML-based XAML markup language using Longhorn’s Avalon graphics system. Browsers like Mozilla Firefox, Opera and Apple’s Safari will be useless to access these Internet-based Windows applications.
  • “The WHAT approach works OK for small examples,” Pemberton said. “But actors like the Department of Defense say ‘no scripting.’”
  • HAT approach works OK for small examples,” Pemberton said. “But actors like the Department of Defense say ‘no scripting.’”
  • The evolution versus revolution debate over forms centers on the use of scripting–specifically JavaScript–to perform important tasks in forms-based applications.
  • “I understand where WHAT is coming from, but they are browser makers, not forms experts,” Pemberton said. “It is important to build something that is future-proof and not a Band-Aid solution. Forms (technology) is the basis of the e-commerce revolution and so it is important to do it right.”
Paul Merrell

Secret Malware in European Union Attack Linked to U.S. and British Intelligence - The I... - 0 views

  • Complex malware known as Regin is the suspected technology behind sophisticated cyberattacks conducted by U.S. and British intelligence agencies on the European Union and a Belgian telecommunications company, according to security industry sources and technical analysis conducted by The Intercept. Regin was found on infected internal computer systems and email servers at Belgacom, a partly state-owned Belgian phone and internet provider, following reports last year that the company was targeted in a top-secret surveillance operation carried out by British spy agency Government Communications Headquarters, industry sources told The Intercept. The malware, which steals data from infected systems and disguises itself as legitimate Microsoft software, has also been identified on the same European Union computer systems that were targeted for surveillance by the National Security Agency.
  • The hacking operations against Belgacom and the European Union were first revealed last year through documents leaked by NSA whistleblower Edward Snowden. The specific malware used in the attacks has never been disclosed, however.
Paul Merrell

Office Business Applications for Store Operations - 0 views

  • Service orientation addresses these challenges by centering on rapidly evolving XML and Web services standards that are revolutionizing how developers compose systems and integrate them over distributed networks. No longer are developers forced to make do with rigid and proprietary languages and object models that used to be the norm before service orientation came into play. The emergence of this new methodology is helping to develop new approaches specifically for Web-based distributed computing. This revolution is transforming the business by integrating disparate systems to establish a real-time enterprise. Making information available where it is needed to simplify merchandising processes requires a methodology that is based on loosely coupled integration between various in-store and back-end applications. This demand makes it critical for an architecture that is based on service orientation for integration between disparate applications. In addition, surfacing information at the right place requires the ability to compose dynamic applications using an array of underlying services. The Office Business Applications platform provides this ability to create composite applications, such as dashboards for the store, regional, and corporate managers.
  •  
    Summary: Changing market conditions require agility in business applications. Service orientation answers the challenge by centering on XML and Web services standards that revolutionize how developers compose systems and integrate them over distributed networks. Once integrated, how is the information presented to the decision makers? (36 printed pages)
  •  
    Most quality online stores. Know whether you are a trusted online retailer in the world. Whatever we can buy very good quality. and do not hesitate. Everything is very high quality. Including clothes, accessories, bags, cups. Highly recommended. This is one of the trusted online store in the world. View now www.retrostyler.com
Paul Merrell

NZ Prime Minister John Key Retracts Vow to Resign if Mass Surveillance Is Shown - 0 views

  • In August 2013, as evidence emerged of the active participation by New Zealand in the “Five Eyes” mass surveillance program exposed by Edward Snowden, the country’s conservative Prime Minister, John Key, vehemently denied that his government engages in such spying. He went beyond mere denials, expressly vowing to resign if it were ever proven that his government engages in mass surveillance of New Zealanders. He issued that denial, and the accompanying resignation vow, in order to reassure the country over fears provoked by a new bill he advocated to increase the surveillance powers of that country’s spying agency, Government Communications Security Bureau (GCSB) — a bill that passed by one vote thanks to the Prime Minister’s guarantees that the new law would not permit mass surveillance.
  • Since then, a mountain of evidence has been presented that indisputably proves that New Zealand does exactly that which Prime Minister Key vehemently denied — exactly that which he said he would resign if it were proven was done. Last September, we reported on a secret program of mass surveillance at least partially implemented by the Key government that was designed to exploit the very law that Key was publicly insisting did not permit mass surveillance. At the time, Snowden, citing that report as well as his own personal knowledge of GCSB’s participation in the mass surveillance tool XKEYSCORE, wrote in an article for The Intercept: Let me be clear: any statement that mass surveillance is not performed in New Zealand, or that the internet communications are not comprehensively intercepted and monitored, or that this is not intentionally and actively abetted by the GCSB, is categorically false. . . . The prime minister’s claim to the public, that “there is no and there never has been any mass surveillance” is false. The GCSB, whose operations he is responsible for, is directly involved in the untargeted, bulk interception and algorithmic analysis of private communications sent via internet, satellite, radio, and phone networks.
  • A series of new reports last week by New Zealand journalist Nicky Hager, working with my Intercept colleague Ryan Gallagher, has added substantial proof demonstrating GCSB’s widespread use of mass surveillance. An article last week in The New Zealand Herald demonstrated that “New Zealand’s electronic surveillance agency, the GCSB, has dramatically expanded its spying operations during the years of John Key’s National Government and is automatically funnelling vast amounts of intelligence to the US National Security Agency.” Specifically, its “intelligence base at Waihopai has moved to ‘full-take collection,’ indiscriminately intercepting Asia-Pacific communications and providing them en masse to the NSA through the controversial NSA intelligence system XKeyscore, which is used to monitor emails and internet browsing habits.” Moreover, the documents “reveal that most of the targets are not security threats to New Zealand, as has been suggested by the Government,” but “instead, the GCSB directs its spying against a surprising array of New Zealand’s friends, trading partners and close Pacific neighbours.” A second report late last week published jointly by Hager and The Intercept detailed the role played by GCSB’s Waihopai base in aiding NSA’s mass surveillance activities in the Pacific (as Hager was working with The Intercept on these stories, his house was raided by New Zealand police for 10 hours, ostensibly to find Hager’s source for a story he published that was politically damaging to Key).
  • ...6 more annotations...
  • That the New Zealand government engages in precisely the mass surveillance activities Key vehemently denied is now barely in dispute. Indeed, a former director of GCSB under Key, Sir Bruce Ferguson, while denying any abuse of New Zealander’s communications, now admits that the agency engages in mass surveillance.
  • Meanwhile, Russel Norman, the head of the country’s Green Party, said in response to these stories that New Zealand is “committing crimes” against its neighbors in the Pacific by subjecting them to mass surveillance, and insists that the Key government broke the law because that dragnet necessarily includes the communications of New Zealand citizens when they travel in the region.
  • So now that it’s proven that New Zealand does exactly that which Prime Minister Key vowed would cause him to resign if it were proven, is he preparing his resignation speech? No: that’s something a political official with a minimal amount of integrity would do. Instead — even as he now refuses to say what he has repeatedly said before: that GCSB does not engage in mass surveillance — he’s simply retracting his pledge as though it were a minor irritant, something to be casually tossed aside:
  • When asked late last week whether New Zealanders have a right to know what their government is doing in the realm of digital surveillance, the Prime Minister said: “as a general rule, no.” And he expressly refuses to say whether New Zealand is doing that which he swore repeatedly it was not doing, as this excellent interview from Radio New Zealand sets forth: Interviewer: “Nicky Hager’s revelations late last week . . . have stoked fears that New Zealanders’ communications are being indiscriminately caught in that net. . . . The Prime Minister, John Key, has in the past promised to resign if it were found to be mass surveillance of New Zealanders . . . Earlier, Mr. Key was unable to give me an assurance that mass collection of communications from New Zealanders in the Pacific was not taking place.” PM Key: “No, I can’t. I read the transcript [of former GCSB Director Bruce Ferguson’s interview] – I didn’t hear the interview – but I read the transcript, and you know, look, there’s a variety of interpretations – I’m not going to critique–”
  • Interviewer: “OK, I’m not asking for a critique. Let’s listen to what Bruce Ferguson did tell us on Friday:” Ferguson: “The whole method of surveillance these days, is sort of a mass collection situation – individualized: that is mission impossible.” Interviewer: “And he repeated that several times, using the analogy of a net which scoops up all the information. . . . I’m not asking for a critique with respect to him. Can you confirm whether he is right or wrong?” Key: “Uh, well I’m not going to go and critique the guy. And I’m not going to give a view of whether he’s right or wrong” . . . . Interviewer: “So is there mass collection of personal data of New Zealand citizens in the Pacific or not?” Key: “I’m just not going to comment on where we have particular targets, except to say that where we go and collect particular information, there is always a good reason for that.”
  • From “I will resign if it’s shown we engage in mass surveillance of New Zealanders” to “I won’t say if we’re doing it” and “I won’t quit either way despite my prior pledges.” Listen to the whole interview: both to see the type of adversarial questioning to which U.S. political leaders are so rarely subjected, but also to see just how obfuscating Key’s answers are. The history of reporting from the Snowden archive has been one of serial dishonesty from numerous governments: such as the way European officials at first pretended to be outraged victims of NSA only for it to be revealed that, in many ways, they are active collaborators in the very system they were denouncing. But, outside of the U.S. and U.K. itself, the Key government has easily been the most dishonest over the last 20 months: one of the most shocking stories I’ve seen during this time was how the Prime Minister simultaneously plotted in secret to exploit the 2013 proposed law to implement mass surveillance at exactly the same time that he persuaded the public to support it by explicitly insisting that it would not allow mass surveillance. But overtly reneging on a public pledge to resign is a new level of political scandal. Key was just re-elected for his third term, and like any political official who stays in power too long, he has the despot’s mentality that he’s beyond all ethical norms and constraints. But by the admission of his own former GCSB chief, he has now been caught red-handed doing exactly that which he swore to the public would cause him to resign if it were proven. If nothing else, the New Zealand media ought to treat that public deception from its highest political official with the level of seriousness it deserves.
  •  
    It seems the U.S. is not the only nation that has liars for head of state. 
Paul Merrell

Internet Giants Erect Barriers to Spy Agencies - NYTimes.com - 0 views

  • As fast as it can, Google is sealing up cracks in its systems that Edward J. Snowden revealed the N.S.A. had brilliantly exploited. It is encrypting more data as it moves among its servers and helping customers encode their own emails. Facebook, Microsoft and Yahoo are taking similar steps.
  • After years of cooperating with the government, the immediate goal now is to thwart Washington — as well as Beijing and Moscow. The strategy is also intended to preserve business overseas in places like Brazil and Germany that have threatened to entrust data only to local providers. Google, for example, is laying its own fiber optic cable under the world’s oceans, a project that began as an effort to cut costs and extend its influence, but now has an added purpose: to assure that the company will have more control over the movement of its customer data.
  • A year after Mr. Snowden’s revelations, the era of quiet cooperation is over. Telecommunications companies say they are denying requests to volunteer data not covered by existing law. A.T.&T., Verizon and others say that compared with a year ago, they are far more reluctant to cooperate with the United States government in “gray areas” where there is no explicit requirement for a legal warrant.
  • ...8 more annotations...
  • Eric Grosse, Google’s security chief, suggested in an interview that the N.S.A.'s own behavior invited the new arms race.“I am willing to help on the purely defensive side of things,” he said, referring to Washington’s efforts to enlist Silicon Valley in cybersecurity efforts. “But signals intercept is totally off the table,” he said, referring to national intelligence gathering.“No hard feelings, but my job is to make their job hard,” he added.
  • In Washington, officials acknowledge that covert programs are now far harder to execute because American technology companies, fearful of losing international business, are hardening their networks and saying no to requests for the kind of help they once quietly provided.Continue reading the main story Robert S. Litt, the general counsel of the Office of the Director of National Intelligence, which oversees all 17 American spy agencies, said on Wednesday that it was “an unquestionable loss for our nation that companies are losing the willingness to cooperate legally and voluntarily” with American spy agencies.
  • Many point to an episode in 2012, when Russian security researchers uncovered a state espionage tool, Flame, on Iranian computers. Flame, like the Stuxnet worm, is believed to have been produced at least in part by American intelligence agencies. It was created by exploiting a previously unknown flaw in Microsoft’s operating systems. Companies argue that others could have later taken advantage of this defect.Worried that such an episode undercuts confidence in its wares, Microsoft is now fully encrypting all its products, including Hotmail and Outlook.com, by the end of this year with 2,048-bit encryption, a stronger protection that would take a government far longer to crack. The software is protected by encryption both when it is in data centers and when data is being sent over the Internet, said Bradford L. Smith, the company’s general counsel.
  • Mr. Smith also said the company was setting up “transparency centers” abroad so that technical experts of foreign governments could come in and inspect Microsoft’s proprietary source code. That will allow foreign governments to check to make sure there are no “back doors” that would permit snooping by United States intelligence agencies. The first such center is being set up in Brussels.Microsoft has also pushed back harder in court. In a Seattle case, the government issued a “national security letter” to compel Microsoft to turn over data about a customer, along with a gag order to prevent Microsoft from telling the customer it had been compelled to provide its communications to government officials. Microsoft challenged the gag order as violating the First Amendment. The government backed down.
  • Hardware firms like Cisco, which makes routers and switches, have found their products a frequent subject of Mr. Snowden’s disclosures, and their business has declined steadily in places like Asia, Brazil and Europe over the last year. The company is still struggling to convince foreign customers that their networks are safe from hackers — and free of “back doors” installed by the N.S.A. The frustration, companies here say, is that it is nearly impossible to prove that their systems are N.S.A.-proof.
  • In one slide from the disclosures, N.S.A. analysts pointed to a sweet spot inside Google’s data centers, where they could catch traffic in unencrypted form. Next to a quickly drawn smiley face, an N.S.A. analyst, referring to an acronym for a common layer of protection, had noted, “SSL added and removed here!”
  • Facebook and Yahoo have also been encrypting traffic among their internal servers. And Facebook, Google and Microsoft have been moving to more strongly encrypt consumer traffic with so-called Perfect Forward Secrecy, specifically devised to make it more labor intensive for the N.S.A. or anyone to read stored encrypted communications.One of the biggest indirect consequences from the Snowden revelations, technology executives say, has been the surge in demands from foreign governments that saw what kind of access to user information the N.S.A. received — voluntarily or surreptitiously. Now they want the same.
  • The latest move in the war between intelligence agencies and technology companies arrived this week, in the form of a new Google encryption tool. The company released a user-friendly, email encryption method to replace the clunky and often mistake-prone encryption schemes the N.S.A. has readily exploited.But the best part of the tool was buried in Google’s code, which included a jab at the N.S.A.'s smiley-face slide. The code included the phrase: “ssl-added-and-removed-here-; - )”
Paul Merrell

Use Tor or 'EXTREMIST' Tails Linux? Congrats, you're on the NSA's list * The Register - 0 views

  • Alleged leaked documents about the NSA's XKeyscore snooping software appear to show the paranoid agency is targeting Tor and Tails users, Linux Journal readers – and anyone else interested in online privacy.Apparently, this configuration file for XKeyscore is in the divulged data, which was obtained and studied by members of the Tor project and security specialists for German broadcasters NDR and WDR. <a href="http://pubads.g.doubleclick.net/gampad/jump?iu=/6978/reg_security/front&sz=300x250%7C300x600&tile=3&c=33U7ZK6qwQrMkAACSrTugAAAP1&t=ct%3Dns%26unitnum%3D3%26unitname%3Dwww_top_mpu%26pos%3Dtop%26test%3D0" target="_blank"> <img src="http://pubads.g.doubleclick.net/gampad/ad?iu=/6978/reg_security/front&sz=300x250%7C300x600&tile=3&c=33U7ZK6qwQrMkAACSrTugAAAP1&t=ct%3Dns%26unitnum%3D3%26unitname%3Dwww_top_mpu%26pos%3Dtop%26test%3D0" alt=""></a> In their analysis of the alleged top-secret documents, they claim the NSA is, among other things:Specifically targeting Tor directory servers Reading email contents for mentions of Tor bridges Logging IP addresses used to search for privacy-focused websites and software And possibly breaking international law in doing so. We already know from leaked Snowden documents that Western intelligence agents hate Tor for its anonymizing abilities. But what the aforementioned leaked source code, written in a rather strange custom language, shows is that not only is the NSA targeting the anonymizing network Tor specifically, it is also taking digital fingerprints of any netizens who are remotely interested in privacy.
  • These include readers of the Linux Journal site, anyone visiting the website for the Tor-powered Linux operating system Tails – described by the NSA as "a comsec mechanism advocated by extremists on extremist forums" – and anyone looking into combining Tails with the encryption tool Truecrypt.If something as innocuous as Linux Journal is on the NSA's hit list, it's a distinct possibility that El Reg is too, particularly in light of our recent exclusive report on GCHQ – which led to a Ministry of Defence advisor coming round our London office for a chat.
  • If you take even the slightest interest in online privacy or have Googled a Linux Journal article about a broken package, you are earmarked in an NSA database for further surveillance, according to these latest leaks.This is assuming the leaked file is genuine, of course.Other monitored sites, we're told, include HotSpotShield, FreeNet, Centurian, FreeProxies.org, MegaProxy, privacy.li and an anonymous email service called MixMinion. The IP address of computer users even looking at these sites is recorded and stored on the NSA's servers for further analysis, and it's up to the agency how long it keeps that data.The XKeyscore code, we're told, includes microplugins that target Tor servers in Germany, at MIT in the United States, in Sweden, in Austria, and in the Netherlands. In doing so it may not only fall foul of German law but also the US's Fourth Amendment.
  • ...2 more annotations...
  • The nine Tor directory servers receive especially close monitoring from the NSA's spying software, which states the "goal is to find potential Tor clients connecting to the Tor directory servers." Tor clients linking into the directory servers are also logged."This shows that Tor is working well enough that Tor has become a target for the intelligence services," said Sebastian Hahn, who runs one of the key Tor servers. "For me this means that I will definitely go ahead with the project.”
  • While the German reporting team has published part of the XKeyscore scripting code, it doesn't say where it comes from. NSA whistleblower Edward Snowden would be a logical pick, but security experts are not so sure."I do not believe that this came from the Snowden documents," said security guru Bruce Schneier. "I also don't believe the TAO catalog came from the Snowden documents. I think there's a second leaker out there."If so, the NSA is in for much more scrutiny than it ever expected.
Paul Merrell

A Secret Catalogue of Government Gear for Spying on Your Cellphone - 0 views

  • HE INTERCEPT HAS OBTAINED a secret, internal U.S. government catalogue of dozens of cellphone surveillance devices used by the military and by intelligence agencies. The document, thick with previously undisclosed information, also offers rare insight into the spying capabilities of federal law enforcement and local police inside the United States. The catalogue includes details on the Stingray, a well-known brand of surveillance gear, as well as Boeing “dirt boxes” and dozens of more obscure devices that can be mounted on vehicles, drones, and piloted aircraft. Some are designed to be used at static locations, while others can be discreetly carried by an individual. They have names like Cyberhawk, Yellowstone, Blackfin, Maximus, Cyclone, and Spartacus. Within the catalogue, the NSA is listed as the vendor of one device, while another was developed for use by the CIA, and another was developed for a special forces requirement. Nearly a third of the entries focus on equipment that seems to have never been described in public before.
  • The Intercept obtained the catalogue from a source within the intelligence community concerned about the militarization of domestic law enforcement. (The original is here.) A few of the devices can house a “target list” of as many as 10,000 unique phone identifiers. Most can be used to geolocate people, but the documents indicate that some have more advanced capabilities, like eavesdropping on calls and spying on SMS messages. Two systems, apparently designed for use on captured phones, are touted as having the ability to extract media files, address books, and notes, and one can retrieve deleted text messages. Above all, the catalogue represents a trove of details on surveillance devices developed for military and intelligence purposes but increasingly used by law enforcement agencies to spy on people and convict them of crimes. The mass shooting earlier this month in San Bernardino, California, which President Barack Obama has called “an act of terrorism,” prompted calls for state and local police forces to beef up their counterterrorism capabilities, a process that has historically involved adapting military technologies to civilian use. Meanwhile, civil liberties advocates and others are increasingly alarmed about how cellphone surveillance devices are used domestically and have called for a more open and informed debate about the trade-off between security and privacy — despite a virtual blackout by the federal government on any information about the specific capabilities of the gear.
  • “We’ve seen a trend in the years since 9/11 to bring sophisticated surveillance technologies that were originally designed for military use — like Stingrays or drones or biometrics — back home to the United States,” said Jennifer Lynch, a senior staff attorney at the Electronic Frontier Foundation, which has waged a legal battle challenging the use of cellphone surveillance devices domestically. “But using these technologies for domestic law enforcement purposes raises a host of issues that are different from a military context.”
  • ...2 more annotations...
  • ANY OF THE DEVICES in the catalogue, including the Stingrays and dirt boxes, are cell-site simulators, which operate by mimicking the towers of major telecom companies like Verizon, AT&T, and T-Mobile. When someone’s phone connects to the spoofed network, it transmits a unique identification code and, through the characteristics of its radio signals when they reach the receiver, information about the phone’s location. There are also indications that cell-site simulators may be able to monitor calls and text messages. In the catalogue, each device is listed with guidelines about how its use must be approved; the answer is usually via the “Ground Force Commander” or under one of two titles in the U.S. code governing military and intelligence operations, including covert action.
  • But domestically the devices have been used in a way that violates the constitutional rights of citizens, including the Fourth Amendment prohibition on illegal search and seizure, critics like Lynch say. They have regularly been used without warrants, or with warrants that critics call overly broad. Judges and civil liberties groups alike have complained that the devices are used without full disclosure of how they work, even within court proceedings.
1 - 20 of 25 Next ›
Showing 20 items per page