Skip to main content

Home/ Future of the Web/ Group items tagged document

Rss Feed Group items tagged

Paul Merrell

XHTML Modularization 1.1 Released as W3C Recommendation - 0 views

  • XHTML Modularization is a decomposition of XHTML 1.0, and by reference HTML 4, into a collection of abstract modules that provide specific types of functionality.
  • The modularization of XHTML refers to the task of specifying well-defined sets of XHTML elements that can be combined and extended by document authors, document type architects, other XML standards specifications, and application and product designers to make it economically feasible for content developers to deliver content on a greater number and diversity of platforms. Over the last couple of years, many specialized markets have begun looking to HTML as a content language. There is a great movement toward using HTML across increasingly diverse computing platforms. Currently there is activity to move HTML onto mobile devices (hand held computers, portable phones, etc.), television devices (digital televisions, TV-based Web browsers, etc.), and appliances (fixed function devices). Each of these devices has different requirements and constraints.
  • XHTML Modularization is a decomposition of XHTML 1.0, and by reference HTML 4, into a collection of abstract modules that provide specific types of functionality. These abstract modules are implemented in this specification using the XML Schema and XML Document Type Definition languages. The rules for defining the abstract modules, and for implementing them using XML Schemas and XML DTDs, are also defined in this document. These modules may be combined with each other and with other modules to create XHTML subset and extension document types that qualify as members of the XHTML-family of document types.
  • ...1 more annotation...
  • Modularizing XHTML provides a means for product designers to specify which elements are supported by a device using standard building blocks and standard methods for specifying which building blocks are used. These modules serve as "points of conformance" for the content community. The content community can now target the installed base that supports a certain collection of modules, rather than worry about the installed base that supports this or that permutation of XHTML elements. The use of standards is critical for modularized XHTML to be successful on a large scale. It is not economically feasible for content developers to tailor content to each and every permutation of XHTML elements. By specifying a standard, either software processes can autonomously tailor content to a device, or the device can automatically load the software required to process a module. Modularization also allows for the extension of XHTML's layout and presentation capabilities, using the extensibility of XML, without breaking the XHTML standard. This development path provides a stable, useful, and implementable framework for content developers and publishers to manage the rapid pace of technological change on the Web.
Gary Edwards

Brendan's Roadmap Updates: Open letter to Microsoft's Chris Wilson and their fight to s... - 0 views

  • The history of ECMAScript since its beginnings in November 1996 shows that when Microsoft was behind in the market (against Netscape in 1996-1997), it moved aggressively in the standards body to evolve standards starting with ES1 through ES3. Once Microsoft dominated the market, the last edition of the standard was left to rot -- ES3 was finished in 1999 -- and even easy-to-fix standards conformance bugs in IE JScript went unfixed for eight years (so three years to go from Edition 1 to 3, then over eight to approach Edition 4). Now that the proposed 4th edition looks like a competitive threat, the world suddenly hears in detail about all those bugs, spun as differences afflicting "JavaScript" that should inform a new standard.
  • In my opinion the notion that we need to add features so that ajax programming would be easier is plain wrong. ajax is a hack and also the notion of a webapp is a hack. the web was created in a document centric view. All w3c standards are also based on the same document notion. The heart of the web, the HTTP protocol is designed to support a web of documents and as such is stateless. the proper solution, IMO, is not to evolve ES for the benefit of ajax and webapps, but rather generalize the notion of a document browser that connects to a web of documents to a general purpose client engine that connects to a network of internet applications. thus the current web (document) browser just becomes one such internet application.
  •  
    the obvious conflict of interest between the standards-based web and proprietary platforms advanced by Microsoft, and the rationales for keeping the web's client-side programming language small while the proprietary platforms rapidly evolve support for large languages, does not help maintain the fiction that only clashing high-level philosophies are involved here. Readers may not know that Ecma has no provision for "minor releases" of its standards, so any ES3.1 that was approved by TG1 would inevitably be given a whole edition number, presumably becoming the 4th Edition of ECMAScript. This is obviously contentious given all the years that the majority of TG1, sometimes even apparently including Microsoft representatives, has worked on ES4, and the developer expectations set by this long-standing effort. A history of Microsoft's post-ES3 involvement in the ECMAScript standard group, leading up to the overt split in TG1 in March, is summarized here. The history of ECMAScript since its beginnings in November 1996 shows that when Microsoft was behind in the market (against Netscape in 1996-1997), it moved aggressively in the standards body to evolve standards starting with ES1 through ES3. Once Microsoft dominated the market, the last edition of the standard was left to rot -- ES3 was finished in 1999 -- and even easy-to-fix standards conformance bugs in IE JScript went unfixed for eight years (so three years to go from Edition 1 to 3, then over eight to approach Edition 4). Now that the proposed 4th edition looks like a competitive threat, the world suddenly hears in detail about all those bugs, spun as differences afflicting "JavaScript" that should inform a new standard.
Gary Edwards

Developing a Universal Markup Solution For Web Content - 0 views

  •  
    KODAXIL To Replace XML?

    File this one under the Universal Interoperability label. Very interesting. Especially since XML document formats have proven to fall short on the two primary expectations of users: interoperability and Web ready. Like HTML+ :) Maybe KODAXIL will work?

    The recent Web 2.0 Conference was filled with new web services , portals and wiki efforts trying their best to mash data into document objects. iCloud, MindTouch, AppLogic, 3Tera, Caspio and Gazoodle all deserve attention. although each took a rather different approach towards solving the problem. MindTouch in particular was excellent.

    "A Montreal-based software and research development company has developed a markup solution and language-neutral asset-descriptor that when fully developed, could result in a universal computer language for representing information in databases, web and document contents and business objects."

    "While still at a seminal stage of development, the company Gnoesis, aims to address the problem of data fragmentation caused by semantic differences between developers and users from different linguistic backgrounds."

    Gnoesis, the company that has developed the language called KODAXIL (Knowledge, Object, Data, Action, and eXtensible Interoperable Language), a data and information representation language, says the new language will replace the XML function of consolidating semantically identical data streams from different languages, by creating a common language to do this.

    The extensible semantic markup associated with this language will be understood worldwide and is three times shorter than XML.
Paul Merrell

For sale: Systems that can secretly track where cellphone users go around the globe - T... - 0 views

  • Makers of surveillance systems are offering governments across the world the ability to track the movements of almost anybody who carries a cellphone, whether they are blocks away or on another continent. The technology works by exploiting an essential fact of all cellular networks: They must keep detailed, up-to-the-minute records on the locations of their customers to deliver calls and other services to them. Surveillance systems are secretly collecting these records to map people’s travels over days, weeks or longer, according to company marketing documents and experts in surveillance technology.
  • The world’s most powerful intelligence services, such as the National Security Agency and Britain’s GCHQ, long have used cellphone data to track targets around the globe. But experts say these new systems allow less technically advanced governments to track people in any nation — including the United States — with relative ease and precision.
  • It is unclear which governments have acquired these tracking systems, but one industry official, speaking on the condition of anonymity to share sensitive trade information, said that dozens of countries have bought or leased such technology in recent years. This rapid spread underscores how the burgeoning, multibillion-dollar surveillance industry makes advanced spying technology available worldwide. “Any tin-pot dictator with enough money to buy the system could spy on people anywhere in the world,” said Eric King, deputy director of Privacy International, a London-based activist group that warns about the abuse of surveillance technology. “This is a huge problem.”
  • ...9 more annotations...
  • Security experts say hackers, sophisticated criminal gangs and nations under sanctions also could use this tracking technology, which operates in a legal gray area. It is illegal in many countries to track people without their consent or a court order, but there is no clear international legal standard for secretly tracking people in other countries, nor is there a global entity with the authority to police potential abuses.
  • tracking systems that access carrier location databases are unusual in their ability to allow virtually any government to track people across borders, with any type of cellular phone, across a wide range of carriers — without the carriers even knowing. These systems also can be used in tandem with other technologies that, when the general location of a person is already known, can intercept calls and Internet traffic, activate microphones, and access contact lists, photos and other documents. Companies that make and sell surveillance technology seek to limit public information about their systems’ capabilities and client lists, typically marketing their technology directly to law enforcement and intelligence services through international conferences that are closed to journalists and other members of the public.
  • Yet marketing documents obtained by The Washington Post show that companies are offering powerful systems that are designed to evade detection while plotting movements of surveillance targets on computerized maps. The documents claim system success rates of more than 70 percent. A 24-page marketing brochure for SkyLock, a cellular tracking system sold by Verint, a maker of analytics systems based in Melville, N.Y., carries the subtitle “Locate. Track. Manipulate.” The document, dated January 2013 and labeled “Commercially Confidential,” says the system offers government agencies “a cost-effective, new approach to obtaining global location information concerning known targets.”
  • (Privacy International has collected several marketing brochures on cellular surveillance systems, including one that refers briefly to SkyLock, and posted them on its Web site. The 24-page SkyLock brochure and other material was independently provided to The Post by people concerned that such systems are being abused.)
  • Verint, which also has substantial operations in Israel, declined to comment for this story. It says in the marketing brochure that it does not use SkyLock against U.S. or Israeli phones, which could violate national laws. But several similar systems, marketed in recent years by companies based in Switzerland, Ukraine and elsewhere, likely are free of such limitations.
  • The tracking technology takes advantage of the lax security of SS7, a global network that cellular carriers use to communicate with one another when directing calls, texts and Internet data. The system was built decades ago, when only a few large carriers controlled the bulk of global phone traffic. Now thousands of companies use SS7 to provide services to billions of phones and other mobile devices, security experts say. All of these companies have access to the network and can send queries to other companies on the SS7 system, making the entire network more vulnerable to exploitation. Any one of these companies could share its access with others, including makers of surveillance systems.
  • Companies that market SS7 tracking systems recommend using them in tandem with “IMSI catchers,” increasingly common surveillance devices that use cellular signals collected directly from the air to intercept calls and Internet traffic, send fake texts, install spyware on a phone, and determine precise locations. IMSI catchers — also known by one popular trade name, StingRay — can home in on somebody a mile or two away but are useless if a target’s general location is not known. SS7 tracking systems solve that problem by locating the general area of a target so that IMSI catchers can be deployed effectively. (The term “IMSI” refers to a unique identifying code on a cellular phone.)
  • Verint can install SkyLock on the networks of cellular carriers if they are cooperative — something that telecommunications experts say is common in countries where carriers have close relationships with their national governments. Verint also has its own “worldwide SS7 hubs” that “are spread in various locations around the world,” says the brochure. It does not list prices for the services, though it says that Verint charges more for the ability to track targets in many far-flung countries, as opposed to only a few nearby ones. Among the most appealing features of the system, the brochure says, is its ability to sidestep the cellular operators that sometimes protect their users’ personal information by refusing government requests or insisting on formal court orders before releasing information.
  • Another company, Defentek, markets a similar system called Infiltrator Global Real-Time Tracking System on its Web site, claiming to “locate and track any phone number in the world.” The site adds: “It is a strategic solution that infiltrates and is undetected and unknown by the network, carrier, or the target.”
  •  
    The Verint company has very close ties to the Iraeli government. Its former parent company Comverse, was heavily subsidized by Israel and the bulk of its manufacturing and code development was done in Israel. See https://en.wikipedia.org/wiki/Comverse_Technology "In December 2001, a Fox News report raised the concern that wiretapping equipment provided by Comverse Infosys to the U.S. government for electronic eavesdropping may have been vulnerable, as these systems allegedly had a back door through which the wiretaps could be intercepted by unauthorized parties.[55] Fox News reporter Carl Cameron said there was no reason to believe the Israeli government was implicated, but that "a classified top-secret investigation is underway".[55] A March 2002 story by Le Monde recapped the Fox report and concluded: "Comverse is suspected of having introduced into its systems of the 'catch gates' in order to 'intercept, record and store' these wire-taps. This hardware would render the 'listener' himself 'listened to'."[56] Fox News did not pursue the allegations, and in the years since, there have been no legal or commercial actions of any type taken against Comverse by the FBI or any other branch of the US Government related to data access and security issues. While no real evidence has been presented against Comverse or Verint, the allegations have become a favorite topic of conspiracy theorists.[57] By 2005, the company had $959 million in sales and employed over 5,000 people, of whom about half were located in Israel.[16]" Verint is also the company that got the Dept. of Homeland Security contract to provide and install an electronic and video surveillance system across the entire U.S. border with Mexico.  One need not be much of a conspiracy theorist to have concerns about Verint's likely interactions and data sharing with the NSA and its Israeli equivalent, Unit 8200. 
Gonzalo San Gil, PhD.

Open Document Format: Using Officeshots and ODFAutoTesting for Sustainable Documents | ... - 1 views

  •  
    "One of the many benefits of open source software is that it offers some protection from having programs disappear or stop working."
Paul Merrell

The People and Tech Behind the Panama Papers - Features - Source: An OpenNews project - 0 views

  • Then we put the data up, but the problem with Solr was it didn’t have a user interface, so we used Project Blacklight, which is open source software normally used by librarians. We used it for the journalists. It’s simple because it allows you to do faceted search—so, for example, you can facet by the folder structure of the leak, by years, by type of file. There were more complex things—it supports queries in regular expressions, so the more advanced users were able to search for documents with a certain pattern of numbers that, for example, passports use. You could also preview and download the documents. ICIJ open-sourced the code of our document processing chain, created by our web developer Matthew Caruana Galizia. We also developed a batch-searching feature. So say you were looking for politicians in your country—you just run it through the system, and you upload your list to Blacklight and you would get a CSV back saying yes, there are matches for these names—not only exact matches, but also matches based on proximity. So you would say “I want Mar Cabra proximity 2” and that would give you “Mar Cabra,” “Mar whatever Cabra,” “Cabra, Mar,”—so that was good, because very quickly journalists were able to see… I have this list of politicians and they are in the data!
  • Last Sunday, April 3, the first stories emerging from the leaked dataset known as the Panama Papers were published by a global partnership of news organizations working in coordination with the International Consortium of Investigative Journalists, or ICIJ. As we begin the second week of reporting on the leak, Iceland’s Prime Minister has been forced to resign, Germany has announced plans to end anonymous corporate ownership, governments around the world launched investigations into wealthy citizens’ participation in tax havens, the Russian government announced that the investigation was an anti-Putin propaganda operation, and the Chinese government banned mentions of the leak in Chinese media. As the ICIJ-led consortium prepares for its second major wave of reporting on the Panama Papers, we spoke with Mar Cabra, editor of ICIJ’s Data & Research unit and lead coordinator of the data analysis and infrastructure work behind the leak. In our conversation, Cabra reveals ICIJ’s years-long effort to build a series of secure communication and analysis platforms in support of genuinely global investigative reporting collaborations.
  • For communication, we have the Global I-Hub, which is a platform based on open source software called Oxwall. Oxwall is a social network, like Facebook, which has a wall when you log in with the latest in your network—it has forum topics, links, you can share files, and you can chat with people in real time.
  • ...3 more annotations...
  • We had the data in a relational database format in SQL, and thanks to ETL (Extract, Transform, and Load) software Talend, we were able to easily transform the data from SQL to Neo4j (the graph-database format we used). Once the data was transformed, it was just a matter of plugging it into Linkurious, and in a couple of minutes, you have it visualized—in a networked way, so anyone can log in from anywhere in the world. That was another reason we really liked Linkurious and Neo4j—they’re very quick when representing graph data, and the visualizations were easy to understand for everybody. The not-very-tech-savvy reporter could expand the docs like magic, and more technically expert reporters and programmers could use the Neo4j query language, Cypher, to do more complex queries, like show me everybody within two degrees of separation of this person, or show me all the connected dots…
  • We believe in open source technology and try to use it as much as possible. We used Apache Solr for the indexing and Apache Tika for document processing, and it’s great because it processes dozens of different formats and it’s very powerful. Tika interacts with Tesseract, so we did the OCRing on Tesseract. To OCR the images, we created an army of 30–40 temporary servers in Amazon that allowed us to process the documents in parallel and do parallel OCR-ing. If it was very slow, we’d increase the number of servers—if it was going fine, we would decrease because of course those servers have a cost.
  • For the visualization of the Mossack Fonseca internal database, we worked with another tool called Linkurious. It’s not open source, it’s licensed software, but we have an agreement with them, and they allowed us to work with it. It allows you to represent data in graphs. We had a version of Linkurious on our servers, so no one else had the data. It was pretty intuitive—journalists had to click on dots that expanded, basically, and could search the names.
Paul Merrell

Wikileaks Releases "NightSkies 1.2": Proof CIA Bugs "Factory Fresh" iPhones | Zero Hedge - 0 views

  • The latest leaks from WikiLeaks' Vault 7 is titled “Dark Matter” and claims that the CIA has been bugging “factory fresh” iPhones since at least 2008 through suppliers.
  • And here is the full press release from WikiLeaks: Today, March 23rd 2017, WikiLeaks releases Vault 7 "Dark Matter", which contains documentation for several CIA projects that infect Apple Mac Computer firmware (meaning the infection persists even if the operating system is re-installed) developed by the CIA's Embedded Development Branch (EDB). These documents explain the techniques used by CIA to gain 'persistence' on Apple Mac devices, including Macs and iPhones and demonstrate their use of EFI/UEFI and firmware malware.   Among others, these documents reveal the "Sonic Screwdriver" project which, as explained by the CIA, is a "mechanism for executing code on peripheral devices while a Mac laptop or desktop is booting" allowing an attacker to boot its attack software for example from a USB stick "even when a firmware password is enabled". The CIA's "Sonic Screwdriver" infector is stored on the modified firmware of an Apple Thunderbolt-to-Ethernet adapter.   "DarkSeaSkies" is "an implant that persists in the EFI firmware of an Apple MacBook Air computer" and consists of "DarkMatter", "SeaPea" and "NightSkies", respectively EFI, kernel-space and user-space implants.   Documents on the "Triton" MacOSX malware, its infector "Dark Mallet" and its EFI-persistent version "DerStake" are also included in this release. While the DerStake1.4 manual released today dates to 2013, other Vault 7 documents show that as of 2016 the CIA continues to rely on and update these systems and is working on the production of DerStarke2.0.   Also included in this release is the manual for the CIA's "NightSkies 1.2" a "beacon/loader/implant tool" for the Apple iPhone. Noteworthy is that NightSkies had reached 1.2 by 2008, and is expressly designed to be physically installed onto factory fresh iPhones. i.e the CIA has been infecting the iPhone supply chain of its targets since at least 2008.   While CIA assets are sometimes used to physically infect systems in the custody of a target it is likely that many CIA physical access attacks have infected the targeted organization's supply chain including by interdicting mail orders and other shipments (opening, infecting, and resending) leaving the United States or otherwise.
Gonzalo San Gil, PhD.

***¡Copiad, malditos! derechos de autor en la era digital - elegant mob films... - 0 views

  •  
    [¡Copiad, malditos! proyecto de documental para TV + web sobre propiedad intelectual. Más información: www.copiadmalditos.net Para descargar el documental hay varias opciones: 1 - En RTVE.es se ofrecen dos versiones en FLV y MP4 para su descarga de...]
Gonzalo San Gil, PhD.

Translating good documentation beats starting over from scratch | Opensource.com - 0 views

  •  
    "Writing documentation can have a way of getting into your blood, so that you think about it quite a bit, play with some ideas, start various new ideas that may not come to much, and it seems that what you're looking for as much as anything is a task that takes hold of you and develops its own energy to keep you going until you finish."
Gary Edwards

Wolfram Alpha is Coming -- and It Could be as Important as Google | Twine - 0 views

  • The first question was could (or even should) Wolfram Alpha be built using the Semantic Web in some manner, rather than (or as well as) the Mathematica engine it is currently built on. Is anything missed by not building it with Semantic Web's languages (RDF, OWL, Sparql, etc.)? The answer is that there is no reason that one MUST use the Semantic Web stack to build something like Wolfram Alpha. In fact, in my opinion it would be far too difficult to try to explicitly represent everything Wolfram Alpha knows and can compute using OWL ontologies. It is too wide a range of human knowledge and giant OWL ontologies are just too difficult to build and curate.
  • However for the internal knowledge representation and reasoning that takes places in the system, it appears Wolfram has found a pragmatic and efficient representation of his own, and I don't think he needs the Semantic Web at that level. It seems to be doing just fine without it. Wolfram Alpha is built on hand-curated knowledge and expertise. Wolfram and his team have somehow figured out a way to make that practical where all others who have tried this have failed to achieve their goals. The task is gargantuan -- there is just so much diverse knowledge in the world. Representing even a small segment of it formally turns out to be extremely difficult and time-consuming.
  • It has generally not been considered feasible for any one group to hand-curate all knowledge about every subject. This is why the Semantic Web was invented -- by enabling everyone to curate their own knowledge about their own documents and topics in parallel, in principle at least, more knowledge could be represented and shared in less time by more people -- in an interoperable manner. At least that is the vision of the Semantic Web.
  • ...1 more annotation...
  • Where Google is a system for FINDING things that we as a civilization collectively publish, Wolfram Alpha is for ANSWERING questions about what we as a civilization collectively know. It's the next step in the distribution of knowledge and intelligence around the world -- a new leap in the intelligence of our collective "Global Brain." And like any big next-step, Wolfram Alpha works in a new way -- it computes answers instead of just looking them up.
  •  
    A Computational Knowledge Engine for the Web In a nutshell, Wolfram and his team have built what he calls a "computational knowledge engine" for the Web. OK, so what does that really mean? Basically it means that you can ask it factual questions and it computes answers for you. It doesn't simply return documents that (might) contain the answers, like Google does, and it isn't just a giant database of knowledge, like the Wikipedia. It doesn't simply parse natural language and then use that to retrieve documents, like Powerset, for example. Instead, Wolfram Alpha actually computes the answers to a wide range of questions -- like questions that have factual answers such as "What country is Timbuktu in?" or "How many protons are in a hydrogen atom?" or "What is the average rainfall in Seattle this month?," "What is the 300th digit of Pi?," "where is the ISS?" or "When was GOOG worth more than $300?" Think about that for a minute. It computes the answers. Wolfram Alpha doesn't simply contain huge amounts of manually entered pairs of questions and answers, nor does it search for answers in a database of facts. Instead, it understands and then computes answers to certain kinds of questions.
Gary Edwards

Why Kindle Should Be An Open Book - Tim O'Reilly at Forbes.com - 0 views

  •  
    Like someone finding out that the rapture has happened, and they've been left behind, Tim O'Reilly shakes his fists and shouts to the heavens that Amazon must support open standards. He argues that in-spite of incredible market success, the Amazon Kindle will fail because the document format is not Open. He even argues that Apple, with the iPod and iPhone, have figured out how to blend Open Web formats and application development with proprietary hardware initiatives....

    "The Amazon Kindle has sparked huge media interest in e-books and has seemingly jump-started the market. Its instant wireless access to hundreds of thousands of e-books and seamless one-click purchasing process would seem to give it an enormous edge over other dedicated e-book platforms. Yet I have a bold prediction: Unless Amazon embraces open e-book standards like epub, which allow readers to read books on a variety of devices, the Kindle will be gone within two or three years."

    TO points to ePub as an open format, apparently not realizing the format falls far short of Open Web advances designed to enable a complete publication-typesetter model. The WebKit and Mozilla open source communities are pushing the envelope of Open Web development with an extremely advanced document model based on HTML5, CSS3, SVG/Canvas, and JavaScript4+. ePub on the other hand is stuck in 1998, supporting the aging HTML4 - CSS2.1 specs. Very sad.

Paul Merrell

Use Tor or 'EXTREMIST' Tails Linux? Congrats, you're on the NSA's list * The Register - 0 views

  • Alleged leaked documents about the NSA's XKeyscore snooping software appear to show the paranoid agency is targeting Tor and Tails users, Linux Journal readers – and anyone else interested in online privacy.Apparently, this configuration file for XKeyscore is in the divulged data, which was obtained and studied by members of the Tor project and security specialists for German broadcasters NDR and WDR. <a href="http://pubads.g.doubleclick.net/gampad/jump?iu=/6978/reg_security/front&sz=300x250%7C300x600&tile=3&c=33U7ZK6qwQrMkAACSrTugAAAP1&t=ct%3Dns%26unitnum%3D3%26unitname%3Dwww_top_mpu%26pos%3Dtop%26test%3D0" target="_blank"> <img src="http://pubads.g.doubleclick.net/gampad/ad?iu=/6978/reg_security/front&sz=300x250%7C300x600&tile=3&c=33U7ZK6qwQrMkAACSrTugAAAP1&t=ct%3Dns%26unitnum%3D3%26unitname%3Dwww_top_mpu%26pos%3Dtop%26test%3D0" alt=""></a> In their analysis of the alleged top-secret documents, they claim the NSA is, among other things:Specifically targeting Tor directory servers Reading email contents for mentions of Tor bridges Logging IP addresses used to search for privacy-focused websites and software And possibly breaking international law in doing so. We already know from leaked Snowden documents that Western intelligence agents hate Tor for its anonymizing abilities. But what the aforementioned leaked source code, written in a rather strange custom language, shows is that not only is the NSA targeting the anonymizing network Tor specifically, it is also taking digital fingerprints of any netizens who are remotely interested in privacy.
  • These include readers of the Linux Journal site, anyone visiting the website for the Tor-powered Linux operating system Tails – described by the NSA as "a comsec mechanism advocated by extremists on extremist forums" – and anyone looking into combining Tails with the encryption tool Truecrypt.If something as innocuous as Linux Journal is on the NSA's hit list, it's a distinct possibility that El Reg is too, particularly in light of our recent exclusive report on GCHQ – which led to a Ministry of Defence advisor coming round our London office for a chat.
  • If you take even the slightest interest in online privacy or have Googled a Linux Journal article about a broken package, you are earmarked in an NSA database for further surveillance, according to these latest leaks.This is assuming the leaked file is genuine, of course.Other monitored sites, we're told, include HotSpotShield, FreeNet, Centurian, FreeProxies.org, MegaProxy, privacy.li and an anonymous email service called MixMinion. The IP address of computer users even looking at these sites is recorded and stored on the NSA's servers for further analysis, and it's up to the agency how long it keeps that data.The XKeyscore code, we're told, includes microplugins that target Tor servers in Germany, at MIT in the United States, in Sweden, in Austria, and in the Netherlands. In doing so it may not only fall foul of German law but also the US's Fourth Amendment.
  • ...2 more annotations...
  • The nine Tor directory servers receive especially close monitoring from the NSA's spying software, which states the "goal is to find potential Tor clients connecting to the Tor directory servers." Tor clients linking into the directory servers are also logged."This shows that Tor is working well enough that Tor has become a target for the intelligence services," said Sebastian Hahn, who runs one of the key Tor servers. "For me this means that I will definitely go ahead with the project.”
  • While the German reporting team has published part of the XKeyscore scripting code, it doesn't say where it comes from. NSA whistleblower Edward Snowden would be a logical pick, but security experts are not so sure."I do not believe that this came from the Snowden documents," said security guru Bruce Schneier. "I also don't believe the TAO catalog came from the Snowden documents. I think there's a second leaker out there."If so, the NSA is in for much more scrutiny than it ever expected.
Paul Merrell

Bulk Collection Under Section 215 Has Ended… What's Next? | Just Security - 0 views

  • The first (and thus far only) roll-back of post-9/11 surveillance authorities was implemented over the weekend: The National Security Agency shuttered its program for collecting and holding the metadata of Americans’ phone calls under Section 215 of the Patriot Act. While bulk collection under Section 215 has ended, the government can obtain access to this information under the procedures specified in the USA Freedom Act. Indeed, some experts have argued that the Agency likely has access to more metadata because its earlier dragnet didn’t cover cell phones or Internet calling. In addition, the metadata of calls made by an individual in the United States to someone overseas and vice versa can still be collected in bulk — this takes place abroad under Executive Order 12333. No doubt the NSA wishes that this was the end of the surveillance reform story and the Paris attacks initially gave them an opening. John Brennan, the Director of the CIA, implied that the attacks were somehow related to “hand wringing” about spying and Sen. Tom Cotton (R-Ark.) introduced a bill to delay the shut down of the 215 program. Opponents of encryption were quick to say: “I told you so.”
  • But the facts that have emerged thus far tell a different story. It appears that much of the planning took place IRL (that’s “in real life” for those of you who don’t have teenagers). The attackers, several of whom were on law enforcement’s radar, communicated openly over the Internet. If France ever has a 9/11 Commission-type inquiry, it could well conclude that the Paris attacks were a failure of the intelligence agencies rather than a failure of intelligence authorities. Despite the passage of the USA Freedom Act, US surveillance authorities have remained largely intact. Section 702 of the FISA Amendments Act — which is the basis of programs like PRISM and the NSA’s Upstream collection of information from Internet cables — sunsets in the summer of 2017. While it’s difficult to predict the political environment that far out, meaningful reform of Section 702 faces significant obstacles. Unlike the Section 215 program, which was clearly aimed at Americans, Section 702 is supposedly targeted at foreigners and only picks up information about Americans “incidentally.” The NSA has refused to provide an estimate of how many Americans’ information it collects under Section 702, despite repeated requests from lawmakers and most recently a large cohort of advocates. The Section 215 program was held illegal by two federal courts (here and here), but civil attempts to challenge Section 702 have run into standing barriers. Finally, while two review panels concluded that the Section 215 program provided little counterterrorism benefit (here and here), they found that the Section 702 program had been useful.
  • There is, nonetheless, some pressure to narrow the reach of Section 702. The recent decision by the European Court of Justice in the safe harbor case suggests that data flows between Europe and the US may be restricted unless the PRISM program is modified to protect the information of Europeans (see here, here, and here for discussion of the decision and reform options). Pressure from Internet companies whose business is suffering — estimates run to the tune of $35 to 180 billion — as a result of disclosures about NSA spying may also nudge lawmakers towards reform. One of the courts currently considering criminal cases which rely on evidence derived from Section 702 surveillance may hold the program unconstitutional either on the basis of the Fourth Amendment or Article III for the reasons set out in this Brennan Center report. A federal district court in Colorado recently rejected such a challenge, although as explained in Steve’s post, the decision did not seriously explore the issues. Further litigation in the European courts too could have an impact on the debate.
  • ...2 more annotations...
  • The US intelligence community’s broadest surveillance authorities are enshrined in Executive Order 12333, which primarily covers the interception of electronic communications overseas. The Order authorizes the collection, retention, and dissemination of “foreign intelligence” information, which includes information “relating to the capabilities, intentions or activities of foreign powers, organizations or persons.” In other words, so long as they are operating outside the US, intelligence agencies are authorized to collect information about any foreign person — and, of course, any Americans with whom they communicate. The NSA has conceded that EO 12333 is the basis of most of its surveillance. While public information about these programs is limited, a few highlights give a sense of the breadth of EO 12333 operations: The NSA gathers information about every cell phone call made to, from, and within the Bahamas, Mexico, Kenya, the Philippines, and Afghanistan, and possibly other countries. A joint US-UK program tapped into the cables connecting internal Yahoo and Google networks to gather e-mail address books and contact lists from their customers. Another US-UK collaboration collected images from video chats among Yahoo users and possibly other webcam services. The NSA collects both the content and metadata of hundreds of millions of text messages from around the world. By tapping into the cables that connect global networks, the NSA has created a database of the location of hundreds of millions of mobile phones outside the US.
  • Given its scope, EO 12333 is clearly critical to those seeking serious surveillance reform. The path to reform is, however, less clear. There is no sunset provision that requires action by Congress and creates an opportunity for exposing privacy risks. Even in the unlikely event that Congress was inclined to intervene, it would have to address questions about the extent of its constitutional authority to regulate overseas surveillance. To the best of my knowledge, there is no litigation challenging EO 12333 and the government doesn’t give notice to criminal defendants when it uses evidence derived from surveillance under the order, so the likelihood of a court ruling is slim. The Privacy and Civil Liberties Oversight Board is currently reviewing two programs under EO 12333, but it is anticipated that much of its report will be classified (although it has promised a less detailed unclassified version as well). While the short-term outlook for additional surveillance reform is challenging, from a longer-term perspective, the distinctions that our law makes between Americans and non-Americans and between domestic and foreign collection cannot stand indefinitely. If the Fourth Amendment is to meaningfully protect Americans’ privacy, the courts and Congress must come to grips with this reality.
Paul Merrell

EFF Pries More Information on Zero Days from the Government's Grasp | Electronic Fronti... - 0 views

  • Until just last week, the U.S. government kept up the charade that its use of a stockpile of security vulnerabilities for hacking was a closely held secret.1 In fact, in response to EFF’s FOIA suit to get access to the official U.S. policy on zero days, the government redacted every single reference to “offensive” use of vulnerabilities. To add insult to injury, the government’s claim was that even admitting to offensive use would cause damage to national security. Now, in the face of EFF’s brief marshaling overwhelming evidence to the contrary, the charade is over. In response to EFF’s motion for summary judgment, the government has disclosed a new version of the Vulnerabilities Equities Process, minus many of the worst redactions. First and foremost, it now admits that the “discovery of vulnerabilities in commercial information technology may present competing ‘equities’ for the [government’s] offensive and defensive mission.” That might seem painfully obvious—a flaw or backdoor in a Juniper router is dangerous for anyone running a network, whether that network is in the U.S. or Iran. But the government’s failure to adequately weigh these “competing equities” was so severe that in 2013 a group of experts appointed by President Obama recommended that the policy favor disclosure “in almost all instances for widely used code.” [.pdf].
  • The newly disclosed version of the Vulnerabilities Equities Process (VEP) also officially confirms what everyone already knew: the use of zero days isn’t confined to the spies. Rather, the policy states that the “law enforcement community may want to use information pertaining to a vulnerability for similar offensive or defensive purposes but for the ultimate end of law enforcement.” Similarly it explains that “counterintelligence equities can be defensive, offensive, and/or law enforcement-related” and may “also have prosecutorial responsibilities.” Given that the government is currently prosecuting users for committing crimes over Tor hidden services, and that it identified these individuals using vulnerabilities called a “Network Investigative Technique”, this too doesn’t exactly come as a shocker. Just a few weeks ago, the government swore that even acknowledging the mere fact that it uses vulnerabilities offensively “could be expected to cause serious damage to the national security.” That’s a standard move in FOIA cases involving classified information, even though the government unnecessarily classifies documents at an astounding rate. In this case, the government relented only after nearly a year and a half of litigation by EFF. The government would be well advised to stop relying on such weak secrecy claims—it only risks undermining its own credibility.
  • The new version of the VEP also reveals significantly more information about the general process the government follows when a vulnerability is identified. In a nutshell, an agency that discovers a zero day is responsible for invoking the VEP, which then provides for centralized coordination and weighing of equities among all affected agencies. Along with a declaration from an official at the Office of the Director of National Intelligence, this new information provides more background on the reasons why the government decided to develop an overarching zero day policy in the first place: it “recognized that not all organizations see the entire picture of vulnerabilities, and each organization may have its own equities and concerns regarding the prioritization of patches and fixes, as well as its own distinct mission obligations.” We now know the VEP was finalized in February 2010, but the government apparently failed to implement it in any substantial way, prompting the presidential review group’s recommendation to prioritize disclosure over offensive hacking. We’re glad to have forced a little more transparency on this important issue, but the government is still foolishly holding on to a few last redactions, including refusing to name which agencies participate in the VEP. That’s just not supportable, and we’ll be in court next month to argue that the names of these agencies must be disclosed. 
Paul Merrell

New Documents Reveal FBI's "Cozy" Relationship with Geek Squad - 1 views

  • Throughout the past ten years, the FBI has at varying points in time maintained a particularly close relationship with Best Buy officials and used the company’s Geek Squad employees as informants. But the FBI refuses to confirm or deny key information about how the agency may potentially circumvent computer owners’ Fourth Amendment rights. The Electronic Frontier Foundation (EFF) obtained a handful of documents in response to a Freedom of Information Act (FOIA) lawsuit filed in February of last year. EFF says they show the relationship between the FBI and Geek Squad employees is much “cozier” than they thought.
  • In court filings, the defense mentioned there were “eight FBI informants at Geek Squad City” from 2007 to 2012. Multiple employees received payments ranging from $500-1000 for work as informants.
  • There is no evidence that FBI obtained warrants before the Geek Squad informants searched computers they were repairing. It is believed Geek Squad employees routinely search unallocated space for any illegal content that may be on a device and then alert the FBI after conducting “fishing expeditions” for criminal activity, and this is what the FBI trains them to do. EFF sought “records about the extent to which [the FBI] directs and trains Best Buy employees to conduct warrantless searches of people’s devices.” As is clear, the government stonewalled EFF and only released documents that were already referred to by news media. The FBI neither confirmed nor denied whether the agency has “similar relationships with other computer repair facilities or businesses.” The FBI also would not produce any documents that detailed procedures or “training materials” for cultivating informants at computer repair facilities.
Paul Merrell

The New Snowden? NSA Contractor Arrested Over Alleged Theft Of Classified Data - 0 views

  • A contractor working for the National Security Agency (NSA) was arrested by the FBI following his alleged theft of “state secrets.” More specifically, the contractor, Harold Thomas Martin, is charged with stealing highly classified source codes developed to covertly hack the networks of foreign governments, according to several senior law enforcement and intelligence officials. The Justice Department has said that these stolen materials were “critical to national security.” Martin was employed by Booz Allen Hamilton, the company responsible for most of the NSA’s most sensitive cyber-operations. Edward Snowden, the most well-known NSA whistleblower, also worked for Booz Allen Hamilton until he fled to Hong Kong in 2013 where he revealed a trove of documents exposing the massive scope of the NSA dragnet surveillance. That surveillance system was shown to have targeted untold numbers of innocent Americans. According to the New York Times, the theft “raises the embarrassing prospect” that an NSA insider managed to steal highly damaging secret information from the NSA for the second time in three years, not to mention the “Shadow Broker” hack this past August, which made classified NSA hacking tools available to the public.
  • Snowden himself took to Twitter to comment on the arrest. In a tweet, he said the news of Martin’s arrest “is huge” and asked, “Did the FBI secretly arrest the person behind the reports [that the] NSA sat on huge flaws in US products?” It is currently unknown if Martin was connected to those reports as well.
  • It also remains to be seen what Martin’s motivations were in removing classified data from the NSA. Though many suspect that he planned to follow in Snowden’s footsteps, the government will more likely argue that he had planned to commit espionage by selling state secrets to “adversaries.” According to the New York Times article on the arrest, Russia, China, Iran, and North Korea are named as examples of the “adversaries” who would have been targeted by the NSA codes that Martin is accused of stealing. However, Snowden revealed widespread US spying on foreign governments including several US allies such as France and Germany. This suggests that the stolen “source codes” were likely utilized on a much broader scale.
Paul Merrell

HART: Homeland Security's Massive New Database Will Include Face Recognition, DNA, and ... - 0 views

  • The U.S. Department of Homeland Security (DHS) is quietly building what will likely become the largest database of biometric and biographic data on citizens and foreigners in the United States. The agency’s new Homeland Advanced Recognition Technology (HART) database will include multiple forms of biometrics—from face recognition to DNA, data from questionable sources, and highly personal data on innocent people. It will be shared with federal agencies outside of DHS as well as state and local law enforcement and foreign governments. And yet, we still know very little about it.The records DHS plans to include in HART will chill and deter people from exercising their First Amendment protected rights to speak, assemble, and associate. Data like face recognition makes it possible to identify and track people in real time, including at lawful political protests and other gatherings. Other data DHS is planning to collect—including information about people’s “relationship patterns” and from officer “encounters” with the public—can be used to identify political affiliations, religious activities, and familial and friendly relationships. These data points are also frequently colored by conjecture and bias.
  • DHS currently collects a lot of data. Its legacy IDENT fingerprint database contains information on 220-million unique individuals and processes 350,000 fingerprint transactions every day. This is an exponential increase from 20 years ago when IDENT only contained information on 1.8-million people. Between IDENT and other DHS-managed databases, the agency manages over 10-billion biographic records and adds 10-15 million more each week.
  • DHS’s new HART database will allow the agency to vastly expand the types of records it can collect and store. HART will support at least seven types of biometric identifiers, including face and voice data, DNA, scars and tattoos, and a blanket category for “other modalities.” It will also include biographic information, like name, date of birth, physical descriptors, country of origin, and government ID numbers. And it will include data we know to by highly subjective, including information collected from officer “encounters” with the public and information about people’s “relationship patterns.”
  • ...1 more annotation...
  • DHS’s face recognition roll-out is especially concerning. The agency uses mobile biometric devices that can identify faces and capture face data in the field, allowing its ICE (immigration) and CBP (customs) officers to scan everyone with whom they come into contact, whether or not those people are suspected of any criminal activity or an immigration violation. DHS is also partnering with airlines and other third parties to collect face images from travelers entering and leaving the U.S. When combined with data from other government agencies, these troubling collection practices will allow DHS to build a database large enough to identify and track all people in public places, without their knowledge—not just in places the agency oversees, like airports, but anywhere there are cameras.Police abuse of facial recognition technology is not a theoretical issue: it’s happening today. Law enforcement has already used face recognition on public streets and at political protests. During the protests surrounding the death of Freddie Gray in 2015, Baltimore Police ran social media photos against a face recognition database to identify protesters and arrest them. Recent Amazon promotional videos encourage police agencies to acquire that company’s face “Rekognition” capabilities and use them with body cameras and smart cameras to track people throughout cities. At least two U.S. cities are already using Rekognition.DHS compounds face recognition’s threat to anonymity and free speech by planning to include “records related to the analysis of relationship patterns among individuals.” We don’t know where DHS or its external partners will be getting these “relationship pattern” records, but they could come from social media profiles and posts, which the government plans to track by collecting social media user names from all foreign travelers entering the country.
Paul Merrell

Time to 'Break Facebook Up,' Sanders Says After Leaked Docs Show Social Media Giant 'Tr... - 0 views

  • After NBC News on Wednesday published a trove of leaked documents that show how Facebook "treated user data as a bargaining chip with external app developers," White House hopeful Sen. Bernie Sanders declared that it is time "to break Facebook up."
  • When British investigative journalist Duncan Campbell first shared the trove of documents with a handful of media outlets including NBC News in April, journalists Olivia Solon and Cyrus Farivar reported that "Facebook CEO Mark Zuckerberg oversaw plans to consolidate the social network's power and control competitors by treating its users' data as a bargaining chip, while publicly proclaiming to be protecting that data." With the publication Wednesday of nearly 7,000 pages of records—which include internal Facebook emails, web chats, notes, presentations, and spreadsheets—journalists and the public can now have a closer look at exactly how the company was using the vast amount of data it collects when it came to bargaining with third parties.
  • According to Solon and Farivar of NBC: Taken together, they show how Zuckerberg, along with his board and management team, found ways to tap Facebook users' data—including information about friends, relationships, and photos—as leverage over the companies it partnered with. In some cases, Facebook would reward partners by giving them preferential access to certain types of user data while denying the same access to rival companies. For example, Facebook gave Amazon special access to user data because it was spending money on Facebook advertising. In another case the messaging app MessageMe was cut off from access to data because it had grown too popular and could compete with Facebook.
  • ...2 more annotations...
  • The document dump comes as Facebook and Zuckerberg are facing widespread criticism over the company's political advertising policy, which allows candidates for elected office to lie in the ads they pay to circulate on the platform. It also comes as 47 state attorneys general, led by Letitia James of New York, are investigating the social media giant for antitrust violations.
  • The call from Sanders (I-Vt.) Wednesday to break up Facebook follows similar but less definitive statements from the senator. One of Sanders' rivals in the 2020 Democratic presidential primary race, Sen. Elizabeth Warren (D-Mass.), released her plan to "Break Up Big Tech" in March. Zuckerberg is among the opponents of Warren's proposal, which also targets other major technology companies like Amazon and Google.
Paul Merrell

WikiLeaks just dropped the CIA's secret how-to for infecting Windows | Ars Technica - 0 views

  • WikiLeaks has published what it says is another batch of secret hacking manuals belonging to the US Central Intelligence Agency as part of its Vault7 series of leaks. The site is billing Vault7 as the largest publication of intelligence documents ever. Friday's installment includes 27 documents related to "Grasshopper," the codename for a set of software tools used to build customized malware for Windows-based computers. The Grasshopper framework provides building blocks that can be combined in unique ways to suit the requirements of a given surveillance or intelligence operation. The documents are likely to be of interest to potential CIA targets looking for signatures and other signs indicating their Windows systems were hacked. The leak will also prove useful to competing malware developers who want to learn new techniques and best practices. "Grasshopper is a software tool used to build custom installers for target computers running Microsoft Windows operating system," one user guide explained. "An operator uses the Grasshopper builder to construct a custom installation executable."
Paul Merrell

Google Engineer Leaks Nearly 1,000 Pages of Internal Documents, Alleging Bias, Censorship - 0 views

  • A former Google engineer has released nearly 1,000 pages of documents that he says prove that the company, at least in some of its products, secretly boosts or demotes content based on what it deems to be true or false, while publicly claiming to be a neutral platform. The software engineer, Zach Vorhies, first provided the documents to Project Veritas, a right-leaning investigative journalism nonprofit, as well as the Justice Department’s antitrust division, which has been investigating Google for potentially anti-competitive behavior.
  • When he returned to work, however, Google sent him a letter demanding, among other things, that he turn over his employee badge and work laptop, which he did, and “cease and desist” from disclosing “any non-public Google files.” Afraid for his safety, he posted on Twitter that if something would happen to him, all the documents he took would be released to the public.Google then did a “wellness check” on him, he said. The San Francisco police received a call that Vorhies may be mentally ill. A group of officers waited for him outside his house and put him in handcuffs. “This is a large way in which they intimidate their employees that go rogue on the company,” he said.Vorhies then decided that it would be safer for him to go public.
  • One of the goals of the effort was a “clean & regularly sanitized news corpus,” it reads.
  • ...1 more annotation...
  • Robert Epstein, a psychologist who has spent years researching Google’s influence on its users, has published research showing that just by deciding the sequence of top search results, the company can sway undecided voters.Epstein determined that this has led to 2.6 million votes shifting in the 2016 presidential election to Trump’s opponent, former Secretary of State Hillary Clinton. He warned that in 2020, if companies such as Google and Facebook all support the same candidate, they will be able to shift 15 million votes—well beyond the margin most presidents have won by.
« First ‹ Previous 41 - 60 of 332 Next › Last »
Showing 20 items per page