Skip to main content

Home/ Future of the Web/ Group items tagged 2009

Rss Feed Group items tagged

Paul Merrell

Microsoft Statement on European Commission Statement of Objections: Statement of Object... - 0 views

  • REDMOND – Jan. 16, 2009 – “Yesterday Microsoft received a Statement of Objections from the Directorate General for Competition of the European Commission. The Statement of Objections expresses the Commission’s preliminary view that the inclusion of Internet Explorer in Windows since 1996 has violated European competition law. According to the Statement of Objections, other browsers are foreclosed from competing because Windows includes Internet Explorer.
  • The Statement of Objections states that the remedies put in place by the U.S. courts in 2002 following antitrust proceedings in Washington, D.C. do not make the inclusion of Internet Explorer in Windows lawful under European Union law.
  •  
    Microsoft's version of events, notable for the statement that DG Competition included a specific ruling that it is not bound by the U.S. v. Microsoft decision in the U.S. That only states the obvious, but is perhaps intended to forestall somewhat Microsoft arguments that the legality of its bundling was conclusively determined in the U.S. case. If so, it may have worked; Microsoft makes no such claim in this press release.
Paul Merrell

Rapid - Press Releases - EUROPA - 0 views

  • MEMO/09/15 Brussels, 17th January 2009
  • The European Commission can confirm that it has sent a Statement of Objections (SO) to Microsoft on 15th January 2009. The SO outlines the Commission’s preliminary view that Microsoft’s tying of its web browser Internet Explorer to its dominant client PC operating system Windows infringes the EC Treaty rules on abuse of a dominant position (Article 82).
  • In the SO, the Commission sets out evidence and outlines its preliminary conclusion that Microsoft’s tying of Internet Explorer to the Windows operating system harms competition between web browsers, undermines product innovation and ultimately reduces consumer choice. The SO is based on the legal and economic principles established in the judgment of the Court of First Instance of 17 September 2007 (case T-201/04), in which the Court of First Instance upheld the Commission's decision of March 2004 (see IP/04/382), finding that Microsoft had abused its dominant position in the PC operating system market by tying Windows Media Player to its Windows PC operating system (see MEMO/07/359).
  • ...3 more annotations...
  • The evidence gathered during the investigation leads the Commission to believe that the tying of Internet Explorer with Windows, which makes Internet Explorer available on 90% of the world's PCs, distorts competition on the merits between competing web browsers insofar as it provides Internet Explorer with an artificial distribution advantage which other web browsers are unable to match. The Commission is concerned that through the tying, Microsoft shields Internet Explorer from head to head competition with other browsers which is detrimental to the pace of product innovation and to the quality of products which consumers ultimately obtain. In addition, the Commission is concerned that the ubiquity of Internet Explorer creates artificial incentives for content providers and software developers to design websites or software primarily for Internet Explorer which ultimately risks undermining competition and innovation in the provision of services to consumers.
  • Microsoft has 8 weeks to reply the SO, and will then have the right to be heard in an Oral Hearing should it wish to do so. If the preliminary views expressed in the SO are confirmed, the Commission may impose a fine on Microsoft, require Microsoft to cease the abuse and impose a remedy that would restore genuine consumer choice and enable competition on the merits.
  • A Statement of Objections is a formal step in Commission antitrust investigations in which the Commission informs the parties concerned in writing of the objections raised against them. The addressee of a Statement of Objections can reply in writing to the Statement of Objections, setting out all facts known to it which are relevant to its defence against the objections raised by the Commission. The party may also request an oral hearing to present its comments on the case. The Commission may then take a decision on whether conduct addressed in the Statement of Objections is compatible or not with the EC Treaty’s antitrust rules. Sending a Statement of Objections does not prejudge the final outcome of the procedure. In the March 2004 Decision the Commission ordered Microsoft to offer to PC manufacturers a version of its Windows client PC operating system without Windows Media Player. Microsoft, however, retained the right to also offer a version with Windows Media Player (see IP/04/382).
  •  
    It's official, hot off the presses (wasn't there a few minutes ago). We're now into a process where DG Competition will revisit its previous order requiring Microsoft to market two versions of Windows, one with Media Player and one without. DG Competition staff were considerably outraged that Microsoft took advantage of a bit of under-specification in the previous order and sold the two versions at the same price. That detail will not be neglected this time around. Moreover, given the ineffectiveness of the previous order in restoring competition among media players, don't be surprised if this results in an outright ban on bundling MSIE with Windows.
Paul Merrell

Update: EU hits Microsoft with new antitrust charges - 0 views

  • January 16, 2009 (Computerworld) Microsoft Corp. confirmed today that European Union regulators have formally accused the company of breaking antitrust laws by including the company's Internet Explorer (IE) browser with the Windows operating system. "Yesterday, Microsoft received a Statement of Objections from the Directorate General for Competition of the European Commission," the company said in a statement on Friday. "The Statement of Objections expresses the Commission's preliminary view that the inclusion of Internet Explorer in Windows since 1996 has violated European competition law." According to Microsoft, the EU claimed that "other browsers are foreclosed from competing because Windows includes Internet Explorer."
Gary Edwards

The Google Apps Revenue Myth: $10mm In 2009 (GOOG) - 0 views

  •  
    There are two theories about Google Apps (Spreadsheet, Word-processor, GMail, etc.): Google Apps will rapidly become a multi-billion dollar business that will diversify Google's dependence on search Google Apps will kill Microsoft The first of these theories, a source outside Google familiar with Apps tells us, is laughable.
  •  
    The reason Google-Docs is failing to crack the iron grip Microsoft has on business enterprises is the same reason that Linux desktops running OpenOffice failed :: It's the Business Process's that are bound to the Microsoft Office productivity environment that block the shift to Open Web computing. See, It's the Business Process!
Paul Merrell

HTML 5 differences from HTML 4 - 0 views

  • W3C Working Draft 12 February 2009
  • HTML 5 defines the fifth major revision of the core language of the World Wide Web, HTML. "HTML 5 differences from HTML 4" describes the differences between HTML 4 and HTML 5 and provides some of the rationale for the changes. This document may not provide accurate information as the HTML 5 specification is still actively in development. When in doubt, always check the HTML 5 specification itself. [HTML5]
Paul Merrell

Safe Plurality: Can it be done using OOXML's Markup Compatibility and Extensions mechan... - 0 views

  • During the OOXML standardization proceedings, the ISO particpants felt that there was one particular sub-technology, Markup Compatibility and Extensibility (MCE), that was potentially of such usefulness by other standards, that it was brought out into its own part. It is now IS29500:2009 Part 3: you can download it in its ECMA form here, it only has about 15 pages of substantive text. The particular issue that MCE address is this: what is an application supposed to do when it finds some markup it wasn't programmed to accept? This could be extension elements in some foreign namespace, but it could also be some elements from a known namespace: the case when a document was made against a newer version of the standard than the application.
  •  
    Rick Jelliffe posts a frank view of the OOXML compatibility framework, a document I've studied myself in the past. There is much that is laudable about the framework, but there are also aspects that are troublesome. Jelliffe identifies one red flag item, the freedom for a vendor to "proprietize" OOXML using the MustUnderstand attribute and offers some suggestions for lessening that danger through redrafting of the spec. One issue he does not touch, however, is the Microsoft Open Specification Promise covenant not to sue, a deeply flawed document in terms of anyone implementing OOXML other than Microsoft. Still, there is so much prior art for the OOXML compatibility framework that I doubt any patent reading on it would survive judicial review. E.g., a highly similar framework has been implemented in WordPerfect since version 6.0. and the OOXML framework is remarkably similar to the compatibility framework specified by OASIS OpenDocument 1.0 but subsequently gutted at ISO. The Jelliffe article offers a good overview of factors that must be considered in designing a standard's compatibility framework. For those that go on to read the compatibility framework's specification, keep in mind that in several places the document falsely claims that it is an interoperability framework. It is not. It is a framework designed for one-way transfer of data, not interoperability which involves round-trip 2-way of exchange of data without data loss.
Paul Merrell

MICROSOFT CORP (Form: 10-Q, Received: 01/22/2009 09:02:43) - 0 views

  • In January 2008 the Commission opened a competition law investigation related to the inclusion of various capabilities in our Windows operating system software, including Web browsing software. The investigation was precipitated by a complaint filed with the Commission by Opera Software ASA, a firm that offers Web browsing software. On January 15, 2009, the European Commission issued a statement of objections expressing the Commission’s preliminary view that the inclusion of Internet Explorer in Windows since 1996 has violated European competition law. According to the statement of objections, other browsers are foreclosed from competing because Windows includes Internet Explorer. We will have an opportunity to respond in writing to the statement of objections within about two months. We may also request a hearing, which would take place after the submission of this response. Under European Union procedure, the European Commission will not make a final determination until after it receives and assesses our response and conducts the hearing, should we request one. The statement of objections seeks to impose a remedy that is different than the remedy imposed in the earlier proceeding concerning Windows Media Player.
  • While computer users and OEMs are already free to run any Web browsing software on Windows, the Commission is considering ordering Microsoft and OEMs to obligate users to choose a particular browser when setting up a new PC. Such a remedy might include a requirement that OEMs distribute multiple browsers on new Windows-based PCs. We may also be required to disable certain unspecified Internet Explorer software code if a user chooses a competing browser. The statement of objections also seeks to impose a significant fine based on sales of Windows operating systems in the European Union. In January 2008, the Commission opened an additional competition law investigation that relates primarily to interoperability with respect to our Microsoft Office family of products. This investigation resulted from complaints filed with the Commission by a trade association of Microsoft’s competitors.
Paul Merrell

VMware to Acquire Zimbra - 1 views

  •  
    PALO ALTO, Calif., January 12, 2010 - VMware, Inc. (NYSE: VMW), the global leader in virtualization solutions from the desktop through the datacenter and to the cloud, today announced that it has entered into a definitive agreement to acquire Zimbra, a leading vendor of email and collaboration software, from Yahoo! Inc. Zimbra is a leading open source email and collaboration solution with over 55 million mailboxes.  As an independent Yahoo! product division, Zimbra achieved 2009 mailbox growth of 86% overall and 165% among small and medium business customers.
Gonzalo San Gil, PhD.

Internet Archive Needs Your Help - 1 views

  •  
    [ This year the Internet Archive needs your help. In 2009-2010 we were able to employ hundreds of low income, out of work parents using a stimulus wage subsidy. Most of these parents worked to scan over 150,000 recent books for the blind and dyslexic. We appealed to legislators, but this program was defunded. ... ]
Paul Merrell

Antitrust Week Continues: EU Slams Intel With $1.45b Fine - Law Blog - WSJ - 0 views

  • Most likely, we grant you, it was coincidence. But we couldn’t help notice the timing: Two days after the DOJ’s new antitrust head, Christine Varney, publicly repudiates her predecessors by pledging to ramp up enforcement on so-called “single-firm” monopolistic behavior, the European Union takes a sledgehammer to Intel Corp., fining it $1.45 billion for alleged monopolistic activity. The fine is the largest ever assessed for monopoly abuse. Click here for the WSJ story, from Charles Forelle; here for the NYT story; here for the NYT story; here for the FT story; here for the Commission’s statement; here for Intel’s response.
    • Paul Merrell
       
      See my earlier Diigo bookmark quoting the DG Competition statement that it had coordinated with the U.S. Justice Dept. in its simultaneous and ongoing investigation of INtel.
  • John Pheasant, an antitrust practitioner at Hogan & Hartson in London and Brussels, told the Law Blog that some of the evidence does “not look very good for Intel,” adding that “if the facts are there, this type of conduct is more likely to be regarded as abusive if practiced by a dominant company. . . .”
  • On Varney’s statement from earlier this week, Kroes said the Justice Department’s stance gave her a “huge positive feeling. The more competition authorities joining us in our competition philosophy, the better it is.”
David Corking

UK National Portrait Gallery threatens Wikipedia over scans of its public domain art - ... - 0 views

  • If you take public money to buy art, you should make that art available to the public using the best, most efficient means possible. If you believe the public wants to subsidize the creation of commercial art-books, then get out of the art-gallery business, start a publisher and hit the government up for some free tax-money.
    • David Corking
       
      Hear, hear.
  •  
    This is how I would like my taxes used.
  •  
    Analysis from the "open source" novelist
Gary Edwards

The Education of Gary Edwards - Rick Jelliffe on O'Reilly Broadcast - 0 views

  •  
    I wonder how i missed this? Incredibly, i have my own biographer and i didn't know it! The date line is September, 2008, I had turned off all my ODF-OOXML-OASIS searches and blog feeds back in October of 2007 when we moved the da Vinci plug-in to HTML+ using the W3C CDF model. Is it appropriate to send flowers to your secret biographer? Maybe i'll find some time and update his work. The gap between October 2007 and April of 2009 is filled with adventure and wonder. And WebKit!

    "....One of the more interesting characters in the recent standards battles has been Gary Edwards: he was a member of the original ODF TC in 2002 which oversaw the creation of ODF 1.0 in 2005, but gradually became more concerned about large vendor dominance of the ODF TC frustrating what he saw as critical improvements in the area of interoperability. This compromised the ability of ODF to act as a universal format."

    "....Edwards increasingly came to believe that the battleground had shifted, with the SharePoint threat increasingly needing to be the focus of open standards and FOSS attention, not just the standalone desktop applications: I think Edwards tends to see Office Open XML as a stalking horse for Microsoft to get its foot back in the door for back-end systems....."

    "....Edwards and some colleagues split with some acrimony from the ODF effort in 2007, and subsequently see W3C's Compound Document Formats (CDF) as holding the best promise for interoperability. Edwards' public comments are an interesting reflection of an person evolving their opinion in the light of experience, events and changing opportunities...."

    ".... I have put together some interesting quotes from him which, I hope, fairly bring out some of the themes I see. As always, read the source to get more info: ..... "

Gary Edwards

Sun pitches new cloud as 'Open Platform' * - 0 views

  •  
    Sun takes on the problem of interoperability and portability of applications in a world where there will be many many clouds. At the roll out of the Sun Cloud, key executives explain Sun's implementation of Open Cloud API's and what they see as a pressing need for management tools that will allow some standardization across clouds.

    Sun's Open Cloud API plan is a clean reuse of existing Open Web API's.

    "..... The underpinning of the Open Cloud Platform that Sun will be pitching to developers is a set of cloud APIs, the creation of which is focused under Project Kenai and which has been released under a Community Commons open source license. Sun wants lots of feedback on the APIs and wants these APIs to become a standard too, hence the open license. These APIs describes how virtual elements in a cloud are created, started, stopped, and hibernated using HTTP commands such as GET, PUT, and POST...."

    "...... The upshot is that these APIs will allow programmatic access to virtual infrastructure from Java, PHP, Python, and Ruby and that means system admins can script how virtual resources are deployed. The APIs, as co-creator Tim Bray explains in his blog, are written in JavaScript Object Notation (JSON), not XML. The Q-Layer software is a graphical representation of what is going on down in the APIs, and you can moving virtual resources into the cloud with a click of a mouse using the dashboard or programmatically using the APIs from those four programming languages listed above. (PHP support is not yet available, but will be)....."
  •  
    I can see why Sun picked those four languages first. Can I assume that with a bit of work, this API will be usable from any language with a C "foreign function interface", such as Perl, Common Lisp, Bourne shell, Squeak Smalltalk, and others that your server application might be written in?
  •  
    I read this comment that largely answers my question at: http://www.tbray.org/ongoing/When/200x/2009/03/16/Sun-Cloud "So right now JSON out of a shell tool is not so good. More things like this will create pressure for development of tools to change that, but years of widespread XML/HTML deployment have only produced a few oddly maintained tools. Perhaps that's because you can scrape quite a bit of the web with a couple sed passes, and if I were to have to deal with the mentioned tools, that's probably the route I'd take." (seth w. klein) In other words, with a bit of work, _anything_ that can talk text over HTTP can do this with a bit of work, but an object-oriented is likely to be more at home with JSON (JavaScript Object Notation)
Matteo Spreafico

Google Redefines Disruption: The "Less Than Free" Business Model - 0 views

  • In the summer of 2007, excitement regarding the criticality of map data (specifically turn-by-turn navigation data) reached a fever pitch.  On July 23, 2007, TomTom, the leading portable GPS device maker, agreed to buy Tele Atlas for US$2.7 billion. Shortly thereafter, on October 1, Nokia agreed to buy NavTeq for a cool US$8.1 billion. Meanwhile Google was still evolving its strategy and no longer wanted to be limited by the terms of its two contracts. As such, they informed Tele Atlas and NavTeq that they wanted to modify their license terms to allow more liberty with respect to syndication and proliferation. NavTeq balked, and in September of 2008 Google quietly dropped NavTeq, moving to just one partner for its core mapping data. Tele Atlas eventually agreed to the term modifications, but perhaps they should have sensed something bigger at play.
  • Rumors abound about just how many cars Google has on the roads building it own turn-by-turn mapping data as well as its unique “Google Streetview” database. Whatever it is, it must be huge. This October 13th, just over one year after dropping NavTeq, the other shoe dropped as well. Google disconnected from Tele Atlas and began to offer maps that were free and clear of either license. These maps are based on a combination of their own data as well as freely available data. Two weeks after this, Google announces free turn-by-turn directions for all Android phones. This couldn’t have been a great day for the deal teams that worked on the respective Tele Atlas and NavTeq acquisitions.
  • Google’s free navigation feature announcement dealt a crushing blow to the GPS stocks. Garmin fell 16%. TomTom fell 21%. Imagine trying to maintain high royalty rates against this strategic move by Google. Android is not only a phone OS, it’s a CE OS. If Ford or BMW want to build an in-dash Android GPS, guess what? Google will give it to them for free.
  • ...2 more annotations...
  • I then asked my friend, “so why would they ever use the Google (non open source) license version.”  (EDIT: One of the commenters below pointed out that all Android is open source, and the Google apps pack, including the GPS, is licensed on top.  Doesn’t change the argument, but wanted the correct data included here.)  Here was the big punch line – because Google will give you ad splits on search if you use that version!  That’s right; Google will pay you to use their mobile OS. I like to call this the “less than free” business model.
  • “Less than free” may not stop with the mobile phone. Google’s CEO Eric Schmidt has been quite outspoken about his support for the Google Chrome OS. And there is no reason to believe that the “less than free” business model will not be used here as well. If Sony or HP or Dell builds a netbook based on Chrome OS, they will make money on every search each user initiates. Google, eager to protect its search share and market volume, will gladly pay the ad splits. Microsoft, who was already forced to lower Windows netbook pricing to fend off Linux, will be dancing with a business model inversion of epic proportion – from “you pay me” to “I pay you.”  It’s really hard to build a compensation package for your sales team on those economics.
Paul Merrell

Visit the Wrong Website, and the FBI Could End Up in Your Computer | Threat Level | WIRED - 0 views

  • Security experts call it a “drive-by download”: a hacker infiltrates a high-traffic website and then subverts it to deliver malware to every single visitor. It’s one of the most powerful tools in the black hat arsenal, capable of delivering thousands of fresh victims into a hackers’ clutches within minutes. Now the technique is being adopted by a different kind of a hacker—the kind with a badge. For the last two years, the FBI has been quietly experimenting with drive-by hacks as a solution to one of law enforcement’s knottiest Internet problems: how to identify and prosecute users of criminal websites hiding behind the powerful Tor anonymity system. The approach has borne fruit—over a dozen alleged users of Tor-based child porn sites are now headed for trial as a result. But it’s also engendering controversy, with charges that the Justice Department has glossed over the bulk-hacking technique when describing it to judges, while concealing its use from defendants. Critics also worry about mission creep, the weakening of a technology relied on by human rights workers and activists, and the potential for innocent parties to wind up infected with government malware because they visited the wrong website. “This is such a big leap, there should have been congressional hearings about this,” says ACLU technologist Chris Soghoian, an expert on law enforcement’s use of hacking tools. “If Congress decides this is a technique that’s perfectly appropriate, maybe that’s OK. But let’s have an informed debate about it.”
  • The FBI’s use of malware is not new. The bureau calls the method an NIT, for “network investigative technique,” and the FBI has been using it since at least 2002 in cases ranging from computer hacking to bomb threats, child porn to extortion. Depending on the deployment, an NIT can be a bulky full-featured backdoor program that gives the government access to your files, location, web history and webcam for a month at a time, or a slim, fleeting wisp of code that sends the FBI your computer’s name and address, and then evaporates. What’s changed is the way the FBI uses its malware capability, deploying it as a driftnet instead of a fishing line. And the shift is a direct response to Tor, the powerful anonymity system endorsed by Edward Snowden and the State Department alike.
Gonzalo San Gil, PhD.

Unprecedented Music Piracy Collapse Fails to Boost Revenues | TorrentFreak - 0 views

  •  
    " Andy on January 26, 2015 C: 94 Breaking A survey carried out by music industry group IFPI has revealed that just 4% of Norwegians under 30 are now using illegal file-sharing platforms to obtain music, down from 70% in 2009. But while that achievement is unprecedented, overall music industry revenues have remained static."
Gonzalo San Gil, PhD.

Nina Paley Argues Why Copyright Is Brain Damage | Techdirt - 0 views

  •  
    "from the sovereignty-of-your-own-mind dept We first wrote about Nina Paley in 2009, upon hearing about the ridiculous copyright mess she found herself in concerning her wonderful movie Sita Sings the Blues. While she eventually was able to sort out that mess and release the film, she also discovered that the more she shared the film, the more money she made, and she began to question copyright entirely."
Paul Merrell

From Radio to Porn, British Spies Track Web Users' Online Identities - 1 views

  • HERE WAS A SIMPLE AIM at the heart of the top-secret program: Record the website browsing habits of “every visible user on the Internet.” Before long, billions of digital records about ordinary people’s online activities were being stored every day. Among them were details cataloging visits to porn, social media and news websites, search engines, chat forums, and blogs. The mass surveillance operation — code-named KARMA POLICE — was launched by British spies about seven years ago without any public debate or scrutiny. It was just one part of a giant global Internet spying apparatus built by the United Kingdom’s electronic eavesdropping agency, Government Communications Headquarters, or GCHQ. The revelations about the scope of the British agency’s surveillance are contained in documents obtained by The Intercept from National Security Agency whistleblower Edward Snowden. Previous reports based on the leaked files have exposed how GCHQ taps into Internet cables to monitor communications on a vast scale, but many details about what happens to the data after it has been vacuumed up have remained unclear.
  • Amid a renewed push from the U.K. government for more surveillance powers, more than two dozen documents being disclosed today by The Intercept reveal for the first time several major strands of GCHQ’s existing electronic eavesdropping capabilities.
  • The surveillance is underpinned by an opaque legal regime that has authorized GCHQ to sift through huge archives of metadata about the private phone calls, emails and Internet browsing logs of Brits, Americans, and any other citizens — all without a court order or judicial warrant
  • ...17 more annotations...
  • A huge volume of the Internet data GCHQ collects flows directly into a massive repository named Black Hole, which is at the core of the agency’s online spying operations, storing raw logs of intercepted material before it has been subject to analysis. Black Hole contains data collected by GCHQ as part of bulk “unselected” surveillance, meaning it is not focused on particular “selected” targets and instead includes troves of data indiscriminately swept up about ordinary people’s online activities. Between August 2007 and March 2009, GCHQ documents say that Black Hole was used to store more than 1.1 trillion “events” — a term the agency uses to refer to metadata records — with about 10 billion new entries added every day. As of March 2009, the largest slice of data Black Hole held — 41 percent — was about people’s Internet browsing histories. The rest included a combination of email and instant messenger records, details about search engine queries, information about social media activity, logs related to hacking operations, and data on people’s use of tools to browse the Internet anonymously.
  • Throughout this period, as smartphone sales started to boom, the frequency of people’s Internet use was steadily increasing. In tandem, British spies were working frantically to bolster their spying capabilities, with plans afoot to expand the size of Black Hole and other repositories to handle an avalanche of new data. By 2010, according to the documents, GCHQ was logging 30 billion metadata records per day. By 2012, collection had increased to 50 billion per day, and work was underway to double capacity to 100 billion. The agency was developing “unprecedented” techniques to perform what it called “population-scale” data mining, monitoring all communications across entire countries in an effort to detect patterns or behaviors deemed suspicious. It was creating what it said would be, by 2013, “the world’s biggest” surveillance engine “to run cyber operations and to access better, more valued data for customers to make a real world difference.”
  • A document from the GCHQ target analysis center (GTAC) shows the Black Hole repository’s structure.
  • The data is searched by GCHQ analysts in a hunt for behavior online that could be connected to terrorism or other criminal activity. But it has also served a broader and more controversial purpose — helping the agency hack into European companies’ computer networks. In the lead up to its secret mission targeting Netherlands-based Gemalto, the largest SIM card manufacturer in the world, GCHQ used MUTANT BROTH in an effort to identify the company’s employees so it could hack into their computers. The system helped the agency analyze intercepted Facebook cookies it believed were associated with Gemalto staff located at offices in France and Poland. GCHQ later successfully infiltrated Gemalto’s internal networks, stealing encryption keys produced by the company that protect the privacy of cell phone communications.
  • Similarly, MUTANT BROTH proved integral to GCHQ’s hack of Belgian telecommunications provider Belgacom. The agency entered IP addresses associated with Belgacom into MUTANT BROTH to uncover information about the company’s employees. Cookies associated with the IPs revealed the Google, Yahoo, and LinkedIn accounts of three Belgacom engineers, whose computers were then targeted by the agency and infected with malware. The hacking operation resulted in GCHQ gaining deep access into the most sensitive parts of Belgacom’s internal systems, granting British spies the ability to intercept communications passing through the company’s networks.
  • In March, a U.K. parliamentary committee published the findings of an 18-month review of GCHQ’s operations and called for an overhaul of the laws that regulate the spying. The committee raised concerns about the agency gathering what it described as “bulk personal datasets” being held about “a wide range of people.” However, it censored the section of the report describing what these “datasets” contained, despite acknowledging that they “may be highly intrusive.” The Snowden documents shine light on some of the core GCHQ bulk data-gathering programs that the committee was likely referring to — pulling back the veil of secrecy that has shielded some of the agency’s most controversial surveillance operations from public scrutiny. KARMA POLICE and MUTANT BROTH are among the key bulk collection systems. But they do not operate in isolation — and the scope of GCHQ’s spying extends far beyond them.
  • The agency operates a bewildering array of other eavesdropping systems, each serving its own specific purpose and designated a unique code name, such as: SOCIAL ANTHROPOID, which is used to analyze metadata on emails, instant messenger chats, social media connections and conversations, plus “telephony” metadata about phone calls, cell phone locations, text and multimedia messages; MEMORY HOLE, which logs queries entered into search engines and associates each search with an IP address; MARBLED GECKO, which sifts through details about searches people have entered into Google Maps and Google Earth; and INFINITE MONKEYS, which analyzes data about the usage of online bulletin boards and forums. GCHQ has other programs that it uses to analyze the content of intercepted communications, such as the full written body of emails and the audio of phone calls. One of the most important content collection capabilities is TEMPORA, which mines vast amounts of emails, instant messages, voice calls and other communications and makes them accessible through a Google-style search tool named XKEYSCORE.
  • As of September 2012, TEMPORA was collecting “more than 40 billion pieces of content a day” and it was being used to spy on people across Europe, the Middle East, and North Africa, according to a top-secret memo outlining the scope of the program. The existence of TEMPORA was first revealed by The Guardian in June 2013. To analyze all of the communications it intercepts and to build a profile of the individuals it is monitoring, GCHQ uses a variety of different tools that can pull together all of the relevant information and make it accessible through a single interface. SAMUEL PEPYS is one such tool, built by the British spies to analyze both the content and metadata of emails, browsing sessions, and instant messages as they are being intercepted in real time. One screenshot of SAMUEL PEPYS in action shows the agency using it to monitor an individual in Sweden who visited a page about GCHQ on the U.S.-based anti-secrecy website Cryptome.
  • Partly due to the U.K.’s geographic location — situated between the United States and the western edge of continental Europe — a large amount of the world’s Internet traffic passes through its territory across international data cables. In 2010, GCHQ noted that what amounted to “25 percent of all Internet traffic” was transiting the U.K. through some 1,600 different cables. The agency said that it could “survey the majority of the 1,600” and “select the most valuable to switch into our processing systems.”
  • According to Joss Wright, a research fellow at the University of Oxford’s Internet Institute, tapping into the cables allows GCHQ to monitor a large portion of foreign communications. But the cables also transport masses of wholly domestic British emails and online chats, because when anyone in the U.K. sends an email or visits a website, their computer will routinely send and receive data from servers that are located overseas. “I could send a message from my computer here [in England] to my wife’s computer in the next room and on its way it could go through the U.S., France, and other countries,” Wright says. “That’s just the way the Internet is designed.” In other words, Wright adds, that means “a lot” of British data and communications transit across international cables daily, and are liable to be swept into GCHQ’s databases.
  • A map from a classified GCHQ presentation about intercepting communications from undersea cables. GCHQ is authorized to conduct dragnet surveillance of the international data cables through so-called external warrants that are signed off by a government minister. The external warrants permit the agency to monitor communications in foreign countries as well as British citizens’ international calls and emails — for example, a call from Islamabad to London. They prohibit GCHQ from reading or listening to the content of “internal” U.K. to U.K. emails and phone calls, which are supposed to be filtered out from GCHQ’s systems if they are inadvertently intercepted unless additional authorization is granted to scrutinize them. However, the same rules do not apply to metadata. A little-known loophole in the law allows GCHQ to use external warrants to collect and analyze bulk metadata about the emails, phone calls, and Internet browsing activities of British people, citizens of closely allied countries, and others, regardless of whether the data is derived from domestic U.K. to U.K. communications and browsing sessions or otherwise. In March, the existence of this loophole was quietly acknowledged by the U.K. parliamentary committee’s surveillance review, which stated in a section of its report that “special protection and additional safeguards” did not apply to metadata swept up using external warrants and that domestic British metadata could therefore be lawfully “returned as a result of searches” conducted by GCHQ.
  • Perhaps unsurprisingly, GCHQ appears to have readily exploited this obscure legal technicality. Secret policy guidance papers issued to the agency’s analysts instruct them that they can sift through huge troves of indiscriminately collected metadata records to spy on anyone regardless of their nationality. The guidance makes clear that there is no exemption or extra privacy protection for British people or citizens from countries that are members of the Five Eyes, a surveillance alliance that the U.K. is part of alongside the U.S., Canada, Australia, and New Zealand. “If you are searching a purely Events only database such as MUTANT BROTH, the issue of location does not occur,” states one internal GCHQ policy document, which is marked with a “last modified” date of July 2012. The document adds that analysts are free to search the databases for British metadata “without further authorization” by inputing a U.K. “selector,” meaning a unique identifier such as a person’s email or IP address, username, or phone number. Authorization is “not needed for individuals in the U.K.,” another GCHQ document explains, because metadata has been judged “less intrusive than communications content.” All the spies are required to do to mine the metadata troves is write a short “justification” or “reason” for each search they conduct and then click a button on their computer screen.
  • Intelligence GCHQ collects on British persons of interest is shared with domestic security agency MI5, which usually takes the lead on spying operations within the U.K. MI5 conducts its own extensive domestic surveillance as part of a program called DIGINT (digital intelligence).
  • GCHQ’s documents suggest that it typically retains metadata for periods of between 30 days to six months. It stores the content of communications for a shorter period of time, varying between three to 30 days. The retention periods can be extended if deemed necessary for “cyber defense.” One secret policy paper dated from January 2010 lists the wide range of information the agency classes as metadata — including location data that could be used to track your movements, your email, instant messenger, and social networking “buddy lists,” logs showing who you have communicated with by phone or email, the passwords you use to access “communications services” (such as an email account), and information about websites you have viewed.
  • Records showing the full website addresses you have visited — for instance, www.gchq.gov.uk/what_we_do — are treated as content. But the first part of an address you have visited — for instance, www.gchq.gov.uk — is treated as metadata. In isolation, a single metadata record of a phone call, email, or website visit may not reveal much about a person’s private life, according to Ethan Zuckerman, director of Massachusetts Institute of Technology’s Center for Civic Media. But if accumulated and analyzed over a period of weeks or months, these details would be “extremely personal,” he told The Intercept, because they could reveal a person’s movements, habits, religious beliefs, political views, relationships, and even sexual preferences. For Zuckerman, who has studied the social and political ramifications of surveillance, the most concerning aspect of large-scale government data collection is that it can be “corrosive towards democracy” — leading to a chilling effect on freedom of expression and communication. “Once we know there’s a reasonable chance that we are being watched in one fashion or another it’s hard for that not to have a ‘panopticon effect,’” he said, “where we think and behave differently based on the assumption that people may be watching and paying attention to what we are doing.”
  • When compared to surveillance rules in place in the U.S., GCHQ notes in one document that the U.K. has “a light oversight regime.” The more lax British spying regulations are reflected in secret internal rules that highlight greater restrictions on how NSA databases can be accessed. The NSA’s troves can be searched for data on British citizens, one document states, but they cannot be mined for information about Americans or other citizens from countries in the Five Eyes alliance. No such constraints are placed on GCHQ’s own databases, which can be sifted for records on the phone calls, emails, and Internet usage of Brits, Americans, and citizens from any other country. The scope of GCHQ’s surveillance powers explain in part why Snowden told The Guardian in June 2013 that U.K. surveillance is “worse than the U.S.” In an interview with Der Spiegel in July 2013, Snowden added that British Internet cables were “radioactive” and joked: “Even the Queen’s selfies to the pool boy get logged.”
  • In recent years, the biggest barrier to GCHQ’s mass collection of data does not appear to have come in the form of legal or policy restrictions. Rather, it is the increased use of encryption technology that protects the privacy of communications that has posed the biggest potential hindrance to the agency’s activities. “The spread of encryption … threatens our ability to do effective target discovery/development,” says a top-secret report co-authored by an official from the British agency and an NSA employee in 2011. “Pertinent metadata events will be locked within the encrypted channels and difficult, if not impossible, to prise out,” the report says, adding that the agencies were working on a plan that would “(hopefully) allow our Internet Exploitation strategy to prevail.”
Paul Merrell

American and British Spy Agencies Targeted In-Flight Mobile Phone Use - 0 views

  • In the trove of documents provided by former National Security Agency contractor Edward Snowden is a treasure. It begins with a riddle: “What do the President of Pakistan, a cigar smuggler, an arms dealer, a counterterrorism target, and a combatting proliferation target have in common? They all used their everyday GSM phone during a flight.” This riddle appeared in 2010 in SIDtoday, the internal newsletter of the NSA’s Signals Intelligence Directorate, or SID, and it was classified “top secret.” It announced the emergence of a new field of espionage that had not yet been explored: the interception of data from phone calls made on board civil aircraft. In a separate internal document from a year earlier, the NSA reported that 50,000 people had already used their mobile phones in flight as of December 2008, a figure that rose to 100,000 by February 2009. The NSA attributed the increase to “more planes equipped with in-flight GSM capability, less fear that a plane will crash due to making/receiving a call, not as expensive as people thought.” The sky seemed to belong to the agency.
‹ Previous 21 - 40 of 112 Next › Last »
Showing 20 items per page