Skip to main content

Home/ Future of the Web/ Group items matching "$50" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
Paul Merrell

Google Wins Patent For Data Center In A Box; Trouble For Sun, Rackable, IBM? -- Data Center - 0 views

  • Google (NSDQ: GOOG) has obtained a broad patent for a data center in a container, which might put a kink in product plans for companies like Sun Microsystems (NSDQ: JAVA), Rackable Systems, and IBM (NYSE: IBM). The patent, granted Tuesday, covers "modular data centers with modular components that can implemented in numerous ways, including as a process, an apparatus, a system, a device, or a method."
  • The U.S. Patent Trademark Office site reveals patent number 7,278,273 as describing modules in intermodal shipping containers, or those that can be shipped by multiple carriers and systems. It also covers computing systems mounted within temperature-controlled containers, configured so they can ship easily, be factory built and deployed at data center sites.
  • If this sounds familiar, it might just be. Google's patent description resembles Sun Microsystems' data center in a box, called Project Blackbox. During its debut last year, Sun installed a Blackbox -- essentially a cargo container for 18-wheelers -- outside of Grand Central Station in New York City to show how easily one of their data centers could be installed. Google's patent description also has similarities to Rackable Systems' ICE Cube as well as IBM's Scalable Modular Data Center.
Paul Merrell

Carakan - By Opera Core Concerns - 0 views

  • Over the past few months, a small team of developers and testers have been working on implementing a new ECMAScript/JavaScript engine for Opera. When Opera's current ECMAScript engine, called Futhark, was first released in a public version, it was the fastest engine on the market. That engine was developed to minimize code footprint and memory usage, rather than to achieve maximum execution speed. This has traditionally been a correct trade-off on many of the platforms Opera runs on. The Web is a changing environment however, and tomorrow's advanced web applications will require faster ECMAScript execution, so we have now taken on the challenge to once again develop the fastest ECMAScript engine on the market.
  • So how fast is Carakan? Using a regular cross-platform switch dispatch mechanism (without any generated native code) Carakan is currently about two and a half times faster at the SunSpider benchmark than the ECMAScript engine in Presto 2.2 (Opera 10 Alpha). Since Opera is ported to many different hardware architectures, this cross-platform improvement is on its own very important. The native code generation in Carakan is not yet ready for full-scale testing, but the few individual benchmark tests that it is already compatible with runs between 5 and 50 times faster, so it is looking promising so far.
Paul Merrell

Web Hypertext Application Technology Working Group Demos from September 2008 - 0 views

  • HTML 5 demos from September 2008
  • The demos and segments of this talk are: <video> (00:35) postMessage() (05:40) localStorage (15:20) sessionStorage (21:00) Drag and Drop API (29:05) onhashchange (37:30) Form Controls (40:50) <canvas> (56:55) Validation (1:07:20) Questions and Answers (1:09:35)
Alessandro P

LearningReviews.com - Software and Webware - 4 views

  • Web 2.0 in Education
    • Alessandro P
       
      Una wiki ben strutturata. Possibile riferimento.
  • Websites: Freewebs
    • Alessandro P
       
      Sito e dominio gratis. Da vedre il link
  • Websites: Weebly
    • Alessandro P
       
      Per creare un sito da se. Da vedre!
  •  
    Software and Webware What FREE software and webware packages are most helpful to kids and teachers?
Paul Merrell

LEAKED: Secret Negotiations to Let Big Brother Go Global | Wolf Street - 0 views

  • Much has been written, at least in the alternative media, about the Trans Pacific Partnership (TPP) and the Transatlantic Trade and Investment Partnership (TTIP), two multilateral trade treaties being negotiated between the representatives of dozens of national governments and armies of corporate lawyers and lobbyists (on which you can read more here, here and here). However, much less is known about the decidedly more secretive Trade in Services Act (TiSA), which involves more countries than either of the other two. At least until now, that is. Thanks to a leaked document jointly published by the Associated Whistleblowing Press and Filtrala, the potential ramifications of the treaty being hashed out behind hermetically sealed doors in Geneva are finally seeping out into the public arena.
  • If signed, the treaty would affect all services ranging from electronic transactions and data flow, to veterinary and architecture services. It would almost certainly open the floodgates to the final wave of privatization of public services, including the provision of healthcare, education and water. Meanwhile, already privatized companies would be prevented from a re-transfer to the public sector by a so-called barring “ratchet clause” – even if the privatization failed. More worrisome still, the proposal stipulates that no participating state can stop the use, storage and exchange of personal data relating to their territorial base. Here’s more from Rosa Pavanelli, general secretary of Public Services International (PSI):
  • The leaked documents confirm our worst fears that TiSA is being used to further the interests of some of the largest corporations on earth (…) Negotiation of unrestricted data movement, internet neutrality and how electronic signatures can be used strike at the heart of individuals’ rights. Governments must come clean about what they are negotiating in these secret trade deals. Fat chance of that, especially in light of the fact that the text is designed to be almost impossible to repeal, and is to be “considered confidential” for five years after being signed. What that effectively means is that the U.S. approach to data protection (read: virtually non-existent) could very soon become the norm across 50 countries spanning the breadth and depth of the industrial world.
  • ...1 more annotation...
  • The main players in the top-secret negotiations are the United States and all 28 members of the European Union. However, the broad scope of the treaty also includes Australia, Canada, Chile, Colombia, Costa Rica, Hong Kong, Iceland, Israel, Japan, Liechtenstein, Mexico, New Zealand, Norway, Pakistan, Panama, Paraguay, Peru, South Korea, Switzerland, Taiwan and Turkey. Combined they represent almost 70 percent of all trade in services worldwide. An explicit goal of the TiSA negotiations is to overcome the exceptions in GATS that protect certain non-tariff trade barriers, such as data protection. For example, the draft Financial Services Annex of TiSA, published by Wikileaks in June 2014, would allow financial institutions, such as banks, the free transfer of data, including personal data, from one country to another. As Ralf Bendrath, a senior policy advisor to the MEP Jan Philipp Albrecht, writes in State Watch, this would constitute a radical carve-out from current European data protection rules:
Paul Merrell

An Important Kindle request - 0 views

  • A Message from the Amazon Books Team Dear Readers, Just ahead of World War II, there was a radical invention that shook the foundations of book publishing. It was the paperback book. This was a time when movie tickets cost 10 or 20 cents, and books cost $2.50. The new paperback cost 25 cents — it was ten times cheaper. Readers loved the paperback and millions of copies were sold in just the first year. With it being so inexpensive and with so many more people able to afford to buy and read books, you would think the literary establishment of the day would have celebrated the invention of the paperback, yes? Nope. Instead, they dug in and circled the wagons. They believed low cost paperbacks would destroy literary culture and harm the industry (not to mention their own bank accounts). Many bookstores refused to stock them, and the early paperback publishers had to use unconventional methods of distribution — places like newsstands and drugstores. The famous author George Orwell came out publicly and said about the new paperback format, if "publishers had any sense, they would combine against them and suppress them." Yes, George Orwell was suggesting collusion. Well… history doesn't repeat itself, but it does rhyme.
  • Fast forward to today, and it's the e-book's turn to be opposed by the literary establishment. Amazon and Hachette — a big US publisher and part of a $10 billion media conglomerate — are in the middle of a business dispute about e-books. We want lower e-book prices. Hachette does not. Many e-books are being released at $14.99 and even $19.99. That is unjustifiably high for an e-book. With an e-book, there's no printing, no over-printing, no need to forecast, no returns, no lost sales due to out of stock, no warehousing costs, no transportation costs, and there is no secondary market — e-books cannot be resold as used books. E-books can and should be less expensive. Perhaps channeling Orwell's decades old suggestion, Hachette has already been caught illegally colluding with its competitors to raise e-book prices. So far those parties have paid $166 million in penalties and restitution. Colluding with its competitors to raise prices wasn't only illegal, it was also highly disrespectful to Hachette's readers. The fact is many established incumbents in the industry have taken the position that lower e-book prices will "devalue books" and hurt "Arts and Letters." They're wrong. Just as paperbacks did not destroy book culture despite being ten times cheaper, neither will e-books. On the contrary, paperbacks ended up rejuvenating the book industry and making it stronger. The same will happen with e-books.
Gary Edwards

Should you buy enterprise applications from a startup? - 0 views

  • The biggest advantage of startups, in Mueller's opinion? "They have no technical historical burden, and they don't care about many technical dependencies. They deliver easy-to-use technology with relatively simple but powerful integration options."
  • "The model we've used to buy on-premises software for 20-plus years is shifting," insists Laping. "There are new ways of selecting and vetting partners."
  • Part of that shift is simple: The business side sees what technology can do, and it's banging on IT's door, demanding ... what? Not new drop-down menus in the same-old ERP application, but rather state-of-the-art, cutting-edge, ain't-that-cool innovation. The landscape is wide open: Innovation can come in the form of new technologies, such as the Internet of Things, or from mobility, the cloud, virtualization -- in fact, from anywhere an enterprise vendor isn't filling a need. The easiest place to find that? Startups.
  • ...5 more annotations...
  • "The number one reason to consider a startup is that the current landscape of Magic Quadrant vendors is not serving a critical need. That's a problem."
  • Ravi Belani is managing partner at Alchemist Accelerator, a Palo Alto, Calif.-based venture-backed initiative focused on accelerating startups whose revenue comes from enterprises rather than consumers. He says, "The innovation that used to come out of big software houses isn't there anymore, while the pace of innovation in technology is accelerating."
  • He acknowledges that there has been a longtime concern with startups about the ability of their applications to scale, but given startups' ability to build their software on robust infrastructure platforms using IaaS or PaaS, and then deploy them via SaaS, "scalability isn't as big a deal as it used it be. It costs $50,000 today to do what you needed $50 million to do ten years ago. That means it takes less capital today to create the same innovation. Ten years ago, that was a moat, a barrier to entry, but software vendors don't own that moat anymore."
  • he confluence of offshore programming, open source technologies and cloud-based infrastructures has significantly lowered the barriers to entry of launching a new venture -- not to mention all those newly minted tech millionaires willing to be angel investors.
  • "In the new paradigm, [most software] implementations are so much shorter, you don't have to think about that risk. You're not talking about three years and $20 million. You're talking about 75 days and $50,000. You implement little modules and get big wins along the way."
  •  
    "The idea of buying an enterprise application from a startup company might sound like anathema to a CIO. But Chris Laping, CIO of restaurant chain Red Robin, based in Greenwood Village, Colo., disagrees. He believes we're in the middle of a significant shift that favors startups -- moving from huge applications with extensive features to task-based activities, inspired by the apps running on mobile devices. Featured Resource Presented by Scribe Software 10 Best Practices for Integrating Data Data integration is often underestimated and poorly implemented, taking time and resources. Yet it Learn More Mirco Mueller concurs. He is an IT architect for St. Gallen, Switzerland-based Helvetia Swiss Life Insurance Co., which -- having been founded in 1858 -- is about as far from a startup as possible. He recently chose a SaaS tool from an unnamed startup over what he calls "a much more powerful but much more complex alternative. Its list of features is shorter than the feature list of the big companies, but in terms of agility, flexibility, ease of use and adjustable business model, it beat" all of its competitors. The biggest advantage of startups, in Mueller's opinion? "They have no technical historical burden, and they don't care about many technical dependencies. They deliver easy-to-use technology with relatively simple but powerful integration options." There's certainly no lack of applications available from new players. At a recent conference focusing on innovation, Microsoft Ventures principal Daniel Sumner noted that every month for the last 88 months, there's been a $1 billion valuation for one startup or another. That's seven years and counting. But as Silicon Valley skeptics like to point out, those are the ones you hear about. For every successful startup, there are at least three that fail, according to 2012 research by Harvard Business School professor Shikhar Ghosh. So why, then, would CIOs in their right mind take the risk of buying enterprise applic
Paul Merrell

Obama wants to help make your Internet faster and cheaper. This is his plan. - The Washington Post - 0 views

  • Frustrated over the number of Internet providers that are available to you? If so, you're like many who are limited to just a handful of broadband companies. But now President Obama wants to change that, arguing that choice and competition are lacking in the U.S. broadband market. On Wednesday, Obama will unveil a series of measures aimed at making high-speed Web connections cheaper and more widely available to millions of Americans. The announcement will focus chiefly on efforts by cities to build their own alternatives to major Internet providers such as Comcast, Verizon or AT&T — a public option for Internet access, you could say. He'll write to the Federal Communications Commission urging the agency to help neutralize laws, erected by states, that effectively protect large established Internet providers against the threat represented by cities that want to build and offer their own, municipal Internet service. He'll direct federal agencies to expand grants and loans for these projects and for smaller, rural Internet providers. And he'll draw attention to a new coalition of mayors from 50 cities who've committed to spurring choice in the broadband industry.
  • "When more companies compete for your broadband business, it means lower prices," Jeff Zients, director of Obama's National Economic Council, told reporters Tuesday. "Broadband is no longer a luxury. It's a necessity." The announcement highlights a growing chorus of small and mid-sized cities that say they've been left behind by some of the country's biggest Internet providers. In many of these places, incumbent companies have delayed network upgrades or offer what customers say is unsatisfactory service because it isn't cost-effective to build new infrastructure. Many cities, such as Cedar Falls, Iowa, have responded by building their own, publicly operated competitors. Obama will travel to Cedar Falls on Wednesday to roll out his initiative.
Paul Merrell

WikiLeaks - Secret Trans-Pacific Partnership Agreement (TPP) - Investment Chapter - 0 views

  • WikiLeaks releases today the "Investment Chapter" from the secret negotiations of the TPP (Trans-Pacific Partnership) agreement. The document adds to the previous WikiLeaks publications of the chapters for Intellectual Property Rights (November 2013) and the Environment (January 2014). The TPP Investment Chapter, published today, is dated 20 January 2015. The document is classified and supposed to be kept secret for four years after the entry into force of the TPP agreement or, if no agreement is reached, for four years from the close of the negotiations. Julian Assange, WikiLeaks editor said: "The TPP has developed in secret an unaccountable supranational court for multinationals to sue states. This system is a challenge to parliamentary and judicial sovereignty. Similar tribunals have already been shown to chill the adoption of sane environmental protection, public health and public transport policies." Current TPP negotiation member states are the United States, Japan, Mexico, Canada, Australia, Malaysia, Chile, Singapore, Peru, Vietnam, New Zealand and Brunei. The TPP is the largest economic treaty in history, including countries that represent more than 40 per cent of the world´s GDP.
  • The Investment Chapter highlights the intent of the TPP negotiating parties, led by the United States, to increase the power of global corporations by creating a supra-national court, or tribunal, where foreign firms can "sue" states and obtain taxpayer compensation for "expected future profits". These investor-state dispute settlement (ISDS) tribunals are designed to overrule the national court systems. ISDS tribunals introduce a mechanism by which multinational corporations can force governments to pay compensation if the tribunal states that a country's laws or policies affect the company's claimed future profits. In return, states hope that multinationals will invest more. Similar mechanisms have already been used. For example, US tobacco company Phillip Morris used one such tribunal to sue Australia (June 2011 – ongoing) for mandating plain packaging of tobacco products on public health grounds; and by the oil giant Chevron against Ecuador in an attempt to evade a multi-billion-dollar compensation ruling for polluting the environment. The threat of future lawsuits chilled environmental and other legislation in Canada after it was sued by pesticide companies in 2008/9. ISDS tribunals are often held in secret, have no appeal mechanism, do not subordinate themselves to human rights laws or the public interest, and have few means by which other affected parties can make representations. The TPP negotiations have been ongoing in secrecy for five years and are now in their final stages. In the United States the Obama administration plans to "fast-track" the treaty through Congress without the ability of elected officials to discuss or vote on individual measures. This has met growing opposition as a result of increased public scrutiny following WikiLeaks' earlier releases of documents from the negotiations.
  • The TPP is set to be the forerunner to an equally secret agreement between the US and EU, the TTIP (Transatlantic Trade and Investment Partnership). Negotiations for the TTIP were initiated by the Obama administration in January 2013. Combined, the TPP and TTIP will cover more than 60 per cent of global GDP. The third treaty of the same kind, also negotiated in secrecy is TISA, on trade in services, including the financial and health sectors. It covers 50 countries, including the US and all EU countries. WikiLeaks released the secret draft text of the TISA's financial annex in June 2014. All these agreements on so-called “free trade” are negotiated outside the World Trade Organization's (WTO) framework. Conspicuously absent from the countries involved in these agreements are the BRICs countries of Brazil, Russia, India and China. Read the Secret Trans-Pacific Partnership Agreement (TPP) - Investment chapter
  •  
    The previously leaked chapter on copyrights makes clear that the TPP would be a disaster for a knowledge society. This chapter makes clear that only corprorations may compel arbitration; there is no corresponding right for human beings to do so. 
Paul Merrell

ISPs say the "massive cost" of Snooper's Charter will push up UK broadband bills | Ars Technica UK - 0 views

  • How much extra will you have to pay for the privilege of being spied on?
  • UK ISPs have warned MPs that the costs of implementing the Investigatory Powers Bill (aka the Snooper's Charter) will be much greater than the £175 million the UK government has allotted for the task, and that broadband bills will need to rise as a result. Representatives from ISPs and software companies told the House of Commons Science and Technology Committee that the legislation greatly underestimates the "sheer quantity" of data generated by Internet users these days. They also pointed out that distinguishing content from metadata is a far harder task than the government seems to assume. Matthew Hare, the chief executive of ISP Gigaclear, said with "a typical 1 gigabit connection to someone's home, over 50 terabytes of data per year [are] passing over it. If you say that a proportion of that is going to be the communications data—the record of who you communicate with, when you communicate or what you communicate—there would be the most massive and enormous amount of data that in future an access provider would be expected to keep. The indiscriminate collection of mass data across effectively every user of the Internet in this country is going to have a massive cost."
  • Moreover, the larger the cache of stored data, the more worthwhile it will be for criminals and state-backed actors to gain access and download that highly-revealing personal information for fraud and blackmail. John Shaw, the vice president of product management at British security firm Sophos, told the MPs: "There would be a huge amount of very sensitive personal data that could be used by bad guys.
  • ...2 more annotations...
  • The ISPs also challenged the government's breezy assumption that separating the data from the (equally revealing) metadata would be simple, not least because an Internet connection is typically being used for multiple services simultaneously, with data packets mixed together in a completely contingent way. Hare described a typical usage scenario for a teenager on their computer at home, where they are playing a game communicating with their friends using Steam; they are broadcasting the game using Twitch; and they may also be making a voice call at the same time too. "All those applications are running simultaneously," Hare said. "They are different applications using different servers with different services and different protocols. They are all running concurrently on that one machine." Even accessing a Web page is much more complicated than the government seems to believe, Hare pointed out. "As a webpage is loading, you will see that that webpage is made up of tens, or many tens, of individual sessions that have been created across the Internet just to load a single webpage. Bluntly, if you want to find out what someone is doing you need to be tracking all of that data all the time."
  • Hare raised another major issue. "If I was a software business ... I would be very worried that my customers would not buy my software any more if it had anything to do with security at all. I would be worried that a backdoor was built into the software by the [Investigatory Powers] Bill that would allow the UK government to find out what information was on that system at any point they wanted in the future." As Ars reported last week, the ability to demand that backdoors are added to systems, and a legal requirement not to reveal that fact under any circumstances, are two of the most contentious aspects of the new Investigatory Powers Bill. The latest comments from industry experts add to concerns that the latest version of the Snooper's Charter would inflict great harm on civil liberties in the UK, and also make security research well-nigh impossible here. To those fears can now be added undermining the UK software industry, as well as forcing the UK public to pay for the privilege of having their ISP carry out suspicionless surveillance.
Paul Merrell

Announcing STARTTLS Everywhere: Securing Hop-to-Hop Email Delivery | Electronic Frontier Foundation - 0 views

  • Today we’re announcing the launch of STARTTLS Everywhere, EFF’s initiative to improve the security of the email ecosystem. Thanks to previous EFF efforts like Let's Encrypt, and Certbot, as well as help from the major web browsers, we've seen significant wins in encrypting the web. Now we want to do for email what we’ve done for web browsing: make it simple and easy for everyone to help ensure their communications aren’t vulnerable to mass surveillance.
  • t’s important to note that STARTTLS Everywhere is designed to be run by mailserver admins, not regular users. No matter your role, you can join in the STARTTLS fun and find out how secure your current email provider is at: https://www.starttls-everywhere.org/ Enter your email domain (the part of your email address after the “@” symbol), and we’ll check if your email provider has configured their server to use STARTTLS, whether or not they use a valid certificate, and whether or not they’re on the STARTTLS Preload List—all different indications of how secure (or vulnerable) your email provider is to mass surveillance.
Paul Merrell

From Radio to Porn, British Spies Track Web Users' Online Identities - 1 views

  • HERE WAS A SIMPLE AIM at the heart of the top-secret program: Record the website browsing habits of “every visible user on the Internet.” Before long, billions of digital records about ordinary people’s online activities were being stored every day. Among them were details cataloging visits to porn, social media and news websites, search engines, chat forums, and blogs. The mass surveillance operation — code-named KARMA POLICE — was launched by British spies about seven years ago without any public debate or scrutiny. It was just one part of a giant global Internet spying apparatus built by the United Kingdom’s electronic eavesdropping agency, Government Communications Headquarters, or GCHQ. The revelations about the scope of the British agency’s surveillance are contained in documents obtained by The Intercept from National Security Agency whistleblower Edward Snowden. Previous reports based on the leaked files have exposed how GCHQ taps into Internet cables to monitor communications on a vast scale, but many details about what happens to the data after it has been vacuumed up have remained unclear.
  • Amid a renewed push from the U.K. government for more surveillance powers, more than two dozen documents being disclosed today by The Intercept reveal for the first time several major strands of GCHQ’s existing electronic eavesdropping capabilities.
  • The surveillance is underpinned by an opaque legal regime that has authorized GCHQ to sift through huge archives of metadata about the private phone calls, emails and Internet browsing logs of Brits, Americans, and any other citizens — all without a court order or judicial warrant
  • ...17 more annotations...
  • A huge volume of the Internet data GCHQ collects flows directly into a massive repository named Black Hole, which is at the core of the agency’s online spying operations, storing raw logs of intercepted material before it has been subject to analysis. Black Hole contains data collected by GCHQ as part of bulk “unselected” surveillance, meaning it is not focused on particular “selected” targets and instead includes troves of data indiscriminately swept up about ordinary people’s online activities. Between August 2007 and March 2009, GCHQ documents say that Black Hole was used to store more than 1.1 trillion “events” — a term the agency uses to refer to metadata records — with about 10 billion new entries added every day. As of March 2009, the largest slice of data Black Hole held — 41 percent — was about people’s Internet browsing histories. The rest included a combination of email and instant messenger records, details about search engine queries, information about social media activity, logs related to hacking operations, and data on people’s use of tools to browse the Internet anonymously.
  • Throughout this period, as smartphone sales started to boom, the frequency of people’s Internet use was steadily increasing. In tandem, British spies were working frantically to bolster their spying capabilities, with plans afoot to expand the size of Black Hole and other repositories to handle an avalanche of new data. By 2010, according to the documents, GCHQ was logging 30 billion metadata records per day. By 2012, collection had increased to 50 billion per day, and work was underway to double capacity to 100 billion. The agency was developing “unprecedented” techniques to perform what it called “population-scale” data mining, monitoring all communications across entire countries in an effort to detect patterns or behaviors deemed suspicious. It was creating what it said would be, by 2013, “the world’s biggest” surveillance engine “to run cyber operations and to access better, more valued data for customers to make a real world difference.”
  • A document from the GCHQ target analysis center (GTAC) shows the Black Hole repository’s structure.
  • The data is searched by GCHQ analysts in a hunt for behavior online that could be connected to terrorism or other criminal activity. But it has also served a broader and more controversial purpose — helping the agency hack into European companies’ computer networks. In the lead up to its secret mission targeting Netherlands-based Gemalto, the largest SIM card manufacturer in the world, GCHQ used MUTANT BROTH in an effort to identify the company’s employees so it could hack into their computers. The system helped the agency analyze intercepted Facebook cookies it believed were associated with Gemalto staff located at offices in France and Poland. GCHQ later successfully infiltrated Gemalto’s internal networks, stealing encryption keys produced by the company that protect the privacy of cell phone communications.
  • Similarly, MUTANT BROTH proved integral to GCHQ’s hack of Belgian telecommunications provider Belgacom. The agency entered IP addresses associated with Belgacom into MUTANT BROTH to uncover information about the company’s employees. Cookies associated with the IPs revealed the Google, Yahoo, and LinkedIn accounts of three Belgacom engineers, whose computers were then targeted by the agency and infected with malware. The hacking operation resulted in GCHQ gaining deep access into the most sensitive parts of Belgacom’s internal systems, granting British spies the ability to intercept communications passing through the company’s networks.
  • In March, a U.K. parliamentary committee published the findings of an 18-month review of GCHQ’s operations and called for an overhaul of the laws that regulate the spying. The committee raised concerns about the agency gathering what it described as “bulk personal datasets” being held about “a wide range of people.” However, it censored the section of the report describing what these “datasets” contained, despite acknowledging that they “may be highly intrusive.” The Snowden documents shine light on some of the core GCHQ bulk data-gathering programs that the committee was likely referring to — pulling back the veil of secrecy that has shielded some of the agency’s most controversial surveillance operations from public scrutiny. KARMA POLICE and MUTANT BROTH are among the key bulk collection systems. But they do not operate in isolation — and the scope of GCHQ’s spying extends far beyond them.
  • The agency operates a bewildering array of other eavesdropping systems, each serving its own specific purpose and designated a unique code name, such as: SOCIAL ANTHROPOID, which is used to analyze metadata on emails, instant messenger chats, social media connections and conversations, plus “telephony” metadata about phone calls, cell phone locations, text and multimedia messages; MEMORY HOLE, which logs queries entered into search engines and associates each search with an IP address; MARBLED GECKO, which sifts through details about searches people have entered into Google Maps and Google Earth; and INFINITE MONKEYS, which analyzes data about the usage of online bulletin boards and forums. GCHQ has other programs that it uses to analyze the content of intercepted communications, such as the full written body of emails and the audio of phone calls. One of the most important content collection capabilities is TEMPORA, which mines vast amounts of emails, instant messages, voice calls and other communications and makes them accessible through a Google-style search tool named XKEYSCORE.
  • As of September 2012, TEMPORA was collecting “more than 40 billion pieces of content a day” and it was being used to spy on people across Europe, the Middle East, and North Africa, according to a top-secret memo outlining the scope of the program. The existence of TEMPORA was first revealed by The Guardian in June 2013. To analyze all of the communications it intercepts and to build a profile of the individuals it is monitoring, GCHQ uses a variety of different tools that can pull together all of the relevant information and make it accessible through a single interface. SAMUEL PEPYS is one such tool, built by the British spies to analyze both the content and metadata of emails, browsing sessions, and instant messages as they are being intercepted in real time. One screenshot of SAMUEL PEPYS in action shows the agency using it to monitor an individual in Sweden who visited a page about GCHQ on the U.S.-based anti-secrecy website Cryptome.
  • Partly due to the U.K.’s geographic location — situated between the United States and the western edge of continental Europe — a large amount of the world’s Internet traffic passes through its territory across international data cables. In 2010, GCHQ noted that what amounted to “25 percent of all Internet traffic” was transiting the U.K. through some 1,600 different cables. The agency said that it could “survey the majority of the 1,600” and “select the most valuable to switch into our processing systems.”
  • According to Joss Wright, a research fellow at the University of Oxford’s Internet Institute, tapping into the cables allows GCHQ to monitor a large portion of foreign communications. But the cables also transport masses of wholly domestic British emails and online chats, because when anyone in the U.K. sends an email or visits a website, their computer will routinely send and receive data from servers that are located overseas. “I could send a message from my computer here [in England] to my wife’s computer in the next room and on its way it could go through the U.S., France, and other countries,” Wright says. “That’s just the way the Internet is designed.” In other words, Wright adds, that means “a lot” of British data and communications transit across international cables daily, and are liable to be swept into GCHQ’s databases.
  • A map from a classified GCHQ presentation about intercepting communications from undersea cables. GCHQ is authorized to conduct dragnet surveillance of the international data cables through so-called external warrants that are signed off by a government minister. The external warrants permit the agency to monitor communications in foreign countries as well as British citizens’ international calls and emails — for example, a call from Islamabad to London. They prohibit GCHQ from reading or listening to the content of “internal” U.K. to U.K. emails and phone calls, which are supposed to be filtered out from GCHQ’s systems if they are inadvertently intercepted unless additional authorization is granted to scrutinize them. However, the same rules do not apply to metadata. A little-known loophole in the law allows GCHQ to use external warrants to collect and analyze bulk metadata about the emails, phone calls, and Internet browsing activities of British people, citizens of closely allied countries, and others, regardless of whether the data is derived from domestic U.K. to U.K. communications and browsing sessions or otherwise. In March, the existence of this loophole was quietly acknowledged by the U.K. parliamentary committee’s surveillance review, which stated in a section of its report that “special protection and additional safeguards” did not apply to metadata swept up using external warrants and that domestic British metadata could therefore be lawfully “returned as a result of searches” conducted by GCHQ.
  • Perhaps unsurprisingly, GCHQ appears to have readily exploited this obscure legal technicality. Secret policy guidance papers issued to the agency’s analysts instruct them that they can sift through huge troves of indiscriminately collected metadata records to spy on anyone regardless of their nationality. The guidance makes clear that there is no exemption or extra privacy protection for British people or citizens from countries that are members of the Five Eyes, a surveillance alliance that the U.K. is part of alongside the U.S., Canada, Australia, and New Zealand. “If you are searching a purely Events only database such as MUTANT BROTH, the issue of location does not occur,” states one internal GCHQ policy document, which is marked with a “last modified” date of July 2012. The document adds that analysts are free to search the databases for British metadata “without further authorization” by inputing a U.K. “selector,” meaning a unique identifier such as a person’s email or IP address, username, or phone number. Authorization is “not needed for individuals in the U.K.,” another GCHQ document explains, because metadata has been judged “less intrusive than communications content.” All the spies are required to do to mine the metadata troves is write a short “justification” or “reason” for each search they conduct and then click a button on their computer screen.
  • Intelligence GCHQ collects on British persons of interest is shared with domestic security agency MI5, which usually takes the lead on spying operations within the U.K. MI5 conducts its own extensive domestic surveillance as part of a program called DIGINT (digital intelligence).
  • GCHQ’s documents suggest that it typically retains metadata for periods of between 30 days to six months. It stores the content of communications for a shorter period of time, varying between three to 30 days. The retention periods can be extended if deemed necessary for “cyber defense.” One secret policy paper dated from January 2010 lists the wide range of information the agency classes as metadata — including location data that could be used to track your movements, your email, instant messenger, and social networking “buddy lists,” logs showing who you have communicated with by phone or email, the passwords you use to access “communications services” (such as an email account), and information about websites you have viewed.
  • Records showing the full website addresses you have visited — for instance, www.gchq.gov.uk/what_we_do — are treated as content. But the first part of an address you have visited — for instance, www.gchq.gov.uk — is treated as metadata. In isolation, a single metadata record of a phone call, email, or website visit may not reveal much about a person’s private life, according to Ethan Zuckerman, director of Massachusetts Institute of Technology’s Center for Civic Media. But if accumulated and analyzed over a period of weeks or months, these details would be “extremely personal,” he told The Intercept, because they could reveal a person’s movements, habits, religious beliefs, political views, relationships, and even sexual preferences. For Zuckerman, who has studied the social and political ramifications of surveillance, the most concerning aspect of large-scale government data collection is that it can be “corrosive towards democracy” — leading to a chilling effect on freedom of expression and communication. “Once we know there’s a reasonable chance that we are being watched in one fashion or another it’s hard for that not to have a ‘panopticon effect,’” he said, “where we think and behave differently based on the assumption that people may be watching and paying attention to what we are doing.”
  • When compared to surveillance rules in place in the U.S., GCHQ notes in one document that the U.K. has “a light oversight regime.” The more lax British spying regulations are reflected in secret internal rules that highlight greater restrictions on how NSA databases can be accessed. The NSA’s troves can be searched for data on British citizens, one document states, but they cannot be mined for information about Americans or other citizens from countries in the Five Eyes alliance. No such constraints are placed on GCHQ’s own databases, which can be sifted for records on the phone calls, emails, and Internet usage of Brits, Americans, and citizens from any other country. The scope of GCHQ’s surveillance powers explain in part why Snowden told The Guardian in June 2013 that U.K. surveillance is “worse than the U.S.” In an interview with Der Spiegel in July 2013, Snowden added that British Internet cables were “radioactive” and joked: “Even the Queen’s selfies to the pool boy get logged.”
  • In recent years, the biggest barrier to GCHQ’s mass collection of data does not appear to have come in the form of legal or policy restrictions. Rather, it is the increased use of encryption technology that protects the privacy of communications that has posed the biggest potential hindrance to the agency’s activities. “The spread of encryption … threatens our ability to do effective target discovery/development,” says a top-secret report co-authored by an official from the British agency and an NSA employee in 2011. “Pertinent metadata events will be locked within the encrypted channels and difficult, if not impossible, to prise out,” the report says, adding that the agencies were working on a plan that would “(hopefully) allow our Internet Exploitation strategy to prevail.”
Paul Merrell

Senate and House Democrats Introduce Resolution to Reinstate Net Neutrality - U.S. Senator Ed Markey of Massachusetts - 0 views

  • On the Net Neutrality National Day of Action, Senate and House Democrats introduced a Congressional Review Act (CRA) resolution to overturn the Federal Communications Commission’s (FCC) partisan decision on net neutrality. At a press conference today, Senators Edward J. Markey (D-Mass.), Congressman Mike Doyle (PA-14), Senate Democratic Leader Chuck Schumer (D-N.Y.), and House Democratic Leader Nancy Pelosi (CA-12) announced introduction of House and Senate resolutions to fully restore the 2015 Open Internet Order. The Senate CRA resolution of disapproval stands at 50 supporters, including Republican Senator Susan Collins (R-Maine.). Rep. Doyle’s resolution in the House of Representatives currently has 150 co-sponsors.   The FCC’s Open Internet Order prohibited internet service providers from blocking, slowing down, or discriminating against content online. Repealing these net neutrality rules could lead to higher prices for consumers, slower internet traffic, and even blocked websites. A recent poll showed that 83 percent of Americans do not approve of the FCC’s action to repeal net neutrality rules.  
  • A copy of the CRA resolution can be found HERE.   Last week, the FCC’s rule repealing net neutrality was published in the Federal Register, leaving 60 legislative days to seek a vote on the Senate floor on the CRA resolutions. In order to force a vote on the Senate resolution, Senator Markey will submit a discharge petition, which requires a minimum of 30 Senators’ signature. Once the discharge petition is filed, Senator Markey and Senate Democrats will demand a vote on the resolution.
Paul Merrell

Evidence of Google blacklisting of left and progressive sites continues to mount - World Socialist Web Site - 0 views

  • A growing number of leading left-wing websites have confirmed that their search traffic from Google has plunged in recent months, adding to evidence that Google, under the cover of a fraudulent campaign against fake news, is implementing a program of systematic and widespread censorship. Truthout, a not-for-profit news website that focuses on political, social, and ecological developments from a left progressive standpoint, had its readership plunge by 35 percent since April. The Real News , a nonprofit video news and documentary service, has had its search traffic fall by 37 percent. Another site, Common Dreams , last week told the WSWS that its search traffic had fallen by up to 50 percent. As extreme as these sudden drops in search traffic are, they do not equal the nearly 70 percent drop in traffic from Google seen by the WSWS. “This is political censorship of the worst sort; it’s just an excuse to suppress political viewpoints,” said Robert Epstein, a former editor in chief of Psychology Today and noted expert on Google. Epstein said that at this point, the question was whether the WSWS had been flagged specifically by human evaluators employed by the search giant, or whether those evaluators had influenced the Google Search engine to demote left-wing sites. “What you don’t know is whether this was the human evaluators who are demoting you, or whether it was the new algorithm they are training,” Epstein said.
  • Richard Stallman, the world-renowned technology pioneer and a leader of the free software movement, said he had read the WSWS’s coverage on Google’s censorship of left-wing sites. He warned about the immense control exercised by Google over the Internet, saying, “For people’s main way of finding articles about a topic to be run by a giant corporation creates an obvious potential for abuse.” According to data from the search optimization tool SEMRush, search traffic to Mr. Stallman’s personal website, Stallman.org, fell by 24 percent, while traffic to gnu.org, operated by the Free Software Foundation, fell 19 percent. Eric Maas, a search engine optimization consultant working in the San Francisco Bay area, said his team has surveyed a wide range of alternative news sites affected by changes in Google’s algorithms since April.  “While the update may be targeting specific site functions, there is evidence that this update is promoting only large mainstream news organizations. What I find problematic with this is that it appears that some sites have been targeted and others have not.” The massive drop in search traffic to the WSWS and other left-wing sites followed the implementation of changes in Google’s search evaluation protocols. In a statement issued on April 25, Ben Gomes, the company’s vice president for engineering, stated that Google’s update of its search engine would block access to “offensive” sites, while working to surface more “authoritative content.” In a set of guidelines issued to Google evaluators in March, the company instructed its search evaluators to flag pages returning “conspiracy theories” or “upsetting” content unless “the query clearly indicates the user is seeking an alternative viewpoint.”
Paul Merrell

Google Censors Block Access to CounterPunch and Other Progressive Sites - 0 views

  • Now Google, at the behest of its friends in Washington, is actively censoring – essentially blocking access to – any websites which seek to warn American workers of the ongoing effort to further attack their incomes, social services, and life conditions by the U.S. central government, and which seek to warn against the impending warfare between U.S.-led Nato and other forces against countries like Iran, Russia, and China, which have in no way threatened the U.S. state or its people
  • Under its new so-called anti-fake-news program, Google algorithms have in the past few months moved socialist, anti-war, and progressive websites from previously prominent positions in Google searches to positions up to 50 search result pages from the first page, essentially removing them from the search results any searcher will see.    CounterPunch, World Socialist Website, Democracy Now, American Civil liberties Union, Wikileaks are just a few of the websites which have experienced severe reductions in their returns from Google searches.  World Socialist Website, to cite just one example, has experienced a 67% drop in its returns from Google since the new policy was announced. This conversion of Google into a Censorship engine is not a trivial development.   Google searches are currently a primary means by which workers and other members of the public seek information about their lives and their world.  Every effort must be made to combat this serious infringement on the basic rights of freedom of speech and freedom of press.
Paul Merrell

It's Time to Nationalize the Internet - 0 views

  • Such profiteering tactics have disproportionately affected low-income and rural communities. ISPs have long redlined these demographic groups, creating what’s commonly known as the “digital divide.” Thirty-nine percent of Americans lack access to service fast enough to meet the federal definition of broadband. More than 50 percent of adults with household incomes below $30,000 have home broadband—a problem plaguing users of color most acutely. In contrast, internet access is near-universal for households with an annual income of $100,000 or more. The reason for such chasms is simple: Private network providers prioritize only those they expect to provide a return on investment, thus excluding poor and sparsely populated areas.
  • Chattanooga, Tennessee, has seen more success in addressing redlining. Since 2010, the city has offered public broadband via its municipal power organization, Electric Power Board (EPB). The project has become a rousing success: At half the price, its service is approximately 85 percent faster than that of Comcast, the region’s primary ISP prior to EPB’s inception. Coupled with a discounted program for low-income residents, Chattanooga’s publicly run broadband reaches about 82,000 residents—more than half of the area’s Internet users—and is only expected to grow. Chattanooga’s achievements have radiated to other locales. More than 450 communities have introduced publicly-owned broadband. And more than 110 communities in 24 states have access to publicly owned networks with one gigabit-per-second (Gbps) service. (AT&T, for example, has yet to introduce speeds this high.) Seattle City Councilmember Kshama Sawant proposed a pilot project in 2015 and has recently urged her city to invest in municipal broadband. Hawaii congressperson Kaniela Ing is drafting a bill for publicly-owned Internet for the state legislature to consider next year. In November, residents of Fort Collins, Colo. voted to authorize the city to build municipal broadband infrastructure.
Paul Merrell

U.S. looking at ways to hold Zuckerberg accountable for Facebook's problems - 0 views

  • Federal regulators are discussing whether and how to hold Facebook Chief Executive Mark Zuckerberg personally accountable for the company's history of mismanaging users' private data, two sources familiar with the discussions told NBC News on Thursday.The sources wouldn't elaborate on what measures are specifically under consideration. The Washington Post, which first reported the development, reported that regulators were exploring increased oversight of Zuckerberg's leadership.While Facebook has come under scrutiny for its privacy practices for years, both of the Democratic members of the FTC have said the agency should target individual executives when appropriate.Justin Brookman, a former policy director for technology research at the Federal Trade Commission, or FTC, said Thursday night that while the FTC can name individual company leaders if they directed, controlled and knew about any wrongdoing, "they typically only use that authority in fraud-like cases, so far as I can tell."
Paul Merrell

Federal Trade Commission calls for breakup of Facebook - 0 views

  • The Federal Trade Commission sued to break up Facebook on Wednesday, asking a federal court to force the sell-off of assets such as Instagram and WhatsApp as independent businesses.“Facebook has maintained its monopoly position by buying up companies that present competitive threats and by imposing restrictive policies that unjustifiably hinder actual or potential rivals that Facebook does not or cannot acquire,” the commission said in the lawsuit filed in federal court in Washington, D.C.The lawsuit asks the court to order the “divestiture of assets, divestiture or reconstruction of businesses (including, but not limited to, Instagram and/or WhatsApp),” as well as other possible relief the court might want to add.
  • Attorneys general from 48 states and territories said they were filing their own lawsuit against Facebook, reflecting the broad and bipartisan concern about how much power Facebook and its CEO, Mark Zuckerberg, have accumulated on the internet.
« First ‹ Previous 41 - 59 of 59
Showing 20 items per page