Skip to main content

Home/ Future of the Web/ Group items tagged considered

Rss Feed Group items tagged

Gary Edwards

Is the W3C to Blame for the Breaking of the Web? | Continuing Intermittent In... - 0 views

  • Consider the recent CSS features added by WebKit: transformations, animations, gradients, masks, et cetera. They’ve very nearly _run out_ of standards to implement, so they’re starting to implement the wouldn’t-it-be-cool-if stuff. If I’m not mistaken, this is the exact sort of thing you’re wishing for.
  • Changing the renderer (which is what we’re taking about when we talk about upgrading “the web”) goes hand-in-hand today with upgrading the *rest* of the browser as well, which requires the user to care…and users (to a one) don’t give a flying leap about CSS 2.1 support.
    • Gary Edwards
       
      Note to marbux: the browser is the layout/rendering engine for web applications and services. Nothing happens on the web unless and until the browser, or a browser RiA alternative, implements a compliant end user interface. Focus on the browser layout engines, and Web applications will follow.
  •  
    Another article taking up the issue of "Blame the W3C" for what increasingly looks like a proprietary Web future. The author is an Ajax-DOJO supporter, and he tries to defend the W3C by saying it's not their job, they don't have the "power" or the "authority" to push the Web forward. About the best they can do is, at the end of the day, try to corral big vendors into agreement. Meanwhile, the Web has become the wild wild west with browser vendors innovating into their corporate web stacks where vast profits and future monopolies rest. For me, WebKit represents the best effort insisting that the Web remain Open. It's OSS with excellent big vendor support. And they are pushing the envelope. Finally!
Gary Edwards

Siding with HTML over XHTML, My Decision to Switch - Monday By Noon - 0 views

  • Publishing content on the Web is in no way limited to professional developers or designers, much of the reason the net is so active is because anyone can make a website. Sure, we (as knowledgeable professionals or hobbyists) all hope to make the Web a better place by doing our part in publishing documents with semantically rich, valid markup, but the reality is that those documents are rare. It’s important to keep in mind the true nature of the Internet; an open platform for information sharing.
  • XHTML2 has some very good ideas that I hope can become part of the web. However, it’s unrealistic to think that all web authors will switch to an XML-based syntax which demands that browsers stop processing the document on the first error. XML’s draconian policy was an attempt to clean up the web. This was done around 1996 when lots of invalid content entered the web. CSS took a different approach: instead of demanding that content isn’t processed, we defined rules for how to handle the undefined. It’s called “forward-compatible parsing” and means we can add new constructs without breaking the old. So, I don’t think XHTML is a realistic option for the masses. HTML 5 is it.
    • Gary Edwards
       
      Great quote from CSS expert Hakon Wium Lie.
  • @marbux: Of course i disagree with your interop assessment, but I wondered how it is that you’re missing the point. I think you confuse web applications with legacy desktop – client/server application model. And that confusion leads to the mistake of trying to transfer the desktop document model to one that could adequately service advancing web applications.
  •  
    A CMS expert argues for HTML over XHTML, explaining his reasons for switching. Excellent read! He nails the basics. for similar reasons, we moved from ODF to ePUB and then to CDf and finally to the advanced WebKit document model, where wikiWORD will make it's stand.
  •  
    See also my comment on the same web page that explains why HTML 5 is NOT it for document exchange between web editing applications. .
  •  
    Response to marbux supporting the WebKit layout/document model. Marbux argues that HTML5 is not interoperable, and CSS2 near useless. HTML5 fails regarding the the interop web appplications need. I respond by arguing that the only way to look at web applications is to consider that the browser layout engine is the web application layout engine! Web applications are actually written to the browser layout/document model, OR, to take advantage of browser plug-in capabilities. The interoperability marbux seeks is tied directly to the browser layout engine. In this context, the web format is simply a reflection of that layout engine. If there's an interop problem, it comes from browser madness differentials. The good news is that there are all kinds of efforts to close the browser gap: including WHATWG - HTML5, CSS3, W3C DOM, JavaScript Libraries, Google GWT (Java to JavaScript), Yahoo GUI, and the my favorite; WebKit. The bad news is that the clock is ticking. Microsoft has pulled the trigger and the great migration of MSOffice client/server systems to the MS WebSTack-Mesh architecture has begun. Key to this transition are the WPF-.NET proprietary formats, protocols and interfaces such as XAML, Silverlight, LINQ, and Smart Tags. New business processes are being written, and old legacy desktop bound processes are being transitioned to this emerging platform. The fight for the Open Web is on, with Microsoft threatening to transtion their entire business desktop monopoly to a Web platfomr they own. ~ge~
Gary Edwards

Wolfram Alpha is Coming -- and It Could be as Important as Google | Twine - 0 views

  • The first question was could (or even should) Wolfram Alpha be built using the Semantic Web in some manner, rather than (or as well as) the Mathematica engine it is currently built on. Is anything missed by not building it with Semantic Web's languages (RDF, OWL, Sparql, etc.)? The answer is that there is no reason that one MUST use the Semantic Web stack to build something like Wolfram Alpha. In fact, in my opinion it would be far too difficult to try to explicitly represent everything Wolfram Alpha knows and can compute using OWL ontologies. It is too wide a range of human knowledge and giant OWL ontologies are just too difficult to build and curate.
  • However for the internal knowledge representation and reasoning that takes places in the system, it appears Wolfram has found a pragmatic and efficient representation of his own, and I don't think he needs the Semantic Web at that level. It seems to be doing just fine without it. Wolfram Alpha is built on hand-curated knowledge and expertise. Wolfram and his team have somehow figured out a way to make that practical where all others who have tried this have failed to achieve their goals. The task is gargantuan -- there is just so much diverse knowledge in the world. Representing even a small segment of it formally turns out to be extremely difficult and time-consuming.
  • It has generally not been considered feasible for any one group to hand-curate all knowledge about every subject. This is why the Semantic Web was invented -- by enabling everyone to curate their own knowledge about their own documents and topics in parallel, in principle at least, more knowledge could be represented and shared in less time by more people -- in an interoperable manner. At least that is the vision of the Semantic Web.
  • ...1 more annotation...
  • Where Google is a system for FINDING things that we as a civilization collectively publish, Wolfram Alpha is for ANSWERING questions about what we as a civilization collectively know. It's the next step in the distribution of knowledge and intelligence around the world -- a new leap in the intelligence of our collective "Global Brain." And like any big next-step, Wolfram Alpha works in a new way -- it computes answers instead of just looking them up.
  •  
    A Computational Knowledge Engine for the Web In a nutshell, Wolfram and his team have built what he calls a "computational knowledge engine" for the Web. OK, so what does that really mean? Basically it means that you can ask it factual questions and it computes answers for you. It doesn't simply return documents that (might) contain the answers, like Google does, and it isn't just a giant database of knowledge, like the Wikipedia. It doesn't simply parse natural language and then use that to retrieve documents, like Powerset, for example. Instead, Wolfram Alpha actually computes the answers to a wide range of questions -- like questions that have factual answers such as "What country is Timbuktu in?" or "How many protons are in a hydrogen atom?" or "What is the average rainfall in Seattle this month?," "What is the 300th digit of Pi?," "where is the ISS?" or "When was GOOG worth more than $300?" Think about that for a minute. It computes the answers. Wolfram Alpha doesn't simply contain huge amounts of manually entered pairs of questions and answers, nor does it search for answers in a database of facts. Instead, it understands and then computes answers to certain kinds of questions.
Paul Merrell

HTML presentation markup deprecated - 0 views

  • Prior to CSS, nearly all of the presentational attributes of HTML documents were contained within the HTML markup; all font colors, background styles, element alignments, borders and sizes had to be explicitly described, often repeatedly, within the HTML. CSS allows authors to move much of that information to a separate stylesheet resulting in considerably simpler HTML markup. Headings (h1 elements), sub-headings (h2), sub-sub-headings (h3), etc., are defined structurally using HTML. In print and on the screen, choice of font, size, color and emphasis for these elements is presentational. Prior to CSS, document authors who wanted to assign such typographic characteristics to, say, all h2 headings had to use the HTML font and other presentational elements for each occurrence of that heading type. The additional presentational markup in the HTML made documents more complex, and generally more difficult to maintain. In CSS, presentation is separated from structure. In print, CSS can define color, font, text alignment, size, borders, spacing, layout and many other typographic characteristics. It can do so independently for on-screen and printed views. CSS also defines non-visual styles such as the speed and emphasis with which text is read out by aural text readers. The W3C now considers the advantages of CSS for defining all aspects of the presentation of HTML pages to be superior to other methods. It has therefore deprecated the use of all the original presentational HTML markup.
Paul Merrell

MICROSOFT CORP (Form: 10-Q, Received: 01/22/2009 09:02:43) - 0 views

  • In January 2008 the Commission opened a competition law investigation related to the inclusion of various capabilities in our Windows operating system software, including Web browsing software. The investigation was precipitated by a complaint filed with the Commission by Opera Software ASA, a firm that offers Web browsing software. On January 15, 2009, the European Commission issued a statement of objections expressing the Commission’s preliminary view that the inclusion of Internet Explorer in Windows since 1996 has violated European competition law. According to the statement of objections, other browsers are foreclosed from competing because Windows includes Internet Explorer. We will have an opportunity to respond in writing to the statement of objections within about two months. We may also request a hearing, which would take place after the submission of this response. Under European Union procedure, the European Commission will not make a final determination until after it receives and assesses our response and conducts the hearing, should we request one. The statement of objections seeks to impose a remedy that is different than the remedy imposed in the earlier proceeding concerning Windows Media Player.
  • While computer users and OEMs are already free to run any Web browsing software on Windows, the Commission is considering ordering Microsoft and OEMs to obligate users to choose a particular browser when setting up a new PC. Such a remedy might include a requirement that OEMs distribute multiple browsers on new Windows-based PCs. We may also be required to disable certain unspecified Internet Explorer software code if a user chooses a competing browser. The statement of objections also seeks to impose a significant fine based on sales of Windows operating systems in the European Union. In January 2008, the Commission opened an additional competition law investigation that relates primarily to interoperability with respect to our Microsoft Office family of products. This investigation resulted from complaints filed with the Commission by a trade association of Microsoft’s competitors.
Paul Merrell

Dare Obasanjo aka Carnage4Life - Not Turtles, AtomPub All the Way Down - 0 views

  • I don't think the Atom publishing protocol can be considered the universal protocol for talking to remote databases given that cloud storage vendors like Amazon and database vendors like Oracle don't support it yet. That said, this is definitely a positive trend. Back in the RSS vs. Atom days I used to get frustrated that people were spending so much time reinventing the wheel with an RSS clone when the real gaping hole in the infrastructure was a standard editing protocol. It took a little longer than I expected (Sam Ruby started talking about in 2003) but the effort has succeeded way beyond my wildest dreams. All I wanted was a standard editing protocol for blogs and content management systems and we've gotten so much more.
  • Microsoft is using AtomPub as the interface to a wide breadth of services and products as George Moore points out in his post A Unified Standards-Based Protocols and Tooling Platform for Storage from Microsoft 
  • And a few weeks after George's post even more was revealed in posts such as this one about  FeedSync and Live Mesh where we find out Congratulations to the Live Mesh team, who announced their Live Mesh Technology Preview release earlier this evening! Amit Mital gives a detailed overview in this post on http://dev.live.com. You can read all about it in the usual places...so why do I mention it here? FeedSync is one of the core parts of the Live Mesh platform. One of the key values of Live Mesh is that your data flows to all of your devices. And rather than being hidden away in a single service, any properly authenticated user has full bidirectional sync capability. As I discussed in the Introduction to FeedSync, this really makes "your stuff yours". Okay, FeedSync isn't really AtomPub but it does use the Atom syndication format so I count that as a win for Atom+APP as well. As time goes on, I hope we'll see even more products and services that support Atom and AtomPub from Microsoft. Standardization at the protocol layer means we can move innovation up the stack.
  •  
    Like this http://www.hdfilmsaati.net Film,dvd,download,free download,product... ppc,adword,adsense,amazon,clickbank,osell,bookmark,dofollow,edu,gov,ads,linkwell,traffic,scor,serp,goggle,bing,yahoo.ads,ads network,ads goggle,bing,quality links,link best,ptr,cpa,bpa
Paul Merrell

Bankrolled by broadband donors, lawmakers lobby FCC on net neutrality | Ars Technica - 1 views

  • The 28 House members who lobbied the Federal Communications Commission to drop net neutrality this week have received more than twice the amount in campaign contributions from the broadband sector than the average for all House members. These lawmakers, including the top House leadership, warned the FCC that regulating broadband like a public utility "harms" providers, would be "fatal to the Internet," and could "limit economic freedom."​ According to research provided Friday by Maplight, the 28 House members received, on average, $26,832 from the "cable & satellite TV production & distribution" sector over a two-year period ending in December. According to the data, that's 2.3 times more than the House average of $11,651. What's more, one of the lawmakers who told the FCC that he had "grave concern" (PDF) about the proposed regulation took more money from that sector than any other member of the House. Rep. Greg Walden (R-OR) was the top sector recipient, netting more than $109,000 over the two-year period, the Maplight data shows.
  • Dan Newman, cofounder and president of Maplight, the California research group that reveals money in politics, said the figures show that "it's hard to take seriously politicians' claims that they are acting in the public interest when their campaigns are funded by companies seeking huge financial benefits for themselves." Signing a letter to the FCC along with Walden, who chairs the House Committee on Energy and Commerce, were three other key members of the same committee: Reps. Fred Upton (R-MI), Robert Latta (R-OH), and Marsha Blackburn (R-TN). Over the two-year period, Upton took in $65,000, Latta took $51,000, and Blackburn took $32,500. In a letter (PDF) those representatives sent to the FCC two days before Thursday's raucous FCC net neutrality hearing, the four wrote that they had "grave concern" over the FCC's consideration of "reclassifying Internet broadband service as an old-fashioned 'Title II common carrier service.'" The letter added that a switchover "harms broadband providers, the American economy, and ultimately broadband consumers, actually doing so would be fatal to the Internet as we know it."
  • Not every one of the 28 members who publicly lobbied the FCC against net neutrality in advance of Thursday's FCC public hearing received campaign financing from the industry. One representative took no money: Rep. Nick Rahall (D-WV). In all, the FCC received at least three letters from House lawmakers with 28 signatures urging caution on classifying broadband as a telecommunications service, which would open up the sector to stricter "common carrier" rules, according to letters the members made publicly available. The US has long applied common carrier status to the telephone network, providing justification for universal service obligations that guarantee affordable phone service to all Americans and other rules that promote competition and consumer choice. Some consumer advocates say that common carrier status is needed for the FCC to impose strong network neutrality rules that would force ISPs to treat all traffic equally, not degrading competing services or speeding up Web services in exchange for payment. ISPs have argued that common carrier rules would saddle them with too much regulation and would force them to spend less on network upgrades and be less innovative.
  • ...2 more annotations...
  • Of the 28 House members signing on to the three letters, Republicans received, on average, $59,812 from the industry over the two-year period compared to $13,640 for Democrats, according to the Maplight data. Another letter (PDF) sent to the FCC this week from four top members of the House, including Speaker John Boehner (R-OH), Majority Leader Eric Cantor (R-VA), Majority Whip Kevin McCarthy (R-CA), and Republican Conference Chair Cathy McMorris Rodgers (R-WA), argued in favor of cable companies: "We are writing to respectfully urge you to halt your consideration of any plan to impose antiquated regulation on the Internet, and to warn that implementation of such a plan will needlessly inhibit the creation of American private sector jobs, limit economic freedom and innovation, and threaten to derail one of our economy's most vibrant sectors," they wrote. Over the two-year period, Boehner received $75,450; Cantor got $80,800; McCarthy got $33,000; and McMorris Rodgers got $31,500.
  • The third letter (PDF) forwarded to the FCC this week was signed by 20 House members. "We respectfully urge you to consider the effect that regressing to a Title II approach might have on private companies' ability to attract capital and their continued incentives to invest and innovate, as well as the potentially negative impact on job creation that might result from any reduction in funding or investment," the letter said. Here are the 28 lawmakers who lobbied the FCC this week and their reported campaign contributions:
Paul Merrell

UN Report Finds Mass Surveillance Violates International Treaties and Privacy Rights - ... - 0 views

  • The United Nations’ top official for counter-terrorism and human rights (known as the “Special Rapporteur”) issued a formal report to the U.N. General Assembly today that condemns mass electronic surveillance as a clear violation of core privacy rights guaranteed by multiple treaties and conventions. “The hard truth is that the use of mass surveillance technology effectively does away with the right to privacy of communications on the Internet altogether,” the report concluded. Central to the Rapporteur’s findings is the distinction between “targeted surveillance” — which “depend[s] upon the existence of prior suspicion of the targeted individual or organization” — and “mass surveillance,” whereby “states with high levels of Internet penetration can [] gain access to the telephone and e-mail content of an effectively unlimited number of users and maintain an overview of Internet activity associated with particular websites.” In a system of “mass surveillance,” the report explained, “all of this is possible without any prior suspicion related to a specific individual or organization. The communications of literally every Internet user are potentially open for inspection by intelligence and law enforcement agencies in the States concerned.”
  • Mass surveillance thus “amounts to a systematic interference with the right to respect for the privacy of communications,” it declared. As a result, “it is incompatible with existing concepts of privacy for States to collect all communications or metadata all the time indiscriminately.” In concluding that mass surveillance impinges core privacy rights, the report was primarily focused on the International Covenant on Civil and Political Rights, a treaty enacted by the General Assembly in 1966, to which all of the members of the “Five Eyes” alliance are signatories. The U.S. ratified the treaty in 1992, albeit with various reservations that allowed for the continuation of the death penalty and which rendered its domestic law supreme. With the exception of the U.S.’s Persian Gulf allies (Saudi Arabia, UAE and Qatar), virtually every major country has signed the treaty. Article 17 of the Covenant guarantees the right of privacy, the defining protection of which, the report explained, is “that individuals have the right to share information and ideas with one another without interference by the State, secure in the knowledge that their communication will reach and be read by the intended recipients alone.”
  • The report’s key conclusion is that this core right is impinged by mass surveillance programs: “Bulk access technology is indiscriminately corrosive of online privacy and impinges on the very essence of the right guaranteed by article 17. In the absence of a formal derogation from States’ obligations under the Covenant, these programs pose a direct and ongoing challenge to an established norm of international law.” The report recognized that protecting citizens from terrorism attacks is a vital duty of every state, and that the right of privacy is not absolute, as it can be compromised when doing so is “necessary” to serve “compelling” purposes. It noted: “There may be a compelling counter-terrorism justification for the radical re-evaluation of Internet privacy rights that these practices necessitate. ” But the report was adamant that no such justifications have ever been demonstrated by any member state using mass surveillance: “The States engaging in mass surveillance have so far failed to provide a detailed and evidence-based public justification for its necessity, and almost no States have enacted explicit domestic legislation to authorize its use.”
  • ...5 more annotations...
  • Instead, explained the Rapporteur, states have relied on vague claims whose validity cannot be assessed because of the secrecy behind which these programs are hidden: “The arguments in favor of a complete abrogation of the right to privacy on the Internet have not been made publicly by the States concerned or subjected to informed scrutiny and debate.” About the ongoing secrecy surrounding the programs, the report explained that “states deploying this technology retain a monopoly of information about its impact,” which is “a form of conceptual censorship … that precludes informed debate.” A June report from the High Commissioner for Human Rights similarly noted “the disturbing lack of governmental transparency associated with surveillance policies, laws and practices, which hinders any effort to assess their coherence with international human rights law and to ensure accountability.” The rejection of the “terrorism” justification for mass surveillance as devoid of evidence echoes virtually every other formal investigation into these programs. A federal judge last December found that the U.S. Government was unable to “cite a single case in which analysis of the NSA’s bulk metadata collection actually stopped an imminent terrorist attack.” Later that month, President Obama’s own Review Group on Intelligence and Communications Technologies concluded that mass surveillance “was not essential to preventing attacks” and information used to detect plots “could readily have been obtained in a timely manner using conventional [court] orders.”
  • That principle — that the right of internet privacy belongs to all individuals, not just Americans — was invoked by NSA whistleblower Edward Snowden when he explained in a June, 2013 interview at The Guardian why he disclosed documents showing global surveillance rather than just the surveillance of Americans: “More fundamentally, the ‘US Persons’ protection in general is a distraction from the power and danger of this system. Suspicionless surveillance does not become okay simply because it’s only victimizing 95% of the world instead of 100%.” The U.N. Rapporteur was clear that these systematic privacy violations are the result of a union between governments and tech corporations: “States increasingly rely on the private sector to facilitate digital surveillance. This is not confined to the enactment of mandatory data retention legislation. Corporates [sic] have also been directly complicit in operationalizing bulk access technology through the design of communications infrastructure that facilitates mass surveillance. ”
  • The report was most scathing in its rejection of a key argument often made by American defenders of the NSA: that mass surveillance is justified because Americans are given special protections (the requirement of a FISA court order for targeted surveillance) which non-Americans (95% of the world) do not enjoy. Not only does this scheme fail to render mass surveillance legal, but it itself constitutes a separate violation of international treaties (emphasis added): The Special Rapporteur concurs with the High Commissioner for Human Rights that where States penetrate infrastructure located outside their territorial jurisdiction, they remain bound by their obligations under the Covenant. Moreover, article 26 of the Covenant prohibits discrimination on grounds of, inter alia, nationality and citizenship. The Special Rapporteur thus considers that States are legally obliged to afford the same privacy protection for nationals and non-nationals and for those within and outside their jurisdiction. Asymmetrical privacy protection regimes are a clear violation of the requirements of the Covenant.
  • Three Democratic Senators on the Senate Intelligence Committee wrote in The New York Times that “the usefulness of the bulk collection program has been greatly exaggerated” and “we have yet to see any proof that it provides real, unique value in protecting national security.” A study by the centrist New America Foundation found that mass metadata collection “has had no discernible impact on preventing acts of terrorism” and, where plots were disrupted, “traditional law enforcement and investigative methods provided the tip or evidence to initiate the case.” It labeled the NSA’s claims to the contrary as “overblown and even misleading.” While worthless in counter-terrorism policies, the UN report warned that allowing mass surveillance to persist with no transparency creates “an ever present danger of ‘purpose creep,’ by which measures justified on counter-terrorism grounds are made available for use by public authorities for much less weighty public interest purposes.” Citing the UK as one example, the report warned that, already, “a wide range of public bodies have access to communications data, for a wide variety of purposes, often without judicial authorization or meaningful independent oversight.”
  • The latest finding adds to the growing number of international formal rulings that the mass surveillance programs of the U.S. and its partners are illegal. In January, the European parliament’s civil liberties committee condemned such programs in “the strongest possible terms.” In April, the European Court of Justice ruled that European legislation on data retention contravened EU privacy rights. A top secret memo from the GCHQ, published last year by The Guardian, explicitly stated that one key reason for concealing these programs was fear of a “damaging public debate” and specifically “legal challenges against the current regime.” The report ended with a call for far greater transparency along with new protections for privacy in the digital age. Continuation of the status quo, it warned, imposes “a risk that systematic interference with the security of digital communications will continue to proliferate without any serious consideration being given to the implications of the wholesale abandonment of the right to online privacy.” The urgency of these reforms is underscored, explained the Rapporteur, by a conclusion of the United States Privacy and Civil Liberties Oversight Board that “permitting the government to routinely collect the calling records of the entire nation fundamentally shifts the balance of power between the state and its citizens.”
Paul Merrell

Utah lawmaker questions city water going to NSA - 0 views

  • SALT LAKE CITY – A Utah lawmaker concerned about government spying on its citizens is questioning whether city water service should be cut off to a massive National Security Agency data storage facility outside Salt Lake City.Republican Rep. Marc Roberts, of Santaquin, said there are serious questions about privacy and surveillance surrounding the center, and several Utah residents who spoke at a legislative committee hearing Wednesday agreed.During the last legislative session, lawmakers opted to hold off on Roberts' bill to shut off the facility's water and decided to study it during the interim."This is not a bill just about a data center. This is a bill about civil rights," web developer Joe Levi said. "This is a bill that needs to be taken up and needs to be taken seriously."Pete Ashdown, founder of Salt Lake City-based Internet provider XMission, called the center a stain upon the state and its technology industry. "I do encourage you to stand up and do something about it," he said.Lawmakers said they aren't considering shutting down $1.7 billion facility, but the committee chair acknowledged the concerns and said there might be another way to get the point across. "We may look at some type of a strong message to give our representatives to take back to Congress," said Republican Sen. David Hinkins, of Orangeville.
  • The NSA's largest data storage center in the U.S. was built in Utah over 37 other locations because of open land and cheap electricity. The center sits on a National Guard base about 25 miles south of Salt Lake City in the town of Bluffdale.NSA officials said the center is key to protecting national security networks and allowing U.S. authorities to watch for cyber threats. Beyond that, the agency has offered few details.The center attracted much discussion and concern after revelations last year that the NSA has been collecting millions of U.S. phone records and digital communications stored by major Internet providers.
  • Cybersecurity experts say the nondescript Utah facility is a giant storehouse for phone calls, emails and online records that have been secretly collected.Outside the computer storehouses are large coolers that keep the machines from overheating. The coolers use large amounts of water, which the nearby city of Bluffdale sells to the center at a discounted rate.City records released earlier this year showed monthly water use was much less than the 1 million gallons a day that the U.S. Army Corps of Engineers predicted the center would need, causing some to wonder if the center was fully operational.NSA officials have refused to say if the center is up and running after its scheduled opening in October 2013 was stalled by electrical problems.City utility records showed the NSA has been making monthly minimum payments of about $30,000 to Bluffdale. The city manager said that pays for more water than the center used.The state of Nevada shut off water to the site of the proposed Yucca Mountain nuclear waste dump 90 miles northwest of Las Vegas in 2002, after months of threats.The project didn't run dry because the Energy Department built a 1-million-gallon tank and a small well for the site. Department officials said the stored water, plus 400,000 gallons stored in other tanks at the Nevada Test Site, provided time for scientists to continue experiments and design work at the site.
  • ...1 more annotation...
  • SALT LAKE CITY – A Utah lawmaker concerned about government spying on its citizens is questioning whether city water service should be cut off to a massive National Security Agency data storage facility outside Salt Lake City.Rep
  •  
    Hey, go for their electricity too! But what do we do with the Bluffdale facility after we abolish the NSA? Turn it over to Internet Archives, with a $1 billion endowment for maintenance? Free and permanent web sites for everyone?  
Paul Merrell

LEAKED: Secret Negotiations to Let Big Brother Go Global | Wolf Street - 0 views

  • Much has been written, at least in the alternative media, about the Trans Pacific Partnership (TPP) and the Transatlantic Trade and Investment Partnership (TTIP), two multilateral trade treaties being negotiated between the representatives of dozens of national governments and armies of corporate lawyers and lobbyists (on which you can read more here, here and here). However, much less is known about the decidedly more secretive Trade in Services Act (TiSA), which involves more countries than either of the other two. At least until now, that is. Thanks to a leaked document jointly published by the Associated Whistleblowing Press and Filtrala, the potential ramifications of the treaty being hashed out behind hermetically sealed doors in Geneva are finally seeping out into the public arena.
  • If signed, the treaty would affect all services ranging from electronic transactions and data flow, to veterinary and architecture services. It would almost certainly open the floodgates to the final wave of privatization of public services, including the provision of healthcare, education and water. Meanwhile, already privatized companies would be prevented from a re-transfer to the public sector by a so-called barring “ratchet clause” – even if the privatization failed. More worrisome still, the proposal stipulates that no participating state can stop the use, storage and exchange of personal data relating to their territorial base. Here’s more from Rosa Pavanelli, general secretary of Public Services International (PSI):
  • The leaked documents confirm our worst fears that TiSA is being used to further the interests of some of the largest corporations on earth (…) Negotiation of unrestricted data movement, internet neutrality and how electronic signatures can be used strike at the heart of individuals’ rights. Governments must come clean about what they are negotiating in these secret trade deals. Fat chance of that, especially in light of the fact that the text is designed to be almost impossible to repeal, and is to be “considered confidential” for five years after being signed. What that effectively means is that the U.S. approach to data protection (read: virtually non-existent) could very soon become the norm across 50 countries spanning the breadth and depth of the industrial world.
  • ...1 more annotation...
  • The main players in the top-secret negotiations are the United States and all 28 members of the European Union. However, the broad scope of the treaty also includes Australia, Canada, Chile, Colombia, Costa Rica, Hong Kong, Iceland, Israel, Japan, Liechtenstein, Mexico, New Zealand, Norway, Pakistan, Panama, Paraguay, Peru, South Korea, Switzerland, Taiwan and Turkey. Combined they represent almost 70 percent of all trade in services worldwide. An explicit goal of the TiSA negotiations is to overcome the exceptions in GATS that protect certain non-tariff trade barriers, such as data protection. For example, the draft Financial Services Annex of TiSA, published by Wikileaks in June 2014, would allow financial institutions, such as banks, the free transfer of data, including personal data, from one country to another. As Ralf Bendrath, a senior policy advisor to the MEP Jan Philipp Albrecht, writes in State Watch, this would constitute a radical carve-out from current European data protection rules:
Paul Merrell

Google Says Website Encryption Will Now Influence Search Rankings - 0 views

  • Google will begin using website encryption, or HTTPS, as a ranking signal – a move which should prompt website developers who have dragged their heels on increased security measures, or who debated whether their website was “important” enough to require encryption, to make a change. Initially, HTTPS will only be a lightweight signal, affecting fewer than 1% of global queries, says Google. That means that the new signal won’t carry as much weight as other factors, including the quality of the content, the search giant noted, as Google means to give webmasters time to make the switch to HTTPS. Over time, however, encryption’s effect on search ranking make strengthen, as the company places more importance on website security. Google also promises to publish a series of best practices around TLS (HTTPS, is also known as HTTP over TLS, or Transport Layer Security) so website developers can better understand what they need to do in order to implement the technology and what mistakes they should avoid. These tips will include things like what certificate type is needed, how to use relative URLs for resources on the same secure domain, best practices around allowing for site indexing, and more.
  • In addition, website developers can test their current HTTPS-enabled website using the Qualys Lab tool, says Google, and can direct further questions to Google’s Webmaster Help Forums where the company is already in active discussions with the broader community. The announcement has drawn a lot of feedback from website developers and those in the SEO industry – for instance, Google’s own blog post on the matter, shared in the early morning hours on Thursday, is already nearing 1,000 comments. For the most part, the community seems to support the change, or at least acknowledge that they felt that something like this was in the works and are not surprised. Google itself has been making moves to better securing its own traffic in recent months, which have included encrypting traffic between its own servers. Gmail now always uses an encrypted HTTPS connection which keeps mail from being snooped on as it moves from a consumer’s machine to Google’s data centers.
  • While HTTPS and site encryption have been a best practice in the security community for years, the revelation that the NSA has been tapping the cables, so to speak, to mine user information directly has prompted many technology companies to consider increasing their own security measures, too. Yahoo, for example, also announced in November its plans to encrypt its data center traffic. Now Google is helping to push the rest of the web to do the same.
  •  
    The Internet continues to harden in the wake of the NSA revelations. This is a nice nudge by Google.
Gary Edwards

Should you buy enterprise applications from a startup? - 0 views

  • The biggest advantage of startups, in Mueller's opinion? "They have no technical historical burden, and they don't care about many technical dependencies. They deliver easy-to-use technology with relatively simple but powerful integration options."
  • "The model we've used to buy on-premises software for 20-plus years is shifting," insists Laping. "There are new ways of selecting and vetting partners."
  • Part of that shift is simple: The business side sees what technology can do, and it's banging on IT's door, demanding ... what? Not new drop-down menus in the same-old ERP application, but rather state-of-the-art, cutting-edge, ain't-that-cool innovation. The landscape is wide open: Innovation can come in the form of new technologies, such as the Internet of Things, or from mobility, the cloud, virtualization -- in fact, from anywhere an enterprise vendor isn't filling a need. The easiest place to find that? Startups.
  • ...5 more annotations...
  • "The number one reason to consider a startup is that the current landscape of Magic Quadrant vendors is not serving a critical need. That's a problem."
  • Ravi Belani is managing partner at Alchemist Accelerator, a Palo Alto, Calif.-based venture-backed initiative focused on accelerating startups whose revenue comes from enterprises rather than consumers. He says, "The innovation that used to come out of big software houses isn't there anymore, while the pace of innovation in technology is accelerating."
  • He acknowledges that there has been a longtime concern with startups about the ability of their applications to scale, but given startups' ability to build their software on robust infrastructure platforms using IaaS or PaaS, and then deploy them via SaaS, "scalability isn't as big a deal as it used it be. It costs $50,000 today to do what you needed $50 million to do ten years ago. That means it takes less capital today to create the same innovation. Ten years ago, that was a moat, a barrier to entry, but software vendors don't own that moat anymore."
  • he confluence of offshore programming, open source technologies and cloud-based infrastructures has significantly lowered the barriers to entry of launching a new venture -- not to mention all those newly minted tech millionaires willing to be angel investors.
  • "In the new paradigm, [most software] implementations are so much shorter, you don't have to think about that risk. You're not talking about three years and $20 million. You're talking about 75 days and $50,000. You implement little modules and get big wins along the way."
  •  
    "The idea of buying an enterprise application from a startup company might sound like anathema to a CIO. But Chris Laping, CIO of restaurant chain Red Robin, based in Greenwood Village, Colo., disagrees. He believes we're in the middle of a significant shift that favors startups -- moving from huge applications with extensive features to task-based activities, inspired by the apps running on mobile devices. Featured Resource Presented by Scribe Software 10 Best Practices for Integrating Data Data integration is often underestimated and poorly implemented, taking time and resources. Yet it Learn More Mirco Mueller concurs. He is an IT architect for St. Gallen, Switzerland-based Helvetia Swiss Life Insurance Co., which -- having been founded in 1858 -- is about as far from a startup as possible. He recently chose a SaaS tool from an unnamed startup over what he calls "a much more powerful but much more complex alternative. Its list of features is shorter than the feature list of the big companies, but in terms of agility, flexibility, ease of use and adjustable business model, it beat" all of its competitors. The biggest advantage of startups, in Mueller's opinion? "They have no technical historical burden, and they don't care about many technical dependencies. They deliver easy-to-use technology with relatively simple but powerful integration options." There's certainly no lack of applications available from new players. At a recent conference focusing on innovation, Microsoft Ventures principal Daniel Sumner noted that every month for the last 88 months, there's been a $1 billion valuation for one startup or another. That's seven years and counting. But as Silicon Valley skeptics like to point out, those are the ones you hear about. For every successful startup, there are at least three that fail, according to 2012 research by Harvard Business School professor Shikhar Ghosh. So why, then, would CIOs in their right mind take the risk of buying enterprise applic
Paul Merrell

Secret 'BADASS' Intelligence Program Spied on Smartphones - The Intercept - 0 views

  • British and Canadian spy agencies accumulated sensitive data on smartphone users, including location, app preferences, and unique device identifiers, by piggybacking on ubiquitous software from advertising and analytics companies, according to a document obtained by NSA whistleblower Edward Snowden. The document, included in a trove of Snowden material released by Der Spiegel on January 17, outlines a secret program run by the intelligence agencies called BADASS. The German newsweekly did not write about the BADASS document, attaching it to a broader article on cyberwarfare. According to The Intercept‘s analysis of the document, intelligence agents applied BADASS software filters to streams of intercepted internet traffic, plucking from that traffic unencrypted uploads from smartphones to servers run by advertising and analytics companies.
  • Programmers frequently embed code from a handful of such companies into their smartphone apps because it helps them answer a variety of questions: How often does a particular user open the app, and at what time of day? Where does the user live? Where does the user work? Where is the user right now? What’s the phone’s unique identifier? What version of Android or iOS is the device running? What’s the user’s IP address? Answers to those questions guide app upgrades and help target advertisements, benefits that help explain why tracking users is not only routine in the tech industry but also considered a best practice. For users, however, the smartphone data routinely provided to ad and analytics companies represents a major privacy threat. When combined together, the information fragments can be used to identify specific users, and when concentrated in the hands of a small number of companies, they have proven to be irresistibly convenient targets for those engaged in mass surveillance. Although the BADASS presentation appears to be roughly four years old, at least one player in the mobile advertising and analytics space, Google, acknowledges that its servers still routinely receive unencrypted uploads from Google code embedded in apps.
Paul Merrell

Google reveals where AT&T, Comcast, Time Warner Cable will next offer Gbps broadband * ... - 0 views

  • Google has named the next four areas in the US to get its gigabit-a-second fiber broadband. The advertising giant said on Tuesday it will next roll out high-speed connections to 18 cities in and around Atlanta, GA; Charlotte, NC; Raleigh-Durham, NC; and Nashville, TN. Charlotte city officials had indicated they were expecting to be named as one of the next places to feel Google's cable. The expansion will bring the total number of areas with Google Fiber deployments to seven: the California biz already offers fiber broadband in and around Kansas City, MO, Austin, TX, and Provo, UT.
  • Google charges $70 a month for gigabit internet, $120 if you want TV with it, or free if you're happy with 5Mbit/s for the downlink. Only the freebie option requires a $300 installation fee. Despite the price tag, the service is hotly anticipated in the few chosen cities. The presence of Google Fiber also has the side-effect of spurring rival carriers, such as AT&T, to offer their own high-speed broadband services in the area.
  • Later this year, the Chocolate Factory will also make its decision on where the next set of Fiber rollouts will take place. Five areas are being considered: Portland, OR; San Jose, CA; Salt Lake City, UT; Phoenix, AZ; and San Antonio, TX. ®
Paul Merrell

Canadian risks prison for not giving up phone's passcode - Yahoo News - 0 views

  • Montreal (AFP) - A Canadian charged for refusing to give border agents his smartphone passcode was expected Thursday to become the first to test whether border inspections can include information stored on devices.Alain Philippon, 38, risks up to a year in prison and a fine of up to Can$25,000 (US$20,000) if convicted of obstruction.He told local media that he refused to provide the passcode because he considered information on his smartphone to be "personal."Philippon was transiting through the port city of Halifax on his way home from a Caribbean vacation on Monday when he was selected for an in-depth exam.
  • "Philippon refused to divulge the passcode for his cell phone, preventing border services officers from their duties," Canada Border Services Agency said in an email.The agency insists that the Customs Act authorizes its officers to examine "all goods and conveyances including electronic devices, such as cell phones and laptops."But, according to legal experts, the issue of whether a traveler must reveal the password for an electronic device at a border crossing has not been tested in court. "(It's) one thing for them to inspect it, another thing for them to compel you to help them," Rob Currie, director of the Law and Technology Institute at Dalhousie University, told public broadcaster CBC.Philippon is scheduled to appear in court on May 12.
Paul Merrell

NZ Prime Minister John Key Retracts Vow to Resign if Mass Surveillance Is Shown - 0 views

  • In August 2013, as evidence emerged of the active participation by New Zealand in the “Five Eyes” mass surveillance program exposed by Edward Snowden, the country’s conservative Prime Minister, John Key, vehemently denied that his government engages in such spying. He went beyond mere denials, expressly vowing to resign if it were ever proven that his government engages in mass surveillance of New Zealanders. He issued that denial, and the accompanying resignation vow, in order to reassure the country over fears provoked by a new bill he advocated to increase the surveillance powers of that country’s spying agency, Government Communications Security Bureau (GCSB) — a bill that passed by one vote thanks to the Prime Minister’s guarantees that the new law would not permit mass surveillance.
  • Since then, a mountain of evidence has been presented that indisputably proves that New Zealand does exactly that which Prime Minister Key vehemently denied — exactly that which he said he would resign if it were proven was done. Last September, we reported on a secret program of mass surveillance at least partially implemented by the Key government that was designed to exploit the very law that Key was publicly insisting did not permit mass surveillance. At the time, Snowden, citing that report as well as his own personal knowledge of GCSB’s participation in the mass surveillance tool XKEYSCORE, wrote in an article for The Intercept: Let me be clear: any statement that mass surveillance is not performed in New Zealand, or that the internet communications are not comprehensively intercepted and monitored, or that this is not intentionally and actively abetted by the GCSB, is categorically false. . . . The prime minister’s claim to the public, that “there is no and there never has been any mass surveillance” is false. The GCSB, whose operations he is responsible for, is directly involved in the untargeted, bulk interception and algorithmic analysis of private communications sent via internet, satellite, radio, and phone networks.
  • A series of new reports last week by New Zealand journalist Nicky Hager, working with my Intercept colleague Ryan Gallagher, has added substantial proof demonstrating GCSB’s widespread use of mass surveillance. An article last week in The New Zealand Herald demonstrated that “New Zealand’s electronic surveillance agency, the GCSB, has dramatically expanded its spying operations during the years of John Key’s National Government and is automatically funnelling vast amounts of intelligence to the US National Security Agency.” Specifically, its “intelligence base at Waihopai has moved to ‘full-take collection,’ indiscriminately intercepting Asia-Pacific communications and providing them en masse to the NSA through the controversial NSA intelligence system XKeyscore, which is used to monitor emails and internet browsing habits.” Moreover, the documents “reveal that most of the targets are not security threats to New Zealand, as has been suggested by the Government,” but “instead, the GCSB directs its spying against a surprising array of New Zealand’s friends, trading partners and close Pacific neighbours.” A second report late last week published jointly by Hager and The Intercept detailed the role played by GCSB’s Waihopai base in aiding NSA’s mass surveillance activities in the Pacific (as Hager was working with The Intercept on these stories, his house was raided by New Zealand police for 10 hours, ostensibly to find Hager’s source for a story he published that was politically damaging to Key).
  • ...6 more annotations...
  • That the New Zealand government engages in precisely the mass surveillance activities Key vehemently denied is now barely in dispute. Indeed, a former director of GCSB under Key, Sir Bruce Ferguson, while denying any abuse of New Zealander’s communications, now admits that the agency engages in mass surveillance.
  • Meanwhile, Russel Norman, the head of the country’s Green Party, said in response to these stories that New Zealand is “committing crimes” against its neighbors in the Pacific by subjecting them to mass surveillance, and insists that the Key government broke the law because that dragnet necessarily includes the communications of New Zealand citizens when they travel in the region.
  • So now that it’s proven that New Zealand does exactly that which Prime Minister Key vowed would cause him to resign if it were proven, is he preparing his resignation speech? No: that’s something a political official with a minimal amount of integrity would do. Instead — even as he now refuses to say what he has repeatedly said before: that GCSB does not engage in mass surveillance — he’s simply retracting his pledge as though it were a minor irritant, something to be casually tossed aside:
  • When asked late last week whether New Zealanders have a right to know what their government is doing in the realm of digital surveillance, the Prime Minister said: “as a general rule, no.” And he expressly refuses to say whether New Zealand is doing that which he swore repeatedly it was not doing, as this excellent interview from Radio New Zealand sets forth: Interviewer: “Nicky Hager’s revelations late last week . . . have stoked fears that New Zealanders’ communications are being indiscriminately caught in that net. . . . The Prime Minister, John Key, has in the past promised to resign if it were found to be mass surveillance of New Zealanders . . . Earlier, Mr. Key was unable to give me an assurance that mass collection of communications from New Zealanders in the Pacific was not taking place.” PM Key: “No, I can’t. I read the transcript [of former GCSB Director Bruce Ferguson’s interview] – I didn’t hear the interview – but I read the transcript, and you know, look, there’s a variety of interpretations – I’m not going to critique–”
  • Interviewer: “OK, I’m not asking for a critique. Let’s listen to what Bruce Ferguson did tell us on Friday:” Ferguson: “The whole method of surveillance these days, is sort of a mass collection situation – individualized: that is mission impossible.” Interviewer: “And he repeated that several times, using the analogy of a net which scoops up all the information. . . . I’m not asking for a critique with respect to him. Can you confirm whether he is right or wrong?” Key: “Uh, well I’m not going to go and critique the guy. And I’m not going to give a view of whether he’s right or wrong” . . . . Interviewer: “So is there mass collection of personal data of New Zealand citizens in the Pacific or not?” Key: “I’m just not going to comment on where we have particular targets, except to say that where we go and collect particular information, there is always a good reason for that.”
  • From “I will resign if it’s shown we engage in mass surveillance of New Zealanders” to “I won’t say if we’re doing it” and “I won’t quit either way despite my prior pledges.” Listen to the whole interview: both to see the type of adversarial questioning to which U.S. political leaders are so rarely subjected, but also to see just how obfuscating Key’s answers are. The history of reporting from the Snowden archive has been one of serial dishonesty from numerous governments: such as the way European officials at first pretended to be outraged victims of NSA only for it to be revealed that, in many ways, they are active collaborators in the very system they were denouncing. But, outside of the U.S. and U.K. itself, the Key government has easily been the most dishonest over the last 20 months: one of the most shocking stories I’ve seen during this time was how the Prime Minister simultaneously plotted in secret to exploit the 2013 proposed law to implement mass surveillance at exactly the same time that he persuaded the public to support it by explicitly insisting that it would not allow mass surveillance. But overtly reneging on a public pledge to resign is a new level of political scandal. Key was just re-elected for his third term, and like any political official who stays in power too long, he has the despot’s mentality that he’s beyond all ethical norms and constraints. But by the admission of his own former GCSB chief, he has now been caught red-handed doing exactly that which he swore to the public would cause him to resign if it were proven. If nothing else, the New Zealand media ought to treat that public deception from its highest political official with the level of seriousness it deserves.
  •  
    It seems the U.S. is not the only nation that has liars for head of state. 
Paul Merrell

After Brit spies 'snoop' on families' lawyers, UK govt admits: We flouted human rights ... - 0 views

  • The British government has admitted that its practice of spying on confidential communications between lawyers and their clients was a breach of the European Convention on Human Rights (ECHR). Details of the controversial snooping emerged in November: lawyers suing Blighty over its rendition of two Libyan families to be tortured by the late and unlamented Gaddafi regime claimed Her Majesty's own lawyers seemed to have access to the defense team's emails. The families' briefs asked for a probe by the secretive Investigatory Powers Tribunal (IPT), a move that led to Wednesday's admission. "The concession the government has made today relates to the agencies' policies and procedures governing the handling of legally privileged communications and whether they are compatible with the ECHR," a government spokesman said in a statement to the media, via the Press Association. "In view of recent IPT judgments, we acknowledge that the policies applied since 2010 have not fully met the requirements of the ECHR, specifically Article 8. This includes a requirement that safeguards are made sufficiently public."
  • The guidelines revealed by the investigation showed that MI5 – which handles the UK's domestic security – had free reign to spy on highly private and sensitive lawyer-client conversations between April 2011 and January 2014. MI6, which handles foreign intelligence, had no rules on the matter either until 2011, and even those were considered void if "extremists" were involved. Britain's answer to the NSA, GCHQ, had rules against such spying, but they too were relaxed in 2011. "By allowing the intelligence agencies free rein to spy on communications between lawyers and their clients, the Government has endangered the fundamental British right to a fair trial," said Cori Crider, a director at the non-profit Reprieve and one of the lawyers for the Libyan families. "For too long, the security services have been allowed to snoop on those bringing cases against them when they speak to their lawyers. In doing so, they have violated a right that is centuries old in British common law. Today they have finally admitted they have been acting unlawfully for years."
  • Crider said it now seemed probable that UK snoopers had been listening in on the communications over the Libyan case. The British government hasn't admitted guilt, but it has at least acknowledged that it was doing something wrong – sort of. "It does not mean that there was any deliberate wrongdoing on the part of the security and intelligence agencies, which have always taken their obligation to protect legally privileged material extremely seriously," the government spokesman said. "Nor does it mean that any of the agencies' activities have prejudiced or in any way resulted in an abuse of process in any civil or criminal proceedings. The agencies will now work with the independent Interception of Communications Commissioner to ensure their policies satisfy all of the UK's human rights obligations." So that's all right, then.
  •  
    If you follow the "November" link you'[l learn that yes, indeed, the UK government lawyers were happily getting the content of their adversaries privileged attorney-client communications. Conspicuously, the promises of reform make no mention of what is surely a disbarment offense in the U.S. I doubt that it's different in the UK. Discovery rules of procedure strictly limit how parties may obtain information from the other side. Wiretapping the other side's lawyers is not a permitted from of discovery. Hopefully, at least the government lawyers in the case in which the misbehavior was discovered have been referred for disciplinary action.  
Paul Merrell

ISPs take GCHQ to court in UK over mass surveillance | World news | theguardian.com - 0 views

  • Internet service providers from around the world are lodging formal complaints against the UK government's monitoring service, GCHQ, alleging that it uses "malicious software" to break into their networks.The claims from seven organisations based in six countries – the UK, Netherlands, US, South Korea, Germany and Zimbabwe – will add to international pressure on the British government following Edward Snowden's revelations about mass surveillance of the internet by UK and US intelligence agencies.The claims are being filed with the investigatory powers tribunal (IPT), the court in London that assesses complaints about the agencies' activities and misuse of surveillance by government organisations. Most of its hearings are held at least partially in secret.
  • The IPT is already considering a number of related submissions. Later this month it will investigate complaints by human rights groups about the way social media sites have been targeted by GCHQ.The government has defended the security services, pointing out that online searches are often routed overseas and those deemed "external communications" can be monitored without the need for an individual warrant. Critics say that such a legal interpretation sidesteps the need for traditional intercept safeguards.The latest claim is against both GCHQ, located near Cheltenham, and the Foreign Office. It is based on articles published earlier this year in the German magazine Der Spiegel. That report alleged that GCHQ had carried out an attack, codenamed Operation Socialist, on the Belgian telecoms group, Belgacom, targeting individual employees with "malware (malicious software)".One of the techniques was a "man in the middle" attack, which, according to the documents filed at the IPT, bypasses modern encryption software and "operates by interposing the attacker [GCHQ] between two computers that believe that they are securely communicating with each other. In fact, each is communicating with GCHQ, who collect the communications, as well as relaying them in the hope that the interference will be undetected."The complaint alleges that the attacks were a breach of the Computer Misuse Act 1990 and an interference with the privacy rights of the employees under the European convention of human rights.
  • The organisations targeted, the submission states, were all "responsible and professional internet service providers". The claimants are: GreenNet Ltd, based in the UK, Riseup Networks in Seattle, Mango Email Service in Zimbabwe, Jinbonet in South Korea, Greenhost in the Netherlands, May First/People Link in New York and the Chaos Computer Club in Hamburg.
  • ...1 more annotation...
  • Among the programs said to have been operating were Turbine, which automates the injection of data and can infect millions of machines and Warrior Pride, which enables microphones on iPhones and Android devices to be remotely activated.
Paul Merrell

Court upholds NSA snooping | TheHill - 0 views

  • A district court in California has issued a ruling in favor of the National Security Agency in a long-running case over the spy agency’s collection of Internet records.The challenge against the controversial Upstream program was tossed out because additional defense from the government would have required “impermissible disclosure of state secret information,” Judge Jeffrey White wrote in his decision.ADVERTISEMENTUnder the program — details of which were revealed through leaks from Edward Snowden and others — the NSA taps into the fiber cables that make up the backbone of the Internet and gathers information about people's online and phone communications. The agency then filters out communications of U.S. citizens, whose data is protected with legal defenses not extended to foreigners, and searches for “selectors” tied to a terrorist or other target.In 2008, the Electronic Frontier Foundation (EFF) sued the government over the program on behalf of five AT&T customers, who said that the collection violated the constitutional protections to privacy and free speech.
  • But “substantial details” about the program still remain classified, White, an appointee under former President George W. Bush, wrote in his decision. Moving forward with the merits of a trial would risk “exceptionally grave damage to national security,” he added. <A HREF="http://ws-na.amazon-adsystem.com/widgets/q?rt=tf_mfw&ServiceVersion=20070822&MarketPlace=US&ID=V20070822%2FUS%2Fthehill07-20%2F8001%2Fdffbe72d-f425-4b83-b07e-357ae9d405f6&Operation=NoScript">Amazon.com Widgets</A> The government has been “persuasive” in using its state secrets privilege, he continued, which allows it to withhold evidence from a case that could severely jeopardize national security.   In addition to saying that the program appeared constitutional, the judge also found that the AT&T customers did not even have the standing to sue the NSA over its data gathering.While they may be AT&T customers, White wrote that the evidence presented to the court was “insufficient to establish that the Upstream collection process operates in the manner” that they say it does, which makes it impossible to tell if their information was indeed collected in the NSA program.  The decision is a stinging rebuke to critics of the NSA, who have seen public interest in their cause slowly fade in the months since Snowden’s revelations.
  • The EFF on Tuesday evening said that it was considering next steps and noted that the court focused on just one program, not the totality of the NSA’s controversial operations.“It would be a travesty of justice if our clients are denied their day in court over the ‘secrecy’ of a program that has been front-page news for nearly a decade,” the group said in a statement.“We will continue to fight to end NSA mass surveillance.”The name of the case is Jewel v. NSA. 
  •  
    The article should have mentioned that the decision was on cross-motions for *partial* summary judgment. The Jewel case will proceed on other plaintiff claims. 
« First ‹ Previous 101 - 120 of 137 Next ›
Showing 20 items per page