Skip to main content

Home/ Future of the Web/ Group items tagged heart

Rss Feed Group items tagged

Gonzalo San Gil, PhD.

Enemies of the Internet 2014: entities at the heart of censorship and surveillance | En... - 0 views

  •  
    "Natalia Radzina of Charter97, a Belarusian news website whose criticism of the government is often censored, was attending an OSCE-organized conference in Vienna on the Internet and media freedom in February 2013 when she ran into someone she would rather not have seen: a member of the Operations and Analysis Centre, a Belarusian government unit that coordinates Internet surveillance and censorship. It is entities like this, little known but often at the heart of surveillance and censorship systems in many countries, that Reporters Without Borders is spotlighting in this year's Enemies of the Internet report, which it is releasing, as usual, on World Day Against Cyber-Censorship (12 March)."
Gonzalo San Gil, PhD.

The impact of open cloud technologies on IT- The Inquirer - 0 views

  •  
    " Column Open source developments are at the heart of new cloud technologies, says Jim Zemlin By Jim Zemlin Fri May 29 2015, 12:00 Jim Zemlin NOWHERE are we seeing more open source and collaborative development than in cloud computing."
  •  
    " Column Open source developments are at the heart of new cloud technologies, says Jim Zemlin By Jim Zemlin Fri May 29 2015, 12:00 Jim Zemlin NOWHERE are we seeing more open source and collaborative development than in cloud computing."
Gonzalo San Gil, PhD.

At the Heart of OpenStack Evolution | Enterprise | LinuxInsider - 1 views

  •  
    "As it matures, OpenStack's parallel to Linux is clearer. Linux emerged 20 years ago as a somewhat exotic challenger to proprietary operating systems. Today, it is one of the most popular and widely used OSes. However, Linux still exists in a market of mixed use. It's likely that OpenStack will be subject to the same effect, becoming a viable option among a number of cloud infrastructures."
Gonzalo San Gil, PhD.

Creative Commons to pass one billion licensed works | Opensource.com - 0 views

  •  
    "At its heart, Creative Commons is a simple idea. It's the idea that when people share their creativity and knowledge with each other, amazing things can happen."
  •  
    "At its heart, Creative Commons is a simple idea. It's the idea that when people share their creativity and knowledge with each other, amazing things can happen."
Gonzalo San Gil, PhD.

Why the Surveillance State Lives On - Michael Hirsh - POLITICO Magazine - 0 views

  •  
    "Once upon a time, Glenn Greenwald was a lonely voice in the blogging wilderness, and Edward Snowden was an isolated functionary at the heart of the American national-security state."
  •  
    "Once upon a time, Glenn Greenwald was a lonely voice in the blogging wilderness, and Edward Snowden was an isolated functionary at the heart of the American national-security state."
Gonzalo San Gil, PhD.

Maybe It's Time to Trust Microsoft -- Maybe Not | FOSS Force - 0 views

  •  
    "Ken Starks The Heart of Linux In this story, Microsoft is the cunning spider and Linux the intended victim, the fly. Everyone knows how the story begins. 'Will you walk into my parlour?' said the Spider to the Fly."
  •  
    "Ken Starks The Heart of Linux In this story, Microsoft is the cunning spider and Linux the intended victim, the fly. Everyone knows how the story begins. 'Will you walk into my parlour?' said the Spider to the Fly."
Gary Edwards

Skynet rising: Google acquires 512-qubit quantum computer; NSA surveillance to be turne... - 0 views

  •  
    "The ultimate code breakers" If you know anything about encryption, you probably also realize that quantum computers are the secret KEY to unlocking all encrypted files. As I wrote about last year here on Natural News, once quantum computers go into widespread use by the NSA, the CIA, Google, etc., there will be no more secrets kept from the government. All your files - even encrypted files - will be easily opened and read. Until now, most people believed this day was far away. Quantum computing is an "impractical pipe dream," we've been told by scowling scientists and "flat Earth" computer engineers. "It's not possible to build a 512-qubit quantum computer that actually works," they insisted. Don't tell that to Eric Ladizinsky, co-founder and chief scientist of a company called D-Wave. Because Ladizinsky's team has already built a 512-qubit quantum computer. And they're already selling them to wealthy corporations, too. DARPA, Northrup Grumman and Goldman Sachs In case you're wondering where Ladizinsky came from, he's a former employee of Northrup Grumman Space Technology (yes, a weapons manufacturer) where he ran a multi-million-dollar quantum computing research project for none other than DARPA - the same group working on AI-driven armed assault vehicles and battlefield robots to replace human soldiers. .... When groundbreaking new technology is developed by smart people, it almost immediately gets turned into a weapon. Quantum computing will be no different. This technology grants God-like powers to police state governments that seek to dominate and oppress the People.  ..... Google acquires "Skynet" quantum computers from D-Wave According to an article published in Scientific American, Google and NASA have now teamed up to purchase a 512-qubit quantum computer from D-Wave. The computer is called "D-Wave Two" because it's the second generation of the system. The first system was a 128-qubit computer. Gen two
  •  
    Normally, I'd be suspicious of anything published by Infowars because its editors are willing to publish really over the top stuff, but: [i] this is subject matter I've maintained an interest in over the years and I was aware that working quantum computers were imminent; and [ii] the pedigree on this particular information does not trace to Scientific American, as stated in the article. I've known Scientific American to publish at least one soothing and lengthy article on the subject of chlorinated dioxin hazard -- my specialty as a lawyer was litigating against chemical companies that generated dioxin pollution -- that was generated by known closet chemical industry advocates long since discredited and was totally lacking in scientific validity and contrary to established scientific knowledge. So publication in Scientific American doesn't pack a lot of weight with me. But checking the Scientific American linked article, notes that it was reprinted by permission from Nature, a peer-reviewed scientific journal and news organization that I trust much more. That said, the InfoWars version is a rewrite that contains lots of information not in the Nature/Scientific American version of a sensationalist nature, so heightened caution is still in order. Check the reprinted Nature version before getting too excited: "The D-Wave computer is not a 'universal' computer that can be programmed to tackle any kind of problem. But scientists have found they can usefully frame questions in machine-learning research as optimisation problems. "D-Wave has battled to prove that its computer really operates on a quantum level, and that it is better or faster than a conventional computer. Before striking the latest deal, the prospective customers set a series of tests for the quantum computer. D-Wave hired an outside expert in algorithm-racing, who concluded that the speed of the D-Wave Two was above average overall, and that it was 3,600 times faster than a leading conventional comput
Gonzalo San Gil, PhD.

The online royalty free public domain clip art - vector clip art online, royalty free &... - 1 views

  •  
    "Vector images: Animal Art Bird Black Blue Brown Button Cartoon Color Computer Flower Food Girl Gray Green Grey Heart Icon Image Logo Man Map Music New Orange Outline Pink Purple Red Sign Support Symbol Tree White Yellow Raster / stock photos: - A And Animated Arts Big Black Blue Design Dsc Edit Flag Flower Free Girl Green Head Icons Image Img Japanese Logo Man Music New Photo Picture Red Sea Ship Support The Tree View "
Gary Edwards

Out in the Open: Inside the Operating System Edward Snowden Used to Evade the NSA | Ent... - 0 views

  •  
    TAILS anonymous Operating System- excerpt: "When NSA whistle-blower Edward Snowden first emailed Glenn Greenwald, he insisted on using email encryption software called PGP for all communications. But this month, we learned that Snowden used another technology to keep his communications out of the NSA's prying eyes. It's called Tails. And naturally, nobody knows exactly who created it. Tails is a kind of computer-in-a-box. You install it on a DVD or USB drive, boot up the computer from the drive and, voila, you're pretty close to anonymous on the internet. At its heart, Tails is a version of the Linux operating system optimized for anonymity. It comes with several privacy and encryption tools, most notably Tor, an application that anonymizes a user's internet traffic by routing it through a network of computers run by volunteers around the world. Snowden, Greenwald and their collaborator, documentary film maker Laura Poitras, used it because, by design, Tails doesn't store any data locally. This makes it virtually immune to malicious software, and prevents someone from performing effective forensics on the computer after the fact. That protects both the journalists, and often more importantly, their sources. "The installation and verification has a learning curve to make sure it is installed correctly," Poitras told WIRED by e-mail. "But once the set up is done, I think it is very easy to use." An Operating System for Anonymity Originally developed as a research project by the U.S. Naval Research Laboratory, Tor has been used by a wide range of people who care about online anonymity: everyone from Silk Road drug dealers, to activists, whistleblowers, stalking victims and people who simply like their online privacy. Tails makes it much easier to use Tor and other privacy tools. Once you boot into Tails - which requires no special setup - Tor runs automatically. When you're done using it, you can boot back into your PC's normal operating
Gonzalo San Gil, PhD.

Attachmate says openSUSE lives, UNIX copyrights not sold to MS - 0 views

  •  
    [A lot of unanswered questions lingered after Attachmate announced that it has negotiated an agreement to acquire Linux vendor Novell earlier this month. The company has since issued official statements to clear up several notable points of concern. Attachmate intends to continue developing the SUSE platform and will support the community-driven openSUSE project. The company has also confirmed that it has retained the UNIX copyrights, the intellectual property at the heart of the SCO dispute. ...]
Gonzalo San Gil, PhD.

Beware: Piracy Defense Lawyers Can Be "Trolls" Too - TorrentFreak [# ! Note] - 1 views

  •  
    " Ernesto on February 8, 2016 C: 37 News Every month hundreds of people are sued for sharing copyrighted media through file-sharing networks, mostly BitTorrent. This practice is big business for copyright holders and lawyers alike. Unfortunately, however, not all defense attorneys appear to have the best interests of their clients at heart."
Gonzalo San Gil, PhD.

Microsoft and Linux: True Romance or Toxic Love? | Linux Journal [# ! Note] - 0 views

  •  
    "On the other hand, Microsoft continues to launch legal attacks on open-source projects directly and through puppet corporations. It's clear that Microsoft hasn't had some big moral change of heart over proprietary vs. free software, so why the public declarations of adoration? "
Paul Merrell

WhatsApp Encryption Said to Stymie Wiretap Order - The New York Times - 0 views

  • While the Justice Department wages a public fight with Apple over access to a locked iPhone, government officials are privately debating how to resolve a prolonged standoff with another technology company, WhatsApp, over access to its popular instant messaging application, officials and others involved in the case said. No decision has been made, but a court fight with WhatsApp, the world’s largest mobile messaging service, would open a new front in the Obama administration’s dispute with Silicon Valley over encryption, security and privacy.WhatsApp, which is owned by Facebook, allows customers to send messages and make phone calls over the Internet. In the last year, the company has been adding encryption to those conversations, making it impossible for the Justice Department to read or eavesdrop, even with a judge’s wiretap order.
  • As recently as this past week, officials said, the Justice Department was discussing how to proceed in a continuing criminal investigation in which a federal judge had approved a wiretap, but investigators were stymied by WhatsApp’s encryption.The Justice Department and WhatsApp declined to comment. The government officials and others who discussed the dispute did so on condition of anonymity because the wiretap order and all the information associated with it were under seal. The nature of the case was not clear, except that officials said it was not a terrorism investigation. The location of the investigation was also unclear.
  • To understand the battle lines, consider this imperfect analogy from the predigital world: If the Apple dispute is akin to whether the F.B.I. can unlock your front door and search your house, the issue with WhatsApp is whether it can listen to your phone calls. In the era of encryption, neither question has a clear answer.Some investigators view the WhatsApp issue as even more significant than the one over locked phones because it goes to the heart of the future of wiretapping. They say the Justice Department should ask a judge to force WhatsApp to help the government get information that has been encrypted. Others are reluctant to escalate the dispute, particularly with senators saying they will soon introduce legislation to help the government get data in a format it can read.
Paul Merrell

Profiled From Radio to Porn, British Spies Track Web Users' Online Identities | Global ... - 0 views

  • One system builds profiles showing people’s web browsing histories. Another analyzes instant messenger communications, emails, Skype calls, text messages, cell phone locations, and social media interactions. Separate programs were built to keep tabs on “suspicious” Google searches and usage of Google Maps. The surveillance is underpinned by an opaque legal regime that has authorized GCHQ to sift through huge archives of metadata about the private phone calls, emails and Internet browsing logs of Brits, Americans, and any other citizens  all without a court order or judicial warrant.
  • The power of KARMA POLICE was illustrated in 2009, when GCHQ launched a top-secret operation to collect intelligence about people using the Internet to listen to radio shows. The agency used a sample of nearly 7 million metadata records, gathered over a period of three months, to observe the listening habits of more than 200,000 people across 185 countries, including the U.S., the U.K., Ireland, Canada, Mexico, Spain, the Netherlands, France, and Germany.
  • GCHQ’s documents indicate that the plans for KARMA POLICE were drawn up between 2007 and 2008. The system was designed to provide the agency with “either (a) a web browsing profile for every visible user on the Internet, or (b) a user profile for every visible website on the Internet.” The origin of the surveillance system’s name is not discussed in the documents. But KARMA POLICE is also the name of a popular song released in 1997 by the Grammy Award-winning British band Radiohead, suggesting the spies may have been fans. A verse repeated throughout the hit song includes the lyric, “This is what you’ll get, when you mess with us.”
  • ...3 more annotations...
  • Throughout this period, as smartphone sales started to boom, the frequency of people’s Internet use was steadily increasing. In tandem, British spies were working frantically to bolster their spying capabilities, with plans afoot to expand the size of Black Hole and other repositories to handle an avalanche of new data. By 2010, according to the documents, GCHQ was logging 30 billion metadata records per day. By 2012, collection had increased to 50 billion per day, and work was underway to double capacity to 100 billion. The agency was developing “unprecedented” techniques to perform what it called “population-scale” data mining, monitoring all communications across entire countries in an effort to detect patterns or behaviors deemed suspicious. It was creating what it saidwould be, by 2013, “the world’s biggest” surveillance engine “to run cyber operations and to access better, more valued data for customers to make a real world difference.” HERE WAS A SIMPLE AIM at the heart of the top-secret program: Record the website browsing habits of “every visible user on the Internet.” Before long, billions of digital records about ordinary people’s online activities were being stored every day. Among them were details cataloging visits to porn, social media and news websites, search engines, chat forums, and blogs.
  • GCHQ vacuums up the website browsing histories using “probes” that tap into the international fiber-optic cables that transport Internet traffic across the world. A huge volume of the Internet data GCHQ collects flows directly into a massive repository named Black Hole, which is at the core of the agency’s online spying operations, storing raw logs of intercepted material before it has been subject to analysis. Black Hole contains data collected by GCHQ as part of bulk “unselected” surveillance, meaning it is not focused on particular “selected” targets and instead includes troves of data indiscriminately swept up about ordinary people’s online activities. Between August 2007 and March 2009, GCHQ documents say that Black Hole was used to store more than 1.1 trillion “events”  a term the agency uses to refer to metadata records  with about 10 billion new entries added every day. As of March 2009, the largest slice of data Black Hole held  41 percent  was about people’s Internet browsing histories. The rest included a combination of email and instant messenger records, details about search engine queries, information about social media activity, logs related to hacking operations, and data on people’s use of tools to browse the Internet anonymously.
  • The mass surveillance operation — code-named KARMA POLICE — was launched by British spies about seven years ago without any public debate or scrutiny. It was just one part of a giant global Internet spying apparatus built by the United Kingdom’s electronic eavesdropping agency, Government Communications Headquarters, or GCHQ. The revelations about the scope of the British agency’s surveillance are contained in documents obtained by The Intercept from National Security Agency whistleblower Edward Snowden. Previous reports based on the leaked files have exposed how GCHQ taps into Internet cables to monitor communications on a vast scale, but many details about what happens to the data after it has been vacuumed up have remained unclear.
Paul Merrell

Doug Mahugh : Miscellaneous links for 12-09-2008 - 0 views

  • If you've been at one of the recent DII workshops, you may recall that some of us from Microsoft have been talking about an upcoming converter interface that will allow you to add support for other formats to Office. I'm pleased to report that we've now published the documentation on MSDN for the External File Converter for SP2. The basic concept is that you convert incoming files to the Open XML format, and on save you convert Open XML to your format. Using this API, you can extend Office to support any format you'd like. The details are not for the faint of heart, but there is sample C++ source code available to help you get started.
  •  
    So now we learn some details about the new MS Office API(s) for unsupported file formats Microsoft promised a few months ago. Surprise, surprise! They're not for native file support. They're external process tools for converting to and from OOXML. That makes it sound as though Microsoft has no intention of coughing up the documentation for the native file support APIs despite its claim that it would document all APIs for Office (also required by U.S. v. Microsoft). The extra conversion step also practically guarantees more conversion artifacts. Do the new APIs provide interop for embedded scripts, etc.? My guess is no. There has to be a reason Microsoft chose to externalize the process rather than documenting the existing APIs. Limiting features available is still the most plausible scenario.
Gary Edwards

Collaboration Is At The Heart Of Open Source Content Management -- Open Source Content ... - 0 views

  •  
    As the economy tanks, open source proponents reflexively point to the low capital costs of acquiring open source software. But big customers want more than a bargain. They also want better. Thus, collaboration is more than just staying true to the open source credo of community and cooperation. It's also a smart business move. Drupal and Alfresco show us why.
Paul Merrell

Sir Tim Berners-Lee on 'Reinventing HTML' - 0 views

    • Paul Merrell
       
      Berners-Lee gives the obligaotry lip service to participation of "other stakeholders" but the stark reality is that W3C is the captive of the major browser developers. One may still credit W3C staff and Berners-Lee for what they have accomplished despite that reality, but in an organization that sells votes the needs of "other stakeholders" will always be neglected.
  • Some things are clearer with hindsight of several years. It is necessary to evolve HTML incrementally. The attempt to get the world to switch to XML, including quotes around attribute values and slashes in empty tags and namespaces all at once didn't work. The large HTML-generating public did not move, largely because the browsers didn't complain. Some large communities did shift and are enjoying the fruits of well-formed systems, but not all. It is important to maintain HTML incrementally, as well as continuing a transition to well-formed world, and developing more power in that world.
  • The plan is, informed by Webforms, to extend HTML forms. At the same time, there is a work item to look at how HTML forms (existing and extended) can be thought of as XForm equivalents, to allow an easy escalation path. A goal would be to have an HTML forms language which is a superset of the existing HTML language, and a subset of a XForms language wit added HTML compatibility.
  • ...7 more annotations...
  • There will be no dependency of HTML work on the XHTML2 work.
    • Paul Merrell
       
      He just confirms that that incremental migration from HTML forms to XForms is entirely a pie-in-the-sky aspiration, not a plan.
  • This is going to be a very major collaboration on a very important spec, one of the crown jewels of web technology. Even though hundreds of people will be involved, we are evolving the technology which millions going on billions will use in the future. There won't seem like enough thankyous to go around some days.
    • Paul Merrell
       
      Perhaps a political reality. But I am 62 years old, have had three major heart attacks, and am still smoking cigarettes. I would like to experience interoperable web apps before I die. What does the incremental strategy do for me? I would much prefer to see Berners-Lee raising his considerable voice and stature against the dominance of the browser developers at W3C.
    • Paul Merrell
       
      Bye-bye XForms.
    • Paul Merrell
       
      This is the precise reason the major browser developers must be brought to heel rather than being catered to with a standard that serves only the needs of the browser developers and not the need of users for interoperable web applications. CSS is in the web app page templates, not in the markup that can be exchanged by web apps. Why can't MediaWiki exchange page content with Drupal? It's because HTML really sucks biig time as a data exchange format. All the power is in the CSS site templates, not in what users can stick in HTML forms.
  • The perceived accountability of the HTML group has been an issue. Sometimes this was a departure from the W3C process, sometimes a sticking to it in principle, but not actually providing assurances to commenters. An issue was the formation of the breakaway WHAT WG, which attracted reviewers though it did not have a process or specific accountability measures itself.
  • Some things are very clear. It is really important to have real developers on the ground involved with the development of HTML. It is also really important to have browser makers intimately involved and committed. And also all the other stakeholders, including users and user companies and makers of related products.
  •  
    Like this http://www.hdfilmsaati.net Film,dvd,download,free download,product... ppc,adword,adsense,amazon,clickbank,osell,bookmark,dofollow,edu,gov,ads,linkwell,traffic,scor,serp,goggle,bing,yahoo.ads,ads network,ads goggle,bing,quality links,link best,ptr,cpa,bpa
Paul Merrell

Offline Web Apps, Dumb Idea or Really Dumb Idea? - 0 views

  • The amount of work it takes to "offline enable" a Web application is roughly similar to the amount of work it takes to "online enable" a desktop application.
  • I suspect this is the bitter truth that answers the questions asked in articles like  The Frustratingly Unfulfilled Promise of Google Gears where the author laments the lack of proliferation of offline Web applications built on Google Gears. When it first shipped I was looking forward to a platform like Google Gears but after I thought about the problem for a while, I realized that such a platform would be just as useful for "online enabling" desktop applications as it would be for "offline enabling" Web applications. Additionally, I came to the conclusion that the former is a lot more enabling to users than the latter. This is when I started becoming interested in Live Mesh as a Platform, this is one area where I think Microsoft's hearts and minds are in the right place. I want to see more applications like Outlook + RPC over HTTP  not "offline enabled" versions of Outlook Web Access.
  •  
    Like this http://www.hdfilmsaati.net Film,dvd,download,free download,product... ppc,adword,adsense,amazon,clickbank,osell,bookmark,dofollow,edu,gov,ads,linkwell,traffic,scor,serp,goggle,bing,yahoo.ads,ads network,ads goggle,bing,quality links,link best,ptr,cpa,bpa
Gary Edwards

Brendan's Roadmap Updates: Open letter to Microsoft's Chris Wilson and their fight to s... - 0 views

  • The history of ECMAScript since its beginnings in November 1996 shows that when Microsoft was behind in the market (against Netscape in 1996-1997), it moved aggressively in the standards body to evolve standards starting with ES1 through ES3. Once Microsoft dominated the market, the last edition of the standard was left to rot -- ES3 was finished in 1999 -- and even easy-to-fix standards conformance bugs in IE JScript went unfixed for eight years (so three years to go from Edition 1 to 3, then over eight to approach Edition 4). Now that the proposed 4th edition looks like a competitive threat, the world suddenly hears in detail about all those bugs, spun as differences afflicting "JavaScript" that should inform a new standard.
  • In my opinion the notion that we need to add features so that ajax programming would be easier is plain wrong. ajax is a hack and also the notion of a webapp is a hack. the web was created in a document centric view. All w3c standards are also based on the same document notion. The heart of the web, the HTTP protocol is designed to support a web of documents and as such is stateless. the proper solution, IMO, is not to evolve ES for the benefit of ajax and webapps, but rather generalize the notion of a document browser that connects to a web of documents to a general purpose client engine that connects to a network of internet applications. thus the current web (document) browser just becomes one such internet application.
  •  
    the obvious conflict of interest between the standards-based web and proprietary platforms advanced by Microsoft, and the rationales for keeping the web's client-side programming language small while the proprietary platforms rapidly evolve support for large languages, does not help maintain the fiction that only clashing high-level philosophies are involved here. Readers may not know that Ecma has no provision for "minor releases" of its standards, so any ES3.1 that was approved by TG1 would inevitably be given a whole edition number, presumably becoming the 4th Edition of ECMAScript. This is obviously contentious given all the years that the majority of TG1, sometimes even apparently including Microsoft representatives, has worked on ES4, and the developer expectations set by this long-standing effort. A history of Microsoft's post-ES3 involvement in the ECMAScript standard group, leading up to the overt split in TG1 in March, is summarized here. The history of ECMAScript since its beginnings in November 1996 shows that when Microsoft was behind in the market (against Netscape in 1996-1997), it moved aggressively in the standards body to evolve standards starting with ES1 through ES3. Once Microsoft dominated the market, the last edition of the standard was left to rot -- ES3 was finished in 1999 -- and even easy-to-fix standards conformance bugs in IE JScript went unfixed for eight years (so three years to go from Edition 1 to 3, then over eight to approach Edition 4). Now that the proposed 4th edition looks like a competitive threat, the world suddenly hears in detail about all those bugs, spun as differences afflicting "JavaScript" that should inform a new standard.
Paul Merrell

LEAKED: Secret Negotiations to Let Big Brother Go Global | Wolf Street - 0 views

  • Much has been written, at least in the alternative media, about the Trans Pacific Partnership (TPP) and the Transatlantic Trade and Investment Partnership (TTIP), two multilateral trade treaties being negotiated between the representatives of dozens of national governments and armies of corporate lawyers and lobbyists (on which you can read more here, here and here). However, much less is known about the decidedly more secretive Trade in Services Act (TiSA), which involves more countries than either of the other two. At least until now, that is. Thanks to a leaked document jointly published by the Associated Whistleblowing Press and Filtrala, the potential ramifications of the treaty being hashed out behind hermetically sealed doors in Geneva are finally seeping out into the public arena.
  • If signed, the treaty would affect all services ranging from electronic transactions and data flow, to veterinary and architecture services. It would almost certainly open the floodgates to the final wave of privatization of public services, including the provision of healthcare, education and water. Meanwhile, already privatized companies would be prevented from a re-transfer to the public sector by a so-called barring “ratchet clause” – even if the privatization failed. More worrisome still, the proposal stipulates that no participating state can stop the use, storage and exchange of personal data relating to their territorial base. Here’s more from Rosa Pavanelli, general secretary of Public Services International (PSI):
  • The leaked documents confirm our worst fears that TiSA is being used to further the interests of some of the largest corporations on earth (…) Negotiation of unrestricted data movement, internet neutrality and how electronic signatures can be used strike at the heart of individuals’ rights. Governments must come clean about what they are negotiating in these secret trade deals. Fat chance of that, especially in light of the fact that the text is designed to be almost impossible to repeal, and is to be “considered confidential” for five years after being signed. What that effectively means is that the U.S. approach to data protection (read: virtually non-existent) could very soon become the norm across 50 countries spanning the breadth and depth of the industrial world.
  • ...1 more annotation...
  • The main players in the top-secret negotiations are the United States and all 28 members of the European Union. However, the broad scope of the treaty also includes Australia, Canada, Chile, Colombia, Costa Rica, Hong Kong, Iceland, Israel, Japan, Liechtenstein, Mexico, New Zealand, Norway, Pakistan, Panama, Paraguay, Peru, South Korea, Switzerland, Taiwan and Turkey. Combined they represent almost 70 percent of all trade in services worldwide. An explicit goal of the TiSA negotiations is to overcome the exceptions in GATS that protect certain non-tariff trade barriers, such as data protection. For example, the draft Financial Services Annex of TiSA, published by Wikileaks in June 2014, would allow financial institutions, such as banks, the free transfer of data, including personal data, from one country to another. As Ralf Bendrath, a senior policy advisor to the MEP Jan Philipp Albrecht, writes in State Watch, this would constitute a radical carve-out from current European data protection rules:
1 - 20 of 29 Next ›
Showing 20 items per page