Skip to main content

Home/ Open Web/ Group items tagged internet-access

Rss Feed Group items tagged

Paul Merrell

We're Halfway to Encrypting the Entire Web | Electronic Frontier Foundation - 0 views

  • The movement to encrypt the web has reached a milestone. As of earlier this month, approximately half of Internet traffic is now protected by HTTPS. In other words, we are halfway to a web safer from the eavesdropping, content hijacking, cookie stealing, and censorship that HTTPS can protect against. Mozilla recently reported that the average volume of encrypted web traffic on Firefox now surpasses the average unencrypted volume
  • Google Chrome’s figures on HTTPS usage are consistent with that finding, showing that over 50% of of all pages loaded are protected by HTTPS across different operating systems.
  • This milestone is a combination of HTTPS implementation victories: from tech giants and large content providers, from small websites, and from users themselves.
  • ...4 more annotations...
  • Starting in 2010, EFF members have pushed tech companies to follow crypto best practices. We applauded when Facebook and Twitter implemented HTTPS by default, and when Wikipedia and several other popular sites later followed suit. Google has also put pressure on the tech community by using HTTPS as a signal in search ranking algorithms and, starting this year, showing security warnings in Chrome when users load HTTP sites that request passwords or credit card numbers. EFF’s Encrypt the Web Report also played a big role in tracking and encouraging specific practices. Recently other organizations have followed suit with more sophisticated tracking projects. For example, Secure the News and Pulse track HTTPS progress among news media sites and U.S. government sites, respectively.
  • But securing large, popular websites is only one part of a much bigger battle. Encrypting the entire web requires HTTPS implementation to be accessible to independent, smaller websites. Let’s Encrypt and Certbot have changed the game here, making what was once an expensive, technically demanding process into an easy and affordable task for webmasters across a range of resource and skill levels. Let’s Encrypt is a Certificate Authority (CA) run by the Internet Security Research Group (ISRG) and founded by EFF, Mozilla, and the University of Michigan, with Cisco and Akamai as founding sponsors. As a CA, Let’s Encrypt issues and maintains digital certificates that help web users and their browsers know they’re actually talking to the site they intended to. CAs are crucial to secure, HTTPS-encrypted communication, as these certificates verify the association between an HTTPS site and a cryptographic public key. Through EFF’s Certbot tool, webmasters can get a free certificate from Let’s Encrypt and automatically configure their server to use it. Since we announced that Let’s Encrypt was the web’s largest certificate authority last October, it has exploded from 12 million certs to over 28 million. Most of Let’s Encrypt’s growth has come from giving previously unencrypted sites their first-ever certificates. A large share of these leaps in HTTPS adoption are also thanks to major hosting companies and platforms--like WordPress.com, Squarespace, and dozens of others--integrating Let’s Encrypt and providing HTTPS to their users and customers.
  • Unfortunately, you can only use HTTPS on websites that support it--and about half of all web traffic is still with sites that don’t. However, when sites partially support HTTPS, users can step in with the HTTPS Everywhere browser extension. A collaboration between EFF and the Tor Project, HTTPS Everywhere makes your browser use HTTPS wherever possible. Some websites offer inconsistent support for HTTPS, use unencrypted HTTP as a default, or link from secure HTTPS pages to unencrypted HTTP pages. HTTPS Everywhere fixes these problems by rewriting requests to these sites to HTTPS, automatically activating encryption and HTTPS protection that might otherwise slip through the cracks.
  • Our goal is a universally encrypted web that makes a tool like HTTPS Everywhere redundant. Until then, we have more work to do. Protect your own browsing and websites with HTTPS Everywhere and Certbot, and spread the word to your friends, family, and colleagues to do the same. Together, we can encrypt the entire web.
  •  
    HTTPS connections don't work for you if you don't use them. If you're not using HTTPS Everywhere in your browser, you should be; it's your privacy that is at stake. And every encrypted communication you make adds to the backlog of encrypted data that NSA and other internet voyeurs must process as encrypted traffic; because cracking encrypted messages is computer resource intensive, the voyeurs do not have the resources to crack more than a tiny fraction. HTTPS is a free extension for Firefox, Chrome, and Opera. You can get it here. https://www.eff.org/HTTPS-everywhere
Paul Merrell

Searching for the Future of Television  - Technology Review - 0 views

  • The company was testing one of the boldest bets in its 12-year history: Google TV. It is software that aims to give people an easy way to access everything available on regular television channels and the vast sea of content on the Internet, all on the biggest screen in the house—a bid to reinvent television for the Internet age. The initiative was set for its first public demonstration on May 20, and Google's geeks had to make sure the product would appeal to the average viewer, who watches about five hours of television a day. So week after week, the Google TV team would try out countless variations on everything from the look of the search results to the background colors for the screen, hoping to learn what worked best
  •  
    In-depth look at Google TV and its market. Well worth the read. Is this at last the long-anticipated convergence of television and the Internet?
Paul Merrell

Gov. Mills signs nation's strictest internet privacy protection bill - Portland Press H... - 0 views

  • Maine internet service providers will face the strictest consumer privacy protections in the nation under a bill signed Thursday by Gov. Janet Mills, but the new law will almost certainly be challenged in court. Several technology and communication trade groups warned in testimony before the Legislature that the measure may be in conflict with federal law and would likely be the subject of legal action.
  • The new law, which goes into effect on July 1, 2020, would require providers to ask for permission before they sell or share any of their customers’ data to a third party. The law would also apply to telecommunications companies that provide access to the internet via their cellular networks.
  • The law is modeled on a Federal Communications Commission rule, adopted under the administration of President Obama but overturned by the administration of President Trump in 2017. The rule blocked an ISP from selling a customer’s personal data, which is not prohibited under federal law.
  • ...1 more annotation...
  • The law is unlike any in the nation, as it requires an ISP to obtain consent from a consumer before sharing any data. Only California has a similar law on the books, but it requires consumers to “opt out”  by asking their ISP to protect their data. Maine’s new law does not allow an ISP to offer a discounted rate to customers who agree to share or sell their data.
Paul Merrell

2nd Cir. Affirms That Creation of Full-Text Searchable Database of Works Is Fair Use | ... - 0 views

  • The fair use doctrine permits the unauthorized digitization of copyrighted works in order to create a full-text searchable database, the U.S. Court of Appeals for the Second Circuit ruled June 10.Affirming summary judgment in favor of a consortium of university libraries, the court also ruled that the fair use doctrine permits the unauthorized conversion of those works into accessible formats for use by persons with disabilities, such as the blind.
  • The dispute is connected to the long-running conflict between Google Inc. and various authors of books that Google included in a mass digitization program. In 2004, Google began soliciting the participation of publishers in its Google Print for Publishers service, part of what was then called the Google Print project, aimed at making information available for free over the Internet.Subsequently, Google announced a new project, Google Print for Libraries. In 2005, Google Print was renamed Google Book Search and it is now known simply as Google Books. Under this program, Google made arrangements with several of the world's largest libraries to digitize the entire contents of their collections to create an online full-text searchable database.The announcement of this program triggered a copyright infringement action by the Authors Guild that continues to this day.
  • Turning to the fair use question, the court first concluded that the full-text search function of the Hathitrust Digital Library was a “quintessentially transformative use,” and thus constituted fair use. The court said:the result of a word search is different in purpose, character, expression, meaning, and message from the page (and the book) from which it is drawn. Indeed, we can discern little or no resemblance between the original text and the results of the HDL full-text search.There is no evidence that the Authors write with the purpose of enabling text searches of their books. Consequently, the full-text search function does not “supersede[ ] the objects [or purposes] of the original creation.”Turning to the fourth fair use factor—whether the use functions as a substitute for the original work—the court rejected the argument that such use represents lost sales to the extent that it prevents the future development of a market for licensing copies of works to be used in full-text searches.However, the court emphasized that the search function “does not serve as a substitute for the books that are being searched.”
  • ...3 more annotations...
  • Part of the deal between Google and the libraries included an offer by Google to hand over to the libraries their own copies of the digitized versions of their collections.In 2011, a group of those libraries announced the establishment of a new service, called the HathiTrust digital library, to which the libraries would contribute their digitized collections. This database of copies is to be made available for full-text searching and preservation activities. Additionally, it is intended to offer free access to works to individuals who have “print disabilities.” For works under copyright protection, the search function would return only a list of page numbers that a search term appeared on and the frequency of such appearance.
  • The court also rejected the argument that the database represented a threat of a security breach that could result in the full text of all the books becoming available for anyone to access. The court concluded that Hathitrust's assertions of its security measures were unrebutted.Thus, the full-text search function was found to be protected as fair use.
  • The court also concluded that allowing those with print disabilities access to the full texts of the works collected in the Hathitrust database was protected as fair use. Support for this conclusion came from the legislative history of the Copyright Act's fair use provision, 17 U.S.C. §107.
Gary Edwards

Google Chrome OS: Web Platform To Rule Them All -- InformationWeek - 0 views

  •  
    Some good commentary on chrome OS from InformationWeek's Thomas Claburn. Excerpt: With Chrome OS, Google aims to make the Web the primary platform for software development....... The fact that Chrome OS applications will be written using open Web standards like JavaScript, HTML, and CSS might seem like a liability because Web applications still aren't as capable as applications written for specific devices and operating systems. But Google is betting that will change and is working to effect the change on which its bet depends. Within a year or two, Web browsers will gain access to peripherals, through an infrastructure layer above the level of device drivers. Google's work with standards bodies is making that happen..... ..... According to Matt Womer, the "ubiquitous Web activity lead" for W3C, the Web standards consortium, Web protocol groups are working to codify ways to access peripherals like digital cameras, the messaging stack, calendar data, and contact data. There's now a JavaScript API that Web developers can use to get GPS information from mobile phones using the phone's browser, he points out. What that means is that device drivers for Chrome OS will emerge as HTML 5 and related standards mature. Without these, consumers would never use Chrome OS because devices like digital cameras wouldn't be able to transfer data. Womer said the standardization work could move quite quickly, but won't be done until there's an actual implementation. That would be Chrome OS...... ..... Chrome OS will sell itself to developers because, as Google puts it, writing applications for the Web gives "developers the largest user base of any platform."
Paul Merrell

Mystery Swirls Around Assange's Status At Ecuadorean Embassy - 0 views

  • Midway through releasing a series of damaging disclosures about U.S. presidential contender Hillary Clinton, WikiLeaks founder Julian Assange says his hosts at the Ecuadorean Embassy in London abruptly cut him off from the internet. The news adds another layer of intrigue to an extraordinary campaign. “We can confirm Ecuador cut off Assange’s internet access Saturday, 5pm GMT, shortly after publication of Clinton’s Goldman Sachs (speeches),” the group said in a message posted to Twitter late Monday.
Paul Merrell

Assange Keeps Warning Of AI Censorship, And It's Time We Started Listening - 0 views

  • Where power is not overtly totalitarian, wealthy elites have bought up all media, first in print, then radio, then television, and used it to advance narratives that are favorable to their interests. Not until humanity gained widespread access to the internet has our species had the ability to freely and easily share ideas and information on a large scale without regulation by the iron-fisted grip of power. This newfound ability arguably had a direct impact on the election for the most powerful elected office in the most powerful government in the world in 2016, as a leak publishing outlet combined with alternative and social media enabled ordinary Americans to tell one another their own stories about what they thought was going on in their country.This newly democratized narrative-generating power of the masses gave those in power an immense fright, and they’ve been working to restore the old order of power controlling information ever since. And the editor-in-chief of the aforementioned leak publishing outlet, WikiLeaks, has been repeatedly trying to warn us about this coming development.
  • In a statement that was recently read during the “Organising Resistance to Internet Censorship” webinar, sponsored by the World Socialist Web Site, Assange warned of how “digital super states” like Facebook and Google have been working to “re-establish discourse control”, giving authority over how ideas and information are shared back to those in power.Assange went on to say that the manipulative attempts of world power structures to regain control of discourse in the information age has been “operating at a scale, speed, and increasingly at a subtlety, that appears likely to eclipse human counter-measures.”What this means is that using increasingly more advanced forms of artificial intelligence, power structures are becoming more and more capable of controlling the ideas and information that people are able to access and share with one another, hide information which goes against the interests of those power structures and elevate narratives which support those interests, all of course while maintaining the illusion of freedom and lively debate.
  • To be clear, this is already happening. Due to a recent shift in Google’s “evaluation methods”, traffic to left-leaning and anti-establishment websites has plummeted, with sites like WikiLeaks, Alternet, Counterpunch, Global Research, Consortium News, Truthout, and WSWS losing up to 70 percent of the views they were getting prior to the changes. Powerful billionaire oligarchs Pierre Omidyar and George Soros are openly financing the development of “an automated fact-checking system” (AI) to hide “fake news” from the public.
  • ...2 more annotations...
  • To make matters even worse, there’s no way to know the exact extent to which this is going on, because we know that we can absolutely count on the digital super states in question to lie about it. In the lead-up to the 2016 election, Twitter CEO Jack Dorsey was asked point-blank if Twitter was obstructing the #DNCLeaks from trending, a hashtag people were using to build awareness of the DNC emails which had just been published by WikiLeaks, and Dorsey flatly denied it. More than a year later, we learned from a prepared testimony before the Senate Subcommittee on Crime and Terrorism by Twitter’s acting general counsel Sean J. Edgett that this was completely false and Twitter had indeed been doing exactly that to protect the interests of US political structures by sheltering the public from information allegedly gathered by Russian hackers.
  • Imagine going back to a world like the Middle Ages where you only knew the things your king wanted you to know, except you could still watch innocuous kitten videos on Youtube. That appears to be where we may be headed, and if that happens the possibility of any populist movement arising to hold power to account may be effectively locked out from the realm of possibility forever.To claim that these powerful new media corporations are just private companies practicing their freedom to determine what happens on their property is to bury your head in the sand and ignore the extent to which these digital super states are already inextricably interwoven with existing power structures. In a corporatist system of government, which America unquestionably has, corporate censorship is government censorship, of an even more pernicious strain than if Jeff Sessions were touring the country burning books. The more advanced artificial intelligence becomes, the more adept these power structures will become at manipulating us. Time to start paying very close attention to this.
Paul Merrell

Europe and Japan Aiming to Build 100Gbps Fibre Optic Internet - ISPreview UK - 0 views

  • The European Commission (EC) and Japan have announced the launch of six joint research projects, supported by £15.3m+ (€18m) in funding, that aim to build networks which are “5000 times faster than today’s average European broadband ISP speed (100Gbps compared to 19.7Mbps)“. The telecoms experts among you will know that 100Gbps+ (Gigabits per second) fibre optic links are nothing new but most of these are major submarine or national cable links. The new effort appears to be looking further ahead, with a view to improving the efficiency of such networks and perhaps even bringing them closer to homes. It’s frequently noted that demand for data is putting a growing strain on broadband connections (the EU expects data traffic to grow 12-fold by 2018), which is partly fuelled by ever faster fixed line ISP and mobile broadband connectivity. But technology is always evolving to keep pace.
  • A quick glance at each of the projects reveals that this seems to be more about improving what already exists, yet in some circles even 100Gbps is beginning to look old-hat. Never the less many of the improvements mentioned above will, if ever adopted, eventually filter down to benefit everybody. After all, several UK ISPs are already offering 1Gbps home connections (e.g. Hyperoptic, CityFibre / Fibreband in Bournemouth, Gigaclear etc.) and that’s only 99 fold slower than a 100Gbps link. In the realm of evolving internet access services that’s only a short hop, unless your infrastructure is still limited by a copper last mile. But there’s little point in having a 100Gbps link (don’t worry we won’t see this in homes for a fair few years) if the ISP can’t supply the capacity for it and that’s another part of the new effort. It’s important to stress that this is not about tackling today’s needs; it’s all about the future. Not so long ago we were still stuck on 50Kbps dialup.
Paul Merrell

This project aims to make '404 not found' pages a thing of the past - 0 views

  • The Internet is always changing. Sites are rising and falling, content is deleted, and bad URLs can lead to '404 Not Found' errors that are as helpful as a brick wall. A new project proposes an do away with dead 404 errors by implementing new HTML code that will help access prior versions of hyperlinked content. With any luck, that means that you’ll never have to run into a dead link again. The “404-No-More” project is backed by a formidable coalition including members from organizations like the Harvard Library Innovation Lab, Los Alamos National Laboratory, Old Dominion University, and the Berkman Center for Internet & Society. Part of the Knight News Challenge, which seeks to strengthen the Internet for free expression and innovation through a variety of initiatives, 404-No-More recently reached the semifinal stage. The project aims to cure so-called link rot, the process by which hyperlinks become useless overtime because they point to addresses that are no longer available. If implemented, websites such as Wikipedia and other reference documents would be vastly improved. The new feature would also give Web authors a way provide links that contain both archived copies of content and specific dates of reference, the sort of information that diligent readers have to hunt down on a website like Archive.org.
  • While it may sound trivial, link rot can actually have real ramifications. Nearly 50 percent of the hyperlinks in Supreme Court decisions no longer work, a 2013 study revealed. Losing footnotes and citations in landmark legal decisions can mean losing crucial information and context about the laws that govern us. The same study found that 70 percent of URLs within the Harvard Law Review and similar journals didn’t link to the originally cited information, considered a serious loss surrounding the discussion of our laws. The project’s proponents have come up with more potential uses as well. Activists fighting censorship will have an easier time combatting government takedowns, for instance. Journalists will be much more capable of researching dynamic Web pages. “If every hyperlink was annotated with a publication date, you could automatically view an archived version of the content as the author intended for you to see it,” the project’s authors explain. The ephemeral nature of the Web could no longer be used as a weapon. Roger Macdonald, a director at the Internet Archive, called the 404-No-More project “an important contribution to preservation of knowledge.”
  • The new feature would come in the form of introducing the mset attribute to the <a> element in HTML, which would allow users of the code to specify multiple dates and copies of content as an external resource. For instance, if both the date of reference and the location of a copy of targeted content is known by an author, the new code would like like this: The 404-No-More project’s goals are numerous, but the ultimate goal is to have mset become a new HTML standard for hyperlinks. “An HTML standard that incorporates archives for hyperlinks will loop in these efforts and make the Web better for everyone,” project leaders wrote, “activists, journalists, and regular ol’ everyday web users.”
Paul Merrell

Internet Giants Erect Barriers to Spy Agencies - NYTimes.com - 0 views

  • As fast as it can, Google is sealing up cracks in its systems that Edward J. Snowden revealed the N.S.A. had brilliantly exploited. It is encrypting more data as it moves among its servers and helping customers encode their own emails. Facebook, Microsoft and Yahoo are taking similar steps.
  • After years of cooperating with the government, the immediate goal now is to thwart Washington — as well as Beijing and Moscow. The strategy is also intended to preserve business overseas in places like Brazil and Germany that have threatened to entrust data only to local providers. Google, for example, is laying its own fiber optic cable under the world’s oceans, a project that began as an effort to cut costs and extend its influence, but now has an added purpose: to assure that the company will have more control over the movement of its customer data.
  • A year after Mr. Snowden’s revelations, the era of quiet cooperation is over. Telecommunications companies say they are denying requests to volunteer data not covered by existing law. A.T.&T., Verizon and others say that compared with a year ago, they are far more reluctant to cooperate with the United States government in “gray areas” where there is no explicit requirement for a legal warrant.
  • ...8 more annotations...
  • Eric Grosse, Google’s security chief, suggested in an interview that the N.S.A.'s own behavior invited the new arms race.“I am willing to help on the purely defensive side of things,” he said, referring to Washington’s efforts to enlist Silicon Valley in cybersecurity efforts. “But signals intercept is totally off the table,” he said, referring to national intelligence gathering.“No hard feelings, but my job is to make their job hard,” he added.
  • In Washington, officials acknowledge that covert programs are now far harder to execute because American technology companies, fearful of losing international business, are hardening their networks and saying no to requests for the kind of help they once quietly provided.Continue reading the main story Robert S. Litt, the general counsel of the Office of the Director of National Intelligence, which oversees all 17 American spy agencies, said on Wednesday that it was “an unquestionable loss for our nation that companies are losing the willingness to cooperate legally and voluntarily” with American spy agencies.
  • Many point to an episode in 2012, when Russian security researchers uncovered a state espionage tool, Flame, on Iranian computers. Flame, like the Stuxnet worm, is believed to have been produced at least in part by American intelligence agencies. It was created by exploiting a previously unknown flaw in Microsoft’s operating systems. Companies argue that others could have later taken advantage of this defect.Worried that such an episode undercuts confidence in its wares, Microsoft is now fully encrypting all its products, including Hotmail and Outlook.com, by the end of this year with 2,048-bit encryption, a stronger protection that would take a government far longer to crack. The software is protected by encryption both when it is in data centers and when data is being sent over the Internet, said Bradford L. Smith, the company’s general counsel.
  • Mr. Smith also said the company was setting up “transparency centers” abroad so that technical experts of foreign governments could come in and inspect Microsoft’s proprietary source code. That will allow foreign governments to check to make sure there are no “back doors” that would permit snooping by United States intelligence agencies. The first such center is being set up in Brussels.Microsoft has also pushed back harder in court. In a Seattle case, the government issued a “national security letter” to compel Microsoft to turn over data about a customer, along with a gag order to prevent Microsoft from telling the customer it had been compelled to provide its communications to government officials. Microsoft challenged the gag order as violating the First Amendment. The government backed down.
  • Hardware firms like Cisco, which makes routers and switches, have found their products a frequent subject of Mr. Snowden’s disclosures, and their business has declined steadily in places like Asia, Brazil and Europe over the last year. The company is still struggling to convince foreign customers that their networks are safe from hackers — and free of “back doors” installed by the N.S.A. The frustration, companies here say, is that it is nearly impossible to prove that their systems are N.S.A.-proof.
  • In one slide from the disclosures, N.S.A. analysts pointed to a sweet spot inside Google’s data centers, where they could catch traffic in unencrypted form. Next to a quickly drawn smiley face, an N.S.A. analyst, referring to an acronym for a common layer of protection, had noted, “SSL added and removed here!”
  • Facebook and Yahoo have also been encrypting traffic among their internal servers. And Facebook, Google and Microsoft have been moving to more strongly encrypt consumer traffic with so-called Perfect Forward Secrecy, specifically devised to make it more labor intensive for the N.S.A. or anyone to read stored encrypted communications.One of the biggest indirect consequences from the Snowden revelations, technology executives say, has been the surge in demands from foreign governments that saw what kind of access to user information the N.S.A. received — voluntarily or surreptitiously. Now they want the same.
  • The latest move in the war between intelligence agencies and technology companies arrived this week, in the form of a new Google encryption tool. The company released a user-friendly, email encryption method to replace the clunky and often mistake-prone encryption schemes the N.S.A. has readily exploited.But the best part of the tool was buried in Google’s code, which included a jab at the N.S.A.'s smiley-face slide. The code included the phrase: “ssl-added-and-removed-here-; - )”
Paul Merrell

NSA Spying Relies on AT&T's 'Extreme Willingness to Help' - ProPublica - 0 views

  • he National Security Agency’s ability to spy on vast quantities of Internet traffic passing through the United States has relied on its extraordinary, decades-long partnership with a single company: the telecom giant AT&T. While it has been long known that American telecommunications companies worked closely with the spy agency, newly disclosed NSA documents show that the relationship with AT&T has been considered unique and especially productive. One document described it as “highly collaborative,” while another lauded the company’s “extreme willingness to help.”
  • AT&T’s cooperation has involved a broad range of classified activities, according to the documents, which date from 2003 to 2013. AT&T has given the NSA access, through several methods covered under different legal rules, to billions of emails as they have flowed across its domestic networks. It provided technical assistance in carrying out a secret court order permitting the wiretapping of all Internet communications at the United Nations headquarters, a customer of AT&T. The NSA’s top-secret budget in 2013 for the AT&T partnership was more than twice that of the next-largest such program, according to the documents. The company installed surveillance equipment in at least 17 of its Internet hubs on American soil, far more than its similarly sized competitor, Verizon. And its engineers were the first to try out new surveillance technologies invented by the eavesdropping agency. One document reminds NSA officials to be polite when visiting AT&T facilities, noting: “This is a partnership, not a contractual relationship.” The documents, provided by the former agency contractor Edward Snowden, were jointly reviewed by The New York Times and ProPublica.
  • It is not clear if the programs still operate in the same way today. Since the Snowden revelations set off a global debate over surveillance two years ago, some Silicon Valley technology companies have expressed anger at what they characterize as NSA intrusions and have rolled out new encryption to thwart them. The telecommunications companies have been quieter, though Verizon unsuccessfully challenged a court order for bulk phone records in 2014. At the same time, the government has been fighting in court to keep the identities of its telecom partners hidden. In a recent case, a group of AT&T customers claimed that the NSA’s tapping of the Internet violated the Fourth Amendment protection against unreasonable searches. This year, a federal judge dismissed key portions of the lawsuit after the Obama administration argued that public discussion of its telecom surveillance efforts would reveal state secrets, damaging national security.
Paul Merrell

New Leak Of Final TPP Text Confirms Attack On Freedom Of Expression, Public Health - 0 views

  • Offering a first glimpse of the secret 12-nation “trade” deal in its final form—and fodder for its growing ranks of opponents—WikiLeaks on Friday published the final negotiated text for the Trans-Pacific Partnership (TPP)’s Intellectual Property Rights chapter, confirming that the pro-corporate pact would harm freedom of expression by bolstering monopolies while and injure public health by blocking patient access to lifesaving medicines. The document is dated October 5, the same day it was announced in Atlanta, Georgia that the member states to the treaty had reached an accord after more than five years of negotiations. Aside from the WikiLeaks publication, the vast majority of the mammoth deal’s contents are still being withheld from the public—which a WikiLeaks press statement suggests is a strategic move by world leaders to forestall public criticism until after the Canadian election on October 19. Initial analyses suggest that many of the chapter’s more troubling provisions, such as broader patent and data protections that pharmaceutical companies use to delay generic competition, have stayed in place since draft versions were leaked in 2014 and 2015. Moreover, it codifies a crackdown on freedom of speech with rules allowing widespread internet censorship.
Paul Merrell

Comcast hints at plan for paid fast lanes after net neutrality repeal | Ars Technica - 0 views

  • For years, Comcast has been promising that it won't violate the principles of net neutrality, regardless of whether the government imposes any net neutrality rules. That meant that Comcast wouldn't block or throttle lawful Internet traffic and that it wouldn't create fast lanes in order to collect tolls from Web companies that want priority access over the Comcast network. This was one of the ways in which Comcast argued that the Federal Communications Commission should not reclassify broadband providers as common carriers, a designation that forces ISPs to treat customers fairly in other ways. The Title II common carrier classification that makes net neutrality rules enforceable isn't necessary because ISPs won't violate net neutrality principles anyway, Comcast and other ISPs have claimed. But with Republican Ajit Pai now in charge at the Federal Communications Commission, Comcast's stance has changed. While the company still says it won't block or throttle Internet content, it has dropped its promise about not instituting paid prioritization.
  • Instead, Comcast now vaguely says that it won't "discriminate against lawful content" or impose "anti-competitive paid prioritization." The change in wording suggests that Comcast may offer paid fast lanes to websites or other online services, such as video streaming providers, after Pai's FCC eliminates the net neutrality rules next month.
Paul Merrell

Web Browser Supports Time Travel (Library of Congress) - 0 views

  • September 24, 2010 -- For those who use the Mozilla Firefox browser, you now have the option to time travel through the web.  MementoFox  is a free extension that users can add-on to their browsers. The extension implements the Memento protocal , which the Los Alamos National Laboratory and Old Dominion University are developing to enable capture of and access to older versions of websites. 
  • have the option to time travel through the web.  MementoFox  is a free extension that users can add-on to their browsers. The extension implements the Memento protocal , which the Los Alamos National Laboratory and Old Dominion University are developing to enable capture of and access to older versions of websites. 
  • Memento allows a user to link resources, or web pages, with their previous versions automatically.  The term Memento refers to an archival record of a resource as well as the technological framework that supports the ability to discover and browse older versions of Web resources.  One of the challenges of researching older versions of Internet resources is searching for them in web archives. With the Firefox add-on, users can easily view older versions in the same browser without having to search across archives. After a URL is specified in the browser, the newly released extension allows users to set a target day, using the slider bar (the "add-on").  The protocol will search archives across the web for previous versions of the URL.   As long as those archives are available on a server accessible across the web, MementoFox will return the previous targeted version. MementoFox is available for download  and developers interested in working with the protocol can join the development group .
  • ...1 more annotation...
  • The Memento project receives support from the Library of Congress National Digital Information Infrastructure and Preservation Program.
Paul Merrell

DOJ Pushes to Expand Hacking Abilities Against Cyber-Criminals - Law Blog - WSJ - 0 views

  • The U.S. Department of Justice is pushing to make it easier for law enforcement to get warrants to hack into the computers of criminal suspects across the country. The move, which would alter federal court rules governing search warrants, comes amid increases in cases related to computer crimes. Investigators say they need more flexibility to get warrants to allow hacking in such cases, especially when multiple computers are involved or the government doesn’t know where the suspect’s computer is physically located. The Justice Department effort is raising questions among some technology advocates, who say the government should focus on fixing the holes in computer software that allow such hacking instead of exploiting them. Privacy advocates also warn government spyware could end up on innocent people’s computers if remote attacks are authorized against equipment whose ownership isn’t clear.
  • The government’s push for rule changes sheds light on law enforcement’s use of remote hacking techniques, which are being deployed more frequently but have been protected behind a veil of secrecy for years. In documents submitted by the government to the judicial system’s rule-making body this year, the government discussed using software to find suspected child pornographers who visited a U.S. site and concealed their identity using a strong anonymization tool called Tor. The government’s hacking tools—such as sending an email embedded with code that installs spying software — resemble those used by criminal hackers. The government doesn’t describe these methods as hacking, preferring instead to use terms like “remote access” and “network investigative techniques.” Right now, investigators who want to search property, including computers, generally need to get a warrant from a judge in the district where the property is located, according to federal court rules. In a computer investigation, that might not be possible, because criminals can hide behind anonymizing technologies. In cases involving botnets—groups of hijacked computers—investigators might also want to search many machines at once without getting that many warrants.
  • Some judges have already granted warrants in cases when authorities don’t know where the machine is. But at least one judge has denied an application in part because of the current rules. The department also wants warrants to be allowed for multiple computers at the same time, as well as for searches of many related storage, email and social media accounts at once, as long as those accounts are accessed by the computer being searched. “Remote searches of computers are often essential to the successful investigation” of computer crimes, Acting Assistant Attorney General Mythili Raman wrote in a letter to the judicial system’s rulemaking authority requesting the change in September. The government tries to obtain these “remote access warrants” mainly to “combat Internet anonymizing techniques,” the department said in a memo to the authority in March. Some groups have raised questions about law enforcement’s use of hacking technologies, arguing that such tools mean the government is failing to help fix software problems exploited by criminals. “It is crucial that we have a robust public debate about how the Fourth Amendment and federal law should limit the government’s use of malware and spyware within the U.S.,” said Nathan Wessler, a staff attorney at the American Civil Liberties Union who focuses on technology issues.
  • ...1 more annotation...
  • A Texas judge who denied a warrant application last year cited privacy concerns associated with sending malware when the location of the computer wasn’t known. He pointed out that a suspect opening an email infected with spyware could be doing so on a public computer, creating risk of information being collected from innocent people. A former computer crimes prosecutor serving on an advisory committee of the U.S. Judicial Conference, which is reviewing the request, said he was concerned that allowing the search of multiple computers under a single warrant would violate the Fourth Amendment’s protections against overly broad searches. The proposed rule is set to be debated by the Judicial Conference’s Advisory Committee on Criminal Rules in early April, after which it would be opened to public comment.
Paul Merrell

Microsoft to host data in Germany to evade US spying | Naked Security - 0 views

  • Microsoft's new plan to keep the US government's hands off its customers' data: Germany will be a safe harbor in the digital privacy storm. Microsoft on Wednesday announced that beginning in the second half of 2016, it will give foreign customers the option of keeping data in new European facilities that, at least in theory, should shield customers from US government surveillance. It will cost more, according to the Financial Times, though pricing details weren't forthcoming. Microsoft Cloud - including Azure, Office 365 and Dynamics CRM Online - will be hosted from new datacenters in the German regions of Magdeburg and Frankfurt am Main. Access to data will be controlled by what the company called a German data trustee: T-Systems, a subsidiary of the independent German company Deutsche Telekom. Without the permission of Deutsche Telekom or customers, Microsoft won't be able to get its hands on the data. If it does get permission, the trustee will still control and oversee Microsoft's access.
  • Microsoft CEO Satya Nadella dropped the word "trust" into the company's statement: Microsoft’s mission is to empower every person and every individual on the planet to achieve more. Our new datacenter regions in Germany, operated in partnership with Deutsche Telekom, will not only spur local innovation and growth, but offer customers choice and trust in how their data is handled and where it is stored.
  • On Tuesday, at the Future Decoded conference in London, Nadella also announced that Microsoft would, for the first time, be opening two UK datacenters next year. The company's also expanding its existing operations in Ireland and the Netherlands. Officially, none of this has anything to do with the long-drawn-out squabbling over the transatlantic Safe Harbor agreement, which the EU's highest court struck down last month, calling the agreement "invalid" because it didn't protect data from US surveillance. No, Nadella said, the new datacenters and expansions are all about giving local businesses and organizations "transformative technology they need to seize new global growth." But as Diginomica reports, Microsoft EVP of Cloud and Enterprise Scott Guthrie followed up his boss’s comments by saying that yes, the driver behind the new datacenters is to let customers keep data close: We can guarantee customers that their data will always stay in the UK. Being able to very concretely tell that story is something that I think will accelerate cloud adoption further in the UK.
  • ...2 more annotations...
  • Microsoft and T-Systems' lawyers may well think that storing customer data in a German trustee data center will protect it from the reach of US law, but for all we know, that could be wishful thinking. Forrester cloud computing analyst Paul Miller: To be sure, we must wait for the first legal challenge. And the appeal. And the counter-appeal. As with all new legal approaches, we don’t know it is watertight until it is challenged in court. Microsoft and T-Systems’ lawyers are very good and say it's watertight. But we can be sure opposition lawyers will look for all the holes. By keeping data offshore - particularly in Germany, which has strong data privacy laws - Microsoft could avoid the situation it's now facing with the US demanding access to customer emails stored on a Microsoft server in Dublin. The US has argued that Microsoft, as a US company, comes under US jurisdiction, regardless of where it keeps its data.
  • Running away to Germany isn't a groundbreaking move; other US cloud services providers have already pledged expansion of their EU presences, including Amazon's plan to open a UK datacenter in late 2016 that will offer what CTO Werner Vogels calls "strong data sovereignty to local users." Other big data operators that have followed suit: Salesforce, which has already opened datacenters in the UK and Germany and plans to open one in France next year, as well as new EU operations pledged for the new year by NetSuite and Box. Can Germany keep the US out of its datacenters? Can Ireland? Time, and court cases, will tell.
  •  
    The European Community's Court of Justice decision in the Safe Harbor case --- and Edward Snowden --- are now officially downgrading the U.S. as a cloud data center location. NSA is good business for Europeans looking to displace American cloud service providers, as evidenced by Microsoft's decision. The legal test is whether Microsoft has "possession, custody, or control" of the data. From the info given in the article, it seems that Microsoft has done its best to dodge that bullet by moving data centers to Germany and placing their data under the control of a European company. Do ownership of the hardware and profits from their rent mean that Microsoft still has "possession, custody, or control" of the data? The fine print of the agreement with Deutsche Telekom and the customer EULAs will get a thorough going over by the Dept. of Justice for evidence of Microsoft "control" of the data. That will be the crucial legal issue. The data centers in Germany may pass the test. But the notion that data centers in the UK can offer privacy is laughable; the UK's legal authority for GCHQ makes it even easier to get the data than the NSA can in the U.S.  It doesn't even require a court order. 
Gary Edwards

Feds use keylogger to thwart PGP, Hushmail | News Blogs - CNET News - 0 views

  •  
    The more i learn about the Governments illegal and un-Constitutional surveillance activities, the worse it gets.  As i read this article i couldn't help but wonder why the Government would want to disclose the warrantless activities as evidence in court?  Clearly the Government wants to have their violations of carefully enumerated Constitutional protections of individual rights validated by the nations courts.  Scary stuff. excerpt: A recent court case provides a rare glimpse into how some federal agents deal with encryption: by breaking into a suspect's home or office, implanting keystroke-logging software, and spying on what happens from afar. An agent with the Drug Enforcement Administration persuaded a federal judge to authorize him to sneak into an Escondido, Calif., office believed to be a front for manufacturing the drug MDMA, or Ecstasy. The DEA received permission to copy the hard drives' contents and inject a keystroke logger into the computers. That was necessary, according to DEA Agent Greg Coffey, because the suspects were using PGP and the encrypted Web e-mail service Hushmail.com. Coffey asserted that the DEA needed "real-time and meaningful access" to "monitor the keystrokes" for PGP and Hushmail passphrases. The aggressive surveillance techniques employed by the DEA were part of a case that resulted in a ruling on Friday (PDF) by the 9th Circuit Court of Appeals, which primarily dealt with Internet surveillance through a wiretap conducted on a PacBell (now AT&T) business DSL line used by the defendants.
Gary Edwards

The Man Who Makes the Future: Wired Icon Marc Andreessen | Epicenter | Wired.com - 1 views

  •  
    Must read interview. Marc Andreessen explains his five big ideas, taking us from the beginning of the Web, into the Cloud and beyond. Great stuff! ... (1) 1992 - Everyone Will Have the Web ... (2) 1995 - The Browser will the Operating System ... (3) 1999 - Web business will live in the Cloud ... (4) 2004 - Everything will be Social ... (5) 2009 - Software will Eat the World excerpt: Technology is like water; it wants to find its level. So if you hook up your computer to a billion other computers, it just makes sense that a tremendous share of the resources you want to use-not only text or media but processing power too-will be located remotely. People tend to think of the web as a way to get information or perhaps as a place to carry out ecommerce. But really, the web is about accessing applications. Think of each website as an application, and every single click, every single interaction with that site, is an opportunity to be on the very latest version of that application. Once you start thinking in terms of networks, it just doesn't make much sense to prefer local apps, with downloadable, installable code that needs to be constantly updated.

    "We could have built a social element into Mosaic. But back then the Internet was all about anonymity."
    Anderson: Assuming you have enough bandwidth.

    Andreessen: That's the very big if in this equation. If you have infinite network bandwidth, if you have an infinitely fast network, then this is what the technology wants. But we're not yet in a world of infinite speed, so that's why we have mobile apps and PC and Mac software on laptops and phones. That's why there are still Xbox games on discs. That's why everything isn't in the cloud. But eventually the technology wants it all to be up there.

    Anderson: Back in 1995, Netscape began pursuing this vision by enabling the browser to do more.

    Andreessen: We knew that you would need some pro
Gary Edwards

Office to finally fully support ODF, Open XML, and PDF formats | ZDNet - 0 views

  •  
    The king of clicks returns!  No doubt there was a time when the mere mention of ODF and the now legendary XML "document" format wars with Microsoft could drive click counts into the statisphere.  Sorry to say though, those times are long gone. It's still a good story though.  Even if the fate of mankind and the future of the Internet no longer hinges on the outcome.  There is that question that continues defy answer; "Did Microsoft win or lose?"  So the mere announcement of supported formats in MSOffice XX is guaranteed to rev the clicks somewhat. Veteran ODF clickmeister SVN does make an interesting observation though: "The ironic thing is that, while this was as hotly debated am issue in the mid-2000s as are mobile patents and cloud implementation is today, this news was barely noticed. That's a mistake. Updegrove points out, "document interoperability and vendor neutrality matter more now than ever before as paper archives disappear and literally all of human knowledge is entrusted to electronic storage." He concluded, "Only if documents can be easily exchanged and reliably accessed on an ongoing basis will competition in the present be preserved, and the availability of knowledge down through the ages be assured. Without robust, universally adopted document formats, both of those goals will be impossible to attain." Updegrove's right of course. Don't believe me? Go into your office's archives and try to bring up documents your wrote in the 90s in WordPerfect or papers your staff created in the 80s with WordStar. If you don't want to lose your institutional memory, open document standards support is more important than ever. "....................................... Sorry but Updegrove is wrong.  Woefully wrong. The Web is the future.  Sure interoperability matters, but only as far as the Web and the future of Cloud Computing is concerned.  Sadly neither ODF or Open XML are Web ready.  The language of the Web is famously HTML, now HTML5+
Gary Edwards

WE'RE BLOWN AWAY: This Startup Could Literally Change The Entire Software Industry - Bu... - 0 views

  •  
    "Startup Numecent has come out of stealth mode today with some of the most impressive enterprise technology we've seen in a decade. Plus the company is interesting for other reasons, like its business model and its founder. Numecent offers something it calls "cloud paging" and, if successful, it could be a game-changer for enterprise software, video gaming, and smartphone apps. Red Hat thinks so. It has already partnered with the company to help it offer Windows software to Linux users. "Cloud paging" instantly "cloudifies" any software, even an operating system like Windows itself, says founder and CEO Osman Kent. It lets any software, with no modification, be delivered from the cloud and run as fast or faster than if the app was on your desktop. Lots of so-called "desktop virtualization" services work fast. But cloud-paging can even operate the cloud software if the PC gets disconnected from the network or Internet. It can also turn a smartphone into a server. That means a bunch of devices like tablets can run the software -- like a game -- off of the smartphone. Imagine showing up to a party and letting all your friends play the latest version of Halo from your phone. That's crazy cool. Cloudpaging can do all this because it doesn't use "pixel-streaming" technology like other virtualization tech. Instead it temporarily downloads bits of the application itself (instructions) and runs them on the device. It can almost magically predict which parts of the app the user will need, and downloads only those parts. For business owners, that's not even the best part. It also helps enterprises sidestep extra licensing fees associated with the cloud. For instance, Microsoft licenses its software by the device, not by the user, and, in many cases, charges a "Virtual Desktop Access" fee for each device using a virtual version of Windows. (For a bit of light reading, check out the Microsoft virtual desktop licensing white paper: PDF) Cloudpaging has what Kent calls "f
« First ‹ Previous 61 - 80 of 130 Next › Last »
Showing 20 items per page