Skip to main content

Home/ Future of the Web/ Group items matching "30" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
Paul Merrell

Profiled From Radio to Porn, British Spies Track Web Users' Online Identities | Global Research - Centre for Research on Globalization - 0 views

  • One system builds profiles showing people’s web browsing histories. Another analyzes instant messenger communications, emails, Skype calls, text messages, cell phone locations, and social media interactions. Separate programs were built to keep tabs on “suspicious” Google searches and usage of Google Maps. The surveillance is underpinned by an opaque legal regime that has authorized GCHQ to sift through huge archives of metadata about the private phone calls, emails and Internet browsing logs of Brits, Americans, and any other citizens  all without a court order or judicial warrant.
  • The power of KARMA POLICE was illustrated in 2009, when GCHQ launched a top-secret operation to collect intelligence about people using the Internet to listen to radio shows. The agency used a sample of nearly 7 million metadata records, gathered over a period of three months, to observe the listening habits of more than 200,000 people across 185 countries, including the U.S., the U.K., Ireland, Canada, Mexico, Spain, the Netherlands, France, and Germany.
  • GCHQ’s documents indicate that the plans for KARMA POLICE were drawn up between 2007 and 2008. The system was designed to provide the agency with “either (a) a web browsing profile for every visible user on the Internet, or (b) a user profile for every visible website on the Internet.” The origin of the surveillance system’s name is not discussed in the documents. But KARMA POLICE is also the name of a popular song released in 1997 by the Grammy Award-winning British band Radiohead, suggesting the spies may have been fans. A verse repeated throughout the hit song includes the lyric, “This is what you’ll get, when you mess with us.”
  • ...3 more annotations...
  • GCHQ vacuums up the website browsing histories using “probes” that tap into the international fiber-optic cables that transport Internet traffic across the world. A huge volume of the Internet data GCHQ collects flows directly into a massive repository named Black Hole, which is at the core of the agency’s online spying operations, storing raw logs of intercepted material before it has been subject to analysis. Black Hole contains data collected by GCHQ as part of bulk “unselected” surveillance, meaning it is not focused on particular “selected” targets and instead includes troves of data indiscriminately swept up about ordinary people’s online activities. Between August 2007 and March 2009, GCHQ documents say that Black Hole was used to store more than 1.1 trillion “events”  a term the agency uses to refer to metadata records  with about 10 billion new entries added every day. As of March 2009, the largest slice of data Black Hole held  41 percent  was about people’s Internet browsing histories. The rest included a combination of email and instant messenger records, details about search engine queries, information about social media activity, logs related to hacking operations, and data on people’s use of tools to browse the Internet anonymously.
  • Throughout this period, as smartphone sales started to boom, the frequency of people’s Internet use was steadily increasing. In tandem, British spies were working frantically to bolster their spying capabilities, with plans afoot to expand the size of Black Hole and other repositories to handle an avalanche of new data. By 2010, according to the documents, GCHQ was logging 30 billion metadata records per day. By 2012, collection had increased to 50 billion per day, and work was underway to double capacity to 100 billion. The agency was developing “unprecedented” techniques to perform what it called “population-scale” data mining, monitoring all communications across entire countries in an effort to detect patterns or behaviors deemed suspicious. It was creating what it saidwould be, by 2013, “the world’s biggest” surveillance engine “to run cyber operations and to access better, more valued data for customers to make a real world difference.” HERE WAS A SIMPLE AIM at the heart of the top-secret program: Record the website browsing habits of “every visible user on the Internet.” Before long, billions of digital records about ordinary people’s online activities were being stored every day. Among them were details cataloging visits to porn, social media and news websites, search engines, chat forums, and blogs.
  • The mass surveillance operation — code-named KARMA POLICE — was launched by British spies about seven years ago without any public debate or scrutiny. It was just one part of a giant global Internet spying apparatus built by the United Kingdom’s electronic eavesdropping agency, Government Communications Headquarters, or GCHQ. The revelations about the scope of the British agency’s surveillance are contained in documents obtained by The Intercept from National Security Agency whistleblower Edward Snowden. Previous reports based on the leaked files have exposed how GCHQ taps into Internet cables to monitor communications on a vast scale, but many details about what happens to the data after it has been vacuumed up have remained unclear.
Paul Merrell

NSA Will Destroy Archived Metadata When Program Stops - 0 views

  • Four months from now, at the same time that the National Security Agency finally abandons the massive domestic telephone dragnet exposed by whistleblower Edward Snowden, it will also stop perusing the vast archive of data collected by the program. The NSA announced on Monday that it will expunge all the telephone metadata it previously swept up, citing Section 215 of the U.S.A Patriot Act. The program was ruled illegal by a federal appeals court in May. In June, Congress voted to end the program, but gave the NSA until the end of November to phase it out. The historical metadata —  records of American phone calls showing who called who, when, and for how long — will be put out of the reach of analysts on November 29, although technical personnel will have access for three more months. The program started 14 years ago, and operated under rules requiring data be retained for five years, and then destroyed.
  • The only possible hold-up, ironically, would be if any of the civil lawsuits prompted by the program prohibit the destruction of the data. “The telephony metadata” will be “preserved solely because of preservation obligations in pending civil litigation,” the Office of the Director of National Intelligence announced. “As soon as possible, NSA will destroy the Section 215 bulk telephony metadata upon expiration of its litigation preservation obligations.” ACLU staff attorney Alex Abdo told The Intercept his organization is “pleased that the NSA intends to purge the call records it has collected illegally.” But, he added: “Even with today’s pledge, the devil may be in the details.”
Paul Merrell

Popular Security Software Came Under Relentless NSA and GCHQ Attacks - The Intercept - 0 views

  • The National Security Agency and its British counterpart, Government Communications Headquarters, have worked to subvert anti-virus and other security software in order to track users and infiltrate networks, according to documents from NSA whistleblower Edward Snowden. The spy agencies have reverse engineered software products, sometimes under questionable legal authority, and monitored web and email traffic in order to discreetly thwart anti-virus software and obtain intelligence from companies about security software and users of such software. One security software maker repeatedly singled out in the documents is Moscow-based Kaspersky Lab, which has a holding registered in the U.K., claims more than 270,000 corporate clients, and says it protects more than 400 million people with its products. British spies aimed to thwart Kaspersky software in part through a technique known as software reverse engineering, or SRE, according to a top-secret warrant renewal request. The NSA has also studied Kaspersky Lab’s software for weaknesses, obtaining sensitive customer information by monitoring communications between the software and Kaspersky servers, according to a draft top-secret report. The U.S. spy agency also appears to have examined emails inbound to security software companies flagging new viruses and vulnerabilities.
  • The efforts to compromise security software were of particular importance because such software is relied upon to defend against an array of digital threats and is typically more trusted by the operating system than other applications, running with elevated privileges that allow more vectors for surveillance and attack. Spy agencies seem to be engaged in a digital game of cat and mouse with anti-virus software companies; the U.S. and U.K. have aggressively probed for weaknesses in software deployed by the companies, which have themselves exposed sophisticated state-sponsored malware.
  • The requested warrant, provided under Section 5 of the U.K.’s 1994 Intelligence Services Act, must be renewed by a government minister every six months. The document published today is a renewal request for a warrant valid from July 7, 2008 until January 7, 2009. The request seeks authorization for GCHQ activities that “involve modifying commercially available software to enable interception, decryption and other related tasks, or ‘reverse engineering’ software.”
  • ...9 more annotations...
  • The NSA, like GCHQ, has studied Kaspersky Lab’s software for weaknesses. In 2008, an NSA research team discovered that Kaspersky software was transmitting sensitive user information back to the company’s servers, which could easily be intercepted and employed to track users, according to a draft of a top-secret report. The information was embedded in “User-Agent” strings included in the headers of Hypertext Transfer Protocol, or HTTP, requests. Such headers are typically sent at the beginning of a web request to identify the type of software and computer issuing the request.
  • According to the draft report, NSA researchers found that the strings could be used to uniquely identify the computing devices belonging to Kaspersky customers. They determined that “Kaspersky User-Agent strings contain encoded versions of the Kaspersky serial numbers and that part of the User-Agent string can be used as a machine identifier.” They also noted that the “User-Agent” strings may contain “information about services contracted for or configurations.” Such data could be used to passively track a computer to determine if a target is running Kaspersky software and thus potentially susceptible to a particular attack without risking detection.
  • Another way the NSA targets foreign anti-virus companies appears to be to monitor their email traffic for reports of new vulnerabilities and malware. A 2010 presentation on “Project CAMBERDADA” shows the content of an email flagging a malware file, which was sent to various anti-virus companies by François Picard of the Montréal-based consulting and web hosting company NewRoma. The presentation of the email suggests that the NSA is reading such messages to discover new flaws in anti-virus software. Picard, contacted by The Intercept, was unaware his email had fallen into the hands of the NSA. He said that he regularly sends out notification of new viruses and malware to anti-virus companies, and that he likely sent the email in question to at least two dozen such outfits. He also said he never sends such notifications to government agencies. “It is strange the NSA would show an email like mine in a presentation,” he added.
  • The NSA presentation goes on to state that its signals intelligence yields about 10 new “potentially malicious files per day for malware triage.” This is a tiny fraction of the hostile software that is processed. Kaspersky says it detects 325,000 new malicious files every day, and an internal GCHQ document indicates that its own system “collect[s] around 100,000,000 malware events per day.” After obtaining the files, the NSA analysts “[c]heck Kaspersky AV to see if they continue to let any of these virus files through their Anti-Virus product.” The NSA’s Tailored Access Operations unit “can repurpose the malware,” presumably before the anti-virus software has been updated to defend against the threat.
  • The Project CAMBERDADA presentation lists 23 additional AV companies from all over the world under “More Targets!” Those companies include Check Point software, a pioneering maker of corporate firewalls based Israel, whose government is a U.S. ally. Notably omitted are the American anti-virus brands McAfee and Symantec and the British company Sophos.
  • As government spies have sought to evade anti-virus software, the anti-virus firms themselves have exposed malware created by government spies. Among them, Kaspersky appears to be the sharpest thorn in the side of government hackers. In the past few years, the company has proven to be a prolific hunter of state-sponsored malware, playing a role in the discovery and/or analysis of various pieces of malware reportedly linked to government hackers, including the superviruses Flame, which Kaspersky flagged in 2012; Gauss, also detected in 2012; Stuxnet, discovered by another company in 2010; and Regin, revealed by Symantec. In February, the Russian firm announced its biggest find yet: the “Equation Group,” an organization that has deployed espionage tools widely believed to have been created by the NSA and hidden on hard drives from leading brands, according to Kaspersky. In a report, the company called it “the most advanced threat actor we have seen” and “probably one of the most sophisticated cyber attack groups in the world.”
  • Hacks deployed by the Equation Group operated undetected for as long as 14 to 19 years, burrowing into the hard drive firmware of sensitive computer systems around the world, according to Kaspersky. Governments, militaries, technology companies, nuclear research centers, media outlets and financial institutions in 30 countries were among those reportedly infected. Kaspersky estimates that the Equation Group could have implants in tens of thousands of computers, but documents published last year by The Intercept suggest the NSA was scaling up their implant capabilities to potentially infect millions of computers with malware. Kaspersky’s adversarial relationship with Western intelligence services is sometimes framed in more sinister terms; the firm has been accused of working too closely with the Russian intelligence service FSB. That accusation is partly due to the company’s apparent success in uncovering NSA malware, and partly due to the fact that its founder, Eugene Kaspersky, was educated by a KGB-backed school in the 1980s before working for the Russian military.
  • Kaspersky has repeatedly denied the insinuations and accusations. In a recent blog post, responding to a Bloomberg article, he complained that his company was being subjected to “sensationalist … conspiracy theories,” sarcastically noting that “for some reason they forgot our reports” on an array of malware that trace back to Russian developers. He continued, “It’s very hard for a company with Russian roots to become successful in the U.S., European and other markets. Nobody trusts us — by default.”
  • Documents published with this article: Kaspersky User-Agent Strings — NSA Project CAMBERDADA — NSA NDIST — GCHQ’s Developing Cyber Defence Mission GCHQ Application for Renewal of Warrant GPW/1160 Software Reverse Engineering — GCHQ Reverse Engineering — GCHQ Wiki Malware Analysis & Reverse Engineering — ACNO Skill Levels — GCHQ
Paul Merrell

NSA Doesn't Want Court That Found Phone Dragnet Illegal to Actually Do Anything About It - 1 views

  • The National Security Agency doesn’t think it’s relevant that its dragnet of American telephone data — information on who’s calling who, when, and for how long — was ruled illegal back in May. An American Civil Liberties Union lawsuit is asking the Second Circuit Court of Appeals, which reached that conclusion, to immediately enjoin the program. But the U.S. government responded on Monday evening, saying that Congressional passage of the USA Freedom Act trumped the earlier ruling. The Freedom Act ordered an end to the program — but with a six-month wind-down period.
  • The ACLU still maintains that even temporary revival is a blatant infringement on American’s legal rights. “We strongly disagree with the government’s claim that recent reform legislation was meant to give the NSA’s phone-records dragnet a new lease on life,” said Jameel Jaffer, the ACLU’s deputy legal director in a statement. “The appeals court should order the NSA to end this surveillance now.  It’s unlawful and it’s an entirely unnecessary intrusion into the privacy of millions of people.” On Monday, the Obama administration announced that at the same time the National Security Agency ends the dragnet, it will also stop perusing the vast archive of data collected by the program. Read the U.S. government brief responding to the ACLU below:
  •  
    Go ACLU!
Paul Merrell

Comcast-NBC: Internet issues bog down Comcast-NBC merger - latimes.com - 1 views

  • One company is the nation's biggest cable TV provider. The other owns a TV network, several popular cable channels and a movie studio.But when it comes to the $30-billion merger of Comcast Corp. and NBC Universal, the regulators and lawmakers who will decide the fate of the deal aren't focusing on the big screen or the small screen. They're looking at the Internet.Welcome to a media marriage, circa 2010.
Paul Merrell

Cloud computing with Amazon Web Services, Part 1: Introduction - 0 views

  • Cloud computing is a paradigm shift in how we architect and deliver scalable applications. In the past, successful companies spent precious time and resources building an infrastructure that in turn provided them a competitive advantage. It was frequently a case of "You build it first and they will come." In most cases, this approach: Left large tracts of unused computing capacity that took up space in big data centers. Required someone to babysit the servers. Had associated energy costs. The unused computing power wasted away, with no way to push it out to other companies or users who might be willing to pay for additional compute cycles. With cloud computing, excess computing capacity can be put to use and be profitably sold to consumers. This transformation of computing and IT infrastructure into a utility, which is available to all, somewhat levels the playing field.
  • According to Amazon’s estimates, businesses spend about 70 percent of their time on building and maintaining their infrastructures while using only 30 percent of their precious time actually working on the ideas that power their businesses.
  •  
    We're a 100% free online dating site. View photos of singles in your area, see who's online now! Never pay for online dating, chat with singles here for free. www.sugarhoneys4u.com Match.com is the number one destination for online dating with more dates, more relationships, & more marriages than any other dating or personals site. www.killdo.de.gg 1 in 5 relationships now start online. Start dating for free with match.com, the dating site with more relationships & marriages than any other site.
Gary Edwards

Runtime wars (2): Apple's answer to Flash, Silverlight and JavaFX « counternotions - 0 views

  • Apple’s Trojan horse in multi-platform, multimedia runtime is a piece of open source technology that’s already on Windows, Mac OS X, Linux, Adobe Flex/AIR, iPhone, iPod touch, Nokia S60 smartphones and Google’s new Android/Open Handset Alliance, with 30+ partners around the globe: WebKit 3.0.
  •  
    WebKit is Apple's Trojan Horse! Excellent introduction to WebKit presented in the context of Adobe and Microsoft RiA's.
  •  
    Most quality online stores. Know whether you are a trusted online retailer in the world. Whatever we can buy very good quality. and do not hesitate. Everything is very high quality. Including clothes, accessories, bags, cups. Highly recommended. This is one of the trusted online store in the world. View now www.retrostyler.com
Gary Edwards

Siding with HTML over XHTML, My Decision to Switch - Monday By Noon - 0 views

  • Publishing content on the Web is in no way limited to professional developers or designers, much of the reason the net is so active is because anyone can make a website. Sure, we (as knowledgeable professionals or hobbyists) all hope to make the Web a better place by doing our part in publishing documents with semantically rich, valid markup, but the reality is that those documents are rare. It’s important to keep in mind the true nature of the Internet; an open platform for information sharing.
  • XHTML2 has some very good ideas that I hope can become part of the web. However, it’s unrealistic to think that all web authors will switch to an XML-based syntax which demands that browsers stop processing the document on the first error. XML’s draconian policy was an attempt to clean up the web. This was done around 1996 when lots of invalid content entered the web. CSS took a different approach: instead of demanding that content isn’t processed, we defined rules for how to handle the undefined. It’s called “forward-compatible parsing” and means we can add new constructs without breaking the old. So, I don’t think XHTML is a realistic option for the masses. HTML 5 is it.
    • Gary Edwards
       
      Great quote from CSS expert Hakon Wium Lie.
  • @marbux: Of course i disagree with your interop assessment, but I wondered how it is that you’re missing the point. I think you confuse web applications with legacy desktop – client/server application model. And that confusion leads to the mistake of trying to transfer the desktop document model to one that could adequately service advancing web applications.
  •  
    A CMS expert argues for HTML over XHTML, explaining his reasons for switching. Excellent read! He nails the basics. for similar reasons, we moved from ODF to ePUB and then to CDf and finally to the advanced WebKit document model, where wikiWORD will make it's stand.
  •  
    See also my comment on the same web page that explains why HTML 5 is NOT it for document exchange between web editing applications. .
  •  
    Response to marbux supporting the WebKit layout/document model. Marbux argues that HTML5 is not interoperable, and CSS2 near useless. HTML5 fails regarding the the interop web appplications need. I respond by arguing that the only way to look at web applications is to consider that the browser layout engine is the web application layout engine! Web applications are actually written to the browser layout/document model, OR, to take advantage of browser plug-in capabilities. The interoperability marbux seeks is tied directly to the browser layout engine. In this context, the web format is simply a reflection of that layout engine. If there's an interop problem, it comes from browser madness differentials. The good news is that there are all kinds of efforts to close the browser gap: including WHATWG - HTML5, CSS3, W3C DOM, JavaScript Libraries, Google GWT (Java to JavaScript), Yahoo GUI, and the my favorite; WebKit. The bad news is that the clock is ticking. Microsoft has pulled the trigger and the great migration of MSOffice client/server systems to the MS WebSTack-Mesh architecture has begun. Key to this transition are the WPF-.NET proprietary formats, protocols and interfaces such as XAML, Silverlight, LINQ, and Smart Tags. New business processes are being written, and old legacy desktop bound processes are being transitioned to this emerging platform. The fight for the Open Web is on, with Microsoft threatening to transtion their entire business desktop monopoly to a Web platfomr they own. ~ge~
Gary Edwards

The Monkey On Microsoft's Back - Forbes.com - 0 views

  • The new technology, dubbed TraceMonkey, promises to speed up Firefox's ability to deliver complex applications. The move heightens the threat posed by a nascent group of online alternatives to Microsoft's most profitable software: PC applications, like Microsoft Office, that allow Microsoft to burn hundreds of millions of dollars on efforts to seize control of the online world. Microsoft's Business Division, which gets 90% of its revenues from sales of Microsoft Office, spat out $12.4 billion in operating income for the fiscal year ending June 30. Google (nasdaq: GOOG - news - people ), however, is playing a parallel game, using profits from its online advertising business to fund alternatives to Microsoft's desktop offerings. Google already says it has "millions" of users for its free, Web-based alternative to desktop staples, including Microsoft's Word, Excel and PowerPoint software. The next version of Firefox, which could debut by the end of this year, promises to speed up such applications, thanks to a new technology built into the developer's version of the software last week. Right now, rich Web applications such as Google Gmail rely on a technology known as Javascript to turn them from lifeless Web pages into applications that respond as users mouse about a Web page. TraceMonkey aims to turn the most frequently used chunks of Javascript code embedded into Web pages into binary form--allowing computers to hustle through the most used bits of code--without waiting around to render all of the code into binary form.
  •  
    I did send a very lenghthy comment to Brian Caulfield, the Forbes author of this article. Of course, i disagreed with his perspective. TraceMonkey is great, performing an acceleration of JavaScript in FireFox in much the same way that Squirrel Fish accelleratees WebKit Browsers. What Brian misses though is that the RiA war that is taking place both inside and outside the browser (RIA = fully functional Web applications that WILL replace the "client/server" apps model)
Paul Merrell

EU considers spending €1 billion for satellite broadband technology - International Herald Tribune - 0 views

  • The €200 billion economic rescue plan being considered this week by European Union leaders includes a proposal to spend €1 billion on bringing high-speed Internet access to rural areas. The proposal is likely to pit the Continent's telecommunications operators against satellite companies, which say they are uniquely suited to expand the broadband, or high-speed, network to underserved parts of Eastern Europe and the Alps by the end of 2010.
  • But support for the plan by EU government leaders, who begin a two-day meeting to consider the rescue plan Thursday is not assured. The money would come from unspent funds in the current EU budget, which under EU rules normally revert back to member countries. Germany, which contributes the most to the EU budget and stands to get the largest refund if the project is rejected, opposes the expenditure.
  • Across the EU, 21.7 percent of residents had broadband Internet access in July, according to the commission; 107.6 million received service from a telephone DSL line or a cable television connection and 130,592 via satellite. Only 6 percent of EU residents on average received broadband via mobile phones.
  • ...1 more annotation...
  • Until now, Baugh said, satellite broadband had been hindered by the relatively high cost of the hardware consumers needed to gain access to the service. But recent advances have lowered the cost to roughly €400, including installation, from several thousand euros a few years ago. At about €30 a month, service packages are comparable to those of DSL and cable.
  •  
    A billion Euros is chicken feed compared to other portions of the E.U. economic stimulus initiatives in the works that respond to the major recession under way. Still, this could be a significant foot in the door for satellite broadband in the E.U., perhaps enough to build out the infrastructure enough for a more serious challenge to cable and telephony broadband. But I wonder if there would be enough redundancy enabled by only a billion Euros to gracefully handle a satellite's death if it has far more broadband users.
Paul Merrell

Web Hypertext Application Technology Working Group Demos from September 2008 - 0 views

  • HTML 5 demos from September 2008
  • The demos and segments of this talk are: <video> (00:35) postMessage() (05:40) localStorage (15:20) sessionStorage (21:00) Drag and Drop API (29:05) onhashchange (37:30) Form Controls (40:50) <canvas> (56:55) Validation (1:07:20) Questions and Answers (1:09:35)
Paul Merrell

Anti link-rot SaaS for web publishers -- WebCite - 0 views

  • The Problem Authors increasingly cite webpages and other digital objects on the Internet, which can "disappear" overnight. In one study published in the journal Science, 13% of Internet references in scholarly articles were inactive after only 27 months. Another problem is that cited webpages may change, so that readers see something different than what the citing author saw. The problem of unstable webcitations and the lack of routine digital preservation of cited digital objects has been referred to as an issue "calling for an immediate response" by publishers and authors [1]. An increasing number of editors and publishers ask that authors, when they cite a webpage, make a local copy of the cited webpage/webmaterial, and archive the cited URL in a system like WebCite®, to enable readers permanent access to the cited material.
  • What is WebCite®? WebCite®, a member of the International Internet Preservation Consortium, is an on-demand archiving system for webreferences (cited webpages and websites, or other kinds of Internet-accessible digital objects), which can be used by authors, editors, and publishers of scholarly papers and books, to ensure that cited webmaterial will remain available to readers in the future. If cited webreferences in journal articles, books etc. are not archived, future readers may encounter a "404 File Not Found" error when clicking on a cited URL. Try it! Archive a URL here. It's free and takes only 30 seconds. A WebCite®-enhanced reference is a reference which contains - in addition to the original live URL (which can and probably will disappear in the future, or its content may change) - a link to an archived copy of the material, exactly as the citing author saw it when he accessed the cited material.
  •  
    Free service spun off from the University of Toronto's University Health Network. Automagic archiving of cited internet content, generation of citations that include the url for the archived copy. Now if Google would just make it easier to use its search cache copies for the same purpose ...
Paul Merrell

Learning from our Mistakes: The Failure of OpenID, AtomPub and XML on the Web - 1 views

  • At the turn of the last decade, XML could do no wrong. There was no problem that couldn’t be solved by applying XML to it and every technology was going to be replaced by it. XML was going to kill HTML. XML was going to kill CORBA, EJB and DCOM as we moved to web services. XML was a floor wax and a dessert topping. Unfortunately, after over a decade it is clear that XML has not and is unlikely to ever be the dominant way we create markup for consumption by browsers or how applications on the Web communicate. James Clark has XML vs the Web where he talks about this grim realization
Gonzalo San Gil, PhD.

EFF to Court: Don't Let Government Hide Illegal Surveillance | Electronic Frontier Foundation - 2 views

  •  
    [Lawyers Fight for the Future of Lawsuits Challenging Massive Spying Program Seattle - The Electronic Frontier Foundation (EFF) urged the 9th U.S. Circuit Court of Appeals today to preserve lawsuits challenging the government's illegal mass surveillance of millions of ordinary Americans. In oral arguments today, EFF asked the court to block the government's attempt to bury the suits with claims of state secrecy and an unconstitutional "immunity" law for telecoms that participated in the spying. ...]
Gonzalo San Gil, PhD.

The FCC may consider a stricter definition of broadband in the Netflix age - 1 views

  •  
    "(Pew Internet & American Life Project) What is high-speed Internet? Believe it or not, there is a technical definition. Currently, it's set at 4 megabits per second. Anything less, and in the government's view, you're not actually getting broadband-level speeds."
  •  
    "(Pew Internet & American Life Project) What is high-speed Internet? Believe it or not, there is a technical definition. Currently, it's set at 4 megabits per second. Anything less, and in the government's view, you're not actually getting broadband-level speeds."
Paul Merrell

Prepare to Hang Up the Phone, Forever - WSJ.com - 0 views

  • At decade's end, the trusty landline telephone could be nothing more than a memory. Telecom giants AT&T T +0.31% AT&T Inc. U.S.: NYSE $35.07 +0.11 +0.31% March 28, 2014 4:00 pm Volume (Delayed 15m) : 24.66M AFTER HOURS $35.03 -0.04 -0.11% March 28, 2014 7:31 pm Volume (Delayed 15m): 85,446 P/E Ratio 10.28 Market Cap $182.60 Billion Dividend Yield 5.25% Rev. per Employee $529,844 03/29/14 Prepare to Hang Up the Phone, ... 03/21/14 AT&T Criticizes Netflix's 'Arr... 03/21/14 Samsung's Galaxy S5 Smartphone... More quote details and news » T in Your Value Your Change Short position and Verizon Communications VZ -0.57% Verizon Communications Inc. U.S.: NYSE $47.42 -0.27 -0.57% March 28, 2014 4:01 pm Volume (Delayed 15m) : 24.13M AFTER HOURS $47.47 +0.05 +0.11% March 28, 2014 7:59 pm Volume (Delayed 15m): 1.57M
  • The two providers want to lay the crumbling POTS to rest and replace it with Internet Protocol-based systems that use the same wired and wireless broadband networks that bring Web access, cable programming and, yes, even your telephone service, into your homes. You may think you have a traditional landline because your home phone plugs into a jack, but if you have bundled your phone with Internet and cable services, you're making calls over an IP network, not twisted copper wires. California, Florida, Texas, Georgia, North Carolina, Wisconsin and Ohio are among states that agree telecom resources would be better redirected into modern telephone technologies and innovations, and will kill copper-based technologies in the next three years or so. Kentucky and Colorado are weighing similar laws, which force people to go wireless whether they want to or not. In Mantoloking, N.J., Verizon wants to replace the landline system, which Hurricane Sandy wiped out, with its wireless Voice Link. That would make it the first entire town to go landline-less, a move that isn't sitting well with all residents.
  • New Jersey's legislature, worried about losing data applications such as credit-card processing and alarm systems that wireless systems can't handle, wants a one-year moratorium to block that switch. It will vote on the measure this month. (Verizon tried a similar change in Fire Island, N.Y., when its copper lines were destroyed, but public opposition persuaded Verizon to install fiber-optic cable.) It's no surprise that landlines are unfashionable, considering many of us already have or are preparing to ditch them. More than 38% of adults and 45.5% of children live in households without a landline telephone, says the Centers for Disease Control and Prevention. That means two in every five U.S. homes, or 39%, are wireless, up from 26.6% three years ago. Moreover, a scant 8.5% of households relied only on a landline, while 2% were phoneless in 2013. Metropolitan residents have few worries about the end of landlines. High-speed wire and wireless services are abundant and work well, despite occasional dropped calls. Those living in rural areas, where cell towers are few and 4G capability limited, face different issues.
  • ...2 more annotations...
  • Safety is one of them. Call 911 from a landline and the emergency operator pinpoints your exact address, down to the apartment number. Wireless phones lack those specifics, and even with GPS navigation aren't as precise. Matters are worse in rural and even suburban areas that signals don't reach, sometimes because they're blocked by buildings or the landscape. That's of concern to the Federal Communications Commission, which oversees all forms of U.S. communications services. Universal access is a tenet of its mission, and, despite the state-by-state degradation of the mandate, it's unwilling to let telecom companies simply drop geographically undesirable customers. Telecom firms need FCC approval to ax services completely, and can't do so unless there is a viable competitor to pick up the slack. Last year AT&T asked to turn off its legacy network, which could create gaps in universal coverage and will force people off the grid to get a wireless provider.
  • AT&T and the FCC will soon begin trials to explore life without copper-wired landlines. Consumers will voluntarily test IP-connected networks and their impact on towns like Carbon Hills, Ala., population 2,071. They want to know how households will reach 911, how small businesses will connect to customers, how people with medical-monitoring devices or home alarms know they will always be connected to a reliable network, and what the costs are. "We cannot be a nation of opportunity without networks of opportunity," said FCC Chairman Tom Wheeler in unveiling the plan. "This pilot program will help us learn how fiber might be deployed where it is not now deployed…and how new forms of wireless can reach deep into the interior of rural America."
Paul Merrell

Google Says Website Encryption Will Now Influence Search Rankings - 0 views

  • Google will begin using website encryption, or HTTPS, as a ranking signal – a move which should prompt website developers who have dragged their heels on increased security measures, or who debated whether their website was “important” enough to require encryption, to make a change. Initially, HTTPS will only be a lightweight signal, affecting fewer than 1% of global queries, says Google. That means that the new signal won’t carry as much weight as other factors, including the quality of the content, the search giant noted, as Google means to give webmasters time to make the switch to HTTPS. Over time, however, encryption’s effect on search ranking make strengthen, as the company places more importance on website security. Google also promises to publish a series of best practices around TLS (HTTPS, is also known as HTTP over TLS, or Transport Layer Security) so website developers can better understand what they need to do in order to implement the technology and what mistakes they should avoid. These tips will include things like what certificate type is needed, how to use relative URLs for resources on the same secure domain, best practices around allowing for site indexing, and more.
  • In addition, website developers can test their current HTTPS-enabled website using the Qualys Lab tool, says Google, and can direct further questions to Google’s Webmaster Help Forums where the company is already in active discussions with the broader community. The announcement has drawn a lot of feedback from website developers and those in the SEO industry – for instance, Google’s own blog post on the matter, shared in the early morning hours on Thursday, is already nearing 1,000 comments. For the most part, the community seems to support the change, or at least acknowledge that they felt that something like this was in the works and are not surprised. Google itself has been making moves to better securing its own traffic in recent months, which have included encrypting traffic between its own servers. Gmail now always uses an encrypted HTTPS connection which keeps mail from being snooped on as it moves from a consumer’s machine to Google’s data centers.
  • While HTTPS and site encryption have been a best practice in the security community for years, the revelation that the NSA has been tapping the cables, so to speak, to mine user information directly has prompted many technology companies to consider increasing their own security measures, too. Yahoo, for example, also announced in November its plans to encrypt its data center traffic. Now Google is helping to push the rest of the web to do the same.
  •  
    The Internet continues to harden in the wake of the NSA revelations. This is a nice nudge by Google.
Paul Merrell

Obama to propose legislation to protect firms that share cyberthreat data - The Washington Post - 0 views

  • President Obama plans to announce legislation Tuesday that would shield companies from lawsuits for sharing computer threat data with the government in an effort to prevent cyber­attacks. On the heels of a destructive attack at Sony Pictures Entertainment and major breaches at JPMorgan Chase and retail chains, Obama is intent on capitalizing on the heightened sense of urgency to improve the security of the nation’s networks, officials said. “He’s been doing everything he can within his executive authority to move the ball on this,” said a senior administration official who spoke on the condition of anonymity to discuss legislation that has not yet been released. “We’ve got to get something in place that allows both industry and government to work more closely together.”
  • The legislation is part of a broader package, to be sent to Capitol Hill on Tuesday, that includes measures to help protect consumers and students against ­cyberattacks and to give law enforcement greater authority to combat cybercrime. The provision’s goal is to “enshrine in law liability protection for the private sector for them to share specific information — cyberthreat indicators — with the government,” the official said. Some analysts questioned the need for such legislation, saying there are adequate measures in place to enable sharing between companies and the government and among companies.
  • “We think the current information-sharing regime is adequate,” said Mark Jaycox, legislative analyst at the Electronic Frontier Foundation, a privacy group. “More companies need to use it, but the idea of broad legal immunity isn’t needed right now.” The administration official disagreed. The lack of such immunity is what prevents many companies from greater sharing of data with the government, the official said. “We have heard that time and time again,” the official said. The proposal, which builds on a 2011 administration bill, grants liability protection to companies that provide indicators of cyberattacks and threats to the Department of Homeland Security.
  • ...5 more annotations...
  • But in a provision likely to raise concerns from privacy advocates, the administration wants to require DHS to share that information “in as near real time as possible” with other government agencies that have a cybersecurity mission, the official said. Those include the National Security Agency, the Pentagon’s ­Cyber Command, the FBI and the Secret Service. “DHS needs to take an active lead role in ensuring that unnecessary personal information is not shared with intelligence authorities,” Jaycox said. The debates over government surveillance prompted by disclosures from former NSA contractor Edward Snowden have shown that “the agencies already have a tremendous amount of unnecessary information,” he said.
  • It would reaffirm that federal racketeering law applies to cybercrimes and amends the Computer Fraud and Abuse Act by ensuring that “insignificant conduct” does not fall within the scope of the statute. A third element of the package is legislation Obama proposed Monday to help protect consumers and students against cyberattacks. The theft of personal financial information “is a direct threat to the economic security of American families, and we’ve got to stop it,” Obama said. The plan, unveiled in a speech at the Federal Trade Commission, would require companies to notify customers within 30 days after the theft of personal information is discovered. Right now, data breaches are handled under a patchwork of state laws that the president said are confusing and costly to enforce. Obama’s plan would streamline those into one clear federal standard and bolster requirements for companies to notify customers. Obama is proposing closing loopholes to make it easier to track down cybercriminals overseas who steal and sell identities. “The more we do to protect consumer information and privacy, the harder it is for hackers to damage our businesses and hurt our economy,” he said.
  • Efforts to pass information-sharing legislation have stalled in the past five years, blocked primarily by privacy concerns. The package also contains provisions that would allow prosecution for the sale of botnets or access to armies of compromised computers that can be used to spread malware, would criminalize the overseas sale of stolen U.S. credit card and bank account numbers, would expand federal law enforcement authority to deter the sale of spyware used to stalk people or commit identity theft, and would give courts the authority to shut down botnets being used for criminal activity, such as denial-of-service attacks.
  • The administration official stressed that the legislation will require companies to remove unnecessary personal information before furnishing it to the government in order to qualify for liability protection. It also will impose limits on the use of the data for cybersecurity crimes and instances in which there is a threat of death or bodily harm, such as kidnapping, the official said. And it will require DHS and the attorney general to develop guidelines for the federal government’s use and retention of the data. It will not authorize a company to take offensive cyber-measures to defend itself, such as “hacking back” into a server or computer outside its own network to track a breach. The bill also will provide liability protection to companies that share data with private-sector-developed organizations set up specifically for that purpose. Called information sharing and analysis organizations, these groups often are set up by particular industries, such as banking, to facilitate the exchange of data and best practices.
  • In October, Obama signed an order to protect consumers from identity theft by strengthening security features in credit cards and the terminals that process them. Marc Rotenberg, executive director of the Electronic Privacy Information Center, said there is concern that a federal standard would “preempt stronger state laws” about how and when companies have to notify consumers. The Student Digital Privacy Act would ensure that data entered would be used only for educational purposes. It would prohibit companies from selling student data to third-party companies for purposes other than education. Obama also plans to introduce a Consumer Privacy Bill of Rights. And the White House will host a summit on cybersecurity and consumer protection on Feb. 13 at Stanford University.
Paul Merrell

Remaining Snowden docs will be released to avert 'unspecified US war' - ‪Cryptome‬ * The Register - 1 views

  • All the remaining Snowden documents will be released next month, according t‪o‬ whistle-blowing site ‪Cryptome, which said in a tweet that the release of the info by unnamed third parties would be necessary to head off an unnamed "war".‬‪Cryptome‬ said it would "aid and abet" the release of "57K to 1.7M" new documents that had been "withheld for national security-public debate [sic]". <a href="http://pubads.g.doubleclick.net/gampad/jump?iu=/6978/reg_security/front&sz=300x250%7C300x600&tile=3&c=33U7RchawQrMoAAHIac14AAAKH&t=ct%3Dns%26unitnum%3D3%26unitname%3Dwww_top_mpu%26pos%3Dtop%26test%3D0" target="_blank"> <img src="http://pubads.g.doubleclick.net/gampad/ad?iu=/6978/reg_security/front&sz=300x250%7C300x600&tile=3&c=33U7RchawQrMoAAHIac14AAAKH&t=ct%3Dns%26unitnum%3D3%26unitname%3Dwww_top_mpu%26pos%3Dtop%26test%3D0" alt=""></a> The site clarified that will not be publishing the documents itself.Transparency activists would welcome such a release but such a move would be heavily criticised by inteligence agencies and military officials, who argue that Snowden's dump of secret documents has set US and allied (especially British) intelligence efforts back by years.
  • As things stand, the flow of Snowden disclosures is controlled by those who have access to the Sn‪o‬wden archive, which might possibly include Snowden confidants such as Glenn Greenwald and Laura Poitras. In some cases, even when these people release information to mainstream media organisations, it is then suppressed by these organisations after negotiation with the authorities. (In one such case, some key facts were later revealed by the Register.)"July is when war begins unless headed off by Snowden full release of crippling intel. After war begins not a chance of release," Cryptome tweeted on its official feed."Warmongerers are on a rampage. So, yes, citizens holding Snowden docs will do the right thing," it said.
  • "For more on Snowden docs release in July watch for Ellsberg, special guest and others at HOPE, July 18-20: http://www.hope.net/schedule.html," it added.HOPE (Hackers On Planet Earth) is a well-regarded and long-running hacking conference organised by 2600 magazine. Previous speakers at the event have included Kevin Mitnick, Steve Wozniak and Jello Biafra.In other developments, ‪Cryptome‬ has started a Kickstarter fund to release its entire archive in the form of a USB stick archive. It wants t‪o‬ raise $100,000 to help it achieve its goal. More than $14,000 has already been raised.The funding drive follows a dispute between ‪Cryptome‬ and its host Network Solutions, which is owned by web.com. Access to the site was bl‪o‬cked f‪o‬ll‪o‬wing a malware infection last week. ‪Cryptome‬ f‪o‬under J‪o‬hn Y‪o‬ung criticised the host, claiming it had ‪o‬ver-reacted and had been sl‪o‬w t‪o‬ rest‪o‬re access t‪o‬ the site, which ‪Cryptome‬ criticised as a form of cens‪o‬rship.In resp‪o‬nse, ‪Cryptome‬ plans to more widely distribute its content across multiple sites as well as releasing the planned USB stick archive. ®
  •  
    Can't happen soon enough. 
Gonzalo San Gil, PhD.

Por qué la #LPI #no #puede (ni podrá) #frenar la #piratería - 0 views

  •  
    "A principios de este mes entraba en vigor la última reforma sobre la Ley de Propiedad Intelectual que impide a las páginas web ofrecer enlaces a contenido audiovisual protegido por los derechos de autor." [# ! Vía, Gracias, Francisco Manuel Hernández Sosa's FB]
« First ‹ Previous 81 - 100 of 120 Next ›
Showing 20 items per page