Skip to main content

Home/ Open Web/ Group items tagged web-security

Rss Feed Group items tagged

Paul Merrell

Leaked docs show spyware used to snoop on US computers | Ars Technica - 0 views

  • Software created by the controversial UK-based Gamma Group International was used to spy on computers that appear to be located in the United States, the UK, Germany, Russia, Iran, and Bahrain, according to a leaked trove of documents analyzed by ProPublica. It's not clear whether the surveillance was conducted by governments or private entities. Customer e-mail addresses in the collection appeared to belong to a German surveillance company, an independent consultant in Dubai, the Bosnian and Hungarian Intelligence services, a Dutch law enforcement officer, and the Qatari government.
  • The leaked files—which were posted online by hackers—are the latest in a series of revelations about how state actors including repressive regimes have used Gamma's software to spy on dissidents, journalists, and activist groups. The documents, leaked last Saturday, could not be readily verified, but experts told ProPublica they believed them to be genuine. "I think it's highly unlikely that it's a fake," said Morgan Marquis-Bore, a security researcher who while at The Citizen Lab at the University of Toronto had analyzed Gamma Group's software and who authored an article about the leak on Thursday. The documents confirm many details that have already been reported about Gamma, such as that its tools were used to spy on Bahraini activists. Some documents in the trove contain metadata tied to e-mail addresses of several Gamma employees. Bill Marczak, another Gamma Group expert at the Citizen Lab, said that several dates in the documents correspond to publicly known events—such as the day that a particular Bahraini activist was hacked.
  • The leaked files contain more than 40 gigabytes of confidential technical material, including software code, internal memos, strategy reports, and user guides on how to use Gamma Group software suite called FinFisher. FinFisher enables customers to monitor secure Web traffic, Skype calls, webcams, and personal files. It is installed as malware on targets' computers and cell phones. A price list included in the trove lists a license of the software at almost $4 million. The documents reveal that Gamma uses technology from a French company called Vupen Security that sells so-called computer "exploits." Exploits include techniques called "zero days" for "popular software like Microsoft Office, Internet Explorer, Adobe Acrobat Reader, and many more." Zero days are exploits that have not yet been detected by the software maker and therefore are not blocked.
  • ...2 more annotations...
  • Many of Gamma's product brochures have previously been published by the Wall Street Journal and Wikileaks, but the latest trove shows how the products are getting more sophisticated. In one document, engineers at Gamma tested a product called FinSpy, which inserts malware onto a user's machine, and found that it could not be blocked by most antivirus software. Documents also reveal that Gamma had been working to bypass encryption tools including a mobile phone encryption app, Silent Circle, and were able to bypass the protection given by hard-drive encryption products TrueCrypt and Microsoft's Bitlocker.
  • The documents also describe a "country-wide" surveillance product called FinFly ISP which promises customers the ability to intercept Internet traffic and masquerade as ordinary websites in order to install malware on a target's computer. The most recent date-stamp found in the documents is August 2, coincidung with the first tweet by a parody Twitter account, @GammaGroupPR, which first announced the hack and may be run by the hacker or hackers responsible for the leak. On Reddit, a user called PhineasFisher claimed responsibility for the leak. "Two years ago their software was found being widely used by governments in the middle east, especially Bahrain, to hack and spy on the computers and phones of journalists and dissidents," the user wrote. The name on the @GammaGroupPR Twitter account is also "Phineas Fisher." GammaGroup, the surveillance company whose documents were released, is no stranger to the spotlight. The security firm F-Secure first reported the purchase of FinFisher software by the Egyptian State Security agency in 2011. In 2012, Bloomberg News and The Citizen Lab showed how the company's malware was used to target activists in Bahrain. In 2013, the software company Mozilla sent a cease-and-desist letter to the company after a report by The Citizen Lab showed that a spyware-infected version of the Firefox browser manufactured by Gamma was being used to spy on Malaysian activists.
Gary Edwards

Discoverer of JSON Recommends Suspension of HTML5 | Web Security Journal - 0 views

  •  
    Fascinating conversation between Douglas Crockford and Jeremy Geelan. The issue is that XSS - the Cross Site Scripting capabilities of HTML. and "the painful gap" in the HTML5 specification of the itnerface between JavaScript and the browser. I had to use the Evernote Clearly Chrome extension to read this page. Microsoft is running a huge JavaScript advertisement/pointer that totally blocks the page with no way of closing or escaping. Incredible. Clearly was able to knock it out though. Nicely done! The HTML5-XSS problem is very important, especially if your someone like me that sees the HTML+ format (HTML5-CSS3-JSON-JavaScript-SVG/Canvas) as the undisputed Cloud Productivity Platform "compound document" model. The XSS discussion goes right to the heart of matter of creating an HTML compound document in much the same way that a MSOffice Productivity Compound Document worked. The XSS mimics the functionality of of embedded compound document components such as OLE, DDE, ODBC and Scripting. Crack open any client/server business document and it will be found to be loaded with these embeded components. It seems to me that any one of the Cloud Productivity Platform contenders could solve the HTML-XSS problem. I'm thinking Google Apps, Zoho, SalesForce.com, RackSpace and Amazon - with gApps and Zoho clearly leading the charge. Also let me add that RSS and XMP (Jabber), while not normally mentioned with JSON, ought to be considered. Twitter uses RSS to transport and connect data. Jabber is of course a long time favorite of mine. excerpt: The fundamental mistake in HTML5 was one of prioritization. It should have tackled the browser's most important problem first. Once the platform was secured, then shiny new features could be carefully added. There is much that is attractive about HTML5. But ultimately the thing that made the browser into a credible application delivery system was JavaScript, the ultimate workaround tool. There is a painful gap
Gary Edwards

Adeptol Viewing Technology Features - 0 views

  •  
    Quick LinksGet a TrialEnterprise On DemandEnterprise On PremiseFAQHelpContact UsWhy Adeptol?Document SupportSupport for more than 300 document types out of boxNot a Virtual PrinterMultitenant platform for high end document viewingNo SoftwaresNo need to install any additional softwares on serverNo ActiveX/PluginsNo plugins or active x or applets need to be downloaded on client side.Fully customizableAdvanced API offers full customization and UI changes.Any OS/Any Prog LanguageInstall Adeptol Server on any OS and integrate with any programming language.AwardsAdeptol products receive industry awards and accolades year after year  View a DemoAttend a WebcastContact AdeptolView a Success StoryNo ActiveX, No Plug-in, No Software's to download. Any OS, Any Browser, Any Programming Language. That is the Power of Adeptol. Adeptol can help you retain your customers and streamline your content integration efforts. Leverage Web 2.0 technologies to get a completely scalable content viewer that easily handles any type of content in virtually unlimited volume, with additional capabilities to support high-volume transaction and archive environments. Our enterprise-class infrastructure was built to meet the needs of the world's most demanding global enterprises. Based on AJAX technology you can easily integrate the viewer into your application with complete ease. Support for all Server PlatformsCan be installed on Windows   (32bit/64bit) Server and Linux   (32bit/64bit) Server. Click here to see technical specifications.Integrate with any programming languageWhether you work in .net, c#, php, cold fusion or JSP. Adeptol Viewer can be integrated easily in any programming language using the easy API set. It also comes with sample code for all languages to get you started.Compatibility with more than 99% of the browsersTested & verified for compatibility with 99% of the various browsers on different platforms. Click here to see browser compatibility report.More than 300 Document T
Gary Edwards

Mobile Opportunity: Windows 8 - The Beginning of the End of Windows - 0 views

  •  
    Michael Mace provides the best analysis and insight yet concerning Windows 8 and what it means to Microsoft, Windows, and Future of the Web.  Not sure i agree with the MSOffice future, but this is excellent thinking.  Glad i stumbled on Micheal Mace! excerpt: I've got to say, this is the first time in years that I've been deeply intrigued by something Microsoft announced.  Not just because it looks cool (it does), but because I think it shows clever business strategy on Microsoft's part.  And I can't even remember the last time I used the phrase "clever business strategy" and Microsoft in the same sentence. The announcement also has immense implications for the rest of the industry.  Whether or not Windows 8 is a financial success for Microsoft, we've now crossed a critical threshold. The old Windows of mice and icons is officially obsolete. That resets the playing field for everybody in computing. The slow death of Windows When Netscape first made the web important in personal computing, Microsoft responded by rapidly evolving Internet Explorer.  That response was broadly viewed as successful, but in retrospect maybe it was too successful for Microsoft's good.  It let the company go back to harvesting money from its Windows + Office monopoly, feeling pretty secure from potential challengers. Meanwhile, the focus of application innovation slipped away from Windows, toward web apps.  New software was developed first on the Internet, rather than on Windows.  Over time, Windows became more and more a legacy thing we kept because we needed backward compatibility, rather than a part of the next generation of computing. Windows was our past, the web was our future
Paul Merrell

How to Encrypt the Entire Web for Free - The Intercept - 0 views

  • If we’ve learned one thing from the Snowden revelations, it’s that what can be spied on will be spied on. Since the advent of what used to be known as the World Wide Web, it has been a relatively simple matter for network attackers—whether it’s the NSA, Chinese intelligence, your employer, your university, abusive partners, or teenage hackers on the same public WiFi as you—to spy on almost everything you do online. HTTPS, the technology that encrypts traffic between browsers and websites, fixes this problem—anyone listening in on that stream of data between you and, say, your Gmail window or bank’s web site would get nothing but useless random characters—but is woefully under-used. The ambitious new non-profit Let’s Encrypt aims to make the process of deploying HTTPS not only fast, simple, and free, but completely automatic. If it succeeds, the project will render vast regions of the internet invisible to prying eyes.
  • The benefits of using HTTPS are obvious when you think about protecting secret information you send over the internet, like passwords and credit card numbers. It also helps protect information like what you search for in Google, what articles you read, what prescription medicine you take, and messages you send to colleagues, friends, and family from being monitored by hackers or authorities. But there are less obvious benefits as well. Websites that don’t use HTTPS are vulnerable to “session hijacking,” where attackers can take over your account even if they don’t know your password. When you download software without encryption, sophisticated attackers can secretly replace the download with malware that hacks your computer as soon as you try installing it.
  • Encryption also prevents attackers from tampering with or impersonating legitimate websites. For example, the Chinese government censors specific pages on Wikipedia, the FBI impersonated The Seattle Times to get a suspect to click on a malicious link, and Verizon and AT&T injected tracking tokens into mobile traffic without user consent. HTTPS goes a long way in preventing these sorts of attacks. And of course there’s the NSA, which relies on the limited adoption of HTTPS to continue to spy on the entire internet with impunity. If companies want to do one thing to meaningfully protect their customers from surveillance, it should be enabling encryption on their websites by default.
  • ...2 more annotations...
  • Let’s Encrypt, which was announced this week but won’t be ready to use until the second quarter of 2015, describes itself as “a free, automated, and open certificate authority (CA), run for the public’s benefit.” It’s the product of years of work from engineers at Mozilla, Cisco, Akamai, Electronic Frontier Foundation, IdenTrust, and researchers at the University of Michigan. (Disclosure: I used to work for the Electronic Frontier Foundation, and I was aware of Let’s Encrypt while it was being developed.) If Let’s Encrypt works as advertised, deploying HTTPS correctly and using all of the best practices will be one of the simplest parts of running a website. All it will take is running a command. Currently, HTTPS requires jumping through a variety of complicated hoops that certificate authorities insist on in order prove ownership of domain names. Let’s Encrypt automates this task in seconds, without requiring any human intervention, and at no cost.
  • The transition to a fully encrypted web won’t be immediate. After Let’s Encrypt is available to the public in 2015, each website will have to actually use it to switch over. And major web hosting companies also need to hop on board for their customers to be able to take advantage of it. If hosting companies start work now to integrate Let’s Encrypt into their services, they could offer HTTPS hosting by default at no extra cost to all their customers by the time it launches.
  •  
    Don't miss the video. And if you have a web site, urge your host service to begin preparing for Let's Encrypt. (See video on why it's good for them.)
Paul Merrell

Google Says Website Encryption Will Now Influence Search Rankings - 0 views

  • Google will begin using website encryption, or HTTPS, as a ranking signal – a move which should prompt website developers who have dragged their heels on increased security measures, or who debated whether their website was “important” enough to require encryption, to make a change. Initially, HTTPS will only be a lightweight signal, affecting fewer than 1% of global queries, says Google. That means that the new signal won’t carry as much weight as other factors, including the quality of the content, the search giant noted, as Google means to give webmasters time to make the switch to HTTPS. Over time, however, encryption’s effect on search ranking make strengthen, as the company places more importance on website security. Google also promises to publish a series of best practices around TLS (HTTPS, is also known as HTTP over TLS, or Transport Layer Security) so website developers can better understand what they need to do in order to implement the technology and what mistakes they should avoid. These tips will include things like what certificate type is needed, how to use relative URLs for resources on the same secure domain, best practices around allowing for site indexing, and more.
  • In addition, website developers can test their current HTTPS-enabled website using the Qualys Lab tool, says Google, and can direct further questions to Google’s Webmaster Help Forums where the company is already in active discussions with the broader community. The announcement has drawn a lot of feedback from website developers and those in the SEO industry – for instance, Google’s own blog post on the matter, shared in the early morning hours on Thursday, is already nearing 1,000 comments. For the most part, the community seems to support the change, or at least acknowledge that they felt that something like this was in the works and are not surprised. Google itself has been making moves to better securing its own traffic in recent months, which have included encrypting traffic between its own servers. Gmail now always uses an encrypted HTTPS connection which keeps mail from being snooped on as it moves from a consumer’s machine to Google’s data centers.
  • While HTTPS and site encryption have been a best practice in the security community for years, the revelation that the NSA has been tapping the cables, so to speak, to mine user information directly has prompted many technology companies to consider increasing their own security measures, too. Yahoo, for example, also announced in November its plans to encrypt its data center traffic. Now Google is helping to push the rest of the web to do the same.
  •  
    The Internet continues to harden in the wake of the NSA revelations. This is a nice nudge by Google.
Paul Merrell

Most Agencies Falling Short on Mandate for Online Records - 0 views

  • Nearly 20 years after Congress passed the Electronic Freedom of Information Act Amendments (E-FOIA), only 40 percent of agencies have followed the law's instruction for systematic posting of records released through FOIA in their electronic reading rooms, according to a new FOIA Audit released today by the National Security Archive at www.nsarchive.org to mark Sunshine Week. The Archive team audited all federal agencies with Chief FOIA Officers as well as agency components that handle more than 500 FOIA requests a year — 165 federal offices in all — and found only 67 with online libraries populated with significant numbers of released FOIA documents and regularly updated.
  • Congress called on agencies to embrace disclosure and the digital era nearly two decades ago, with the passage of the 1996 "E-FOIA" amendments. The law mandated that agencies post key sets of records online, provide citizens with detailed guidance on making FOIA requests, and use new information technology to post online proactively records of significant public interest, including those already processed in response to FOIA requests and "likely to become the subject of subsequent requests." Congress believed then, and openness advocates know now, that this kind of proactive disclosure, publishing online the results of FOIA requests as well as agency records that might be requested in the future, is the only tenable solution to FOIA backlogs and delays. Thus the National Security Archive chose to focus on the e-reading rooms of agencies in its latest audit. Even though the majority of federal agencies have not yet embraced proactive disclosure of their FOIA releases, the Archive E-FOIA Audit did find that some real "E-Stars" exist within the federal government, serving as examples to lagging agencies that technology can be harnessed to create state-of-the art FOIA platforms. Unfortunately, our audit also found "E-Delinquents" whose abysmal web performance recalls the teletype era.
  • E-Delinquents include the Office of Science and Technology Policy at the White House, which, despite being mandated to advise the President on technology policy, does not embrace 21st century practices by posting any frequently requested records online. Another E-Delinquent, the Drug Enforcement Administration, insults its website's viewers by claiming that it "does not maintain records appropriate for FOIA Library at this time."
  • ...9 more annotations...
  • "The presumption of openness requires the presumption of posting," said Archive director Tom Blanton. "For the new generation, if it's not online, it does not exist." The National Security Archive has conducted fourteen FOIA Audits since 2002. Modeled after the California Sunshine Survey and subsequent state "FOI Audits," the Archive's FOIA Audits use open-government laws to test whether or not agencies are obeying those same laws. Recommendations from previous Archive FOIA Audits have led directly to laws and executive orders which have: set explicit customer service guidelines, mandated FOIA backlog reduction, assigned individualized FOIA tracking numbers, forced agencies to report the average number of days needed to process requests, and revealed the (often embarrassing) ages of the oldest pending FOIA requests. The surveys include:
  • The federal government has made some progress moving into the digital era. The National Security Archive's last E-FOIA Audit in 2007, " File Not Found," reported that only one in five federal agencies had put online all of the specific requirements mentioned in the E-FOIA amendments, such as guidance on making requests, contact information, and processing regulations. The new E-FOIA Audit finds the number of agencies that have checked those boxes is now much higher — 100 out of 165 — though many (66 in 165) have posted just the bare minimum, especially when posting FOIA responses. An additional 33 agencies even now do not post these types of records at all, clearly thwarting the law's intent.
  • The FOIAonline Members (Department of Commerce, Environmental Protection Agency, Federal Labor Relations Authority, Merit Systems Protection Board, National Archives and Records Administration, Pension Benefit Guaranty Corporation, Department of the Navy, General Services Administration, Small Business Administration, U.S. Citizenship and Immigration Services, and Federal Communications Commission) won their "E-Star" by making past requests and releases searchable via FOIAonline. FOIAonline also allows users to submit their FOIA requests digitally.
  • THE E-DELINQUENTS: WORST OVERALL AGENCIES In alphabetical order
  • Key Findings
  • Excuses Agencies Give for Poor E-Performance
  • Justice Department guidance undermines the statute. Currently, the FOIA stipulates that documents "likely to become the subject of subsequent requests" must be posted by agencies somewhere in their electronic reading rooms. The Department of Justice's Office of Information Policy defines these records as "frequently requested records… or those which have been released three or more times to FOIA requesters." Of course, it is time-consuming for agencies to develop a system that keeps track of how often a record has been released, which is in part why agencies rarely do so and are often in breach of the law. Troublingly, both the current House and Senate FOIA bills include language that codifies the instructions from the Department of Justice. The National Security Archive believes the addition of this "three or more times" language actually harms the intent of the Freedom of Information Act as it will give agencies an easy excuse ("not requested three times yet!") not to proactively post documents that agency FOIA offices have already spent time, money, and energy processing. We have formally suggested alternate language requiring that agencies generally post "all records, regardless of form or format that have been released in response to a FOIA request."
  • Disabilities Compliance. Despite the E-FOIA Act, many government agencies do not embrace the idea of posting their FOIA responses online. The most common reason agencies give is that it is difficult to post documents in a format that complies with the Americans with Disabilities Act, also referred to as being "508 compliant," and the 1998 Amendments to the Rehabilitation Act that require federal agencies "to make their electronic and information technology (EIT) accessible to people with disabilities." E-Star agencies, however, have proven that 508 compliance is no barrier when the agency has a will to post. All documents posted on FOIAonline are 508 compliant, as are the documents posted by the Department of Defense and the Department of State. In fact, every document created electronically by the US government after 1998 should already be 508 compliant. Even old paper records that are scanned to be processed through FOIA can be made 508 compliant with just a few clicks in Adobe Acrobat, according to this Department of Homeland Security guide (essentially OCRing the text, and including information about where non-textual fields appear). Even if agencies are insistent it is too difficult to OCR older documents that were scanned from paper, they cannot use that excuse with digital records.
  • Privacy. Another commonly articulated concern about posting FOIA releases online is that doing so could inadvertently disclose private information from "first person" FOIA requests. This is a valid concern, and this subset of FOIA requests should not be posted online. (The Justice Department identified "first party" requester rights in 1989. Essentially agencies cannot use the b(6) privacy exemption to redact information if a person requests it for him or herself. An example of a "first person" FOIA would be a person's request for his own immigration file.) Cost and Waste of Resources. There is also a belief that there is little public interest in the majority of FOIA requests processed, and hence it is a waste of resources to post them. This thinking runs counter to the governing principle of the Freedom of Information Act: that government information belongs to US citizens, not US agencies. As such, the reason that a person requests information is immaterial as the agency processes the request; the "interest factor" of a document should also be immaterial when an agency is required to post it online. Some think that posting FOIA releases online is not cost effective. In fact, the opposite is true. It's not cost effective to spend tens (or hundreds) of person hours to search for, review, and redact FOIA requests only to mail it to the requester and have them slip it into their desk drawer and forget about it. That is a waste of resources. The released document should be posted online for any interested party to utilize. This will only become easier as FOIA processing systems evolve to automatically post the documents they track. The State Department earned its "E-Star" status demonstrating this very principle, and spent no new funds and did not hire contractors to build its Electronic Reading Room, instead it built a self-sustaining platform that will save the agency time and money going forward.
malwaresecurity

Best WordPress Security | 10 Tips to Improve Your Website Security - 1 views

  •  
    However, if there is a check on the security controls to tighten up boundaries, websites can be effectively protected from malicious attacks. In this article, we will be discussing on some tips to fix WordPress security issues. Read on to know more.
Paul Merrell

Save Firefox! | Electronic Frontier Foundation - 0 views

  • The World Wide Web Consortium (W3C), once the force for open standards that kept browsers from locking publishers to their proprietary capabilities, has changed its mission. Since 2013, the organization has provided a forum where today's dominant browser companies and the dominant entertainment companies can collaborate on a system to let our browsers control our behavior, rather than the other way. This system, "Encrypted Media Extensions" (EME) uses standards-defined code to funnel video into a proprietary container called a "Content Decryption Module." For a new browser to support this new video streaming standard -- which major studios and cable operators are pushing for -- it would have to convince those entertainment companies or one of their partners to let them have a CDM, or this part of the "open" Web would not display in their new browser. This is the opposite of every W3C standard to date: once, all you needed to do to render content sent by a server was follow the standard, not get permission. If browsers had needed permission to render a page at the launch of Mozilla, the publishers would have frozen out this new, pop-up-blocking upstart. Kiss Firefox goodbye, in other words.
  • The W3C didn't have to do this. No copyright law says that making a video gives you the right to tell people who legally watch it how they must configure their equipment. But because of the design of EME, copyright holders will be able to use the law to shut down any new browser that tries to render the video without their permission. That's because EME is designed to trigger liability under section 1201 of the Digital Millennium Copyright Act (DMCA), which says that removing a digital lock that controls access to a copyrighted work without permission is an offense, even if the person removing the lock has the right to the content it restricts. In other words, once a video is sent with EME, a new company that unlocks it for its users can be sued, even if the users do nothing illegal with that video. We proposed that the W3C could protect new browsers by making their members promise not to use the DMCA to attack new entrants in the market, an idea supported by a diverse group of W3C members, but the W3C executive overruled us saying the work would go forward with no safeguards for future competition. It's even worse than at first glance. The DMCA isn't limited to the USA: the US Trade Representative has spread DMCA-like rules to virtually every country that does business with America. Worse still: the DMCA is also routinely used by companies to threaten and silence security researchers who reveal embarrassing defects in their products. The W3C also declined to require its members to protect security researchers who discover flaws in EME, leaving every Web user vulnerable to vulnerabilities whose disclosure can only safely take place if the affected company decides to permit it.
  • The W3C needs credibility with people who care about the open Web and innovation in order to be viable. They are sensitive to this kind of criticism. We empathize. There are lots of good people working there, people who genuinely, passionately want the Web to stay open to everyone, and to be safe for its users. But the organization made a terrible decision when it opted to provide a home for EME, and an even worse one when it overruled its own members and declined protection for security research and new competitors. It needs to hear from you now. Please share this post, and spread the word. Help the W3C be the organization it is meant to be.
Paul Merrell

Thousands of HTML5 tests planned by Web consortium - 0 views

  • W3C is warning against drawing any conclusions based on the early tests, saying thousands of more HTML5 tests are planned. The goal of the tests is not to declare one browser a winner, but rather to help vendors and Web application developers ensure interoperability across all browsers, W3C says.
  • "We do expect to have tens of thousands of tests," says Philippe Le Hegaret, who oversees HTML activities for the W3C. 
  • the purpose of the HTML5 test suite is to help vendors and developers ensure that HTML5 applications work across all browsers. For example, a developer might check the test results before enabling a certain feature in an application, just to make sure it will work across IE9, Firefox, Chrome, Safari and Opera. Developers can build HTML5 applications today, but they have to keep in mind that they are early adopters and act accordingly, Le Hegaret says. "If you think HTML5 is perfectly stable today and you can use it without worrying about interoperability issues, I think you're going to fool yourself," he says. Although the first round of HTML5 tests focused on desktop browsers, Le Hegaret says HTML5 compatibility is advancing more rapidly on mobile devices such as iPhones and Androids.
    • Paul Merrell
       
      Note the continuing, indeed, escalating abuse of the term "interoperability" by W3C. "Interoperability" has both a legal and (happily, coinciding) technical meaning that involves round-tripping of information. ISO/IEC JTC 1 Directives defines the term in precisely the same terms as the European Union's Court of First Instance did in the landmark Commmission v. Microsoft antitrust case; "interoperability is understood to be the ability of two or more IT systems to *exchange* information at one or more standardised interfaces and to make *mutual use* of the information that has been exchanged." Web browsers do not do "interoperability;" there is no "exchange" and "mutual use" of the information exchanged. Web browsers do "compatibility," a one-way transfer of information that is broadcast from web servers; i.e., web browsers cannot send web pages to web servers.
Paul Merrell

Canada Casts Global Surveillance Dragnet Over File Downloads - The Intercept - 0 views

  • Canada’s leading surveillance agency is monitoring millions of Internet users’ file downloads in a dragnet search to identify extremists, according to top-secret documents. The covert operation, revealed Wednesday by CBC News in collaboration with The Intercept, taps into Internet cables and analyzes records of up to 15 million downloads daily from popular websites commonly used to share videos, photographs, music, and other files. The revelations about the spying initiative, codenamed LEVITATION, are the first from the trove of files provided by National Security Agency whistleblower Edward Snowden to show that the Canadian government has launched its own globe-spanning Internet mass surveillance system. According to the documents, the LEVITATION program can monitor downloads in several countries across Europe, the Middle East, North Africa, and North America. It is led by the Communications Security Establishment, or CSE, Canada’s equivalent of the NSA. (The Canadian agency was formerly known as “CSEC” until a recent name change.)
  • The latest disclosure sheds light on Canada’s broad existing surveillance capabilities at a time when the country’s government is pushing for a further expansion of security powers following attacks in Ottawa and Quebec last year. Ron Deibert, director of University of Toronto-based Internet security think tank Citizen Lab, said LEVITATION illustrates the “giant X-ray machine over all our digital lives.” “Every single thing that you do – in this case uploading/downloading files to these sites – that act is being archived, collected and analyzed,” Deibert said, after reviewing documents about the online spying operation for CBC News. David Christopher, a spokesman for Vancouver-based open Internet advocacy group OpenMedia.ca, said the surveillance showed “robust action” was needed to rein in the Canadian agency’s operations.
  • In a top-secret PowerPoint presentation, dated from mid-2012, an analyst from the agency jokes about how, while hunting for extremists, the LEVITATION system gets clogged with information on innocuous downloads of the musical TV series Glee. CSE finds some 350 “interesting” downloads each month, the presentation notes, a number that amounts to less than 0.0001 per cent of the total collected data. The agency stores details about downloads and uploads to and from 102 different popular file-sharing websites, according to the 2012 document, which describes the collected records as “free file upload,” or FFU, “events.” Only three of the websites are named: RapidShare, SendSpace, and the now defunct MegaUpload.
  • ...3 more annotations...
  • “The specific uses that they talk about in this [counter-terrorism] context may not be the problem, but it’s what else they can do,” said Tamir Israel, a lawyer with the University of Ottawa’s Canadian Internet Policy and Public Interest Clinic. Picking which downloads to monitor is essentially “completely at the discretion of CSE,” Israel added. The file-sharing surveillance also raises questions about the number of Canadians whose downloading habits could have been swept up as part of LEVITATION’s dragnet. By law, CSE isn’t allowed to target Canadians. In the LEVITATION presentation, however, two Canadian IP addresses that trace back to a web server in Montreal appear on a list of suspicious downloads found across the world. The same list includes downloads that CSE monitored in closely allied countries, including the United Kingdom, United States, Spain, Brazil, Germany and Portugal. It is unclear from the document whether LEVITATION has ever prevented any terrorist attacks. The agency cites only two successes of the program in the 2012 presentation: the discovery of a hostage video through a previously unknown target, and an uploaded document that contained the hostage strategy of a terrorist organization. The hostage in the discovered video was ultimately killed, according to public reports.
  • LEVITATION does not rely on cooperation from any of the file-sharing companies. A separate secret CSE operation codenamed ATOMIC BANJO obtains the data directly from internet cables that it has tapped into, and the agency then sifts out the unique IP address of each computer that downloaded files from the targeted websites. The IP addresses are valuable pieces of information to CSE’s analysts, helping to identify people whose downloads have been flagged as suspicious. The analysts use the IP addresses as a kind of search term, entering them into other surveillance databases that they have access to, such as the vast repositories of intercepted Internet data shared with the Canadian agency by the NSA and its British counterpart Government Communications Headquarters. If successful, the searches will return a list of results showing other websites visited by the people downloading the files – in some cases revealing associations with Facebook or Google accounts. In turn, these accounts may reveal the names and the locations of individual downloaders, opening the door for further surveillance of their activities.
  • Canada’s leading surveillance agency is monitoring millions of Internet users’ file downloads in a dragnet search to identify extremists, according to top-secret documents. The covert operation, revealed Wednesday by CBC News in collaboration with The Intercept, taps into Internet cables and analyzes records of up to 15 million downloads daily from popular websites commonly used to share videos, photographs, music, and other files. The revelations about the spying initiative, codenamed LEVITATION, are the first from the trove of files provided by National Security Agency whistleblower Edward Snowden to show that the Canadian government has launched its own globe-spanning Internet mass surveillance system. According to the documents, the LEVITATION program can monitor downloads in several countries across Europe, the Middle East, North Africa, and North America. It is led by the Communications Security Establishment, or CSE, Canada’s equivalent of the NSA. (The Canadian agency was formerly known as “CSEC” until a recent name change.)
Gary Edwards

The Future of Collaborative Networks : Aaron Fulkerson of MindTouch - 0 views

  •  
    MindTouch was by far and away the hottest property at the 2009 Web 2.0 Conference. And for good reason. They have figured out how to tap into the productivity value of enterprise collaborative networks. Most their underlying stuff is based on REST based data objects and services, but they also allow for proprietary data bindings. The key to MindTouch seemd to be the easy to fall into and use collaborative interface: imagine a workgroup project centered around a Web page filled with data objects, graphics and content, with each object also having a collabortaive conversation attached to it. Sounds complicated, but that's where the magic of MindTouch kicks in. It's simple. One the things that most impressed me was an interactive graph placed on one of the wiki project pages. The graph was being fed data from a local excel spreadsheet, and could be interacted with in real time. It was simple to change from a pie chart to a bar graph and so on. It was also possible to interact with the data itself and create what-if scenario's. Great stuff. With considerable persistence though, i was able to discover from Aaron that this interactivity and graphical richness was due to a Silverlight plug-in! From the article: "..... Rather than focusing on socialization, one to one interactions and individual enrichment, businesses must be concerned with creating an information fabric within their organizations. This information fabric is a federation of content from the multiplicity of data and application silos utilized on a daily basis; such as, ERP, CRM, file servers, email, databases, web-services infrastructures, etc. When you make this information fabric easy to edit between groups of individuals in a dynamic, secure, governed and real-time manner, it creates a Collaborative Network." "This is very different from social networks or social software, which is focused entirely on enabling conversations. Collaborative Networks are focused on groups accessing and organiz
Paul Merrell

Notes from the Fight Against Surveillance and Censorship: 2014 in Review | Electronic F... - 0 views

  • 2014 in Review Series Net Neutrality Takes a Wild Ride 8 Stellar Surveillance Scoops Web Encryption Gets Stronger and More Widespread Big Patent Reform Wins in Court, Defeat (For Now) in Congress International Copyright Law More Time in the Spotlight for NSLs The State of Free Expression Online What We Learned About NSA Spying in 2014—And What We're Fighting to Expose in 2015 "Fair Use Is Working!" Email Encryption Grew Tremendously, but Still Needs Work Spies Vs. Spied, Worldwide The Fight in Congress to End the NSA's Mass Spying Open Access Movement Broadens, Moves Forward Stingrays Go Mainstream Three Vulnerabilities That Rocked the Online Security World Mobile Privacy and Security Takes Two Steps Forward, One Step Back It Was a Pivotal Year in TPP Activism but the Biggest Fight Is Still to Come The Government Spent a Lot of Time in Court Defending NSA Spying Last Year Let's Encrypt (the Entire Web)
  •  
    The Electronic Freedom Foundation just dropped an incredible bunch of articles on the world in the form of their "2014 Year In Review" series. These are major contributions that place an awful lot of information in context. I thought I had been keeping a close eye on the same subject matter, but I'm only part way through the articles and am learning time after time that I had missed really important news having to do with digital freedom. I can't recommend these articles enough. So far, they are all must-read.  
Paul Merrell

American Surveillance Now Threatens American Business - The Atlantic - 0 views

  • What does it look like when a society loses its sense of privacy? <div><a href="http://pubads.g.doubleclick.net/gampad/jump?iu=%2F4624%2FTheAtlanticOnline%2Fchannel_technology&t=src%3Dblog%26by%3Drobinson-meyer%26title%3Damerican-surveillance-now-threatens-american-business%26pos%3Din-article&sz=300x250&c=285899172&tile=1" title=""><img style="border:none;" src="http://pubads.g.doubleclick.net/gampad/ad?iu=%2F4624%2FTheAtlanticOnline%2Fchannel_technology&t=src%3Dblog%26by%3Drobinson-meyer%26title%3Damerican-surveillance-now-threatens-american-business%26pos%3Din-article&sz=300x250&c=285899172&tile=1" alt="" /></a></div>In the almost 18 months since the Snowden files first received coverage, writers and critics have had to guess at the answer. Does a certain trend, consumer complaint, or popular product epitomize some larger shift? Is trust in tech companies eroding—or is a subset just especially vocal about it? Polling would make those answers clear, but polling so far has been… confused. A new study, conducted by the Pew Internet Project last January and released last week, helps make the average American’s view of his or her privacy a little clearer. And their confidence in their own privacy is ... low. The study's findings—and the statistics it reports—stagger. Vast majorities of Americans are uncomfortable with how the government uses their data, how private companies use and distribute their data, and what the government does to regulate those companies. No summary can equal a recounting of the findings. Americans are displeased with government surveillance en masse:   
  • A new study finds that a vast majority of Americans trust neither the government nor tech companies with their personal data.
  • What does it look like when a society loses its sense of privacy? <div><a href="http://pubads.g.doubleclick.net/gampad/jump?iu=%2F4624%2FTheAtlanticOnline%2Fchannel_technology&t=src%3Dblog%26by%3Drobinson-meyer%26title%3Damerican-surveillance-now-threatens-american-business%26pos%3Din-article&sz=300x250&c=285899172&tile=1" title=""><img style="border:none;" src="http://pubads.g.doubleclick.net/gampad/ad?iu=%2F4624%2FTheAtlanticOnline%2Fchannel_technology&t=src%3Dblog%26by%3Drobinson-meyer%26title%3Damerican-surveillance-now-threatens-american-business%26pos%3Din-article&sz=300x250&c=285899172&tile=1" alt="" /></a></div>In the almost 18 months since the Snowden files first received coverage, writers and critics have had to guess at the answer. Does a certain trend, consumer complaint, or popular product epitomize some larger shift? Is trust in tech companies eroding—or is a subset just especially vocal about it? Polling would make those answers clear, but polling so far has been… confused. A new study, conducted by the Pew Internet Project last January and released last week, helps make the average American’s view of his or her privacy a little clearer. And their confidence in their own privacy is ... low. The study's findings—and the statistics it reports—stagger. Vast majorities of Americans are uncomfortable with how the government uses their data, how private companies use and distribute their data, and what the government does to regulate those companies. No summary can equal a recounting of the findings. Americans are displeased with government surveillance en masse:   
  • ...3 more annotations...
  • According to the study, 70 percent of Americans are “at least somewhat concerned” with the government secretly obtaining information they post to social networking sites. Eighty percent of respondents agreed that “Americans should be concerned” with government surveillance of telephones and the web. They are also uncomfortable with how private corporations use their data: Ninety-one percent of Americans believe that “consumers have lost control over how personal information is collected and used by companies,” according to the study. Eighty percent of Americans who use social networks “say they are concerned about third parties like advertisers or businesses accessing the data they share on these sites.” And even though they’re squeamish about the government’s use of data, they want it to regulate tech companies and data brokers more strictly: 64 percent wanted the government to do more to regulate private data collection. Since June 2013, American politicians and corporate leaders have fretted over how much the leaks would cost U.S. businesses abroad.
  • “It’s clear the global community of Internet users doesn’t like to be caught up in the American surveillance dragnet,” Senator Ron Wyden said last month. At the same event, Google chairman Eric Schmidt agreed with him. “What occurred was a loss of trust between America and other countries,” he said, according to the Los Angeles Times. “It's making it very difficult for American firms to do business.” But never mind the world. Americans don’t trust American social networks. More than half of the poll’s respondents said that social networks were “not at all secure. Only 40 percent of Americans believe email or texting is at least “somewhat” secure. Indeed, Americans trusted most of all communication technologies where some protections has been enshrined into the law (though the report didn’t ask about snail mail). That is: Talking on the telephone, whether on a landline or cell phone, is the only kind of communication that a majority of adults believe to be “very secure” or “somewhat secure.”
  • (That may seem a bit incongruous, because making a telephone call is one area where you can be almost sure you are being surveilled: The government has requisitioned mass call records from phone companies since 2001. But Americans appear, when discussing security, to differentiate between the contents of the call and data about it.) Last month, Ramsey Homsany, the general counsel of Dropbox, said that one big thing could take down the California tech scene. “We have built this incredible economic engine in this region of the country,” said Homsany in the Los Angeles Times, “and [mistrust] is the one thing that starts to rot it from the inside out.” According to this poll, the mistrust has already begun corroding—and is already, in fact, well advanced. We’ve always assumed that the great hurt to American business will come globally—that citizens of other nations will stop using tech companies’s services. But the new Pew data shows that Americans suspect American businesses just as much. And while, unlike citizens of other nations, they may not have other places to turn, they may stop putting sensitive or delicate information online.
Paul Merrell

ISPs say the "massive cost" of Snooper's Charter will push up UK broadband bills | Ars ... - 0 views

  • How much extra will you have to pay for the privilege of being spied on?
  • UK ISPs have warned MPs that the costs of implementing the Investigatory Powers Bill (aka the Snooper's Charter) will be much greater than the £175 million the UK government has allotted for the task, and that broadband bills will need to rise as a result. Representatives from ISPs and software companies told the House of Commons Science and Technology Committee that the legislation greatly underestimates the "sheer quantity" of data generated by Internet users these days. They also pointed out that distinguishing content from metadata is a far harder task than the government seems to assume. Matthew Hare, the chief executive of ISP Gigaclear, said with "a typical 1 gigabit connection to someone's home, over 50 terabytes of data per year [are] passing over it. If you say that a proportion of that is going to be the communications data—the record of who you communicate with, when you communicate or what you communicate—there would be the most massive and enormous amount of data that in future an access provider would be expected to keep. The indiscriminate collection of mass data across effectively every user of the Internet in this country is going to have a massive cost."
  • Moreover, the larger the cache of stored data, the more worthwhile it will be for criminals and state-backed actors to gain access and download that highly-revealing personal information for fraud and blackmail. John Shaw, the vice president of product management at British security firm Sophos, told the MPs: "There would be a huge amount of very sensitive personal data that could be used by bad guys.
  • ...2 more annotations...
  • The ISPs also challenged the government's breezy assumption that separating the data from the (equally revealing) metadata would be simple, not least because an Internet connection is typically being used for multiple services simultaneously, with data packets mixed together in a completely contingent way. Hare described a typical usage scenario for a teenager on their computer at home, where they are playing a game communicating with their friends using Steam; they are broadcasting the game using Twitch; and they may also be making a voice call at the same time too. "All those applications are running simultaneously," Hare said. "They are different applications using different servers with different services and different protocols. They are all running concurrently on that one machine." Even accessing a Web page is much more complicated than the government seems to believe, Hare pointed out. "As a webpage is loading, you will see that that webpage is made up of tens, or many tens, of individual sessions that have been created across the Internet just to load a single webpage. Bluntly, if you want to find out what someone is doing you need to be tracking all of that data all the time."
  • Hare raised another major issue. "If I was a software business ... I would be very worried that my customers would not buy my software any more if it had anything to do with security at all. I would be worried that a backdoor was built into the software by the [Investigatory Powers] Bill that would allow the UK government to find out what information was on that system at any point they wanted in the future." As Ars reported last week, the ability to demand that backdoors are added to systems, and a legal requirement not to reveal that fact under any circumstances, are two of the most contentious aspects of the new Investigatory Powers Bill. The latest comments from industry experts add to concerns that the latest version of the Snooper's Charter would inflict great harm on civil liberties in the UK, and also make security research well-nigh impossible here. To those fears can now be added undermining the UK software industry, as well as forcing the UK public to pay for the privilege of having their ISP carry out suspicionless surveillance.
Paul Merrell

New open-source router firmware opens your Wi-Fi network to strangers | Ars Technica - 0 views

  • We’ve often heard security folks explain their belief that one of the best ways to protect Web privacy and security on one's home turf is to lock down one's private Wi-Fi network with a strong password. But a coalition of advocacy organizations is calling such conventional wisdom into question. Members of the “Open Wireless Movement,” including the Electronic Frontier Foundation (EFF), Free Press, Mozilla, and Fight for the Future are advocating that we open up our Wi-Fi private networks (or at least a small slice of our available bandwidth) to strangers. They claim that such a random act of kindness can actually make us safer online while simultaneously facilitating a better allocation of finite broadband resources. The OpenWireless.org website explains the group’s initiative. “We are aiming to build technologies that would make it easy for Internet subscribers to portion off their wireless networks for guests and the public while maintaining security, protecting privacy, and preserving quality of access," its mission statement reads. "And we are working to debunk myths (and confront truths) about open wireless while creating technologies and legal precedent to ensure it is safe, private, and legal to open your network.”
  • One such technology, which EFF plans to unveil at the Hackers on Planet Earth (HOPE X) conference next month, is open-sourced router firmware called Open Wireless Router. This firmware would enable individuals to share a portion of their Wi-Fi networks with anyone nearby, password-free, as Adi Kamdar, an EFF activist, told Ars on Friday. Home network sharing tools are not new, and the EFF has been touting the benefits of open-sourcing Web connections for years, but Kamdar believes this new tool marks the second phase in the open wireless initiative. Unlike previous tools, he claims, EFF’s software will be free for all, will not require any sort of registration, and will actually make surfing the Web safer and more efficient.
  • Kamdar said that the new firmware utilizes smart technologies that prioritize the network owner's traffic over others', so good samaritans won't have to wait for Netflix to load because of strangers using their home networks. What's more, he said, "every connection is walled off from all other connections," so as to decrease the risk of unwanted snooping. Additionally, EFF hopes that opening one’s Wi-Fi network will, in the long run, make it more difficult to tie an IP address to an individual. “From a legal perspective, we have been trying to tackle this idea that law enforcement and certain bad plaintiffs have been pushing, that your IP address is tied to your identity. Your identity is not your IP address. You shouldn't be targeted by a copyright troll just because they know your IP address," said Kamdar.
  • ...1 more annotation...
  • While the EFF firmware will initially be compatible with only one specific router, the organization would like to eventually make it compatible with other routers and even, perhaps, develop its own router. “We noticed that router software, in general, is pretty insecure and inefficient," Kamdar said. “There are a few major players in the router space. Even though various flaws have been exposed, there have not been many fixes.”
Paul Merrell

'Let's Encrypt' Project Strives To Make Encryption Simple - Slashdot - 0 views

  •  
    The blurb is a bit misleading. This is a project that's been under way since last year; what's new is that they're moving under the Linux Foundation umbrella for various non-technical suoport purposes. By sometime this summer, encrypting web site data and broadcasting it over https is  slated to become a two-click process. Or on the linux command line: $ sudo apt-get install lets-encrypt $ lets-encrypt example.com This is a project that grew out of public disgust with NSA surveillance, designed to flood the NSA (and other bad actors) with so much encrypted data that they will be able to decrypt only a tiny fraction (decryption without the decryption key takes gobs of computer cycles).  The other half of the solution is already available, the HTTPS Everywhere extension for the Chrome, FIrefox, and Opera web browsers by the Electronic Frontier Foundation and the TOR Project that translates your every request for a http address into an effort to connect to an https address preferentially before establishing an http connection if https is not available. HTTPS Everywhere is fast and does not noticeably add to your page loading time. If you'd like to effortlessly imoprove your online security and help burden NSA, install HTTPS Everywhere. Get it at https://www.eff.org/https-everywhere
Gary Edwards

oEmbed - Web OLE - 0 views

  •  
    OLE for the Web: oEmbed is a format for allowing an embedded representation of a URL on third party sites. The simple API allows a website to display embedded content (such as photos or videos) when a user posts a link to that resource, without having to parse the resource directly. Table Of Contents Quick Example Full Spec Security considerations Discovery More examples Authors Implementations
Gary Edwards

The Lowdown: Technology and Politics of HTML5 vs. Flash | Hidden Dimensions | The M... - 0 views

  •  
    Excellent but light weight and breezy review of the Flash-Silverlight-Open Web HTML5 battle for the future of the Web. excerpt: At the top of the org chart, Apple's deprecation of Flash technology is all about politics. Apple doesn't want its mainstream video delivery system controlled by a third party. So Mr. Jobs backs up his politics with tidbits of technical truths. However, discovering the real truths about HTML5 and Flash is a bit harder, as this survey shows. On February 11th, I wrote an editorial, "What Should Apple Do About Adobe?" Part of the discussion related to Adobe's Flash Player on the Mac, updates and security. Inevitably, the comments escalated to a discussion of Steve Job's distaste for and blocking of Flash on the iPhone and iPad. The question is: is Apple's stance against Flash justified? Of course, any political argument needs only the barest of idealogical arguments to sustain itself. More to the point is, can Apple fight this war and win based on the state-of-the-art with HTML5? Again, Apple's CEO must believe he can win this war. There has to be some technical basis for that, or the war wouldn't be waged.
Gary Edwards

Kaazing | Kaazing WebSocket Gateway - 0 views

  •  
    Kaazing WebSocket Gateway is the world's only enterprise solution for full-duplex, high-performance communication over the Web using the HTML5 WebSocket standard. Designed to be the next evolutionary step in web communication, HTML5 WebSocket addresses the problems inherent with traditional Ajax and Comet solutions today. True real-time connectivity in the browser and on mobile devices is now a reality thanks to this exciting new standard. Kaazing WebSocket Gateway delivers these features and benefits with the performance, scalability, robustness, and security that enterprises demand.
‹ Previous 21 - 40 of 103 Next › Last »
Showing 20 items per page