Skip to main content

Home/ Open Web/ Group items tagged Code

Rss Feed Group items tagged

Gary Edwards

FeedHenry Secures $9M Funding Led By Intel Capital To Feed Boom in Mobile Enterprise | ... - 0 views

  •  
    FeedHenry provides a cloud Mobile Application Platform that simplifies the development, integration, deployment and management of secure mobile apps for business. This mobile platform-as-a-service (PaaS) allows apps to be developed in HTML5, JavaScript, and CSS and deployed to multiple mobile devices from a single code base. The node.js backend service offers a complete range of APIs designed to simplify and secure the connectivity of mobile apps to backend and third party systems. The platform can be deployed to private, public or hybrid clouds. FeedHenry's PaaS offers developers speed of development, instant scalability, device and cloud independence, and the ability to easily integrate to backend information. ................................ If, say, a company uses both Sharepoint and Salesforce inside a mobile app, to get that data into one app they need multiple levels of API integration. Because of the enormous boom in mobile and tablet apps, so-called 'back-end as a service' (BaaS) platforms like FeedHenry - which solve these problems - are hugely expanding. Thus, today FeedHenry has secured $9M (€7M) in a funding round led by Intel Capital, alongside a "seven figure" investment from existing investor Kernel Capital. Other existing investors VMware Inc., Enterprise Ireland and private investors also participated and were joined by new investment from ACT Venture Capital. The funds will be used on an international roll out. FeedHenry's mobile application platform - built between Ireland and the U.S. - helps businesses build mobile apps that integrate securely to their business through the cloud. This is a competitive market that includes StackMob, Usergrid, Appcelerator, Sencha.io, Applicasa ,Parse, CloudMine , CloudyRec , iKnode, yorAPI, Buddy and ScottyApp.
Paul Merrell

AT&T Mobility LLC, et al v. AU Optronics Corp., et al :: Ninth Circuit :: US Courts of ... - 0 views

  • Justia.com Opinion Summary: Plaintiffs alleged that they purchased billions of dollars worth of mobile handsets containing defendants' LCD panels and that the prices they paid for those handsets were artificially inflated because defendants had orchestrated a global conspiracy to fix the prices of LCD panels. The district court certified to the court pursuant to 28 U.S.C. 1292(b) "the question whether the application of California antitrust law to claims against defendants based on purchases that occurred outside California would violate the Due Process Clause of the United States Constitution." Because the underlying conduct in this case involved not just the indirect purchase of price-fixed goods, but also the conspiratorial conduct that led to the sale of those goods, the court answered in the negative. To the extent a defendant's conspiratorial conduct was sufficiently connected to California, and was not "slight and casual," the application of California law to that conduct was "neither arbitrary nor fundamentally unfair," and the application of California law did not violate that defendant's rights under the Due Process Clause. Therefore, the court reversed the district court's order dismissing plaintiffs' California law claims and remanded for further proceedings.
  •  
    This page includes the opinion of the Ninth U.S. Circuit Court of Appeals on an interlocutory appeal from a district court decision to dismiss two California state law causes of action from an ongoing case, leaving only the federal law causes of action. The Ninth Circuit disagreed, vacated the district court's decision, and remanded for consideration of the dismissal issue under the correct legal standard. This was a pro-plaintiff decision that makes it very likely that the case will continue with the state law causes of action reinstated against all or nearly all defendants. This is an unusually important price-fixing case with potentially disruptive effect among mobile device component manufacturers and by such a settlement or judgment's ripple effects, manufacturers of other device components globally. Plaintiffs are several major  voice/data communications services in the U.S. with the defendants being virtually all of the manufacturers of LCD panels used in mobile telephones. One must suspect that if price-fixing is in fact universal in the LCD panel manufacturing industry, price-fixing is likely common among manufacturers of other device components. According to the Ninth Circuit opinion, the plaintiffs' amended complaint includes detailed allegations of specific price-fixing agreements and price sharing actions by principles or agents of each individual defendant company committed within the State of California, which suggests that plaintiffs have very strong evidence that the alleged conspiracy exists. This is a case to watch.    
Paul Merrell

The NSA Leak Is Real, Snowden Documents Confirm - 0 views

  • On Monday, a hacking group calling itself the “ShadowBrokers” announced an auction for what it claimed were “cyber weapons” made by the NSA. Based on never-before-published documents provided by the whistleblower Edward Snowden, The Intercept can confirm that the arsenal contains authentic NSA software, part of a powerful constellation of tools used to covertly infect computers worldwide. The provenance of the code has been a matter of heated debate this week among cybersecurity experts, and while it remains unclear how the software leaked, one thing is now beyond speculation: The malware is covered with the NSA’s virtual fingerprints and clearly originates from the agency. The evidence that ties the ShadowBrokers dump to the NSA comes in an agency manual for implanting malware, classified top secret, provided by Snowden, and not previously available to the public. The draft manual instructs NSA operators to track their use of one malware program using a specific 16-character string, “ace02468bdf13579.” That exact same string appears throughout the ShadowBrokers leak in code associated with the same program, SECONDDATE.
Paul Merrell

EFF to Court: Don't Undermine Legal Protections for Online Platforms that Enable Free S... - 0 views

  • EFF filed a brief in federal court arguing that a lower court’s ruling jeopardizes the online platforms that make the Internet a robust platform for users’ free speech. The brief, filed in the U.S. Court of Appeals for the Ninth Circuit, argues that 47 U.S.C. § 230, enacted as part of the Communications Decency Act (known simply as “Section 230”) broadly protects online platforms, including review websites, when they aggregate or otherwise edit users’ posts. Generally, Section 230 provides legal immunity for online intermediaries that host or republish speech by protecting them against a range of laws that might otherwise be used to hold them legally responsible for what others say and do. Section 230’s immunity directly led to the development of the platforms everyone uses today, allowing people to upload videos to their favorite platforms such as YouTube, as well as leave reviews on Amazon or Yelp. It also incentivizes the creation of new platforms that can host users’ content, leading to more innovation that enables the robust free speech found online. The lower court’s decision in Consumer Cellular v. ConsumerAffairs.com, however, threatens to undermine the broad protections of Section 230, EFF’s brief argues.
  • In the case, Consumer Cellular alleged, among other things, that ConsumerAffairs.com should be held liable for aggregating negative reviews about its business into a star rating. It also alleged that ConsumerAffairs.com edited or otherwise deleted certain reviews of Consumer Cellular in bad faith. Courts and the text of Section 230, however, plainly allow platforms to edit or aggregate user-generated content into summaries or star ratings without incurring legal liability, EFF’s brief argues. It goes on: “And any function protected by Section 230 remains so regardless of the publisher’s intent.” By allowing Consumer Cellular’s claims against ConsumerAffairs.com to proceed, the lower court seriously undercut Section 230’s legal immunity for online platforms. If the decision is allowed to stand, EFF’s brief argues, then platforms may take steps to further censor or otherwise restrict user content out of fear of being held liable. That outcome, EFF warns, could seriously diminish the Internet’s ability to serve as a diverse forum for free speech. The Internet it is constructed of and depends upon intermediaries. The many varied online intermediary platforms, including Twitter, Reddit, YouTube, and Instagram, all give a single person, with minimal resources, almost anywhere in the world the ability to communicate with the rest of the world. Without intermediaries, that speaker would need technical skill and money that most people lack to disseminate their message. If our legal system fails to robustly protect intermediaries, it fails to protect free speech online.
Gary Edwards

The Man Who Makes the Future: Wired Icon Marc Andreessen | Epicenter | Wired.com - 1 views

  •  
    Must read interview. Marc Andreessen explains his five big ideas, taking us from the beginning of the Web, into the Cloud and beyond. Great stuff! ... (1) 1992 - Everyone Will Have the Web ... (2) 1995 - The Browser will the Operating System ... (3) 1999 - Web business will live in the Cloud ... (4) 2004 - Everything will be Social ... (5) 2009 - Software will Eat the World excerpt: Technology is like water; it wants to find its level. So if you hook up your computer to a billion other computers, it just makes sense that a tremendous share of the resources you want to use-not only text or media but processing power too-will be located remotely. People tend to think of the web as a way to get information or perhaps as a place to carry out ecommerce. But really, the web is about accessing applications. Think of each website as an application, and every single click, every single interaction with that site, is an opportunity to be on the very latest version of that application. Once you start thinking in terms of networks, it just doesn't make much sense to prefer local apps, with downloadable, installable code that needs to be constantly updated.

    "We could have built a social element into Mosaic. But back then the Internet was all about anonymity."
    Anderson: Assuming you have enough bandwidth.

    Andreessen: That's the very big if in this equation. If you have infinite network bandwidth, if you have an infinitely fast network, then this is what the technology wants. But we're not yet in a world of infinite speed, so that's why we have mobile apps and PC and Mac software on laptops and phones. That's why there are still Xbox games on discs. That's why everything isn't in the cloud. But eventually the technology wants it all to be up there.

    Anderson: Back in 1995, Netscape began pursuing this vision by enabling the browser to do more.

    Andreessen: We knew that you would need some pro
Paul Merrell

DARPA seeks the Holy Grail of search engines - 0 views

  • The scientists at DARPA say the current methods of searching the Internet for all manner of information just won't cut it in the future. Today the agency announced a program that would aim to totally revamp Internet search and "revolutionize the discovery, organization and presentation of search results." Specifically, the goal of DARPA's Memex program is to develop software that will enable domain-specific indexing of public web content and domain-specific search capabilities. According to the agency the technologies developed in the program will also provide the mechanisms for content discovery, information extraction, information retrieval, user collaboration, and other areas needed to address distributed aggregation, analysis, and presentation of web content.
  • Memex also aims to produce search results that are more immediately useful to specific domains and tasks, and to improve the ability of military, government and commercial enterprises to find and organize mission-critical publically available information on the Internet. "The current one-size-fits-all approach to indexing and search of web content limits use to the business case of web-scale commercial providers," the agency stated. 
  • The Memex program will address the need to move beyond a largely manual process of searching for exact text in a centralized index, including overcoming shortcomings such as: Limited scope and richness of indexed content, which may not include relevant components of the deep web such as temporary pages, pages behind forms, etc.; an impoverished index, which may not include shared content across pages, normalized content, automatic annotations, content aggregation, analysis, etc. Basic search interfaces, where every session is independent, there is no collaboration or history beyond the search term, and nearly exact text input is required; standard practice for interacting with the majority of web content, which remains one-at-a-time manual queries that return federated lists of results. Memex would ultimately apply to any public domain content; initially, DARPA  said it intends to develop Memex to address a key Defense Department mission: fighting human trafficking. Human trafficking is a factor in many types of military, law enforcement and intelligence investigations and has a significant web presence to attract customers. The use of forums, chats, advertisements, job postings, hidden services, etc., continues to enable a growing industry of modern slavery. An index curated for the counter-trafficking domain, along with configurable interfaces for search and analysis, would enable new opportunities to uncover and defeat trafficking enterprises.
  • ...1 more annotation...
  • DARPA said the Memex program gets its name and inspiration from a hypothetical device described in "As We May Think," a 1945 article for The Atlantic Monthly written by Vannevar Bush, director of the U.S. Office of Scientific Research and Development (OSRD) during World War II. Envisioned as an analog computer to supplement human memory, the memex (a combination of "memory" and "index") would store and automatically cross-reference all of the user's books, records and other information. This cross-referencing, which Bush called associative indexing, would enable users to quickly and flexibly search huge amounts of information and more efficiently gain insights from it. The memex presaged and encouraged scientists and engineers to create hypertext, the Internet, personal computers, online encyclopedias and other major IT advances of the last seven decades, DARPA stated.
  •  
    DoD announces that they want to go beyond Google. Lots more detail in the proposal description linked from the article. Interesting tidbits: [i] the dark web is a specific target; [ii] they want the ability to crawl web pages blocked by robots.txt; [iii] they want to be able to search page source code and comments. 
Paul Merrell

Testosterone Pit - Home - The Other Reason Why IBM Throws A Billion At Linux ... - 0 views

  • IBM announced today that it would throw another billion at Linux, the open-source operating system, to run its Power System servers. The first time it had thrown a billion at Linux was in 2001, when Linux was a crazy, untested, even ludicrous proposition for the corporate world. So the moolah back then didn’t go to Linux itself, which was free, but to related technologies across hardware, software, and service, including things like sales and advertising – and into IBM’s partnership with Red Hat which was developing its enterprise operating system, Red Hat Enterprise Linux. “It helped start a flurry of innovation that has never slowed,” said Jim Zemlin, executive director of the Linux Foundation. IBM claims that the investment would “help clients capitalize on big data and cloud computing with modern systems built to handle the new wave of applications coming to the data center in the post-PC era.” Some of the moolah will be plowed into the Power Systems Linux Center in Montpellier, France, which opened today. IBM’s first Power Systems Linux Center opened in Beijing in May. IBM may be trying to make hay of the ongoing revelations that have shown that the NSA and other intelligence organizations in the US and elsewhere have roped in American tech companies of all stripes with huge contracts to perfect a seamless spy network. They even include physical aspects of surveillance, such as license plate scanners and cameras, which are everywhere [read.... Surveillance Society: If You Drive, You Get Tracked].
  • Then another boon for IBM. Experts at the German Federal Office for Security in Information Technology (BIS) determined that Windows 8 is dangerous for data security. It allows Microsoft to control the computer remotely through a “special surveillance chip,” the wonderfully named Trusted Platform Module (TPM), and a backdoor in the software – with keys likely accessible to the NSA and possibly other third parties, such as the Chinese. Risks: “Loss of control over the operating system and the hardware” [read.... LEAKED: German Government Warns Key Entities Not To Use Windows 8 – Links The NSA.
  • It would be an enormous competitive advantage for an IBM salesperson to walk into a government or corporate IT department and sell Big Data servers that don’t run on Windows, but on Linux. With the Windows 8 debacle now in public view, IBM salespeople don’t even have to mention it. In the hope of stemming the pernicious revenue decline their employer has been suffering from, they can politely and professionally hype the security benefits of IBM’s systems and mention in passing the comforting fact that some of it would be developed in the Power Systems Linux Centers in Montpellier and Beijing. Alas, Linux too is tarnished. The backdoors are there, though the code can be inspected, unlike Windows code. And then there is Security-Enhanced Linux (SELinux), which was integrated into the Linux kernel in 2003. It provides a mechanism for supporting “access control” (a backdoor) and “security policies.” Who developed SELinux? Um, the NSA – which helpfully discloses some details on its own website (emphasis mine): The results of several previous research projects in this area have yielded a strong, flexible mandatory access control architecture called Flask. A reference implementation of this architecture was first integrated into a security-enhanced Linux® prototype system in order to demonstrate the value of flexible mandatory access controls and how such controls could be added to an operating system. The architecture has been subsequently mainstreamed into Linux and ported to several other systems, including the Solaris™ operating system, the FreeBSD® operating system, and the Darwin kernel, spawning a wide range of related work.
  • ...1 more annotation...
  • Among a slew of American companies who contributed to the NSA’s “mainstreaming” efforts: Red Hat. And IBM? Like just about all of our American tech heroes, it looks at the NSA and other agencies in the Intelligence Community as “the Customer” with deep pockets, ever increasing budgets, and a thirst for technology and data. Which brings us back to Windows 8 and TPM. A decade ago, a group was established to develop and promote Trusted Computing that governs how operating systems and the “special surveillance chip” TPM work together. And it too has been cooperating with the NSA. The founding members of this Trusted Computing Group, as it’s called facetiously: AMD, Cisco, Hewlett-Packard, Intel, Microsoft, and Wave Systems. Oh, I almost forgot ... and IBM. And so IBM might not escape, despite its protestations and slick sales presentations, the suspicion by foreign companies and governments alike that its Linux servers too have been compromised – like the cloud products of other American tech companies. And now, they’re going to pay a steep price for their cooperation with the NSA. Read...  NSA Pricked The “Cloud” Bubble For US Tech Companies
Gary Edwards

WhiteHat Aviator - The most secure browser online - 1 views

  •  
    "FREQUENTLY ASKED QUESTIONS What is WhiteHat Aviator? WhiteHat Aviator; is the most secure , most private Web browser available anywhere. By default, it provides an easy way to bank, shop, and use social networks while stopping viruses from infecting computers, preventing accounts from being hacked, and blocking advertisers from invisibly spying on every click. Why do I need a secure Web browser? According to CA Technologies, 84 percent of hacker attacks in 2009 took advantage of vulnerabilities in Web browsers. Similarly, Symantec found that four of the top five vulnerabilities being exploited were client-side vulnerabilities that were frequently targeted by Web-based attacks. The fact is, that when you visit any website you run the risk of having your surfing history, passwords, real name, workplace, home address, phone number, email, gender, political affiliation, sexual preferences, income bracket, education level, and medical history stolen - and your computer infected with viruses. Sadly, this happens on millions of websites every day. Before you have any chance at protecting yourself, other browsers force you to follow complicated how-to guides, modify settings that only serve advertising empires and install obscure third-party software. What makes WhiteHat Aviator so secure? WhiteHat Aviator; is built on Chromium, the same open-source foundation used by Google Chrome. Chromium has several unique, powerful security features. One is a "sandbox" that prevents websites from stealing files off your computer or infecting it with viruses. As good as Chromium is, we went much further to create the safest online experience possible. WhiteHat Aviator comes ready-to-go with hardened security and privacy settings, giving hackers less to work with. And our browser downloads to you - without any hidden user-tracking functionality. Our default search engine is DuckDuckGo - not Google, which logs your activity. For good measure, Aviator integrates Disconnect
Paul Merrell

2nd Cir. Affirms That Creation of Full-Text Searchable Database of Works Is Fair Use | ... - 0 views

  • The fair use doctrine permits the unauthorized digitization of copyrighted works in order to create a full-text searchable database, the U.S. Court of Appeals for the Second Circuit ruled June 10.Affirming summary judgment in favor of a consortium of university libraries, the court also ruled that the fair use doctrine permits the unauthorized conversion of those works into accessible formats for use by persons with disabilities, such as the blind.
  • The dispute is connected to the long-running conflict between Google Inc. and various authors of books that Google included in a mass digitization program. In 2004, Google began soliciting the participation of publishers in its Google Print for Publishers service, part of what was then called the Google Print project, aimed at making information available for free over the Internet.Subsequently, Google announced a new project, Google Print for Libraries. In 2005, Google Print was renamed Google Book Search and it is now known simply as Google Books. Under this program, Google made arrangements with several of the world's largest libraries to digitize the entire contents of their collections to create an online full-text searchable database.The announcement of this program triggered a copyright infringement action by the Authors Guild that continues to this day.
  • Turning to the fair use question, the court first concluded that the full-text search function of the Hathitrust Digital Library was a “quintessentially transformative use,” and thus constituted fair use. The court said:the result of a word search is different in purpose, character, expression, meaning, and message from the page (and the book) from which it is drawn. Indeed, we can discern little or no resemblance between the original text and the results of the HDL full-text search.There is no evidence that the Authors write with the purpose of enabling text searches of their books. Consequently, the full-text search function does not “supersede[ ] the objects [or purposes] of the original creation.”Turning to the fourth fair use factor—whether the use functions as a substitute for the original work—the court rejected the argument that such use represents lost sales to the extent that it prevents the future development of a market for licensing copies of works to be used in full-text searches.However, the court emphasized that the search function “does not serve as a substitute for the books that are being searched.”
  • ...3 more annotations...
  • Part of the deal between Google and the libraries included an offer by Google to hand over to the libraries their own copies of the digitized versions of their collections.In 2011, a group of those libraries announced the establishment of a new service, called the HathiTrust digital library, to which the libraries would contribute their digitized collections. This database of copies is to be made available for full-text searching and preservation activities. Additionally, it is intended to offer free access to works to individuals who have “print disabilities.” For works under copyright protection, the search function would return only a list of page numbers that a search term appeared on and the frequency of such appearance.
  • The court also rejected the argument that the database represented a threat of a security breach that could result in the full text of all the books becoming available for anyone to access. The court concluded that Hathitrust's assertions of its security measures were unrebutted.Thus, the full-text search function was found to be protected as fair use.
  • The court also concluded that allowing those with print disabilities access to the full texts of the works collected in the Hathitrust database was protected as fair use. Support for this conclusion came from the legislative history of the Copyright Act's fair use provision, 17 U.S.C. §107.
Paul Merrell

From Radio to Porn, British Spies Track Web Users' Online Identities - 0 views

  • HERE WAS A SIMPLE AIM at the heart of the top-secret program: Record the website browsing habits of “every visible user on the Internet.” Before long, billions of digital records about ordinary people’s online activities were being stored every day. Among them were details cataloging visits to porn, social media and news websites, search engines, chat forums, and blogs. The mass surveillance operation — code-named KARMA POLICE — was launched by British spies about seven years ago without any public debate or scrutiny. It was just one part of a giant global Internet spying apparatus built by the United Kingdom’s electronic eavesdropping agency, Government Communications Headquarters, or GCHQ. The revelations about the scope of the British agency’s surveillance are contained in documents obtained by The Intercept from National Security Agency whistleblower Edward Snowden. Previous reports based on the leaked files have exposed how GCHQ taps into Internet cables to monitor communications on a vast scale, but many details about what happens to the data after it has been vacuumed up have remained unclear.
  • Amid a renewed push from the U.K. government for more surveillance powers, more than two dozen documents being disclosed today by The Intercept reveal for the first time several major strands of GCHQ’s existing electronic eavesdropping capabilities.
  • The surveillance is underpinned by an opaque legal regime that has authorized GCHQ to sift through huge archives of metadata about the private phone calls, emails and Internet browsing logs of Brits, Americans, and any other citizens — all without a court order or judicial warrant
  • ...17 more annotations...
  • A huge volume of the Internet data GCHQ collects flows directly into a massive repository named Black Hole, which is at the core of the agency’s online spying operations, storing raw logs of intercepted material before it has been subject to analysis. Black Hole contains data collected by GCHQ as part of bulk “unselected” surveillance, meaning it is not focused on particular “selected” targets and instead includes troves of data indiscriminately swept up about ordinary people’s online activities. Between August 2007 and March 2009, GCHQ documents say that Black Hole was used to store more than 1.1 trillion “events” — a term the agency uses to refer to metadata records — with about 10 billion new entries added every day. As of March 2009, the largest slice of data Black Hole held — 41 percent — was about people’s Internet browsing histories. The rest included a combination of email and instant messenger records, details about search engine queries, information about social media activity, logs related to hacking operations, and data on people’s use of tools to browse the Internet anonymously.
  • Throughout this period, as smartphone sales started to boom, the frequency of people’s Internet use was steadily increasing. In tandem, British spies were working frantically to bolster their spying capabilities, with plans afoot to expand the size of Black Hole and other repositories to handle an avalanche of new data. By 2010, according to the documents, GCHQ was logging 30 billion metadata records per day. By 2012, collection had increased to 50 billion per day, and work was underway to double capacity to 100 billion. The agency was developing “unprecedented” techniques to perform what it called “population-scale” data mining, monitoring all communications across entire countries in an effort to detect patterns or behaviors deemed suspicious. It was creating what it said would be, by 2013, “the world’s biggest” surveillance engine “to run cyber operations and to access better, more valued data for customers to make a real world difference.”
  • A document from the GCHQ target analysis center (GTAC) shows the Black Hole repository’s structure.
  • The data is searched by GCHQ analysts in a hunt for behavior online that could be connected to terrorism or other criminal activity. But it has also served a broader and more controversial purpose — helping the agency hack into European companies’ computer networks. In the lead up to its secret mission targeting Netherlands-based Gemalto, the largest SIM card manufacturer in the world, GCHQ used MUTANT BROTH in an effort to identify the company’s employees so it could hack into their computers. The system helped the agency analyze intercepted Facebook cookies it believed were associated with Gemalto staff located at offices in France and Poland. GCHQ later successfully infiltrated Gemalto’s internal networks, stealing encryption keys produced by the company that protect the privacy of cell phone communications.
  • Similarly, MUTANT BROTH proved integral to GCHQ’s hack of Belgian telecommunications provider Belgacom. The agency entered IP addresses associated with Belgacom into MUTANT BROTH to uncover information about the company’s employees. Cookies associated with the IPs revealed the Google, Yahoo, and LinkedIn accounts of three Belgacom engineers, whose computers were then targeted by the agency and infected with malware. The hacking operation resulted in GCHQ gaining deep access into the most sensitive parts of Belgacom’s internal systems, granting British spies the ability to intercept communications passing through the company’s networks.
  • In March, a U.K. parliamentary committee published the findings of an 18-month review of GCHQ’s operations and called for an overhaul of the laws that regulate the spying. The committee raised concerns about the agency gathering what it described as “bulk personal datasets” being held about “a wide range of people.” However, it censored the section of the report describing what these “datasets” contained, despite acknowledging that they “may be highly intrusive.” The Snowden documents shine light on some of the core GCHQ bulk data-gathering programs that the committee was likely referring to — pulling back the veil of secrecy that has shielded some of the agency’s most controversial surveillance operations from public scrutiny. KARMA POLICE and MUTANT BROTH are among the key bulk collection systems. But they do not operate in isolation — and the scope of GCHQ’s spying extends far beyond them.
  • The agency operates a bewildering array of other eavesdropping systems, each serving its own specific purpose and designated a unique code name, such as: SOCIAL ANTHROPOID, which is used to analyze metadata on emails, instant messenger chats, social media connections and conversations, plus “telephony” metadata about phone calls, cell phone locations, text and multimedia messages; MEMORY HOLE, which logs queries entered into search engines and associates each search with an IP address; MARBLED GECKO, which sifts through details about searches people have entered into Google Maps and Google Earth; and INFINITE MONKEYS, which analyzes data about the usage of online bulletin boards and forums. GCHQ has other programs that it uses to analyze the content of intercepted communications, such as the full written body of emails and the audio of phone calls. One of the most important content collection capabilities is TEMPORA, which mines vast amounts of emails, instant messages, voice calls and other communications and makes them accessible through a Google-style search tool named XKEYSCORE.
  • As of September 2012, TEMPORA was collecting “more than 40 billion pieces of content a day” and it was being used to spy on people across Europe, the Middle East, and North Africa, according to a top-secret memo outlining the scope of the program. The existence of TEMPORA was first revealed by The Guardian in June 2013. To analyze all of the communications it intercepts and to build a profile of the individuals it is monitoring, GCHQ uses a variety of different tools that can pull together all of the relevant information and make it accessible through a single interface. SAMUEL PEPYS is one such tool, built by the British spies to analyze both the content and metadata of emails, browsing sessions, and instant messages as they are being intercepted in real time. One screenshot of SAMUEL PEPYS in action shows the agency using it to monitor an individual in Sweden who visited a page about GCHQ on the U.S.-based anti-secrecy website Cryptome.
  • Partly due to the U.K.’s geographic location — situated between the United States and the western edge of continental Europe — a large amount of the world’s Internet traffic passes through its territory across international data cables. In 2010, GCHQ noted that what amounted to “25 percent of all Internet traffic” was transiting the U.K. through some 1,600 different cables. The agency said that it could “survey the majority of the 1,600” and “select the most valuable to switch into our processing systems.”
  • According to Joss Wright, a research fellow at the University of Oxford’s Internet Institute, tapping into the cables allows GCHQ to monitor a large portion of foreign communications. But the cables also transport masses of wholly domestic British emails and online chats, because when anyone in the U.K. sends an email or visits a website, their computer will routinely send and receive data from servers that are located overseas. “I could send a message from my computer here [in England] to my wife’s computer in the next room and on its way it could go through the U.S., France, and other countries,” Wright says. “That’s just the way the Internet is designed.” In other words, Wright adds, that means “a lot” of British data and communications transit across international cables daily, and are liable to be swept into GCHQ’s databases.
  • A map from a classified GCHQ presentation about intercepting communications from undersea cables. GCHQ is authorized to conduct dragnet surveillance of the international data cables through so-called external warrants that are signed off by a government minister. The external warrants permit the agency to monitor communications in foreign countries as well as British citizens’ international calls and emails — for example, a call from Islamabad to London. They prohibit GCHQ from reading or listening to the content of “internal” U.K. to U.K. emails and phone calls, which are supposed to be filtered out from GCHQ’s systems if they are inadvertently intercepted unless additional authorization is granted to scrutinize them. However, the same rules do not apply to metadata. A little-known loophole in the law allows GCHQ to use external warrants to collect and analyze bulk metadata about the emails, phone calls, and Internet browsing activities of British people, citizens of closely allied countries, and others, regardless of whether the data is derived from domestic U.K. to U.K. communications and browsing sessions or otherwise. In March, the existence of this loophole was quietly acknowledged by the U.K. parliamentary committee’s surveillance review, which stated in a section of its report that “special protection and additional safeguards” did not apply to metadata swept up using external warrants and that domestic British metadata could therefore be lawfully “returned as a result of searches” conducted by GCHQ.
  • Perhaps unsurprisingly, GCHQ appears to have readily exploited this obscure legal technicality. Secret policy guidance papers issued to the agency’s analysts instruct them that they can sift through huge troves of indiscriminately collected metadata records to spy on anyone regardless of their nationality. The guidance makes clear that there is no exemption or extra privacy protection for British people or citizens from countries that are members of the Five Eyes, a surveillance alliance that the U.K. is part of alongside the U.S., Canada, Australia, and New Zealand. “If you are searching a purely Events only database such as MUTANT BROTH, the issue of location does not occur,” states one internal GCHQ policy document, which is marked with a “last modified” date of July 2012. The document adds that analysts are free to search the databases for British metadata “without further authorization” by inputing a U.K. “selector,” meaning a unique identifier such as a person’s email or IP address, username, or phone number. Authorization is “not needed for individuals in the U.K.,” another GCHQ document explains, because metadata has been judged “less intrusive than communications content.” All the spies are required to do to mine the metadata troves is write a short “justification” or “reason” for each search they conduct and then click a button on their computer screen.
  • Intelligence GCHQ collects on British persons of interest is shared with domestic security agency MI5, which usually takes the lead on spying operations within the U.K. MI5 conducts its own extensive domestic surveillance as part of a program called DIGINT (digital intelligence).
  • GCHQ’s documents suggest that it typically retains metadata for periods of between 30 days to six months. It stores the content of communications for a shorter period of time, varying between three to 30 days. The retention periods can be extended if deemed necessary for “cyber defense.” One secret policy paper dated from January 2010 lists the wide range of information the agency classes as metadata — including location data that could be used to track your movements, your email, instant messenger, and social networking “buddy lists,” logs showing who you have communicated with by phone or email, the passwords you use to access “communications services” (such as an email account), and information about websites you have viewed.
  • Records showing the full website addresses you have visited — for instance, www.gchq.gov.uk/what_we_do — are treated as content. But the first part of an address you have visited — for instance, www.gchq.gov.uk — is treated as metadata. In isolation, a single metadata record of a phone call, email, or website visit may not reveal much about a person’s private life, according to Ethan Zuckerman, director of Massachusetts Institute of Technology’s Center for Civic Media. But if accumulated and analyzed over a period of weeks or months, these details would be “extremely personal,” he told The Intercept, because they could reveal a person’s movements, habits, religious beliefs, political views, relationships, and even sexual preferences. For Zuckerman, who has studied the social and political ramifications of surveillance, the most concerning aspect of large-scale government data collection is that it can be “corrosive towards democracy” — leading to a chilling effect on freedom of expression and communication. “Once we know there’s a reasonable chance that we are being watched in one fashion or another it’s hard for that not to have a ‘panopticon effect,’” he said, “where we think and behave differently based on the assumption that people may be watching and paying attention to what we are doing.”
  • When compared to surveillance rules in place in the U.S., GCHQ notes in one document that the U.K. has “a light oversight regime.” The more lax British spying regulations are reflected in secret internal rules that highlight greater restrictions on how NSA databases can be accessed. The NSA’s troves can be searched for data on British citizens, one document states, but they cannot be mined for information about Americans or other citizens from countries in the Five Eyes alliance. No such constraints are placed on GCHQ’s own databases, which can be sifted for records on the phone calls, emails, and Internet usage of Brits, Americans, and citizens from any other country. The scope of GCHQ’s surveillance powers explain in part why Snowden told The Guardian in June 2013 that U.K. surveillance is “worse than the U.S.” In an interview with Der Spiegel in July 2013, Snowden added that British Internet cables were “radioactive” and joked: “Even the Queen’s selfies to the pool boy get logged.”
  • In recent years, the biggest barrier to GCHQ’s mass collection of data does not appear to have come in the form of legal or policy restrictions. Rather, it is the increased use of encryption technology that protects the privacy of communications that has posed the biggest potential hindrance to the agency’s activities. “The spread of encryption … threatens our ability to do effective target discovery/development,” says a top-secret report co-authored by an official from the British agency and an NSA employee in 2011. “Pertinent metadata events will be locked within the encrypted channels and difficult, if not impossible, to prise out,” the report says, adding that the agencies were working on a plan that would “(hopefully) allow our Internet Exploitation strategy to prevail.”
Gary Edwards

Apple's HTML5 Promotion May Backfire - Neil McAllister - 0 views

  •  
    Return to the bad old days Many of Apple's demos rely on "experimental" CSS3 properties to work. The exact implementation of these properties has yet to be hammered out, so browser vendors must use their best guess to determine how they should be rendered onscreen. Because of the ambiguity this introduces, it is the custom for browser vendors to attach a vendor-specific prefix to the CSS property names. Firefox uses "moz," while Safari uses "webkit," named for the browser's WebKit rendering engine. This means Web developers who want to use a specific experimental CSS feature must include the vendor-specific properties for each browser they want to support in their style sheets. It's a less than ideal situation, but the actual coding required is trivial. Apple chose not to bother for its HTML5 demo site. That would be bad enough. But Apple's demos don't work on Google's Chrome browser, either -- and Chrome also uses the "webkit" prefix for its experimental CSS3 properties (because it's also based on the WebKit rendering engine). Rather than detecting browser capabilities and degrading the user experience gracefully where features aren't supported -- as is the accepted best practice on modern browsers -- Apple chose to deliberately screen out any browser that doesn't self-identify as Safari. That's right: By forcing my browser's user agent string to identify as Safari 5, I was able to view many of the demos just fine in Firefox 3.6 on Windows. Seriously, Apple? I thought we left elaborate browser-detection scripts behind in the bad old days of the 1990s. I can't imagine anyone would want to start up the practice again, let alone one of the leading companies in the development of next-generation Web standards.
Gary Edwards

Sencha creates touch-screen UI development framework - SD Times: Software Development News - 0 views

  •  
    Ext JS, said Mullany, includes numerous UI elements and handlers that are built entirely from CSS 3, HTML5 and JavaScript. As a result, the applications built with Ext JS can be run on any WebKit-based browser. Both Android and the iPhone use WebKit-based browsers, and RIM should soon offer one as well for BlackBerry users. Mullany said each interface element is built on top of CSS, and can therefore be skinned and modified by designers after creation. He also said this limits the size of the code that must be embedded in each page with Ext JS elements. Ext JS is available for free under the GPL. For commercial users, the software will cost US$1,000 per developer per year.
Gary Edwards

Yahoo Could Be A Big Winner In The Battle For Developers - 0 views

  •  
    Last year Bartz vented to me about Yahoo's infrastructure problems - the company, she explained, was a compilation of fundamentally disconnected vertical silos, each with its own P&L, codebase, infrastructure, and culture. It was nearly impossible to roll out products that cut across, say, Mail, Homepage, Finance, IM, Search, and Flickr, because each instance required custom integration and coding. Yahoo was literally broken underneath, even as it looked consistent at the UI layer. Add in the issues of internationalization, and you went from nearly impossible to "not even worth considering." That mean stagnation, and on more than one axis. For one, it means it's very hard to find leverage between your internal resources, or to roll out new products that build on more than one stack. For another, it means it's next to impossible to open your company's resources up to third party developers (there's that word) who might want to add value to the ecosystem you've created.
Gary Edwards

RealObjects: Next Generation HTML-CSS Online Editor - 1 views

  •  
    Advanced XML, HTML5, XHTML CSS3 editing with conversion to PDF, PDF/A and SVG.  Excellent stuff.  Good Case Studies.  Lots of tools and document source code examples.
Gary Edwards

Topix Weblog: The Secret Source of Google's Power - 1 views

  •  
    Incredible.  Despite the title.  It's the platform stupid! excerpt: Much is being written about Gmail, Google's new free webmail system. There's something deeper to learn about Google from this product than the initial reaction to the product features, however. Ignore for a moment the observations about Google leapfrogging their competitors with more user value and a new feature or two. Or Google diversifying away from search into other applications; they've been doing that for a while. Or the privacy red herring. No, the story is about seemingly incremental features that are actually massively expensive for others to match, and the platform that Google is building which makes it cheaper and easier for them to develop and run web-scale applications than anyone else. I've written before about Google's snippet service, which required that they store the entire web in RAM. All so they could generate a slightly better page excerpt than other search engines. Google has taken the last 10 years of systems software research out of university labs, and built their own proprietary, production quality system. What is this platform that Google is building? It's a distributed computing platform that can manage web-scale datasets on 100,000 node server clusters. It includes a petabyte, distributed, fault tolerant filesystem, distributed RPC code, probably network shared memory and process migration. And a datacenter management system which lets a handful of ops engineers effectively run 100,000 servers. Any of these projects could be the sole focus of a startup.
Gary Edwards

Google's HTML5 Crush | PCMag.com - 1 views

  •  
    Google I/O, on the other hand, is about more than just the Chrome Browser-which was barely mentioned in the keynote. Mobile Analyst Sascha Segan had a theory about Google's seeming HTML5 obsession. It's an open "standard." Talking about standards makes government regulatory bodies happy. Google, which grows bigger and more powerful by the minute, is under almost constant scrutiny-look at the trouble it's having completing its AdMob acquisition. If you talk open standards, the feds may assume that you're a company looking to do no harm and to work in harmony with everyone else. It's not a bad theory, but I don't buy it. When looked at alongside other announcements Google made yesterday, you see a company trying to rebuild the Web in its own image. Google wants you to use HTML5, but, like Microsoft, it likely wants you to build things its way. Don't be surprised if little pet tags start to creep in from all interested parties. And then there's video. Google introduced a brand new video code that'll work, naturally, with HTML5 and, conceivably, Flash. It's called VP8.
  •  
    Adobe has already announced that they'll be adding VP8 support to Flash.
Gary Edwards

Avatron Software: Air Sharing of Documents iPhone and iPAD - 0 views

  •  
    Viewing and printing of documents.  Support for PDF, RTF, RTFD, iWork, MSOffice (subject to iOS compatibility), Web archives, HTML, text, source code, and standard iOS multi media.  No discussion yet as tho whether or not Avatron can support Visual fixed/flow viewing of these supported formats. Some interesting support for mounting remote file servers - cloud storage systems like DropBox, Box.net , FTP and secure HTTPS. No WebDav.   Seems to be struggling to make that cross-over from iOS device to desktop to cloud-computing connectivity.
Gary Edwards

Mobile Enterprise: Android OS, Best Practices for Developing Mobile Strategies - 0 views

  •  
    Convert Content for Android OS Making your content mobile friendly is harder than it sounds. However, more tools are emerging to help companies create content for multiple platforms, from iPads to smartphones, across a variety of operating systems. Recently, AppsGeyser privately launched a web platform that allows you to convert any web content to an Android App. With AppsGeyser companies can create an Android app three ways: Grabbing any website content block or web widget Copying and pasting HTML code, JavaScript, AJAX or Flash Entering the URL of your website Nifty tool for instantly converting web site widgets into Android Apps.  Looks like a new category of tools to make legacy Web services mobile-ready.  Titanium
Gary Edwards

Notepad++ | 5.8.6 - 1 views

  •  
    Notepad++ is a free (as in "free speech" and also as in "free beer") source code editor and Notepad replacement that supports several languages. Running in the MS Windows environment, its use is governed by GPL License. Excellent recommendation from Marbux!  Very powerful but easy to use OSS Based on the powerful editing component Scintilla, Notepad++ is written in C++ and uses pure Win32 API and STL which ensures a higher execution speed and smaller program size. By optimizing as many routines as possible without losing user friendliness, Notepad++ is trying to reduce the world carbon dioxide emissions. When using less CPU power, the PC can throttle down and reduce power consumption, resulting in a greener environment.
Gary Edwards

GMailr: An Unofficial Javascript API for GMail - ReadWriteCloud - 1 views

  •  
    Google has pretty much given up on developing a JavaScript API for GMail. There was once a Greasemonkey script Google developed for GMail but that broke and Google shows no sign of fixing it. James Yu is now trying to fix that scenario with GMailr, a JavaScript API for GMail. It is made from the code he wrote for 0Boxer, an extension for GMail that turns organizing your inbox into a game. Yu is also a lead developer at Scribd. Yu said developing the API took him on a path fraught with frustrations and dead ends. He writes there is supported official JavaScript API for Gmail. The Greasemonkey script is broken and no one has yet released a frontend API for Gmail. He said he needed access to the various user actions in the UI as the backend APIs were not going to work as he wished. He decided to write his own library from scratch.
« First ‹ Previous 41 - 60 of 137 Next › Last »
Showing 20 items per page