Skip to main content

Home/ Future of the Web/ Group items tagged bad lead

Rss Feed Group items tagged

Gonzalo San Gil, PhD.

Spotify Reminded of uTorrent Past After Branding Grooveshark 'Pirates' | TorrentFreak - 0 views

  •  
    " Andy on November 12, 2014 C: 0 Breaking Spotify's Daniel Ek has poured fuel onto the raging Taylor Swift controversy. While explaining how less availability of Swift's music will lead some to obtain it without paying, Ek labeled rival Grooveshark a 'pirate' service. Now Grooveshark is biting back by reminding Ek that he was once the CEO of uTorrent. " # ! what a bad memory... # ! ... for not to say #disloyal #competence, # ! ... or morbid envy of others success # ! ... where the first one failed... [# ! Or the fiasco of many so-called legit music services with their limited music supply: a few bucks in royalties will never compensate the broad exposition of artists through 'irregular' sites...]
  •  
    " Andy on November 12, 2014 C: 0 Breaking Spotify's Daniel Ek has poured fuel onto the raging Taylor Swift controversy. While explaining how less availability of Swift's music will lead some to obtain it without paying, Ek labeled rival Grooveshark a 'pirate' service. Now Grooveshark is biting back by reminding Ek that he was once the CEO of uTorrent. "
Gary Edwards

ptsefton » OpenOffice.org is bad for the planet - 0 views

  •  
    ptsefton continues his rant that OpenOffice does not support the Open Web. He's been on this rant for so long, i'm wondering if he really thinks there's a chance the lords of ODF and the OpenOffice source code are listening? In this post he describes how useless it is to submit his findings and frustrations with OOo in a bug report. Pretty funny stuff even if you do end up joining the Michael Meeks trek along this trail of tears. Maybe there's another way?

    What would happen if pt moved from targeting the not so open OpenOffice, to target governments and enterprises trying to set future information system requirements?

    NY State is next up on this endless list. Most likely they will follow the lessons of exhaustive pilot studies conducted by Massachusetts, California, Belgium, Denmark and England, and end up mandating the use of both open standard "XML" formats, ODF and OOXML.

    The pilots concluded that there was a need for both XML formats; depending on the needs of different departments and workgroups. The pilot studies scream out a general rule of thumb; if your department has day-to-day business processes bound to MSOffice workgroups, then it makes sense to use MSOffice OOXML going forward. If there is no legacy MSOffice bound workgroup or workflow, it makes sense to move to OpenOffice ODF.

    One thing the pilots make clear is that it is prohibitively costly and disruptive to try to replace MSOffice bound workgroups.

    What NY State might consider is that the Web is going to be an important part of their informations systems future. What a surprise. Every pilot recognized and indeed, emphasized this fact. Yet, they fell short of the obvious conclusion; mandating that desktop applications provide native support for Open Web formats, protocols and interfaces!

    What's wrong with insisting that desktop applciations and office suites support the rapidly advancing HTML+ technologies as well as the applicat
Gonzalo San Gil, PhD.

Driven by necessity, Mozilla to enable HTML5 DRM in Firefox | Ars Technica - 0 views

  •  
    "Mozilla announced today that it will follow the lead of Microsoft, Google, and Apple and implement support for the contentious HTML5 digital rights management specification called Encrypted Media Extensions (EME)."
  •  
    "Mozilla announced today that it will follow the lead of Microsoft, Google, and Apple and implement support for the contentious HTML5 digital rights management specification called Encrypted Media Extensions (EME)."
Gary Edwards

Skynet rising: Google acquires 512-qubit quantum computer; NSA surveillance to be turne... - 0 views

  •  
    "The ultimate code breakers" If you know anything about encryption, you probably also realize that quantum computers are the secret KEY to unlocking all encrypted files. As I wrote about last year here on Natural News, once quantum computers go into widespread use by the NSA, the CIA, Google, etc., there will be no more secrets kept from the government. All your files - even encrypted files - will be easily opened and read. Until now, most people believed this day was far away. Quantum computing is an "impractical pipe dream," we've been told by scowling scientists and "flat Earth" computer engineers. "It's not possible to build a 512-qubit quantum computer that actually works," they insisted. Don't tell that to Eric Ladizinsky, co-founder and chief scientist of a company called D-Wave. Because Ladizinsky's team has already built a 512-qubit quantum computer. And they're already selling them to wealthy corporations, too. DARPA, Northrup Grumman and Goldman Sachs In case you're wondering where Ladizinsky came from, he's a former employee of Northrup Grumman Space Technology (yes, a weapons manufacturer) where he ran a multi-million-dollar quantum computing research project for none other than DARPA - the same group working on AI-driven armed assault vehicles and battlefield robots to replace human soldiers. .... When groundbreaking new technology is developed by smart people, it almost immediately gets turned into a weapon. Quantum computing will be no different. This technology grants God-like powers to police state governments that seek to dominate and oppress the People.  ..... Google acquires "Skynet" quantum computers from D-Wave According to an article published in Scientific American, Google and NASA have now teamed up to purchase a 512-qubit quantum computer from D-Wave. The computer is called "D-Wave Two" because it's the second generation of the system. The first system was a 128-qubit computer. Gen two
  •  
    Normally, I'd be suspicious of anything published by Infowars because its editors are willing to publish really over the top stuff, but: [i] this is subject matter I've maintained an interest in over the years and I was aware that working quantum computers were imminent; and [ii] the pedigree on this particular information does not trace to Scientific American, as stated in the article. I've known Scientific American to publish at least one soothing and lengthy article on the subject of chlorinated dioxin hazard -- my specialty as a lawyer was litigating against chemical companies that generated dioxin pollution -- that was generated by known closet chemical industry advocates long since discredited and was totally lacking in scientific validity and contrary to established scientific knowledge. So publication in Scientific American doesn't pack a lot of weight with me. But checking the Scientific American linked article, notes that it was reprinted by permission from Nature, a peer-reviewed scientific journal and news organization that I trust much more. That said, the InfoWars version is a rewrite that contains lots of information not in the Nature/Scientific American version of a sensationalist nature, so heightened caution is still in order. Check the reprinted Nature version before getting too excited: "The D-Wave computer is not a 'universal' computer that can be programmed to tackle any kind of problem. But scientists have found they can usefully frame questions in machine-learning research as optimisation problems. "D-Wave has battled to prove that its computer really operates on a quantum level, and that it is better or faster than a conventional computer. Before striking the latest deal, the prospective customers set a series of tests for the quantum computer. D-Wave hired an outside expert in algorithm-racing, who concluded that the speed of the D-Wave Two was above average overall, and that it was 3,600 times faster than a leading conventional comput
Paul Merrell

The Great SIM Heist: How Spies Stole the Keys to the Encryption Castle - 0 views

  • AMERICAN AND BRITISH spies hacked into the internal computer network of the largest manufacturer of SIM cards in the world, stealing encryption keys used to protect the privacy of cellphone communications across the globe, according to top-secret documents provided to The Intercept by National Security Agency whistleblower Edward Snowden. The hack was perpetrated by a joint unit consisting of operatives from the NSA and its British counterpart Government Communications Headquarters, or GCHQ. The breach, detailed in a secret 2010 GCHQ document, gave the surveillance agencies the potential to secretly monitor a large portion of the world’s cellular communications, including both voice and data. The company targeted by the intelligence agencies, Gemalto, is a multinational firm incorporated in the Netherlands that makes the chips used in mobile phones and next-generation credit cards. Among its clients are AT&T, T-Mobile, Verizon, Sprint and some 450 wireless network providers around the world. The company operates in 85 countries and has more than 40 manufacturing facilities. One of its three global headquarters is in Austin, Texas and it has a large factory in Pennsylvania. In all, Gemalto produces some 2 billion SIM cards a year. Its motto is “Security to be Free.”
  • With these stolen encryption keys, intelligence agencies can monitor mobile communications without seeking or receiving approval from telecom companies and foreign governments. Possessing the keys also sidesteps the need to get a warrant or a wiretap, while leaving no trace on the wireless provider’s network that the communications were intercepted. Bulk key theft additionally enables the intelligence agencies to unlock any previously encrypted communications they had already intercepted, but did not yet have the ability to decrypt.
  • Leading privacy advocates and security experts say that the theft of encryption keys from major wireless network providers is tantamount to a thief obtaining the master ring of a building superintendent who holds the keys to every apartment. “Once you have the keys, decrypting traffic is trivial,” says Christopher Soghoian, the principal technologist for the American Civil Liberties Union. “The news of this key theft will send a shock wave through the security community.”
  • ...2 more annotations...
  • According to one secret GCHQ slide, the British intelligence agency penetrated Gemalto’s internal networks, planting malware on several computers, giving GCHQ secret access. We “believe we have their entire network,” the slide’s author boasted about the operation against Gemalto. Additionally, the spy agency targeted unnamed cellular companies’ core networks, giving it access to “sales staff machines for customer information and network engineers machines for network maps.” GCHQ also claimed the ability to manipulate the billing servers of cell companies to “suppress” charges in an effort to conceal the spy agency’s secret actions against an individual’s phone. Most significantly, GCHQ also penetrated “authentication servers,” allowing it to decrypt data and voice communications between a targeted individual’s phone and his or her telecom provider’s network. A note accompanying the slide asserted that the spy agency was “very happy with the data so far and [was] working through the vast quantity of product.”
  • The U.S. and British intelligence agencies pulled off the encryption key heist in great stealth, giving them the ability to intercept and decrypt communications without alerting the wireless network provider, the foreign government or the individual user that they have been targeted. “Gaining access to a database of keys is pretty much game over for cellular encryption,” says Matthew Green, a cryptography specialist at the Johns Hopkins Information Security Institute. The massive key theft is “bad news for phone security. Really bad news.”
  •  
    Remember all those NSA claims that no evidence of their misbehavior has emerged? That one should never take wing again. Monitoring call content without the involvement of any court? Without a warrant? Without probable cause?  Was there even any Congressional authorization?  Wiretapping unequivocally requires a judicially-approved search warrant. It's going to be very interesting to learn the government's argument for this misconduct's legality. 
Paul Merrell

CISA Security Bill: An F for Security But an A+ for Spying | WIRED - 0 views

  • When the Senate Intelligence Committee passed the Cybersecurity Information Sharing Act by a vote of 14 to 1, committee chairman Senator Richard Burr argued that it successfully balanced security and privacy. Fifteen new amendments to the bill, he said, were designed to protect internet users’ personal information while enabling new ways for companies and federal agencies to coordinate responses to cyberattacks. But critics within the security and privacy communities still have two fundamental problems with the legislation: First, they say, the proposed cybersecurity act won’t actually boost security. And second, the “information sharing” it describes sounds more than ever like a backchannel for surveillance.
  • On Tuesday the bill’s authors released the full, updated text of the CISA legislation passed last week, and critics say the changes have done little to assuage their fears about wanton sharing of Americans’ private data. In fact, legal analysts say the changes actually widen the backdoor leading from private firms to intelligence agencies. “It’s a complete failure to strengthen the privacy protections of the bill,” says Robyn Greene, a policy lawyer for the Open Technology Institute, which joined a coalition of dozens of non-profits and cybersecurity experts criticizing the bill in an open letter earlier this month. “None of the [privacy-related] points we raised in our coalition letter to the committee was effectively addressed.” The central concern of that letter was how the same data sharing meant to bolster cybersecurity for companies and the government opens massive surveillance loopholes. The bill, as worded, lets a private company share with the Department of Homeland Security any information construed as a cybersecurity threat “notwithstanding any other provision of law.” That means CISA trumps privacy laws like the Electronic Communication Privacy Act of 1986 and the Privacy Act of 1974, which restrict eavesdropping and sharing of users’ communications. And once the DHS obtains the information, it would automatically be shared with the NSA, the Department of Defense (including Cyber Command), and the Office of the Director of National Intelligence.
  • In a statement posted to his website yesterday, Senator Burr wrote that “Information sharing is purely voluntary and companies can only share cyber-threat information and the government may only use shared data for cybersecurity purposes.” But in fact, the bill’s data sharing isn’t limited to cybersecurity “threat indicators”—warnings of incoming hacker attacks, which is the central data CISA is meant to disseminate among companies and three-letter agencies. OTI’s Greene says it also gives companies a mandate to share with the government any data related to imminent terrorist attacks, weapons of mass destruction, or even other information related to violent crimes like robbery and carjacking. 
  • ...2 more annotations...
  • The latest update to the bill tacks on yet another kind of information, anything related to impending “serious economic harm.” All of those vague terms, Greene argues, widen the pipe of data that companies can send the government, expanding CISA into a surveillance system for the intelligence community and domestic law enforcement. If information-sharing legislation does not include adequate privacy protections, then...It’s a surveillance bill by another name. Senator Ron Wyden
  • “CISA goes far beyond [cybersecurity], and permits law enforcement to use information it receives for investigations and prosecutions of a wide range of crimes involving any level of physical force,” reads the letter from the coalition opposing CISA. “The lack of use limitations creates yet another loophole for law enforcement to conduct backdoor searches on Americans—including searches of digital communications that would otherwise require law enforcement to obtain a warrant based on probable cause. This undermines Fourth Amendment protections and constitutional principles.”
  •  
    I read the legislation. It's as bad for privacy as described in the aritcle. And its drafting is incredibly sloppy.
Paul Merrell

This project aims to make '404 not found' pages a thing of the past - 0 views

  • The Internet is always changing. Sites are rising and falling, content is deleted, and bad URLs can lead to '404 Not Found' errors that are as helpful as a brick wall. A new project proposes an do away with dead 404 errors by implementing new HTML code that will help access prior versions of hyperlinked content. With any luck, that means that you’ll never have to run into a dead link again. The “404-No-More” project is backed by a formidable coalition including members from organizations like the Harvard Library Innovation Lab, Los Alamos National Laboratory, Old Dominion University, and the Berkman Center for Internet & Society. Part of the Knight News Challenge, which seeks to strengthen the Internet for free expression and innovation through a variety of initiatives, 404-No-More recently reached the semifinal stage. The project aims to cure so-called link rot, the process by which hyperlinks become useless overtime because they point to addresses that are no longer available. If implemented, websites such as Wikipedia and other reference documents would be vastly improved. The new feature would also give Web authors a way provide links that contain both archived copies of content and specific dates of reference, the sort of information that diligent readers have to hunt down on a website like Archive.org.
  • While it may sound trivial, link rot can actually have real ramifications. Nearly 50 percent of the hyperlinks in Supreme Court decisions no longer work, a 2013 study revealed. Losing footnotes and citations in landmark legal decisions can mean losing crucial information and context about the laws that govern us. The same study found that 70 percent of URLs within the Harvard Law Review and similar journals didn’t link to the originally cited information, considered a serious loss surrounding the discussion of our laws. The project’s proponents have come up with more potential uses as well. Activists fighting censorship will have an easier time combatting government takedowns, for instance. Journalists will be much more capable of researching dynamic Web pages. “If every hyperlink was annotated with a publication date, you could automatically view an archived version of the content as the author intended for you to see it,” the project’s authors explain. The ephemeral nature of the Web could no longer be used as a weapon. Roger Macdonald, a director at the Internet Archive, called the 404-No-More project “an important contribution to preservation of knowledge.”
  • The new feature would come in the form of introducing the mset attribute to the <a> element in HTML, which would allow users of the code to specify multiple dates and copies of content as an external resource. For instance, if both the date of reference and the location of a copy of targeted content is known by an author, the new code would like like this: The 404-No-More project’s goals are numerous, but the ultimate goal is to have mset become a new HTML standard for hyperlinks. “An HTML standard that incorporates archives for hyperlinks will loop in these efforts and make the Web better for everyone,” project leaders wrote, “activists, journalists, and regular ol’ everyday web users.”
Gary Edwards

ES4 and the fight for the future of the Open Web - By Haavard - 0 views

  • Here, we have no better theory to explain why Microsoft is enthusiastic to spread C# onto the web via Silverlight, but not to give C# a run for its money in the open web standards by supporting ES4 in IE.The fact is, and we've heard this over late night truth-telling meetings between Mozilla principals and friends at Microsoft, that Microsoft does not think the web needs to change much. Or as one insider said to a Mozilla figure earlier this year: "we could improve the web standards, but what's in it for us?"
  •  
    Microsoft opposes the stunning collection of EcmaScript standards improvements to JavaScript ES3 known as "ES4". Brendan Eich, author of JavaScript and lead Mozilla developer claims that Microsoft is stalling the advance of JavaScript to protect their proprietary advantages with Silverlight - WPF technologies. Opera developer "Haavard" asks the question, "Why would Microsoft do this?" Brendan Eich explains: Indeed Microsoft does not desire serious change to ES3, and we heard this inside TG1 in April. The words were (from my notes) more like this: "Microsoft does not think the web needs to change much". Except, of course, via Silverlight and WPF, which if not matched by evolution of the open web standards, will spread far and wide on the Web, as Flash already has. And that change to the Web is apparently just fine and dandy according to Microsoft. First, Microsoft does not think the Web needs to change much, but then they give us Silverlight and WPF? An amazing contradiction if I ever saw one. It is obvious that Microsoft wants to lock the Web to their proprietary technologies again. They want Silverlight, not some new open standard which further threatens their locked-in position. They will use dirty tricks - lies and deception - to convince people that they are in the right. Excellent discussion on how Microsoft participates in open standards groups to delay, stall and dumb down the Open Web formats, protocols and interfaces their competitors use. With their applications and services, Microsoft offers users a Hobbsian choice; use the stalled, limited and dumbed down Open Web standards, or, use rich, fully featured and advanced but proprietary Silverlight-WPF technologies. Some choice.
Gonzalo San Gil, PhD.

Data snooping blunders by UK spies, cops led to wrongful arrests-watchdog | Ars Technic... - 0 views

  •  
    "IP address mistakes particularly troubling; likely to get worse under Snoopers' Charter. Glyn Moody - Sep 9, 2016 1:24 pm UTC "
Paul Merrell

EFF to Court: Don't Undermine Legal Protections for Online Platforms that Enable Free S... - 0 views

  • EFF filed a brief in federal court arguing that a lower court’s ruling jeopardizes the online platforms that make the Internet a robust platform for users’ free speech. The brief, filed in the U.S. Court of Appeals for the Ninth Circuit, argues that 47 U.S.C. § 230, enacted as part of the Communications Decency Act (known simply as “Section 230”) broadly protects online platforms, including review websites, when they aggregate or otherwise edit users’ posts. Generally, Section 230 provides legal immunity for online intermediaries that host or republish speech by protecting them against a range of laws that might otherwise be used to hold them legally responsible for what others say and do. Section 230’s immunity directly led to the development of the platforms everyone uses today, allowing people to upload videos to their favorite platforms such as YouTube, as well as leave reviews on Amazon or Yelp. It also incentivizes the creation of new platforms that can host users’ content, leading to more innovation that enables the robust free speech found online. The lower court’s decision in Consumer Cellular v. ConsumerAffairs.com, however, threatens to undermine the broad protections of Section 230, EFF’s brief argues.
  • In the case, Consumer Cellular alleged, among other things, that ConsumerAffairs.com should be held liable for aggregating negative reviews about its business into a star rating. It also alleged that ConsumerAffairs.com edited or otherwise deleted certain reviews of Consumer Cellular in bad faith. Courts and the text of Section 230, however, plainly allow platforms to edit or aggregate user-generated content into summaries or star ratings without incurring legal liability, EFF’s brief argues. It goes on: “And any function protected by Section 230 remains so regardless of the publisher’s intent.” By allowing Consumer Cellular’s claims against ConsumerAffairs.com to proceed, the lower court seriously undercut Section 230’s legal immunity for online platforms. If the decision is allowed to stand, EFF’s brief argues, then platforms may take steps to further censor or otherwise restrict user content out of fear of being held liable. That outcome, EFF warns, could seriously diminish the Internet’s ability to serve as a diverse forum for free speech. The Internet it is constructed of and depends upon intermediaries. The many varied online intermediary platforms, including Twitter, Reddit, YouTube, and Instagram, all give a single person, with minimal resources, almost anywhere in the world the ability to communicate with the rest of the world. Without intermediaries, that speaker would need technical skill and money that most people lack to disseminate their message. If our legal system fails to robustly protect intermediaries, it fails to protect free speech online.
Gary Edwards

Siding with HTML over XHTML, My Decision to Switch - Monday By Noon - 0 views

  • Publishing content on the Web is in no way limited to professional developers or designers, much of the reason the net is so active is because anyone can make a website. Sure, we (as knowledgeable professionals or hobbyists) all hope to make the Web a better place by doing our part in publishing documents with semantically rich, valid markup, but the reality is that those documents are rare. It’s important to keep in mind the true nature of the Internet; an open platform for information sharing.
  • XHTML2 has some very good ideas that I hope can become part of the web. However, it’s unrealistic to think that all web authors will switch to an XML-based syntax which demands that browsers stop processing the document on the first error. XML’s draconian policy was an attempt to clean up the web. This was done around 1996 when lots of invalid content entered the web. CSS took a different approach: instead of demanding that content isn’t processed, we defined rules for how to handle the undefined. It’s called “forward-compatible parsing” and means we can add new constructs without breaking the old. So, I don’t think XHTML is a realistic option for the masses. HTML 5 is it.
    • Gary Edwards
       
      Great quote from CSS expert Hakon Wium Lie.
  • @marbux: Of course i disagree with your interop assessment, but I wondered how it is that you’re missing the point. I think you confuse web applications with legacy desktop – client/server application model. And that confusion leads to the mistake of trying to transfer the desktop document model to one that could adequately service advancing web applications.
  •  
    A CMS expert argues for HTML over XHTML, explaining his reasons for switching. Excellent read! He nails the basics. for similar reasons, we moved from ODF to ePUB and then to CDf and finally to the advanced WebKit document model, where wikiWORD will make it's stand.
  •  
    See also my comment on the same web page that explains why HTML 5 is NOT it for document exchange between web editing applications. .
  •  
    Response to marbux supporting the WebKit layout/document model. Marbux argues that HTML5 is not interoperable, and CSS2 near useless. HTML5 fails regarding the the interop web appplications need. I respond by arguing that the only way to look at web applications is to consider that the browser layout engine is the web application layout engine! Web applications are actually written to the browser layout/document model, OR, to take advantage of browser plug-in capabilities. The interoperability marbux seeks is tied directly to the browser layout engine. In this context, the web format is simply a reflection of that layout engine. If there's an interop problem, it comes from browser madness differentials. The good news is that there are all kinds of efforts to close the browser gap: including WHATWG - HTML5, CSS3, W3C DOM, JavaScript Libraries, Google GWT (Java to JavaScript), Yahoo GUI, and the my favorite; WebKit. The bad news is that the clock is ticking. Microsoft has pulled the trigger and the great migration of MSOffice client/server systems to the MS WebSTack-Mesh architecture has begun. Key to this transition are the WPF-.NET proprietary formats, protocols and interfaces such as XAML, Silverlight, LINQ, and Smart Tags. New business processes are being written, and old legacy desktop bound processes are being transitioned to this emerging platform. The fight for the Open Web is on, with Microsoft threatening to transtion their entire business desktop monopoly to a Web platfomr they own. ~ge~
Gonzalo San Gil, PhD.

Verizon's "deteriorated" phone lines cited in demand for investigation | Ars Technica - 0 views

  •  
    "Deregulation led to higher prices and worse service, New York lawmakers claim. by Jon Brodkin - July 8 2014, 12:02am CEST" [# ! … the '#deregulation' that ONLY #providers want…]
  •  
    "Deregulation led to higher prices and worse service, New York lawmakers claim. by Jon Brodkin - July 8 2014, 12:02am CEST"
Paul Merrell

NSA Director Finally Admits Encryption Is Needed to Protect Public's Privacy - 0 views

  • NSA Director Finally Admits Encryption Is Needed to Protect Public’s Privacy The new stance denotes a growing awareness within the government that Americans are not comfortable with the State’s grip on their data. By Carey Wedler | AntiMedia | January 22, 2016 Share this article! https://mail.google.com/mail/?view=cm&fs=1&to&su=NSA%20Director%20Finally%20Admits%20Encryption%20Is%20Needed%20to%20Protect%20Public%E2%80%99s%20Privacy&body=http%3A%2F%2Fwww.mintpress
  • Rogers cited the recent Office of Personnel Management hack of over 20 million users as a reason to increase encryption rather than scale it back. “What you saw at OPM, you’re going to see a whole lot more of,” he said, referring to the massive hack that compromised the personal data about 20 million people who obtained background checks. Rogers’ comments, while forward-thinking, signify an about face in his stance on encryption. In February 2015, he said he “shares [FBI] Director [James] Comey’s concern” about cell phone companies’ decision to add encryption features to their products. Comey has been one loudest critics of encryption. However, Rogers’ comments on Thursday now directly conflict with Comey’s stated position. The FBI director has publicly chastised encryption, as well as the companies that provide it. In 2014, he claimed Apple’s then-new encryption feature could lead the world to “a very dark place.” At a Department of Justice hearing in November, Comey testified that “Increasingly, the shadow that is ‘going dark’ is falling across more and more of our work.” Though he claimed, “We support encryption,” he insisted “we have a problem that encryption is crashing into public safety and we have to figure out, as people who care about both, to resolve it. So, I think the conversation’s in a healthier place.”
  • At the same hearing, Comey and Attorney General Loretta Lynch declined to comment on whether they had proof the Paris attackers used encryption. Even so, Comey recently lobbied for tech companies to do away with end-to-end encryption. However, his crusade has fallen on unsympathetic ears, both from the private companies he seeks to control — and from the NSA. Prior to Rogers’ statements in support of encryption Thursday, former NSA chief Michael Hayden said, “I disagree with Jim Comey. I actually think end-to-end encryption is good for America.” Still another former NSA chair has criticized calls for backdoor access to information. In October, Mike McConnell told a panel at an encryption summit that the United States is “better served by stronger encryption, rather than baking in weaker encryption.” Former Department of Homeland Security chief, Michael Chertoff, has also spoken out against government being able to bypass encryption.
  • ...2 more annotations...
  • Regardless of these individual defenses of encryption, the Intercept explained why these statements may be irrelevant: “Left unsaid is the fact that the FBI and NSA have the ability to circumvent encryption and get to the content too — by hacking. Hacking allows law enforcement to plant malicious code on someone’s computer in order to gain access to the photos, messages, and text before they were ever encrypted in the first place, and after they’ve been decrypted. The NSA has an entire team of advanced hackers, possibly as many as 600, camped out at Fort Meade.”
  • Rogers statements, of course, are not a full-fledged endorsement of privacy, nor can the NSA be expected to make it a priority. Even so, his new stance denotes a growing awareness within the government that Americans are not comfortable with the State’s grip on their data. “So spending time arguing about ‘hey, encryption is bad and we ought to do away with it’ … that’s a waste of time to me,” Rogers said Thursday. “So what we’ve got to ask ourselves is, with that foundation, what’s the best way for us to deal with it? And how do we meet those very legitimate concerns from multiple perspectives?”
1 - 13 of 13
Showing 20 items per page