Skip to main content

Home/ Future of the Web/ Group items tagged Analysis

Rss Feed Group items tagged

Paul Merrell

Net neutrality comment fraud will be investigated by government | Ars Technica - 0 views

  • The US Government Accountability Office (GAO) will investigate the use of impersonation in public comments on the Federal Communications Commission's net neutrality repeal. Congressional Democrats requested the investigation last month, and the GAO has granted the request. While the investigation request was spurred by widespread fraud in the FCC's net neutrality repeal docket, Democrats asked the GAO to also "examine whether this shady practice extends to other agency rulemaking processes." The GAO will do just that, having told Democrats in a letter that it will "review the extent and pervasiveness of fraud and the misuse of American identities during federal rulemaking processes."
  • The GAO provides independent, nonpartisan audits and investigations for Congress. The GAO previously agreed to investigate DDoS attacks that allegedly targeted the FCC comment system, also in response to a request by Democratic lawmakers. The Democrats charged that Chairman Ajit Pai's FCC did not provide enough evidence that the attacks actually happened, and they asked the GAO to find out what evidence the FCC used to make its determination. Democrats also asked the GAO to examine whether the FCC is prepared to prevent future attacks. The DDoS investigation should happen sooner than the new one on comment fraud because the GAO accepted that request in October.
  • The FCC's net neutrality repeal received more than 22 million comments, but millions were apparently submitted by bots and falsely attributed to real Americans (including some dead ones) who didn't actually submit comments. Various analyses confirmed the widespread spam and fraud; one analysis found that 98.5 percent of unique comments opposed the repeal plan.
  • ...1 more annotation...
  • The FCC's comment system makes no attempt to verify submitters' identities, and allows bulk uploads so that groups collecting signatures for letters and petitions can get them on the docket easily. It was like that even before Pai took over as chair, but the fraud became far more pervasive in the proceeding that led to the repeal of net neutrality rules. Pai's FCC did not remove any fraudulent comments from the record. Democratic FCC Commissioner Jessica Rosenworcel called for a delay in the net neutrality repeal vote because of the fraud, but the Republican majority pushed the vote through as scheduled last month. New York Attorney General Eric Schneiderman has been investigating the comment fraud and says the FCC has stonewalled the investigation by refusing to provide evidence. Schneiderman is also leading a lawsuit to reverse the FCC's net neutrality repeal, and the comment fraud could play a role in the case. "We understand that the FCC's rulemaking process requires it to address all comments it receives, regardless of who submits them," Congressional Democrats said in their letter requesting a GAO investigation. "However, we do not believe any outside parties should be permitted to generate any comments to any federal governmental entity using information it knows to be false, such as the identities of those submitting the comments."
Paul Merrell

HART: Homeland Security's Massive New Database Will Include Face Recognition, DNA, and ... - 0 views

  • The U.S. Department of Homeland Security (DHS) is quietly building what will likely become the largest database of biometric and biographic data on citizens and foreigners in the United States. The agency’s new Homeland Advanced Recognition Technology (HART) database will include multiple forms of biometrics—from face recognition to DNA, data from questionable sources, and highly personal data on innocent people. It will be shared with federal agencies outside of DHS as well as state and local law enforcement and foreign governments. And yet, we still know very little about it.The records DHS plans to include in HART will chill and deter people from exercising their First Amendment protected rights to speak, assemble, and associate. Data like face recognition makes it possible to identify and track people in real time, including at lawful political protests and other gatherings. Other data DHS is planning to collect—including information about people’s “relationship patterns” and from officer “encounters” with the public—can be used to identify political affiliations, religious activities, and familial and friendly relationships. These data points are also frequently colored by conjecture and bias.
  • DHS currently collects a lot of data. Its legacy IDENT fingerprint database contains information on 220-million unique individuals and processes 350,000 fingerprint transactions every day. This is an exponential increase from 20 years ago when IDENT only contained information on 1.8-million people. Between IDENT and other DHS-managed databases, the agency manages over 10-billion biographic records and adds 10-15 million more each week.
  • DHS’s new HART database will allow the agency to vastly expand the types of records it can collect and store. HART will support at least seven types of biometric identifiers, including face and voice data, DNA, scars and tattoos, and a blanket category for “other modalities.” It will also include biographic information, like name, date of birth, physical descriptors, country of origin, and government ID numbers. And it will include data we know to by highly subjective, including information collected from officer “encounters” with the public and information about people’s “relationship patterns.”
  • ...1 more annotation...
  • DHS’s face recognition roll-out is especially concerning. The agency uses mobile biometric devices that can identify faces and capture face data in the field, allowing its ICE (immigration) and CBP (customs) officers to scan everyone with whom they come into contact, whether or not those people are suspected of any criminal activity or an immigration violation. DHS is also partnering with airlines and other third parties to collect face images from travelers entering and leaving the U.S. When combined with data from other government agencies, these troubling collection practices will allow DHS to build a database large enough to identify and track all people in public places, without their knowledge—not just in places the agency oversees, like airports, but anywhere there are cameras.Police abuse of facial recognition technology is not a theoretical issue: it’s happening today. Law enforcement has already used face recognition on public streets and at political protests. During the protests surrounding the death of Freddie Gray in 2015, Baltimore Police ran social media photos against a face recognition database to identify protesters and arrest them. Recent Amazon promotional videos encourage police agencies to acquire that company’s face “Rekognition” capabilities and use them with body cameras and smart cameras to track people throughout cities. At least two U.S. cities are already using Rekognition.DHS compounds face recognition’s threat to anonymity and free speech by planning to include “records related to the analysis of relationship patterns among individuals.” We don’t know where DHS or its external partners will be getting these “relationship pattern” records, but they could come from social media profiles and posts, which the government plans to track by collecting social media user names from all foreign travelers entering the country.
David Hart

Web Crawler - 1 views

  •  
    Website Scraping analysis is a technique that involves retrieving unstructured data from web pages and converting it into structured data.
Gonzalo San Gil, PhD.

TTIP on its deathbed, but CETA moves forward despite growing concerns | Ars Technica UK - 0 views

  •  
    "At a key meeting in Bratislava last Friday, EU ministers effectively put the controversial Transatlantic Trade and Investment Partnership (TTIP) negotiations on hold, perhaps forever. "
Paul Merrell

The Ninth Circuit Holds-Correctly-That a Blogger Has the Same Defamation Protection as ... - 0 views

  • On January 17, a three-judge panel of the U.S. Court of Appeals for the Ninth Circuit ruled, as a matter of first impression, that First Amendment defamation rules apply equally to both the institutional press and individual speakers and writers, such as bloggers.
  • In reaching this conclusion, the Ninth Circuit analyzed two key prior Supreme Court precedents: New York Times v. Sullivan (public official seeking damages for defamation must show “actual malice” as defined as a showing thatthe defendant published the defamatory statement with knowledge that it was false, or with reckless disregard as to whether it was false or not) and Gertz v. Robert Welch, Inc. (First Amendment requires only a negligence standard for private defamation actions). Notably, Gertz involved an institutional media defendant, and the Gertz Court invoked the need to shield “the press and broadcast media from the rigors of strict liability for defamation.” Yet neither New York Times nor Gertz, as the Ninth Circuit noted, were expressly limited to the institutional press. Moreover,a number of other Supreme Court cases have rejected such a limitation: Bartnicki v. Vopper; Cohen v. Cowles Media Co.; First National Bank of Boston v. Bellotti; and Citizens United v. Federal Election Commission.
Gary Edwards

Meet OX Text, a collaborative, non-destructive alternative to Google Docs - Tech News a... - 0 views

  • The German software-as-a-service firm Open-Xchange, which provides apps that telcos and other service providers can bundle with their connectivity or hosting products, is adding a cloud-based office productivity toolset called OX Documents to its OX App Suite lineup. Open-Xchange has around 70 million users through its contracts with roughly 80 providers such as 1&1 Internet and Strato. Its OX App Suite takes the form of a virtual desktop of sorts, that lets users centralize their email and file storage accounts and view all sorts of documents through a unified portal. However, as of an early April release it will also include OX Text, a non-destructive, collaborative document editor that rivals Google Docs, and that has an interesting heritage of its own.
  • The team that created the HTML5- and JavaScript-based OX Text includes some of the core developers behind OpenOffice, the free alternative to Microsoft Office that passed from Sun Microsystems to Oracle before morphing into LibreOffice. The German developers we’re talking about hived off the project before LibreOffice happened, and ended up getting hired by Open-Xchange. “To them it was a once in a lifetime event, because we allowed them to start from scratch,” Open-Xchange CEO Rafael Laguna told me. “We said we wanted a fresh office productivity suite that runs inside the browser. In terms of the architecture and principles for the product, we wanted to make it fully round-trip capable, meaning whatever file format we run into needs to be retained.”
  • This is an extremely handy formatting and version control feature. Changes made to a document in OX Text get pushed through to Open-Xchange’s backend, where a changelog is maintained. “Power” Word features such as Smart Art or Charts, which are not necessarily supported by other productivity suites, are replaced with placeholders during editing and are there, as before, when the edited document is eventually downloaded. As the OX Text blurb says, “OX Text never damages your valuable work even if it does not understand it”.
  • ...1 more annotation...
  • “[This avoids] the big disadvantage of anything other than Microsoft Office,” Laguna said. “If you use OpenOffice with a .docx file, the whole document is converted, creating artefacts, then you convert it back. That’s one of the major reasons not everyone is using OpenOffice, and the same is true for Google Apps.” OX Text will be available as an extension to OX App Suite, which also includes calendaring and other productivity tools. However, it will also come out as a standalone product under both commercial licenses – effectively support-based subscriptions for Open-Xchange’s service provider customers – and open-source licenses, namely the GNU General Public License 2 and Creative Commons Attribution-NonCommercial-ShareAlike 2.5 License, which will allow free personal, non-commercial use. You can find a demo of App Suite, including the OX Text functionality, here, and there’s a video too:
Gonzalo San Gil, PhD.

Europe's net neutrality law gets breathing space as industry committee delays vote - Te... - 0 views

  •  
    [* #Our Time, Let's #Go!] "Summary: The two-week delay is apparently down to technicalities over translations, but those opposed to anti-neutrality amendments made by some in the committee say it will give everyone a chance to better examine the details of the text."
Gary Edwards

The True Story of How the Patent Bar Captured a Court and Shrank the Intellectual Commo... - 1 views

  • The change in the law wrought by the Federal Circuit can also be viewed substantively through the controversy over software patents. Throughout the 1960s, the USPTO refused to award patents for software innovations. However, several of the USPTO’s decisions were overruled by the patent-friendly U.S. Court of Customs and Patent Appeals, which ordered that software patents be granted. In Gottschalk v. Benson (1972) and Parker v. Flook (1978), the U.S. Supreme Court reversed the Court of Customs and Patent Appeals, holding that mathematical algorithms (and therefore software) were not patentable subject matter. In 1981, in Diamond v. Diehr, the Supreme Court upheld a software patent on the grounds that the patent in question involved a physical process—the patent was issued for software used in the molding of rubber. While affirming their prior ruling that mathematical formulas are not patentable in the abstract, the Court held that an otherwise patentable invention did not become unpatentable simply because it utilized a computer.
  • In the hands of the newly established Federal Circuit, however, this small scope for software patents in precedent was sufficient to open the floodgates. In a series of decisions culminating in State Street Bank v. Signature Financial Group (1998), the Federal Circuit broadened the criteria for patentability of software and business methods substantially, allowing protection as long as the innovation “produces a useful, concrete and tangible result.” That broadened criteria led to an explosion of low-quality software patents, from Amazon’s 1-Click checkout system to Twitter’s pull-to-refresh feature on smartphones. The GAO estimates that more than half of all patents granted in recent years are software-related. Meanwhile, the Supreme Court continues to hold, as in Parker v. Flook, that computer software algorithms are not patentable, and has begun to push back against the Federal Circuit. In Bilski v. Kappos (2010), the Supreme Court once again held that abstract ideas are not patentable, and in Alice v. CLS (2014), it ruled that simply applying an abstract idea on a computer does not suffice to make the idea patent-eligible. It still is not clear what portion of existing software patents Alice invalidates, but it could be a significant one.
  • Supreme Court justices also recognize the Federal Circuit’s insubordination. In oral arguments in Carlsbad Technology v. HIF Bio (2009), Chief Justice John Roberts joked openly about it:
  • ...17 more annotations...
  • The Opportunity of the Commons
  • As a result of the Federal Circuit’s pro-patent jurisprudence, our economy has been flooded with patents that would otherwise not have been granted. If more patents meant more innovation, then we would now be witnessing a spectacular economic boom. Instead, we have been living through what Tyler Cowen has called a Great Stagnation. The fact that patents have increased while growth has not is known in the literature as the “patent puzzle.” As Michele Boldrin and David Levine put it, “there is no empirical evidence that [patents] serve to increase innovation and productivity, unless productivity is identified with the number of patents awarded—which, as evidence shows, has no correlation with measured productivity.”
  • While more patents have not resulted in faster economic growth, they have resulted in more patent lawsuits.
  • Software patents have characteristics that make them particularly susceptible to litigation. Unlike, say, chemical patents, software patents are plagued by a problem of description. How does one describe a software innovation in such a way that anyone searching for it will easily find it? As Christina Mulligan and Tim Lee demonstrate, chemical formulas are indexable, meaning that as the number of chemical patents grow, it will still be easy to determine if a molecule has been patented. Since software innovations are not indexable, they estimate that “patent clearance by all firms would require many times more hours of legal research than all patent lawyers in the United States can bill in a year. The result has been an explosion of patent litigation.” Software and business method patents, estimate James Bessen and Michael Meurer, are 2 and 7 times more likely to be litigated than other patents, respectively (4 and 13 times more likely than chemical patents).
  • Software patents make excellent material for predatory litigation brought by what are often called “patent trolls.”
  • Trolls use asymmetries in the rules of litigation to legally extort millions of dollars from innocent parties. For example, one patent troll, Innovatio IP Ventures, LLP, acquired patents that implicated Wi-Fi. In 2011, it started sending demand letters to coffee shops and hotels that offered wireless Internet access, offering to settle for $2,500 per location. This amount was far in excess of the 9.56 cents per device that Innovatio was entitled to under the “Fair, Reasonable, and Non-Discriminatory” licensing promises attached to their portfolio, but it was also much less than the cost of trial, and therefore it was rational for firms to pay. Cisco stepped in and spent $13 million in legal fees on the case, and settled on behalf of their customers for 3.2 cents per device. Other manufacturers had already licensed Innovatio’s portfolio, but that didn’t stop their customers from being targeted by demand letters.
  • Litigation cost asymmetries are magnified by the fact that most patent trolls are nonpracticing entities. This means that when patent infringement trials get to the discovery phase, they will cost the troll very little—a firm that does not operate a business has very few records to produce.
  • But discovery can cost a medium or large company millions of dollars. Using an event study methodology, James Bessen and coauthors find that infringement lawsuits by nonpracticing entities cost publicly traded companies $83 billion per year in stock market capitalization, while plaintiffs gain less than 10 percent of that amount.
  • Software patents also reduce innovation in virtue of their cumulative nature and the fact that many of them are frequently inputs into a single product. Law professor Michael Heller coined the phrase “tragedy of the anticommons” to refer to a situation that mirrors the well-understood “tragedy of the commons.” Whereas in a commons, multiple parties have the right to use a resource but not to exclude others, in an anticommons, multiple parties have the right to exclude others, and no one is therefore able to make effective use of the resource. The tragedy of the commons results in overuse of the resource; the tragedy of the anticommons results in underuse.
  • In order to cope with the tragedy of the anticommons, we should carefully investigate the opportunity of  the commons. The late Nobelist Elinor Ostrom made a career of studying how communities manage shared resources without property rights. With appropriate self-governance institutions, Ostrom found again and again that a commons does not inevitably lead to tragedy—indeed, open access to shared resources can provide collective benefits that are not available under other forms of property management.
  • This suggests that—litigation costs aside—patent law could be reducing the stock of ideas rather than expanding it at current margins.
  • Advocates of extensive patent protection frequently treat the commons as a kind of wasteland. But considering the problems in our patent system, it is worth looking again at the role of well-tailored limits to property rights in some contexts. Just as we all benefit from real property rights that no longer extend to the highest heavens, we would also benefit if the scope of patent protection were more narrowly drawn.
  • Reforming the Patent System
  • This analysis raises some obvious possibilities for reforming the patent system. Diane Wood, Chief Judge of the 7th Circuit, has proposed ending the Federal Circuit’s exclusive jurisdiction over patent appeals—instead, the Federal Circuit could share jurisdiction with the other circuit courts. While this is a constructive suggestion, it still leaves the door open to the Federal Circuit playing “a leading role in shaping patent law,” which is the reason for its capture by patent interests. It would be better instead simply to abolish the Federal Circuit and return to the pre-1982 system, in which patents received no special treatment in appeals. This leaves open the possibility of circuit splits, which the creation of the Federal Circuit was designed to mitigate, but there are worse problems than circuit splits, and we now have them.
  • Another helpful reform would be for Congress to limit the scope of patentable subject matter via statute. New Zealand has done just that, declaring that software is “not an invention” to get around WTO obligations to respect intellectual property. Congress should do the same with respect to both software and business methods.
  • Finally, even if the above reforms were adopted, there would still be a need to address the asymmetries in patent litigation that result in predatory “troll” lawsuits. While the holding in Alice v. CLS arguably makes a wide swath of patents invalid, those patents could still be used in troll lawsuits because a ruling of invalidity for each individual patent might not occur until late in a trial. Current legislation in Congress addresses this class of problem by mandating disclosures, shifting fees in the case of spurious lawsuits, and enabling a review of the patent’s validity before a trial commences.
  • What matters for prosperity is not just property rights in the abstract, but good property-defining institutions. Without reform, our patent system will continue to favor special interests and forestall economic growth.
  •  
    "Libertarians intuitively understand the case for patents: just as other property rights internalize the social benefits of improvements to land, automobile maintenance, or business investment, patents incentivize the creation of new inventions, which might otherwise be undersupplied. So far, so good. But it is important to recognize that the laws that govern property, intellectual or otherwise, do not arise out of thin air. Rather, our political institutions, with all their virtues and foibles, determine the contours of property-the exact bundle of rights that property holders possess, their extent, and their limitations. Outlining efficient property laws is not a trivial problem. The optimal contours of property are neither immutable nor knowable a priori. For example, in 1946, the U.S. Supreme Court reversed the age-old common law doctrine that extended real property rights to the heavens without limit. The advent of air travel made such extensive property rights no longer practicable-airlines would have had to cobble together a patchwork of easements, acre by acre, for every corridor through which they flew, and they would have opened themselves up to lawsuits every time their planes deviated from the expected path. The Court rightly abridged property rights in light of these empirical realities. In defining the limits of patent rights, our political institutions have gotten an analogous question badly wrong. A single, politically captured circuit court with exclusive jurisdiction over patent appeals has consistently expanded the scope of patentable subject matter. This expansion has resulted in an explosion of both patents and patent litigation, with destructive consequences. "
  •  
    I added a comment to the page's article. Patents are antithetical to the precepts of Libertarianism and do not involve Natural Law rights. But I agree with the author that the Court of Appeals for the Federal Circuit should be abolished. It's a failed experiment.
Paul Merrell

Profiled From Radio to Porn, British Spies Track Web Users' Online Identities | Global ... - 0 views

  • One system builds profiles showing people’s web browsing histories. Another analyzes instant messenger communications, emails, Skype calls, text messages, cell phone locations, and social media interactions. Separate programs were built to keep tabs on “suspicious” Google searches and usage of Google Maps. The surveillance is underpinned by an opaque legal regime that has authorized GCHQ to sift through huge archives of metadata about the private phone calls, emails and Internet browsing logs of Brits, Americans, and any other citizens  all without a court order or judicial warrant.
  • The power of KARMA POLICE was illustrated in 2009, when GCHQ launched a top-secret operation to collect intelligence about people using the Internet to listen to radio shows. The agency used a sample of nearly 7 million metadata records, gathered over a period of three months, to observe the listening habits of more than 200,000 people across 185 countries, including the U.S., the U.K., Ireland, Canada, Mexico, Spain, the Netherlands, France, and Germany.
  • GCHQ’s documents indicate that the plans for KARMA POLICE were drawn up between 2007 and 2008. The system was designed to provide the agency with “either (a) a web browsing profile for every visible user on the Internet, or (b) a user profile for every visible website on the Internet.” The origin of the surveillance system’s name is not discussed in the documents. But KARMA POLICE is also the name of a popular song released in 1997 by the Grammy Award-winning British band Radiohead, suggesting the spies may have been fans. A verse repeated throughout the hit song includes the lyric, “This is what you’ll get, when you mess with us.”
  • ...3 more annotations...
  • GCHQ vacuums up the website browsing histories using “probes” that tap into the international fiber-optic cables that transport Internet traffic across the world. A huge volume of the Internet data GCHQ collects flows directly into a massive repository named Black Hole, which is at the core of the agency’s online spying operations, storing raw logs of intercepted material before it has been subject to analysis. Black Hole contains data collected by GCHQ as part of bulk “unselected” surveillance, meaning it is not focused on particular “selected” targets and instead includes troves of data indiscriminately swept up about ordinary people’s online activities. Between August 2007 and March 2009, GCHQ documents say that Black Hole was used to store more than 1.1 trillion “events”  a term the agency uses to refer to metadata records  with about 10 billion new entries added every day. As of March 2009, the largest slice of data Black Hole held  41 percent  was about people’s Internet browsing histories. The rest included a combination of email and instant messenger records, details about search engine queries, information about social media activity, logs related to hacking operations, and data on people’s use of tools to browse the Internet anonymously.
  • Throughout this period, as smartphone sales started to boom, the frequency of people’s Internet use was steadily increasing. In tandem, British spies were working frantically to bolster their spying capabilities, with plans afoot to expand the size of Black Hole and other repositories to handle an avalanche of new data. By 2010, according to the documents, GCHQ was logging 30 billion metadata records per day. By 2012, collection had increased to 50 billion per day, and work was underway to double capacity to 100 billion. The agency was developing “unprecedented” techniques to perform what it called “population-scale” data mining, monitoring all communications across entire countries in an effort to detect patterns or behaviors deemed suspicious. It was creating what it saidwould be, by 2013, “the world’s biggest” surveillance engine “to run cyber operations and to access better, more valued data for customers to make a real world difference.” HERE WAS A SIMPLE AIM at the heart of the top-secret program: Record the website browsing habits of “every visible user on the Internet.” Before long, billions of digital records about ordinary people’s online activities were being stored every day. Among them were details cataloging visits to porn, social media and news websites, search engines, chat forums, and blogs.
  • The mass surveillance operation — code-named KARMA POLICE — was launched by British spies about seven years ago without any public debate or scrutiny. It was just one part of a giant global Internet spying apparatus built by the United Kingdom’s electronic eavesdropping agency, Government Communications Headquarters, or GCHQ. The revelations about the scope of the British agency’s surveillance are contained in documents obtained by The Intercept from National Security Agency whistleblower Edward Snowden. Previous reports based on the leaked files have exposed how GCHQ taps into Internet cables to monitor communications on a vast scale, but many details about what happens to the data after it has been vacuumed up have remained unclear.
Gonzalo San Gil, PhD.

scancode-toolkit · GitHub - 0 views

  •  
    "ScanCode is a tool to scan code and detect licenses, copyrights and more. This open source code scanning tool helps you find and discover open source and third-party components in your code. "
Paul Merrell

Security Experts Oppose Government Access to Encrypted Communication - The New York Times - 0 views

  • An elite group of security technologists has concluded that the American and British governments cannot demand special access to encrypted communications without putting the world’s most confidential data and critical infrastructure in danger.A new paper from the group, made up of 14 of the world’s pre-eminent cryptographers and computer scientists, is a formidable salvo in a skirmish between intelligence and law enforcement leaders, and technologists and privacy advocates. After Edward J. Snowden’s revelations — with security breaches and awareness of nation-state surveillance at a record high and data moving online at breakneck speeds — encryption has emerged as a major issue in the debate over privacy rights.
  • That has put Silicon Valley at the center of a tug of war. Technology companies including Apple, Microsoft and Google have been moving to encrypt more of their corporate and customer data after learning that the National Security Agency and its counterparts were siphoning off digital communications and hacking into corporate data centers.
  • Yet law enforcement and intelligence agency leaders argue that such efforts thwart their ability to monitor kidnappers, terrorists and other adversaries. In Britain, Prime Minister David Cameron threatened to ban encrypted messages altogether. In the United States, Michael S. Rogers, the director of the N.S.A., proposed that technology companies be required to create a digital key to unlock encrypted data, but to divide the key into pieces and secure it so that no one person or government agency could use it alone.The encryption debate has left both sides bitterly divided and in fighting mode. The group of cryptographers deliberately issued its report a day before James B. Comey Jr., the director of the Federal Bureau of Investigation, and Sally Quillian Yates, the deputy attorney general at the Justice Department, are scheduled to testify before the Senate Judiciary Committee on the concerns that they and other government agencies have that encryption technologies will prevent them from effectively doing their jobs.
  • ...2 more annotations...
  • The new paper is the first in-depth technical analysis of government proposals by leading cryptographers and security thinkers, including Whitfield Diffie, a pioneer of public key cryptography, and Ronald L. Rivest, the “R” in the widely used RSA public cryptography algorithm. In the report, the group said any effort to give the government “exceptional access” to encrypted communications was technically unfeasible and would leave confidential data and critical infrastructure like banks and the power grid at risk. Handing governments a key to encrypted communications would also require an extraordinary degree of trust. With government agency breaches now the norm — most recently at the United States Office of Personnel Management, the State Department and the White House — the security specialists said authorities could not be trusted to keep such keys safe from hackers and criminals. They added that if the United States and Britain mandated backdoor keys to communications, China and other governments in foreign markets would be spurred to do the same.
  • “Such access will open doors through which criminals and malicious nation-states can attack the very individuals law enforcement seeks to defend,” the report said. “The costs would be substantial, the damage to innovation severe and the consequences to economic growth hard to predict. The costs to the developed countries’ soft power and to our moral authority would also be considerable.”
  •  
    Our system of government does not expect that every criminal will be apprehended and convicted. There are numerous values our society believes are more important. Some examples: [i] a presumption of innocence unless guilt is established beyond any reasonable doubt; [ii] the requirement that government officials convince a neutral magistrate that they have probable cause to believe that a search or seizure will produce evidence of a crime; [iii] many communications cannot be compelled to be disclosed and used in evidence, such as attorney-client communications, spousal communications, and priest-penitent communications; and [iv] etc. Moral of my story: the government needs a much stronger reason to justify interception of communications than saying, "some crooks will escape prosecution if we can't do that." We have a right to whisper to each other, concealing our communicatons from all others. Why does the right to whisper privately disappear if our whisperings are done electronically? The Supreme Court took its first step on a very slippery slope when it permitted wiretapping in Olmstead v. United States, 277 U.S. 438, 48 S. Ct. 564, 72 L. Ed. 944 (1928). https://goo.gl/LaZGHt It's been a long slide ever since. It's past time to revisit Olmstead and recognize that American citizens have the absolute right to communicate privately. "The President … recognizes that U.S. citizens and institutions should have a reasonable expectation of privacy from foreign or domestic intercept when using the public telephone system." - Brent Scowcroft, U.S. National Security Advisor, National Security Decision Memorandum 338 (1 September 1976) (Nixon administration), http://www.fas.org/irp/offdocs/nsdm-ford/nsdm-338.pdf   
Gonzalo San Gil, PhD.

Microsoft has built a Linux-based operating system | ITworld - 1 views

    • Gonzalo San Gil, PhD.
       
      # ! Everbody wants to be ( or say they are) # ! #OpenSource. (http://www.bloomberg.com/news/articles/2015-06-08/apple-goes-open-source) # ! ... "Don't believe everything You hear..." # ! And, in this case, all moves of Apple and Microsoft towards Open Source are due to their appetite for the Supercomputers' Marker (actually -and traditionally- reigned by Open Source Platforms... )http://www.datacenterdynamics.com/news-analysis/supercomputers-prefer-open-source-storage/77786.fullarticle
  •  
    "Pigs haven't taken flight; aliens haven't invaded; hell hasn't frozen over. But... Microsoft has created an OS powered by Linux. No, this is not The Onion; it's true. "
  •  
    "Pigs haven't taken flight; aliens haven't invaded; hell hasn't frozen over. But... Microsoft has created an OS powered by Linux. No, this is not The Onion; it's true. "
Gonzalo San Gil, PhD.

USA Freedom Act Passes: What We Celebrate, What We Mourn, and Where We Go Fro... - 0 views

  • The Senate passed the USA Freedom Act today by 67-32, marking the first time in over thirty years that both houses of Congress have approved a bill placing real restrictions and oversight on the National Security Agency’s surveillance powers. The weakening amendments to the legislation proposed by NSA defender Senate Majority Mitch McConnell were defeated, and we have every reason to believe that President Obama will sign USA Freedom into law. Technology users everywhere should celebrate, knowing that the NSA will be a little more hampered in its surveillance overreach, and both the NSA and the FISA court will be more transparent and accountable than it was before the USA Freedom Act. It’s no secret that we wanted more. In the wake of the damning evidence of surveillance abuses disclosed by Edward Snowden, Congress had an opportunity to champion comprehensive surveillance reform and undertake a thorough investigation, like it did with the Church Committee. Congress could have tried to completely end mass surveillance and taken numerous other steps to rein in the NSA and FBI. This bill was the result of compromise and strong leadership by Sens. Patrick Leahy and Mike Lee and Reps. Robert Goodlatte, Jim Sensenbrenner, and John Conyers. It’s not the bill EFF would have written, and in light of the Second Circuit's thoughtful opinion, we withdrew our support from the bill in an effort to spur Congress to strengthen some of its privacy protections and out of concern about language added to the bill at the behest of the intelligence community. Even so, we’re celebrating. We’re celebrating because, however small, this bill marks a day that some said could never happen—a day when the NSA saw its surveillance power reduced by Congress. And we’re hoping that this could be a turning point in the fight to rein in the NSA.
  •  
    [The Senate passed the USA Freedom Act today by 67-32, marking the first time in over thirty years that both houses of Congress have approved a bill placing real restrictions and oversight on the National Security Agency's surveillance powers. The weakening amendments to the legislation proposed by NSA defender Senate Majority Mitch McConnell were defeated, and we have every reason to believe that President Obama will sign USA Freedom into law. Technology users everywhere should celebrate, knowing that the NSA will be a little more hampered in its surveillance overreach, and both the NSA and the FISA court will be more transparent and accountable than it was before the USA Freedom Act. ...]
Paul Merrell

Facebook's Deepface Software Has Gotten Them in Deep Trouble | nsnbc international - 0 views

  • In a Chicago court, several Facebook users filed a class-action lawsuit against the social media giant for allegedly violating its users’ privacy rights to acquire the largest privately held stash of biometric face-recognition data in the world. The court documents reveal claims that “Facebook began violating the Illinois Biometric Information Privacy Act (IBIPA) of 2008 in 2010, in a purported attempt to make the process of tagging friends easier.”
  • This was accomplished through the “tag suggestions” feature provided by Facebook which “scans all pictures uploaded by users and identifies any Facebook friends they may want to tag.” The Facebook users maintain that this feature is a “form of data mining [that] violates user’s privacy”. One plaintiff said this is a “brazen disregard for its users’ privacy rights,” through which Facebook has “secretly amassed the world’s largest privately held database of consumer biometrics data.” Because “Facebook actively conceals” their protocol using “faceprint databases” to identify Facebook users in photos, and “doesn’t disclose its wholesale biometrics data collection practices in its privacy policies, nor does it even ask users to acknowledge them.”
  • This would be a violation of the IBIPA which states it is “unlawful to collect biometric data without written notice to the subject stating the purpose and length of the data collection, and without obtaining the subject’s written release.” Because all users are automatically part of the “faceprint’ facial recognition program, this is an illegal act in the state of Illinois, according to the complaint. Jay Edelson, attorney for the plaintiffs, asserts the opt-out ability to prevent other Facebook users from tagging them in photos is “insufficient”.
  • ...1 more annotation...
  • Deepface is the name of the new technology researchers at Facebook created in order to identify people in pictures; mimicking the way humans recognize the differences in each other’s faces. Facebook has already implemented facial recognition software (FRS) to suggest names for tagging photos; however Deepface can “identify faces from a side view” as well as when the person is directly facing the camera in the picture. In 2013, Erin Egan, chief privacy officer for Facebook, said that this upgrade “would give users better control over their personal information, by making it easier to identify posted photos in which they appear.” Egan explained: “Our goal is to facilitate tagging so that people know when there are photos of them on our service.” Facebook has stated that they retain information from their users that is syphoned from all across the web. This data is used to increase Facebook’s profits with the information being sold for marketing purposes. This is the impressive feature of Deepface; as previous FRS can only decipher faces in images that are frontal views of people. Shockingly, Deepface displays 97.25% accuracy in identifying faces in photos. That is quite a feat considering humans have a 97.53% accuracy rate. In order to ensure accuracy, Deepface “conducts its analysis based on more than 120 million different parameters.”
Gonzalo San Gil, PhD.

Is Music Piracy The Problem… Or The Solution? - hypebot - 0 views

  •  
    "The point should be to build an audience that respects, loves, and appreciates you to the point where there is a clear demand for you and the product… to reach critical mass. With this in mind, piracy can play a huge role in acting as a catalyst for organic and valuable word-of-mouth promotion for the artist"
Paul Merrell

Android phones outsell iPhone 2-to-1, says research firm - Computerworld - 2 views

  • Android-powered smartphones outsold iPhones in the U.S. by almost 2-to-1 in the third quarter, a research firm said today.
  • "We started to see Android take off in 2009 when Verizon added the [Motorola] Droid," said Ross Rubin, the executive director of industry analysis for the NPD Group. "A big part of Android success is its carrier distribution. Once it got to the Verizon and Sprint customer bases, with their mature 3G networks, that's when we started to see it take off." According to NPD's surveys of U.S. retailers, Android phones accounted for 44% of all consumer smartphone sales in the third quarter, an increase of 11 percentage points over 2010's second quarter. Meanwhile, Apple's iOS, which powers the iPhone, was up one point to 23%.
Paul Merrell

Japan's Underground Datacenter - System News - 0 views

  • 00 meters under the ground in Japan, Sun along with ten other IT firms are building a datacenter. The datacenter is located at such a low depth to take advantage of the cooler air as a means of bringing the 40% of energy usage, for cooling, down a few notches. The datacenter will also be reluctant to Japan’s earthquake potential by being built on the solid bedrock floor of the crater hollowed out for the project.
  • In the underground pictures it is clear that the Sun Modular Datacenter 20 is going to be a successful format for the datacenter because it is self contained and there is an abundant resource of ground water in the cave for a cooling system. The data center will be used by government agencies, it will serve as a service center for IT clients, and it will be used by businesses.
  • The Sun MD 20 Sun is included the design of this datacenter. In the earthquake analysis, the prototype was placed on a large shake table in California, and put through a simulation of the Northridge earthquake of 1993. The results were very conclusive. The location of Japan’s underground datacenter is still undisclosed. More Information
Paul Merrell

New direction for 'JavaScript 2' | InfoWorld | Analysis | 2008-08-26 | By Paul Krill - 0 views

  • Standardization efforts for the next version of JavaScript have taken a sharp turn this month, with some key changes in the Web scripting technology's direction.
Paul Merrell

Mozilla, ARM and Others Eyeing a New Class of Device | OStatic - 0 views

  • I read with interest this item, along with analysis from Matt Asay about Mozilla, ARM, MontaVista Software and four other companies working together on a new category of device. The partners envision devices that sit between smartphones and laptops, and they sound very much like the Ultra-Mobile PC (UMPC) tablets, such as the ones Nokia makes.
  • The new device from the seven partners might be on sale by early 2009, according to Softpedia. Their story also makes this good point about the difference between this new effort and Nokia's tablet strategy: "Arm Inc. is creating a completely open platform that will be shared with the open-source community ." If it is completely open that could draw the interest of developers.
Gary Edwards

After Bill Gates, five possible futures for Microsoft | InfoWorld | Analysis | 2008-06-... - 0 views

  •  
    For most people, Bill Gates and Microsoft are one and the same. Gates has led Microsoft to global dominance in the 33 years since its founding, combining a strong opportunism -- getting the code for DOS to sell to IBM for the first PC and aping Apple's visual interface for the first Windows are the two best examples of Gates' moving where the wind was soon to blow -- with a steady vision of desktop computers being as powerful as the mainframes that captured techies' imaginations in the 1970s. This is the intro and overview into a series of articles describing the future of Microsoft through five possible scenarios. The series includes under the lead article, "The Future of Microsoft"; * The "Borvell" scenario * The "slow decline" scenario * The "streaming" scenario * The "Oort services" scenario * The "Gates was right" scenario ]
« First ‹ Previous 41 - 60 of 79 Next ›
Showing 20 items per page