Skip to main content

Home/ Future of the Web/ Group items matching "Click" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
Paul Merrell

Sun to Distribute Microsoft Live Search-Powered Toolbar as Part of Java Runtime Environment - System News - 0 views

  • Sun and Microsoft have agreed on a search distribution deal that will offer the MSN Toolbar, powered by Microsoft Live Search, to U.S.-based Internet Explorer users who download the Java Runtime Environment (JRE).
  • Sun and Microsoft have agreed on a search distribution deal that will offer the MSN Toolbar, powered by Microsoft Live Search, to U.S.-based Internet Explorer users who download the Java Runtime Environment (JRE). This agreement gives Internet Explorer users downloading Sun’s JRE the option to download the MSN Toolbar for one-click access to Live Search features, as well as news, entertainment, sports and more from the MSN network and direct access to Windows Live Hotmail and Windows Live Messenger.
  • “This agreement with Sun Microsystems is another important milestone in our strategy to secure broad-scale distribution for our search offering, enabling millions more people to experience the benefits of Live Search,” said Yusuf Mehdi, senior vice president of the Online Audience Business at Microsoft. “With the vast array of Java software-based Web applications that are downloaded every month, this deal will expose Live Search to millions more Internet users and drive increased volume for our search advertisers.”
Paul Merrell

Anti link-rot SaaS for web publishers -- WebCite - 0 views

  • The Problem Authors increasingly cite webpages and other digital objects on the Internet, which can "disappear" overnight. In one study published in the journal Science, 13% of Internet references in scholarly articles were inactive after only 27 months. Another problem is that cited webpages may change, so that readers see something different than what the citing author saw. The problem of unstable webcitations and the lack of routine digital preservation of cited digital objects has been referred to as an issue "calling for an immediate response" by publishers and authors [1]. An increasing number of editors and publishers ask that authors, when they cite a webpage, make a local copy of the cited webpage/webmaterial, and archive the cited URL in a system like WebCite®, to enable readers permanent access to the cited material.
  • What is WebCite®? WebCite®, a member of the International Internet Preservation Consortium, is an on-demand archiving system for webreferences (cited webpages and websites, or other kinds of Internet-accessible digital objects), which can be used by authors, editors, and publishers of scholarly papers and books, to ensure that cited webmaterial will remain available to readers in the future. If cited webreferences in journal articles, books etc. are not archived, future readers may encounter a "404 File Not Found" error when clicking on a cited URL. Try it! Archive a URL here. It's free and takes only 30 seconds. A WebCite®-enhanced reference is a reference which contains - in addition to the original live URL (which can and probably will disappear in the future, or its content may change) - a link to an archived copy of the material, exactly as the citing author saw it when he accessed the cited material.
  •  
    Free service spun off from the University of Toronto's University Health Network. Automagic archiving of cited internet content, generation of citations that include the url for the archived copy. Now if Google would just make it easier to use its search cache copies for the same purpose ...
Paul Merrell

OASIS - News - 2008-05-30 - 0 views

  • Members Approve Web Services for Remote Portlets (WSRP) 2.0 as OASIS Standard IBM, Sun Microsystems, Microsoft, Novell, Oracle, SAP, TIBCO, Vignette and Others Collaborate on Open Standard for Integrating Web Services into Portals
  • Boston, MA, USA; 30 May 2008 — OASIS, the international open standards consortium, today announced that its members have approved the Web Services for Remote Portlets (WSRP) version 2.0 as an OASIS Standard, a status that signifies the highest level of ratification. Developed through an open process by the OASIS WSRP Technical Committee, the new standard simplifies the effort required for aggregating applications, such as portals, to quickly integrate remote content and applications. "Vendors of aggregating applications no longer need to write special adapters to accommodate the variety of interfaces and protocols used by content providers," explained Rich Thompson of IBM, chair of the OASIS WSRP Technical Committee. "With WSRP, they can integrate remote content and applications with just a few mouse clicks and virtually no programming effort. WSRP version 2.0 adds those capabilities needed to fully integrate the remote components into the aggregated application."
  •  
    Like this http://www.hdfilmsaati.net Film,dvd,download,free download,product... ppc,adword,adsense,amazon,clickbank,osell,bookmark,dofollow,edu,gov,ads,linkwell,traffic,scor,serp,goggle,bing,yahoo.ads,ads network,ads goggle,bing,quality links,link best,ptr,cpa,bpa
Paul Merrell

Antitrust Week Continues: EU Slams Intel With $1.45b Fine - Law Blog - WSJ - 0 views

  • Most likely, we grant you, it was coincidence. But we couldn’t help notice the timing: Two days after the DOJ’s new antitrust head, Christine Varney, publicly repudiates her predecessors by pledging to ramp up enforcement on so-called “single-firm” monopolistic behavior, the European Union takes a sledgehammer to Intel Corp., fining it $1.45 billion for alleged monopolistic activity. The fine is the largest ever assessed for monopoly abuse. Click here for the WSJ story, from Charles Forelle; here for the NYT story; here for the NYT story; here for the FT story; here for the Commission’s statement; here for Intel’s response.
    • Paul Merrell
       
      See my earlier Diigo bookmark quoting the DG Competition statement that it had coordinated with the U.S. Justice Dept. in its simultaneous and ongoing investigation of INtel.
  • John Pheasant, an antitrust practitioner at Hogan & Hartson in London and Brussels, told the Law Blog that some of the evidence does “not look very good for Intel,” adding that “if the facts are there, this type of conduct is more likely to be regarded as abusive if practiced by a dominant company. . . .”
  • On Varney’s statement from earlier this week, Kroes said the Justice Department’s stance gave her a “huge positive feeling. The more competition authorities joining us in our competition philosophy, the better it is.”
Paul Merrell

NSA contractors use LinkedIn profiles to cash in on national security | Al Jazeera America - 0 views

  • NSA spies need jobs, too. And that is why many covert programs could be hiding in plain sight. Job websites such as LinkedIn and Indeed.com contain hundreds of profiles that reference classified NSA efforts, posted by everyone from career government employees to low-level IT workers who served in Iraq or Afghanistan. They offer a rare glimpse into the intelligence community's projects and how they operate. Now some researchers are using the same kinds of big-data tools employed by the NSA to scrape public LinkedIn profiles for classified programs. But the presence of so much classified information in public view raises serious concerns about security — and about the intelligence industry as a whole. “I’ve spent the past couple of years searching LinkedIn profiles for NSA programs,” said Christopher Soghoian, the principal technologist with the American Civil Liberties Union’s Speech, Privacy and Technology Project.
  • On Aug. 3, The Wall Street Journal published a story about the FBI’s growing use of hacking to monitor suspects, based on information Soghoian provided. The next day, Soghoian spoke at the Defcon hacking conference about how he uncovered the existence of the FBI’s hacking team, known as the Remote Operations Unit (ROU), using the LinkedIn profiles of two employees at James Bimen Associates, with which the FBI contracts for hacking operations. “Had it not been for the sloppy actions of a few contractors updating their LinkedIn profiles, we would have never known about this,” Soghoian said in his Defcon talk. Those two contractors were not the only ones being sloppy.
  • And there are many more. A quick search of Indeed.com using three code names unlikely to return false positives — Dishfire, XKeyscore and Pinwale — turned up 323 résumés. The same search on LinkedIn turned up 48 profiles mentioning Dishfire, 18 mentioning XKeyscore and 74 mentioning Pinwale. Almost all these people appear to work in the intelligence industry. Network-mapping the data Fabio Pietrosanti of the Hermes Center for Transparency and Digital Human Rights noticed all the code names on LinkedIn last December. While sitting with M.C. McGrath at the Chaos Communication Congress in Hamburg, Germany, Pietrosanti began searching the website for classified program names — and getting serious results. McGrath was already developing Transparency Toolkit, a Web application for investigative research, and knew he could improve on Pietrosanti’s off-the-cuff methods.
  • ...2 more annotations...
  • “I was, like, huh, maybe there’s more we can do with this — actually get a list of all these profiles that have these results and use that to analyze the structure of which companies are helping with which programs, which people are helping with which programs, try to figure out in what capacity, and learn more about things that we might not know about,” McGrath said. He set up a computer program called a scraper to search LinkedIn for public profiles that mention known NSA programs, contractors or jargon — such as SIGINT, the agency’s term for “signals intelligence” gleaned from intercepted communications. Once the scraper found the name of an NSA program, it searched nearby for other words in all caps. That allowed McGrath to find the names of unknown programs, too. Once McGrath had the raw data — thousands of profiles in all, with 70 to 80 different program names — he created a network graph that showed the relationships between specific government agencies, contractors and intelligence programs. Of course, the data are limited to what people are posting on their LinkedIn profiles. Still, the network graph gives a sense of which contractors work on several NSA programs, which ones work on just one or two, and even which programs military units in Iraq and Afghanistan are using. And that is just the beginning.
  • Click on the image to view an interactive network illustration of the relationships between specific national security surveillance programs in red, and government organizations or private contractors in blue.
  •  
    What a giggle, public spying on NSA and its contractors using Big Data. The interactive network graph with its sidebar display of relevant data derived from LinkedIn profiles is just too delightful. 
Paul Merrell

F.B.I. Director to Call 'Dark' Devices a Hindrance to Crime Solving in a Policy Speech - NYTimes.com - 0 views

  • In his first major policy speech as director of the F.B.I., James B. Comey on Thursday plans to wade deeper into the debate between law enforcement agencies and technology companies about new programs intended to protect personal information on communication devices.Mr. Comey will say that encryption technologies used on these devices, like the new iPhone, have become so sophisticated that crimes will go unsolved because law enforcement officers will not be able to get information from them, according to a senior F.B.I. official who provided a preview of the speech.The speech was prompted, in part, by the new encryption technology on the iPhone 6, which was released last month. The phone encrypts emails, photos and contacts, thwarting intelligence and law enforcement agencies, like the National Security Agency and F.B.I., from gaining access to it, even if they have court approval.
  • The F.B.I. has long had concerns about devices “going dark” — when technology becomes so sophisticated that the authorities cannot gain access. But now, Mr. Comey said he believes that the new encryption technology has evolved to the point that it will adversely affect crime solving.He will say in the speech that these new programs will most severely affect state and local law enforcement agencies, because they are the ones who most often investigate crimes like kidnappings and robberies in which getting information from electronic devices in a timely manner is essential to solving the crime.
  • They also do not have the resources that are available to the F.B.I. and other federal intelligence and law enforcement authorities in order to get around the programs.Mr. Comey will cite examples of crimes that the authorities were able to solve because they gained access to a phone.“He is going to call for a discussion on this issue and ask whether this is the path we want to go down,” said the senior F.B.I. official. “He is not going to accuse the companies of designing the technologies to prevent the F.B.I. from accessing them. But, he will say that this is a negative byproduct and we need to work together to fix it.”
  • ...2 more annotations...
  • Mr. Comey is scheduled to give the speech — titled “Going Dark: Are Technology, Privacy and Public Safety on a Collision Course?” — at the Brookings Institution in Washington.
  • In the interview that aired on “60 Minutes” on Sunday, Mr. Comey said that “the notion that we would market devices that would allow someone to place themselves beyond the law troubles me a lot.”He said that it was the equivalent of selling cars with trunks that could never be opened, even with a court order.“The notion that people have devices, again, that with court orders, based on a showing of probable cause in a case involving kidnapping or child exploitation or terrorism, we could never open that phone?” he said. “My sense is that we've gone too far when we've gone there.”
  •  
    I'm informed that Comey will also call for legislation outlawing communication by whispering because of technical difficulties in law enforcement monitoring of such communications. 
Paul Merrell

Most Agencies Falling Short on Mandate for Online Records - 1 views

  • Nearly 20 years after Congress passed the Electronic Freedom of Information Act Amendments (E-FOIA), only 40 percent of agencies have followed the law's instruction for systematic posting of records released through FOIA in their electronic reading rooms, according to a new FOIA Audit released today by the National Security Archive at www.nsarchive.org to mark Sunshine Week. The Archive team audited all federal agencies with Chief FOIA Officers as well as agency components that handle more than 500 FOIA requests a year — 165 federal offices in all — and found only 67 with online libraries populated with significant numbers of released FOIA documents and regularly updated.
  • Congress called on agencies to embrace disclosure and the digital era nearly two decades ago, with the passage of the 1996 "E-FOIA" amendments. The law mandated that agencies post key sets of records online, provide citizens with detailed guidance on making FOIA requests, and use new information technology to post online proactively records of significant public interest, including those already processed in response to FOIA requests and "likely to become the subject of subsequent requests." Congress believed then, and openness advocates know now, that this kind of proactive disclosure, publishing online the results of FOIA requests as well as agency records that might be requested in the future, is the only tenable solution to FOIA backlogs and delays. Thus the National Security Archive chose to focus on the e-reading rooms of agencies in its latest audit. Even though the majority of federal agencies have not yet embraced proactive disclosure of their FOIA releases, the Archive E-FOIA Audit did find that some real "E-Stars" exist within the federal government, serving as examples to lagging agencies that technology can be harnessed to create state-of-the art FOIA platforms. Unfortunately, our audit also found "E-Delinquents" whose abysmal web performance recalls the teletype era.
  • E-Delinquents include the Office of Science and Technology Policy at the White House, which, despite being mandated to advise the President on technology policy, does not embrace 21st century practices by posting any frequently requested records online. Another E-Delinquent, the Drug Enforcement Administration, insults its website's viewers by claiming that it "does not maintain records appropriate for FOIA Library at this time."
  • ...9 more annotations...
  • "The presumption of openness requires the presumption of posting," said Archive director Tom Blanton. "For the new generation, if it's not online, it does not exist." The National Security Archive has conducted fourteen FOIA Audits since 2002. Modeled after the California Sunshine Survey and subsequent state "FOI Audits," the Archive's FOIA Audits use open-government laws to test whether or not agencies are obeying those same laws. Recommendations from previous Archive FOIA Audits have led directly to laws and executive orders which have: set explicit customer service guidelines, mandated FOIA backlog reduction, assigned individualized FOIA tracking numbers, forced agencies to report the average number of days needed to process requests, and revealed the (often embarrassing) ages of the oldest pending FOIA requests. The surveys include:
  • The federal government has made some progress moving into the digital era. The National Security Archive's last E-FOIA Audit in 2007, " File Not Found," reported that only one in five federal agencies had put online all of the specific requirements mentioned in the E-FOIA amendments, such as guidance on making requests, contact information, and processing regulations. The new E-FOIA Audit finds the number of agencies that have checked those boxes is now much higher — 100 out of 165 — though many (66 in 165) have posted just the bare minimum, especially when posting FOIA responses. An additional 33 agencies even now do not post these types of records at all, clearly thwarting the law's intent.
  • The FOIAonline Members (Department of Commerce, Environmental Protection Agency, Federal Labor Relations Authority, Merit Systems Protection Board, National Archives and Records Administration, Pension Benefit Guaranty Corporation, Department of the Navy, General Services Administration, Small Business Administration, U.S. Citizenship and Immigration Services, and Federal Communications Commission) won their "E-Star" by making past requests and releases searchable via FOIAonline. FOIAonline also allows users to submit their FOIA requests digitally.
  • THE E-DELINQUENTS: WORST OVERALL AGENCIES In alphabetical order
  • Key Findings
  • Excuses Agencies Give for Poor E-Performance
  • Justice Department guidance undermines the statute. Currently, the FOIA stipulates that documents "likely to become the subject of subsequent requests" must be posted by agencies somewhere in their electronic reading rooms. The Department of Justice's Office of Information Policy defines these records as "frequently requested records… or those which have been released three or more times to FOIA requesters." Of course, it is time-consuming for agencies to develop a system that keeps track of how often a record has been released, which is in part why agencies rarely do so and are often in breach of the law. Troublingly, both the current House and Senate FOIA bills include language that codifies the instructions from the Department of Justice. The National Security Archive believes the addition of this "three or more times" language actually harms the intent of the Freedom of Information Act as it will give agencies an easy excuse ("not requested three times yet!") not to proactively post documents that agency FOIA offices have already spent time, money, and energy processing. We have formally suggested alternate language requiring that agencies generally post "all records, regardless of form or format that have been released in response to a FOIA request."
  • Disabilities Compliance. Despite the E-FOIA Act, many government agencies do not embrace the idea of posting their FOIA responses online. The most common reason agencies give is that it is difficult to post documents in a format that complies with the Americans with Disabilities Act, also referred to as being "508 compliant," and the 1998 Amendments to the Rehabilitation Act that require federal agencies "to make their electronic and information technology (EIT) accessible to people with disabilities." E-Star agencies, however, have proven that 508 compliance is no barrier when the agency has a will to post. All documents posted on FOIAonline are 508 compliant, as are the documents posted by the Department of Defense and the Department of State. In fact, every document created electronically by the US government after 1998 should already be 508 compliant. Even old paper records that are scanned to be processed through FOIA can be made 508 compliant with just a few clicks in Adobe Acrobat, according to this Department of Homeland Security guide (essentially OCRing the text, and including information about where non-textual fields appear). Even if agencies are insistent it is too difficult to OCR older documents that were scanned from paper, they cannot use that excuse with digital records.
  • Privacy. Another commonly articulated concern about posting FOIA releases online is that doing so could inadvertently disclose private information from "first person" FOIA requests. This is a valid concern, and this subset of FOIA requests should not be posted online. (The Justice Department identified "first party" requester rights in 1989. Essentially agencies cannot use the b(6) privacy exemption to redact information if a person requests it for him or herself. An example of a "first person" FOIA would be a person's request for his own immigration file.) Cost and Waste of Resources. There is also a belief that there is little public interest in the majority of FOIA requests processed, and hence it is a waste of resources to post them. This thinking runs counter to the governing principle of the Freedom of Information Act: that government information belongs to US citizens, not US agencies. As such, the reason that a person requests information is immaterial as the agency processes the request; the "interest factor" of a document should also be immaterial when an agency is required to post it online. Some think that posting FOIA releases online is not cost effective. In fact, the opposite is true. It's not cost effective to spend tens (or hundreds) of person hours to search for, review, and redact FOIA requests only to mail it to the requester and have them slip it into their desk drawer and forget about it. That is a waste of resources. The released document should be posted online for any interested party to utilize. This will only become easier as FOIA processing systems evolve to automatically post the documents they track. The State Department earned its "E-Star" status demonstrating this very principle, and spent no new funds and did not hire contractors to build its Electronic Reading Room, instead it built a self-sustaining platform that will save the agency time and money going forward.
Paul Merrell

How to Encrypt the Entire Web for Free - The Intercept - 0 views

  • If we’ve learned one thing from the Snowden revelations, it’s that what can be spied on will be spied on. Since the advent of what used to be known as the World Wide Web, it has been a relatively simple matter for network attackers—whether it’s the NSA, Chinese intelligence, your employer, your university, abusive partners, or teenage hackers on the same public WiFi as you—to spy on almost everything you do online. HTTPS, the technology that encrypts traffic between browsers and websites, fixes this problem—anyone listening in on that stream of data between you and, say, your Gmail window or bank’s web site would get nothing but useless random characters—but is woefully under-used. The ambitious new non-profit Let’s Encrypt aims to make the process of deploying HTTPS not only fast, simple, and free, but completely automatic. If it succeeds, the project will render vast regions of the internet invisible to prying eyes.
  • Encryption also prevents attackers from tampering with or impersonating legitimate websites. For example, the Chinese government censors specific pages on Wikipedia, the FBI impersonated The Seattle Times to get a suspect to click on a malicious link, and Verizon and AT&T injected tracking tokens into mobile traffic without user consent. HTTPS goes a long way in preventing these sorts of attacks. And of course there’s the NSA, which relies on the limited adoption of HTTPS to continue to spy on the entire internet with impunity. If companies want to do one thing to meaningfully protect their customers from surveillance, it should be enabling encryption on their websites by default.
  • Let’s Encrypt, which was announced this week but won’t be ready to use until the second quarter of 2015, describes itself as “a free, automated, and open certificate authority (CA), run for the public’s benefit.” It’s the product of years of work from engineers at Mozilla, Cisco, Akamai, Electronic Frontier Foundation, IdenTrust, and researchers at the University of Michigan. (Disclosure: I used to work for the Electronic Frontier Foundation, and I was aware of Let’s Encrypt while it was being developed.) If Let’s Encrypt works as advertised, deploying HTTPS correctly and using all of the best practices will be one of the simplest parts of running a website. All it will take is running a command. Currently, HTTPS requires jumping through a variety of complicated hoops that certificate authorities insist on in order prove ownership of domain names. Let’s Encrypt automates this task in seconds, without requiring any human intervention, and at no cost.
  • ...2 more annotations...
  • The benefits of using HTTPS are obvious when you think about protecting secret information you send over the internet, like passwords and credit card numbers. It also helps protect information like what you search for in Google, what articles you read, what prescription medicine you take, and messages you send to colleagues, friends, and family from being monitored by hackers or authorities. But there are less obvious benefits as well. Websites that don’t use HTTPS are vulnerable to “session hijacking,” where attackers can take over your account even if they don’t know your password. When you download software without encryption, sophisticated attackers can secretly replace the download with malware that hacks your computer as soon as you try installing it.
  • The transition to a fully encrypted web won’t be immediate. After Let’s Encrypt is available to the public in 2015, each website will have to actually use it to switch over. And major web hosting companies also need to hop on board for their customers to be able to take advantage of it. If hosting companies start work now to integrate Let’s Encrypt into their services, they could offer HTTPS hosting by default at no extra cost to all their customers by the time it launches.
  •  
    Don't miss the video. And if you have a web site, urge your host service to begin preparing for Let's Encrypt. (See video on why it's good for them.)
Paul Merrell

U.S. Embedded Spyware Overseas, Report Claims - NYTimes.com - 0 views

  • The United States has found a way to permanently embed surveillance and sabotage tools in computers and networks it has targeted in Iran, Russia, Pakistan, China, Afghanistan and other countries closely watched by American intelligence agencies, according to a Russian cybersecurity firm.In a presentation of its findings at a conference in Mexico on Monday, Kaspersky Lab, the Russian firm, said that the implants had been placed by what it called the “Equation Group,” which appears to be a veiled reference to the National Security Agency and its military counterpart, United States Cyber Command.
  • It linked the techniques to those used in Stuxnet, the computer worm that disabled about 1,000 centrifuges in Iran’s nuclear enrichment program. It was later revealed that Stuxnet was part of a program code-named Olympic Games and run jointly by Israel and the United States.Kaspersky’s report said that Olympic Games had similarities to a much broader effort to infect computers well beyond those in Iran. It detected particularly high infection rates in computers in Iran, Pakistan and Russia, three countries whose nuclear programs the United States routinely monitors.
  • Some of the implants burrow so deep into the computer systems, Kaspersky said, that they infect the “firmware,” the embedded software that preps the computer’s hardware before the operating system starts. It is beyond the reach of existing antivirus products and most security controls, Kaspersky reported, making it virtually impossible to wipe out.
  • ...1 more annotation...
  • In many cases, it also allows the American intelligence agencies to grab the encryption keys off a machine, unnoticed, and unlock scrambled contents. Moreover, many of the tools are designed to run on computers that are disconnected from the Internet, which was the case in the computers controlling Iran’s nuclear enrichment plants.
Gonzalo San Gil, PhD.

Missteps in Europe's Online Privacy Bill - NYTimes.com - 0 views

  •  
    "By THE EDITORIAL BOARD December 21, 2015 The European Union could soon adopt a law that would strengthen online privacy protections for consumers, but it would come at a cost to free expression and leave a redacted history for Internet users."
Alexandra IcecreamApps

Top Google Chrome Extensions for Better Browsing - Icecream Tech Digest - 1 views

  •  
    Google Chrome browser has become widely popular thanks to its high speed, elegant, minimalistic interface, and in-built translator; and, well, it is a Google product after all. Thanks to its fame and tons of users, the number of available extensions…
  •  
    Google Chrome browser has become widely popular thanks to its high speed, elegant, minimalistic interface, and in-built translator; and, well, it is a Google product after all. Thanks to its fame and tons of users, the number of available extensions…
Paul Merrell

Apple's New Challenge: Learning How the U.S. Cracked Its iPhone - The New York Times - 0 views

  • Now that the United States government has cracked open an iPhone that belonged to a gunman in the San Bernardino, Calif., mass shooting without Apple’s help, the tech company is under pressure to find and fix the flaw.But unlike other cases where security vulnerabilities have cropped up, Apple may face a higher set of hurdles in ferreting out and repairing the particular iPhone hole that the government hacked.The challenges start with the lack of information about the method that the law enforcement authorities, with the aid of a third party, used to break into the iPhone of Syed Rizwan Farook, an attacker in the San Bernardino rampage last year. Federal officials have refused to identify the person, or organization, who helped crack the device, and have declined to specify the procedure used to open the iPhone. Apple also cannot obtain the device to reverse-engineer the problem, the way it would in other hacking situations.
  •  
    It would make a very interesting Freedom of Information Act case if Apple sued under that Act to force disclosure of the security hole iPhone product defect the FBI exploited. I know of no interpretation of the law enforcement FOIA exemption that would justify FBI disclosure of the information. It might be alleged that the information is the trade secret of the company that disclosed the defect and exploit to the the FBI, but there's a very strong argument that the fact that the information was shared with the FBI waived the trade secrecy claim. And the notion that government is entitled to collect product security defects and exploit them without informing the exploited product's company of the specific defect is extremely weak.  Were I Tim Cook, I would have already told my lawyers to get cracking on filing the FOIA request with the FBI to get the legal ball rolling. 
Paul Merrell

Facebook's New 'Supreme Court' Could Revolutionize Online Speech - Lawfare - 0 views

  • The Supreme Court of Facebook is about to become a reality. When Facebook CEO Mark Zuckerberg first mentioned the idea of an independent oversight body to determine the boundaries of acceptable speech on the platform—“almost like a Supreme Court,” he said—in an April 2018 interview with Vox, it sounded like an offhand musing.  But on Nov. 15, responding to a New York Times article documenting how Facebook’s executives have dealt with the company’s scandal-ridden last few years, Zuckerberg published a blog post announcing that Facebook will “create a new way for people to appeal content decisions to an independent body, whose decisions would be transparent and binding.” Supreme Court of Facebook-like bodies will be piloted early next year in regions around the world, and the “court” proper is to be established by the end of 2019, he wrote.
Paul Merrell

California Passes Sweeping Law to Protect Online Privacy - The New York Times - 0 views

  • California has passed a digital privacy law granting consumers more control over and insight into the spread of their personal information online, creating one of the most significant regulations overseeing the data-collection practices of technology companies in the United States.The bill raced through the State Legislature without opposition on Thursday and was signed into law by Gov. Jerry Brown, just hours before a deadline to pull from the November ballot an initiative seeking even tougher oversight over technology companies.The new law grants consumers the right to know what information companies are collecting about them, why they are collecting that data and with whom they are sharing it. It gives consumers the right to tell companies to delete their information as well as to not sell or share their data. Businesses must still give consumers who opt out the same quality of service.It also makes it more difficult to share or sell data on children younger than 16.The legislation, which goes into effect in January 2020, makes it easier for consumers to sue companies after a data breach. And it gives the state’s attorney general more authority to fine companies that don’t adhere to the new regulations.
  • The California law is not as expansive as Europe’s General Data Protection Regulation, or G.D.P.R., a new set of laws restricting how tech companies collect, store and use personal data.But Aleecia M. McDonald, an incoming assistant professor at Carnegie Mellon University who specializes in privacy policy, said California’s privacy measure was one of the most comprehensive in the United States, since most existing laws — and there are not many — do little to limit what companies can do with consumer information.
Paul Merrell

From Radio to Porn, British Spies Track Web Users' Online Identities - 1 views

  • HERE WAS A SIMPLE AIM at the heart of the top-secret program: Record the website browsing habits of “every visible user on the Internet.” Before long, billions of digital records about ordinary people’s online activities were being stored every day. Among them were details cataloging visits to porn, social media and news websites, search engines, chat forums, and blogs. The mass surveillance operation — code-named KARMA POLICE — was launched by British spies about seven years ago without any public debate or scrutiny. It was just one part of a giant global Internet spying apparatus built by the United Kingdom’s electronic eavesdropping agency, Government Communications Headquarters, or GCHQ. The revelations about the scope of the British agency’s surveillance are contained in documents obtained by The Intercept from National Security Agency whistleblower Edward Snowden. Previous reports based on the leaked files have exposed how GCHQ taps into Internet cables to monitor communications on a vast scale, but many details about what happens to the data after it has been vacuumed up have remained unclear.
  • Amid a renewed push from the U.K. government for more surveillance powers, more than two dozen documents being disclosed today by The Intercept reveal for the first time several major strands of GCHQ’s existing electronic eavesdropping capabilities.
  • The surveillance is underpinned by an opaque legal regime that has authorized GCHQ to sift through huge archives of metadata about the private phone calls, emails and Internet browsing logs of Brits, Americans, and any other citizens — all without a court order or judicial warrant
  • ...17 more annotations...
  • A huge volume of the Internet data GCHQ collects flows directly into a massive repository named Black Hole, which is at the core of the agency’s online spying operations, storing raw logs of intercepted material before it has been subject to analysis. Black Hole contains data collected by GCHQ as part of bulk “unselected” surveillance, meaning it is not focused on particular “selected” targets and instead includes troves of data indiscriminately swept up about ordinary people’s online activities. Between August 2007 and March 2009, GCHQ documents say that Black Hole was used to store more than 1.1 trillion “events” — a term the agency uses to refer to metadata records — with about 10 billion new entries added every day. As of March 2009, the largest slice of data Black Hole held — 41 percent — was about people’s Internet browsing histories. The rest included a combination of email and instant messenger records, details about search engine queries, information about social media activity, logs related to hacking operations, and data on people’s use of tools to browse the Internet anonymously.
  • Throughout this period, as smartphone sales started to boom, the frequency of people’s Internet use was steadily increasing. In tandem, British spies were working frantically to bolster their spying capabilities, with plans afoot to expand the size of Black Hole and other repositories to handle an avalanche of new data. By 2010, according to the documents, GCHQ was logging 30 billion metadata records per day. By 2012, collection had increased to 50 billion per day, and work was underway to double capacity to 100 billion. The agency was developing “unprecedented” techniques to perform what it called “population-scale” data mining, monitoring all communications across entire countries in an effort to detect patterns or behaviors deemed suspicious. It was creating what it said would be, by 2013, “the world’s biggest” surveillance engine “to run cyber operations and to access better, more valued data for customers to make a real world difference.”
  • A document from the GCHQ target analysis center (GTAC) shows the Black Hole repository’s structure.
  • The data is searched by GCHQ analysts in a hunt for behavior online that could be connected to terrorism or other criminal activity. But it has also served a broader and more controversial purpose — helping the agency hack into European companies’ computer networks. In the lead up to its secret mission targeting Netherlands-based Gemalto, the largest SIM card manufacturer in the world, GCHQ used MUTANT BROTH in an effort to identify the company’s employees so it could hack into their computers. The system helped the agency analyze intercepted Facebook cookies it believed were associated with Gemalto staff located at offices in France and Poland. GCHQ later successfully infiltrated Gemalto’s internal networks, stealing encryption keys produced by the company that protect the privacy of cell phone communications.
  • Similarly, MUTANT BROTH proved integral to GCHQ’s hack of Belgian telecommunications provider Belgacom. The agency entered IP addresses associated with Belgacom into MUTANT BROTH to uncover information about the company’s employees. Cookies associated with the IPs revealed the Google, Yahoo, and LinkedIn accounts of three Belgacom engineers, whose computers were then targeted by the agency and infected with malware. The hacking operation resulted in GCHQ gaining deep access into the most sensitive parts of Belgacom’s internal systems, granting British spies the ability to intercept communications passing through the company’s networks.
  • In March, a U.K. parliamentary committee published the findings of an 18-month review of GCHQ’s operations and called for an overhaul of the laws that regulate the spying. The committee raised concerns about the agency gathering what it described as “bulk personal datasets” being held about “a wide range of people.” However, it censored the section of the report describing what these “datasets” contained, despite acknowledging that they “may be highly intrusive.” The Snowden documents shine light on some of the core GCHQ bulk data-gathering programs that the committee was likely referring to — pulling back the veil of secrecy that has shielded some of the agency’s most controversial surveillance operations from public scrutiny. KARMA POLICE and MUTANT BROTH are among the key bulk collection systems. But they do not operate in isolation — and the scope of GCHQ’s spying extends far beyond them.
  • The agency operates a bewildering array of other eavesdropping systems, each serving its own specific purpose and designated a unique code name, such as: SOCIAL ANTHROPOID, which is used to analyze metadata on emails, instant messenger chats, social media connections and conversations, plus “telephony” metadata about phone calls, cell phone locations, text and multimedia messages; MEMORY HOLE, which logs queries entered into search engines and associates each search with an IP address; MARBLED GECKO, which sifts through details about searches people have entered into Google Maps and Google Earth; and INFINITE MONKEYS, which analyzes data about the usage of online bulletin boards and forums. GCHQ has other programs that it uses to analyze the content of intercepted communications, such as the full written body of emails and the audio of phone calls. One of the most important content collection capabilities is TEMPORA, which mines vast amounts of emails, instant messages, voice calls and other communications and makes them accessible through a Google-style search tool named XKEYSCORE.
  • As of September 2012, TEMPORA was collecting “more than 40 billion pieces of content a day” and it was being used to spy on people across Europe, the Middle East, and North Africa, according to a top-secret memo outlining the scope of the program. The existence of TEMPORA was first revealed by The Guardian in June 2013. To analyze all of the communications it intercepts and to build a profile of the individuals it is monitoring, GCHQ uses a variety of different tools that can pull together all of the relevant information and make it accessible through a single interface. SAMUEL PEPYS is one such tool, built by the British spies to analyze both the content and metadata of emails, browsing sessions, and instant messages as they are being intercepted in real time. One screenshot of SAMUEL PEPYS in action shows the agency using it to monitor an individual in Sweden who visited a page about GCHQ on the U.S.-based anti-secrecy website Cryptome.
  • Partly due to the U.K.’s geographic location — situated between the United States and the western edge of continental Europe — a large amount of the world’s Internet traffic passes through its territory across international data cables. In 2010, GCHQ noted that what amounted to “25 percent of all Internet traffic” was transiting the U.K. through some 1,600 different cables. The agency said that it could “survey the majority of the 1,600” and “select the most valuable to switch into our processing systems.”
  • According to Joss Wright, a research fellow at the University of Oxford’s Internet Institute, tapping into the cables allows GCHQ to monitor a large portion of foreign communications. But the cables also transport masses of wholly domestic British emails and online chats, because when anyone in the U.K. sends an email or visits a website, their computer will routinely send and receive data from servers that are located overseas. “I could send a message from my computer here [in England] to my wife’s computer in the next room and on its way it could go through the U.S., France, and other countries,” Wright says. “That’s just the way the Internet is designed.” In other words, Wright adds, that means “a lot” of British data and communications transit across international cables daily, and are liable to be swept into GCHQ’s databases.
  • A map from a classified GCHQ presentation about intercepting communications from undersea cables. GCHQ is authorized to conduct dragnet surveillance of the international data cables through so-called external warrants that are signed off by a government minister. The external warrants permit the agency to monitor communications in foreign countries as well as British citizens’ international calls and emails — for example, a call from Islamabad to London. They prohibit GCHQ from reading or listening to the content of “internal” U.K. to U.K. emails and phone calls, which are supposed to be filtered out from GCHQ’s systems if they are inadvertently intercepted unless additional authorization is granted to scrutinize them. However, the same rules do not apply to metadata. A little-known loophole in the law allows GCHQ to use external warrants to collect and analyze bulk metadata about the emails, phone calls, and Internet browsing activities of British people, citizens of closely allied countries, and others, regardless of whether the data is derived from domestic U.K. to U.K. communications and browsing sessions or otherwise. In March, the existence of this loophole was quietly acknowledged by the U.K. parliamentary committee’s surveillance review, which stated in a section of its report that “special protection and additional safeguards” did not apply to metadata swept up using external warrants and that domestic British metadata could therefore be lawfully “returned as a result of searches” conducted by GCHQ.
  • Perhaps unsurprisingly, GCHQ appears to have readily exploited this obscure legal technicality. Secret policy guidance papers issued to the agency’s analysts instruct them that they can sift through huge troves of indiscriminately collected metadata records to spy on anyone regardless of their nationality. The guidance makes clear that there is no exemption or extra privacy protection for British people or citizens from countries that are members of the Five Eyes, a surveillance alliance that the U.K. is part of alongside the U.S., Canada, Australia, and New Zealand. “If you are searching a purely Events only database such as MUTANT BROTH, the issue of location does not occur,” states one internal GCHQ policy document, which is marked with a “last modified” date of July 2012. The document adds that analysts are free to search the databases for British metadata “without further authorization” by inputing a U.K. “selector,” meaning a unique identifier such as a person’s email or IP address, username, or phone number. Authorization is “not needed for individuals in the U.K.,” another GCHQ document explains, because metadata has been judged “less intrusive than communications content.” All the spies are required to do to mine the metadata troves is write a short “justification” or “reason” for each search they conduct and then click a button on their computer screen.
  • Intelligence GCHQ collects on British persons of interest is shared with domestic security agency MI5, which usually takes the lead on spying operations within the U.K. MI5 conducts its own extensive domestic surveillance as part of a program called DIGINT (digital intelligence).
  • GCHQ’s documents suggest that it typically retains metadata for periods of between 30 days to six months. It stores the content of communications for a shorter period of time, varying between three to 30 days. The retention periods can be extended if deemed necessary for “cyber defense.” One secret policy paper dated from January 2010 lists the wide range of information the agency classes as metadata — including location data that could be used to track your movements, your email, instant messenger, and social networking “buddy lists,” logs showing who you have communicated with by phone or email, the passwords you use to access “communications services” (such as an email account), and information about websites you have viewed.
  • Records showing the full website addresses you have visited — for instance, www.gchq.gov.uk/what_we_do — are treated as content. But the first part of an address you have visited — for instance, www.gchq.gov.uk — is treated as metadata. In isolation, a single metadata record of a phone call, email, or website visit may not reveal much about a person’s private life, according to Ethan Zuckerman, director of Massachusetts Institute of Technology’s Center for Civic Media. But if accumulated and analyzed over a period of weeks or months, these details would be “extremely personal,” he told The Intercept, because they could reveal a person’s movements, habits, religious beliefs, political views, relationships, and even sexual preferences. For Zuckerman, who has studied the social and political ramifications of surveillance, the most concerning aspect of large-scale government data collection is that it can be “corrosive towards democracy” — leading to a chilling effect on freedom of expression and communication. “Once we know there’s a reasonable chance that we are being watched in one fashion or another it’s hard for that not to have a ‘panopticon effect,’” he said, “where we think and behave differently based on the assumption that people may be watching and paying attention to what we are doing.”
  • When compared to surveillance rules in place in the U.S., GCHQ notes in one document that the U.K. has “a light oversight regime.” The more lax British spying regulations are reflected in secret internal rules that highlight greater restrictions on how NSA databases can be accessed. The NSA’s troves can be searched for data on British citizens, one document states, but they cannot be mined for information about Americans or other citizens from countries in the Five Eyes alliance. No such constraints are placed on GCHQ’s own databases, which can be sifted for records on the phone calls, emails, and Internet usage of Brits, Americans, and citizens from any other country. The scope of GCHQ’s surveillance powers explain in part why Snowden told The Guardian in June 2013 that U.K. surveillance is “worse than the U.S.” In an interview with Der Spiegel in July 2013, Snowden added that British Internet cables were “radioactive” and joked: “Even the Queen’s selfies to the pool boy get logged.”
  • In recent years, the biggest barrier to GCHQ’s mass collection of data does not appear to have come in the form of legal or policy restrictions. Rather, it is the increased use of encryption technology that protects the privacy of communications that has posed the biggest potential hindrance to the agency’s activities. “The spread of encryption … threatens our ability to do effective target discovery/development,” says a top-secret report co-authored by an official from the British agency and an NSA employee in 2011. “Pertinent metadata events will be locked within the encrypted channels and difficult, if not impossible, to prise out,” the report says, adding that the agencies were working on a plan that would “(hopefully) allow our Internet Exploitation strategy to prevail.”
Paul Merrell

Trump's Blocking of Twitter Users Is Unconstitutional, Judge Says - The New York Times - 0 views

  • Apart from the man himself, perhaps nothing has defined President Trump’s political persona more than Twitter.But on Wednesday, one of Mr. Trump’s Twitter habits — his practice of blocking critics on the service, preventing them from engaging with his account — was declared unconstitutional by a federal judge in Manhattan.Judge Naomi Reice Buchwald, addressing a novel issue about how the Constitution applies to social media platforms and public officials, found that the president’s Twitter feed is a public forum. As a result, she ruled that when Mr. Trump or an aide blocked seven plaintiffs from viewing and replying to his posts, he violated the First Amendment.If the principle undergirding Wednesday’s ruling in Federal District Court stands, it is likely to have implications far beyond Mr. Trump’s feed and its 52 million followers, said Jameel Jaffer, the Knight First Amendment Institute’s executive director and the counsel for the plaintiffs. Public officials throughout the country, from local politicians to governors and members of Congress, regularly use social media platforms like Twitter and Facebook to interact with the public about government business.
Paul Merrell

Federal Court Rules Suspicionless Searches of Travelers' Phones and Laptops Unconstitutional | Electronic Frontier Foundation - 1 views

  • n a major victory for privacy rights at the border, a federal court in Boston ruled today that suspicionless searches of travelers’ electronic devices by federal agents at airports and other U.S. ports of entry are unconstitutional. The ruling came in a lawsuit, Alasaad v. McAleenan, filed by the American Civil Liberties Union (ACLU), Electronic Frontier Foundation (EFF), and ACLU of Massachusetts, on behalf of 11 travelers whose smartphones and laptops were searched without individualized suspicion at U.S. ports of entry.“This ruling significantly advances Fourth Amendment protections for millions of international travelers who enter the United States every year,” said Esha Bhandari, staff attorney with the ACLU’s Speech, Privacy, and Technology Project. “By putting an end to the government’s ability to conduct suspicionless fishing expeditions, the court reaffirms that the border is not a lawless place and that we don’t lose our privacy rights when we travel.”
  • The district court order puts an end to Customs and Border Control (CBP) and Immigration and Customs Enforcement (ICE) asserted authority to search and seize travelers’ devices for purposes far afield from the enforcement of immigration and customs laws. Border officers must now demonstrate individualized suspicion of illegal contraband before they can search a traveler’s device. The number of electronic device searches at U.S. ports of entry has increased significantly. Last year, CBP conducted more than 33,000 searches, almost four times the number from just three years prior. International travelers returning to the United States have reported numerous cases of abusive searches in recent months. While searching through the phone of Zainab Merchant, a plaintiff in the Alasaad case, a border agent knowingly rifled through privileged attorney-client communications. An immigration officer at Boston Logan Airport reportedly searched an incoming Harvard freshman’s cell phone and laptop, reprimanded the student for friends’ social media postings expressing views critical of the U.S. government, and denied the student entry into the country following the search.For the order:https://www.eff.org/document/alasaad-v-nielsen-summary-judgment-order For more on this case:https://www.eff.org/cases/alasaad-v-duke
Paul Merrell

Facebook to Pay $550 Million to Settle Facial Recognition Suit - The New York Times - 2 views

  • Facebook said on Wednesday that it had agreed to pay $550 million to settle a class-action lawsuit over its use of facial recognition technology in Illinois, giving privacy groups a major victory that again raised questions about the social network’s data-mining practices.The case stemmed from Facebook’s photo-labeling service, Tag Suggestions, which uses face-matching software to suggest the names of people in users’ photos. The suit said the Silicon Valley company violated an Illinois biometric privacy law by harvesting facial data for Tag Suggestions from the photos of millions of users in the state without their permission and without telling them how long the data would be kept. Facebook has said the allegations have no merit.Under the agreement, Facebook will pay $550 million to eligible Illinois users and for the plaintiffs’ legal fees. The sum dwarfs the $380.5 million that the Equifax credit reporting agency agreed this month to pay to settle a class-action case over a 2017 consumer data breach.
Paul Merrell

House Lawmakers Condemn Big Tech's 'Monopoly Power' and Urge Their Breakups - The New York Times - 0 views

  • House lawmakers who spent the last 16 months investigating the practices of the world’s largest technology companies said on Tuesday that Amazon, Apple, Facebook and Google had exercised and abused their monopoly power and called for the most sweeping changes to antitrust laws in half a century.In a 449-page report that was presented by the House Judiciary Committee’s Democratic leadership, lawmakers said the four companies had turned from “scrappy” start-ups into “the kinds of monopolies we last saw in the era of oil barons and railroad tycoons.” The lawmakers said the companies had abused their dominant positions, setting and often dictating prices and rules for commerce, search, advertising, social networking and publishing.The House ReportRead the full report here »
  • To amend the inequities, the lawmakers recommended restoring competition by effectively breaking up the companies, emboldening the agencies that police market concentration and throwing up hurdles for the companies to acquire start-ups. They also proposed reforming antitrust laws, in the biggest potential shift since the Hart-Scott-Rodino Act of 1976 created stronger reviews of big mergers.
Paul Merrell

Google and Facebook fined $240 million for making cookies hard to refuse | Malwarebytes Labs - 0 views

  • French privacy watchdog, the Commission Nationale de l’Informatique et des Libertés (CNIL), has hit Google with a 150 million euro fine and Facebook with a 60 million euro fine, because their websites—google.fr, youtube.com, and facebook.com—don’t make refusing cookies as easy as accepting them. The CNIL carried out an online investigation after receiving complaints from users about the way cookies were handled on these sites. It found that while the sites offered buttons for allowing immediate acceptance of cookies, the sites didn’t implement an equivalent solution to let users refuse them. Several clicks were required to refuse all cookies, against a single one to accept them. In addition to the fines, the companies have been given three months to provide Internet users in France with a way to refuse cookies that’s as simple as accepting them. If they don’t, the companies will have to pay a penalty of 100,000 euros for each day they delay.
  • EU data protection regulators’ powers have increased significantly since the General Data Protection Regulation (GDPR) took effect in May 2018. This EU law allows watchdogs to levy penalties of as much as 4% of a company’s annual global sales. The restricted committee, the body in charge of sanctions, considered that the process regarding cookies affects the freedom of consent of Internet users and constitutes an infringement of the French Data Protection Act, which demands that it should be as easy to refuse cookies as to accept them. Since March 31, 2021, when the deadline set for websites and mobile applications to comply with the new rules on cookies expired, the CNIL has adopted nearly 100 corrective measures (orders and sanctions) related to non-compliance with the legislation on cookies.
« First ‹ Previous 41 - 60 of 60
Showing 20 items per page