Skip to main content

Home/ Future of the Web/ Group items matching "deleted" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
Paul Merrell

Archiveteam - 0 views

  • HISTORY IS OUR FUTURE And we've been trashing our history Archive Team is a loose collective of rogue archivists, programmers, writers and loudmouths dedicated to saving our digital heritage. Since 2009 this variant force of nature has caught wind of shutdowns, shutoffs, mergers, and plain old deletions - and done our best to save the history before it's lost forever. Along the way, we've gotten attention, resistance, press and discussion, but most importantly, we've gotten the message out: IT DOESN'T HAVE TO BE THIS WAY. This website is intended to be an offloading point and information depot for a number of archiving projects, all related to saving websites or data that is in danger of being lost. Besides serving as a hub for team-based pulling down and mirroring of data, this site will provide advice on managing your own data and rescuing it from the brink of destruction. Currently Active Projects (Get Involved Here!) Archive Team recruiting Want to code for Archive Team? Here's a starting point.
  • Archive Team is a loose collective of rogue archivists, programmers, writers and loudmouths dedicated to saving our digital heritage. Since 2009 this variant force of nature has caught wind of shutdowns, shutoffs, mergers, and plain old deletions - and done our best to save the history before it's lost forever. Along the way, we've gotten attention, resistance, press and discussion, but most importantly, we've gotten the message out: IT DOESN'T HAVE TO BE THIS WAY. This website is intended to be an offloading point and information depot for a number of archiving projects, all related to saving websites or data that is in danger of being lost. Besides serving as a hub for team-based pulling down and mirroring of data, this site will provide advice on managing your own data and rescuing it from the brink of destruction.
  • Who We Are and how you can join our cause! Deathwatch is where we keep track of sites that are sickly, dying or dead. Fire Drill is where we keep track of sites that seem fine but a lot depends on them. Projects is a comprehensive list of AT endeavors. Philosophy describes the ideas underpinning our work. Some Starting Points The Introduction is an overview of basic archiving methods. Why Back Up? Because they don't care about you. Back Up your Facebook Data Learn how to liberate your personal data from Facebook. Software will assist you in regaining control of your data by providing tools for information backup, archiving and distribution. Formats will familiarise you with the various data formats, and how to ensure your files will be readable in the future. Storage Media is about where to get it, what to get, and how to use it. Recommended Reading links to others sites for further information. Frequently Asked Questions is where we answer common questions.
  •  
    The Archive Team Warrior is a virtual archiving appliance. You can run it to help with the ArchiveTeam archiving efforts. It will download sites and upload them to our archive - and it's really easy to do! The warrior is a virtual machine, so there is no risk to your computer. The warrior will only use your bandwidth and some of your disk space. It will get tasks from and report progress to the Tracker. Basic usage The warrior runs on Windows, OS X and Linux using a virtual machine. You'll need one of: VirtualBox (recommended) VMware workstation/player (free-gratis for personal use) See below for alternative virtual machines Partners with and contributes lots of archives to the Wayback Machine. Here's how you can help by contributing some bandwidth if you run an always-on box with an internet connection.
Paul Merrell

Microsoft Pitches Technology That Can Read Facial Expressions at Political Rallies - 1 views

  • On the 21st floor of a high-rise hotel in Cleveland, in a room full of political operatives, Microsoft’s Research Division was advertising a technology that could read each facial expression in a massive crowd, analyze the emotions, and report back in real time. “You could use this at a Trump rally,” a sales representative told me. At both the Republican and Democratic conventions, Microsoft sponsored event spaces for the news outlet Politico. Politico, in turn, hosted a series of Microsoft-sponsored discussions about the use of data technology in political campaigns. And throughout Politico’s spaces in both Philadelphia and Cleveland, Microsoft advertised an array of products from “Microsoft Cognitive Services,” its artificial intelligence and cloud computing division. At one exhibit, titled “Realtime Crowd Insights,” a small camera scanned the room, while a monitor displayed the captured image. Every five seconds, a new image would appear with data annotated for each face — an assigned serial number, gender, estimated age, and any emotions detected in the facial expression. When I approached, the machine labeled me “b2ff” and correctly identified me as a 23-year-old male.
  • “Realtime Crowd Insights” is an Application Programming Interface (API), or a software tool that connects web applications to Microsoft’s cloud computing services. Through Microsoft’s emotional analysis API — a component of Realtime Crowd Insights — applications send an image to Microsoft’s servers. Microsoft’s servers then analyze the faces and return emotional profiles for each one. In a November blog post, Microsoft said that the emotional analysis could detect “anger, contempt, fear, disgust, happiness, neutral, sadness or surprise.” Microsoft’s sales representatives told me that political campaigns could use the technology to measure the emotional impact of different talking points — and political scientists could use it to study crowd response at rallies.
  • Facial recognition technology — the identification of faces by name — is already widely used in secret by law enforcement, sports stadiums, retail stores, and even churches, despite being of questionable legality. As early as 2002, facial recognition technology was used at the Super Bowl to cross-reference the 100,000 attendees to a database of the faces of known criminals. The technology is controversial enough that in 2013, Google tried to ban the use of facial recognition apps in its Google glass system. But “Realtime Crowd Insights” is not true facial recognition — it could not identify me by name, only as “b2ff.” It did, however, store enough data on each face that it could continuously identify it with the same serial number, even hours later. The display demonstrated that capability by distinguishing between the number of total faces it had seen, and the number of unique serial numbers. Photo: Alex Emmons
  • ...2 more annotations...
  • Instead, “Realtime Crowd Insights” is an example of facial characterization technology — where computers analyze faces without necessarily identifying them. Facial characterization has many positive applications — it has been tested in the classroom, as a tool for spotting struggling students, and Microsoft has boasted that the tool will even help blind people read the faces around them. But facial characterization can also be used to assemble and store large profiles of information on individuals, even anonymously.
  • Alvaro Bedoya, a professor at Georgetown Law School and expert on privacy and facial recognition, has hailed that code of conduct as evidence that Microsoft is trying to do the right thing. But he pointed out that it leaves a number of questions unanswered — as illustrated in Cleveland and Philadelphia. “It’s interesting that the app being shown at the convention ‘remembered’ the faces of the people who walked by. That would seem to suggest that their faces were being stored and processed without the consent that Microsoft’s policy requires,” Bedoya said. “You have to wonder: What happened to the face templates of the people who walked by that booth? Were they deleted? Or are they still in the system?” Microsoft officials declined to comment on exactly what information is collected on each face and what data is retained or stored, instead referring me to their privacy policy, which does not address the question. Bedoya also pointed out that Microsoft’s marketing did not seem to match the consent policy. “It’s difficult to envision how companies will obtain consent from people in large crowds or rallies.”
  •  
    But nobody is saying that the output of this technology can't be combined with the output of facial recognition technology to let them monitor you individually AND track your emotions. Fortunately, others are fighting back with knowledge and tech to block facial recognition. http://goo.gl/JMQM2W
Paul Merrell

In Hearing on Internet Surveillance, Nobody Knows How Many Americans Impacted in Data Collection | Electronic Frontier Foundation - 0 views

  • The Senate Judiciary Committee held an open hearing today on the FISA Amendments Act, the law that ostensibly authorizes the digital surveillance of hundreds of millions of people both in the United States and around the world. Section 702 of the law, scheduled to expire next year, is designed to allow U.S. intelligence services to collect signals intelligence on foreign targets related to our national security interests. However—thanks to the leaks of many whistleblowers including Edward Snowden, the work of investigative journalists, and statements by public officials—we now know that the FISA Amendments Act has been used to sweep up data on hundreds of millions of people who have no connection to a terrorist investigation, including countless Americans. What do we mean by “countless”? As became increasingly clear in the hearing today, the exact number of Americans impacted by this surveillance is unknown. Senator Franken asked the panel of witnesses, “Is it possible for the government to provide an exact count of how many United States persons have been swept up in Section 702 surveillance? And if not the exact count, then what about an estimate?”
  • The lack of information makes rigorous oversight of the programs all but impossible. As Senator Franken put it in the hearing today, “When the public lacks even a rough sense of the scope of the government’s surveillance program, they have no way of knowing if the government is striking the right balance, whether we are safeguarding our national security without trampling on our citizens’ fundamental privacy rights. But the public can’t know if we succeed in striking that balance if they don’t even have the most basic information about our major surveillance programs."  Senator Patrick Leahy also questioned the panel about the “minimization procedures” associated with this type of surveillance, the privacy safeguard that is intended to ensure that irrelevant data and data on American citizens is swiftly deleted. Senator Leahy asked the panel: “Do you believe the current minimization procedures ensure that data about innocent Americans is deleted? Is that enough?”  David Medine, who recently announced his pending retirement from the Privacy and Civil Liberties Oversight Board, answered unequivocally:
  • Elizabeth Goitein, the Brennan Center director whose articulate and thought-provoking testimony was the highlight of the hearing, noted that at this time an exact number would be difficult to provide. However, she asserted that an estimate should be possible for most if not all of the government’s surveillance programs. None of the other panel participants—which included David Medine and Rachel Brand of the Privacy and Civil Liberties Oversight Board as well as Matthew Olsen of IronNet Cybersecurity and attorney Kenneth Wainstein—offered an estimate. Today’s hearing reaffirmed that it is not only the American people who are left in the dark about how many people or accounts are impacted by the NSA’s dragnet surveillance of the Internet. Even vital oversight committees in Congress like the Senate Judiciary Committee are left to speculate about just how far-reaching this surveillance is. It's part of the reason why we urged the House Judiciary Committee to demand that the Intelligence Community provide the public with a number. 
  • ...2 more annotations...
  • Senator Leahy, they don’t. The minimization procedures call for the deletion of innocent Americans’ information upon discovery to determine whether it has any foreign intelligence value. But what the board’s report found is that in fact information is never deleted. It sits in the databases for 5 years, or sometimes longer. And so the minimization doesn’t really address the privacy concerns of incidentally collected communications—again, where there’s been no warrant at all in the process… In the United States, we simply can’t read people’s emails and listen to their phone calls without court approval, and the same should be true when the government shifts its attention to Americans under this program. One of the most startling exchanges from the hearing today came toward the end of the session, when Senator Dianne Feinstein—who also sits on the Intelligence Committee—seemed taken aback by Ms. Goitein’s mention of “backdoor searches.” 
  • Feinstein: Wow, wow. What do you call it? What’s a backdoor search? Goitein: Backdoor search is when the FBI or any other agency targets a U.S. person for a search of data that was collected under Section 702, which is supposed to be targeted against foreigners overseas. Feinstein: Regardless of the minimization that was properly carried out. Goitein: Well the data is searched in its unminimized form. So the FBI gets raw data, the NSA, the CIA get raw data. And they search that raw data using U.S. person identifiers. That’s what I’m referring to as backdoor searches. It’s deeply concerning that any member of Congress, much less a member of the Senate Judiciary Committee and the Senate Intelligence Committee, might not be aware of the problem surrounding backdoor searches. In April 2014, the Director of National Intelligence acknowledged the searches of this data, which Senators Ron Wyden and Mark Udall termed “the ‘back-door search’ loophole in section 702.” The public was so incensed that the House of Representatives passed an amendment to that year's defense appropriations bill effectively banning the warrantless backdoor searches. Nonetheless, in the hearing today it seemed like Senator Feinstein might not recognize or appreciate the serious implications of allowing U.S. law enforcement agencies to query the raw data collected through these Internet surveillance programs. Hopefully today’s testimony helped convince the Senator that there is more to this topic than what she’s hearing in jargon-filled classified security briefings.
  •  
    The 4th Amendment: "The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and *particularly describing the place to be searched, and the* persons or *things to be seized."* So much for the particularized description of the place to be searched and the thngs to be seized.  Fah! Who needs a Constitution, anyway .... 
Gonzalo San Gil, PhD.

Apple Stole My Music. No, Seriously. | vellumatlanta - 1 views

  •  
    "May 4, 2016 / jamespinkstone "The software is functioning as intended," said Amber. "Wait," I asked, "so it's supposed to delete my personal files from my internal hard drive without asking my permission?" "Yes," she replied."
  •  
    "May 4, 2016 / jamespinkstone "The software is functioning as intended," said Amber. "Wait," I asked, "so it's supposed to delete my personal files from my internal hard drive without asking my permission?" "Yes," she replied."
Paul Merrell

A Secret Catalogue of Government Gear for Spying on Your Cellphone - 0 views

  • HE INTERCEPT HAS OBTAINED a secret, internal U.S. government catalogue of dozens of cellphone surveillance devices used by the military and by intelligence agencies. The document, thick with previously undisclosed information, also offers rare insight into the spying capabilities of federal law enforcement and local police inside the United States. The catalogue includes details on the Stingray, a well-known brand of surveillance gear, as well as Boeing “dirt boxes” and dozens of more obscure devices that can be mounted on vehicles, drones, and piloted aircraft. Some are designed to be used at static locations, while others can be discreetly carried by an individual. They have names like Cyberhawk, Yellowstone, Blackfin, Maximus, Cyclone, and Spartacus. Within the catalogue, the NSA is listed as the vendor of one device, while another was developed for use by the CIA, and another was developed for a special forces requirement. Nearly a third of the entries focus on equipment that seems to have never been described in public before.
  • The Intercept obtained the catalogue from a source within the intelligence community concerned about the militarization of domestic law enforcement. (The original is here.) A few of the devices can house a “target list” of as many as 10,000 unique phone identifiers. Most can be used to geolocate people, but the documents indicate that some have more advanced capabilities, like eavesdropping on calls and spying on SMS messages. Two systems, apparently designed for use on captured phones, are touted as having the ability to extract media files, address books, and notes, and one can retrieve deleted text messages. Above all, the catalogue represents a trove of details on surveillance devices developed for military and intelligence purposes but increasingly used by law enforcement agencies to spy on people and convict them of crimes. The mass shooting earlier this month in San Bernardino, California, which President Barack Obama has called “an act of terrorism,” prompted calls for state and local police forces to beef up their counterterrorism capabilities, a process that has historically involved adapting military technologies to civilian use. Meanwhile, civil liberties advocates and others are increasingly alarmed about how cellphone surveillance devices are used domestically and have called for a more open and informed debate about the trade-off between security and privacy — despite a virtual blackout by the federal government on any information about the specific capabilities of the gear.
  • “We’ve seen a trend in the years since 9/11 to bring sophisticated surveillance technologies that were originally designed for military use — like Stingrays or drones or biometrics — back home to the United States,” said Jennifer Lynch, a senior staff attorney at the Electronic Frontier Foundation, which has waged a legal battle challenging the use of cellphone surveillance devices domestically. “But using these technologies for domestic law enforcement purposes raises a host of issues that are different from a military context.”
  • ...2 more annotations...
  • ANY OF THE DEVICES in the catalogue, including the Stingrays and dirt boxes, are cell-site simulators, which operate by mimicking the towers of major telecom companies like Verizon, AT&T, and T-Mobile. When someone’s phone connects to the spoofed network, it transmits a unique identification code and, through the characteristics of its radio signals when they reach the receiver, information about the phone’s location. There are also indications that cell-site simulators may be able to monitor calls and text messages. In the catalogue, each device is listed with guidelines about how its use must be approved; the answer is usually via the “Ground Force Commander” or under one of two titles in the U.S. code governing military and intelligence operations, including covert action.
  • But domestically the devices have been used in a way that violates the constitutional rights of citizens, including the Fourth Amendment prohibition on illegal search and seizure, critics like Lynch say. They have regularly been used without warrants, or with warrants that critics call overly broad. Judges and civil liberties groups alike have complained that the devices are used without full disclosure of how they work, even within court proceedings.
Paul Merrell

Cy Vance's Proposal to Backdoor Encrypted Devices Is Riddled With Vulnerabilities | Just Security - 0 views

  • Less than a week after the attacks in Paris — while the public and policymakers were still reeling, and the investigation had barely gotten off the ground — Cy Vance, Manhattan’s District Attorney, released a policy paper calling for legislation requiring companies to provide the government with backdoor access to their smartphones and other mobile devices. This is the first concrete proposal of this type since September 2014, when FBI Director James Comey reignited the “Crypto Wars” in response to Apple’s and Google’s decisions to use default encryption on their smartphones. Though Comey seized on Apple’s and Google’s decisions to encrypt their devices by default, his concerns are primarily related to end-to-end encryption, which protects communications that are in transit. Vance’s proposal, on the other hand, is only concerned with device encryption, which protects data stored on phones. It is still unclear whether encryption played any role in the Paris attacks, though we do know that the attackers were using unencrypted SMS text messages on the night of the attack, and that some of them were even known to intelligence agencies and had previously been under surveillance. But regardless of whether encryption was used at some point during the planning of the attacks, as I lay out below, prohibiting companies from selling encrypted devices would not prevent criminals or terrorists from being able to access unbreakable encryption. Vance’s primary complaint is that Apple’s and Google’s decisions to provide their customers with more secure devices through encryption interferes with criminal investigations. He claims encryption prevents law enforcement from accessing stored data like iMessages, photos and videos, Internet search histories, and third party app data. He makes several arguments to justify his proposal to build backdoors into encrypted smartphones, but none of them hold water.
  • Before addressing the major privacy, security, and implementation concerns that his proposal raises, it is worth noting that while an increase in use of fully encrypted devices could interfere with some law enforcement investigations, it will help prevent far more crimes — especially smartphone theft, and the consequent potential for identity theft. According to Consumer Reports, in 2014 there were more than two million victims of smartphone theft, and nearly two-thirds of all smartphone users either took no steps to secure their phones or their data or failed to implement passcode access for their phones. Default encryption could reduce instances of theft because perpetrators would no longer be able to break into the phone to steal the data.
  • Vance argues that creating a weakness in encryption to allow law enforcement to access data stored on devices does not raise serious concerns for security and privacy, since in order to exploit the vulnerability one would need access to the actual device. He considers this an acceptable risk, claiming it would not be the same as creating a widespread vulnerability in encryption protecting communications in transit (like emails), and that it would be cheap and easy for companies to implement. But Vance seems to be underestimating the risks involved with his plan. It is increasingly important that smartphones and other devices are protected by the strongest encryption possible. Our devices and the apps on them contain astonishing amounts of personal information, so much that an unprecedented level of harm could be caused if a smartphone or device with an exploitable vulnerability is stolen, not least in the forms of identity fraud and credit card theft. We bank on our phones, and have access to credit card payments with services like Apple Pay. Our contact lists are stored on our phones, including phone numbers, emails, social media accounts, and addresses. Passwords are often stored on people’s phones. And phones and apps are often full of personal details about their lives, from food diaries to logs of favorite places to personal photographs. Symantec conducted a study, where the company spread 50 “lost” phones in public to see what people who picked up the phones would do with them. The company found that 95 percent of those people tried to access the phone, and while nearly 90 percent tried to access private information stored on the phone or in other private accounts such as banking services and email, only 50 percent attempted contacting the owner.
  • ...8 more annotations...
  • Vance attempts to downplay this serious risk by asserting that anyone can use the “Find My Phone” or Android Device Manager services that allow owners to delete the data on their phones if stolen. However, this does not stand up to scrutiny. These services are effective only when an owner realizes their phone is missing and can take swift action on another computer or device. This delay ensures some period of vulnerability. Encryption, on the other hand, protects everyone immediately and always. Additionally, Vance argues that it is safer to build backdoors into encrypted devices than it is to do so for encrypted communications in transit. It is true that there is a difference in the threats posed by the two types of encryption backdoors that are being debated. However, some manner of widespread vulnerability will inevitably result from a backdoor to encrypted devices. Indeed, the NSA and GCHQ reportedly hacked into a database to obtain cell phone SIM card encryption keys in order defeat the security protecting users’ communications and activities and to conduct surveillance. Clearly, the reality is that the threat of such a breach, whether from a hacker or a nation state actor, is very real. Even if companies go the extra mile and create a different means of access for every phone, such as a separate access key for each phone, significant vulnerabilities will be created. It would still be possible for a malicious actor to gain access to the database containing those keys, which would enable them to defeat the encryption on any smartphone they took possession of. Additionally, the cost of implementation and maintenance of such a complex system could be high.
  • Privacy is another concern that Vance dismisses too easily. Despite Vance’s arguments otherwise, building backdoors into device encryption undermines privacy. Our government does not impose a similar requirement in any other context. Police can enter homes with warrants, but there is no requirement that people record their conversations and interactions just in case they someday become useful in an investigation. The conversations that we once had through disposable letters and in-person conversations now happen over the Internet and on phones. Just because the medium has changed does not mean our right to privacy has.
  • In addition to his weak reasoning for why it would be feasible to create backdoors to encrypted devices without creating undue security risks or harming privacy, Vance makes several flawed policy-based arguments in favor of his proposal. He argues that criminals benefit from devices that are protected by strong encryption. That may be true, but strong encryption is also a critical tool used by billions of average people around the world every day to protect their transactions, communications, and private information. Lawyers, doctors, and journalists rely on encryption to protect their clients, patients, and sources. Government officials, from the President to the directors of the NSA and FBI, and members of Congress, depend on strong encryption for cybersecurity and data security. There are far more innocent Americans who benefit from strong encryption than there are criminals who exploit it. Encryption is also essential to our economy. Device manufacturers could suffer major economic losses if they are prohibited from competing with foreign manufacturers who offer more secure devices. Encryption also protects major companies from corporate and nation-state espionage. As more daily business activities are done on smartphones and other devices, they may now hold highly proprietary or sensitive information. Those devices could be targeted even more than they are now if all that has to be done to access that information is to steal an employee’s smartphone and exploit a vulnerability the manufacturer was required to create.
  • Vance also suggests that the US would be justified in creating such a requirement since other Western nations are contemplating requiring encryption backdoors as well. Regardless of whether other countries are debating similar proposals, we cannot afford a race to the bottom on cybersecurity. Heads of the intelligence community regularly warn that cybersecurity is the top threat to our national security. Strong encryption is our best defense against cyber threats, and following in the footsteps of other countries by weakening that critical tool would do incalculable harm. Furthermore, even if the US or other countries did implement such a proposal, criminals could gain access to devices with strong encryption through the black market. Thus, only innocent people would be negatively affected, and some of those innocent people might even become criminals simply by trying to protect their privacy by securing their data and devices. Finally, Vance argues that David Kaye, UN Special Rapporteur for Freedom of Expression and Opinion, supported the idea that court-ordered decryption doesn’t violate human rights, provided certain criteria are met, in his report on the topic. However, in the context of Vance’s proposal, this seems to conflate the concepts of court-ordered decryption and of government-mandated encryption backdoors. The Kaye report was unequivocal about the importance of encryption for free speech and human rights. The report concluded that:
  • States should promote strong encryption and anonymity. National laws should recognize that individuals are free to protect the privacy of their digital communications by using encryption technology and tools that allow anonymity online. … States should not restrict encryption and anonymity, which facilitate and often enable the rights to freedom of opinion and expression. Blanket prohibitions fail to be necessary and proportionate. States should avoid all measures that weaken the security that individuals may enjoy online, such as backdoors, weak encryption standards and key escrows. Additionally, the group of intelligence experts that was hand-picked by the President to issue a report and recommendations on surveillance and technology, concluded that: [R]egarding encryption, the U.S. Government should: (1) fully support and not undermine efforts to create encryption standards; (2) not in any way subvert, undermine, weaken, or make vulnerable generally available commercial software; and (3) increase the use of encryption and urge US companies to do so, in order to better protect data in transit, at rest, in the cloud, and in other storage.
  • The clear consensus among human rights experts and several high-ranking intelligence experts, including the former directors of the NSA, Office of the Director of National Intelligence, and DHS, is that mandating encryption backdoors is dangerous. Unaddressed Concerns: Preventing Encrypted Devices from Entering the US and the Slippery Slope In addition to the significant faults in Vance’s arguments in favor of his proposal, he fails to address the question of how such a restriction would be effectively implemented. There is no effective mechanism for preventing code from becoming available for download online, even if it is illegal. One critical issue the Vance proposal fails to address is how the government would prevent, or even identify, encrypted smartphones when individuals bring them into the United States. DHS would have to train customs agents to search the contents of every person’s phone in order to identify whether it is encrypted, and then confiscate the phones that are. Legal and policy considerations aside, this kind of policy is, at the very least, impractical. Preventing strong encryption from entering the US is not like preventing guns or drugs from entering the country — encrypted phones aren’t immediately obvious as is contraband. Millions of people use encrypted devices, and tens of millions more devices are shipped to and sold in the US each year.
  • Finally, there is a real concern that if Vance’s proposal were accepted, it would be the first step down a slippery slope. Right now, his proposal only calls for access to smartphones and devices running mobile operating systems. While this policy in and of itself would cover a number of commonplace devices, it may eventually be expanded to cover laptop and desktop computers, as well as communications in transit. The expansion of this kind of policy is even more worrisome when taking into account the speed at which technology evolves and becomes widely adopted. Ten years ago, the iPhone did not even exist. Who is to say what technology will be commonplace in 10 or 20 years that is not even around today. There is a very real question about how far law enforcement will go to gain access to information. Things that once seemed like merely science fiction, such as wearable technology and artificial intelligence that could be implanted in and work with the human nervous system, are now available. If and when there comes a time when our “smart phone” is not really a device at all, but is rather an implant, surely we would not grant law enforcement access to our minds.
  • Policymakers should dismiss Vance’s proposal to prohibit the use of strong encryption to protect our smartphones and devices in order to ensure law enforcement access. Undermining encryption, regardless of whether it is protecting data in transit or at rest, would take us down a dangerous and harmful path. Instead, law enforcement and the intelligence community should be working to alter their skills and tactics in a fast-evolving technological world so that they are not so dependent on information that will increasingly be protected by encryption.
Paul Merrell

As Belgium threatens fines, Facebook's defence of tracking visitors rings hollow | nsnbc international - 0 views

  • Facebook has been ordered by a Belgian court to stop tracking non-Facebook users when they visit the Facebook site. Facebook has been given 48 hours to stop the tracking or face possible fines of up to 250,000 Euro a day.
  • Facebook has said that it will appeal the ruling, claiming that since their european headquarters are situated in Ireland, they should only be bound by the Irish Data Protection Regulator. Facebook’s chief of security Alex Stamos has posted an explanation about why non-Facebook users are tracked when they visit the site. The tracking issue centres around the creation of a “cookie” called “datr” whenever anyone visits a Facebook page. This cookie contains an identification number that identifies the same browser returning each time to different Facebook pages. Once created, the cookie will last 2 years unless the user explicitly deletes it. The cookie is created for all visitors to Facebook, irrespective of whether they are a Facebook user or even whether they are logged into Facebook at the time. According to Stamos, the measure is needed to: Prevent the creation of fake and spammy accounts Reduce the risk of someone’s account being taken over by someone else Protect people’s content from being stolen Stopping denial of service attacks against Facebook
  • The principle behind this is that if you can identify requests that arrive at the site for whatever reason, abnormal patterns may unmask people creating fake accounts, hijacking a real account or just issuing so many requests that it overwhelms the site. Stamos’ defence of tracking users is that they have been using it for the past 5 years and nobody had complained until now, that it was common practice and that there was little harm because the data was not collected for any purpose other than security. The dilemma raised by Facebook’s actions is a common one in the conflicting spheres of maintaining privacy and maintaining security. It is obvious that if you can identify all visitors to a site, then it is possible to determine more information about what they are doing than if they were anonymous. The problem with this from a moral perspective is that everyone is being tagged, irrespective of whether their intent was going to be malicious or not. It is essentially compromising the privacy of the vast majority for the sake of a much smaller likelihood of bad behaviour.
  •  
    I checked and sure enough: five Facebook cookies even though I have no Facebook account. They're gone now, and I've created an exception blocking Facebook from planting more cookies on my systems. 
Paul Merrell

Everything You Need to Know About AOL's Zombie Apocalypse | nsnbc international - 0 views

  • America Online (AOL) will be resurrecting Verizon’s zombie cookies because they are fabulous data-trackers that cannot be “killed”. AOL wants to boost their ad revenue regardless of the infringement on customer privacy they pose and the enabling of hacker attacks they can facilitate.
  •  
    "The zombie cookies will allow AOL to "acquire demographic data on users" while simultaneously using their own advertising network to track user browsing history, use pf apps on smartphones and their geo-location coordinates. Earlier this year, ProPublica released a report regarding the advertising company called Turn and their zombie cookies that are used by large tech firms to "come back to life" even after users have deleted them. In the ProPublica report, it was revealed that Turn is "taking advantage of a hidden undeletable number that Verizon uses to monitor customers' habits on their smartphones and tablets" by respawning those "tracking cookies that users have deleted." Called unique identifier headers (UIDHs), or perma-cookies, this sneaky monitoring of customers is used "to help marketers create more targeted ads based on their customers' unique browsing habits." In 2012, UIDHs were used by Verizon to provide a way for advertisers with "demographic and third-party interest-based segments" to help them deliver "relevant ads" based on mobile devices' unique identifiers. Shockingly, more than 100 million Verizon customers were affected by this snooping by the corporation, tracking individual customer usage and reporting the findings to the federal government and advertising corporations."
Paul Merrell

Thousands Join Legal Fight Against UK Surveillance - And You Can, Too - The Intercept - 1 views

  • Thousands of people are signing up to join an unprecedented legal campaign against the United Kingdom’s leading electronic surveillance agency. On Monday, London-based human rights group Privacy International launched an initiative enabling anyone across the world to challenge covert spying operations involving Government Communications Headquarters, or GCHQ, the National Security Agency’s British counterpart. The campaign was made possible following a historic court ruling earlier this month that deemed intelligence sharing between GCHQ and the NSA to have been unlawful because of the extreme secrecy shrouding it.
  • Consequently, members of the public now have a rare opportunity to take part in a lawsuit against the spying in the Investigatory Powers Tribunal, a special British court that handles complaints about surveillance operations conducted by law enforcement and intelligence agencies. Privacy International is allowing anyone who wants to participate to submit their name, email address and phone number through a page on its website. The group plans to use the details to lodge a case with GCHQ and the court that will seek to discover whether each participant’s emails or phone calls have been covertly obtained by the agency in violation of the privacy and freedom of expression provisions of the European Convention on Human Rights. If it is established that any of the communications have been unlawfully collected, the court could force GCHQ to delete them from its vast repositories of intercepted data.
  • By Tuesday evening, more than 10,000 people had already signed up to the campaign, a spokesman for Privacy International told The Intercept. In a statement announcing the campaign on Monday, Eric King, deputy director of Privacy International, said: “The public have a right to know if they were illegally spied on, and GCHQ must come clean on whose records they hold that they should never have had in the first place. “We have known for some time that the NSA and GCHQ have been engaged in mass surveillance, but never before could anyone explicitly find out if their phone calls, emails, or location histories were unlawfully shared between the U.S. and U.K. “There are few chances that people have to directly challenge the seemingly unrestrained surveillance state, but individuals now have a historic opportunity finally hold GCHQ accountable for their unlawful actions.”
Paul Merrell

The Newest Reforms on SIGINT Collection Still Leave Loopholes | Just Security - 0 views

  • Director of National Intelligence James Clapper this morning released a report detailing new rules aimed at reforming the way signals intelligence is collected and stored by certain members of the United States Intelligence Community (IC). The long-awaited changes follow up on an order announced by President Obama one year ago that laid out the White House’s principles governing the collection of signals intelligence. That order, commonly known as PPD-28, purports to place limits on the use of data collected in bulk and to increase privacy protections related to the data collected, regardless of nationality. Accordingly, most of the changes presented as “new” by Clapper’s office  (ODNI) stem directly from the guidance provided in PPD-28, and so aren’t truly new. And of the biggest changes outlined in the report, there are still large exceptions that appear to allow the government to escape the restrictions with relative ease. Here’s a quick rundown.
  • National security letters (NSLs). The report also states that the FBI’s gag orders related to NSLs expire three years after the opening of a full-blown investigation or three years after an investigation’s close, whichever is earlier. However, these expiration dates can be easily overridden by by an FBI Special Agent in Charge or a Deputy Assistant FBI Director who finds that the statutory standards for secrecy about the NSL continue to be satisfied (which at least one court has said isn’t a very high bar). This exception also doesn’t address concerns that NSL gag orders lack adequate due process protections, lack basic judicial oversight, and may violate the First Amendment.
  • Retention policy for non-U.S. persons. The new rules say that the IC must now delete information about “non-U.S. persons” that’s been gathered via signals intelligence after five-years. However, there is a loophole that will let spies hold onto that information indefinitely whenever the Director of National Intelligence determines (after considering the views of the ODNI’s Civil Liberties Protection Officer) that retaining information is in the interest of national security. The new rules don’t say whether the exceptions will be directed at entire groups of people or individual surveillance targets.  Section 215 metadata. Updates to the rules concerning the use of data collected under Section 215 of the Patriot Act includes the requirement that the Foreign Intelligence Surveillance Court (rather than authorized NSA officials) must determine spies have “reasonable, articulable suspicion” prior to query Section 215 data, outside of emergency circumstances. What qualifies as an emergency for these purposes? We don’t know. Additionally, the IC is now limited to two “hops” in querying the database. This means that spies can only play two degrees of Kevin Bacon, instead of the previously allowed three degrees, with the contacts of anyone targeted under Section 215. The report doesn’t explain what would prevent the NSA (or other agency using the 215 databases) from getting around this limit by redesignating a phone number found in the first or second hop as a new “target,” thereby allowing the agency to continue the contact chain.
  • ...1 more annotation...
  • The report also details the ODNI’s and IC’s plans for the future, including: (1) Working with Congress to reauthorize bulk collection under Section 215. (2) Updating agency guidelines under Executive Order 12333 “to protect the privacy and civil liberties of U.S. persons.” (3) Producing another annual report in January 2016 on the IC’s progress in implementing signals intelligence reforms. These plans raise more questions than they answer. Given the considerable doubts about Section 215’s effectiveness, why is the ODNI pushing for its reauthorization? And what will the ODNI consider appropriate privacy protections under Executive Order 12333?
Paul Merrell

Verizon Will Now Let Users Kill Previously Indestructible Tracking Code - ProPublica - 0 views

  • Verizon says it will soon offer customers a way to opt out from having their smartphone and tablet browsing tracked via a hidden un-killable tracking identifier. The decision came after a ProPublica article revealed that an online advertiser, Turn, was exploiting the Verizon identifier to respawn tracking cookies that users had deleted. Two days after the article appeared, Turn said it would suspend the practice of creating so-called "zombie cookies" that couldn't be deleted. But Verizon couldn't assure users that other companies might not also exploit the number - which was transmitted automatically to any website or app a user visited from a Verizon-enabled device - to build dossiers about people's behavior on their mobile devices. Verizon subsequently updated its website to note Turn's decision and declared that it would "work with other partners to ensure that their use of [the undeletable tracking number] is consistent with the purposes we intended." Previously, its website had stated: "It is unlikely that sites and ad entities will attempt to build customer profiles.
  • However, policing the hundreds of companies in the online tracking business was likely to be a difficult task for Verizon. And so, on Monday, Verizon followed in the footsteps of AT&T, which had already declared in November that it would stop inserting the hidden undeletable number in its users' Web traffic. In a statement emailed to reporters on Friday, Verizon said, "We have begun working to expand the opt-out to include the identifier referred to as the UIDH, and expect that to be available soon." Previously, users who opted out from Verizon's program were told that information about their demographics and Web browsing behavior would no longer be shared with advertisers, but that the tracking number would still be attached to their traffic. For more coverage, read ProPublica's previous reporting on Verizon's indestructible tracking and how one company used the tool to create zombie cookies.
  •  
    Good for Pro Publica!
Gonzalo San Gil, PhD.

Sub Pop artist creates music-streaming site to mock Pandora, Spotify | Ars Technica [# ! check -and delete- lead...] - 0 views

  •  
    "On Tuesday, Josh Tillman, the lead singer and songwriter of the band Father John Misty, announced a phony, satirical music-streaming service called Streamline Audio Protocol, or, better put, SAP. ... On the site, Tillman calls his music-delivery system "a new signal-to-audio process by which popular albums are 'sapped' of their performances, original vocal, atmosphere, and other distracting affectations so the consumer can decide quickly and efficiently whether they like a musical composition, based strictly on its formal attributes, enough to spend money on it. ..."
  •  
    "On Tuesday, Josh Tillman, the lead singer and songwriter of the band Father John Misty, announced a phony, satirical music-streaming service called Streamline Audio Protocol, or, better put, SAP. ... On the site, Tillman calls his music-delivery system "a new signal-to-audio process by which popular albums are 'sapped' of their performances, original vocal, atmosphere, and other distracting affectations so the consumer can decide quickly and efficiently whether they like a musical composition, based strictly on its formal attributes, enough to spend money on it. ..."
Paul Merrell

The Latest Rules on How Long NSA Can Keep Americans' Encrypted Data Look Too Familiar | Just Security - 0 views

  • Does the National Security Agency (NSA) have the authority to collect and keep all encrypted Internet traffic for as long as is necessary to decrypt that traffic? That was a question first raised in June 2013, after the minimization procedures governing telephone and Internet records collected under Section 702 of the Foreign Intelligence Surveillance Act were disclosed by Edward Snowden. The issue quickly receded into the background, however, as the world struggled to keep up with the deluge of surveillance disclosures. The Intelligence Authorization Act of 2015, which passed Congress this last December, should bring the question back to the fore. It established retention guidelines for communications collected under Executive Order 12333 and included an exception that allows NSA to keep ‘incidentally’ collected encrypted communications for an indefinite period of time. This creates a massive loophole in the guidelines. NSA’s retention of encrypted communications deserves further consideration today, now that these retention guidelines have been written into law. It has become increasingly clear over the last year that surveillance reform will be driven by technological change—specifically by the growing use of encryption technologies. Therefore, any legislation touching on encryption should receive close scrutiny.
  • Section 309 of the intel authorization bill describes “procedures for the retention of incidentally acquired communications.” It establishes retention guidelines for surveillance programs that are “reasonably anticipated to result in the acquisition of [telephone or electronic communications] to or from a United States person.” Communications to or from a United States person are ‘incidentally’ collected because the U.S. person is not the actual target of the collection. Section 309 states that these incidentally collected communications must be deleted after five years unless they meet a number of exceptions. One of these exceptions is that “the communication is enciphered or reasonably believed to have a secret meaning.” This exception appears to be directly lifted from NSA’s minimization procedures for data collected under Section 702 of FISA, which were declassified in 2013. 
  • While Section 309 specifically applies to collection taking place under E.O. 12333, not FISA, several of the exceptions described in Section 309 closely match exceptions in the FISA minimization procedures. That includes the exception for “enciphered” communications. Those minimization procedures almost certainly served as a model for these retention guidelines and will likely shape how this new language is interpreted by the Executive Branch. Section 309 also asks the heads of each relevant member of the intelligence community to develop procedures to ensure compliance with new retention requirements. I expect those procedures to look a lot like the FISA minimization guidelines.
  • ...6 more annotations...
  • This language is broad, circular, and technically incoherent, so it takes some effort to parse appropriately. When the minimization procedures were disclosed in 2013, this language was interpreted by outside commentators to mean that NSA may keep all encrypted data that has been incidentally collected under Section 702 for at least as long as is necessary to decrypt that data. Is this the correct interpretation? I think so. It is important to realize that the language above isn’t just broad. It seems purposefully broad. The part regarding relevance seems to mirror the rationale NSA has used to justify its bulk phone records collection program. Under that program, all phone records were relevant because some of those records could be valuable to terrorism investigations and (allegedly) it isn’t possible to collect only those valuable records. This is the “to find a needle a haystack, you first have to have the haystack” argument. The same argument could be applied to encrypted data and might be at play here.
  • This exception doesn’t just apply to encrypted data that might be relevant to a current foreign intelligence investigation. It also applies to cases in which the encrypted data is likely to become relevant to a future intelligence requirement. This is some remarkably generous language. It seems one could justify keeping any type of encrypted data under this exception. Upon close reading, it is difficult to avoid the conclusion that these procedures were written carefully to allow NSA to collect and keep a broad category of encrypted data under the rationale that this data might contain the communications of NSA targets and that it might be decrypted in the future. If NSA isn’t doing this today, then whoever wrote these minimization procedures wanted to at least ensure that NSA has the authority to do this tomorrow.
  • There are a few additional observations that are worth making regarding these nominally new retention guidelines and Section 702 collection. First, the concept of incidental collection as it has typically been used makes very little sense when applied to encrypted data. The way that NSA’s Section 702 upstream “about” collection is understood to work is that technology installed on the network does some sort of pattern match on Internet traffic; say that an NSA target uses example@gmail.com to communicate. NSA would then search content of emails for references to example@gmail.com. This could notionally result in a lot of incidental collection of U.S. persons’ communications whenever the email that references example@gmail.com is somehow mixed together with emails that have nothing to do with the target. This type of incidental collection isn’t possible when the data is encrypted because it won’t be possible to search and find example@gmail.com in the body of an email. Instead, example@gmail.com will have been turned into some alternative, indecipherable string of bits on the network. Incidental collection shouldn’t occur because the pattern match can’t occur in the first place. This demonstrates that, when communications are encrypted, it will be much harder for NSA to search Internet traffic for a unique ID associated with a specific target.
  • This lends further credence to the conclusion above: rather than doing targeted collection against specific individuals, NSA is collecting, or plans to collect, a broad class of data that is encrypted. For example, NSA might collect all PGP encrypted emails or all Tor traffic. In those cases, NSA could search Internet traffic for patterns associated with specific types of communications, rather than specific individuals’ communications. This would technically meet the definition of incidental collection because such activity would result in the collection of communications of U.S. persons who aren’t the actual targets of surveillance. Collection of all Tor traffic would entail a lot of this “incidental” collection because the communications of NSA targets would be mixed with the communications of a large number of non-target U.S. persons. However, this “incidental” collection is inconsistent with how the term is typically used, which is to refer to over-collection resulting from targeted surveillance programs. If NSA were collecting all Tor traffic, that activity wouldn’t actually be targeted, and so any resulting over-collection wouldn’t actually be incidental. Moreover, greater use of encryption by the general public would result in an ever-growing amount of this type of incidental collection.
  • This type of collection would also be inconsistent with representations of Section 702 upstream collection that have been made to the public and to Congress. Intelligence officials have repeatedly suggested that search terms used as part of this program have a high degree of specificity. They have also argued that the program is an example of targeted rather than bulk collection. ODNI General Counsel Robert Litt, in a March 2014 meeting before the Privacy and Civil Liberties Oversight Board, stated that “there is either a misconception or a mischaracterization commonly repeated that Section 702 is a form of bulk collection. It is not bulk collection. It is targeted collection based on selectors such as telephone numbers or email addresses where there’s reason to believe that the selector is relevant to a foreign intelligence purpose.” The collection of Internet traffic based on patterns associated with types of communications would be bulk collection; more akin to NSA’s collection of phone records en mass than it is to targeted collection focused on specific individuals. Moreover, this type of collection would certainly fall within the definition of bulk collection provided just last week by the National Academy of Sciences: “collection in which a significant portion of the retained data pertains to identifiers that are not targets at the time of collection.”
  • The Section 702 minimization procedures, which will serve as a template for any new retention guidelines established for E.O. 12333 collection, create a large loophole for encrypted communications. With everything from email to Internet browsing to real-time communications moving to encrypted formats, an ever-growing amount of Internet traffic will fall within this loophole.
  •  
    Tucked into a budget authorization act in December without press notice. Section 309 (the Act is linked from the article) appears to be very broad authority for the NSA to intercept any form of telephone or other electronic information in bulk. There are far more exceptions from the five-year retention limitation than the encrypted information exception. When reading this, keep in mind that the U.S. intelligence community plays semantic games to obfuscate what it does. One of its word plays is that communications are not "collected" until an analyst looks at or listens to partiuclar data, even though the data will be searched to find information countless times before it becomes "collected." That searching was the major basis for a decision by the U.S. District Court in Washington, D.C. that bulk collection of telephone communications was unconstitutional: Under the Fourth Amendment, a "search" or "seizure" requiring a judicial warrant occurs no later than when the information is intercepted. That case is on appeal, has been briefed and argued, and a decision could come any time now. Similar cases are pending in two other courts of appeals. Also, an important definition from the new Intelligence Authorization Act: "(a) DEFINITIONS.-In this section: (1) COVERED COMMUNICATION.-The term ''covered communication'' means any nonpublic telephone or electronic communication acquired without the consent of a person who is a party to the communication, including communications in electronic storage."       
Gonzalo San Gil, PhD.

Universal Music Can Delete Any SoundCloud Track Without Oversight | TorrentFreak - 0 views

  •  
    [# ! delete remotely... without oversight? #imagine ... any '#whateveholder' indiscriminately #removing #content from disks w@rldwide...] " Ernesto on July 3, 2014 C: 17 News In a bid to tackle alleged infringement, popular music sharing platform SoundCloud is offering unlimited removal powers to certain copyright holders. Responding to a complaint from a UK DJ the company admitted that Universal Music can delete any and all SoundCloud tracks without oversight."
  •  
    [# ! delete remotely... without oversight? who? what? why?] " Ernesto on July 3, 2014 C: 17 News In a bid to tackle alleged infringement, popular music sharing platform SoundCloud is offering unlimited removal powers to certain copyright holders. Responding to a complaint from a UK DJ the company admitted that Universal Music can delete any and all SoundCloud tracks without oversight."
Gonzalo San Gil, PhD.

Dangerous Ruling: EU Says Google Must Help People Disappear Stuff They Don't Like From The Internet | Techdirt - 0 views

  •  
    "from the right-to-be-forgotten dept For years now we've explained why Europe's concept of a "right to be forgotten" is a terrible, dangerous and impossible idea. The basic idea is that if you were involved in something that you're not happy about later, you can demand that the incident be stricken from the record... everywhere. It's a clear attack on free speech -- allowing people to censor others from saying truthful and accurate things about someone. "
Paul Merrell

This project aims to make '404 not found' pages a thing of the past - 0 views

  • The Internet is always changing. Sites are rising and falling, content is deleted, and bad URLs can lead to '404 Not Found' errors that are as helpful as a brick wall. A new project proposes an do away with dead 404 errors by implementing new HTML code that will help access prior versions of hyperlinked content. With any luck, that means that you’ll never have to run into a dead link again. The “404-No-More” project is backed by a formidable coalition including members from organizations like the Harvard Library Innovation Lab, Los Alamos National Laboratory, Old Dominion University, and the Berkman Center for Internet & Society. Part of the Knight News Challenge, which seeks to strengthen the Internet for free expression and innovation through a variety of initiatives, 404-No-More recently reached the semifinal stage. The project aims to cure so-called link rot, the process by which hyperlinks become useless overtime because they point to addresses that are no longer available. If implemented, websites such as Wikipedia and other reference documents would be vastly improved. The new feature would also give Web authors a way provide links that contain both archived copies of content and specific dates of reference, the sort of information that diligent readers have to hunt down on a website like Archive.org.
  • While it may sound trivial, link rot can actually have real ramifications. Nearly 50 percent of the hyperlinks in Supreme Court decisions no longer work, a 2013 study revealed. Losing footnotes and citations in landmark legal decisions can mean losing crucial information and context about the laws that govern us. The same study found that 70 percent of URLs within the Harvard Law Review and similar journals didn’t link to the originally cited information, considered a serious loss surrounding the discussion of our laws. The project’s proponents have come up with more potential uses as well. Activists fighting censorship will have an easier time combatting government takedowns, for instance. Journalists will be much more capable of researching dynamic Web pages. “If every hyperlink was annotated with a publication date, you could automatically view an archived version of the content as the author intended for you to see it,” the project’s authors explain. The ephemeral nature of the Web could no longer be used as a weapon. Roger Macdonald, a director at the Internet Archive, called the 404-No-More project “an important contribution to preservation of knowledge.”
  • The new feature would come in the form of introducing the mset attribute to the <a> element in HTML, which would allow users of the code to specify multiple dates and copies of content as an external resource. For instance, if both the date of reference and the location of a copy of targeted content is known by an author, the new code would like like this: The 404-No-More project’s goals are numerous, but the ultimate goal is to have mset become a new HTML standard for hyperlinks. “An HTML standard that incorporates archives for hyperlinks will loop in these efforts and make the Web better for everyone,” project leaders wrote, “activists, journalists, and regular ol’ everyday web users.”
Paul Merrell

Dropbox: Condoleeza Rice appointment won't alter privacy pledge - CNET - 1 views

  •  
    The straw that broke this camel's back. On top of having an absolutely horrible security model, Dropbox elects Condi Rice to its board of directors. I just completed transfer of my files to another service (in the E.U. where U.S. court orders don't reach) and deleted my Dropbox account.  
Paul Merrell

Cloud Computing: The Nine Features of an Ideal PaaS Cloud - 0 views

  • What sort of cloud computer(s) should we be building or expecting from vendors? Are there issues of lock-in that should concern customers of either SaaS clouds or PaaS clouds? I’ve been thinking about this problem as the CEO of a PaaS cloud computing company for some time. Clouds should be open. They shouldn’t be proprietary. More broadly, I believe no vendor currently does everything that’s required to serve customers well. What’s required for such a cloud? I think an ideal PaaS cloud would have the following nine features:
  • 1. Virtualization Layer Network Stability
  • 2. API for Creation, Deletion, Cloning of Instances
  • ...7 more annotations...
  • 3. Application Layer Interoperability
  • 4. State Layer Interoperability
  • 5. Application Services (e.g. email infrastructure, payments infrastructure)
  • 6. Automatic Scale (deploy and forget about it)
  • 7. Hardware Load Balancing
  • 8. Storage as a Service
  • 9. “Root”, If Required
  •  
    Most quality online stores. Know whether you are a trusted online retailer in the world. Whatever we can buy very good quality. and do not hesitate. Everything is very high quality. Including clothes, accessories, bags, cups. Highly recommended. This is one of the trusted online store in the world. View now www.retrostyler.com
Gonzalo San Gil, PhD.

Just Delete Me | A directory of direct links to delete your account from web services. - 3 views

  •  
    "A directory of direct links to delete your account from web services."
Paul Merrell

Blink! Google Is Forking WebKit - Slashdot - 0 views

  • "In a blog post titled Blink: A rendering engine for the Chromium project, Google has announced that Chromium (the open source backend for Chrome) will be switching to Blink, a new WebKit-based web rendering engine. Quoting: 'Chromium uses a different multi-process architecture than other WebKit-based browsers, and supporting multiple architectures over the years has led to increasing complexity for both the WebKit and Chromium projects. This has slowed down the collective pace of innovation... This was not an easy decision. We know that the introduction of a new rendering engine can have significant implications for the web. Nevertheless, we believe that having multiple rendering engines—similar to having multiple browsers—will spur innovation and over time improve the health of the entire open web ecosystem. ... In the short term, Blink will bring little change for web developers. The bulk of the initial work will focus on internal architectural improvements and a simplification of the codebase. For example, we anticipate that we’ll be able to remove 7 build systems and delete more than 7,000 files—comprising more than 4.5 million lines—right off the bat. Over the long term a healthier codebase leads to more stability and fewer bugs.'"
‹ Previous 21 - 40 of 44 Next ›
Showing 20 items per page