Skip to main content

Home/ Future of the Web/ Group items tagged cloud-services

Rss Feed Group items tagged

Paul Merrell

Gmail leaves Google Apps admins nervous | InfoWorld | News | 2008-08-15 | By Juan Carlo... - 0 views

  • Because vendors host applications in their own datacenters, companies don't have to concern themselves with hardware provisioning and software maintenance. By living in the Internet "cloud," these hosted applications simplify sharing and collaboration among employees. However, the experience of users living through the recent Google Apps outages could serve as a deterrent to some IT and business managers who might not be ready to ditch conventional software packages that are installed on their servers.
  •  
    Google apps goes down three times in less than a week.
Paul Merrell

Bloomberg.com: News - 0 views

  • Christine A. Varney, nominated by President Barack Obama to be the U.S.’s next antitrust chief, has described Google Inc. as a monopolist that will dominate online computing services the way Microsoft Corp. ruled software.
  • Varney, 53, lobbied the Clinton administration on behalf of Netscape Communications Corp. to urge antitrust enforcers to sue Microsoft.
  • Still, Google is “quickly gathering market power in what I would call an online computing environment in the clouds,” she said, using a software industry term for software that is based on the Internet rather than in individual personal computers. “When all our enterprises move to computing in the clouds and there is a single firm that is offering a comprehensive solution,” Varney said, “you are going to see the same repeat of Microsoft.”
  • ...1 more annotation...
  • As in the Microsoft case, “there will be companies that will begin to allege that Google is discriminating” against them by “not allowing their products to interoperate with Google’s products,” Varney said.
Gary Edwards

Meteor: The NeXT Web - 0 views

  •  
    "Writing software is too hard and it takes too long. It's time for a new way to write software - especially application software, the user-facing software we use every day to talk to people and keep track of things. This new way should be radically simple. It should make it possible to build a prototype in a day or two, and a real production app in a few weeks. It should make everyday things easy, even when those everyday things involve hundreds of servers, millions of users, and integration with dozens of other systems. It should be built on collaboration, specialization, and division of labor, and it should be accessible to the maximum number of people. Today, there's a chance to create this new way - to build a new platform for cloud applications that will become as ubiquitous as previous platforms such as Unix, HTTP, and the relational database. It is not a small project. There are many big problems to tackle, such as: How do we transition the web from a "dumb terminal" model that is based on serving HTML, to a client/server model that is based on exchanging data? How do we design software to run in a radically distributed environment, where even everyday database apps are spread over multiple data centers and hundreds of intelligent client devices, and must integrate with other software at dozens of other organizations? How do we prepare for a world where most web APIs will be push-based (realtime), rather than polling-driven? In the face of escalating complexity, how can we simplify software engineering so that more people can do it? How will software developers collaborate and share components in this new world? Meteor is our audacious attempt to solve all of these big problems, at least for a certain large class of everyday applications. We think that success will come from hard work, respect for history and "classically beautiful" engineering patterns, and a philosophy of generally open and collaborative development. " .............. "It is not a
  •  
    "How do we transition the web from a "dumb terminal" model that is based on serving HTML, to a client/server model that is based on exchanging data?" From a litigation aspect, the best bet I know of is antitrust litigation against the W3C and the WHATWG Working Group for implementing a non-interoperable specification. See e.g., Commission v. Microsoft, No. T-167/08, European Community Court of First Instance (Grand Chamber Judgment of 17 September, 2007), para. 230, 374, 421, http://preview.tinyurl.com/chsdb4w (rejecting Microsoft's argument that "interoperability" has a 1-way rather than 2-way meaning; information technology specifications must be disclosed with sufficient specificity to place competitors on an "equal footing" in regard to interoperability; "the 12th recital to Directive 91/250 defines interoperability as 'the ability to exchange information and mutually to use the information which has been exchanged'"). Note that the Microsoft case was prosecuted on the E.U.'s "abuse of market power" law that corresponds to the U.S. Sherman Act § 2 (monopolies). But undoubtedly the E.U. courts would apply the same standard to "agreements among undertakings" in restraint of trade, counterpart to the Sherman Act's § 1 (conspiracies in restraint of trade), the branch that applies to development of voluntary standards by competitors. But better to innovate and obsolete HTML, I think. DG Competition and the DoJ won't prosecute such cases soon. For example, Obama ran for office promising to "reinvigorate antitrust enforcement" but his DoJ has yet to file its first antitrust case against a big company. Nb., virtually the same definition of interoperability announced by the Court of First Instance is provided by ISO/IEC JTC-1 Directives, annex I ("eye"), which is applicable to all international standards in the IT sector: "... interoperability is understood to be the ability of two or more IT systems to exchange information at one or more standardised interfaces
Paul Merrell

From Radio to Porn, British Spies Track Web Users' Online Identities - 1 views

  • HERE WAS A SIMPLE AIM at the heart of the top-secret program: Record the website browsing habits of “every visible user on the Internet.” Before long, billions of digital records about ordinary people’s online activities were being stored every day. Among them were details cataloging visits to porn, social media and news websites, search engines, chat forums, and blogs. The mass surveillance operation — code-named KARMA POLICE — was launched by British spies about seven years ago without any public debate or scrutiny. It was just one part of a giant global Internet spying apparatus built by the United Kingdom’s electronic eavesdropping agency, Government Communications Headquarters, or GCHQ. The revelations about the scope of the British agency’s surveillance are contained in documents obtained by The Intercept from National Security Agency whistleblower Edward Snowden. Previous reports based on the leaked files have exposed how GCHQ taps into Internet cables to monitor communications on a vast scale, but many details about what happens to the data after it has been vacuumed up have remained unclear.
  • Amid a renewed push from the U.K. government for more surveillance powers, more than two dozen documents being disclosed today by The Intercept reveal for the first time several major strands of GCHQ’s existing electronic eavesdropping capabilities.
  • The surveillance is underpinned by an opaque legal regime that has authorized GCHQ to sift through huge archives of metadata about the private phone calls, emails and Internet browsing logs of Brits, Americans, and any other citizens — all without a court order or judicial warrant
  • ...17 more annotations...
  • A huge volume of the Internet data GCHQ collects flows directly into a massive repository named Black Hole, which is at the core of the agency’s online spying operations, storing raw logs of intercepted material before it has been subject to analysis. Black Hole contains data collected by GCHQ as part of bulk “unselected” surveillance, meaning it is not focused on particular “selected” targets and instead includes troves of data indiscriminately swept up about ordinary people’s online activities. Between August 2007 and March 2009, GCHQ documents say that Black Hole was used to store more than 1.1 trillion “events” — a term the agency uses to refer to metadata records — with about 10 billion new entries added every day. As of March 2009, the largest slice of data Black Hole held — 41 percent — was about people’s Internet browsing histories. The rest included a combination of email and instant messenger records, details about search engine queries, information about social media activity, logs related to hacking operations, and data on people’s use of tools to browse the Internet anonymously.
  • Throughout this period, as smartphone sales started to boom, the frequency of people’s Internet use was steadily increasing. In tandem, British spies were working frantically to bolster their spying capabilities, with plans afoot to expand the size of Black Hole and other repositories to handle an avalanche of new data. By 2010, according to the documents, GCHQ was logging 30 billion metadata records per day. By 2012, collection had increased to 50 billion per day, and work was underway to double capacity to 100 billion. The agency was developing “unprecedented” techniques to perform what it called “population-scale” data mining, monitoring all communications across entire countries in an effort to detect patterns or behaviors deemed suspicious. It was creating what it said would be, by 2013, “the world’s biggest” surveillance engine “to run cyber operations and to access better, more valued data for customers to make a real world difference.”
  • A document from the GCHQ target analysis center (GTAC) shows the Black Hole repository’s structure.
  • The data is searched by GCHQ analysts in a hunt for behavior online that could be connected to terrorism or other criminal activity. But it has also served a broader and more controversial purpose — helping the agency hack into European companies’ computer networks. In the lead up to its secret mission targeting Netherlands-based Gemalto, the largest SIM card manufacturer in the world, GCHQ used MUTANT BROTH in an effort to identify the company’s employees so it could hack into their computers. The system helped the agency analyze intercepted Facebook cookies it believed were associated with Gemalto staff located at offices in France and Poland. GCHQ later successfully infiltrated Gemalto’s internal networks, stealing encryption keys produced by the company that protect the privacy of cell phone communications.
  • Similarly, MUTANT BROTH proved integral to GCHQ’s hack of Belgian telecommunications provider Belgacom. The agency entered IP addresses associated with Belgacom into MUTANT BROTH to uncover information about the company’s employees. Cookies associated with the IPs revealed the Google, Yahoo, and LinkedIn accounts of three Belgacom engineers, whose computers were then targeted by the agency and infected with malware. The hacking operation resulted in GCHQ gaining deep access into the most sensitive parts of Belgacom’s internal systems, granting British spies the ability to intercept communications passing through the company’s networks.
  • In March, a U.K. parliamentary committee published the findings of an 18-month review of GCHQ’s operations and called for an overhaul of the laws that regulate the spying. The committee raised concerns about the agency gathering what it described as “bulk personal datasets” being held about “a wide range of people.” However, it censored the section of the report describing what these “datasets” contained, despite acknowledging that they “may be highly intrusive.” The Snowden documents shine light on some of the core GCHQ bulk data-gathering programs that the committee was likely referring to — pulling back the veil of secrecy that has shielded some of the agency’s most controversial surveillance operations from public scrutiny. KARMA POLICE and MUTANT BROTH are among the key bulk collection systems. But they do not operate in isolation — and the scope of GCHQ’s spying extends far beyond them.
  • The agency operates a bewildering array of other eavesdropping systems, each serving its own specific purpose and designated a unique code name, such as: SOCIAL ANTHROPOID, which is used to analyze metadata on emails, instant messenger chats, social media connections and conversations, plus “telephony” metadata about phone calls, cell phone locations, text and multimedia messages; MEMORY HOLE, which logs queries entered into search engines and associates each search with an IP address; MARBLED GECKO, which sifts through details about searches people have entered into Google Maps and Google Earth; and INFINITE MONKEYS, which analyzes data about the usage of online bulletin boards and forums. GCHQ has other programs that it uses to analyze the content of intercepted communications, such as the full written body of emails and the audio of phone calls. One of the most important content collection capabilities is TEMPORA, which mines vast amounts of emails, instant messages, voice calls and other communications and makes them accessible through a Google-style search tool named XKEYSCORE.
  • As of September 2012, TEMPORA was collecting “more than 40 billion pieces of content a day” and it was being used to spy on people across Europe, the Middle East, and North Africa, according to a top-secret memo outlining the scope of the program. The existence of TEMPORA was first revealed by The Guardian in June 2013. To analyze all of the communications it intercepts and to build a profile of the individuals it is monitoring, GCHQ uses a variety of different tools that can pull together all of the relevant information and make it accessible through a single interface. SAMUEL PEPYS is one such tool, built by the British spies to analyze both the content and metadata of emails, browsing sessions, and instant messages as they are being intercepted in real time. One screenshot of SAMUEL PEPYS in action shows the agency using it to monitor an individual in Sweden who visited a page about GCHQ on the U.S.-based anti-secrecy website Cryptome.
  • Partly due to the U.K.’s geographic location — situated between the United States and the western edge of continental Europe — a large amount of the world’s Internet traffic passes through its territory across international data cables. In 2010, GCHQ noted that what amounted to “25 percent of all Internet traffic” was transiting the U.K. through some 1,600 different cables. The agency said that it could “survey the majority of the 1,600” and “select the most valuable to switch into our processing systems.”
  • According to Joss Wright, a research fellow at the University of Oxford’s Internet Institute, tapping into the cables allows GCHQ to monitor a large portion of foreign communications. But the cables also transport masses of wholly domestic British emails and online chats, because when anyone in the U.K. sends an email or visits a website, their computer will routinely send and receive data from servers that are located overseas. “I could send a message from my computer here [in England] to my wife’s computer in the next room and on its way it could go through the U.S., France, and other countries,” Wright says. “That’s just the way the Internet is designed.” In other words, Wright adds, that means “a lot” of British data and communications transit across international cables daily, and are liable to be swept into GCHQ’s databases.
  • A map from a classified GCHQ presentation about intercepting communications from undersea cables. GCHQ is authorized to conduct dragnet surveillance of the international data cables through so-called external warrants that are signed off by a government minister. The external warrants permit the agency to monitor communications in foreign countries as well as British citizens’ international calls and emails — for example, a call from Islamabad to London. They prohibit GCHQ from reading or listening to the content of “internal” U.K. to U.K. emails and phone calls, which are supposed to be filtered out from GCHQ’s systems if they are inadvertently intercepted unless additional authorization is granted to scrutinize them. However, the same rules do not apply to metadata. A little-known loophole in the law allows GCHQ to use external warrants to collect and analyze bulk metadata about the emails, phone calls, and Internet browsing activities of British people, citizens of closely allied countries, and others, regardless of whether the data is derived from domestic U.K. to U.K. communications and browsing sessions or otherwise. In March, the existence of this loophole was quietly acknowledged by the U.K. parliamentary committee’s surveillance review, which stated in a section of its report that “special protection and additional safeguards” did not apply to metadata swept up using external warrants and that domestic British metadata could therefore be lawfully “returned as a result of searches” conducted by GCHQ.
  • Perhaps unsurprisingly, GCHQ appears to have readily exploited this obscure legal technicality. Secret policy guidance papers issued to the agency’s analysts instruct them that they can sift through huge troves of indiscriminately collected metadata records to spy on anyone regardless of their nationality. The guidance makes clear that there is no exemption or extra privacy protection for British people or citizens from countries that are members of the Five Eyes, a surveillance alliance that the U.K. is part of alongside the U.S., Canada, Australia, and New Zealand. “If you are searching a purely Events only database such as MUTANT BROTH, the issue of location does not occur,” states one internal GCHQ policy document, which is marked with a “last modified” date of July 2012. The document adds that analysts are free to search the databases for British metadata “without further authorization” by inputing a U.K. “selector,” meaning a unique identifier such as a person’s email or IP address, username, or phone number. Authorization is “not needed for individuals in the U.K.,” another GCHQ document explains, because metadata has been judged “less intrusive than communications content.” All the spies are required to do to mine the metadata troves is write a short “justification” or “reason” for each search they conduct and then click a button on their computer screen.
  • Intelligence GCHQ collects on British persons of interest is shared with domestic security agency MI5, which usually takes the lead on spying operations within the U.K. MI5 conducts its own extensive domestic surveillance as part of a program called DIGINT (digital intelligence).
  • GCHQ’s documents suggest that it typically retains metadata for periods of between 30 days to six months. It stores the content of communications for a shorter period of time, varying between three to 30 days. The retention periods can be extended if deemed necessary for “cyber defense.” One secret policy paper dated from January 2010 lists the wide range of information the agency classes as metadata — including location data that could be used to track your movements, your email, instant messenger, and social networking “buddy lists,” logs showing who you have communicated with by phone or email, the passwords you use to access “communications services” (such as an email account), and information about websites you have viewed.
  • Records showing the full website addresses you have visited — for instance, www.gchq.gov.uk/what_we_do — are treated as content. But the first part of an address you have visited — for instance, www.gchq.gov.uk — is treated as metadata. In isolation, a single metadata record of a phone call, email, or website visit may not reveal much about a person’s private life, according to Ethan Zuckerman, director of Massachusetts Institute of Technology’s Center for Civic Media. But if accumulated and analyzed over a period of weeks or months, these details would be “extremely personal,” he told The Intercept, because they could reveal a person’s movements, habits, religious beliefs, political views, relationships, and even sexual preferences. For Zuckerman, who has studied the social and political ramifications of surveillance, the most concerning aspect of large-scale government data collection is that it can be “corrosive towards democracy” — leading to a chilling effect on freedom of expression and communication. “Once we know there’s a reasonable chance that we are being watched in one fashion or another it’s hard for that not to have a ‘panopticon effect,’” he said, “where we think and behave differently based on the assumption that people may be watching and paying attention to what we are doing.”
  • When compared to surveillance rules in place in the U.S., GCHQ notes in one document that the U.K. has “a light oversight regime.” The more lax British spying regulations are reflected in secret internal rules that highlight greater restrictions on how NSA databases can be accessed. The NSA’s troves can be searched for data on British citizens, one document states, but they cannot be mined for information about Americans or other citizens from countries in the Five Eyes alliance. No such constraints are placed on GCHQ’s own databases, which can be sifted for records on the phone calls, emails, and Internet usage of Brits, Americans, and citizens from any other country. The scope of GCHQ’s surveillance powers explain in part why Snowden told The Guardian in June 2013 that U.K. surveillance is “worse than the U.S.” In an interview with Der Spiegel in July 2013, Snowden added that British Internet cables were “radioactive” and joked: “Even the Queen’s selfies to the pool boy get logged.”
  • In recent years, the biggest barrier to GCHQ’s mass collection of data does not appear to have come in the form of legal or policy restrictions. Rather, it is the increased use of encryption technology that protects the privacy of communications that has posed the biggest potential hindrance to the agency’s activities. “The spread of encryption … threatens our ability to do effective target discovery/development,” says a top-secret report co-authored by an official from the British agency and an NSA employee in 2011. “Pertinent metadata events will be locked within the encrypted channels and difficult, if not impossible, to prise out,” the report says, adding that the agencies were working on a plan that would “(hopefully) allow our Internet Exploitation strategy to prevail.”
Paul Merrell

Sun, Microsoft tout fruits of cooperation - CNET News - 0 views

  • The software will be incorporated into future versions of the companies' products--likely in 2006, Ballmer said. For now, it's the most concrete example of cooperation between the companies whose fierce competition was blunted somewhat by a 2004 agreement to settle legal issues, share patents and make their software interoperable.
  • Next up will be cooperation in a number of other domains: storage software and hardware; unified systems management; Web services standards for messaging and event-tracking; and Windows terminal services that let PCs act like thin clients by leaving the heavy lifting of computing to central servers.
  •  
    From 2005, a year after Sun and Microsoft became partners in Microsoft's assault on the Web.
Paul Merrell

Dare Obasanjo aka Carnage4Life - Not Turtles, AtomPub All the Way Down - 0 views

  • I don't think the Atom publishing protocol can be considered the universal protocol for talking to remote databases given that cloud storage vendors like Amazon and database vendors like Oracle don't support it yet. That said, this is definitely a positive trend. Back in the RSS vs. Atom days I used to get frustrated that people were spending so much time reinventing the wheel with an RSS clone when the real gaping hole in the infrastructure was a standard editing protocol. It took a little longer than I expected (Sam Ruby started talking about in 2003) but the effort has succeeded way beyond my wildest dreams. All I wanted was a standard editing protocol for blogs and content management systems and we've gotten so much more.
  • Microsoft is using AtomPub as the interface to a wide breadth of services and products as George Moore points out in his post A Unified Standards-Based Protocols and Tooling Platform for Storage from Microsoft 
  • And a few weeks after George's post even more was revealed in posts such as this one about  FeedSync and Live Mesh where we find out Congratulations to the Live Mesh team, who announced their Live Mesh Technology Preview release earlier this evening! Amit Mital gives a detailed overview in this post on http://dev.live.com. You can read all about it in the usual places...so why do I mention it here? FeedSync is one of the core parts of the Live Mesh platform. One of the key values of Live Mesh is that your data flows to all of your devices. And rather than being hidden away in a single service, any properly authenticated user has full bidirectional sync capability. As I discussed in the Introduction to FeedSync, this really makes "your stuff yours". Okay, FeedSync isn't really AtomPub but it does use the Atom syndication format so I count that as a win for Atom+APP as well. As time goes on, I hope we'll see even more products and services that support Atom and AtomPub from Microsoft. Standardization at the protocol layer means we can move innovation up the stack.
  •  
    Like this http://www.hdfilmsaati.net Film,dvd,download,free download,product... ppc,adword,adsense,amazon,clickbank,osell,bookmark,dofollow,edu,gov,ads,linkwell,traffic,scor,serp,goggle,bing,yahoo.ads,ads network,ads goggle,bing,quality links,link best,ptr,cpa,bpa
Matteo Spreafico

Yahoo! Search BOSS - YDN - 0 views

  • Yahoo! Search BOSS BOSS (Build your Own Search Service) is Yahoo!'s open search web services platform. The goal of BOSS is simple: to foster innovation in the search industry. Developers, start-ups, and large Internet companies can use BOSS to build and launch web-scale search products that utilize the entire Yahoo! Search index.
Gary Edwards

Nokia and Google: Too much emphasis on the mobile OS? | ge TalkBack on ZDNet - 0 views

  • Although it appears that the mobile hardware providers are competing through the development of incompatible platforms, i think there's reason to be hopeful. There seems to be movement towards a universal web application model able to join legacy Web with an Open-Web future where devices, desktops, web-stacks, and clouds connect, access, exchange and collaborate with all kinds of information systems. Above the metal, at the web application layer, there is a war between competing runtime engines. The recent Web 2.0 Conference was a showcase for Sun Java FX, Adobe RiA, and Microsoft .NET Silverlight. The exhibitors floor featured a large and prominent Microsoft Silverlight-Mesh island surrounded by Flex RiA providers, with currents of IT and developers asking the same question; Can Adobe run with Microsoft?
  •  
    Interesting discussion about a universal web application layer able to wrok across devices, browsers and web service systems. I reponded with a very lengthy post about WebKit.
  •  
    Most quality online stores. Know whether you are a trusted online retailer in the world. Whatever we can buy very good quality. and do not hesitate. Everything is very high quality. Including clothes, accessories, bags, cups. Highly recommended. This is one of the trusted online store in the world. View now www.retrostyler.com
Gary Edwards

After Bill Gates, five possible futures for Microsoft | InfoWorld | Analysis | 2008-06-... - 0 views

  •  
    For most people, Bill Gates and Microsoft are one and the same. Gates has led Microsoft to global dominance in the 33 years since its founding, combining a strong opportunism -- getting the code for DOS to sell to IBM for the first PC and aping Apple's visual interface for the first Windows are the two best examples of Gates' moving where the wind was soon to blow -- with a steady vision of desktop computers being as powerful as the mainframes that captured techies' imaginations in the 1970s. This is the intro and overview into a series of articles describing the future of Microsoft through five possible scenarios. The series includes under the lead article, "The Future of Microsoft"; * The "Borvell" scenario * The "slow decline" scenario * The "streaming" scenario * The "Oort services" scenario * The "Gates was right" scenario ]
Gary Edwards

Adamac Attack!: Evolution Revolution - 0 views

  • HTTP as a universal calling convention is pretty interesting. We already have tons of web services in the cloud using HTTP to communicate with one another - why not extend this to include local code talking with other components. The iPhone already supports a form of this IPC using the URL handlers, basically turning your application into a web server. BugLabs exposes interfaces to its various embedded device modules through web services. It has even been suggested in the literature that every object could embed a web server. Why not use this mechanism for calling that object's methods?
  •  
    Given the increasing number of platforms supporting Javascript + HTTP + HTML5, it's not inconceivable that "write-once, run anywhere" might come closer to fruition with this combo than Java ever achieved. Here's how this architecture plays out in my mind. Javascript is the core programming language. Using a HTTP transport and JSON data format, components in different processes can perform RPCs to one another. HTML5 features like local storage and the application cache allow for an offline story (the latest build of Safari on iPhone supports this). And of course, HTML + CSS allows for a common UI platform.
Paul Merrell

Edward Snowden Explains How To Reclaim Your Privacy - 0 views

  • Micah Lee: What are some operational security practices you think everyone should adopt? Just useful stuff for average people. Edward Snowden: [Opsec] is important even if you’re not worried about the NSA. Because when you think about who the victims of surveillance are, on a day-to-day basis, you’re thinking about people who are in abusive spousal relationships, you’re thinking about people who are concerned about stalkers, you’re thinking about children who are concerned about their parents overhearing things. It’s to reclaim a level of privacy. The first step that anyone could take is to encrypt their phone calls and their text messages. You can do that through the smartphone app Signal, by Open Whisper Systems. It’s free, and you can just download it immediately. And anybody you’re talking to now, their communications, if it’s intercepted, can’t be read by adversaries. [Signal is available for iOS and Android, and, unlike a lot of security tools, is very easy to use.] You should encrypt your hard disk, so that if your computer is stolen the information isn’t obtainable to an adversary — pictures, where you live, where you work, where your kids are, where you go to school. [I’ve written a guide to encrypting your disk on Windows, Mac, and Linux.] Use a password manager. One of the main things that gets people’s private information exposed, not necessarily to the most powerful adversaries, but to the most common ones, are data dumps. Your credentials may be revealed because some service you stopped using in 2007 gets hacked, and your password that you were using for that one site also works for your Gmail account. A password manager allows you to create unique passwords for every site that are unbreakable, but you don’t have the burden of memorizing them. [The password manager KeePassX is free, open source, cross-platform, and never stores anything in the cloud.]
  • The other thing there is two-factor authentication. The value of this is if someone does steal your password, or it’s left or exposed somewhere … [two-factor authentication] allows the provider to send you a secondary means of authentication — a text message or something like that. [If you enable two-factor authentication, an attacker needs both your password as the first factor and a physical device, like your phone, as your second factor, to login to your account. Gmail, Facebook, Twitter, Dropbox, GitHub, Battle.net, and tons of other services all support two-factor authentication.]
  • We should armor ourselves using systems we can rely on every day. This doesn’t need to be an extraordinary lifestyle change. It doesn’t have to be something that is disruptive. It should be invisible, it should be atmospheric, it should be something that happens painlessly, effortlessly. This is why I like apps like Signal, because they’re low friction. It doesn’t require you to re-order your life. It doesn’t require you to change your method of communications. You can use it right now to talk to your friends.
  • ...4 more annotations...
  • Lee: What do you think about Tor? Do you think that everyone should be familiar with it, or do you think that it’s only a use-it-if-you-need-it thing? Snowden: I think Tor is the most important privacy-enhancing technology project being used today. I use Tor personally all the time. We know it works from at least one anecdotal case that’s fairly familiar to most people at this point. That’s not to say that Tor is bulletproof. What Tor does is it provides a measure of security and allows you to disassociate your physical location. … But the basic idea, the concept of Tor that is so valuable, is that it’s run by volunteers. Anyone can create a new node on the network, whether it’s an entry node, a middle router, or an exit point, on the basis of their willingness to accept some risk. The voluntary nature of this network means that it is survivable, it’s resistant, it’s flexible. [Tor Browser is a great way to selectively use Tor to look something up and not leave a trace that you did it. It can also help bypass censorship when you’re on a network where certain sites are blocked. If you want to get more involved, you can volunteer to run your own Tor node, as I do, and support the diversity of the Tor network.]
  • Lee: So that is all stuff that everybody should be doing. What about people who have exceptional threat models, like future intelligence-community whistleblowers, and other people who have nation-state adversaries? Maybe journalists, in some cases, or activists, or people like that? Snowden: So the first answer is that you can’t learn this from a single article. The needs of every individual in a high-risk environment are different. And the capabilities of the adversary are constantly improving. The tooling changes as well. What really matters is to be conscious of the principles of compromise. How can the adversary, in general, gain access to information that is sensitive to you? What kinds of things do you need to protect? Because of course you don’t need to hide everything from the adversary. You don’t need to live a paranoid life, off the grid, in hiding, in the woods in Montana. What we do need to protect are the facts of our activities, our beliefs, and our lives that could be used against us in manners that are contrary to our interests. So when we think about this for whistleblowers, for example, if you witnessed some kind of wrongdoing and you need to reveal this information, and you believe there are people that want to interfere with that, you need to think about how to compartmentalize that.
  • Tell no one who doesn’t need to know. [Lindsay Mills, Snowden’s girlfriend of several years, didn’t know that he had been collecting documents to leak to journalists until she heard about it on the news, like everyone else.] When we talk about whistleblowers and what to do, you want to think about tools for protecting your identity, protecting the existence of the relationship from any type of conventional communication system. You want to use something like SecureDrop, over the Tor network, so there is no connection between the computer that you are using at the time — preferably with a non-persistent operating system like Tails, so you’ve left no forensic trace on the machine you’re using, which hopefully is a disposable machine that you can get rid of afterward, that can’t be found in a raid, that can’t be analyzed or anything like that — so that the only outcome of your operational activities are the stories reported by the journalists. [SecureDrop is a whistleblower submission system. Here is a guide to using The Intercept’s SecureDrop server as safely as possible.]
  • And this is to be sure that whoever has been engaging in this wrongdoing cannot distract from the controversy by pointing to your physical identity. Instead they have to deal with the facts of the controversy rather than the actors that are involved in it. Lee: What about for people who are, like, in a repressive regime and are trying to … Snowden: Use Tor. Lee: Use Tor? Snowden: If you’re not using Tor you’re doing it wrong. Now, there is a counterpoint here where the use of privacy-enhancing technologies in certain areas can actually single you out for additional surveillance through the exercise of repressive measures. This is why it’s so critical for developers who are working on security-enhancing tools to not make their protocols stand out.
  •  
    Lots more in the interview that I didn't highlight. This is a must-read.
Paul Merrell

Do Not Track Implementation Guide Launched | Electronic Frontier Foundation - 1 views

  • Today we are releasing the implementation guide for EFF’s Do Not Track (DNT) policy. For years users have been able to set a Do Not Track signal in their browser, but there has been little guidance for websites as to how to honor that request. EFF’s DNT policy sets out a meaningful response for servers to follow, and this guide provides details about how to apply it in practice. At its core, DNT protects user privacy by excluding the use of unique identifiers for cross-site tracking, and by limiting the retention period of log data to ten days. This short retention period gives sites the time they need for debugging and security purposes, and to generate aggregate statistical data. From this baseline, the policy then allows exceptions when the user's interactions with the site—e.g., to post comments, make a purchase, or click on an ad—necessitates collecting more information. The site is then free to retain any data necessary to complete the transaction. We believe this approach balances users’ privacy expectations with the ability of websites to deliver the functionality users want. Websites often integrate third-party content and rely on third-party services (like content delivery networks or analytics), and this creates the potential for user data to be leaked despite the best intentions of the site operator. The guide identifies potential pitfalls and catalogs providers of compliant services. It is common, for example, to embed media from platforms like You Tube, Sound Cloud, and Twitter, all of which track users whenever their widgets are loaded. Fortunately, Embedly, which offers control over the appearance of embeds, also supports DNT via its API, displaying a poster instead and loading the widget only if the user clicks on it knowingly.
  • Knowledge makes the difference between willing tracking and non-consensual tracking. Users should be able to choose whether they want to give up their privacy in exchange for using a site or a  particular feature. This means sites need to be transparent about their practices. A great example of this is our biggest adopter, Medium, which does not track DNT users who browse the site and gives clear information about tracking to users when they choose to log in. This is their previous log-in panel, the DNT language is currently being added to their new interface.
Paul Merrell

Japan's Underground Datacenter - System News - 0 views

  • 00 meters under the ground in Japan, Sun along with ten other IT firms are building a datacenter. The datacenter is located at such a low depth to take advantage of the cooler air as a means of bringing the 40% of energy usage, for cooling, down a few notches. The datacenter will also be reluctant to Japan’s earthquake potential by being built on the solid bedrock floor of the crater hollowed out for the project.
  • In the underground pictures it is clear that the Sun Modular Datacenter 20 is going to be a successful format for the datacenter because it is self contained and there is an abundant resource of ground water in the cave for a cooling system. The data center will be used by government agencies, it will serve as a service center for IT clients, and it will be used by businesses.
  • The Sun MD 20 Sun is included the design of this datacenter. In the earthquake analysis, the prototype was placed on a large shake table in California, and put through a simulation of the Northridge earthquake of 1993. The results were very conclusive. The location of Japan’s underground datacenter is still undisclosed. More Information
Gary Edwards

The Next Battle for the Desktop : Portable RiA Runtime Engines - 0 views

shared by Gary Edwards on 06 Nov 08 - Cached
  • The choices for desktop runtimes will be more flexible and will largely be driven by the type of applications rather than the type of platform. It’s likely that desktop computers will eventually ship with two or three different runtimes and that consumers will be more or less ignorant of which one they are using. What will determine the success of one desktop runtime over others will be the execution and development environment. Desktop runtimes that provide the most processing power, speed of execution, and security will dominate. In this scenario the end-user is no longer the customer, it's independent software developers and Integrated Software Vendors that are of primary importance. It’s the developers who will choose the platform on which they create cross-platform applications – the consumer will be largely ignorant of the choices made.  With the exception of download and install differences, the applications will look the same to end-users.
    • Gary Edwards
       
      "It's independent application developers and integrated software vendors that determine which RiA platforms will prevail. Will this group value "cross-platform" RiA? Or will they go for integrated cloud services designed to drive down the cost of development and implementation? Integration into existing business systems i think will trump cross-platform concerns. For sure Microsoft is betting the farm on this.
  •  
    The computer desktop - as was the case with newspapers before there was radio and radio before there was television - has become the high ground from which empires are built. While dominance of the desktop has been maintained for the last decade or more by Microsoft, which at one point represented 95% of the desktops used by all consumers, the future is less certain.it will not be a single operating system that prevails. In the end it will be desktop runtimes that become the most important platforms A desktop runtime is a platform that provides a consistent runtime environment regardless of the underlying operating system. Desktop runtimes are already extending beyond their primary target platform, the desktop, to the Fourth Screen - smart phones.
Paul Merrell

Technology News: Tech Law: Court Ruling Grants Email the Cloak of Privacy - 0 views

  • The Sixth Circuit Court of Appeals has handed down a ruling that delights privacy advocates and Fourth Amendment purists: In U.S. v. Warshak, it found that the government should have obtained a search warrant before seizing and searching defendant Stephen Warshak's emails, which were stored by email service providers.
  • It is an important ruling, because it is the first time a federal court of appeals has extended the Fourth Amendment to email with such careful consideration, and it is likely to be influential on both legal and practical levels
  • The decision is particularly important because the Stored Communications Act does allow the government to secretly obtain emails without a warrant in many situations, according to the EFF, which filed an amicus brief in the case.
Paul Merrell

Open Government Data Initiative - 0 views

  • The Open Government Data Initiative (OGDI) is an initiative led by Microsoft Public Sector Developer Evangelism team. OGDI uses the Azure Services Platform to make it easier to publish and use a wide variety of public data from government agencies. OGDI is also a free, open source ‘starter kit’ (coming soon) with code that can be used to publish data on the Internet in a Web-friendly format with easy-to-use, open API's. OGDI-based web API’s can be accessed from a variety of client technologies such as Silverlight, Flash, JavaScript, PHP, Python, Ruby, mapping web sites, etc. Whether you are a business wishing to use government data, a government developer, or a ‘citizen developer’, these open API's will enable you to build innovative applications, visualizations and mash-ups that empower people through access to government information. This site is built using the OGDI starter kit software assets and provides interactive access to some publicly-available data sets along with sample code and resources for writing applications using the OGDI API's.
Paul Merrell

Tech Companies Reel as NSA's Spying Tarnishes Reputations - Bloomberg - 0 views

  • U.S. technology companies are in danger of losing more business to foreign competitors if the National Security Agency’s power to spy on customers isn’t curbed, researchers with the New America Foundation said in a report today. The report, by the foundation’s Open Technology Institute, called for prohibiting the NSA from collecting data in bulk, while letting companies report more details about what information they give the government. Senate legislation introduced today would fulfill some recommendations by the institute, a Washington-based advocacy group that has been critical of NSA programs.
« First ‹ Previous 61 - 78 of 78
Showing 20 items per page