Skip to main content

Home/ Future of the Web/ Group items tagged built

Rss Feed Group items tagged

Paul Merrell

The Cover Pages: Alfresco and Joomla Provide Integration Based on CMIS - 0 views

  • Alfresco Software and Joomlatools today announced the first integration based on Content Management Interoperability Services (CMIS). The Alfresco:Joomla! integration module was built using the draft CMIS REST API to allow organizations running Joomla-based web sites to access Alfresco's robust open source content management repository.
  • The integration, built using the CMIS REST API, will enable millions of Joomla web sites to access the powerful back-end content repository services of Alfresco, ensuring security, compliance, and auditability. Users will be able to more effectively manage, preview and track increasing volumes of content and digital assets on collaborative Joomla web sites using Alfresco's content library. Similarly Alfresco users will be able to search, publish, share, download, and edit content directly on Joomla sites.
  • The proposed CMIS standard is currently being advanced by an OASIS technical committee and will enable anyone to develop content applications on open source Alfresco and deploy them on SharePoint, EMC, IBM, or OpenText. In September 2008, Alfresco released the industry's first draft implementation of the CMIS specification. The company has also recently made available the CMIS Developer Toolbox, which includes a working implementation and contains resources to assist developers in the CMIS community to start creating portable content applications, based on the draft specification.
  •  
    Hey, maybe web apps will after all be able to hold two-way conversations some day? :-)
Paul Merrell

Microsoft debuts early test version of Oxite open source blogging engine | Open Source ... - 0 views

  • WordPress has more competition to be. Microsoft’s Codeplex team has developed an open source blogging engine that can support simple blogs and large web sites such as its own MIX Online.
  • “Oxite was developed carefully and painstakingly to be a great blogging platform, or a starting point for your own web site project with CMS needs,” according to Microsoft.com.
  • Oxite offers support for multiple blogs per site.  ”Oxite includes the ability to create and edit an arbitrary set of pages on your site. Want an ‘about’ page? You got it. Need a special page about your dogs, with sub-pages for each of those special animals? Yep, no worries,” Microsoft continues. “The ability to add pages as a child of another page is all built in. The web-based editing and creation interface lets you put whatever HTML you want onto your pages, and the built-in authentication system means that only you will be able to edit them.
  •  
    The ability to create child pages of a parent page is something I haven't seen before in a blogging app. There are a ton of CMS that offer such features, but blogs have been an exception.
Gary Edwards

The Next Battle for the Desktop : Portable RiA Runtime Engines - 0 views

shared by Gary Edwards on 06 Nov 08 - Cached
  • The choices for desktop runtimes will be more flexible and will largely be driven by the type of applications rather than the type of platform. It’s likely that desktop computers will eventually ship with two or three different runtimes and that consumers will be more or less ignorant of which one they are using. What will determine the success of one desktop runtime over others will be the execution and development environment. Desktop runtimes that provide the most processing power, speed of execution, and security will dominate. In this scenario the end-user is no longer the customer, it's independent software developers and Integrated Software Vendors that are of primary importance. It’s the developers who will choose the platform on which they create cross-platform applications – the consumer will be largely ignorant of the choices made.  With the exception of download and install differences, the applications will look the same to end-users.
    • Gary Edwards
       
      "It's independent application developers and integrated software vendors that determine which RiA platforms will prevail. Will this group value "cross-platform" RiA? Or will they go for integrated cloud services designed to drive down the cost of development and implementation? Integration into existing business systems i think will trump cross-platform concerns. For sure Microsoft is betting the farm on this.
  •  
    The computer desktop - as was the case with newspapers before there was radio and radio before there was television - has become the high ground from which empires are built. While dominance of the desktop has been maintained for the last decade or more by Microsoft, which at one point represented 95% of the desktops used by all consumers, the future is less certain.it will not be a single operating system that prevails. In the end it will be desktop runtimes that become the most important platforms A desktop runtime is a platform that provides a consistent runtime environment regardless of the underlying operating system. Desktop runtimes are already extending beyond their primary target platform, the desktop, to the Fourth Screen - smart phones.
Paul Merrell

Google bulges old time news archive | The Register - 0 views

  • Google is redoubling efforts to offer a digital archive of the world's newspapers. Two years ago, the search giant began indexing the existing digital archives of papers like The New York Times and The Washington Post, and today, with a post to The Official Google Blog, the company said it's now working with other publishers to bring a much broader range of old newsprint into the project.
  • In addition to the old ads, you'll find new ads. Digitized papers will be joined by familiar AdSense text, and Google will split the revenue with the papers' publishers.
  •  
    There's a change in Google's business model indicated by that last paragraph, sharing Google ad revenues with publishers. Publishers have been suing Google in Europe and the U.S. for indexing their web site news content. Is sharing Google Ad-Sense revenue with publishers the compromise that will bring the world an explosion of information previously unavailable online in easily searchable form? Most newspapers' archives are not available online and with far too many that are, subscriptions are required to search a single newspaper's archives; e.g., the New York Times. Sounds like Google may have its sights set on eroding the information subscription business model that the news business -- along with advertising -- has been built around for centuries. This announcement might mark a paradigm shift.
Gary Edwards

Surfin' Safari WebKit: The SquirrelFish JavaScript VM - 0 views

  • WebKit’s core JavaScript engine just got a new interpreter, code-named SquirrelFish. SquirrelFish is fast—much faster than WebKit’s previous interpreter. Check out the numbers. On the SunSpider JavaScript benchmark, SquirrelFish is 1.6 times faster than WebKit’s previous interpreter.
  •  
    More%20good%20stuff%20from%20WebKit!
  •  
    SquirrelFish is a register-based, direct-threaded, high-level bytecode engine, with a sliding register window calling convention. It lazily generates bytecodes from a syntax tree, using a simple one-pass compiler with built-in copy propagation.
  •  
    Visit News www.killdo.de.gg. How to make the 1000 visitor from PR9 backlinks. Buy cheap service www.fiverr.com/radjaseotea/making-best-super-backlink-143445
Gonzalo San Gil, PhD.

Enterprise Linux or Fedora? Product or project: choose for yourself - 2 views

  •  
    [Product or project: choose for yourself A few years ago there was just one Red Hat Linux. As acceptance grew and Linux reached further into enterprise computing, one Red Hat Linux product could no longer be all things to all users. That's why in 2002 Red Hat created Red Hat Enterprise Linux. Stable, supported, certified -- Red Hat Enterprise Linux has become the Linux standard. The Fedora Project was introduced in late 2003. Built for and with the help of the open source community, the Fedora Project is for developers and high-tech enthusiasts using Linux in non-critical computing environments. Which Linux is right for you? See for yourself.]
Gary Edwards

Mashups turn into an industry as offerings mature | Hinchcliffe Enterprise Web 2.0 | Z... - 0 views

  •  
    Dion has lots to say about the recent Web 2.0 Conference. In this article he covers nine significant announcements from companies specializing in Web based mashups and the related tools for building ad hoc Web applications. This years Web 2.0 was filled with Web developer oriented services, but my favorite was MindTouch. Perhaps because their focus was that of directly engaging end users in the customization of business processes. Yes, the creation of data objects is clearly in the realm of trained developers. And for sure many tools were announced at Web 2.0 to further the much needed wiring of data objects. But once wired and available, services like MindTouch i think will become the way end users interact and create new business productivity methods. Great coverage.

    "...... For awareness and understanding of the fast-growing world of mashups are significant challenges as IT practitioners, business strategists, and software vendors attempt to grapple with what's facing up to be the biggest challenge of all: The habits and expectations of the larger part of a generation of workers who don't yet realize mashups are poised to change many things about the software landscape on the Web and in the workplace. Generational changes can be difficult for businesses to embrace successfully, and while evidence that mashups are remaking the business world are still very much emerging, they certainly hold the promise..."

    ".... while the life of the average Web developer has been greatly improved by the availability of a wide variety of useful open APIs, the average user of the Web hasn't been a direct beneficiary except through the increase in Web apps that are built on the mashup model. And that's because the tools that empower users to weave together existing Web parts and open APIs into the exact solutions they need are just now becoming easy enough and robust enough to readily enable these scenarios. And that doesn't include the variety of
Paul Merrell

Microsoft begins paving path for IT and cloud integration | Cloud Computing - InfoWorld - 1 views

  •  
    Microsoft last week launched its first serious effort to build IT into its cloud plans by introducing technologies that help connect existing corporate networks and cloud services to make them look like a single infrastructure. The concept began to come together at Microsoft's Professional Developers Conference. The company is attempting to show that it wants to move beyond the first wave of the cloud trend, which is defined by the availability of raw computing power supplied by Microsoft and competitors such as Amazon and Google. Microsoft's goal is to supply tools, middleware, and services so users can run applications that span corporate and cloud networks, especially those built with Microsoft's Azure cloud operating system.
Gonzalo San Gil, PhD.

Do Personal Computers Come With NSA Surveillance Devices Built-In As Standard? | Techdirt - 0 views

  •  
    "from the tinfoil-hat dept As Techdirt reported last year, one of the most bizarre episodes in the unfolding story of the Snowden leaks was when two experts from the UK's GCHQ oversaw the destruction of the Guardian's computers that held material provided by Snowden"
Gonzalo San Gil, PhD.

Why Facebook Just Launched Its Own 'Dark Web' Site | WIRED [+ TOR IS THE NSA http://lwn... - 2 views

  •  
    "Facebook has never had much of a reputation for letting users hide their identities online. But now the world's least anonymous website has just joined the Web's most anonymous network." [# ! Just a #PR #Campaign… # ! … as, You'll learn soon… TOR IS THE NSA Posted Jul 9, 2008 21:13 UTC (Wed) by dulles (guest, #45450) Parent article: GNU/Linux free software tools to preserve your online privacy, anonymity and security (FSM) # ! Anyway, since long ago, You Must Know that there is no privacy in # ! a Network built by others -Governments and Big Companies # ! among 'em. # ! Don' come to The Web expecting privacy, as You won't look for # ! intimacy in a Stadium Full of Pe@ple… # ! … but meet the places You get in.]
  •  
    "Facebook has never had much of a reputation for letting users hide their identities online. But now the world's least anonymous website has just joined the Web's most anonymous network."
Gonzalo San Gil, PhD.

Glassgow University built a cloud platform from Raspberry Pi's and Lego ~ Linux and Life - 1 views

  •  
    "The University of Glasgow has created a working model of a multi-million pound cloud computing platform using Lego bricks and Raspberry Pi mini-computers."
  •  
    "The University of Glasgow has created a working model of a multi-million pound cloud computing platform using Lego bricks and Raspberry Pi mini-computers."
Paul Merrell

Utah lawmaker questions city water going to NSA - 0 views

  • SALT LAKE CITY – A Utah lawmaker concerned about government spying on its citizens is questioning whether city water service should be cut off to a massive National Security Agency data storage facility outside Salt Lake City.Republican Rep. Marc Roberts, of Santaquin, said there are serious questions about privacy and surveillance surrounding the center, and several Utah residents who spoke at a legislative committee hearing Wednesday agreed.During the last legislative session, lawmakers opted to hold off on Roberts' bill to shut off the facility's water and decided to study it during the interim."This is not a bill just about a data center. This is a bill about civil rights," web developer Joe Levi said. "This is a bill that needs to be taken up and needs to be taken seriously."Pete Ashdown, founder of Salt Lake City-based Internet provider XMission, called the center a stain upon the state and its technology industry. "I do encourage you to stand up and do something about it," he said.Lawmakers said they aren't considering shutting down $1.7 billion facility, but the committee chair acknowledged the concerns and said there might be another way to get the point across. "We may look at some type of a strong message to give our representatives to take back to Congress," said Republican Sen. David Hinkins, of Orangeville.
  • The NSA's largest data storage center in the U.S. was built in Utah over 37 other locations because of open land and cheap electricity. The center sits on a National Guard base about 25 miles south of Salt Lake City in the town of Bluffdale.NSA officials said the center is key to protecting national security networks and allowing U.S. authorities to watch for cyber threats. Beyond that, the agency has offered few details.The center attracted much discussion and concern after revelations last year that the NSA has been collecting millions of U.S. phone records and digital communications stored by major Internet providers.
  • Cybersecurity experts say the nondescript Utah facility is a giant storehouse for phone calls, emails and online records that have been secretly collected.Outside the computer storehouses are large coolers that keep the machines from overheating. The coolers use large amounts of water, which the nearby city of Bluffdale sells to the center at a discounted rate.City records released earlier this year showed monthly water use was much less than the 1 million gallons a day that the U.S. Army Corps of Engineers predicted the center would need, causing some to wonder if the center was fully operational.NSA officials have refused to say if the center is up and running after its scheduled opening in October 2013 was stalled by electrical problems.City utility records showed the NSA has been making monthly minimum payments of about $30,000 to Bluffdale. The city manager said that pays for more water than the center used.The state of Nevada shut off water to the site of the proposed Yucca Mountain nuclear waste dump 90 miles northwest of Las Vegas in 2002, after months of threats.The project didn't run dry because the Energy Department built a 1-million-gallon tank and a small well for the site. Department officials said the stored water, plus 400,000 gallons stored in other tanks at the Nevada Test Site, provided time for scientists to continue experiments and design work at the site.
  • ...1 more annotation...
  • SALT LAKE CITY – A Utah lawmaker concerned about government spying on its citizens is questioning whether city water service should be cut off to a massive National Security Agency data storage facility outside Salt Lake City.Rep
  •  
    Hey, go for their electricity too! But what do we do with the Bluffdale facility after we abolish the NSA? Turn it over to Internet Archives, with a $1 billion endowment for maintenance? Free and permanent web sites for everyone?  
Paul Merrell

Detekt Is Free Software That Spots Computer Spyware - Businessweek - 0 views

  • For more than two years, researchers and rights activists have tracked the proliferation and abuse of computer spyware that can watch people in their homes and intercept their e-mails. Now they’ve built a tool that can help the targets protect themselves.The free, downloadable software, called Detekt, searches computers for the presence of malicious programs that have been built to evade detection. The spyware ranges from government-grade products used by intelligence and police agencies to hacker staples known as RATs—remote administration tools. Detekt, which was developed by security researcher Claudio Guarnieri, is being released in a partnership with advocacy groups Amnesty International, Digitale Gesellschaft, the Electronic Frontier Foundation, and Privacy International.Guarnieri says his tool finds hidden spy programs by seeking unique patterns on computers that indicate a specific malware is running. He warns users not to expect his program (which is available only for Windows machines) to find all spyware, and notes that the release of Detekt could spur malware developers to further cloak their code.
Gary Edwards

Meteor: The NeXT Web - 0 views

  •  
    "Writing software is too hard and it takes too long. It's time for a new way to write software - especially application software, the user-facing software we use every day to talk to people and keep track of things. This new way should be radically simple. It should make it possible to build a prototype in a day or two, and a real production app in a few weeks. It should make everyday things easy, even when those everyday things involve hundreds of servers, millions of users, and integration with dozens of other systems. It should be built on collaboration, specialization, and division of labor, and it should be accessible to the maximum number of people. Today, there's a chance to create this new way - to build a new platform for cloud applications that will become as ubiquitous as previous platforms such as Unix, HTTP, and the relational database. It is not a small project. There are many big problems to tackle, such as: How do we transition the web from a "dumb terminal" model that is based on serving HTML, to a client/server model that is based on exchanging data? How do we design software to run in a radically distributed environment, where even everyday database apps are spread over multiple data centers and hundreds of intelligent client devices, and must integrate with other software at dozens of other organizations? How do we prepare for a world where most web APIs will be push-based (realtime), rather than polling-driven? In the face of escalating complexity, how can we simplify software engineering so that more people can do it? How will software developers collaborate and share components in this new world? Meteor is our audacious attempt to solve all of these big problems, at least for a certain large class of everyday applications. We think that success will come from hard work, respect for history and "classically beautiful" engineering patterns, and a philosophy of generally open and collaborative development. " .............. "It is not a
  •  
    "How do we transition the web from a "dumb terminal" model that is based on serving HTML, to a client/server model that is based on exchanging data?" From a litigation aspect, the best bet I know of is antitrust litigation against the W3C and the WHATWG Working Group for implementing a non-interoperable specification. See e.g., Commission v. Microsoft, No. T-167/08, European Community Court of First Instance (Grand Chamber Judgment of 17 September, 2007), para. 230, 374, 421, http://preview.tinyurl.com/chsdb4w (rejecting Microsoft's argument that "interoperability" has a 1-way rather than 2-way meaning; information technology specifications must be disclosed with sufficient specificity to place competitors on an "equal footing" in regard to interoperability; "the 12th recital to Directive 91/250 defines interoperability as 'the ability to exchange information and mutually to use the information which has been exchanged'"). Note that the Microsoft case was prosecuted on the E.U.'s "abuse of market power" law that corresponds to the U.S. Sherman Act § 2 (monopolies). But undoubtedly the E.U. courts would apply the same standard to "agreements among undertakings" in restraint of trade, counterpart to the Sherman Act's § 1 (conspiracies in restraint of trade), the branch that applies to development of voluntary standards by competitors. But better to innovate and obsolete HTML, I think. DG Competition and the DoJ won't prosecute such cases soon. For example, Obama ran for office promising to "reinvigorate antitrust enforcement" but his DoJ has yet to file its first antitrust case against a big company. Nb., virtually the same definition of interoperability announced by the Court of First Instance is provided by ISO/IEC JTC-1 Directives, annex I ("eye"), which is applicable to all international standards in the IT sector: "... interoperability is understood to be the ability of two or more IT systems to exchange information at one or more standardised interfaces
Gonzalo San Gil, PhD.

Midori in Launchpad - 0 views

  •  
    [ # Join #midori on irc.freenode.net for discussions about bugs and development. Project statistics: https://www.ohloh.net/p/midori # Midori is a fast and lightweight web browser that uses the WebKit rendering engine and the GTK+ interface. Midori is a fast little WebKit browser with support for HTML5. It can manage many open tabs and windows. The URL bar completes history, bookmarks, search engines and open tabs out of the box. Web developers can use the powerful web inspector that is a part of WebKit. Individual pages can easily be turned into web apps and new profiles can be created on demand. A number of extensions are included by default: * Adblock with support for ABP filter lists and custom rules is built-in. * You can download files with Aria2 or SteadyFlow. * User scripts and styles support a la Greasemonkey. * Managing cookies and scripts via NoJS and Cookie Security Manager. * Switching open tabs in a vertical panel or a popup window.]
Gonzalo San Gil, PhD.

Getting Started with Docker | Linux.com - 0 views

  •  
    "Tuesday, 15 December 2015 07:46 Carla Schroder |Exclusive cowsay Figure1: Whalesay. Docker is the excellent new container application that is generating much buzz and many silly stock photos of shipping containers. Containers are not new; so, what's so great about Docker? Docker is built on Linux Containers (LXC). It runs on Linux, is easy to use, and is resource-efficient."
Paul Merrell

Exclusive: Tim Berners-Lee tells us his radical new plan to upend the - 1 views

  • “The intent is world domination,” Berners-Lee says with a wry smile. The British-born scientist is known for his dry sense of humor. But in this case, he is not joking.This week, Berners-Lee will launch Inrupt, a startup that he has been building, in stealth mode, for the past nine months. Backed by Glasswing Ventures, its mission is to turbocharge a broader movement afoot, among developers around the world, to decentralize the web and take back power from the forces that have profited from centralizing it. In other words, it’s game on for Facebook, Google, Amazon. For years now, Berners-Lee and other internet activists have been dreaming of a digital utopia where individuals control their own data and the internet remains free and open. But for Berners-Lee, the time for dreaming is over.
  • In a post published this weekend, Berners-Lee explains that he is taking a sabbatical from MIT to work full time on Inrupt. The company will be the first major commercial venture built off of Solid, a decentralized web platform he and others at MIT have spent years building.
  • f all goes as planned, Inrupt will be to Solid what Netscape once was for many first-time users of the web: an easy way in. And like with Netscape, Berners-Lee hopes Inrupt will be just the first of many companies to emerge from Solid.
  • ...4 more annotations...
  • On his screen, there is a simple-looking web page with tabs across the top: Tim’s to-do list, his calendar, chats, address book. He built this app–one of the first on Solid–for his personal use. It is simple, spare. In fact, it’s so plain that, at first glance, it’s hard to see its significance. But to Berners-Lee, this is where the revolution begins. The app, using Solid’s decentralized technology, allows Berners-Lee to access all of his data seamlessly–his calendar, his music library, videos, chat, research. It’s like a mashup of Google Drive, Microsoft Outlook, Slack, Spotify, and WhatsApp.The difference here is that, on Solid, all the information is under his control. Every bit of data he creates or adds on Solid exists within a Solid pod–which is an acronym for personal online data store. These pods are what give Solid users control over their applications and information on the web. Anyone using the platform will get a Solid identity and Solid pod. This is how people, Berners-Lee says, will take back the power of the web from corporations.
  • For example, one idea Berners-Lee is currently working on is a way to create a decentralized version of Alexa, Amazon’s increasingly ubiquitous digital assistant. He calls it Charlie. Unlike with Alexa, on Charlie people would own all their data. That means they could trust Charlie with, for example, health records, children’s school events, or financial records. That is the kind of machine Berners-Lee hopes will spring up all over Solid to flip the power dynamics of the web from corporation to individuals.
  • Berners-Lee believes Solid will resonate with the global community of developers, hackers, and internet activists who bristle over corporate and government control of the web. “Developers have always had a certain amount of revolutionary spirit,” he observes. Circumventing government spies or corporate overlords may be the initial lure of Solid, but the bigger draw will be something even more appealing to hackers: freedom. In the centralized web, data is kept in silos–controlled by the companies that build them, like Facebook and Google. In the decentralized web, there are no silos.Starting this week, developers around the world will be able to start building their own decentralized apps with tools through the Inrupt site. Berners-Lee will spend this fall crisscrossing the globe, giving tutorials and presentations to developers about Solid and Inrupt.
  • When asked about this, Berners-Lee says flatly: “We are not talking to Facebook and Google about whether or not to introduce a complete change where all their business models are completely upended overnight. We are not asking their permission.”Game on.
Paul Merrell

From Radio to Porn, British Spies Track Web Users' Online Identities - 1 views

  • HERE WAS A SIMPLE AIM at the heart of the top-secret program: Record the website browsing habits of “every visible user on the Internet.” Before long, billions of digital records about ordinary people’s online activities were being stored every day. Among them were details cataloging visits to porn, social media and news websites, search engines, chat forums, and blogs. The mass surveillance operation — code-named KARMA POLICE — was launched by British spies about seven years ago without any public debate or scrutiny. It was just one part of a giant global Internet spying apparatus built by the United Kingdom’s electronic eavesdropping agency, Government Communications Headquarters, or GCHQ. The revelations about the scope of the British agency’s surveillance are contained in documents obtained by The Intercept from National Security Agency whistleblower Edward Snowden. Previous reports based on the leaked files have exposed how GCHQ taps into Internet cables to monitor communications on a vast scale, but many details about what happens to the data after it has been vacuumed up have remained unclear.
  • Amid a renewed push from the U.K. government for more surveillance powers, more than two dozen documents being disclosed today by The Intercept reveal for the first time several major strands of GCHQ’s existing electronic eavesdropping capabilities.
  • The surveillance is underpinned by an opaque legal regime that has authorized GCHQ to sift through huge archives of metadata about the private phone calls, emails and Internet browsing logs of Brits, Americans, and any other citizens — all without a court order or judicial warrant
  • ...17 more annotations...
  • A huge volume of the Internet data GCHQ collects flows directly into a massive repository named Black Hole, which is at the core of the agency’s online spying operations, storing raw logs of intercepted material before it has been subject to analysis. Black Hole contains data collected by GCHQ as part of bulk “unselected” surveillance, meaning it is not focused on particular “selected” targets and instead includes troves of data indiscriminately swept up about ordinary people’s online activities. Between August 2007 and March 2009, GCHQ documents say that Black Hole was used to store more than 1.1 trillion “events” — a term the agency uses to refer to metadata records — with about 10 billion new entries added every day. As of March 2009, the largest slice of data Black Hole held — 41 percent — was about people’s Internet browsing histories. The rest included a combination of email and instant messenger records, details about search engine queries, information about social media activity, logs related to hacking operations, and data on people’s use of tools to browse the Internet anonymously.
  • Throughout this period, as smartphone sales started to boom, the frequency of people’s Internet use was steadily increasing. In tandem, British spies were working frantically to bolster their spying capabilities, with plans afoot to expand the size of Black Hole and other repositories to handle an avalanche of new data. By 2010, according to the documents, GCHQ was logging 30 billion metadata records per day. By 2012, collection had increased to 50 billion per day, and work was underway to double capacity to 100 billion. The agency was developing “unprecedented” techniques to perform what it called “population-scale” data mining, monitoring all communications across entire countries in an effort to detect patterns or behaviors deemed suspicious. It was creating what it said would be, by 2013, “the world’s biggest” surveillance engine “to run cyber operations and to access better, more valued data for customers to make a real world difference.”
  • A document from the GCHQ target analysis center (GTAC) shows the Black Hole repository’s structure.
  • The data is searched by GCHQ analysts in a hunt for behavior online that could be connected to terrorism or other criminal activity. But it has also served a broader and more controversial purpose — helping the agency hack into European companies’ computer networks. In the lead up to its secret mission targeting Netherlands-based Gemalto, the largest SIM card manufacturer in the world, GCHQ used MUTANT BROTH in an effort to identify the company’s employees so it could hack into their computers. The system helped the agency analyze intercepted Facebook cookies it believed were associated with Gemalto staff located at offices in France and Poland. GCHQ later successfully infiltrated Gemalto’s internal networks, stealing encryption keys produced by the company that protect the privacy of cell phone communications.
  • Similarly, MUTANT BROTH proved integral to GCHQ’s hack of Belgian telecommunications provider Belgacom. The agency entered IP addresses associated with Belgacom into MUTANT BROTH to uncover information about the company’s employees. Cookies associated with the IPs revealed the Google, Yahoo, and LinkedIn accounts of three Belgacom engineers, whose computers were then targeted by the agency and infected with malware. The hacking operation resulted in GCHQ gaining deep access into the most sensitive parts of Belgacom’s internal systems, granting British spies the ability to intercept communications passing through the company’s networks.
  • In March, a U.K. parliamentary committee published the findings of an 18-month review of GCHQ’s operations and called for an overhaul of the laws that regulate the spying. The committee raised concerns about the agency gathering what it described as “bulk personal datasets” being held about “a wide range of people.” However, it censored the section of the report describing what these “datasets” contained, despite acknowledging that they “may be highly intrusive.” The Snowden documents shine light on some of the core GCHQ bulk data-gathering programs that the committee was likely referring to — pulling back the veil of secrecy that has shielded some of the agency’s most controversial surveillance operations from public scrutiny. KARMA POLICE and MUTANT BROTH are among the key bulk collection systems. But they do not operate in isolation — and the scope of GCHQ’s spying extends far beyond them.
  • The agency operates a bewildering array of other eavesdropping systems, each serving its own specific purpose and designated a unique code name, such as: SOCIAL ANTHROPOID, which is used to analyze metadata on emails, instant messenger chats, social media connections and conversations, plus “telephony” metadata about phone calls, cell phone locations, text and multimedia messages; MEMORY HOLE, which logs queries entered into search engines and associates each search with an IP address; MARBLED GECKO, which sifts through details about searches people have entered into Google Maps and Google Earth; and INFINITE MONKEYS, which analyzes data about the usage of online bulletin boards and forums. GCHQ has other programs that it uses to analyze the content of intercepted communications, such as the full written body of emails and the audio of phone calls. One of the most important content collection capabilities is TEMPORA, which mines vast amounts of emails, instant messages, voice calls and other communications and makes them accessible through a Google-style search tool named XKEYSCORE.
  • As of September 2012, TEMPORA was collecting “more than 40 billion pieces of content a day” and it was being used to spy on people across Europe, the Middle East, and North Africa, according to a top-secret memo outlining the scope of the program. The existence of TEMPORA was first revealed by The Guardian in June 2013. To analyze all of the communications it intercepts and to build a profile of the individuals it is monitoring, GCHQ uses a variety of different tools that can pull together all of the relevant information and make it accessible through a single interface. SAMUEL PEPYS is one such tool, built by the British spies to analyze both the content and metadata of emails, browsing sessions, and instant messages as they are being intercepted in real time. One screenshot of SAMUEL PEPYS in action shows the agency using it to monitor an individual in Sweden who visited a page about GCHQ on the U.S.-based anti-secrecy website Cryptome.
  • Partly due to the U.K.’s geographic location — situated between the United States and the western edge of continental Europe — a large amount of the world’s Internet traffic passes through its territory across international data cables. In 2010, GCHQ noted that what amounted to “25 percent of all Internet traffic” was transiting the U.K. through some 1,600 different cables. The agency said that it could “survey the majority of the 1,600” and “select the most valuable to switch into our processing systems.”
  • According to Joss Wright, a research fellow at the University of Oxford’s Internet Institute, tapping into the cables allows GCHQ to monitor a large portion of foreign communications. But the cables also transport masses of wholly domestic British emails and online chats, because when anyone in the U.K. sends an email or visits a website, their computer will routinely send and receive data from servers that are located overseas. “I could send a message from my computer here [in England] to my wife’s computer in the next room and on its way it could go through the U.S., France, and other countries,” Wright says. “That’s just the way the Internet is designed.” In other words, Wright adds, that means “a lot” of British data and communications transit across international cables daily, and are liable to be swept into GCHQ’s databases.
  • A map from a classified GCHQ presentation about intercepting communications from undersea cables. GCHQ is authorized to conduct dragnet surveillance of the international data cables through so-called external warrants that are signed off by a government minister. The external warrants permit the agency to monitor communications in foreign countries as well as British citizens’ international calls and emails — for example, a call from Islamabad to London. They prohibit GCHQ from reading or listening to the content of “internal” U.K. to U.K. emails and phone calls, which are supposed to be filtered out from GCHQ’s systems if they are inadvertently intercepted unless additional authorization is granted to scrutinize them. However, the same rules do not apply to metadata. A little-known loophole in the law allows GCHQ to use external warrants to collect and analyze bulk metadata about the emails, phone calls, and Internet browsing activities of British people, citizens of closely allied countries, and others, regardless of whether the data is derived from domestic U.K. to U.K. communications and browsing sessions or otherwise. In March, the existence of this loophole was quietly acknowledged by the U.K. parliamentary committee’s surveillance review, which stated in a section of its report that “special protection and additional safeguards” did not apply to metadata swept up using external warrants and that domestic British metadata could therefore be lawfully “returned as a result of searches” conducted by GCHQ.
  • Perhaps unsurprisingly, GCHQ appears to have readily exploited this obscure legal technicality. Secret policy guidance papers issued to the agency’s analysts instruct them that they can sift through huge troves of indiscriminately collected metadata records to spy on anyone regardless of their nationality. The guidance makes clear that there is no exemption or extra privacy protection for British people or citizens from countries that are members of the Five Eyes, a surveillance alliance that the U.K. is part of alongside the U.S., Canada, Australia, and New Zealand. “If you are searching a purely Events only database such as MUTANT BROTH, the issue of location does not occur,” states one internal GCHQ policy document, which is marked with a “last modified” date of July 2012. The document adds that analysts are free to search the databases for British metadata “without further authorization” by inputing a U.K. “selector,” meaning a unique identifier such as a person’s email or IP address, username, or phone number. Authorization is “not needed for individuals in the U.K.,” another GCHQ document explains, because metadata has been judged “less intrusive than communications content.” All the spies are required to do to mine the metadata troves is write a short “justification” or “reason” for each search they conduct and then click a button on their computer screen.
  • Intelligence GCHQ collects on British persons of interest is shared with domestic security agency MI5, which usually takes the lead on spying operations within the U.K. MI5 conducts its own extensive domestic surveillance as part of a program called DIGINT (digital intelligence).
  • GCHQ’s documents suggest that it typically retains metadata for periods of between 30 days to six months. It stores the content of communications for a shorter period of time, varying between three to 30 days. The retention periods can be extended if deemed necessary for “cyber defense.” One secret policy paper dated from January 2010 lists the wide range of information the agency classes as metadata — including location data that could be used to track your movements, your email, instant messenger, and social networking “buddy lists,” logs showing who you have communicated with by phone or email, the passwords you use to access “communications services” (such as an email account), and information about websites you have viewed.
  • Records showing the full website addresses you have visited — for instance, www.gchq.gov.uk/what_we_do — are treated as content. But the first part of an address you have visited — for instance, www.gchq.gov.uk — is treated as metadata. In isolation, a single metadata record of a phone call, email, or website visit may not reveal much about a person’s private life, according to Ethan Zuckerman, director of Massachusetts Institute of Technology’s Center for Civic Media. But if accumulated and analyzed over a period of weeks or months, these details would be “extremely personal,” he told The Intercept, because they could reveal a person’s movements, habits, religious beliefs, political views, relationships, and even sexual preferences. For Zuckerman, who has studied the social and political ramifications of surveillance, the most concerning aspect of large-scale government data collection is that it can be “corrosive towards democracy” — leading to a chilling effect on freedom of expression and communication. “Once we know there’s a reasonable chance that we are being watched in one fashion or another it’s hard for that not to have a ‘panopticon effect,’” he said, “where we think and behave differently based on the assumption that people may be watching and paying attention to what we are doing.”
  • When compared to surveillance rules in place in the U.S., GCHQ notes in one document that the U.K. has “a light oversight regime.” The more lax British spying regulations are reflected in secret internal rules that highlight greater restrictions on how NSA databases can be accessed. The NSA’s troves can be searched for data on British citizens, one document states, but they cannot be mined for information about Americans or other citizens from countries in the Five Eyes alliance. No such constraints are placed on GCHQ’s own databases, which can be sifted for records on the phone calls, emails, and Internet usage of Brits, Americans, and citizens from any other country. The scope of GCHQ’s surveillance powers explain in part why Snowden told The Guardian in June 2013 that U.K. surveillance is “worse than the U.S.” In an interview with Der Spiegel in July 2013, Snowden added that British Internet cables were “radioactive” and joked: “Even the Queen’s selfies to the pool boy get logged.”
  • In recent years, the biggest barrier to GCHQ’s mass collection of data does not appear to have come in the form of legal or policy restrictions. Rather, it is the increased use of encryption technology that protects the privacy of communications that has posed the biggest potential hindrance to the agency’s activities. “The spread of encryption … threatens our ability to do effective target discovery/development,” says a top-secret report co-authored by an official from the British agency and an NSA employee in 2011. “Pertinent metadata events will be locked within the encrypted channels and difficult, if not impossible, to prise out,” the report says, adding that the agencies were working on a plan that would “(hopefully) allow our Internet Exploitation strategy to prevail.”
Paul Merrell

Google confirms that advanced backdoor came preinstalled on Android devices | Ars Technica - 0 views

  • Criminals in 2017 managed to get an advanced backdoor preinstalled on Android devices before they left the factories of manufacturers, Google researchers confirmed on Thursday. Triada first came to light in 2016 in articles published by Kaspersky here and here, the first of which said the malware was "one of the most advanced mobile Trojans" the security firm's analysts had ever encountered. Once installed, Triada's chief purpose was to install apps that could be used to send spam and display ads. It employed an impressive kit of tools, including rooting exploits that bypassed security protections built into Android and the means to modify the Android OS' all-powerful Zygote process. That meant the malware could directly tamper with every installed app. Triada also connected to no fewer than 17 command and control servers. In July 2017, security firm Dr. Web reported that its researchers had found Triada built into the firmware of several Android devices, including the Leagoo M5 Plus, Leagoo M8, Nomu S10, and Nomu S20. The attackers used the backdoor to surreptitiously download and install modules. Because the backdoor was embedded into one of the OS libraries and located in the system section, it couldn't be deleted using standard methods, the report said. On Thursday, Google confirmed the Dr. Web report, although it stopped short of naming the manufacturers. Thursday's report also said the supply chain attack was pulled off by one or more partners the manufacturers used in preparing the final firmware image used in the affected devices.
Paul Merrell

Can Dweb Save The Internet? 06/03/2019 - 0 views

  • On a mysterious farm just above the Pacific Ocean, the group who built the internet is inviting a small number of friends to a semi-secret gathering. They describe it as a camp "where diverse people can freely exchange ideas about the technologies, laws, markets, and agreements we need to move forward.” Forward indeed.It wasn’t that long ago that the internet was an open network of computers, blogs, sites, and posts.But then something happened -- and the open web was taken over by private, for-profit, closed networks. Facebook isn’t the web. YouTube isn’t the web. Google isn’t the web. They’re for-profit businesses that are looking to sell audiences to advertisers.Brewster Kahle is one of the early web innovators who built the Internet Archive as a public storehouse to protect the web’s history. Along with web luminaries such as Sir Tim Berners-Lee and Vint Cerf, he is working to protect and rebuild the open nature of the web.advertisementadvertisement“We demonstrated that the web had failed instead of served humanity, as it was supposed to have done,” Berners-Lee told Vanity Fair. The web has “ended up producing -- [through] no deliberate action of the people who designed the platform -- a large-scale emergent phenomenon which is anti-human.”
  • o, they’re out to fix it, working on what they call the Dweb. The “d” in Dweb stands for distributed. In distributed systems, no one entity has control over the participation of any other entity.Berners-Lee is building a platform called Solid, designed to give people control over their own data. Other global projects also have the goal of taking take back the public web. Mastodon is decentralized Twitter. Peertube is a decentralized alternative to YouTube.This July 18 - 21, web activists plan to convene at the Decentralized Web Summit in San Francisco. Back in 2016, Kahle convened an early group of builders, archivists, policymaker, and journalists. He issued a challenge to  use decentralized technologies to “Lock the Web Open.” It’s hard to imagine he knew then how quickly the web would become a closed network.Last year's Dweb gathering convened more than 900 developers, activists, artists, researchers, lawyers, and students. Kahle opened the gathering by reminding attendees that the web used to be a place where everyone could play. "Today, I no longer feel like a player, I feel like I’m being played. Let’s build a decentralized web, let’s build a system we can depend on, a system that doesn’t feel creepy” he said, according to IEEE Spectrum.With the rising tide of concerns about how social networks have hacked our democracy, Kahle and his Dweb community will gather with increasing urgency around their mission.The internet began with an idealist mission to connect people and information for good. Today's web has yet to achieve that goal, but just maybe Dweb will build an internet more robust and open than the current infrastructure allows. That’s a mission worth fighting for.
‹ Previous 21 - 40 of 87 Next › Last »
Showing 20 items per page