Skip to main content

Home/ Open Web/ Group items tagged web-2.0

Rss Feed Group items tagged

Paul Merrell

White House preparing Data.gov 2.0 -- Government Computer News - 0 views

  • White House officials plan to release Version 2.0 of the new government data portal, Data.gov, in the next couple of months, federal chief information officer Vivek Kundra said today. The federal Web site, which makes government data available for public reuse, will likely feature new tagging capabilities and an expanded array of information tools, Kundra said. Data.gov, which debuted May 21, has 87,000 data feeds from various government agencies. That number is expected to top 100,000 by next week, Kundra said.
Paul Merrell

M of A - Assad Says The "Boy In The Ambulance" Is Fake - This Proves It - 0 views

  • Re: Major net hack - its not necessarily off topic. .gov is herding web sites into it's own little DNS animal farms so it can properly protect the public from that dangerous 'information' stuff in time of emergency. CloudFlare is the biggest abattoir... er, animal farm. CloudFlare is kind of like a protection racket. If you pay their outrageous fees, you will be 'protected' from DDoS attacks. Since CloudFlare is the preferred covert .gov tool of censorship and content control (when things go south), they are trying to drive as many sites as possible into their digital panopticons. Who the hell is Cloudflare? ISUCKER: BIG BROTHER INTERNET CULTURE On top of that, CloudFlare’s CEO Matthew Prince made a weird, glib admission that he decided to start the company only after the Department of Homeland Security gave him a call in 2007 and suggested he take the technology behind Project Honey Pot one step further… And that makes CloudFlare a whole different story: People who sign up for the service are allowing CloudFlare to monitor, observe and scrutinize all of their site’s traffic, which makes it much easier for intel or law enforcement agencies to collect info on websites and without having to hack or request the logs from each hosting company separately. But there’s more. Because CloudFlare doesn’t just passively monitor internet traffic but works like a dynamic firewall to selectively block traffic from sources it deems to be “hostile,” website operators are giving it a whole lotta power over who gets to see their content. The whole point of CloudFlare is to restrict access to websites from specific locations/IP addresses on the fly, without notifying or bothering the website owner with the details. It’s all boils down to a question of trust, as in: do you trust a shady company with known intel/law enforcement connections to make that decision?
  • And here is an added bonus for the paranoid: Because CloudFlare partially caches websites and delivers them to web surfers via its own servers, the company also has the power to serve up redacted versions of the content to specific users. CloudFlare is perfect: it can implement censorship on the fly, without anyone getting wise to it! Right now CloudFlare says it monitors nearly 1/5 of all Internet visits. [<-- this] An astounding claim for a company most people haven’t even heard of. And techie bloggers seem very excited about getting as much Internet traffic routed through them as possible! See? Plausable deniability. A couple of degrees of separation. Yet when the Borg Queen wants to start WWIII next year, she can order the DHS Stazi to order outfits like CloudFlare to do the proper 'shaping' of internet traffic to filter out unwanted information. How far is any expose of propaganda like Dusty Boy going to happen if nobody can get to sites like MoA? You'll be able to get to all kinds of tweets and NGO sites crying about Dusty Boy 2.0, but you won't see a tweet or a web site calling them out on their lies. Will you even know they interviewed Assad? Will you know the activist 'photographer' is a paid NGO shill or that he's pals with al Zenki? Nope, not if .gov can help it.
Paul Merrell

Cover Pages: Content Management Interoperability Services (CMIS) - 0 views

  • On October 06, 2008, OASIS issued a public call for participation in a new technical committee chartered to define specifications for use of Web services and Web 2.0 interfaces to enable information sharing across content management repositories from different vendors. The OASIS Content Management Interoperability Services (CMIS) TC will build upon existing specifications to "define a domain model and bindings that are designed to be layered on top of existing Content Management systems and their existing programmatic interfaces. The TC will not prescribe how specific features should be implemented within those Enterprise Content Management (ECM) systems. Rather it will seek to define a generic/universal set of capabilities provided by an ECM system and a set of services for working with those capabilities." As of February 17, 2010, the CMIS technical work had received broad support through TC participation, industry analyst opinion, and declarations of interest from major companies. Some of these include Adobe, Adullact, AIIM, Alfresco, Amdocs, Anakeen, ASG Software Solutions, Booz Allen Hamilton, Capgemini, Citytech, Content Technologies, Day Software, dotCMS, Ektron, EMC, EntropySoft, ESoCE-NET, Exalead, FatWire, Fidelity, Flatirons, fme AG, Genus Technologies, Greenbytes GmbH, Harris, IBM, ISIS Papyrus, KnowledgeTree, Lexmark, Liferay, Magnolia, Mekon, Microsoft, Middle East Technical University, Nuxeo, Open Text, Oracle, Pearson, Quark, RSD, SAP, Saperion, Structured Software Systems (3SL), Sun Microsystems, Tanner AG, TIBCO Software, Vamosa, Vignette, and WeWebU Software. Early commentary from industry analysts and software engineers is positive about the value proposition in standardizing an enterprise content-centric management specification. The OASIS announcement of November 17, 2008 includes endorsements. Principal use cases motivating the CMIS technical work include collaborative content applications, portals leveraging content management repositories, mashups, and searching a content repository.
  •  
    I should have posted before about CMIS, an emerging standard with a very lot of buy-in by vendors large and small. I've been watching the buzz grow via Robin Cover's Daily XML links service. IIt's now on my "need to watch" list. 
Paul Merrell

Dr. Dobb's | Other Voices: An HTML5 Primer | June 03, 2010 - 0 views

  • With Google and Apple strongly supporting HTML5 as the solution for rich applications for the Internet, it's become the buzzword of the month -- particularly after Google I/O. Given its hot currency, though, it's not surprising that the term is starting to become unhinged from reality. Already, we're starting to see job postings requiring "HTML5 experience," and people pointing to everything from simple JavaScript animations to CSS3 effects as examples of HTML5. Just as "AJAX" and "Web 2.0" became handy (and widely misused) shorthand for "next-generation" web development in the mid-2000's, HTML5 is now becoming the next overloaded term. And although there are many excellent resources out there describing details of HTML5, including the core specification itself, they are generally technical and many of them are now out of synch with the current state of the specs. So, I thought a primer on HTML5 might be in order.
Paul Merrell

InfoQ: WS-I closes its doors. What does this mean for WS-*? - 0 views

  • The Web Services Interoperability Organization (WS-I) has just announced that it has completed its mission and will betransitioning all further efforts to OASIS. As their recent press release states: The release of WS-I member approved final materials for Basic Profile (BP) 1.2 and 2.0, and Reliable Secure Profile (RSP) 1.0 fulfills WS-I’s last milestone as an organization. By publishing the final three profiles, WS-I marks the completion of its work. Stewardship over WS-I’s assets, operations and mission will transition to OASIS (Organization for the Advancement of Structured Information Standards), a group of technology vendors and customers that drive development and adoption of open standards. Now at any other time this kind of statement from a standards organization might pass without much comment. However, with the rise of REST, a range of non-WS approaches to SOA and the fact that most of the WS-* standards have not been covered by WS-I, is this a reflection of the new position Web Services finds itself in, over a decade after it began? Perhaps this was inevitable given that the over the past few years there has been a lot more emphasis on interoperability within the various WS-* working groups? Or are the days of interactions across heterogeneous SOAP implementations in the past?
  • So the question remains: has interoperability pretty much been achieved for WS-* through WS-I and the improvements made with the way in which the specifications and standards are developed today, or has the real interoperability challenge moved elsewhere, still to be addressed?
‹ Previous 21 - 25 of 25
Showing 20 items per page