Skip to main content

Home/ Future of the Web/ Group items matching "w3c" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
Paul Merrell

Variability in Specifications - 0 views

  • This document analyzes how design decisions of a specification's conformance model may affect its implementability and the interoperability of its implementations. To do so, it introduces the concept of variability - how much implementations conforming to a given specification may vary among themselves - and presents a set of well-known dimensions of variability. Its goal is to raise awareness of the potential cost that some benign-looking decisions may have on interoperability and to provide guidance on how to avoid these pitfalls by better understanding the mechanisms induced by variability.
  •  
    Like this http://www.hdfilmsaati.net Film,dvd,download,free download,product... ppc,adword,adsense,amazon,clickbank,osell,bookmark,dofollow,edu,gov,ads,linkwell,traffic,scor,serp,goggle,bing,yahoo.ads,ads network,ads goggle,bing,quality links,link best,ptr,cpa,bpa
Gary Edwards

Good News for Ajax and the Open Web - The Browser Wars Are Back - 0 views

  • For much of this decade, Web browsing has been dominated by Microsoft's Internet Explorer (IE), which at its height achieved market share numbers approaching 95%, with the result that Microsoft owned a de facto standard for the Web and held effective veto power over the future of HTML. During much of this period, Microsoft suspended development of IE, with the result that virtually no new features appeared within the world's dominant browser from 2001 to 2006. But while IE was sleeping, one of the biggest phenomena of the computer age happened: Ajax. Clever Web developers discovered gold in them there mountains. Using Ajax techniques, Web developers could create desktop-like rich user interfaces right in the browser. Not only that, Ajax was evolutionary. Ajax offered an incremental path from the industry's existing HTML-based infrastructure and know-how, allowing Web developers to add rich Ajax elements to an existing HTML page.
  • A companion community effort helping to accelerate the adoption of open standards is the Web Standards Project (http://www.webstandards.org), which is producing a set of "acid tests" that verify browser support for Open Web technologies, such as HTML, CSS and JavaScript. Acid2 is focused mainly on CSS support, and is now supported by Opera, Safari/WebKit, and IE. Acid3 (http://www.webstandards.org/action/acid3) tests DOM scripting, CSS rendering,
    • Gary Edwards
       
      The amazing thing about Ajax and the Open Web is the way WHATWG, WebKit, and the Web Standards "ACID" work has accelerated Open Web Standards, pushing far beyond the work of the glacial W3C.
  • Runtime Advocacy Task Force
  •  
    Lengthy artilce from the OpenAjax Alliance summarizing HTML, Ajax and the future of the Open Web. Very well referenced. Lots of whitepapers and links
  •  
    good summarization of the Open Web future.
  •  
    Most quality online stores. Know whether you are a trusted online retailer in the world. Whatever we can buy very good quality. and do not hesitate. Everything is very high quality. Including clothes, accessories, bags, cups. Highly recommended. This is one of the trusted online store in the world. View now www.retrostyler.com
Gary Edwards

WebKit and the Future of the Open Web - 0 views

  •  
    I reformatted my response to marbux concerning HTML5 and web application lack of interoperability. The original article these comments were posted to is titled, "Siding with HTML over XHTML, My Decision to Switch.... ".
Gary Edwards

Siding with HTML over XHTML, My Decision to Switch - Monday By Noon - 0 views

  • Publishing content on the Web is in no way limited to professional developers or designers, much of the reason the net is so active is because anyone can make a website. Sure, we (as knowledgeable professionals or hobbyists) all hope to make the Web a better place by doing our part in publishing documents with semantically rich, valid markup, but the reality is that those documents are rare. It’s important to keep in mind the true nature of the Internet; an open platform for information sharing.
  • XHTML2 has some very good ideas that I hope can become part of the web. However, it’s unrealistic to think that all web authors will switch to an XML-based syntax which demands that browsers stop processing the document on the first error. XML’s draconian policy was an attempt to clean up the web. This was done around 1996 when lots of invalid content entered the web. CSS took a different approach: instead of demanding that content isn’t processed, we defined rules for how to handle the undefined. It’s called “forward-compatible parsing” and means we can add new constructs without breaking the old. So, I don’t think XHTML is a realistic option for the masses. HTML 5 is it.
    • Gary Edwards
       
      Great quote from CSS expert Hakon Wium Lie.
  • @marbux: Of course i disagree with your interop assessment, but I wondered how it is that you’re missing the point. I think you confuse web applications with legacy desktop – client/server application model. And that confusion leads to the mistake of trying to transfer the desktop document model to one that could adequately service advancing web applications.
  •  
    A CMS expert argues for HTML over XHTML, explaining his reasons for switching. Excellent read! He nails the basics. for similar reasons, we moved from ODF to ePUB and then to CDf and finally to the advanced WebKit document model, where wikiWORD will make it's stand.
  •  
    See also my comment on the same web page that explains why HTML 5 is NOT it for document exchange between web editing applications. .
  •  
    Response to marbux supporting the WebKit layout/document model. Marbux argues that HTML5 is not interoperable, and CSS2 near useless. HTML5 fails regarding the the interop web appplications need. I respond by arguing that the only way to look at web applications is to consider that the browser layout engine is the web application layout engine! Web applications are actually written to the browser layout/document model, OR, to take advantage of browser plug-in capabilities. The interoperability marbux seeks is tied directly to the browser layout engine. In this context, the web format is simply a reflection of that layout engine. If there's an interop problem, it comes from browser madness differentials. The good news is that there are all kinds of efforts to close the browser gap: including WHATWG - HTML5, CSS3, W3C DOM, JavaScript Libraries, Google GWT (Java to JavaScript), Yahoo GUI, and the my favorite; WebKit. The bad news is that the clock is ticking. Microsoft has pulled the trigger and the great migration of MSOffice client/server systems to the MS WebSTack-Mesh architecture has begun. Key to this transition are the WPF-.NET proprietary formats, protocols and interfaces such as XAML, Silverlight, LINQ, and Smart Tags. New business processes are being written, and old legacy desktop bound processes are being transitioned to this emerging platform. The fight for the Open Web is on, with Microsoft threatening to transtion their entire business desktop monopoly to a Web platfomr they own. ~ge~
Paul Merrell

XForms for HTML: W3C Working Draft 19 December 2008 - 0 views

  • AbstractXForms for HTML provides a set of attributes and script methods that can be used by the tags or elements of an HTML or XHTML web page to simplify the integration of data-intensive interactive processing capabilities from XForms. The semantics of the attributes are mapped to the rich XForms model-view-controller-connector architecture, thereby allowing web application authors a smoother, selective migration path to the higher-order behaviors available from the full element markup available in modules of XForms.
  • This document describes XForms for HTML, which provides a set of attributes and script methods encompassing a useful subset of XForms functionality and mapping that functionality to syntactic constructs that are familiar to authors of HTML and XHTML web pages. The intent of this module is to simplify the means by which web page authors gain access to the rich functionality available from the hybrid execution model of XForms, which combines declarative constructs with event-driven imperative processing. These attributes and script methods increase the initial consumability of XForms by allowing injection of rich semantics directly into the host language markup. In turn, the behaviors of the attributes and script methods are mapped to the XForms model-view-controller-connector architecture so that applications manifest behaviors consistent with having used XForms markup elements. This allows authors to gradually address greater application complexity as it arises in the software lifecycle by opportunistically, i.e. as the need arises, switching from the attributes and script methods of this specification to the corresponding XForms markup elements. This gradual adoption strategy is being further supported by the modularization of XForms into components that can be consumed incrementally by authors and implementers.
Paul Merrell

XHTML Modularization 1.1 Released as W3C Recommendation - 0 views

  • XHTML Modularization is a decomposition of XHTML 1.0, and by reference HTML 4, into a collection of abstract modules that provide specific types of functionality.
  • The modularization of XHTML refers to the task of specifying well-defined sets of XHTML elements that can be combined and extended by document authors, document type architects, other XML standards specifications, and application and product designers to make it economically feasible for content developers to deliver content on a greater number and diversity of platforms. Over the last couple of years, many specialized markets have begun looking to HTML as a content language. There is a great movement toward using HTML across increasingly diverse computing platforms. Currently there is activity to move HTML onto mobile devices (hand held computers, portable phones, etc.), television devices (digital televisions, TV-based Web browsers, etc.), and appliances (fixed function devices). Each of these devices has different requirements and constraints.
  • XHTML Modularization is a decomposition of XHTML 1.0, and by reference HTML 4, into a collection of abstract modules that provide specific types of functionality. These abstract modules are implemented in this specification using the XML Schema and XML Document Type Definition languages. The rules for defining the abstract modules, and for implementing them using XML Schemas and XML DTDs, are also defined in this document. These modules may be combined with each other and with other modules to create XHTML subset and extension document types that qualify as members of the XHTML-family of document types.
  • ...1 more annotation...
  • Modularizing XHTML provides a means for product designers to specify which elements are supported by a device using standard building blocks and standard methods for specifying which building blocks are used. These modules serve as "points of conformance" for the content community. The content community can now target the installed base that supports a certain collection of modules, rather than worry about the installed base that supports this or that permutation of XHTML elements. The use of standards is critical for modularized XHTML to be successful on a large scale. It is not economically feasible for content developers to tailor content to each and every permutation of XHTML elements. By specifying a standard, either software processes can autonomously tailor content to a device, or the device can automatically load the software required to process a module. Modularization also allows for the extension of XHTML's layout and presentation capabilities, using the extensibility of XML, without breaking the XHTML standard. This development path provides a stable, useful, and implementable framework for content developers and publishers to manage the rapid pace of technological change on the Web.
Paul Merrell

The Self-Describing Web - 0 views

  • Abstract The Web is designed to support flexible exploration of information by human users and by automated agents. For such exploration to be productive, information published by many different sources and for a variety of purposes must be comprehensible to a wide range of Web client software, and to users of that software. HTTP and other Web technologies can be used to deploy resource representations that are in an important sense self-describing: information about the encodings used for each representation is provided explicitly within the representation. Starting with a URI, there is a standard algorithm that a user agent can apply to retrieve and interpret such representations. Furthermore, representations can be grounded in the Web, by ensuring that specifications required to interpret them are determined unambiguously based on the URI, and that explicit references connect the pertinent specifications to each other. Web-grounding reduces ambiguity as to what has been published in the Web, and by whom. When such self-describing, Web-grounded resources are linked together, the Web as a whole can support reliable, ad hoc discovery of information. This finding describes how document formats, markup conventions, attribute values, and other data formats can be designed to facilitate the deployment of self-describing, Web-grounded Web content.
Gary Edwards

Coding In Paradise: Fixing the Web, Part I - 0 views

  •  
    Must read: "This blog post is part of a new, semi-regular series called Fixing the Web. The goal is to highlight these issues, identify potential solutions, and have a dialogue. I don't claim to have the answers for the situation we are in. However, I do know this -- if there is any community that potentially has what it takes to solve these issues I believe it's the Ajax and JavaScript communities, which is why this is a perfect place to have these discussions. To start, I see four areas that are broken that must be fixed: ..... "
Gary Edwards

Brendan's Roadmap Updates: Open letter to Microsoft's Chris Wilson and their fight to stop ES4 - 0 views

  • The history of ECMAScript since its beginnings in November 1996 shows that when Microsoft was behind in the market (against Netscape in 1996-1997), it moved aggressively in the standards body to evolve standards starting with ES1 through ES3. Once Microsoft dominated the market, the last edition of the standard was left to rot -- ES3 was finished in 1999 -- and even easy-to-fix standards conformance bugs in IE JScript went unfixed for eight years (so three years to go from Edition 1 to 3, then over eight to approach Edition 4). Now that the proposed 4th edition looks like a competitive threat, the world suddenly hears in detail about all those bugs, spun as differences afflicting "JavaScript" that should inform a new standard.
  • In my opinion the notion that we need to add features so that ajax programming would be easier is plain wrong. ajax is a hack and also the notion of a webapp is a hack. the web was created in a document centric view. All w3c standards are also based on the same document notion. The heart of the web, the HTTP protocol is designed to support a web of documents and as such is stateless. the proper solution, IMO, is not to evolve ES for the benefit of ajax and webapps, but rather generalize the notion of a document browser that connects to a web of documents to a general purpose client engine that connects to a network of internet applications. thus the current web (document) browser just becomes one such internet application.
  •  
    the obvious conflict of interest between the standards-based web and proprietary platforms advanced by Microsoft, and the rationales for keeping the web's client-side programming language small while the proprietary platforms rapidly evolve support for large languages, does not help maintain the fiction that only clashing high-level philosophies are involved here. Readers may not know that Ecma has no provision for "minor releases" of its standards, so any ES3.1 that was approved by TG1 would inevitably be given a whole edition number, presumably becoming the 4th Edition of ECMAScript. This is obviously contentious given all the years that the majority of TG1, sometimes even apparently including Microsoft representatives, has worked on ES4, and the developer expectations set by this long-standing effort. A history of Microsoft's post-ES3 involvement in the ECMAScript standard group, leading up to the overt split in TG1 in March, is summarized here. The history of ECMAScript since its beginnings in November 1996 shows that when Microsoft was behind in the market (against Netscape in 1996-1997), it moved aggressively in the standards body to evolve standards starting with ES1 through ES3. Once Microsoft dominated the market, the last edition of the standard was left to rot -- ES3 was finished in 1999 -- and even easy-to-fix standards conformance bugs in IE JScript went unfixed for eight years (so three years to go from Edition 1 to 3, then over eight to approach Edition 4). Now that the proposed 4th edition looks like a competitive threat, the world suddenly hears in detail about all those bugs, spun as differences afflicting "JavaScript" that should inform a new standard.
Gary Edwards

Official Google Webmaster Central Blog: Introducing Rich Snippets - 0 views

  •  
    Google "Rich Snippets" is a new presentation of HTML snippets that applies Google's algorithms to highlight structured data embedded in web pages. Rich Snippets give end-users convenient summary information about their search results at a glance. Google is currently supporting a very limited subset of data about reviews and people. When searching for a product or service, users can easily see reviews and ratings, and when searching for a person, they'll get help distinguishing between people with the same name. It's a simple change to the display of search results, yet our experiments have shown that users find the new data valuable. For this to work though, both Web-masters and Web-workers have to annotate thier pages with structured data in a standard format. Google snippets supports microformats and RDFa. Existing Web data can be wrapped with some additional tags to accomplish this. Notice that Google avoids mention of RDF and the W3C's vision of a "Semantic Web" where Web objects are fully described in machine readable semantics. Over at the WHATWG group, where work on HTML5 continues, Google's Ian Hickson has been fighting RDFa and the Semantic Web in what looks to be an effort to protect the infamous Google algorithms. RDFa provides a means for Web-workers, knowledge-workers, line-of-business managers and document generating end-users to enrich their HTML+ with machine semantics. The idea being that the document experts creating Web content can best describe to search engine and content management machines the objects-of-information used. The google algorithms provide a proprietary semantics of this same content. The best solution to the tsunami of conten the Web has wrought would be to combine end-user semantic expertise with Google algorithms. Let's hope Google stays the RDFa course and comes around to recognize the full potential of organizing the world's information with the input of content providers. One thing the world desperatel
« First ‹ Previous 61 - 71 of 71
Showing 20 items per page