Skip to main content

Home/ Open Web/ Group items tagged W3C

Rss Feed Group items tagged

Paul Merrell

Save Firefox! | Electronic Frontier Foundation - 0 views

  • The World Wide Web Consortium (W3C), once the force for open standards that kept browsers from locking publishers to their proprietary capabilities, has changed its mission. Since 2013, the organization has provided a forum where today's dominant browser companies and the dominant entertainment companies can collaborate on a system to let our browsers control our behavior, rather than the other way.

    This system, "Encrypted Media Extensions" (EME) uses standards-defined code to funnel video into a proprietary container called a "Content Decryption Module." For a new browser to support this new video streaming standard -- which major studios and cable operators are pushing for -- it would have to convince those entertainment companies or one of their partners to let them have a CDM, or this part of the "open" Web would not display in their new browser.

    This is the opposite of every W3C standard to date: once, all you needed to do to render content sent by a server was follow the standard, not get permission. If browsers had needed permission to render a page at the launch of Mozilla, the publishers would have frozen out this new, pop-up-blocking upstart. Kiss Firefox goodbye, in other words.

  • The W3C didn't have to do this. No copyright law says that making a video gives you the right to tell people who legally watch it how they must configure their equipment. But because of the design of EME, copyright holders will be able to use the law to shut down any new browser that tries to render the video without their permission.

    That's because EME is designed to trigger liability under section 1201 of the Digital Millennium Copyright Act (DMCA), which says that removing a digital lock that controls access to a copyrighted work without permission is an offense, even if the person removing the lock has the right to the content it restricts. In other words, once a video is sent with EME, a new company that unlocks it for its users can be sued, even if the users do nothing illegal with that video.

    We proposed that the W3C could protect new browsers by making their members promise not to use the DMCA to attack new entrants in the market, an idea supported by a diverse group of W3C members, but the W3C executive overruled us saying the work would go forward with no safeguards for future competition.

    It's even worse than at first glance. The DMCA isn't limited to the USA: the US Trade Representative has spread DMCA-like rules to virtually every country that does business with America. Worse still: the DMCA is also routinely used by companies to threaten and silence security researchers who reveal embarrassing defects in their products. The W3C also declined to require its members to protect security researchers who discover flaws in EME, leaving every Web user vulnerable to vulnerabilities whose disclosure can only safely take place if the affected company decides to permit it.

  • The W3C needs credibility with people who care about the open Web and innovation in order to be viable. They are sensitive to this kind of criticism. We empathize. There are lots of good people working there, people who genuinely, passionately want the Web to stay open to everyone, and to be safe for its users. But the organization made a terrible decision when it opted to provide a home for EME, and an even worse one when it overruled its own members and declined protection for security research and new competitors.

    It needs to hear from you now. Please share this post, and spread the word. Help the W3C be the organization it is meant to be.

Paul Merrell

The US is Losing Control of the Internet…Oh, Really? | Global Research - 0 views

  • All of the major internet organisations have pledged, at a summit in Uruguay, to free themselves of the influence of the US government.

    The directors of ICANN, the Internet Engineering Task Force, the Internet Architecture Board, the World Wide Web Consortium, the Internet Society and all five of the regional Internet address registries have vowed to break their associations with the US government.

    In a statement, the group called for “accelerating the globalization of ICANN and IANA functions, towards an environment in which all stakeholders, including all governments, participate on an equal footing”.

    That’s a distinct change from the current situation, where the US department of commerce has oversight of ICANN.

    In another part of the statement, the group “expressed strong concern over the undermining of the trust and confidence of Internet users globally due to recent revelations of pervasive monitoring and surveillance”.

    Meanwhile, it was announced that the next Internet Governance Summit would be held in Brazil, whose president has been extremely critical of the US over web surveillance.

    In a statement announcing the location of the summit, Brazilian president Dilma Rousseff said: “The United States and its allies must urgently end their spying activities once and for all.”

Paul Merrell

Tim Berners-Lee, W3C Approve Work On DRM For HTML 5.1 - Slashdot - 0 views

  • "Danny O'Brien from the EFF has a weblog post about how the Encrypted Media Extension (EME) proposal will continue to be part of HTML Work Group's bailiwick and may make it into a future HTML revision." From O'Brien's post: "A Web where you cannot cut and paste text; where your browser can't 'Save As...' an image; where the 'allowed' uses of saved files are monitored beyond the browser; where JavaScript is sealed away in opaque tombs; and maybe even where we can no longer effectively 'View Source' on some sites, is a very different Web from the one we have today. It's a Web where user agents—browsers—must navigate a nest of enforced duties every time they visit a page. It's a place where the next Tim Berners-Lee or Mozilla, if they were building a new browser from scratch, couldn't just look up the details of all the 'Web' technologies. They'd have to negotiate and sign compliance agreements with a raft of DRM providers just to be fully standards-compliant and interoperable."
    From the Dept. of YouGottaBeKiddingMe. 
Paul Merrell

Web video accessibility from EmbedPlus on 2011-08-11 ( from July to Se... - 0 views

    For those who care about Web accessibility, here is an opportunity to provide feedback on some accessibility tools for one of the most widely-used web services. The message deserves wide distribution. The contact email address is on the linked page. 

    The linked tool set should also be of interest to those doing mashups or embedding YouTube videos in web pages.

    Hi all,

    I'm the co-developer a YouTube third-party tool called EmbedPlus. It enhances the standard YouTube player with many features that aren't inherently supported. We've been getting lots of feedback regarding the accessibility benefits of some of these features like movable zoom, slow motion, and even third-party annotations.

    As the tool continues to grow in popularity, the importance of its accessibility rises. I decided to do some research and found the WAI Interest group to be a major proponent of accessibility on the web. If anyone has time to take a look at EmbedPlus and share feedback that could help improve the tool, please do.

    Here's the link:

    Thank you in advance,

Paul Merrell

W3C Issues Report on Web and Television Convergence - 0 views

    • 28 March 2011 -- The Web and television convergence story was the focus of W3C's Second Web and TV Workshop, which took place in Berlin in February. Today, W3C publishes a report that summarizes the discussion among the 77 organizations that participated, including broadcasters, telecom companies, cable operators, OTT (over the top) companies, content providers, device vendors, software vendors, Web application providers, researchers, governments, and standardization organizations active in the TV space. Convergence priorities identified in the report include:

      • Adaptive streaming over HTTP
      • Home networking and second-screen scenarios
      • The role of metadata and relation to Semantic Web technology
      • Ensuring that convergent solutions are accessible.
      • Profiling and testing
      • Possible extensions to HTML5 for Television
Gary Edwards

CSS Advanced Layout Module | W3C CSS3 Specification - 0 views

  • The properties in this specification work by associating a layout policy with an element.
    • Gary Edwards
      The CSS3 "Layout Policy" is one of the primary differentials between HTML5-CSS3-SVG and XML alternatives ODF and OOXML. Neither ODF or OOXML provide a complete description (semantic) of the underlying document layout model.
  • these policies give an element an invisible grid for aligning descendant elements
    CSS is a simple, declarative language for creating style sheets that specify the rendering of HTML and other structured documents. This specification is part of level 3 of CSS ("CSS3") and contains features to describe layouts at a high level, meant for tasks such as the positioning and alignment of "widgets" in a graphical user interface or the layout grid for a page or a window, in particular when the desired visual order is different from the order of the elements in the source document. Other CSS3 modules contain properties to specify fonts, colors, text alignment, list numbering, tables, etc.
    The features in this module are described together for easier reading, but are usually not implemented as a group. CSS3 modules often depend on other modules or contain features for several media types. Implementers should look at the various "profiles" of CSS, which list consistent sets of features for each type of media.
Paul Merrell

Thousands of HTML5 tests planned by Web consortium - 0 views

  • W3C is warning against drawing any conclusions based on the early tests, saying thousands of more HTML5 tests are planned. The goal of the tests is not to declare one browser a winner, but rather to help vendors and Web application developers ensure interoperability across all browsers, W3C says.
  • "We do expect to have tens of thousands of tests," says Philippe Le Hegaret, who oversees HTML activities for the W3C. 
  • the purpose of the HTML5 test suite is to help vendors and developers ensure that HTML5 applications work across all browsers. For example, a developer might check the test results before enabling a certain feature in an application, just to make sure it will work across IE9, Firefox, Chrome, Safari and Opera.

    Developers can build HTML5 applications today, but they have to keep in mind that they are early adopters and act accordingly, Le Hegaret says.

    "If you think HTML5 is perfectly stable today and you can use it without worrying about interoperability issues, I think you're going to fool yourself," he says.

    Although the first round of HTML5 tests focused on desktop browsers, Le Hegaret says HTML5 compatibility is advancing more rapidly on mobile devices such as iPhones and Androids.

    • Paul Merrell
      Note the continuing, indeed, escalating abuse of the term "interoperability" by W3C. "Interoperability" has both a legal and (happily, coinciding) technical meaning that involves round-tripping of information. ISO/IEC JTC 1 Directives defines the term in precisely the same terms as the European Union's Court of First Instance did in the landmark Commmission v. Microsoft antitrust case; "interoperability is understood to be the ability of two or more IT systems to *exchange* information at one or more standardised interfaces and to make *mutual use* of the information that has been exchanged."

      Web browsers do not do "interoperability;" there is no "exchange" and "mutual use" of the information exchanged. Web browsers do "compatibility," a one-way transfer of information that is broadcast from web servers; i.e., web browsers cannot send web pages to web servers.
Gary Edwards

Long Live the Web: Tim Berners-Lee at Scientific American - 0 views

    Lengthy article written by Tim Berners-Lee describes his concerns for the future of the Open Web.  Tim details the history of the Web, describing the principles that made the Web an Open Platform of universal access, exchange and collaborative computing.  
    Universality Is the Foundation
    Several principles are key to assuring that the Web becomes ever more valuable. The primary design principle underlying the Web's usefulness and growth is universality. When you make a link, you can link to anything. That means people must be able to put anything on the Web, no matter what computer they have, software they use or human language they speak and regardless of whether they have a wired or wireless Internet connection. The Web should be usable by people with disabilities. It must work with any form of information, be it a document or a point of data, and information of any quality-from a silly tweet to a scholarly paper. And it should be accessible from any kind of hardware that can connect to the Internet: stationary or mobile, small screen or large.
Paul Merrell

W3C News Archive: 2010 W3C - 0 views

  • Today W3C, the International Standards Organization (ISO), and the International Electrotechnical Commission (IEC) took steps that will encourage greater international adoption of W3C standards. W3C is now an "ISO/IEC JTC 1 PAS Submitter" (see the application), bringing "de jure" standards communities closer to the Internet ecosystem. As national bodies refer increasingly to W3C's widely deployed standards, users will benefit from an improved Web experience based on W3C's standards for an Open Web Platform. W3C expects to use this process (1) to help avoid global market fragmentation; (2) to improve deployment within government use of the specification; and (3) when there is evidence of stability/market acceptance of the specification. Web Services specifications will likely constitute the first package W3C will submit, by the end of 2010. For more information, see the W3C PAS Submission FAQ.
Paul Merrell

First official HTML5 tests topped by...Microsoft * The Register - 0 views

  • The Worldwide Web Consortium has released the results of its first HTML5 conformance tests, and according to this initial rundown, the browser that most closely adheres to the latest set of web standards is...Microsoft Internet Explorer 9.

    Yes, the HTML5 spec has yet to be finalised. And yes, these tests cover only a portion of the spec. But we can still marvel at just how much Microsoft's browser philosophy has changed in recent months.

    The W3C tests — available here — put IE9 beta release 6 at the top of the HTML5 conformance table, followed by the Firefox 4 beta 6, Google Chrome 7, Opera 10.6, and Safari 5.0. The tests cover seven aspects of the spec: "attributes", "audio", "video", "canvas", "getElementsByClassName", "foreigncontent," and "xhtml5":

  • The tests do not yet cover web workers, the file API, local storage, or other aspects of the spec.
Paul Merrell

HTML+RDFa 1.1 - 16 views

  • HTML+RDFa 1.1

    Support for RDFa in HTML4 and HTML5

    W3C Working Draft 19 October 2010

  • This specification defines rules and guidelines for adapting the RDFa Core 1.1 specification for use in HTML5 and XHTML5. The rules defined in this specification not only apply to HTML5 documents in non-XML and XML mode, but also to HTML4 and XHTML documents interpreted through the HTML5 parsing rules.
  • This specification is an extension to the HTML5 language. All normative content in the HTML5 specification, unless specifically overridden by this specification, is intended to be the basis for this specification.
  • ...1 more annotation...
  • 1. Introduction

    This section is non-normative.

    Today's web is built predominantly for human consumption. Even as machine-readable data begins to permeate the web, it is typically distributed in a separate file, with a separate format, and very limited correspondence between the human and machine versions. As a result, web browsers can provide only minimal assistance to humans in parsing and processing web data: browsers only see presentation information. RDFa is intended to solve the problem of machine-readable data in HTML documents. RDFa provides a set of HTML attributes to augment visual data with machine-readable hints. Using RDFa, authors may turn their existing human-visible text and links into machine-readable data without repeating content.

Paul Merrell

Eight HTML5 Drafts Updated, W3C News Archive: 2010 W3C - 0 views

Paul Merrell

HTML5: Getting to Last Call - W3C Blog - 0 views

  • We started to work on HTML5 back in 2007 and have been going through issues since then. In November 2009, the HTML Chairs instituted a decision policy, which allowed us to close around 20 issues or so. We now have around 200 bugs and 25 issues on the document.

    In order to drive the Group to Last Call, the HTML Chairs, following the advice from the W3C Team, produced a timeline to get the initial Last Call for HTML5. The W3C team expresses its strong support to the chairs of the HTML Working Group in their efforts to lead the group toward an initial Last Call according to the published timeline.

    All new bugs related to the HTML5 specification received after the first of October 2010 will be treated as Last Call comments, with possible exceptions granted by the Chairs. The intention is to get to the initial Last Call and have a feature-complete document.

    The HTML Chairs will keep driving the Group forward after that date in order to resolve all the bugs received by October 1. The expectation is to issue the Last Call document at the end of May 2011.

    I encourage everyone to send bugs prior to October 1 and keep track of them in order to escalate them to the Working Group if necessary.

    Get your HTML 5 bug reports filed *before* October 1.  See for more details.
Paul Merrell

Mathematical Markup Language (MathML) Version 3.0 - 0 views

  • Mathematical Markup Language (MathML) Version 3.0

    W3C Proposed Recommendation 10 August 2010

  • This specification defines the Mathematical Markup Language, or MathML. MathML is an XML application for describing mathematical notation and capturing both its structure and content. The goal of MathML is to enable mathematics to be served, received, and processed on the World Wide Web, just as HTML has enabled this functionality for text.
    MathML 3 achieves proposed recommendation status. For those unfamiliar with W3C lingo, this means that it is now a proposed standard. Concurrently, W3C published a proposed recommendation for a A MathML for CSS Profile,
Paul Merrell

Media Queries - 0 views

  • Abstract

    HTML4 and CSS2 currently support media-dependent style sheets tailored for different media types. For example, a document may use sans-serif fonts when displayed on a screen and serif fonts when printed. ‘screen’ and ‘print’ are two media types that have been defined. Media queries extend the functionality of media types by allowing more precise labeling of style sheets.

    A media query consists of a media type and zero or more expressions that check for the conditions of particular media features. Among the media features that can be used in media queries are ‘width’, ‘height’, and ‘color’. By using media queries, presentations can be tailored to a specific range of output devices without changing the content itself.

  • There must be at least two interoperable implementations. For the purposes of this criterion, we define the following terms:


    passing the respective test case(s) in the CSS test suite, or, if the implementation is not a Web browser, an equivalent test. Every relevant test in the test suite should have an equivalent test created if such a user agent (UA) is to be used to claim interoperability. In addition if such a UA is to be used to claim interoperability, then there must one or more additional UAs which can also pass those equivalent tests in the same way for the purpose of interoperability. The equivalent tests must be made publicly available for the purposes of peer review.

    While the candidate Media Queries specification is interesting and a small step in the right direction, W3C continues to butcher the meaning of "interoperability." In this latest sleight of hand, we now have "interoperable" *user agents*, a term of art used by W3C for implementations that only receive and cannot return data, e.g., web browsers. But under competition law, "interoperability" requires implementations that can exchange data and *mutually* use data that has been exchanged. See e.g., European Commission v. Microsoft, European Community Court of First Instance (Grand Chamber Judgment of 17 September, 2007), para. 230, 374, 421, (rejecting Microsoft's argument that "interoperability" has a 1-way rather than 2-way meaning; "Directive 91/250 defines interoperability as 'the ability to exchange information and *mutually* to use the information which has been exchanged'") (emphasis added). W3C --- the World Wide Web Conspiracy --- continues down its rut of broadcasting information whilst denying the world the power to round-trip the data received.

    Incredibly, in its latest assault on the meaning of "interoperability", W3C no longer defines "conformance" but redefines the term "interoperability" as its substitute for "conformance." As though W3C could redefine the law?
Paul Merrell

Closing CDF WG, Publishing Specs as Notes from Doug Schepers on 2010-07-12 (public-cdf@... - 0 views

  • Hi, CDF folks- While we had hoped that more implementations might emerge that passed the CDF and WICD test suites [1], such that these specifications would meet the criteria as W3C Recommendations, it does not seem that this will happen in a reasonable timeframe. Despite good partial implementation experience, implementers have not show sufficient interest to justify further investment of W3C resources into this group, even at a background level. In order to clarify the status of the CDF WG specifications, including Compound Document by Reference Framework 1.0 [2], Web Integration Compound Document (WICD) Core 1.0 [3], WICD Mobile 1.0 [4], and WICD Full 1.0 [5], all in Candidate Recommendation phase since July 2007, we have decided to publish them as Working Group Notes instead, and to close the Compound Document Formats Working Group.
    This event speaks loudly to how little interest browser developershave in interoperable web solutions. One-way compatibility wins and the ability of web applications to round-trip data loses. For those that did not realize it, the Compound Document by Reference Framework not only allowes but requires that more featureful implementations round-trip the output of less featureful implementations without data loss. See ("A conformant user agent of a superset profile specification must process subset profile content as if it were the superset profile content"). 
Paul Merrell

Last Call Working Draft -- W3C Authoring Tool Accessibility Guidelines (ATAG) 2.0 - 0 views

    • This is a Working Draft of the Authoring Tool Accessibility Guidelines (ATAG) version 2.0. This document includes recommendations for assisting authoring tool developers to make the authoring tools that they develop more accessible to people with disabilities, including blindness and low vision, deafness and hearing loss, learning disabilities, cognitive limitations, motor difficulties, speech difficulties, and others.

      Accessibility, from an authoring tool perspective, includes addressing the needs of two (potentially overlapping) user groups with disabilities:

    • Examples of authoring tools: ATAG 2.0 applies to a wide variety of web content generating applications, including, but not limited to:
      • web page authoring tools (e.g., WYSIWYG HTML editors)
      • software for directly editing source code (see note below)
      • software for converting to web content technologies (e.g., "Save as HTML" features in office suites)
      • integrated development environments (e.g., for web application development)
      • software that generates web content on the basis of templates, scripts, command-line input or "wizard"-type processes
      • software for rapidly updating portions of web pages (e.g., blogging, wikis, online forums)
      • software for generating/managing entire web sites (e.g., content management systems, courseware tools, content aggregators)
      • email clients that send messages in web content technologies
      • multimedia authoring tools
      • debugging tools for web content
      • software for creating mobile web applications
    • Web-based and non-web-based: ATAG 2.0 applies equally to authoring tools of web content that are web-based, non-web-based or a combination (e.g., a non-web-based markup editor with a web-based help system, a web-based content management system with a non-web-based file uploader client).
    • Real-time publishing: ATAG 2.0 applies to authoring tools with workflows that involve real-time publishing of web content (e.g., some collaborative tools). For these authoring tools, conformance to Part B of ATAG 2.0 may involve some combination of real-time accessibility supports and additional accessibility supports available after the real-time authoring session (e.g., the ability to add captions for audio that was initially published in real-time). For more information, see the Implementing ATAG 2.0 - Appendix E: Real-time content production.
    • Text Editors: ATAG 2.0 is not intended to apply to simple text editors that can be used to edit source content, but that include no support for the production of any particular web content technology. In contrast, ATAG 2.0 can apply to more sophisticated source content editors that support the production of specific web content technologies (e.g., with syntax checking, markup prediction, etc.).
    Link is the latest version link so page should update when this specification graduates to a W3C recommendation.
Paul Merrell

First working draft of W3C HTML5 - 0 views

  • HTML5

    A vocabulary and associated APIs for HTML and XHTML

  • This specification defines the 5th major revision of the core language of the World Wide Web: the Hypertext Markup Language (HTML). In this version, new features are introduced to help Web application authors, new elements are introduced based on research into prevailing authoring practices, and special attention has been given to defining clear conformance criteria for user agents in an effort to improve interoperability.
  • The W3C HTML Working Group is the W3C working group responsible for this specification's progress along the W3C Recommendation track. This specification is the 24 June 2010 Working Draft snapshot.

    Work on this specification is also done at the WHATWG. The W3C HTML working group actively pursues convergence with the WHATWG, as required by the W3C HTML working group charter.

Gary Edwards

Growing pains afflict HTML5 standardization | Deep Tech - CNET News - 0 views

    The World Wide Web Consortium's return to HTML standardization after years of absence has produced tensions with the more informal Web Hypertext Application Working Group (WHATWG) that shouldered the HTML burden during that absence.
    Some examples of language that's cropped up this month on the W3C's HTML Working Group mailing list: "childish," "intolerable," "ridiculous," "shenanigans." And there's a concrete manifestation of the divisiveness: The WHATWG and W3C versions of the HTML5 specification, though both stemming from the same source material, have diverged in some areas.

    Some of the differences are relatively minor, and there are strong incentives to converge the two drafts of the HTML5 specification so that browser makers and Web developers aren't faced with the prospect of incompatibilities. In the meantime, though, the overseers of the Web are clashing during a time when their important new standard is just arriving in the spotlight.
Paul Merrell

RDFa API - 0 views

  • RDFa API

    An API for extracting structured data from Web documents

    W3C Working Draft 08 June 2010

  • RDFa [RDFA-CORE] enables authors to publish structured information that is both human- and machine-readable. Concepts that have traditionally been difficult for machines to detect, like people, places, events, music, movies, and recipes, are now easily marked up in Web documents. While publishing this data is vital to the growth of Linked Data, using the information to improve the collective utility of the Web for humankind is the true goal. To accomplish this goal, it must be simple for Web developers to extract and utilize structured information from a Web document. This document details such a mechanism; an RDFa Document Object Model Application Programming Interface (RDFa DOM API) that allows simple extraction and usage of structured information from a Web document.
    • This document is a detailed specification for an RDFa DOM API. The document is primarily intended for the following audiences:

      • User Agent developers that are providing a mechanism to programatically extract RDF Triples from RDFa in a host language such as XHTML+RDFa [XHTML-RDFA], HTML+RDFa [HTML-RDFA] or SVG Tiny 1.2 [SVGTINY12],
      • DOM tool developers that want to provide a mechanism for extracting RDFa content via programming languages such as JavaScript, Python, Ruby, or Perl, and
      • Developers that want to understand the inner workings and design criteria for the RDFa DOM API.
1 - 20 of 30 Next ›
Showing 20 items per page