Skip to main content

Home/ Future of the Web/ Group items tagged proposal Berners-Lee

Rss Feed Group items tagged

Paul Merrell

Tim Berners-Lee, W3C Approve Work On DRM For HTML 5.1 - Slashdot - 1 views

  • "Danny O'Brien from the EFF has a weblog post about how the Encrypted Media Extension (EME) proposal will continue to be part of HTML Work Group's bailiwick and may make it into a future HTML revision." From O'Brien's post: "A Web where you cannot cut and paste text; where your browser can't 'Save As...' an image; where the 'allowed' uses of saved files are monitored beyond the browser; where JavaScript is sealed away in opaque tombs; and maybe even where we can no longer effectively 'View Source' on some sites, is a very different Web from the one we have today. It's a Web where user agents—browsers—must navigate a nest of enforced duties every time they visit a page. It's a place where the next Tim Berners-Lee or Mozilla, if they were building a new browser from scratch, couldn't just look up the details of all the 'Web' technologies. They'd have to negotiate and sign compliance agreements with a raft of DRM providers just to be fully standards-compliant and interoperable."
  •  
    From the Dept. of YouGottaBeKiddingMe. 
Gary Edwards

RDFa, Drupal and a Practical Semantic Web - 1 views

  •  
    CMSWire has a brief explanation of RDFa and why it's important. RDFa is also finding it's way into the Drupal CMS, which could be a game changer. Timothy Berners-Lee vision of a "Semantic Web" where the meaning of content is understood by both humans and machines depends on the emergence of capable information systems that make it transparently easy to add semantic markup. I'm not surprised that Drupal is jumping with both feet.

    "... In the march toward creating the semantic web, web content management systems such as Drupal (news, site) and many proprietary vendors struggle with the goal of emitting structured information that other sites and tools can usefully consume. There's a balance to be struck between human and machine utility, not to mention simplicity of instrumentation.

    With RDFa (see W3C proposal),  software and web developers have the specification they need to know how to structure data in order to lend meaning both to machines and to humans, all in a single file. And from what we've seen recently, the Drupal community is making the best of it.
Gonzalo San Gil, PhD.

The original proposal of the WWW, HTMLized - 1 views

  •  
    [... The problems of information loss may be particularly acute at CERN, but in this case (as in certain others), CERN is a model in miniature of the rest of world in a few years time. CERN meets now some problems which the rest of the world will have to face soon. In 10 years, there may be many commercial solutions to the problems above, while today we need something to allow us to continue. ...]
Paul Merrell

Social Media Giants Choking Independent News Site Traffic to a Trickle - 0 views

  • Several prominent figures, including Web inventor Tim Berners-Lee, warned the EU Parliament that its proposed censorship measure would begin transforming the Internet from an open platform for sharing and innovation, into a tool for the automated surveillance and control of its users.
  • For much of the year, independent media has felt the sting of increased social media censorship, as the “revolving door” between U.S. intelligence agencies and social-media companies has manifested in a crackdown on news that challenges official government narratives. With many notable independent news websites having shut down since then as a result, those that remain afloat are being censored like never before, with social media traffic from Facebook and Twitter completely cut off in some cases. Among such websites, social media censorship by the most popular social networks is now widely regarded to be the worst it has ever been – a chilling reality for any who seek fact-based perspectives on major world events that differ from those to be found on well-known corporate-media outlets that consistently toe the government line. Last August, MintPress reported that a new Google algorithm targeting “fake news” had quashed traffic to many independent news and advocacy sites, with sites such as the American Civil Liberties Union, Democracy Now, and WikiLeaks, seeing their returns from Google searches experience massive drops. The World Socialist Website, one of the affected pages, reported a 67 percent decrease in Google returns while MintPress experienced an even larger decrease of 76 percent in Google search returns. The new algorithm targeted online publications on both sides of the political spectrum critical of U.S. imperialism, foreign wars, and other long-standing government policies. Now, less than a year later, the situation has become even more dire. Several independent media pages have reported that their social media traffic has sharply declined since March and – in some cases – stopped almost entirely since June began. For instance, independent media website Antimedia – a page with over 2 million likes and follows – saw its traffic drop from around 150,000 page views per day earlier this month to around 12,000 as of this week. As a reference, this time last year Antimedia’s traffic stood at nearly 300,000 a day.
1 - 4 of 4
Showing 20 items per page