Skip to main content

Home/ Future of the Web/ Group items tagged Web

Rss Feed Group items tagged

Gonzalo San Gil, PhD.

No, Department of Justice, 80 Percent of Tor Traffic Is Not Child Porn | WIRED [# ! Via... - 0 views

  • The debate over online anonymity, and all the whistleblowers, trolls, anarchists, journalists and political dissidents it enables, is messy enough. It doesn’t need the US government making up bogus statistics about how much that anonymity facilitates child pornography.
  • he debate over online anonymity, and all the whistleblowers, trolls, anarchists, journalists and political dissidents it enables, is messy enough. It doesn’t need the US government making up bogus statistics about how much that anonymity facilitates child pornography. At the State of the Net conference in Washington on Tuesday, US assistant attorney general Leslie Caldwell discussed what she described as the dangers of encryption and cryptographic anonymity tools like Tor, and how those tools can hamper law enforcement. Her statements are the latest in a growing drumbeat of federal criticism of tech companies and software projects that provide privacy and anonymity at the expense of surveillance. And as an example of the grave risks presented by that privacy, she cited a study she said claimed an overwhelming majority of Tor’s anonymous traffic relates to pedophilia. “Tor obviously was created with good intentions, but it’s a huge problem for law enforcement,” Caldwell said in comments reported by Motherboard and confirmed to me by others who attended the conference. “We understand 80 percent of traffic on the Tor network involves child pornography.” That statistic is horrifying. It’s also baloney.
  • In a series of tweets that followed Caldwell’s statement, a Department of Justice flack said Caldwell was citing a University of Portsmouth study WIRED covered in December. He included a link to our story. But I made clear at the time that the study claimed 80 percent of traffic to Tor hidden services related to child pornography, not 80 percent of all Tor traffic. That is a huge, and important, distinction. The vast majority of Tor’s users run the free anonymity software while visiting conventional websites, using it to route their traffic through encrypted hops around the globe to avoid censorship and surveillance. But Tor also allows websites to run Tor, something known as a Tor hidden service. This collection of hidden sites, which comprise what’s often referred to as the “dark web,” use Tor to obscure the physical location of the servers that run them. Visits to those dark web sites account for only 1.5 percent of all Tor traffic, according to the software’s creators at the non-profit Tor Project. The University of Portsmouth study dealt exclusively with visits to hidden services. In contrast to Caldwell’s 80 percent claim, the Tor Project’s director Roger Dingledine pointed out last month that the study’s pedophilia findings refer to something closer to a single percent of Tor’s overall traffic.
  • ...1 more annotation...
  • So to whoever at the Department of Justice is preparing these talking points for public consumption: Thanks for citing my story. Next time, please try reading it.
  •  
    [# Via Paul Merrell's Diigo...] "That is a huge, and important, distinction. The vast majority of Tor's users run the free anonymity software while visiting conventional websites, using it to route their traffic through encrypted hops around the globe to avoid censorship and surveillance. But Tor also allows websites to run Tor, something known as a Tor hidden service. This collection of hidden sites, which comprise what's often referred to as the "dark web," use Tor to obscure the physical location of the servers that run them. Visits to those dark web sites account for only 1.5 percent of all Tor traffic, according to the software's creators at the non-profit Tor Project."
  •  
    [# Via Paul Merrell's Diigo...] "That is a huge, and important, distinction. The vast majority of Tor's users run the free anonymity software while visiting conventional websites, using it to route their traffic through encrypted hops around the globe to avoid censorship and surveillance. But Tor also allows websites to run Tor, something known as a Tor hidden service. This collection of hidden sites, which comprise what's often referred to as the "dark web," use Tor to obscure the physical location of the servers that run them. Visits to those dark web sites account for only 1.5 percent of all Tor traffic, according to the software's creators at the non-profit Tor Project."
Gonzalo San Gil, PhD.

Descarga sitio web completo con wget aún si hay restricciones - 0 views

  •  
    "GNU Wget es una herramienta de software libre que permite la descarga de contenidos desde servidores web de una forma simple. Su nombre deriva de World Wide Web (w), y de «obtener» (en inglés get), esto quiere decir: obtener desde la WWW." [# ! La #Web # ! … en #tus #manos… # ! … con #GNU #Wget]
  •  
    "GNU Wget es una herramienta de software libre que permite la descarga de contenidos desde servidores web de una forma simple. Su nombre deriva de World Wide Web (w), y de «obtener» (en inglés get), esto quiere decir: obtener desde la WWW."
Gonzalo San Gil, PhD.

How to Access Linux Server Terminal in Web Browser Using 'Wetty (Web + tty)' Tool - 0 views

  •  
    "Wouldn't it be fantastic if there was a way to access a remote Linux server directly from the web browser? Luckily for us all, there is a tool called Wetty (Web + tty) that allows us to do just that - without the need to switch programs and all from the same web browser window."
Paul Merrell

Common Crawl Founder Gil Elbaz Speaks About New Relationship With Amazon, Semantic Web ... - 0 views

  • The Common Crawl Foundation’s repository of openly and freely accessible web crawl data is about to go live as a Public Data Set on Amazon Web Services.
  • Elbaz’ goal in developing the repository: “You can’t access, let alone download, the Google or the Bing crawl data. So certainly we’re differentiated in being very open and transparent about what we’re crawling and actually making it available to developers,” he says. “You might ask why is it going to be revolutionary to allow many more engineers and researchers and developers and students access to this data, whereas historically you have to work for one of the big search engines…. The question is, the world has the largest-ever corpus of knowledge out there on the web, and is there more that one can do with it than Google and Microsoft and a handful of other search engines are already doing? And the answer is unquestionably yes. ”
  • Common Crawl’s data already is stored on Amazon’s S3 service, but now Amazon will be providing the storage space for free through the Public Data Set program. Not only does that remove from Common Crawl the storage burden and costs for hosting its crawl of 5 billion web pages – some 50 or 60 terabytes large – but it should make it easier for users to access the data, and remove the bandwidth-related costs they might incur for downloads. Users won’t have to deal with setting up accounts, being responsible for bandwidth bills incurred, and more complex authentication processes.
Paul Merrell

W3C Standards Make Mobile Web Experience More Inviting - 0 views

  • W3C today announced new standards that will make it easier for people to browse the Web on mobile devices. Mobile Web Best Practices 1.0, published as a W3C Recommendation, condenses the experience of many mobile Web stakeholders into practical advice on creating mobile-friendly content.
  • Until today, content developers faced an additional challenge: a variety of mobile markup languages to choose from. With the publication of the XHTML Basic 1.1 Recommendation today, the preferred format specification of the Best Practices, there is now a full convergence in mobile markup languages, including those developed by the Open Mobile Alliance (OMA). The W3C mobileOK checker (beta), when used with the familiar W3C validator, helps developers test mobile-friendly Web content.
  • W3C is also developing resources to help authors understand how to create content that is both mobile-friendly and accessible to people with disabilities. A draft of Relationship between Mobile Web Best Practices (MWBP) and Web Content Accessibility Guidelines (WCAG) is jointly published by the The Mobile Web Best Practices Working Group and WAI's Education & Outreach Working Group (EOWG).
  •  
    Most quality online stores. Know whether you are a trusted online retailer in the world. Whatever we can buy very good quality. and do not hesitate. Everything is very high quality. Including clothes, accessories, bags, cups. Highly recommended. This is one of the trusted online store in the world. View now www.retrostyler.com
Gary Edwards

How the Web was almost won ... Tim O'Reilly 1998 | Salon - 0 views

  •  
    The Justice Department's antitrust suit and Judge Jackson's finding of fact have focused on how Microsoft used its operating system dominance to wrest control of the Web browser market from Netscape. Perhaps even more significant is the untold story of Microsoft's attempts to corner the Web server market. As someone whose company competes directly with Microsoft, (we sell a Web server called WebSite that runs on Windows NT, and we are active in promoting Perl, Linux and other open-source technologies), I've been privy to some of the not-so-small details that have guided the course of this recent history. And, it seems to me that if it weren't for the work of a small group of independent open-source software developers, the Justice Department intervention might have come too late not just for Netscape but the Web as a whole.
Gary Edwards

Wolfram Alpha is Coming -- and It Could be as Important as Google | Twine - 0 views

  • The first question was could (or even should) Wolfram Alpha be built using the Semantic Web in some manner, rather than (or as well as) the Mathematica engine it is currently built on. Is anything missed by not building it with Semantic Web's languages (RDF, OWL, Sparql, etc.)? The answer is that there is no reason that one MUST use the Semantic Web stack to build something like Wolfram Alpha. In fact, in my opinion it would be far too difficult to try to explicitly represent everything Wolfram Alpha knows and can compute using OWL ontologies. It is too wide a range of human knowledge and giant OWL ontologies are just too difficult to build and curate.
  • However for the internal knowledge representation and reasoning that takes places in the system, it appears Wolfram has found a pragmatic and efficient representation of his own, and I don't think he needs the Semantic Web at that level. It seems to be doing just fine without it. Wolfram Alpha is built on hand-curated knowledge and expertise. Wolfram and his team have somehow figured out a way to make that practical where all others who have tried this have failed to achieve their goals. The task is gargantuan -- there is just so much diverse knowledge in the world. Representing even a small segment of it formally turns out to be extremely difficult and time-consuming.
  • It has generally not been considered feasible for any one group to hand-curate all knowledge about every subject. This is why the Semantic Web was invented -- by enabling everyone to curate their own knowledge about their own documents and topics in parallel, in principle at least, more knowledge could be represented and shared in less time by more people -- in an interoperable manner. At least that is the vision of the Semantic Web.
  • ...1 more annotation...
  • Where Google is a system for FINDING things that we as a civilization collectively publish, Wolfram Alpha is for ANSWERING questions about what we as a civilization collectively know. It's the next step in the distribution of knowledge and intelligence around the world -- a new leap in the intelligence of our collective "Global Brain." And like any big next-step, Wolfram Alpha works in a new way -- it computes answers instead of just looking them up.
  •  
    A Computational Knowledge Engine for the Web In a nutshell, Wolfram and his team have built what he calls a "computational knowledge engine" for the Web. OK, so what does that really mean? Basically it means that you can ask it factual questions and it computes answers for you. It doesn't simply return documents that (might) contain the answers, like Google does, and it isn't just a giant database of knowledge, like the Wikipedia. It doesn't simply parse natural language and then use that to retrieve documents, like Powerset, for example. Instead, Wolfram Alpha actually computes the answers to a wide range of questions -- like questions that have factual answers such as "What country is Timbuktu in?" or "How many protons are in a hydrogen atom?" or "What is the average rainfall in Seattle this month?," "What is the 300th digit of Pi?," "where is the ISS?" or "When was GOOG worth more than $300?" Think about that for a minute. It computes the answers. Wolfram Alpha doesn't simply contain huge amounts of manually entered pairs of questions and answers, nor does it search for answers in a database of facts. Instead, it understands and then computes answers to certain kinds of questions.
Paul Merrell

It's the business processes that are bound to MSOffice - Windows' dominance stifles dem... - 0 views

  • 15 years of workgroup oriented business process automation based on the MSOffice productivity environment has had an impact. Microsoft pretty much owns the "client" in "client/server" because so many of these day-to-day business processes are bound to the MSOffice productivity environment in some way.
  • The good news is that there is a great transition underway. The world is slowly but inexorably moving from "client/server" systems to an emerging architecture one might describe as "client/ WebStack-Cloud-Ria /server. The reason for the great transition is simple; the productivity advantages of putting the Web in the center of information systems and workflows are extraordinary.
  • Now the bad news. Microsoft fully understands this and has spent years preparing for a very controlled transition. They are ready. The pieces are finally falling into place for a controlled transition connecting legacy MSOffice bound business processes to the Microsoft WebStack-Cloud-RiA model (Exchange-SharePoint-SQL Server-Mesh-Silverlight).
  • ...2 more annotations...
  • Anyone with a pulse knows that the Web is the future. Yet, look at how much time and effort has been spent on formats, protocols and interfaces that at best would "break" the Web, and at worst, determine to refight the 1995 office desktop wars. In Massachusetts, while the war between ODF and OOXML raged, Exchange and SharePoint servers were showing up everywhere. It was as if the outcome of the desktop office format decision didn't matter to the Web future.
  • And if we don't successfully re-purpose MSOffice to the Open Web? (And for that matter, OpenOffice). The Web will break. The great transition will be directed to the MS WebStack-Cloud-RiA model. Web enhanced business processes will be entangled with proprietary formats, protocols and interfaces. The barriers to this emerging desktop-Web-device platform of business processes and systems will prove even more impenetrable than the 1995 desktop productivity environment. Linux will not penetrate the business desktop arena. And we will all wonder what it was that we were doing as this unfolded before our eyes.
Gary Edwards

Microsoft, Google Search and the Future of the Open Web - Google Docs - 0 views

  •  
    Response to the InformationWeek article "Remaking Microsoft: Get Out of Web Search!". Covers "The Myth of Google Enterprise Search", and the refusal of Google to implement or recognize W3C Semantic Web technologies. This refusal protects Google's proprietary search and categorization algorithms, but it opens the door wide for Microsoft Office editors to totally exploit the end-user semantic interface opportunities. If Microsoft can pull this off, they will take "search" to the Enterprise and beyond into every high end discipline using MSOffice to edit Web ready documents (private and public use). Also a bit about WebKit as the most disruptive technology Microsoft has faced since the advent of the Web.
Gary Edwards

The Grand Convergence: Web + RIA + Widgets + Client/Server - 0 views

  • he architecture of the Widget engine divides the client technology into two parts, the engine and the widgets. The widget engine is usually a pretty large download.
  • The widget engine is really a wonderful architecture that gives you the power of the desktop (via the widget engine) and the management of the Web (via widget downloads).  Widget engines can out-perform RIA solutions and they can store larger data sets. 
  • Fit Client applications can be centrally managed, yet remain resident on the desktop. They can offer access to standard web content (e.g. HTML) without the need of a browser. Fit Clients can leverage the processing power and disc space of the client machine, but they can also offer more restrictive and secure environments than client/server platforms.
  •  
    Excellent overview of where applications are going. Richard Monson-Haefel, (whom i met at the 2008 Web 2.0 Conference) explains the convergence of four emerging application models: Web Clients (Browsers), RiA Clients, Client/Server, and Widget Engines. He comes up with a convergence point called "Fit Client", offering Adobe Air as the leading example. Richard walks through each application model, discussing limitations and advantages. Good stuff, especially this comment: "The widget engine is really a wonderful architecture that gives you the power of the desktop (via the widget engine) and the management of the Web (via widget downloads).  Widget engines can out-perform RIA solutions and they can store larger data sets.    The limitation of Widget engines is not in their architecture, it is that they have been designed for applications with fairly weak capabilities compared to client/server. Widgets tend to be single-purpose applications with limited access to the native operating system. That said, the widget architecture itself - the separation of the platform from the applications - is important. It makes it possible to create applications (widgets) that are portable across operating systems and are packaged for easy download and installation. "
Gary Edwards

How HTML 5 Is Already Changing the Web - Webmonkey - 0 views

  •  
    HTML 5 represents the biggest leap forward in web standards in almost a decade. Unlike the specifications that came before it, HTML 5 is not merely intended to present content to a web browser. Its goal is to bring the web into maturity as a full-fledged application platform - a level playing field where video, sound, images, animations, and full interactivity with your computer are all standardized. And it may be a long way off still, but elements of HTML 5 are already reshaping the way we use the web.
Fabien Cadet

The Evolution of Web Design (infographic) | kissmetrics.com, 2011-04? by Sean Work - 3 views

  •  
    « Can you believe that the first published website is already 20 years old? Web design has come a long way since the first website was published by Tim Berners-Lee in 1991. This infographic is a peek at the evolutionary tale of web design, which is ironically still in its infant stages. Enjoy the infographic below and let your imagination wander. You might find yourself asking, "Where will web design be in the next 20 years?" »
Paul Merrell

Web video accessibility from EmbedPlus on 2011-08-11 (w3c-wai-ig@w3.org from July to Se... - 0 views

  •  
    For those who care about Web accessibility, here is an opportunity to provide feedback on some accessibility tools for one of the most widely-used web services. The message deserves wide distribution. The contact email address is on the linked page.  The linked tool set should also be of interest to those doing mashups or embedding YouTube videos in web pages. Hi all, I'm the co-developer a YouTube third-party tool called EmbedPlus. It enhances the standard YouTube player with many features that aren't inherently supported. We've been getting lots of feedback regarding the accessibility benefits of some of these features like movable zoom, slow motion, and even third-party annotations. As the tool continues to grow in popularity, the importance of its accessibility rises. I decided to do some research and found the WAI Interest group to be a major proponent of accessibility on the web. If anyone has time to take a look at EmbedPlus and share feedback that could help improve the tool, please do. Here's the link: http://www.embedplus.com/ Thank you in advance, Tay
Gonzalo San Gil, PhD.

Verizon claims common carrier rules would require Web services to pay ISPs | Ars Technica - 0 views

  •  
    " Verizon is making an alarmist argument in its response to the Federal Communications Commission's network neutrality proposal. Classification of broadband as a common carrier service-a step called for by public interest groups who want to prevent ISPs from charging Web services for faster access to consumers-would instead require ISPs to charge Netflix, YouTube, and other Web services for network access, Verizon claims."
  •  
    " Verizon is making an alarmist argument in its response to the Federal Communications Commission's network neutrality proposal. Classification of broadband as a common carrier service-a step called for by public interest groups who want to prevent ISPs from charging Web services for faster access to consumers-would instead require ISPs to charge Netflix, YouTube, and other Web services for network access, Verizon claims."
Paul Merrell

How to Encrypt the Entire Web for Free - The Intercept - 0 views

  • If we’ve learned one thing from the Snowden revelations, it’s that what can be spied on will be spied on. Since the advent of what used to be known as the World Wide Web, it has been a relatively simple matter for network attackers—whether it’s the NSA, Chinese intelligence, your employer, your university, abusive partners, or teenage hackers on the same public WiFi as you—to spy on almost everything you do online. HTTPS, the technology that encrypts traffic between browsers and websites, fixes this problem—anyone listening in on that stream of data between you and, say, your Gmail window or bank’s web site would get nothing but useless random characters—but is woefully under-used. The ambitious new non-profit Let’s Encrypt aims to make the process of deploying HTTPS not only fast, simple, and free, but completely automatic. If it succeeds, the project will render vast regions of the internet invisible to prying eyes.
  • Encryption also prevents attackers from tampering with or impersonating legitimate websites. For example, the Chinese government censors specific pages on Wikipedia, the FBI impersonated The Seattle Times to get a suspect to click on a malicious link, and Verizon and AT&T injected tracking tokens into mobile traffic without user consent. HTTPS goes a long way in preventing these sorts of attacks. And of course there’s the NSA, which relies on the limited adoption of HTTPS to continue to spy on the entire internet with impunity. If companies want to do one thing to meaningfully protect their customers from surveillance, it should be enabling encryption on their websites by default.
  • Let’s Encrypt, which was announced this week but won’t be ready to use until the second quarter of 2015, describes itself as “a free, automated, and open certificate authority (CA), run for the public’s benefit.” It’s the product of years of work from engineers at Mozilla, Cisco, Akamai, Electronic Frontier Foundation, IdenTrust, and researchers at the University of Michigan. (Disclosure: I used to work for the Electronic Frontier Foundation, and I was aware of Let’s Encrypt while it was being developed.) If Let’s Encrypt works as advertised, deploying HTTPS correctly and using all of the best practices will be one of the simplest parts of running a website. All it will take is running a command. Currently, HTTPS requires jumping through a variety of complicated hoops that certificate authorities insist on in order prove ownership of domain names. Let’s Encrypt automates this task in seconds, without requiring any human intervention, and at no cost.
  • ...2 more annotations...
  • The benefits of using HTTPS are obvious when you think about protecting secret information you send over the internet, like passwords and credit card numbers. It also helps protect information like what you search for in Google, what articles you read, what prescription medicine you take, and messages you send to colleagues, friends, and family from being monitored by hackers or authorities. But there are less obvious benefits as well. Websites that don’t use HTTPS are vulnerable to “session hijacking,” where attackers can take over your account even if they don’t know your password. When you download software without encryption, sophisticated attackers can secretly replace the download with malware that hacks your computer as soon as you try installing it.
  • The transition to a fully encrypted web won’t be immediate. After Let’s Encrypt is available to the public in 2015, each website will have to actually use it to switch over. And major web hosting companies also need to hop on board for their customers to be able to take advantage of it. If hosting companies start work now to integrate Let’s Encrypt into their services, they could offer HTTPS hosting by default at no extra cost to all their customers by the time it launches.
  •  
    Don't miss the video. And if you have a web site, urge your host service to begin preparing for Let's Encrypt. (See video on why it's good for them.)
Gary Edwards

Introduction to OpenCalais | OpenCalais - 0 views

  •  
    "The free OpenCalais service and open API is the fastest way to tag the people, places, facts and events in your content.  It can help you improve your SEO, increase your reader engagement, create search-engine-friendly 'topic hubs' and streamline content operations - saving you time and money. OpenCalais is free to use in both commercial and non-commercial settings, but can only be used on public content (don't run your confidential or competitive company information through it!). OpenCalais does not keep a copy of your content, but it does keep a copy of the metadata it extracts there from. To repeat, OpenCalais is not a private service, and there is no secure, enterprise version that you can buy to operate behind a firewall. It is your responsibility to police the content that you submit, so make sure you are comfortable with our Terms of Service (TOS) before you jump in. You can process up to 50,000 documents per day (blog posts, news stories, Web pages, etc.) free of charge.  If you need to process more than that - say you are an aggregator or a media monitoring service - then see this page to learn about Calais Professional. We offer a very affordable license. OpenCalais' early adopters include CBS Interactive / CNET, Huffington Post, Slate, Al Jazeera, The New Republic, The White House and more. Already more than 30,000 developers have signed up, and more than 50 publishers and 75 entrepreneurs are using the free service to help build their businesses. You can read about the pioneering work of these publishers, entrepreneurs and developers here. To get started, scroll to the bottom section of this page. To build OpenCalais into an existing site or publishing platform (CMS), you will need to work with your developers.  Why OpenCalais Matters The reason OpenCalais - and so-called "Web 3.0" in general (concepts like the Semantic Web, Linked Data, etc.) - are important is that these technologies make it easy to automatically conne
Gonzalo San Gil, PhD.

10 Search Engines to Explore the Invisible Web - 6 views

  •  
    [The Invisible Web refers to the part of the WWW that's not indexed by the search engines. Most of us think that that search powerhouses like Google and Bing are like the Great Oracle"¦they see everything. Unfortunately, they can't because they aren't divine at all; they are just web spiders who index pages by following one hyperlink after the other.]
Paul Merrell

Google's web app plans collide with Apple's iPhone, Safari rules - CNET - 0 views

  • Google and Apple, which already battle over mobile operating systems, are opening a new front in their fight. How that plays out may determine the future of the web. Google was born on the web, and its business reflects its origin. The company depends on the web for search and advertising revenue. So it isn't a surprise that Google sees the web as key to the future of software. Front and center are web apps, interactive websites with the same power as conventional apps that run natively on operating systems like Windows, Android, MacOS and iOS.  Apple has a different vision of the future, one that plays to its strengths. The company revolutionized mobile computing with its iPhone line. Its profits depend on those products and the millions of apps that run on them. Apple, unsurprisingly, appears less excited about developments, like web apps, that could cut into its earnings.
Paul Merrell

Haavard - 300 million users strong, Opera moves to WebKit - 1 views

  • Today, we announced that Opera has reached 300 million active users. At the same time, we made the official announcement that Opera will move from Presto to WebKit as the engine at the core of the browser.
  • It was always a goal to be compatible with the real web while also supporting and promoting open standards.That turns out to be a bit of a challenge when you are faced with a web that is not as open as one might have wanted. Add to that the fact that it is constantly changing and that you don't get site compatibility for free (which some browsers are fortunate enough to do), and it ends up taking up a lot of resources - resources that could have been spent on innovation and polish instead.
  • Although I was skeptical at first when I started hearing about the switch, I am now fully convinced that it is the right thing to do. Not only will it free up significant engineering resources at Opera and allow us to do more innovation instead of constantly trying to adapt to the web, but our users should benefit from better site compatibility and more innovative features and polish.This move allows us to focus even more on the actual user experience.
  • ...2 more annotations...
  • If switching to WebKit allows us to accelerate our growth and become an important contributor to the project (we will contribute back to WebKit, and have already submitted our first patch (bug)), we may finally have a direct impact on the way web sites are coded. We want sites to be coded for open standards rather than specific browsers.
  • WebKit has matured enough that it is actually possible to make the switch, and we can help it mature even further. In return, we get to spend more resources on a better user experience, and less on chasing an ever-changing web.This move allows us to create a platform for future growth because it allows us to focus our resources on things that can actually differentiate Opera from the competition, and could help the web move in the right direction.
  •  
    And so there will be only three major web page rendering engines, webkit, mozilla's gecko, and MSIE. with only webkit in the ascendancy. 
Gonzalo San Gil, PhD.

From the Web to the streets: protesting DRM at the World Wide Web Consortium | Defectiv... - 1 views

  •  
    "Submitted by Zak Rogoff on March 22, 2016 - 12:19pm Protesters marching outside the W3C office. On Sunday, we led a protest at the World Wide Web Consortium (W3C) against the attempt by Netflix, Hollywood and other technology and media companies to weave Digital Restrictions Management (DRM) into the HTML standard that undergirds the Web."
« First ‹ Previous 41 - 60 of 669 Next › Last »
Showing 20 items per page