Skip to main content

Home/ Future of the Web/ Group items tagged url

Rss Feed Group items tagged

Paul Merrell

Dare Obasanjo aka Carnage4Life - Not Turtles, AtomPub All the Way Down - 0 views

  • I don't think the Atom publishing protocol can be considered the universal protocol for talking to remote databases given that cloud storage vendors like Amazon and database vendors like Oracle don't support it yet. That said, this is definitely a positive trend. Back in the RSS vs. Atom days I used to get frustrated that people were spending so much time reinventing the wheel with an RSS clone when the real gaping hole in the infrastructure was a standard editing protocol. It took a little longer than I expected (Sam Ruby started talking about in 2003) but the effort has succeeded way beyond my wildest dreams. All I wanted was a standard editing protocol for blogs and content management systems and we've gotten so much more.
  • Microsoft is using AtomPub as the interface to a wide breadth of services and products as George Moore points out in his post A Unified Standards-Based Protocols and Tooling Platform for Storage from Microsoft 
  • And a few weeks after George's post even more was revealed in posts such as this one about  FeedSync and Live Mesh where we find out Congratulations to the Live Mesh team, who announced their Live Mesh Technology Preview release earlier this evening! Amit Mital gives a detailed overview in this post on http://dev.live.com. You can read all about it in the usual places...so why do I mention it here? FeedSync is one of the core parts of the Live Mesh platform. One of the key values of Live Mesh is that your data flows to all of your devices. And rather than being hidden away in a single service, any properly authenticated user has full bidirectional sync capability. As I discussed in the Introduction to FeedSync, this really makes "your stuff yours". Okay, FeedSync isn't really AtomPub but it does use the Atom syndication format so I count that as a win for Atom+APP as well. As time goes on, I hope we'll see even more products and services that support Atom and AtomPub from Microsoft. Standardization at the protocol layer means we can move innovation up the stack.
  •  
    Like this http://www.hdfilmsaati.net Film,dvd,download,free download,product... ppc,adword,adsense,amazon,clickbank,osell,bookmark,dofollow,edu,gov,ads,linkwell,traffic,scor,serp,goggle,bing,yahoo.ads,ads network,ads goggle,bing,quality links,link best,ptr,cpa,bpa
Paul Merrell

NSA Director Finally Admits Encryption Is Needed to Protect Public's Privacy - 0 views

  • NSA Director Finally Admits Encryption Is Needed to Protect Public’s Privacy The new stance denotes a growing awareness within the government that Americans are not comfortable with the State’s grip on their data. By Carey Wedler | AntiMedia | January 22, 2016 Share this article! https://mail.google.com/mail/?view=cm&fs=1&to&su=NSA%20Director%20Finally%20Admits%20Encryption%20Is%20Needed%20to%20Protect%20Public%E2%80%99s%20Privacy&body=http%3A%2F%2Fwww.mintpress
  • Rogers cited the recent Office of Personnel Management hack of over 20 million users as a reason to increase encryption rather than scale it back. “What you saw at OPM, you’re going to see a whole lot more of,” he said, referring to the massive hack that compromised the personal data about 20 million people who obtained background checks. Rogers’ comments, while forward-thinking, signify an about face in his stance on encryption. In February 2015, he said he “shares [FBI] Director [James] Comey’s concern” about cell phone companies’ decision to add encryption features to their products. Comey has been one loudest critics of encryption. However, Rogers’ comments on Thursday now directly conflict with Comey’s stated position. The FBI director has publicly chastised encryption, as well as the companies that provide it. In 2014, he claimed Apple’s then-new encryption feature could lead the world to “a very dark place.” At a Department of Justice hearing in November, Comey testified that “Increasingly, the shadow that is ‘going dark’ is falling across more and more of our work.” Though he claimed, “We support encryption,” he insisted “we have a problem that encryption is crashing into public safety and we have to figure out, as people who care about both, to resolve it. So, I think the conversation’s in a healthier place.”
  • At the same hearing, Comey and Attorney General Loretta Lynch declined to comment on whether they had proof the Paris attackers used encryption. Even so, Comey recently lobbied for tech companies to do away with end-to-end encryption. However, his crusade has fallen on unsympathetic ears, both from the private companies he seeks to control — and from the NSA. Prior to Rogers’ statements in support of encryption Thursday, former NSA chief Michael Hayden said, “I disagree with Jim Comey. I actually think end-to-end encryption is good for America.” Still another former NSA chair has criticized calls for backdoor access to information. In October, Mike McConnell told a panel at an encryption summit that the United States is “better served by stronger encryption, rather than baking in weaker encryption.” Former Department of Homeland Security chief, Michael Chertoff, has also spoken out against government being able to bypass encryption.
  • ...2 more annotations...
  • Regardless of these individual defenses of encryption, the Intercept explained why these statements may be irrelevant: “Left unsaid is the fact that the FBI and NSA have the ability to circumvent encryption and get to the content too — by hacking. Hacking allows law enforcement to plant malicious code on someone’s computer in order to gain access to the photos, messages, and text before they were ever encrypted in the first place, and after they’ve been decrypted. The NSA has an entire team of advanced hackers, possibly as many as 600, camped out at Fort Meade.”
  • Rogers statements, of course, are not a full-fledged endorsement of privacy, nor can the NSA be expected to make it a priority. Even so, his new stance denotes a growing awareness within the government that Americans are not comfortable with the State’s grip on their data. “So spending time arguing about ‘hey, encryption is bad and we ought to do away with it’ … that’s a waste of time to me,” Rogers said Thursday. “So what we’ve got to ask ourselves is, with that foundation, what’s the best way for us to deal with it? And how do we meet those very legitimate concerns from multiple perspectives?”
Paul Merrell

Anti link-rot SaaS for web publishers -- WebCite - 0 views

  • The Problem Authors increasingly cite webpages and other digital objects on the Internet, which can "disappear" overnight. In one study published in the journal Science, 13% of Internet references in scholarly articles were inactive after only 27 months. Another problem is that cited webpages may change, so that readers see something different than what the citing author saw. The problem of unstable webcitations and the lack of routine digital preservation of cited digital objects has been referred to as an issue "calling for an immediate response" by publishers and authors [1]. An increasing number of editors and publishers ask that authors, when they cite a webpage, make a local copy of the cited webpage/webmaterial, and archive the cited URL in a system like WebCite®, to enable readers permanent access to the cited material.
  • What is WebCite®? WebCite®, a member of the International Internet Preservation Consortium, is an on-demand archiving system for webreferences (cited webpages and websites, or other kinds of Internet-accessible digital objects), which can be used by authors, editors, and publishers of scholarly papers and books, to ensure that cited webmaterial will remain available to readers in the future. If cited webreferences in journal articles, books etc. are not archived, future readers may encounter a "404 File Not Found" error when clicking on a cited URL. Try it! Archive a URL here. It's free and takes only 30 seconds. A WebCite®-enhanced reference is a reference which contains - in addition to the original live URL (which can and probably will disappear in the future, or its content may change) - a link to an archived copy of the material, exactly as the citing author saw it when he accessed the cited material.
  •  
    Free service spun off from the University of Toronto's University Health Network. Automagic archiving of cited internet content, generation of citations that include the url for the archived copy. Now if Google would just make it easier to use its search cache copies for the same purpose ...
Gonzalo San Gil, PhD.

Google Highlights DMCA Abuse in New Copyright Transparency Report - TorrentFreak - 0 views

  •  
    " Ernesto on September 12, 2016 C: 3 News Google has released a new and improved version of its Copyright Transparency Report. The revamped report makes it easier to get insights into over a billion reported URLs. Among other things, Google now specifies how many URLs it does not remove and why, highlighting various cases of DMCA abuse"
Gonzalo San Gil, PhD.

Invisible Web: What it is, Why it exists, How to find it, and Its inherent ambiguity - 1 views

  •  
    [What is the "Invisible Web", a.k.a. the "Deep Web"? The "visible web" is what you can find using general web search engines. It's also what you see in almost all subject directories. The "invisible web" is what you cannot find using these types of tools. The first version of this web page was written in 2000, when this topic was new and baffling to many web searchers. Since then, search engines' crawlers and indexing programs have overcome many of the technical barriers that made it impossible for them to find "invisible" web pages. These types of pages used to be invisible but can now be found in most search engine results: Pages in non-HTML formats (pdf, Word, Excel, PowerPoint), now converted into HTML. Script-based pages, whose URLs contain a ? or other script coding. Pages generated dynamically by other types of database software (e.g., Active Server Pages, Cold Fusion). These can be indexed if there is a stable URL somewhere that search engine crawlers can find. ]
Paul Merrell

Offline Web Apps, Dumb Idea or Really Dumb Idea? - 0 views

  • The amount of work it takes to "offline enable" a Web application is roughly similar to the amount of work it takes to "online enable" a desktop application.
  • I suspect this is the bitter truth that answers the questions asked in articles like  The Frustratingly Unfulfilled Promise of Google Gears where the author laments the lack of proliferation of offline Web applications built on Google Gears. When it first shipped I was looking forward to a platform like Google Gears but after I thought about the problem for a while, I realized that such a platform would be just as useful for "online enabling" desktop applications as it would be for "offline enabling" Web applications. Additionally, I came to the conclusion that the former is a lot more enabling to users than the latter. This is when I started becoming interested in Live Mesh as a Platform, this is one area where I think Microsoft's hearts and minds are in the right place. I want to see more applications like Outlook + RPC over HTTP  not "offline enabled" versions of Outlook Web Access.
  •  
    Like this http://www.hdfilmsaati.net Film,dvd,download,free download,product... ppc,adword,adsense,amazon,clickbank,osell,bookmark,dofollow,edu,gov,ads,linkwell,traffic,scor,serp,goggle,bing,yahoo.ads,ads network,ads goggle,bing,quality links,link best,ptr,cpa,bpa
Gary Edwards

Wary of Upsetting Mighty Microsoft, Acer Limits Use Android for Phones, Not Netbooks. - 0 views

  •  
    "For a netbook, you really need to be able to view a full Web for the total Internet experience, and Android is not that yet," Jim Wong, head of Acer's IT products, said Tuesday while introducing a new line of computers."

    Right. Android runs the webkit/Chromium browser based on the same WebKit code base used by Apple iPhone/Safari, Google Chrome, Palm Pre, Nokia s60 and QT IDE, 280 Atlas WebKit IDE, SproutCore-Cocoa project, KOffice, Sun's javaFX, Adobe AiR, and Eclipse "Blinki", Eclipse SWT, Linux Midori, and the Windows CE IRiS browser - to name but a few. Other Open Web browsers Opera and Mozilla Firefox have embraced the highly interactive and very visual WebKit document and application model. Add to this WebKit tsunami the many web sites, applications and services that adopted the WebKit document model to become iPhone ready.

    Finally there is this; any browser, application or web server seekign to pass the ACiD-3 test is in effect an effort to become fully WebKit compliant.

    Maybe Mr. Wong is talking about the 1998 Internet experience supported by IE8? Or maybe there is a secret OEM agreement lurking in the background here. The kind that was used by Microsoft to stop Netscape and Java way back when.

    The problem for Microsoft is that, when it comes to smartphones, countertops and netbooks at the edge of the Web, they are not competing against individual companies pushing device and/or platform specific services. This time they are competing against the next generation Open Web. An very visual and interactive Open Web defined by the surge the WebKit, Firefox and the many JavaScript communities are leading.

    ge
  •  
    The Information Week page bookmarked says "NON-WORKING URL! The URL (Web address) that has been entered is directing to a non-existent page" Try this instead http://www.informationweek.com/news/hardware/handheld/showArticle.jhtml?articleID=216403510 Acer To Use Android For Phones, Not Netbooks April 8, 2009
  •  
    Microsoft conspiracies have happened in the past and we should watch for them. However, another explanation is that Android does not (yet) support many browser plugins. No doubt that is what the Microsoft drones remind Acer each time they meet with them, along with a pitch for Silverlight 2 !! For me, Silverlight 2 is so rare that I would not, personally, make it a requirement for a "full web". A non-Android Linux distribution on a netbook that ran Adobe Flash, Acrobat Reader, OpenOffice.org and AIR when necessary would suit me fine. One day Android may do all these things to, but for now Google has bigger fish to fry!
Paul Merrell

Superiority in Cyberspace Will Remain Elusive - Federation Of American Scientists - 0 views

  • Military planners should not anticipate that the United States will ever dominate cyberspace, the Joint Chiefs of Staff said in a new doctrinal publication. The kind of supremacy that might be achievable in other domains is not a realistic option in cyber operations. “Permanent global cyberspace superiority is not possible due to the complexity of cyberspace,” the DoD publication said. In fact, “Even local superiority may be impractical due to the way IT [information technology] is implemented; the fact US and other national governments do not directly control large, privately owned portions of cyberspace; the broad array of state and non-state actors; the low cost of entry; and the rapid and unpredictable proliferation of technology.” Nevertheless, the military has to make do under all circumstances. “Commanders should be prepared to conduct operations under degraded conditions in cyberspace.” This sober assessment appeared in a new edition of Joint Publication 3-12, Cyberspace Operations, dated June 8, 2018. (The 100-page document updates and replaces a 70-page version from 2013.) The updated DoD doctrine presents a cyber concept of operations, describes the organization of cyber forces, outlines areas of responsibility, and defines limits on military action in cyberspace, including legal limits.
  • The new cyber doctrine reiterates the importance and the difficulty of properly attributing cyber attacks against the US to their source. “The ability to hide the sponsor and/or the threat behind a particular malicious effect in cyberspace makes it difficult to determine how, when, and where to respond,” the document said. “The design of the Internet lends itself to anonymity and, combined with applications intended to hide the identity of users, attribution will continue to be a challenge for the foreseeable future.”
Paul Merrell

Gmail blows up e-mail marketing by caching all images on Google servers | Ars Technica - 1 views

  • Ever wonder why most e-mail clients hide images by default? The reason for the "display images" button is because images in an e-mail must be loaded from a third-party server. For promotional e-mails and spam, usually this server is operated by the entity that sent the e-mail. So when you load these images, you aren't just receiving an image—you're also sending a ton of data about yourself to the e-mail marketer. Loading images from these promotional e-mails reveals a lot about you. Marketers get a rough idea of your location via your IP address. They can see the HTTP referrer, meaning the URL of the page that requested the image. With the referral data, marketers can see not only what client you are using (desktop app, Web, mobile, etc.) but also what folder you were viewing the e-mail in. For instance, if you had a Gmail folder named "Ars Technica" and loaded e-mail images, the referral URL would be "https://mail.google.com/mail/u/0/#label/Ars+Technica"—the folder is right there in the URL. The same goes for the inbox, spam, and any other location. It's even possible to uniquely identify each e-mail, so marketers can tell which e-mail address requested the images—they know that you've read the e-mail. And if it was spam, this will often earn you more spam since the spammers can tell you've read their last e-mail.
  • But Google has just announced a move that will shut most of these tactics down: it will cache all images for Gmail users. Embedded images will now be saved by Google, and the e-mail content will be modified to display those images from Google's cache, instead of from a third-party server. E-mail marketers will no longer be able to get any information from images—they will see a single request from Google, which will then be used to send the image out to all Gmail users. Unless you click on a link, marketers will have no idea the e-mail has been seen. While this means improved privacy from e-mail marketers, Google will now be digging deeper than ever into your e-mails and literally modifying the contents. If you were worried about e-mail scanning, this may take things a step further. However, if you don't like the idea of cached images, you can turn it off in the settings. This move will allow Google to automatically display images, killing the "display all images" button in Gmail. Google servers should also be faster than the usual third-party image host. Hosting all images sent to all Gmail users sounds like a huge bandwidth and storage undertaking, but if anyone can do it, it's Google. The new image handling will rollout to desktop users today, and it should hit mobile apps sometime in early 2014. There's also a bonus side effect for Google: e-mail marketing is advertising. Google exists because of advertising dollars, but they don't do e-mail marketing. They've just made a competitive form of advertising much less appealing and informative to advertisers. No doubt Google hopes this move pushes marketers to spend less on e-mail and more on Adsense.
  •  
    There's an antitrust angle to this; it could be viewed by a court as anti-competitive. But given the prevailing winds on digital privacy, my guess would be that Google would slide by.
Gonzalo San Gil, PhD.

IDG Connect - Friday Rant: The Internet Is Broken - 0 views

  •  
    " Posted by Alex Cruickshank on May 30 2014 The internet has come a long way in 20 years. The infrastructure is older than that, of course, but it was 1994 when the amazing new World Wide Web started to make a serious impression on me and my colleagues and friends. Firing up Netscape Navigator 1.0 on Windows 3.11 with a wobbly TCP/IP stack and a 14.4kbps modem, typing in a cryptic URL and seeing information from the other side of the world, instantly! It was an incredible experience."
  •  
    " Posted by Alex Cruickshank on May 30 2014 The internet has come a long way in 20 years. The infrastructure is older than that, of course, but it was 1994 when the amazing new World Wide Web started to make a serious impression on me and my colleagues and friends. Firing up Netscape Navigator 1.0 on Windows 3.11 with a wobbly TCP/IP stack and a 14.4kbps modem, typing in a cryptic URL and seeing information from the other side of the world, instantly! It was an incredible experience."
Gonzalo San Gil, PhD.

Google Asked to Remove Half a Billion "Pirate" Search Results | TorrentFreak - 0 views

  •  
    " Ernesto on October 2, 2014 C: 0 Breaking Google has been asked to remove half a billion copyright-infringing URLs since it started counting three years ago. The listing of pirate sites in Google's search results has turned into a heated conflict, which the search engine and copyright holders have yet to resolve."
  •  
    " Ernesto on October 2, 2014 C: 0 Breaking Google has been asked to remove half a billion copyright-infringing URLs since it started counting three years ago. The listing of pirate sites in Google's search results has turned into a heated conflict, which the search engine and copyright holders have yet to resolve."
Gonzalo San Gil, PhD.

Court Lifts Overbroad "Piracy" Blockade of Mega and Other Sites | TorrentFreak | # The ... - 0 views

  •  
    " Ernesto on October 9, 2014 C: 0 News Mega and several other file-hosting services are accessible in Italy once again after a negotiated settlement with local law enforcement. Another unnamed site had to appeal its blockade in court but won its case after the court ruled that partial blocking of a specific URL is preferred over site-wide bans."
  •  
    " Ernesto on October 9, 2014 C: 0 News Mega and several other file-hosting services are accessible in Italy once again after a negotiated settlement with local law enforcement. Another unnamed site had to appeal its blockade in court but won its case after the court ruled that partial blocking of a specific URL is preferred over site-wide bans."
Gonzalo San Gil, PhD.

Google Refuses MPAA Request to Blacklist 'Pirate Site' Homepages | TorrentFreak - 0 views

  •  
    "The MPAA recently asked Google to remove the homepages of dozens of sites that offer links to pirated content. Google, however, refused to take down most of the URLs, likely because the takedown notices are seen as too broad. "
  •  
    "The MPAA recently asked Google to remove the homepages of dozens of sites that offer links to pirated content. Google, however, refused to take down most of the URLs, likely because the takedown notices are seen as too broad. "
Alexandra IcecreamApps

How to Convert Video to MP3 - Icecream Tech Digest - 0 views

  •  
    We noticed that because of the popularity of YouTube, the need for URL to MP3 converters has grown, and it’s much easier to find such a converter rather than a way to convert a video file to an audio one. … Continue reading →
  •  
    We noticed that because of the popularity of YouTube, the need for URL to MP3 converters has grown, and it’s much easier to find such a converter rather than a way to convert a video file to an audio one. … Continue reading →
Gonzalo San Gil, PhD.

UK Piracy Blocklist Silently Expands With Hundreds of Domains - TorrentFreak [# ! Note] - 0 views

  •  
    " By Ernesto on November 20, 2016 C: 71 News UK Internet providers have added close to 500 URLs to the national pirate site blocklist. The expansion follows a request from copyright holders who frequently add new proxies for sites that have previously been barred. Despite this mass-update, the ongoing blocking whack-a-mole is far from over."
Paul Merrell

Privacy Shield Program Overview | Privacy Shield - 0 views

  • EU-U.S. Privacy Shield Program Overview The EU-U.S. Privacy Shield Framework was designed by the U.S. Department of Commerce and European Commission to provide companies on both sides of the Atlantic with a mechanism to comply with EU data protection requirements when transferring personal data from the European Union to the United States in support of transatlantic commerce. On July 12, the European Commission deemed the Privacy Shield Framework adequate to enable data transfers under EU law (see the adequacy determination). The Privacy Shield program, which is administered by the International Trade Administration (ITA) within the U.S. Department of Commerce, enables U.S.-based organizations to join the Privacy Shield Framework in order to benefit from the adequacy determination. To join the Privacy Shield Framework, a U.S.-based organization will be required to self-certify to the Department of Commerce (via this website) and publicly commit to comply with the Framework’s requirements. While joining the Privacy Shield Framework is voluntary, once an eligible organization makes the public commitment to comply with the Framework’s requirements, the commitment will become enforceable under U.S. law. All organizations interested in joining the Privacy Shield Framework should review its requirements in their entirety. To assist in that effort, Commerce’s Privacy Shield Team has compiled resources and addressed frequently asked questions below. ResourcesKey New Requirements for Participating Organizations How to Join the Privacy ShieldPrivacy Policy FAQs Frequently Asked Questions
  •  
    I got a notice from Dropbox tonight that it is now certified under this program. This program is fallout from an E.U. Court of Justice decision following the Snowden disclosures, holding that the then existing U.S.-E.U. framework for ptoecting the rights of E.U. citozens' data were invalid because that framework did not adequately protect digital privacy rights. This new framework is intended to comoply with the court's decision but one need only look at section 5 of the agreement to see that it does not. Expect follow-on litigation. THe agreement is at https://www.privacyshield.gov/servlet/servlet.FileDownload?file=015t00000004qAg Section 5 lets NSA continue to intercept and read data from E.U. citizens and also allows their data to be disclosed to U.S. law enforcement. And the agreement adds nothing to U.S. citizens' digital privacy rights. In my view, this framework is a stopgap measure that will only last as long as it takes for another case to reach the Court of Justice and be ruled upon. The ox that got gored by the Court of Justice ruling was U.S. company's ability to store E.U. citizens' data outside the E.U. and to allow internet traffic from the E.U. to pass through the U.S. Microsoft had leadership that set up new server farms in Europe under the control of a business entity beyond the jurisdiction of U.S. courts. Other I/.S. internet biggies didn't follow suit. This framework is their lifeline until the next ruling by the Court of Justice.
Gonzalo San Gil, PhD.

DailyDirt: Publishing Digitally (For Free!) | Techdirt - 0 views

  •  
    "from the urls-we-dig-up dept Publishing content digitally is a topic that comes up around here fairly regularly. If you're a longtime Techdirt reader, you'll know that we generally think digital publishing drives down the price of content to free (but that doesn't mean your work is worthless!) and giving away content is often a very effective promotional tactic for selling other things that can't be freely copied. Here are just a few interesting examples of free content you can peruse at your leisure. "
Paul Merrell

This project aims to make '404 not found' pages a thing of the past - 0 views

  • The Internet is always changing. Sites are rising and falling, content is deleted, and bad URLs can lead to '404 Not Found' errors that are as helpful as a brick wall. A new project proposes an do away with dead 404 errors by implementing new HTML code that will help access prior versions of hyperlinked content. With any luck, that means that you’ll never have to run into a dead link again. The “404-No-More” project is backed by a formidable coalition including members from organizations like the Harvard Library Innovation Lab, Los Alamos National Laboratory, Old Dominion University, and the Berkman Center for Internet & Society. Part of the Knight News Challenge, which seeks to strengthen the Internet for free expression and innovation through a variety of initiatives, 404-No-More recently reached the semifinal stage. The project aims to cure so-called link rot, the process by which hyperlinks become useless overtime because they point to addresses that are no longer available. If implemented, websites such as Wikipedia and other reference documents would be vastly improved. The new feature would also give Web authors a way provide links that contain both archived copies of content and specific dates of reference, the sort of information that diligent readers have to hunt down on a website like Archive.org.
  • While it may sound trivial, link rot can actually have real ramifications. Nearly 50 percent of the hyperlinks in Supreme Court decisions no longer work, a 2013 study revealed. Losing footnotes and citations in landmark legal decisions can mean losing crucial information and context about the laws that govern us. The same study found that 70 percent of URLs within the Harvard Law Review and similar journals didn’t link to the originally cited information, considered a serious loss surrounding the discussion of our laws. The project’s proponents have come up with more potential uses as well. Activists fighting censorship will have an easier time combatting government takedowns, for instance. Journalists will be much more capable of researching dynamic Web pages. “If every hyperlink was annotated with a publication date, you could automatically view an archived version of the content as the author intended for you to see it,” the project’s authors explain. The ephemeral nature of the Web could no longer be used as a weapon. Roger Macdonald, a director at the Internet Archive, called the 404-No-More project “an important contribution to preservation of knowledge.”
  • The new feature would come in the form of introducing the mset attribute to the <a> element in HTML, which would allow users of the code to specify multiple dates and copies of content as an external resource. For instance, if both the date of reference and the location of a copy of targeted content is known by an author, the new code would like like this: The 404-No-More project’s goals are numerous, but the ultimate goal is to have mset become a new HTML standard for hyperlinks. “An HTML standard that incorporates archives for hyperlinks will loop in these efforts and make the Web better for everyone,” project leaders wrote, “activists, journalists, and regular ol’ everyday web users.”
Gonzalo San Gil, PhD.

BPI Hits Record Breaking 100 Million Google Takedowns | TorrentFreak - 0 views

  •  
    " Ernesto on September 22, 2014 C: 40 Breaking The BPI has reached a new milestone in its ongoing efforts to have pirated content removed from the Internet. This week the music industry group reported its 100 millionth URL to Google. Although the takedown notices are processed quickly, the music industry group believes that Google should do more to prevent piracy." [# ! #Music # ! ...doesn't #thrive this way -and everybody knows # ! it-, so # ! guess what's The Aim of this #politics....]
1 - 20 of 44 Next › Last »
Showing 20 items per page