Skip to main content

Home/ Future of the Web/ Group items tagged enough

Rss Feed Group items tagged

Gonzalo San Gil, PhD.

#irespectmusic says 100 Years is Long Enough: The Danger of Pie-ism for All Creators - 0 views

  •  
    [# ! Is not copyright to promote creation? so, it should we enough -for the 'creators' to hold the rights DURING Author's Life...? Everything beyond is a swindle...] Baiting the Pie Last year's hearings on music licensing at the House Judiciary Committee's IP Subcommittee revealed an old argument from broadcasters and a new twist on that argument adopted by webcasters. We already pay for music-you people go fight over that pie.
  • ...1 more comment...
  •  
    # ! copyright is not to promoe creation? so, it should we enough to hold the rightss DURING Author's Life? Everything beyond is a swindle... Baiting the Pie Last year's hearings on music licensing at the House Judiciary Committee's IP Subcommittee revealed an old argument from broadcasters and a new twist on that argument adopted by webcasters. We already pay for music-you people go fight over that pie.
  •  
    [# ! Is not copyright to promote creation? so, it should we enough -for the 'creators' to hold the rights DURING Author's Life...? Everything beyond is a swindle...] Baiting the Pie Last year's hearings on music licensing at the House Judiciary Committee's IP Subcommittee revealed an old argument from broadcasters and a new twist on that argument adopted by webcasters. We already pay for music-you people go fight over that pie.
  •  
    [# ! Is not copyright to promote creation? so, it should we enough -for the 'creators' to hold the rights DURING Author's Life...? Everything beyond is a swindle...] Baiting the Pie Last year's hearings on music licensing at the House Judiciary Committee's IP Subcommittee revealed an old argument from broadcasters and a new twist on that argument adopted by webcasters. We already pay for music-you people go fight over that pie.
Gary Edwards

XML Production Workflows? Start with the Web and XHTML - 0 views

  • Challenges: Some Ugly Truths The challenges of building—and living with—an XML workflow are clear enough. The return on investment is a long-term proposition. Regardless of the benefits XML may provide, the starting reality is that it represents a very different way of doing things than the one we are familiar with. The Word Processing and Desktop Publishing paradigm, based on the promise of onscreen, WYSIWYG layout, is so dominant as to be practically inescapable. It has proven really hard to get from here to there, no matter how attractive XML might be on paper. A considerable amount of organizational effort and labour must be expended up front in order to realize the benefits. This is why XML is often referred to as an “investment”: you sink a bunch of time and money up front, and realize the benefits—greater flexibility, multiple output options, searching and indexing, and general futureproofing—later, over the long haul. It is not a short-term return proposition. And, of course, the returns you are able to realize from your XML investment are commensurate with what you put in up front: fine-grained, semantically rich tagging is going to give you more potential for searchability and recombination than a looser, more general-purpose approach, but it sure costs more. For instance, the Text Encoding Initiative (TEI) is the grand example of pouring enormous amounts of energy into the up-front tagging, with a very open-ended set of possibilities down the line. TEI helpfully defines a level to which most of us do not have to aspire.[5] But understanding this on a theoretical level is only part of the challenge. There are many practical issues that must be addressed. Software and labour are two of the most critical. How do you get the content into XML in the first place? Unfortunately, despite two decades of people doing SGML and XML, this remains an ugly question.
  • Practical Challenges In 2009, there is still no truly likeable—let alone standard—editing and authoring software for XML. For many (myself included), the high-water mark here was Adobe’s FrameMaker, substantially developed by the late 1990s. With no substantial market for it, it is relegated today mostly to the tech writing industry, unavailable for the Mac, and just far enough afield from the kinds of tools we use today that its adoption represents a significant hurdle. And FrameMaker was the best of the breed; most of the other software in decent circulation are programmers’ tools—the sort of things that, as Michael Tamblyn pointed out, encourage editors to drink at their desks. The labour question represents a stumbling block as well. The skill-sets and mind-sets that effective XML editors need have limited overlap with those needed by literary and more traditional production editors. The need to think of documents as machine-readable databases is not something that comes naturally to folks steeped in literary culture. In combination with the sheer time and effort that rich tagging requires, many publishers simply outsource the tagging to India, drawing a division of labour that spans oceans, to put it mildly. Once you have XML content, then what do you do with it? How do you produce books from it? Presumably, you need to be able to produce print output as well as digital formats. But while the latter are new enough to be generally XML-friendly (e-book formats being largely XML based, for instance), there aren’t any straightforward, standard ways of moving XML content into the kind of print production environments we are used to seeing. This isn’t to say that there aren’t ways of getting print—even very high-quality print—output from XML, just that most of them involve replacing your prepress staff with Java programmers.
  • Why does this have to be so hard? It’s not that XML is new, or immature, or untested. Remember that the basics have been around, and in production, since the early 1980s at least. But we have to take account of a substantial and long-running cultural disconnect between traditional editorial and production processes (the ones most of us know intimately) and the ways computing people have approached things. Interestingly, this cultural divide looked rather different in the 1970s, when publishers were looking at how to move to digital typesetting. Back then, printers and software developers could speak the same language. But that was before the ascendancy of the Desktop Publishing paradigm, which computerized the publishing industry while at the same time isolating it culturally. Those of us who learned how to do things the Quark way or the Adobe way had little in common with people who programmed databases or document-management systems. Desktop publishing technology isolated us in a smooth, self-contained universe of toolbars, grid lines, and laser proofs. So, now that the reasons to get with this program, XML, loom large, how can we bridge this long-standing divide?
  • ...44 more annotations...
  • Using the Web as a Production Platform The answer, I think, is right in front of you. The bridge is the Web, a technology and platform that is fundamentally based on XML, and which many publishers are by now comfortably familiar with. Perhaps not entirely comfortably, but at least most publishers are already working with the Web; they already either know or have on staff people who understand it and can work with it. The foundation of our argument is this: rather than looking at jumping to XML in its full, industrial complexity, which seems to be what the O'Reilly-backed StartWithXML initiative[6] is suggesting, publishers instead leverage existing tools and technologies—starting with the Web—as a means of getting XML workflows in place. This means making small investments and working with known tools rather than spending tens of thousands of dollars on XML software and rarefied consultants. It means re-thinking how the existing pieces of the production toolchain fit together; re-thinking the existing roles of software components already in use. It means, fundamentally, taking the Web seriously as a content platform, rather than thinking of it as something you need to get content out to, somehow. If nothing else, the Web represents an opportunity to think about editorial and production from outside the shrink-wrapped Desktop Publishing paradigm.
  • Is the Web made of Real XML? At this point some predictable objections can be heard: wait a moment, the Web isn’t really made out of XML; the HTML that makes up most of the Web is at best the bastard child of SGML, and it is far too flaky/unstructured/underpowered to be taken seriously. We counter by arguing that although HTML on the Web exists in a staggering array of different incarnations, and that the majority of it is indeed an unstructured mess, this does not undermine the general principle that basic, ubiquitous Web technologies can make a solid platform for content management, editorial process, and production workflow.
  • With the advent of a published XML standard in the late 1990s came the W3C’s adoption of XHTML: the realization of the Web’s native content markup as a proper XML document type. Today, its acceptance is almost ubiquitous, even while the majority of actual content out there may not be strictly conforming. The more important point is that most contemporary Web software, from browsers to authoring tools to content management systems (from blogs to enterprise systems), are capable of working with clean, valid XHTML. Or, to put the argument the other way around, clean, valid XHTML content plays absolutely seamlessly with everything else on the Web.[7]
  • The objection which follows, then, will be that even if we grant that XHTML is a real XML document type, that it is underpowered for “serious” content because it is almost entirely presentation (formatting) oriented; it lacks any semantic depth. In XHTML, a paragraph is a paragraph is a paragraph, as opposed to a section or an epigraph or a summary.
  • n contrast, more “serious” XML document types like DocBook[8] or DITA-derived schemas[9] are capable of making semantic distinctions about content chunks at a fine level of granularity and with a high degree of specificity.
  • So there is an argument for recalling the 80:20 rule here. If XHTML can provide 80% of the value with just 20% of the investment, then what exactly is the business case for spending the other 80% to achieve that last 20% of value? We suspect the ratio is actually quite a bit steeper than 80:20 for most publishers.
  • Furthermore, just to get technical for a moment, XHTML is extensible in a fairly straightforward way, through the common “class” attribute on each element. Web developers have long leveraged this kind of extensibility in the elaboration of “microformats” for semantic-web applications.[10] There is no reason why publishers shouldn’t think to use XHTML’s simple extensibility in a similar way for their own ends.
  • XHTML, on the other hand, is supported by a vast array of quotidian software, starting with the ubiquitous Web browser. For this very reason, XHTML is in fact employed as a component part of several more specialized document types (ONIX and ePub among them).
  • Why re-invent a general-purpose prose representation when XHTML already does the job?
  • It is worth pausing for a moment to consider the role of XHTML in the ePub standard for ebook content. An ePub file is, anatomically, a simply disguised zip archive. Inside the zip archive are a few standard component parts: there are specialized files that declare metadata about the book, and about the format of the book. And then there is the book’s content, represented in XHTML. An ePub book is a Web page in a wrapper.
  • To sum up the general argument: the Web as it already exists presents incredible value to publishers, as a platform for doing XML content management with existing (and often free) tools, and without having to go blindly into the unknown. At this point, we can offer a few design guidelines: prefer existing and/or ubiquitous tools over specialized ones wherever possible; prefer free software over proprietary systems where possible; prefer simple tools controlled and coordinated by human beings over fully automated (and therefore complex) systems; play to our strengths: use Web software for storing and managing content, use layout software for layout, and keep editors and production people in charge of their own domains.
  • Putting the Pieces Together: A Prototype
  • At the SFU Master of Publishing Program, we have been chipping away at this general line of thinking for a few years. Over that time, Web content management systems have been getting more and more sophisticated, all the while getting more streamlined and easier to use. (NB: if you have a blog, you have a Web content management system.) The Web is beginning to be recognized as a writing and editing environment used by millions of people. And the ways in which content is represented, stored, and exchanged online have become increasingly robust and standardized.
  • The missing piece of the puzzle has been print production: how can we move content from its malleable, fluid form on line into the kind of high-quality print production environments we’ve come to expect after two decades of Desktop Publishing?
  • Anyone who has tried to print Web content knows that the existing methods leave much to be desired (hyphenation and justification, for starters). In the absence of decent tools for this, most publishers quite naturally think of producing the print content first, and then think about how to get material onto the Web for various purposes. So we tend to export from Word, or from Adobe, as something of an afterthought.
  • While this sort of works, it isn’t elegant, and it completely ignores the considerable advantages of Web-based content management.
  • Content managed online is stored in one central location, accessible simultaneously to everyone in your firm, available anywhere you have an Internet connection, and usually exists in a much more fluid format than Word files. If only we could manage the editorial flow online, and then go to print formats at the end, instead of the other way around. At SFU, we made several attempts to make this work by way of the supposed “XML import” capabilities of various Desktop Publishing tools, without much success.[12]
  • In the winter of 2009, Adobe solved this part of the problem for us with the introduction of its Creative Suite 4. What CS4 offers is the option of a complete XML representation of an InDesign document: what Adobe calls IDML (InDesign Markup Language).
  • The IDML file format is—like ePub—a simply disguised zip archive that, when unpacked, reveals a cluster of XML files that represent all the different facets of an InDesign document: layout spreads, master pages, defined styles, colours, and of course, the content.
  • IDML is a well thought-out XML standard that achieves two very different goals simultaneously: it preserves all of the information that InDesign needs to do what it does; and it is broken up in a way that makes it possible for mere mortals (or at least our Master of Publishing students) to work with it.
  • What this represented to us in concrete terms was the ability to take Web-based content and move it into InDesign in a straightforward way, thus bridging Web and print production environments using existing tools and skillsets, with a little added help from free software.
  • We would take clean XHTML content, transform it to IDML-marked content, and merge that with nicely designed templates in InDesign.
  • The result is an almost push-button publication workflow, which results in a nice, familiar InDesign document that fits straight into the way publishers actually do production.
  • Tracing the steps To begin with, we worked backwards, moving the book content back to clean XHTML.
  • The simplest method for this conversion—and if you want to create Web content, this is an excellent route—was to use Adobe’s “Export to Digital Editions” option, which creates an ePub file.
  • Recall that ePub is just XHTML in a wrapper, so within the ePub file was a relatively clean XHTML document. It was somewhat cleaner (that is, the XHTML tagging was simpler and less cluttered) than InDesign’s other Web-oriented exports, possibly because Digital Editions is a well understood target, compared with somebody’s website.
  • In order to achieve our target of clean XHTML, we needed to do some editing; the XHTML produced by InDesign’s “Digital Editions” export was presentation-oriented. For instance, bulleted list items were tagged as paragraphs, with a class attribute identifying them as list items. Using the search-and-replace function, we converted such structures to proper XHTML list and list-item elements. Our guiding principle was to make the XHTML as straightforward as possible, not dependent on any particular software to interpret it.
  • We broke the book’s content into individual chapter files; each chapter could then carry its own basic metadata, and the pages conveniently fit our Web content management system (which is actually just a wiki). We assembled a dynamically generated table of contents for the 12 chapters, and created a cover page. Essentially, the book was entirely Web-based at this point.
  • When the book chapters are viewed online, they are formatted via a CSS2 stylesheet that defines a main column for content as well as dedicating screen real estate for navigational elements. We then created a second template to render the content for exporting; this was essentially a bare-bones version of the book with no navigation and minimal styling. Pages (or even the entire book) can be exported (via the “Save As...” function in a Web browser) for use in either print production or ebook conversion. At this point, we required no skills beyond those of any decent Web designer.
  • Integrating with CS4 for Print Adobe’s IDML language defines elements specific to InDesign; there is nothing in the language that looks remotely like XHTML. So a mechanical transformation step is needed to convert the XHTML content into something InDesign can use. This is not as hard as it might seem.
  • Both XHTML and IDML are composed of straightforward, well-documented structures, and so transformation from one to the other is, as they say, “trivial.” We chose to use XSLT (Extensible Stylesheet Language Transforms) to do the work. XSLT is part of the overall XML specification, and thus is very well supported in a wide variety of tools. Our prototype used a scripting engine called xsltproc, a nearly ubiquitous piece of software that we found already installed as part of Mac OS X (contemporary Linux distributions also have this as a standard tool), though any XSLT processor would work.
  • In other words, we don’t need to buy InCopy, because we just replaced it with the Web. Our wiki is now plugged directly into our InDesign layout. It even automatically updates the InDesign document when the content changes. Credit is due at this point to Adobe: this integration is possible because of the open file format in the Creative Suite 4.
  • We wrote an XSLT transformation script[18] that converted the XHTML content from the Web into an InCopy ICML file. The script itself is less than 500 lines long, and was written and debugged over a period of about a week by amateurs (again, the people named at the start of this article). The script runs in a couple of seconds, and the resulting .icml file can then be “placed” directly into an InDesign template. The ICML file references an InDesign stylesheet, so the template file can be set up with a house-styled layout, master pages, and stylesheet definitions for paragraphs and character ranges.
  • The result is very simple and easy to use. Our demonstration requires that a production editor run the XSLT transformation script manually, but there is no reason why this couldn’t be built directly into the Web content management system so that exporting the content to print ran the transformation automatically. The resulting file would then be “placed” in InDesign and proofed.
  • It should be noted that the Book Publishing 1 proof-of-concept was artificially complex; we began with a book laid out in InDesign and ended up with a look-alike book laid out in InDesign. But next time—for instance, when we publish Book Publishing 2—we can begin the process with the content on the Web, and keep it there throughout the editorial process. The book’s content could potentially be written and edited entirely online, as Web content, and then automatically poured into an InDesign template at proof time. “Just in time,” as they say. This represents an entirely new way of thinking of book production. With a Web-first orientation, it makes little sense to think of the book as “in print” or “out of print”—the book is simply available, in the first place online; in the second place in derivative digital formats; and third, but really not much more difficult, in print-ready format, via the usual InDesign CS print production system publishers are already familiar with.
  • Creating Ebook Files Creating electronic versions from XHTML source is vastly simpler than trying to generate these out of the existing print process. The ePub version is extremely easy to generate; so is online marketing copy or excerpts for the Web, since the content begins life Web-native.
  • Since an ePub file is essentially XHTML content in a special wrapper, all that is required is that we properly “wrap” our XHTML content. Ideally, the content in an ePub file is broken into chapters (as ours was) and a table of contents file is generated in order to allow easy navigation within an ebook reader. We used Julian Smart’s free tool eCub[19] to simply and automatically generate the ePub wrapper and the table of contents. The only custom development we did was to create a CSS stylesheet for the ebook so that headings and paragraph indents looked the way we wanted. Starting with XHTML content, creating ePub is almost too easy.
  • Such a workflow—beginning with the Web and exporting to print—is surely more in line with the way we will do business in the 21st century, where the Web is the default platform for reaching audiences, developing content, and putting the pieces together. It is time, we suggest, for publishers to re-orient their operations and start with the Web.
  • Our project demonstrates that Web technologies are indeed good enough to use in an XML-oriented workflow; more specialized and expensive options are not necessarily required. For massive-scale enterprise publishing, this approach may not offer enough flexibility, and the challenge of adding and extracting extra semantic richness may prove more trouble than it's worth.
  • But for smaller firms who are looking at the straightforward benefits of XML-based processes—single source publishing, online content and workflow management, open and accessible archive formats, greater online discoverability—here is a way forward.
  • Rather than a public-facing website, our system relies on the Web as a content management platform—of course a public face could easily be added.
  • The final piece of our puzzle, the ability to integrate print production, was made possible by Adobe's release of InDesign with an open XML file format. Since the Web's XHTML is also XML, is can be easily and confidently transformed to the InDesign format.
  • today, we are able to put the process together using nothing but standard, relatively ubiquitous Web tools: the Web itself as an editing and content management environment, standard Web scripting tools for the conversion process, and the well-documented IDML file format to integrate the layout tool.
  • Using the Web as a Production Platform
  •  
    I was looking for an answer to a problem Marbux had presented, and found this interesting article.  The issue was that of the upcoming conversion of the Note Case Pro (NCP) layout engine to the WebKit layout engine, and what to do about the NCP document format. My initial reaction was to encode the legacy NCP document format in XML, and run an XSLT to a universal pivot format like TEI-XML.  From there, the TEI-XML community would provide all the XSLT transformation routines for conversion to ODF, OOXML, XHTML, ePUB and HTML/CSS. Researching the problems one might encounter with this approach, I found this article.  Fascinating stuff. My take away is that TEI-XML would not be as effective a "universal pivot point" as XHTML.  Or perhaps, if NCP really wants to get aggressive; IDML - InDesign Markup Language. The important point though is that XHTML is a browser specific version of XML, and compatible with the Web Kit layout engine Miro wants to move NCP to. The concept of encoding an existing application-specific format in XML has been around since 1998, when XML was first introduced as a W3C standard, a "structured" subset of SGML. (HTML is also a subset of SGML). The multiplatform StarOffice productivity suite became "OpenOffice" when Sun purchased the company in 1998, and open sourced the code base. The OpenOffice developer team came out with a XML encoding of their existing document formats in 2000. The application specific encoding became an OASIS document format standard proposal in 2002 - also known as ODF. Microsoft followed OpenOffice with a XML encoding of their application-specific binary document formats, known as OOXML. Encoding the existing NCP format in XML, specifically targeting XHTML as a "universal pivot point", would put the NCP Outliner in the Web editor category, without breaking backwards compatibility. The trick is in the XSLT conversion process. But I think that is something much easier to handle then trying to
  •  
    I was looking for an answer to a problem Marbux had presented, and found this interesting article.  The issue was that of the upcoming conversion of the Note Case Pro (NCP) layout engine to the WebKit layout engine, and what to do about the NCP document format. My initial reaction was to encode the legacy NCP document format in XML, and run an XSLT to a universal pivot format like TEI-XML.  From there, the TEI-XML community would provide all the XSLT transformation routines for conversion to ODF, OOXML, XHTML, ePUB and HTML/CSS. Researching the problems one might encounter with this approach, I found this article.  Fascinating stuff. My take away is that TEI-XML would not be as effective a "universal pivot point" as XHTML.  Or perhaps, if NCP really wants to get aggressive; IDML - InDesign Markup Language. The important point though is that XHTML is a browser specific version of XML, and compatible with the Web Kit layout engine Miro wants to move NCP to. The concept of encoding an existing application-specific format in XML has been around since 1998, when XML was first introduced as a W3C standard, a "structured" subset of SGML. (HTML is also a subset of SGML). The multiplatform StarOffice productivity suite became "OpenOffice" when Sun purchased the company in 1998, and open sourced the code base. The OpenOffice developer team came out with a XML encoding of their existing document formats in 2000. The application specific encoding became an OASIS document format standard proposal in 2002 - also known as ODF. Microsoft followed OpenOffice with a XML encoding of their application-specific binary document formats, known as OOXML. Encoding the existing NCP format in XML, specifically targeting XHTML as a "universal pivot point", would put the NCP Outliner in the Web editor category, without breaking backwards compatibility. The trick is in the XSLT conversion process. But I think that is something much easier to handle then trying to
Gonzalo San Gil, PhD.

Regulating Google's Results? Law Prof Calls 'Search Neutrality' Incoherent | ... - 2 views

  •  
    [ Regulating Google's Results? Law Prof Calls 'Search Neutrality' Incoherent * By Nate Anderson, ars technica * January 22, 2011 | "Neutrality" - if it's good enough for the core of the internet, isn't it good enough for the edge? The biggest internet providers say it is, and they would love to have the government slap a few neutrality rules on Google, just to see how the advertising giant likes the taste of the regulatory bridle. ]
Gary Edwards

Mashups turn into an industry as offerings mature | Hinchcliffe Enterprise Web 2.0 | Z... - 0 views

  •  
    Dion has lots to say about the recent Web 2.0 Conference. In this article he covers nine significant announcements from companies specializing in Web based mashups and the related tools for building ad hoc Web applications. This years Web 2.0 was filled with Web developer oriented services, but my favorite was MindTouch. Perhaps because their focus was that of directly engaging end users in the customization of business processes. Yes, the creation of data objects is clearly in the realm of trained developers. And for sure many tools were announced at Web 2.0 to further the much needed wiring of data objects. But once wired and available, services like MindTouch i think will become the way end users interact and create new business productivity methods. Great coverage.

    "...... For awareness and understanding of the fast-growing world of mashups are significant challenges as IT practitioners, business strategists, and software vendors attempt to grapple with what's facing up to be the biggest challenge of all: The habits and expectations of the larger part of a generation of workers who don't yet realize mashups are poised to change many things about the software landscape on the Web and in the workplace. Generational changes can be difficult for businesses to embrace successfully, and while evidence that mashups are remaking the business world are still very much emerging, they certainly hold the promise..."

    ".... while the life of the average Web developer has been greatly improved by the availability of a wide variety of useful open APIs, the average user of the Web hasn't been a direct beneficiary except through the increase in Web apps that are built on the mashup model. And that's because the tools that empower users to weave together existing Web parts and open APIs into the exact solutions they need are just now becoming easy enough and robust enough to readily enable these scenarios. And that doesn't include the variety of
Gonzalo San Gil, PhD.

GNU's Framework for Secure Peer-to-Peer Networking GNU's Framework for Secure Peer-to-P... - 0 views

  •  
    "Philosophy The foremost goal of the GNUnet project is to become a widely used, reliable, open, non-discriminating, egalitarian, unfettered and censorship-resistant system of free information exchange. We value free speech above state secrets, law-enforcement or intellectual property. GNUnet is supposed to be an anarchistic network, where the only limitation for peers is that they must contribute enough back to the network such that their resource consumption does not have a significant impact on other users. GNUnet should be more than just another file-sharing network. The plan is to offer many other services and in particular to serve as a development platform for the next generation of decentralized Internet protocols."
  •  
    "Philosophy The foremost goal of the GNUnet project is to become a widely used, reliable, open, non-discriminating, egalitarian, unfettered and censorship-resistant system of free information exchange. We value free speech above state secrets, law-enforcement or intellectual property. GNUnet is supposed to be an anarchistic network, where the only limitation for peers is that they must contribute enough back to the network such that their resource consumption does not have a significant impact on other users. GNUnet should be more than just another file-sharing network. The plan is to offer many other services and in particular to serve as a development platform for the next generation of decentralized Internet protocols."
Gonzalo San Gil, PhD.

10 Tips to Push Your Git Skills to the Next Level - 1 views

  •  
    " Published June 17, 2014 Tweet Subscribe Recently we published a couple of tutorials to get you familiar with Git basics and using Git in a team environment. The commands that we discussed were about enough to help a developer survive in the Git world. In this post, we will try to explore how to manage your time effectively and make full use of the features that Git provides."
  •  
    " Published June 17, 2014 Tweet Subscribe Recently we published a couple of tutorials to get you familiar with Git basics and using Git in a team environment. The commands that we discussed were about enough to help a developer survive in the Git world. In this post, we will try to explore how to manage your time effectively and make full use of the features that Git provides."
Gonzalo San Gil, PhD.

Music Industry Reports 200 Millionth Pirate Link to Google | TorrentFreak - 1 views

  •  
    [ # ! This is a movement against Google itself and for other causes- as 'real pirates' do not need Google to find the sites. They go directly as download/share webs are met through other mechanisms... ] [ ..."Google, however, believes that it has done enough and repeatedly argues that the entertainment industries can themselves do more. "Piracy often arises when consumer demand goes unmet by legitimate supply," the company noted earlier. "The right combination of price, convenience, and inventory will do far more to reduce piracy than enforcement can." ...]
  •  
    [* # ! This is a movement against Google itself and for other causes- as 'real pirates' do not need ggGoogle to find the sites. They go directly as download/share webs are met through other mechanisms... ] "Google, however, believes that it has done enough and repeatedly argues that the entertainment industries can themselves do more. "Piracy often arises when consumer demand goes unmet by legitimate supply," the company noted earlier. "The right combination of price, convenience, and inventory will do far more to reduce piracy than enforcement can.""
Gonzalo San Gil, PhD.

Get involved with the Open Source Hardware Association | Opensource.com - 0 views

  •  
    "Back in October of 2014, I was lucky enough to be elected to the Open Source Hardware Association (OSHWA) board. Because the association received its nonprofit status, the board is finally able to begin increasing its reach in the community."
  •  
    "Back in October of 2014, I was lucky enough to be elected to the Open Source Hardware Association (OSHWA) board. Because the association received its nonprofit status, the board is finally able to begin increasing its reach in the community."
Gonzalo San Gil, PhD.

As TPP Supporters Whine About Failure Of Fast Track, Why Is No One Suggesting Increased... - 0 views

  •  
    "from the time-to-get-it-right dept As we just mentioned, it looks like there aren't enough votes in Congress to give the President and the US Trade Rep the "fast track" authority they want to cram massive trade agreements down the throats of the American public. Nancy Pelosi, whose statement last week helped signal that it was a real possibility that support for fast track would no longer be likely, has now penned an op-ed for USA Today claiming that fast track is on its last legs, highlighting that Congress (not the executive branch) has the power to regulate commerce with foreign countries. Meanwhile, supporters of trade have put into motion an attempt to salvage fast track, which may lead to a vote as soon as tomorrow -- but seems like a risky gambit that may not succeed. "
  •  
    "from the time-to-get-it-right dept As we just mentioned, it looks like there aren't enough votes in Congress to give the President and the US Trade Rep the "fast track" authority they want to cram massive trade agreements down the throats of the American public. Nancy Pelosi, whose statement last week helped signal that it was a real possibility that support for fast track would no longer be likely, has now penned an op-ed for USA Today claiming that fast track is on its last legs, highlighting that Congress (not the executive branch) has the power to regulate commerce with foreign countries. Meanwhile, supporters of trade have put into motion an attempt to salvage fast track, which may lead to a vote as soon as tomorrow -- but seems like a risky gambit that may not succeed. "
Gonzalo San Gil, PhD.

Top 3 Online Resources For Learning The Command Line - 1 views

  •  
    "If you've been using Linux or OS X for a couple of years, you must have run into the command line (or the terminal) at some point. It could have been a command to fix a problem or enable a feature. These days it's easy to just ignore the command line. And for most users, GUI is really enough."
  •  
    "If you've been using Linux or OS X for a couple of years, you must have run into the command line (or the terminal) at some point. It could have been a command to fix a problem or enable a feature. These days it's easy to just ignore the command line. And for most users, GUI is really enough."
Paul Merrell

EU considers spending €1 billion for satellite broadband technology - Interna... - 0 views

  • The €200 billion economic rescue plan being considered this week by European Union leaders includes a proposal to spend €1 billion on bringing high-speed Internet access to rural areas. The proposal is likely to pit the Continent's telecommunications operators against satellite companies, which say they are uniquely suited to expand the broadband, or high-speed, network to underserved parts of Eastern Europe and the Alps by the end of 2010.
  • But support for the plan by EU government leaders, who begin a two-day meeting to consider the rescue plan Thursday is not assured. The money would come from unspent funds in the current EU budget, which under EU rules normally revert back to member countries. Germany, which contributes the most to the EU budget and stands to get the largest refund if the project is rejected, opposes the expenditure.
  • Across the EU, 21.7 percent of residents had broadband Internet access in July, according to the commission; 107.6 million received service from a telephone DSL line or a cable television connection and 130,592 via satellite. Only 6 percent of EU residents on average received broadband via mobile phones.
  • ...1 more annotation...
  • Until now, Baugh said, satellite broadband had been hindered by the relatively high cost of the hardware consumers needed to gain access to the service. But recent advances have lowered the cost to roughly €400, including installation, from several thousand euros a few years ago. At about €30 a month, service packages are comparable to those of DSL and cable.
  •  
    A billion Euros is chicken feed compared to other portions of the E.U. economic stimulus initiatives in the works that respond to the major recession under way. Still, this could be a significant foot in the door for satellite broadband in the E.U., perhaps enough to build out the infrastructure enough for a more serious challenge to cable and telephony broadband. But I wonder if there would be enough redundancy enabled by only a billion Euros to gracefully handle a satellite's death if it has far more broadband users.
Gonzalo San Gil, PhD.

Sub Pop artist creates music-streaming site to mock Pandora, Spotify | Ars Technica [# ... - 0 views

  •  
    "On Tuesday, Josh Tillman, the lead singer and songwriter of the band Father John Misty, announced a phony, satirical music-streaming service called Streamline Audio Protocol, or, better put, SAP. ... On the site, Tillman calls his music-delivery system "a new signal-to-audio process by which popular albums are 'sapped' of their performances, original vocal, atmosphere, and other distracting affectations so the consumer can decide quickly and efficiently whether they like a musical composition, based strictly on its formal attributes, enough to spend money on it. ..."
  •  
    "On Tuesday, Josh Tillman, the lead singer and songwriter of the band Father John Misty, announced a phony, satirical music-streaming service called Streamline Audio Protocol, or, better put, SAP. ... On the site, Tillman calls his music-delivery system "a new signal-to-audio process by which popular albums are 'sapped' of their performances, original vocal, atmosphere, and other distracting affectations so the consumer can decide quickly and efficiently whether they like a musical composition, based strictly on its formal attributes, enough to spend money on it. ..."
Gonzalo San Gil, PhD.

Record Biz Wants To Tax Brits For Copying Their Own Music | TorrentFreak - 0 views

    • Gonzalo San Gil, PhD.
       
      # ! For when some exigencies to the recording industry … like diminishing prices and increasing, as much the quality of the works like the respect to the public…?
    • Gonzalo San Gil, PhD.
       
      # ! as if there weren't already enough taxes...
  •  
    [Several music industry organizations in the UK have launched an application for a judicial review after the government passed legislation allowing citizens to copy their own music for personal use. The group says that in order for the system to be fair, the public must pay a new tax. ...] # ! Definitely... # ! ... '#Music #watchmen' -th@se who persecute aficionad@s # ! just for #sharing- are 'watching' for everything BUT The Music... # ! Let's The #sharing #protect -effectively- the #Culture... (# ! perhaps, 'someone' thinks we don't pay enough taxes yet... # ! ...while Billions 'disappear' yearly from the public coffers....)
  •  
    [Several music industry organizations in the UK have launched an application for a judicial review after the government passed legislation allowing citizens to copy their own music for personal use. The group says that in order for the system to be fair, the public must pay a new tax. ...]
Gary Edwards

Is Linux dead for the desktop? - 1 views

  • Linux never had the apps
  • Charles King, an IT analyst who follows enterprise trends, says the big change is in IT. At one time, executives in charge of computing services were mostly concerned with operating systems and applications for massive throng of traditional business users. Those users have now flocked to mobile computing devices, but they still have a Windows PC sitting on their desk.
  • Today, Microsoft's lock (on the desktop, anyway) remains secure, even in the face of Apple's surge," King says. "Ironically enough, though, the open source model remains alive and well but mostly in the development of new standards and development platforms."
  • ...5 more annotations...
  • David Johnson
  • What corporate end users really need is familiarity, consistency and compatibility - something Apple, Microsoft and Google seem more adept at offering."
  • Can desktop Linux OS be saved? Johnson says the best example of how to save Linux OS is the Chrome OS, an all-in-one laptop and desktop offering available through major consumer electronics companies such as LG (with their Chromebase all-in-one) and the Samsung Chromebook 2
  • The problem is that Chrome OS and Android aren't the same as Linux OS on the desktop. It's a complete reinvention. There are few Windows-like productivity apps and no knowledge worker apps designed for keyboard and mouse.
  • All of experts agree - Windows won every battle for the business user.
  •  
    "For executives in charge of desktop deployments in a large company, Linux OS was once hailed as a saviour for corporate end users. With incredibly low pricing - free, with fee-based support plans, for example - distributions such as Ubuntu Desktop and SUSE Linux Enterprise offered a "good enough" user interface, along with plenty of powerful apps and a rich browser. A few years ago, both Dell and HP jumped on the bandwagon; today, they still offer "developer" and "workstation" models that come pre-loaded with a Linux install. Plus, anyone who follows the Linux market knows that Google has reimagined Linux as a user-friendly tablet interface (the wildly popular Android OS) and a browser-only desktop variant (Chrome OS). Linux also shows up on countless connected home gadgets, fitness trackers, watches and other low-cost devices, mostly because OS costs are so low. The desktop computing OS for end users has failed to capture any attention lately, though. Al Gillen, the programme vice president for servers and system software at IDC, says the Linux OS as a computing platform for end users is at least comatose - and probably dead. Yes, it has reemerged on Android and other devices, but it has gone almost completely silent as a competitor to Windows for mass deployment. As they say, you can hear the crickets chirping."
Gonzalo San Gil, PhD.

Strike Becomes Totally Dynamic With No Torrents to Takedown | TorrentFreak - 1 views

  •  
    " Andy on April 4, 2015 C: 0 Breaking In response to being overwhelmed by DMCA takedown notices, new torrent site Strike now stores no data whatsoever - no torrents, no magnets, no categories and no indexing. The site has become totally dynamic and only fetches data requested by the user. But will these drastic steps be enough?"
  •  
    " Andy on April 4, 2015 C: 0 Breaking In response to being overwhelmed by DMCA takedown notices, new torrent site Strike now stores no data whatsoever - no torrents, no magnets, no categories and no indexing. The site has become totally dynamic and only fetches data requested by the user. But will these drastic steps be enough?"
Gonzalo San Gil, PhD.

Judge: IP-Address Doesn't Identify a Movie Pirate | TorrentFreak - 1 views

  •  
    "In a prominent ruling Florida District Court Judge Ursula Ungaro refused to issue a subpoena, arguing that IP-address evidence is not enough to show who has downloaded a pirated movie."
  •  
    "In a prominent ruling Florida District Court Judge Ursula Ungaro refused to issue a subpoena, arguing that IP-address evidence is not enough to show who has downloaded a pirated movie."
Gonzalo San Gil, PhD.

How to speed up your internet connection on Linux | HowtoForge - 0 views

  •  
    "...there isn't a way to transform a slow internet connection into a lighting-speed one if your provider is just not giving you enough bandwidth, no matter what you do. This post is only aiming to provide generic advice on how to make things a little bit better if possible, and if applicable to each case."
  •  
    "...there isn't a way to transform a slow internet connection into a lighting-speed one if your provider is just not giving you enough bandwidth, no matter what you do. This post is only aiming to provide generic advice on how to make things a little bit better if possible, and if applicable to each case."
Gonzalo San Gil, PhD.

Judge: Vague IP-Address Evidence is Not Enough to Expose BitTorrent 'Pirates' - Torrent... - 1 views

  •  
    " By Ernesto on October 4, 2016 C: 47 News A California federal court has thrown up a roadblock for filmmakers who want to obtain the personal details of an alleged BitTorrent pirate. The judge refused to issue a subpoena, twice, because it's not clear if the rightsholder obtained the geolocation details at the time of the infringement or after the fact."
Gonzalo San Gil, PhD.

Make a Donation to the Internet Archive - 0 views

  •  
    [3-for-1 Match for All Donations! A generous supporter has offered to match every dollar we raise 3-to-1 through December 31st. We are trying to raise $150,000 in donations by the end of the year - with the match, that will give us $600,000, enough to buy 4 more petabytes of storage. Help us keep the library free for millions of people by making a tax-deductible donation today.]
Paul Merrell

Lawrence, KS To Get Gigabit Fiber - But Not From Google - Slashdot - 0 views

  • "Just 40 miles west on the Kansas Turnpike from Kansas City Kansas sits Lawrence, KS. With the slow rollout of Google fiber in their neighbor city, it was looking like their 89,000 people were not going to get the gigabit fiber to the home for quite some time. Up steps Wicked Broadband, a local ISP. With a plan remarkably similar to Google's they look to build out fiber to the home, business, and so on with gigabit speed and similar rates, symmetric bandwidth and no caps. Wicked Fiber's offer is different than Google Fiber's, with more tiers — with cute names. The "Flying Monkey" gigabit plan is $100/month, "Tinman" at 100Mbps is $70/month. They offer TV as well but strangely put Internet streaming and Roku to the fore. They are even using Google's method of installing first in the neighborhoods with the most pre-registration to optimize efficiency, and installing only where there is enough demand. It seems Google's scheme to inspire competition in broadband access is working — if Wicked Fiber gets enough subscribers to make it pay. If this succeeds it may inspire similar ISPs near us to step up to gigabit fiber so let's root for them."
  •  
    It shouldn't take a lot of similar initiatives from companies other than Google to force major ISPs to begin rolling out gigabit ISP services in the U.S. in order to protect their market share from predation. To be followed by lower charges, hopefully. 
1 - 20 of 104 Next › Last »
Showing 20 items per page