Skip to main content

Home/ Future of the Web/ Group items tagged preservation

Rss Feed Group items tagged

Paul Merrell

F.C.C. Backs Opening Net Rules for Debate - NYTimes.com - 0 views

  • On Thursday, the Federal Communications Commission voted 3-2 to open for public debate new rules meant to guarantee an open Internet. Before the plan becomes final, though, the chairman of the commission, Tom Wheeler, will need to convince his colleagues and an array of powerful lobbying groups that the plan follows the principle of net neutrality, the idea that all content running through the Internet’s pipes is treated equally.While the rules are meant to prevent Internet providers from knowingly slowing data, they would allow content providers to pay for a guaranteed fast lane of service. Some opponents of the plan, those considered net neutrality purists, argue that allowing some content to be sent along a fast lane would essentially discriminate against other content.
  • “We are dedicated to protecting and preserving an open Internet,” Mr. Wheeler said immediately before the commission vote. “What we’re dealing with today is a proposal, not a final rule. We are asking for specific comment on different approaches to accomplish the same goal, an open Internet.”
  • Mr. Wheeler argued on Thursday that the proposal did not allow a fast lane. But the proposed rules do not address the connection between an Internet service provider, which sells a connection to consumers, and the operators of backbone transport networks that connect various parts of the Internet’s central plumbing.That essentially means that as long as an Internet service provider like Comcast or Verizon does not slow the service that a consumer buys, the provider can give faster service to a company that pays to get its content to consumers unimpeded
  • ...2 more annotations...
  • The plan will be open for comment for four months, beginning immediately.
  • The public will have until July 15 to submit initial comments on the proposal to the commission, and until Sept. 10 to file comments replying to the initial discussions.
  •  
    I'll need to read the proposed rule, but this doesn't sound good. the FCC majority tries to spin this as options still being open, but I don't recall ever seeing formal regulations changed substantially from their proposed form. If their were to be substantial change, another proposal and comment period would be likely. The public cannot comment on what has not been proposed, so substantial departure from the proposal, absent a new proposal and comment period, would offend basic principles of public notice and comment rulemaking under the Administrative Procedures Act. The proverbial elephant in the room that the press hasn't picked up on yet is the fight that is going on behind the scenes in the Dept. of Justice. If the Anti-trust Division gets its way, DoJ's public comments on the proposed rule could blow this show out of the water. The ISPs are regulated utility monopolies in vast areas of the U.S. with market consolidation at or near the limits of what the anti-trust folk will tolerate. And leveraging one monopoly (service to subscribers) to impose another (fees for internet-based businesses to gain high speed access) is directly counter to the Sherman Act's section 2.   http://www.law.cornell.edu/uscode/text/15/2
Paul Merrell

Internet Giants Erect Barriers to Spy Agencies - NYTimes.com - 0 views

  • As fast as it can, Google is sealing up cracks in its systems that Edward J. Snowden revealed the N.S.A. had brilliantly exploited. It is encrypting more data as it moves among its servers and helping customers encode their own emails. Facebook, Microsoft and Yahoo are taking similar steps.
  • After years of cooperating with the government, the immediate goal now is to thwart Washington — as well as Beijing and Moscow. The strategy is also intended to preserve business overseas in places like Brazil and Germany that have threatened to entrust data only to local providers. Google, for example, is laying its own fiber optic cable under the world’s oceans, a project that began as an effort to cut costs and extend its influence, but now has an added purpose: to assure that the company will have more control over the movement of its customer data.
  • A year after Mr. Snowden’s revelations, the era of quiet cooperation is over. Telecommunications companies say they are denying requests to volunteer data not covered by existing law. A.T.&T., Verizon and others say that compared with a year ago, they are far more reluctant to cooperate with the United States government in “gray areas” where there is no explicit requirement for a legal warrant.
  • ...8 more annotations...
  • Eric Grosse, Google’s security chief, suggested in an interview that the N.S.A.'s own behavior invited the new arms race.“I am willing to help on the purely defensive side of things,” he said, referring to Washington’s efforts to enlist Silicon Valley in cybersecurity efforts. “But signals intercept is totally off the table,” he said, referring to national intelligence gathering.“No hard feelings, but my job is to make their job hard,” he added.
  • In Washington, officials acknowledge that covert programs are now far harder to execute because American technology companies, fearful of losing international business, are hardening their networks and saying no to requests for the kind of help they once quietly provided.Continue reading the main story Robert S. Litt, the general counsel of the Office of the Director of National Intelligence, which oversees all 17 American spy agencies, said on Wednesday that it was “an unquestionable loss for our nation that companies are losing the willingness to cooperate legally and voluntarily” with American spy agencies.
  • Many point to an episode in 2012, when Russian security researchers uncovered a state espionage tool, Flame, on Iranian computers. Flame, like the Stuxnet worm, is believed to have been produced at least in part by American intelligence agencies. It was created by exploiting a previously unknown flaw in Microsoft’s operating systems. Companies argue that others could have later taken advantage of this defect.Worried that such an episode undercuts confidence in its wares, Microsoft is now fully encrypting all its products, including Hotmail and Outlook.com, by the end of this year with 2,048-bit encryption, a stronger protection that would take a government far longer to crack. The software is protected by encryption both when it is in data centers and when data is being sent over the Internet, said Bradford L. Smith, the company’s general counsel.
  • Mr. Smith also said the company was setting up “transparency centers” abroad so that technical experts of foreign governments could come in and inspect Microsoft’s proprietary source code. That will allow foreign governments to check to make sure there are no “back doors” that would permit snooping by United States intelligence agencies. The first such center is being set up in Brussels.Microsoft has also pushed back harder in court. In a Seattle case, the government issued a “national security letter” to compel Microsoft to turn over data about a customer, along with a gag order to prevent Microsoft from telling the customer it had been compelled to provide its communications to government officials. Microsoft challenged the gag order as violating the First Amendment. The government backed down.
  • Hardware firms like Cisco, which makes routers and switches, have found their products a frequent subject of Mr. Snowden’s disclosures, and their business has declined steadily in places like Asia, Brazil and Europe over the last year. The company is still struggling to convince foreign customers that their networks are safe from hackers — and free of “back doors” installed by the N.S.A. The frustration, companies here say, is that it is nearly impossible to prove that their systems are N.S.A.-proof.
  • In one slide from the disclosures, N.S.A. analysts pointed to a sweet spot inside Google’s data centers, where they could catch traffic in unencrypted form. Next to a quickly drawn smiley face, an N.S.A. analyst, referring to an acronym for a common layer of protection, had noted, “SSL added and removed here!”
  • Facebook and Yahoo have also been encrypting traffic among their internal servers. And Facebook, Google and Microsoft have been moving to more strongly encrypt consumer traffic with so-called Perfect Forward Secrecy, specifically devised to make it more labor intensive for the N.S.A. or anyone to read stored encrypted communications.One of the biggest indirect consequences from the Snowden revelations, technology executives say, has been the surge in demands from foreign governments that saw what kind of access to user information the N.S.A. received — voluntarily or surreptitiously. Now they want the same.
  • The latest move in the war between intelligence agencies and technology companies arrived this week, in the form of a new Google encryption tool. The company released a user-friendly, email encryption method to replace the clunky and often mistake-prone encryption schemes the N.S.A. has readily exploited.But the best part of the tool was buried in Google’s code, which included a jab at the N.S.A.'s smiley-face slide. The code included the phrase: “ssl-added-and-removed-here-; - )”
Gonzalo San Gil, PhD.

Killing Spotify's Free Version Will Boost Piracy | TorrentFreak [# ! Note] - 0 views

    • Gonzalo San Gil, PhD.
       
      # ! The Truth is That Industry DOESN'T WANT any competitive, affordable and convenient online music service, as '#They' want to keep on manipulating market and, this, way, try to control the #FreeThinking' that a wide supply of music contents provides... # ! And, of course, preserve its de facto monopoly in the (music related) market panorama...
  •  
    [ Ernesto on May 10, 2015 C: 0 Opinion Spotify is generally hailed as a piracy killer, with music file-sharing traffic dropping in virtually every country where the service launches. However, much of this effect may be lost if recent calls to end Spotify's free tier are honored. ...]
Gonzalo San Gil, PhD.

Guest Post: Five Reasons Why The Major Labels Didn't Blow It With Napster by @thetrickn... - 1 views

    • Gonzalo San Gil, PhD.
       
      # ! #Industry (#Politics) just don't want to share their business (of culture/thinking/VALUES Manipulation) with third partires...
  •  
    [ay 30, 2015 Editor Charlie Leave a comment Go to comments [Editor Charlie sez: We're pleased to get a chance to repost this must read piece by industry veteran Jim McDermott who brings great insights into the Napster history and the flaws in the narrative that the tech press has so eagerly promoted. You can also read Chris's 2008 interview about Napster with Andrew Orlowski in The Register, The Music Wars from 30,000 Feet.] ...]
Gary Edwards

XML Production Workflows? Start with the Web and XHTML - 0 views

  • Challenges: Some Ugly Truths The challenges of building—and living with—an XML workflow are clear enough. The return on investment is a long-term proposition. Regardless of the benefits XML may provide, the starting reality is that it represents a very different way of doing things than the one we are familiar with. The Word Processing and Desktop Publishing paradigm, based on the promise of onscreen, WYSIWYG layout, is so dominant as to be practically inescapable. It has proven really hard to get from here to there, no matter how attractive XML might be on paper. A considerable amount of organizational effort and labour must be expended up front in order to realize the benefits. This is why XML is often referred to as an “investment”: you sink a bunch of time and money up front, and realize the benefits—greater flexibility, multiple output options, searching and indexing, and general futureproofing—later, over the long haul. It is not a short-term return proposition. And, of course, the returns you are able to realize from your XML investment are commensurate with what you put in up front: fine-grained, semantically rich tagging is going to give you more potential for searchability and recombination than a looser, more general-purpose approach, but it sure costs more. For instance, the Text Encoding Initiative (TEI) is the grand example of pouring enormous amounts of energy into the up-front tagging, with a very open-ended set of possibilities down the line. TEI helpfully defines a level to which most of us do not have to aspire.[5] But understanding this on a theoretical level is only part of the challenge. There are many practical issues that must be addressed. Software and labour are two of the most critical. How do you get the content into XML in the first place? Unfortunately, despite two decades of people doing SGML and XML, this remains an ugly question.
  • Practical Challenges In 2009, there is still no truly likeable—let alone standard—editing and authoring software for XML. For many (myself included), the high-water mark here was Adobe’s FrameMaker, substantially developed by the late 1990s. With no substantial market for it, it is relegated today mostly to the tech writing industry, unavailable for the Mac, and just far enough afield from the kinds of tools we use today that its adoption represents a significant hurdle. And FrameMaker was the best of the breed; most of the other software in decent circulation are programmers’ tools—the sort of things that, as Michael Tamblyn pointed out, encourage editors to drink at their desks. The labour question represents a stumbling block as well. The skill-sets and mind-sets that effective XML editors need have limited overlap with those needed by literary and more traditional production editors. The need to think of documents as machine-readable databases is not something that comes naturally to folks steeped in literary culture. In combination with the sheer time and effort that rich tagging requires, many publishers simply outsource the tagging to India, drawing a division of labour that spans oceans, to put it mildly. Once you have XML content, then what do you do with it? How do you produce books from it? Presumably, you need to be able to produce print output as well as digital formats. But while the latter are new enough to be generally XML-friendly (e-book formats being largely XML based, for instance), there aren’t any straightforward, standard ways of moving XML content into the kind of print production environments we are used to seeing. This isn’t to say that there aren’t ways of getting print—even very high-quality print—output from XML, just that most of them involve replacing your prepress staff with Java programmers.
  • Why does this have to be so hard? It’s not that XML is new, or immature, or untested. Remember that the basics have been around, and in production, since the early 1980s at least. But we have to take account of a substantial and long-running cultural disconnect between traditional editorial and production processes (the ones most of us know intimately) and the ways computing people have approached things. Interestingly, this cultural divide looked rather different in the 1970s, when publishers were looking at how to move to digital typesetting. Back then, printers and software developers could speak the same language. But that was before the ascendancy of the Desktop Publishing paradigm, which computerized the publishing industry while at the same time isolating it culturally. Those of us who learned how to do things the Quark way or the Adobe way had little in common with people who programmed databases or document-management systems. Desktop publishing technology isolated us in a smooth, self-contained universe of toolbars, grid lines, and laser proofs. So, now that the reasons to get with this program, XML, loom large, how can we bridge this long-standing divide?
  • ...44 more annotations...
  • Using the Web as a Production Platform The answer, I think, is right in front of you. The bridge is the Web, a technology and platform that is fundamentally based on XML, and which many publishers are by now comfortably familiar with. Perhaps not entirely comfortably, but at least most publishers are already working with the Web; they already either know or have on staff people who understand it and can work with it. The foundation of our argument is this: rather than looking at jumping to XML in its full, industrial complexity, which seems to be what the O'Reilly-backed StartWithXML initiative[6] is suggesting, publishers instead leverage existing tools and technologies—starting with the Web—as a means of getting XML workflows in place. This means making small investments and working with known tools rather than spending tens of thousands of dollars on XML software and rarefied consultants. It means re-thinking how the existing pieces of the production toolchain fit together; re-thinking the existing roles of software components already in use. It means, fundamentally, taking the Web seriously as a content platform, rather than thinking of it as something you need to get content out to, somehow. If nothing else, the Web represents an opportunity to think about editorial and production from outside the shrink-wrapped Desktop Publishing paradigm.
  • Is the Web made of Real XML? At this point some predictable objections can be heard: wait a moment, the Web isn’t really made out of XML; the HTML that makes up most of the Web is at best the bastard child of SGML, and it is far too flaky/unstructured/underpowered to be taken seriously. We counter by arguing that although HTML on the Web exists in a staggering array of different incarnations, and that the majority of it is indeed an unstructured mess, this does not undermine the general principle that basic, ubiquitous Web technologies can make a solid platform for content management, editorial process, and production workflow.
  • With the advent of a published XML standard in the late 1990s came the W3C’s adoption of XHTML: the realization of the Web’s native content markup as a proper XML document type. Today, its acceptance is almost ubiquitous, even while the majority of actual content out there may not be strictly conforming. The more important point is that most contemporary Web software, from browsers to authoring tools to content management systems (from blogs to enterprise systems), are capable of working with clean, valid XHTML. Or, to put the argument the other way around, clean, valid XHTML content plays absolutely seamlessly with everything else on the Web.[7]
  • The objection which follows, then, will be that even if we grant that XHTML is a real XML document type, that it is underpowered for “serious” content because it is almost entirely presentation (formatting) oriented; it lacks any semantic depth. In XHTML, a paragraph is a paragraph is a paragraph, as opposed to a section or an epigraph or a summary.
  • n contrast, more “serious” XML document types like DocBook[8] or DITA-derived schemas[9] are capable of making semantic distinctions about content chunks at a fine level of granularity and with a high degree of specificity.
  • So there is an argument for recalling the 80:20 rule here. If XHTML can provide 80% of the value with just 20% of the investment, then what exactly is the business case for spending the other 80% to achieve that last 20% of value? We suspect the ratio is actually quite a bit steeper than 80:20 for most publishers.
  • Furthermore, just to get technical for a moment, XHTML is extensible in a fairly straightforward way, through the common “class” attribute on each element. Web developers have long leveraged this kind of extensibility in the elaboration of “microformats” for semantic-web applications.[10] There is no reason why publishers shouldn’t think to use XHTML’s simple extensibility in a similar way for their own ends.
  • XHTML, on the other hand, is supported by a vast array of quotidian software, starting with the ubiquitous Web browser. For this very reason, XHTML is in fact employed as a component part of several more specialized document types (ONIX and ePub among them).
  • Why re-invent a general-purpose prose representation when XHTML already does the job?
  • It is worth pausing for a moment to consider the role of XHTML in the ePub standard for ebook content. An ePub file is, anatomically, a simply disguised zip archive. Inside the zip archive are a few standard component parts: there are specialized files that declare metadata about the book, and about the format of the book. And then there is the book’s content, represented in XHTML. An ePub book is a Web page in a wrapper.
  • To sum up the general argument: the Web as it already exists presents incredible value to publishers, as a platform for doing XML content management with existing (and often free) tools, and without having to go blindly into the unknown. At this point, we can offer a few design guidelines: prefer existing and/or ubiquitous tools over specialized ones wherever possible; prefer free software over proprietary systems where possible; prefer simple tools controlled and coordinated by human beings over fully automated (and therefore complex) systems; play to our strengths: use Web software for storing and managing content, use layout software for layout, and keep editors and production people in charge of their own domains.
  • Putting the Pieces Together: A Prototype
  • At the SFU Master of Publishing Program, we have been chipping away at this general line of thinking for a few years. Over that time, Web content management systems have been getting more and more sophisticated, all the while getting more streamlined and easier to use. (NB: if you have a blog, you have a Web content management system.) The Web is beginning to be recognized as a writing and editing environment used by millions of people. And the ways in which content is represented, stored, and exchanged online have become increasingly robust and standardized.
  • The missing piece of the puzzle has been print production: how can we move content from its malleable, fluid form on line into the kind of high-quality print production environments we’ve come to expect after two decades of Desktop Publishing?
  • Anyone who has tried to print Web content knows that the existing methods leave much to be desired (hyphenation and justification, for starters). In the absence of decent tools for this, most publishers quite naturally think of producing the print content first, and then think about how to get material onto the Web for various purposes. So we tend to export from Word, or from Adobe, as something of an afterthought.
  • While this sort of works, it isn’t elegant, and it completely ignores the considerable advantages of Web-based content management.
  • Content managed online is stored in one central location, accessible simultaneously to everyone in your firm, available anywhere you have an Internet connection, and usually exists in a much more fluid format than Word files. If only we could manage the editorial flow online, and then go to print formats at the end, instead of the other way around. At SFU, we made several attempts to make this work by way of the supposed “XML import” capabilities of various Desktop Publishing tools, without much success.[12]
  • In the winter of 2009, Adobe solved this part of the problem for us with the introduction of its Creative Suite 4. What CS4 offers is the option of a complete XML representation of an InDesign document: what Adobe calls IDML (InDesign Markup Language).
  • The IDML file format is—like ePub—a simply disguised zip archive that, when unpacked, reveals a cluster of XML files that represent all the different facets of an InDesign document: layout spreads, master pages, defined styles, colours, and of course, the content.
  • IDML is a well thought-out XML standard that achieves two very different goals simultaneously: it preserves all of the information that InDesign needs to do what it does; and it is broken up in a way that makes it possible for mere mortals (or at least our Master of Publishing students) to work with it.
  • What this represented to us in concrete terms was the ability to take Web-based content and move it into InDesign in a straightforward way, thus bridging Web and print production environments using existing tools and skillsets, with a little added help from free software.
  • We would take clean XHTML content, transform it to IDML-marked content, and merge that with nicely designed templates in InDesign.
  • The result is an almost push-button publication workflow, which results in a nice, familiar InDesign document that fits straight into the way publishers actually do production.
  • Tracing the steps To begin with, we worked backwards, moving the book content back to clean XHTML.
  • The simplest method for this conversion—and if you want to create Web content, this is an excellent route—was to use Adobe’s “Export to Digital Editions” option, which creates an ePub file.
  • Recall that ePub is just XHTML in a wrapper, so within the ePub file was a relatively clean XHTML document. It was somewhat cleaner (that is, the XHTML tagging was simpler and less cluttered) than InDesign’s other Web-oriented exports, possibly because Digital Editions is a well understood target, compared with somebody’s website.
  • In order to achieve our target of clean XHTML, we needed to do some editing; the XHTML produced by InDesign’s “Digital Editions” export was presentation-oriented. For instance, bulleted list items were tagged as paragraphs, with a class attribute identifying them as list items. Using the search-and-replace function, we converted such structures to proper XHTML list and list-item elements. Our guiding principle was to make the XHTML as straightforward as possible, not dependent on any particular software to interpret it.
  • We broke the book’s content into individual chapter files; each chapter could then carry its own basic metadata, and the pages conveniently fit our Web content management system (which is actually just a wiki). We assembled a dynamically generated table of contents for the 12 chapters, and created a cover page. Essentially, the book was entirely Web-based at this point.
  • When the book chapters are viewed online, they are formatted via a CSS2 stylesheet that defines a main column for content as well as dedicating screen real estate for navigational elements. We then created a second template to render the content for exporting; this was essentially a bare-bones version of the book with no navigation and minimal styling. Pages (or even the entire book) can be exported (via the “Save As...” function in a Web browser) for use in either print production or ebook conversion. At this point, we required no skills beyond those of any decent Web designer.
  • Integrating with CS4 for Print Adobe’s IDML language defines elements specific to InDesign; there is nothing in the language that looks remotely like XHTML. So a mechanical transformation step is needed to convert the XHTML content into something InDesign can use. This is not as hard as it might seem.
  • Both XHTML and IDML are composed of straightforward, well-documented structures, and so transformation from one to the other is, as they say, “trivial.” We chose to use XSLT (Extensible Stylesheet Language Transforms) to do the work. XSLT is part of the overall XML specification, and thus is very well supported in a wide variety of tools. Our prototype used a scripting engine called xsltproc, a nearly ubiquitous piece of software that we found already installed as part of Mac OS X (contemporary Linux distributions also have this as a standard tool), though any XSLT processor would work.
  • In other words, we don’t need to buy InCopy, because we just replaced it with the Web. Our wiki is now plugged directly into our InDesign layout. It even automatically updates the InDesign document when the content changes. Credit is due at this point to Adobe: this integration is possible because of the open file format in the Creative Suite 4.
  • We wrote an XSLT transformation script[18] that converted the XHTML content from the Web into an InCopy ICML file. The script itself is less than 500 lines long, and was written and debugged over a period of about a week by amateurs (again, the people named at the start of this article). The script runs in a couple of seconds, and the resulting .icml file can then be “placed” directly into an InDesign template. The ICML file references an InDesign stylesheet, so the template file can be set up with a house-styled layout, master pages, and stylesheet definitions for paragraphs and character ranges.
  • The result is very simple and easy to use. Our demonstration requires that a production editor run the XSLT transformation script manually, but there is no reason why this couldn’t be built directly into the Web content management system so that exporting the content to print ran the transformation automatically. The resulting file would then be “placed” in InDesign and proofed.
  • It should be noted that the Book Publishing 1 proof-of-concept was artificially complex; we began with a book laid out in InDesign and ended up with a look-alike book laid out in InDesign. But next time—for instance, when we publish Book Publishing 2—we can begin the process with the content on the Web, and keep it there throughout the editorial process. The book’s content could potentially be written and edited entirely online, as Web content, and then automatically poured into an InDesign template at proof time. “Just in time,” as they say. This represents an entirely new way of thinking of book production. With a Web-first orientation, it makes little sense to think of the book as “in print” or “out of print”—the book is simply available, in the first place online; in the second place in derivative digital formats; and third, but really not much more difficult, in print-ready format, via the usual InDesign CS print production system publishers are already familiar with.
  • Creating Ebook Files Creating electronic versions from XHTML source is vastly simpler than trying to generate these out of the existing print process. The ePub version is extremely easy to generate; so is online marketing copy or excerpts for the Web, since the content begins life Web-native.
  • Since an ePub file is essentially XHTML content in a special wrapper, all that is required is that we properly “wrap” our XHTML content. Ideally, the content in an ePub file is broken into chapters (as ours was) and a table of contents file is generated in order to allow easy navigation within an ebook reader. We used Julian Smart’s free tool eCub[19] to simply and automatically generate the ePub wrapper and the table of contents. The only custom development we did was to create a CSS stylesheet for the ebook so that headings and paragraph indents looked the way we wanted. Starting with XHTML content, creating ePub is almost too easy.
  • Such a workflow—beginning with the Web and exporting to print—is surely more in line with the way we will do business in the 21st century, where the Web is the default platform for reaching audiences, developing content, and putting the pieces together. It is time, we suggest, for publishers to re-orient their operations and start with the Web.
  • Our project demonstrates that Web technologies are indeed good enough to use in an XML-oriented workflow; more specialized and expensive options are not necessarily required. For massive-scale enterprise publishing, this approach may not offer enough flexibility, and the challenge of adding and extracting extra semantic richness may prove more trouble than it's worth.
  • But for smaller firms who are looking at the straightforward benefits of XML-based processes—single source publishing, online content and workflow management, open and accessible archive formats, greater online discoverability—here is a way forward.
  • Rather than a public-facing website, our system relies on the Web as a content management platform—of course a public face could easily be added.
  • The final piece of our puzzle, the ability to integrate print production, was made possible by Adobe's release of InDesign with an open XML file format. Since the Web's XHTML is also XML, is can be easily and confidently transformed to the InDesign format.
  • today, we are able to put the process together using nothing but standard, relatively ubiquitous Web tools: the Web itself as an editing and content management environment, standard Web scripting tools for the conversion process, and the well-documented IDML file format to integrate the layout tool.
  • Using the Web as a Production Platform
  •  
    I was looking for an answer to a problem Marbux had presented, and found this interesting article.  The issue was that of the upcoming conversion of the Note Case Pro (NCP) layout engine to the WebKit layout engine, and what to do about the NCP document format. My initial reaction was to encode the legacy NCP document format in XML, and run an XSLT to a universal pivot format like TEI-XML.  From there, the TEI-XML community would provide all the XSLT transformation routines for conversion to ODF, OOXML, XHTML, ePUB and HTML/CSS. Researching the problems one might encounter with this approach, I found this article.  Fascinating stuff. My take away is that TEI-XML would not be as effective a "universal pivot point" as XHTML.  Or perhaps, if NCP really wants to get aggressive; IDML - InDesign Markup Language. The important point though is that XHTML is a browser specific version of XML, and compatible with the Web Kit layout engine Miro wants to move NCP to. The concept of encoding an existing application-specific format in XML has been around since 1998, when XML was first introduced as a W3C standard, a "structured" subset of SGML. (HTML is also a subset of SGML). The multiplatform StarOffice productivity suite became "OpenOffice" when Sun purchased the company in 1998, and open sourced the code base. The OpenOffice developer team came out with a XML encoding of their existing document formats in 2000. The application specific encoding became an OASIS document format standard proposal in 2002 - also known as ODF. Microsoft followed OpenOffice with a XML encoding of their application-specific binary document formats, known as OOXML. Encoding the existing NCP format in XML, specifically targeting XHTML as a "universal pivot point", would put the NCP Outliner in the Web editor category, without breaking backwards compatibility. The trick is in the XSLT conversion process. But I think that is something much easier to handle then trying to
  •  
    I was looking for an answer to a problem Marbux had presented, and found this interesting article.  The issue was that of the upcoming conversion of the Note Case Pro (NCP) layout engine to the WebKit layout engine, and what to do about the NCP document format. My initial reaction was to encode the legacy NCP document format in XML, and run an XSLT to a universal pivot format like TEI-XML.  From there, the TEI-XML community would provide all the XSLT transformation routines for conversion to ODF, OOXML, XHTML, ePUB and HTML/CSS. Researching the problems one might encounter with this approach, I found this article.  Fascinating stuff. My take away is that TEI-XML would not be as effective a "universal pivot point" as XHTML.  Or perhaps, if NCP really wants to get aggressive; IDML - InDesign Markup Language. The important point though is that XHTML is a browser specific version of XML, and compatible with the Web Kit layout engine Miro wants to move NCP to. The concept of encoding an existing application-specific format in XML has been around since 1998, when XML was first introduced as a W3C standard, a "structured" subset of SGML. (HTML is also a subset of SGML). The multiplatform StarOffice productivity suite became "OpenOffice" when Sun purchased the company in 1998, and open sourced the code base. The OpenOffice developer team came out with a XML encoding of their existing document formats in 2000. The application specific encoding became an OASIS document format standard proposal in 2002 - also known as ODF. Microsoft followed OpenOffice with a XML encoding of their application-specific binary document formats, known as OOXML. Encoding the existing NCP format in XML, specifically targeting XHTML as a "universal pivot point", would put the NCP Outliner in the Web editor category, without breaking backwards compatibility. The trick is in the XSLT conversion process. But I think that is something much easier to handle then trying to
Paul Merrell

Rapid - Press Releases - EUROPA - 0 views

  • The Commission has found that Intel excluded its competitor in two ways: through illegal loyalty rebates by paying manufacturers and retailers to restrict the commercialisation of competitors' products.These illegal actions were designed to preserve Intel's market share at a time when their only significant rival - AMD - was a growing threat to Intel's position. This threat was widely recognised by both computer manufacturers and in Intel's own internal documents seen by the Commission. The computer manufacturers involved are Acer, Dell, HP, Lenovo and NEC. The retailer involved is Media Saturn Holdings, the parent company of Media Markt.
  • Naturally, the Commission favours strong, vigorous price competition, including by dominant firms. However, Intel went beyond normal price competition by giving rebates to computer manufacturers on the condition that they bought all, or almost all, of their CPUs from Intel. Intel also made direct payments to a major retailer – Media Markt - on the condition that it stocked only computers with Intel CPUs.
  • Just to give you one example: in one case, a computer manufacturer took up only a small part of an offer by AMD of free CPUs because acceptance of all the free CPUs offered would have led that computer manufacturer to breach the conditions of its agreement with Intel and to lose rebates on all its much more numerous Intel purchases.
  • ...3 more annotations...
  • Intel made direct payments to computer manufacturers to halt or delay the launch of products using their rival's chips, and to limit their distribution once available. The Commission has specific, documented examples, of Intel paying other manufacturers to, for example, delay the launch of an AMD-based PC by six months, and to restrict the sales of AMD-based products to certain customers.
  • The Commission Decision contains evidence that Intel went to great lengths to cover-up many of its anti-competitive actions. Many of the conditions mentioned above were not to be found in Intel’s official contracts. However, the Commission was able to gather a broad range of evidence demonstrating Intel's illegal conduct through statements from companies, on-site inspections, and formal requests for information.
  • Finally, I would like to draw your attention to Intel's latest global advertising campaign which proposes Intel as the "Sponsors of Tomorrow." Their website invites visitors to add their 'vision of tomorrow'. Well, I can give my vision of tomorrow for Intel here and now: "obey the law".
Paul Merrell

FCC Chairman Moves Toward Real Net Neutrality Protections | Free Press - 0 views

  • In an appearance at the Consumer Electronics Show in Las Vegas today, FCC Chairman Tom Wheeler indicated that he will move to protect Net Neutrality by reclassifying Internet access under Title II of the Communications Act. The chairman plans to circulate a new rule in early February. The agency is expected to vote on it during its Feb. 26 open meeting. Free Press President and CEO Craig Aaron made the following statement: “Chairman Wheeler appears to have heard the demands of the millions of Internet users who have called for real Net Neutrality protections. The FCC’s past decisions to put its oversight authority on ice resulted in Net Neutrality being under constant threat. Wheeler now realizes that it’s best to simply follow the law Congress wrote and ignore the bogus claims of the biggest phone and cable companies and their well-financed front groups. “Of course the devil will be in the details, and we await publication of the agency's final decision. But it’s refreshing to see the chairman firmly reject the industry’s lies and scare tactics. As we’ve said all along, Title II is a very flexible, deregulatory framework that ensures investment and innovation while also preserving the important public interest principles of nondiscrimination, universal service, interconnection and competition.”
  •  
    Title II is for "common carriers." See http://transition.fcc.gov/Reports/1934new.pdf pg. 35. Under Section 202: "(a) It shall be unlawful for any common carrier to make any unjust or unreasonable discrimination in charges, practices, classifications, regulations, facilities, or services for or in connection with like communication service, directly or indirectly, by any means or device, or to make or give any undue or unreasonable preference or advantage to any particular person, class of persons, or locality, or to subject any particular person, class of persons, or locality to any undue or unreasonable prejudice or disadvantage. (b) Charges or services, whenever referred to in this Act, include charges for, or services in connection with, the use of common carrier lines of communication, whether derived from wire or radio facilities, in chain broadcasting or incidental to radio communication of any kind. (c) Any carrier who knowingly violates the provisions of this section shall forfeit to the United States the sum of $6,000 for each such offense and $300 for each and every day of the continuance of such offense. 
Paul Merrell

FCC Chairman Tom Wheeler: This Is How We Will Ensure Net Neutrality | WIRED - 0 views

  • That is why I am proposing that the FCC use its Title II authority to implement and enforce open internet protections. Using this authority, I am submitting to my colleagues the strongest open internet protections ever proposed by the FCC. These enforceable, bright-line rules will ban paid prioritization, and the blocking and throttling of lawful content and services. I propose to fully apply—for the first time ever—those bright-line rules to mobile broadband. My proposal assures the rights of internet users to go where they want, when they want, and the rights of innovators to introduce new products without asking anyone’s permission. All of this can be accomplished while encouraging investment in broadband networks. To preserve incentives for broadband operators to invest in their networks, my proposal will modernize Title II, tailoring it for the 21st century, in order to provide returns necessary to construct competitive networks. For example, there will be no rate regulation, no tariffs, no last-mile unbundling. Over the last 21 years, the wireless industry has invested almost $300 billion under similar rules, proving that modernized Title II regulation can encourage investment and competition.
  •  
    Victory on Net Neutrality in sight. The FCC Chairman is circulating a draft rule that designates both cable and wireless ISPs as "common carriers" under Title II.  
Paul Merrell

BBC News - GCHQ's Robert Hannigan says tech firms 'in denial' on extremism - 0 views

  • Web giants such as Twitter, Facebook and WhatsApp have become "command-and-control networks... for terrorists and criminals", GCHQ's new head has said. Islamic State extremists had "embraced" the web but some companies remained "in denial" over the problem, Robert Hannigan wrote in the Financial Times. He called for them to do more to co-operate with security services. However, civil liberties campaigners said the companies were already working with the intelligence agencies. None of the major tech firms has yet responded to Mr Hannigan's comments.
  • GCHQ, terrorists, and the internet: what are the issues? GCHQ v tech firms: Internet reacts Change at the top for Britain's
  • Mr Hannigan said IS had "embraced the web as a noisy channel in which to promote itself, intimidate people, and radicalise new recruits." The "security of its communications" added another challenge to agencies such as GCHQ, he said - adding that techniques for encrypting - or digitally scrambling - messages "which were once the preserve of the most sophisticated criminals or nation states now come as standard". GCHQ and its sister agencies, MI5 and the Secret Intelligence Service, could not tackle these challenges "at scale" without greater support from the private sector, including the largest US technology companies which dominate the web, he wrote.
  •  
    What I want to know is what we're going to do with that NSA data center at Bluffdale, Utah, after the NSA is abolished? Maybe give it to the Internet Archive?
Paul Merrell

The Attack on Net Neutrality Begins | The Fifth Column - 0 views

  •  The United States Telecom Association has filed a lawsuit to overturn the net neutrality rules set by the Federal Communications Commission this past February. In its Monday morning Press Release USTelecom, who represents Verizon and AT&T among others, said it filed a lawsuit in the US Court of Appeals for the District of Columbia joining a similar law suit filed by Alamo Broadband Inc.
  • The Federal Communications Commission (FCC) published its net neutrality rules in the Federal Register on Monday and, according to procedure, that began a 60-day countdown until they go into effect (June 12). Their publication also opened a 30-day window for Internet service providers to appeal.  USTelecom and Alamo Broadband wasted no time.  USTelecom filed a previous action preserving the issue according to local court rule prior to the formal petition in March.
  • The rules, which were voted on in February, reclassify broadband under Title II of the 1934 Communications Act and require that ISPs transmit all Web traffic at the same speed. Over 400 pages long, USTelecom filed a CD of the rules as an exhibit with its action. This suit is predicted to be the first of many, as broadband groups like AT&T to congressional Republicans have signaled that they plan to fight the decision.
Paul Merrell

Tech firms and privacy groups press for curbs on NSA surveillance powers - The Washingt... - 0 views

  • The nation’s top technology firms and a coalition of privacy groups are urging Congress to place curbs on government surveillance in the face of a fast-approaching deadline for legislative action. A set of key Patriot Act surveillance authorities expire June 1, but the effective date is May 21 — the last day before Congress breaks for a Memorial Day recess. In a letter to be sent Wednesday to the Obama administration and senior lawmakers, the coalition vowed to oppose any legislation that, among other things, does not ban the “bulk collection” of Americans’ phone records and other data.
  • We know that there are some in Congress who think that they can get away with reauthorizing the expiring provisions of the Patriot Act without any reforms at all,” said Kevin Bankston, policy director of New America Foundation’s Open Technology Institute, a privacy group that organized the effort. “This letter draws a line in the sand that makes clear that the privacy community and the Internet industry do not intend to let that happen without a fight.” At issue is the bulk collection of Americans’ data by intelligence agencies such as the National Security Agency. The NSA’s daily gathering of millions of records logging phone call times, lengths and other “metadata” stirred controversy when it was revealed in June 2013 by former NSA contractor Edward Snowden. The records are placed in a database that can, with a judge’s permission, be searched for links to foreign terrorists.They do not include the content of conversations.
  • That program, placed under federal surveillance court oversight in 2006, was authorized by the court in secret under Section 215 of the Patriot Act — one of the expiring provisions. The public outcry that ensued after the program was disclosed forced President Obama in January 2014 to call for an end to the NSA’s storage of the data. He also appealed to Congress to find a way to preserve the agency’s access to the data for counterterrorism information.
  • ...3 more annotations...
  • Despite growing opposition in some quarters to ending the NSA’s program, a “clean” authorization — one that would enable its continuation without any changes — is unlikely, lawmakers from both parties say. Sen. Ron Wyden (D-Ore.), a leading opponent of the NSA’s program in its current format, said he would be “surprised if there are 60 votes” in the Senate for that. In the House, where there is bipartisan support for reining in surveillance, it’s a longer shot still. “It’s a toxic vote back in your district to reauthorize the Patriot Act, if you don’t get some reforms” with it, said Rep. Thomas Massie (R-Ky.). The House last fall passed the USA Freedom Act, which would have ended the NSA program, but the Senate failed to advance its own version.The House and Senate judiciary committees are working to come up with new bipartisan legislation to be introduced soon.
  • The tech firms and privacy groups’ demands are a baseline, they say. Besides ending bulk collection, they want companies to have the right to be more transparent in reporting on national security requests and greater declassification of opinions by the Foreign Intelligence Surveillance Court.
  • Some legal experts have pointed to a little-noticed clause in the Patriot Act that would appear to allow bulk collection to continue even if the authority is not renewed. Administration officials have conceded privately that a legal case probably could be made for that, but politically it would be a tough sell. On Tuesday, a White House spokesman indicated the administration would not seek to exploit that clause. “If Section 215 sunsets, we will not continue the bulk telephony metadata program,” National Security Council spokesman Edward Price said in a statement first reported by Reuters. Price added that allowing Section 215 to expire would result in the loss of a “critical national security tool” used in investigations that do not involve the bulk collection of data. “That is why we have underscored the imperative of Congressional action in the coming weeks, and we welcome the opportunity to work with lawmakers on such legislation,” he said.
  •  
    I omitted some stuff about opposition to sunsetting the provisions. They  seem to forget, as does Obama, that the proponents of the FISA Court's expansive reading of section 215 have not yet come up with a single instance where 215-derived data caught a single terrorist or prevented a single act of terrorism. Which means that if that data is of some use, it ain't in fighting terrorism, the purpose of the section.  Patriot Act § 215 is codified as 50 USCS § 1861, https://www.law.cornell.edu/uscode/text/50/1861 That section authorizes the FBI to obtain an iorder from the FISA Court "requiring the production of *any tangible things* (including books, records, papers, documents, and other items)."  Specific examples (a non-exclusive list) include: the production of library circulation records, library patron lists, book sales records, book customer lists, firearms sales records, tax return records, educational records, or medical records containing information that would identify a person." The Court can order that the recipient of the order tell no one of its receipt of the order or its response to it.   In other words, this is about way more than your telephone metadata. Do you trust the NSA with your medical records? 
Paul Merrell

With rules repealed, what's next for net neutrality? | TheHill - 0 views

  • The battle over the Federal Communications Commission’s (FCC) repeal of net neutrality rules is entering a new phase, with opponents of the move launching efforts to preserve the Obama-era consumer protections.The net neutrality rules had required internet service providers to treat all web traffic equally. Republicans on the commission decried the regulatory structure as a gross overreach, and quickly moved to reverse them once the Trump administration came to power. The reversal of the rules was published in the Federal Register Thursday, and even though the order is months away from implementation, net neutrality supporters are now free to mount legal challenges to the action. A coalition of Democratic state attorneys general, public interest groups and internet companies have vowed to fight in the courts. Twenty-three states, led by New York and its attorney general, Eric Schneiderman (D), have already filed a lawsuit. 
  • The emerging court battle over net neutrality could keep the issue in limbo for years.Meanwhile, a separate battle over the rules is brewing in Congress.Senate Democrats have secured enough support to force a vote on a bill that would undo the FCC’s December vote and leave the net neutrality rules in place. The bill, which is being pushed by Sen. Ed MarkeyEdward (Ed) John MarkeyRegulators seek to remove barriers to electric grid storage Markey, Paul want to know if new rules are helping opioid treatment Oil spill tax on oil companies reinstated as part of budget deal MORE (D-Mass.), would use a legislative tool called the Congressional Review Act (CRA) to roll back the FCC’s repeal of net neutrality. The entry of the FCC’s repeal order in the Federal Register Thursday means that the Senate has 60 legislative days to move on the CRA bill. Democrats have secured support from one Republican, Sen. Susan CollinsSusan Margaret CollinsOvernight Tech: Judge blocks AT&T request for DOJ communications | Facebook VP apologizes for tweets about Mueller probe | Tech wants Treasury to fight EU tax proposal Overnight Regulation: Trump to take steps to ban bump stocks | Trump eases rules on insurance sold outside of ObamaCare | FCC to officially rescind net neutrality Thursday | Obama EPA chief: Reg rollback won't stand FCC to officially rescind net neutrality rules on Thursday MORE (Maine), and need just one more to cross the aisle for the bill to pass the chamber. 
  • Even if Democrats do manage to find the tie-breaking vote in the Senate, the bill is almost certain to die in the House. But Democrats see a roll call vote as an opportunity to make GOP members stake out a position on an issue that they think could resonate in the midterm elections. On yet another front, Democratic states around the country have already launched their own attack on the FCC’s rules. Five governors (from Montana, Hawaii, New Jersey, Vermont and New York) have in recent weeks signed executive orders forbidding their states from doing business with internet service providers who violate net neutrality principles. And, according to the pro-net neutrality group Free Press, legislatures in 26 states are weighing bills that would codify their own open internet protections. The local efforts could ignite a separate legal battle over whether states have the authority to counteract the FCC’s order, which included a provision preempting them from replacing the rules.
  • ...1 more annotation...
  • For their part, Republicans who applauded the FCC repeal are calling for a legislation that would codify some net neutrality principles. They say doing so would allow for less heavy-handed protections that provide certainty to businesses.But most net neutrality supporters reject that course, at least while the repeal is tied up in court and Republicans control majorities in both the House and Senate. They argue that such a bill would amount to little more than watered-down protections that would be unable to keep internet service providers in check. For now, Democrats seem content to let the battles in the courts and Congress play out.
Paul Merrell

Washington becomes first state to pass law protecting net neutrality - Mar. 6, 2018 - 0 views

  • n a bipartisan effort, the state's legislators passed House Bill 2282. which was signed into law Monday by Gov. Jay Inslee. "Washington will be the first state in the nation to preserve the open internet," Inslee said at the bill signing. The state law, approved by the legislature last month, is to safeguard net neutrality protections, which have been repealed by the Federal Communications Commission and are scheduled to officially end April 23. Net neutrality requires internet service providers to treat all online content the same, meaning they can't deliberately speed up or slow down traffic from specific websites to put their own content at advantage over rivals. The FCC's decision to overturn net neutrality has been championed by the telecom industry, but widely criticized by technology companies and consumer advocacy groups. Attorneys general from more than 20 red and blue states filed a lawsuit in January to stop the repeal. Inslee said the new measure would protect an open internet in Washington, which he described as having "allowed the free flow of information and ideas in one of the greatest demonstrations of free speech in our history." HB2282 bars internet service providers in the state from blocking content, applications, or services, or slowing down traffic on the basis of content or whether they got paid to favor certain traffic. The law goes into effect June 6.
‹ Previous 21 - 36 of 36
Showing 20 items per page