Skip to main content

Home/ Open Web/ Group items tagged Maine

Rss Feed Group items tagged

Gary Edwards

Microsoft has failed | SemiAccurate - 0 views

  • heir product lines have stagnated,
  •  
    Charlie nails it in this analysis of Microsoft's future.  Most interesting is Charlie's explanation of how Microsoft so totally missed the needs of their corporate business user base - the heart of the monopoly!  Windows 8 is a disaster for productivity.  Great analysis! excerpt: Microsoft is in deep trouble, their two main product lines are failing, and the blame game is intensifying. Steve Sinofsky gets the blame this time for the failure of Windows 8, but the real problem is the patterns that are so clearly illustrated by these actions. Microsoft is largely irrelevant to computing of late, the only markets they still play in are evaporating with stunning rapidity. Their long history of circling the wagons tighter and tighter works decently as long as there is not a credible alternative, and that strategy has been the entirety of the Microsoft playbook for so long that there is nothing else now. It works, and as the walls grow higher, customer enmity builds while the value of an alternative grows. This cycle repeats as long as there is no alternative. If there is, everything unravels with frightening rapidity.
Gary Edwards

Cloud file-sharing for enterprise users - 1 views

  •  
    Quick review of different sync-share-store services, starting with DropBox and ending with three Open Source services. Very interesting. Things have progressed since I last worked on the SurDocs project for Sursen. No mention in this review of file formats, conversion or viewing issues. I do know that CrocoDoc is used by near every sync-share-store service to convert documents to either pdf or html formats for viewing. No servie however has been able to hit the "native document" sweet spot. Not even SurDocs - which was the whole purpose behind the project!!! "Native Documents" means that the document is in it's native / original application format. That format is needed for the round tripping and reloading of the document. Although most sync-share-store services work with MSOffice OXML formatted documents, only Microsoft provides a true "native" format viewer (Office 365). Office 365 enables direct edit, view and collaboration on native documents. Which is an enormous advantage given that conversion of any sort is guaranteed to "break" a native document and disrupt any related business processes or round tripping need. It was here that SurDoc was to provide a break-through technology. Sadly, we're still waiting :( excerpt: The availability of cheap, easy-to-use and accessible cloud file-sharing services means users have more freedom and choice than ever before. Dropbox pioneered simplicity and ease of use, and so quickly picked up users inside the enterprise. Similar services have followed Dropbox's lead and now there are dozens, including well-known ones such as Google Drive, SkyDrive and Ubuntu One. cloud.jpg Valdis Filks , research director at analyst firm Gartner explained the appeal of cloud file-sharing services. Filks said: "Enterprise employees use Dropbox and Google because they are consumer products that are simple to use, can be purchased without officially requesting new infrastructure or budget expenditure, and can be installed qu
  •  
    Odd that the reporter mentions the importance of security near the top of the article but gives that topic such short shrift in his evaluation of the services. For example, "secured by 256-bit AES encryption" is meaningless without discussing other factors such as: [i] who creates the encryption keys and on which side of the server/client divide; and [ii] the service's ability to decrypt the customer's content. Encrypt/decryt must be done on the client side using unique keys that are unknown to the service, else security is broken and if the service does business in the U.S. or any of its territories or possessions, it is subject to gagged orders to turn over the decrypted customer information. My wisdom so far is to avoid file sync services to the extent you can, boycott U.S. services until the spy agencies are encaged, and reward services that provide good security from nations with more respect for digital privacy, to give U.S.-based services an incentive to lobby *effectively* on behalf of their customer's privacy in Congress. The proof that they are not doing so is the complete absence of bills in Congress that would deal effectively with the abuse by U.S. spy agencies. From that standpoint, the Switzerland-based http://wuala.com/ file sync service is looking pretty good so far. I'm using it.
Paul Merrell

Secret Trans-Pacific Partnership Agreement (TPP) - 0 views

  • Today, 13 November 2013, WikiLeaks released the secret negotiated draft text for the entire TPP (Trans-Pacific Partnership) Intellectual Property Rights Chapter. The TPP is the largest-ever economic treaty, encompassing nations representing more than 40 per cent of the world’s GDP. The WikiLeaks release of the text comes ahead of the decisive TPP Chief Negotiators summit in Salt Lake City, Utah, on 19-24 November 2013. The chapter published by WikiLeaks is perhaps the most controversial chapter of the TPP due to its wide-ranging effects on medicines, publishers, internet services, civil liberties and biological patents. Significantly, the released text includes the negotiation positions and disagreements between all 12 prospective member states.
  • The TPP is the forerunner to the equally secret US-EU pact TTIP (Transatlantic Trade and Investment Partnership), for which President Obama initiated US-EU negotiations in January 2013. Together, the TPP and TTIP will cover more than 60 per cent of global GDP. Read full press release here Download the full secret TPP treaty IP chapter as a PDF here WikiLeaks Release of Secret Trans-Pacific Partnership Agreement (TPP) Advanced Intellectual Property Chapter for All 12 Nations with Negotiating Positions (August 30 2013 consolidated bracketed negotiating text)
  •  
    The text is leaked for the latest secretly negotiated atrocity against the Open Web and FOSS, and against much more. Note that in the U.S., treaties bypass review by the House of Representatives, needing approval only of the Senate for ratification. 
Gary Edwards

XML Production Workflows? Start with the Web and XHTML - 1 views

  • Challenges: Some Ugly Truths The challenges of building—and living with—an XML workflow are clear enough. The return on investment is a long-term proposition. Regardless of the benefits XML may provide, the starting reality is that it represents a very different way of doing things than the one we are familiar with. The Word Processing and Desktop Publishing paradigm, based on the promise of onscreen, WYSIWYG layout, is so dominant as to be practically inescapable. It has proven really hard to get from here to there, no matter how attractive XML might be on paper. A considerable amount of organizational effort and labour must be expended up front in order to realize the benefits. This is why XML is often referred to as an “investment”: you sink a bunch of time and money up front, and realize the benefits—greater flexibility, multiple output options, searching and indexing, and general futureproofing—later, over the long haul. It is not a short-term return proposition. And, of course, the returns you are able to realize from your XML investment are commensurate with what you put in up front: fine-grained, semantically rich tagging is going to give you more potential for searchability and recombination than a looser, more general-purpose approach, but it sure costs more. For instance, the Text Encoding Initiative (TEI) is the grand example of pouring enormous amounts of energy into the up-front tagging, with a very open-ended set of possibilities down the line. TEI helpfully defines a level to which most of us do not have to aspire.[5] But understanding this on a theoretical level is only part of the challenge. There are many practical issues that must be addressed. Software and labour are two of the most critical. How do you get the content into XML in the first place? Unfortunately, despite two decades of people doing SGML and XML, this remains an ugly question.
  • Practical Challenges In 2009, there is still no truly likeable—let alone standard—editing and authoring software for XML. For many (myself included), the high-water mark here was Adobe’s FrameMaker, substantially developed by the late 1990s. With no substantial market for it, it is relegated today mostly to the tech writing industry, unavailable for the Mac, and just far enough afield from the kinds of tools we use today that its adoption represents a significant hurdle. And FrameMaker was the best of the breed; most of the other software in decent circulation are programmers’ tools—the sort of things that, as Michael Tamblyn pointed out, encourage editors to drink at their desks. The labour question represents a stumbling block as well. The skill-sets and mind-sets that effective XML editors need have limited overlap with those needed by literary and more traditional production editors. The need to think of documents as machine-readable databases is not something that comes naturally to folks steeped in literary culture. In combination with the sheer time and effort that rich tagging requires, many publishers simply outsource the tagging to India, drawing a division of labour that spans oceans, to put it mildly. Once you have XML content, then what do you do with it? How do you produce books from it? Presumably, you need to be able to produce print output as well as digital formats. But while the latter are new enough to be generally XML-friendly (e-book formats being largely XML based, for instance), there aren’t any straightforward, standard ways of moving XML content into the kind of print production environments we are used to seeing. This isn’t to say that there aren’t ways of getting print—even very high-quality print—output from XML, just that most of them involve replacing your prepress staff with Java programmers.
  • Why does this have to be so hard? It’s not that XML is new, or immature, or untested. Remember that the basics have been around, and in production, since the early 1980s at least. But we have to take account of a substantial and long-running cultural disconnect between traditional editorial and production processes (the ones most of us know intimately) and the ways computing people have approached things. Interestingly, this cultural divide looked rather different in the 1970s, when publishers were looking at how to move to digital typesetting. Back then, printers and software developers could speak the same language. But that was before the ascendancy of the Desktop Publishing paradigm, which computerized the publishing industry while at the same time isolating it culturally. Those of us who learned how to do things the Quark way or the Adobe way had little in common with people who programmed databases or document-management systems. Desktop publishing technology isolated us in a smooth, self-contained universe of toolbars, grid lines, and laser proofs. So, now that the reasons to get with this program, XML, loom large, how can we bridge this long-standing divide?
  • ...44 more annotations...
  • Using the Web as a Production Platform The answer, I think, is right in front of you. The bridge is the Web, a technology and platform that is fundamentally based on XML, and which many publishers are by now comfortably familiar with. Perhaps not entirely comfortably, but at least most publishers are already working with the Web; they already either know or have on staff people who understand it and can work with it. The foundation of our argument is this: rather than looking at jumping to XML in its full, industrial complexity, which seems to be what the O'Reilly-backed StartWithXML initiative[6] is suggesting, publishers instead leverage existing tools and technologies—starting with the Web—as a means of getting XML workflows in place. This means making small investments and working with known tools rather than spending tens of thousands of dollars on XML software and rarefied consultants. It means re-thinking how the existing pieces of the production toolchain fit together; re-thinking the existing roles of software components already in use. It means, fundamentally, taking the Web seriously as a content platform, rather than thinking of it as something you need to get content out to, somehow. If nothing else, the Web represents an opportunity to think about editorial and production from outside the shrink-wrapped Desktop Publishing paradigm.
  • Is the Web made of Real XML? At this point some predictable objections can be heard: wait a moment, the Web isn’t really made out of XML; the HTML that makes up most of the Web is at best the bastard child of SGML, and it is far too flaky/unstructured/underpowered to be taken seriously. We counter by arguing that although HTML on the Web exists in a staggering array of different incarnations, and that the majority of it is indeed an unstructured mess, this does not undermine the general principle that basic, ubiquitous Web technologies can make a solid platform for content management, editorial process, and production workflow.
  • With the advent of a published XML standard in the late 1990s came the W3C’s adoption of XHTML: the realization of the Web’s native content markup as a proper XML document type. Today, its acceptance is almost ubiquitous, even while the majority of actual content out there may not be strictly conforming. The more important point is that most contemporary Web software, from browsers to authoring tools to content management systems (from blogs to enterprise systems), are capable of working with clean, valid XHTML. Or, to put the argument the other way around, clean, valid XHTML content plays absolutely seamlessly with everything else on the Web.[7]
  • The objection which follows, then, will be that even if we grant that XHTML is a real XML document type, that it is underpowered for “serious” content because it is almost entirely presentation (formatting) oriented; it lacks any semantic depth. In XHTML, a paragraph is a paragraph is a paragraph, as opposed to a section or an epigraph or a summary.
  • n contrast, more “serious” XML document types like DocBook[8] or DITA-derived schemas[9] are capable of making semantic distinctions about content chunks at a fine level of granularity and with a high degree of specificity.
  • So there is an argument for recalling the 80:20 rule here. If XHTML can provide 80% of the value with just 20% of the investment, then what exactly is the business case for spending the other 80% to achieve that last 20% of value? We suspect the ratio is actually quite a bit steeper than 80:20 for most publishers.
  • Furthermore, just to get technical for a moment, XHTML is extensible in a fairly straightforward way, through the common “class” attribute on each element. Web developers have long leveraged this kind of extensibility in the elaboration of “microformats” for semantic-web applications.[10] There is no reason why publishers shouldn’t think to use XHTML’s simple extensibility in a similar way for their own ends.
  • XHTML, on the other hand, is supported by a vast array of quotidian software, starting with the ubiquitous Web browser. For this very reason, XHTML is in fact employed as a component part of several more specialized document types (ONIX and ePub among them).
  • Why re-invent a general-purpose prose representation when XHTML already does the job?
  • It is worth pausing for a moment to consider the role of XHTML in the ePub standard for ebook content. An ePub file is, anatomically, a simply disguised zip archive. Inside the zip archive are a few standard component parts: there are specialized files that declare metadata about the book, and about the format of the book. And then there is the book’s content, represented in XHTML. An ePub book is a Web page in a wrapper.
  • To sum up the general argument: the Web as it already exists presents incredible value to publishers, as a platform for doing XML content management with existing (and often free) tools, and without having to go blindly into the unknown. At this point, we can offer a few design guidelines: prefer existing and/or ubiquitous tools over specialized ones wherever possible; prefer free software over proprietary systems where possible; prefer simple tools controlled and coordinated by human beings over fully automated (and therefore complex) systems; play to our strengths: use Web software for storing and managing content, use layout software for layout, and keep editors and production people in charge of their own domains.
  • Putting the Pieces Together: A Prototype
  • At the SFU Master of Publishing Program, we have been chipping away at this general line of thinking for a few years. Over that time, Web content management systems have been getting more and more sophisticated, all the while getting more streamlined and easier to use. (NB: if you have a blog, you have a Web content management system.) The Web is beginning to be recognized as a writing and editing environment used by millions of people. And the ways in which content is represented, stored, and exchanged online have become increasingly robust and standardized.
  • The missing piece of the puzzle has been print production: how can we move content from its malleable, fluid form on line into the kind of high-quality print production environments we’ve come to expect after two decades of Desktop Publishing?
  • Anyone who has tried to print Web content knows that the existing methods leave much to be desired (hyphenation and justification, for starters). In the absence of decent tools for this, most publishers quite naturally think of producing the print content first, and then think about how to get material onto the Web for various purposes. So we tend to export from Word, or from Adobe, as something of an afterthought.
  • While this sort of works, it isn’t elegant, and it completely ignores the considerable advantages of Web-based content management.
  • Content managed online is stored in one central location, accessible simultaneously to everyone in your firm, available anywhere you have an Internet connection, and usually exists in a much more fluid format than Word files. If only we could manage the editorial flow online, and then go to print formats at the end, instead of the other way around. At SFU, we made several attempts to make this work by way of the supposed “XML import” capabilities of various Desktop Publishing tools, without much success.[12]
  • In the winter of 2009, Adobe solved this part of the problem for us with the introduction of its Creative Suite 4. What CS4 offers is the option of a complete XML representation of an InDesign document: what Adobe calls IDML (InDesign Markup Language).
  • The IDML file format is—like ePub—a simply disguised zip archive that, when unpacked, reveals a cluster of XML files that represent all the different facets of an InDesign document: layout spreads, master pages, defined styles, colours, and of course, the content.
  • IDML is a well thought-out XML standard that achieves two very different goals simultaneously: it preserves all of the information that InDesign needs to do what it does; and it is broken up in a way that makes it possible for mere mortals (or at least our Master of Publishing students) to work with it.
  • What this represented to us in concrete terms was the ability to take Web-based content and move it into InDesign in a straightforward way, thus bridging Web and print production environments using existing tools and skillsets, with a little added help from free software.
  • We would take clean XHTML content, transform it to IDML-marked content, and merge that with nicely designed templates in InDesign.
  • The result is an almost push-button publication workflow, which results in a nice, familiar InDesign document that fits straight into the way publishers actually do production.
  • Tracing the steps To begin with, we worked backwards, moving the book content back to clean XHTML.
  • The simplest method for this conversion—and if you want to create Web content, this is an excellent route—was to use Adobe’s “Export to Digital Editions” option, which creates an ePub file.
  • Recall that ePub is just XHTML in a wrapper, so within the ePub file was a relatively clean XHTML document. It was somewhat cleaner (that is, the XHTML tagging was simpler and less cluttered) than InDesign’s other Web-oriented exports, possibly because Digital Editions is a well understood target, compared with somebody’s website.
  • In order to achieve our target of clean XHTML, we needed to do some editing; the XHTML produced by InDesign’s “Digital Editions” export was presentation-oriented. For instance, bulleted list items were tagged as paragraphs, with a class attribute identifying them as list items. Using the search-and-replace function, we converted such structures to proper XHTML list and list-item elements. Our guiding principle was to make the XHTML as straightforward as possible, not dependent on any particular software to interpret it.
  • We broke the book’s content into individual chapter files; each chapter could then carry its own basic metadata, and the pages conveniently fit our Web content management system (which is actually just a wiki). We assembled a dynamically generated table of contents for the 12 chapters, and created a cover page. Essentially, the book was entirely Web-based at this point.
  • When the book chapters are viewed online, they are formatted via a CSS2 stylesheet that defines a main column for content as well as dedicating screen real estate for navigational elements. We then created a second template to render the content for exporting; this was essentially a bare-bones version of the book with no navigation and minimal styling. Pages (or even the entire book) can be exported (via the “Save As...” function in a Web browser) for use in either print production or ebook conversion. At this point, we required no skills beyond those of any decent Web designer.
  • Integrating with CS4 for Print Adobe’s IDML language defines elements specific to InDesign; there is nothing in the language that looks remotely like XHTML. So a mechanical transformation step is needed to convert the XHTML content into something InDesign can use. This is not as hard as it might seem.
  • Both XHTML and IDML are composed of straightforward, well-documented structures, and so transformation from one to the other is, as they say, “trivial.” We chose to use XSLT (Extensible Stylesheet Language Transforms) to do the work. XSLT is part of the overall XML specification, and thus is very well supported in a wide variety of tools. Our prototype used a scripting engine called xsltproc, a nearly ubiquitous piece of software that we found already installed as part of Mac OS X (contemporary Linux distributions also have this as a standard tool), though any XSLT processor would work.
  • In other words, we don’t need to buy InCopy, because we just replaced it with the Web. Our wiki is now plugged directly into our InDesign layout. It even automatically updates the InDesign document when the content changes. Credit is due at this point to Adobe: this integration is possible because of the open file format in the Creative Suite 4.
  • We wrote an XSLT transformation script[18] that converted the XHTML content from the Web into an InCopy ICML file. The script itself is less than 500 lines long, and was written and debugged over a period of about a week by amateurs (again, the people named at the start of this article). The script runs in a couple of seconds, and the resulting .icml file can then be “placed” directly into an InDesign template. The ICML file references an InDesign stylesheet, so the template file can be set up with a house-styled layout, master pages, and stylesheet definitions for paragraphs and character ranges.
  • Rather than a public-facing website, our system relies on the Web as a content management platform—of course a public face could easily be added.
  • It should be noted that the Book Publishing 1 proof-of-concept was artificially complex; we began with a book laid out in InDesign and ended up with a look-alike book laid out in InDesign. But next time—for instance, when we publish Book Publishing 2—we can begin the process with the content on the Web, and keep it there throughout the editorial process. The book’s content could potentially be written and edited entirely online, as Web content, and then automatically poured into an InDesign template at proof time. “Just in time,” as they say. This represents an entirely new way of thinking of book production. With a Web-first orientation, it makes little sense to think of the book as “in print” or “out of print”—the book is simply available, in the first place online; in the second place in derivative digital formats; and third, but really not much more difficult, in print-ready format, via the usual InDesign CS print production system publishers are already familiar with.
  • Creating Ebook Files Creating electronic versions from XHTML source is vastly simpler than trying to generate these out of the existing print process. The ePub version is extremely easy to generate; so is online marketing copy or excerpts for the Web, since the content begins life Web-native.
  • Since an ePub file is essentially XHTML content in a special wrapper, all that is required is that we properly “wrap” our XHTML content. Ideally, the content in an ePub file is broken into chapters (as ours was) and a table of contents file is generated in order to allow easy navigation within an ebook reader. We used Julian Smart’s free tool eCub[19] to simply and automatically generate the ePub wrapper and the table of contents. The only custom development we did was to create a CSS stylesheet for the ebook so that headings and paragraph indents looked the way we wanted. Starting with XHTML content, creating ePub is almost too easy.
  • today, we are able to put the process together using nothing but standard, relatively ubiquitous Web tools: the Web itself as an editing and content management environment, standard Web scripting tools for the conversion process, and the well-documented IDML file format to integrate the layout tool.
  • Our project demonstrates that Web technologies are indeed good enough to use in an XML-oriented workflow; more specialized and expensive options are not necessarily required. For massive-scale enterprise publishing, this approach may not offer enough flexibility, and the challenge of adding and extracting extra semantic richness may prove more trouble than it's worth.
  • But for smaller firms who are looking at the straightforward benefits of XML-based processes—single source publishing, online content and workflow management, open and accessible archive formats, greater online discoverability—here is a way forward.
  • The result is very simple and easy to use. Our demonstration requires that a production editor run the XSLT transformation script manually, but there is no reason why this couldn’t be built directly into the Web content management system so that exporting the content to print ran the transformation automatically. The resulting file would then be “placed” in InDesign and proofed.
  • The final piece of our puzzle, the ability to integrate print production, was made possible by Adobe's release of InDesign with an open XML file format. Since the Web's XHTML is also XML, is can be easily and confidently transformed to the InDesign format.
  • Such a workflow—beginning with the Web and exporting to print—is surely more in line with the way we will do business in the 21st century, where the Web is the default platform for reaching audiences, developing content, and putting the pieces together. It is time, we suggest, for publishers to re-orient their operations and start with the Web.
  • Using the Web as a Production Platform
  •  
    I was looking for an answer to a problem Marbux had presented, and found this interesting article.  The issue was that of the upcoming conversion of the Note Case Pro (NCP) layout engine to the WebKit layout engine, and what to do about the NCP document format. My initial reaction was to encode the legacy NCP document format in XML, and run an XSLT to a universal pivot format like TEI-XML.  From there, the TEI-XML community would provide all the XSLT transformation routines for conversion to ODF, OOXML, XHTML, ePUB and HTML/CSS. Researching the problems one might encounter with this approach, I found this article.  Fascinating stuff. My take away is that TEI-XML would not be as effective a "universal pivot point" as XHTML.  Or perhaps, if NCP really wants to get aggressive; IDML - InDesign Markup Language. As an after thought, i was thinking that an alternative title to this article might have been, "Working with Web as the Center of Everything".
Gary Edwards

HOW TO: Optimize Your Mobile Site Across Multiple Platforms - 0 views

  •  
    Great links to HTML5-CSS tools and tricks excerpt: 3. Use Multiple Stylesheets for Device Support Including a mobile-specific stylesheet on your main site with certain parameters that add or subtract features, based on what device is being used, can be an elegant and effective way to serve content across multiple devices. Dominique Hazael-Massieux wrote a great article for A List Apart last year that covers some of the basics and also links to some of the most common parameters for handheld support. Dave Shea included his own solution back in 2008 that is still pretty usable for lots of devices. More recently, Chris Coyier at CSS-Tricks discussed how to add in screen size and browser support via CSS or jQuery, and he includes his own downloadable examples. Dave Calhoun has some excellent suggestions in his series on mobile web development.
Gary Edwards

Is Oracle Quietly Killing OpenOffice? | Revelations From An Unwashed Brain - 1 views

  •  
    Bingo!  Took five years, but finally someone gets it: excerpt:  Great question. After 10 years, OpenOffice hasn't had much traction in the enterprise - supported by under 10% of firms, and today it's facing more competition from online apps from Google and Zoho. I'm not counting OpenOffice completely out yet, however, since IBM has been making good progress on features with Symphony and Oracle is positioning OpenOffice for the web, desktop and mobile - a first. But barriers to OpenOffice and Web-based tools persist, and not just on a feature/function basis. Common barriers include: Third-party integration requirements. Some applications only work with Office. For example, one financial services firm I spoke with was forced to retain Office because its employees needed to work with Fiserv, a proprietary data center that is very Microsoft centric. "What was working pretty well was karate chopped." Another firm rolled out OpenOffice.org to 7,00 users and had to revert back 5,00 of them when they discovered one of the main apps they work with only supported Microsoft. User acceptance. Many firms say that they can overcome pretty much all of the technical issues but face challenges around user acceptance. One firm I spoke with went so far as to "customize" their OpenOffice solution with a Microsoft logo and told employees it was a version of Office. The implementation went smoothly. Others have said that they have met resistance from business users who didn't want Office taken off their desktop. Other strategies include providing OpenOffice to only new employees and to transition through attrition. But this can cause compatibility issues. Lack of seamless interoperability with Office. Just like third-party apps may only work with Office, many collaborative activities force use of particular versions of Office. Today's Web-based and OpenOffice solutions do not provide seamless round tripping between Office and their applications. Corel, with its
Gary Edwards

Key Google Docs changes promise faster service | Relevant Results - CNET News - 0 views

  •  
    Jonathan Rochelle and Dave Girouard: Google's long-term vision of computing is based around the notion that the Web and the browser become the primary vehicles for applications, and Google Docs is an important part of realizing that vision. The main improvement was to create a common infrastructure across the Google Docs products, all of which came into Google from separate acquisitions, Rochelle said. This has paved the way for Google to offer users a chance to do character-by-character real-time editing of a document or spreadsheet, almost the same way Google Wave lets collaborators see each other's keystrokes in a Wave. Those changes have also allowed Google to take more control of the way documents are rendered and formatted in Google Docs, instead of passing the buck to the browser to make those decisions. This allows Google to ensure that documents will look the same on the desktop or in the cloud, an important consideration for designing marketing materials or reviewing architectural blueprints, for example.
Gary Edwards

Is WiMAX or LTE the better 4G choice? - 0 views

  •  
    WiMAX (worldwide interoperability for microwave access) is a fourth-generation (4G) telecommunications technology primarily for fast broadband. Also a 4G mobile technology, LTE allows a peak download speed of 100 megabits per second (Mbps) on mobile phones, compared with 20Mbps for 3G and 40Mbps for WiMAX. "For operators, the choice of technology depends on a number of things including available spectrum, legacy inter-working, timing and business focus," says Nokia Siemens Networks head of sub region, Asia South, Lars Biese. To deploy either technology, operators will have to commit tens of billions of dollars in network upgrades for the new mobility landscape, which now includes social, video, location-based and entertainment applications and experiences. Wing K. Lee says WiMAX and LTE more similiar than different. Also a 4G mobile technology, LTE allows a peak download speed of 100 megabits per second (Mbps) on mobile phones, compared with 20Mbps for 3G and 40Mbps for WiMAX. Some argue that LTE is the next step for mobile networks like GSM, WCDMA/HSPA and CDMA in the move to future networks and services. The common belief is that the natural migration path is from 2G to GPRS, from GPRS to 3G, and from 3G to LTE. But IDC Asia/Pacific's telecom research director Bill Rojas has a differing view. To him, LTE is a totally new set-up. It has been reported that LTE's main advantage over WiMAX, in addition to speed, is that it is part of the popular GSM technology and can allow backward compatibility with both 2G and 3G networks. A point many dispute.  The new Sprint EVO is a 4G smartphone with chipsets for 2G, 3G, 3G enhanced, and 4G WiMAX.  Sprint argues that LTE is just another chipset away.
Paul Merrell

MPEG-LA Considering Patent Pool for VP8/WebM | John Paczkowski | Digital Daily | AllThi... - 0 views

  • A new era of Web video without the patent-encumbered formats that have defined the Internet to date. That seems ideal. But like many ideals, it may prove to be unattainable. As a number of observers have already noted VP8 isn’t free from patent liability. And now that Google has open-sourced it as part of WebM, that liability is likely to become an issue. And quickly, too. Indeed, Larry Horn, CEO of MPEG LA, the consortium that controls the AVC/H.264 video standard, tells me that the group is already looking at creating a patent pool license for VP8.
  • It would seem, then, that VP8 may end up subject to the same licensing issues as H.264. If MPEG LA does create a patent pool license for the standard, the free lunch Google promised yesterday may not be free after all.
Gary Edwards

WYMeditor - web-based XHTML editor - Home - 2 views

  •  
    WYMeditor is a web-based WYSIWYM (What You See Is What You Mean) XHTML editor (not WYSIWYG). WYMeditor's main concept is to leave details of the document's visual layout, and to concentrate on its structure and meaning, while trying to give the user as much comfort as possible (at least as WYSIWYG editors). WYMeditor has been created to generate perfectly structured XHTML strict code, to conform to the W3C XHTML specifications and to facilitate further processing by modern applications. With WYMeditor, the code can't be contaminated by visual informations like font styles and weights, borders, colors, ... The end-user defines content meaning, which will determine its aspect by the use of style sheets. The result is easy and quick maintenance of information. As the code is compliant to W3C XHTML specifications, you can for example process it using a XSLT (at the client or the server side), giving you a wide range of applications. ...................... Great colors on this Web site!  They have mastered the many shades of Uncle Ten's (the Chinese Brush Master, James Liu) charcoal blue
Gary Edwards

EU Cyber Agency ENISA Issues Governmental Cloud Report | WHIR Web Hosting Industry News - 0 views

  •  
    The EU's cyber security agency ENISA (www.enisa.europa.eu) announced this week it has released a new report on governmental cloud computing. The report, which can be downloaded now on the ENISA website, is targeted at senior managers of public bodies who have to make a security and resilience decision about migrating to the cloud, if at all. The main goal of the report is to support governmental bodies in taking informed risk based decisions relating to the security of data, resilience of service and legal compliance on moving to the cloud. ENISA concludes that private and community clouds appear to be the solutions that offer the best solution to meet the needs of public administrations if they need to achieve the highest level of data governance.The report makes several recommendations to governments and public bodies, including national governments and the EU institutions should investigate the concept of an EU governmental cloud.The report also argues that cloud computing will soon serve a significant portion of EU citizens, SMEs and public administrations, and therefore national governments should prepare a cloud computing strategy and study the role that cloud computing will play for critical information infrastructure protection.Finally, the report states that a national cloud computing strategy should address the effects of national/supra-national interoperability and interdependencies, cascading failures, and include cloud providers into the reporting schemes of articles 4 and 13 of the new Telecom Framework Directive. Download report:  http://www.enisa.europa.eu/act/rm/emerging-and-future-risk/deliverables/security-and-resilience-in-governmental-clouds/
Gary Edwards

Google's Enterprise Vision: Mobile First, In the Cloud - 0 views

  •  
    Google "Innovation Nation" Conference in Washington, DC had an interesting conversation thread; that the move to Cloud Computing embraces a move for individual productivity to group productivity.  Not sure i agree with that.  The Windows Desktop-WorkGroup Productivity environment has dominated business since 1992.  Maybe Google might instead focus on the limited access of desktop workgroups and the fact that productivity was horribly crippled by the the PC's lack of communication.  The Web/Cloud magically combines and integrates communication with content and computation.  This is what makes cloud collaboration a genuine leap in productivity - no matter what the discipline.  Here's a question for Google: What's the productivity difference between desktop collaboration and cloud-collaboration? excerpt:  The meeting is the staple of corporate life. The whole day revolves around when a meeting will be, who will be there and what needs to be discussed. Yet, is this rote practice may have become counter-productive in today's world of the always on, always connected workplace. Google's enterprise vision is to leverage mobility and the cloud to change the fundamental way people work. Workforce productivity used to be about how you can optimize individual output. Take all those individuals, put their output together and have a meeting to sort it all out. Google thinks that by putting all that functionality into a cloud environment, workers can use whatever device they want and always be working as a group towards on the mission. A faster, more secure, more cost efficient workplace will be the result. "The main message is that to be an effective [enterprise], we need to change from individual productivity to group productivity,"
Gary Edwards

Shine on Silverlight and Windows with XAML * The Register : Tim Anderson - 0 views

  •  
    Excellent explanation and review from the Tim Anderson. I wonder how i missed this? Here is the summary statement: "..... You can also extend XAML with custom objects. The main requirement is that classes used in XAML must have a parameterless constructor. The procedure is straightforward. Define a class; make sure your application has a reference to the assembly containing the class; then add a namespace declaration for the assembly. You can then define elements in XAML that map to your class, and at runtime these will become object instances. XAML has a curious story when it comes to formatted text, especially in Silverlight. In one sense it is rather limited. XAML has no understanding of common formats such as HTML, CSS or RTF, let alone the fancy new OOXML. Silverlight developers have to interact with the browser DOM in order to display HTML." "... No escaping it: Silverlight .XAP bundle preserves the original XAML. That said, XAML with WPF actually is a document format. The full WPF has an element called FlowDocument and rich formatting capabilities. Silverlight lacks FlowDocument, but does have a TextBlock with basic formatting options via the inline object. It also supports the Glyph element. This is interesting because it is the core element in XPS, Microsoft's invented-here alternative to Adobe's PDF." ".... XPS uses a subset of XAML to describe fixed layouts. In consequence, and with some compromises, you can use Silverlight to display XPS." "..... The bottom line is that XAML is a way of programming .NET declaratively. Its more intricate features improve the mapping between XAML and .NET. The result is we have design tools like Microsoft's Expression Blend and a clean separation between UI objects and program code, which is a considerable achievement." ".... As ever there's a downside, and with Microsoft it's the classic: this is thoroughly proprietary, and the schema issues make it difficult to validate with standard XML tools." No
Paul Merrell

Gov. Mills signs nation's strictest internet privacy protection bill - Portland Press H... - 0 views

  • Maine internet service providers will face the strictest consumer privacy protections in the nation under a bill signed Thursday by Gov. Janet Mills, but the new law will almost certainly be challenged in court. Several technology and communication trade groups warned in testimony before the Legislature that the measure may be in conflict with federal law and would likely be the subject of legal action.
  • The new law, which goes into effect on July 1, 2020, would require providers to ask for permission before they sell or share any of their customers’ data to a third party. The law would also apply to telecommunications companies that provide access to the internet via their cellular networks.
  • The law is modeled on a Federal Communications Commission rule, adopted under the administration of President Obama but overturned by the administration of President Trump in 2017. The rule blocked an ISP from selling a customer’s personal data, which is not prohibited under federal law.
  • ...1 more annotation...
  • The law is unlike any in the nation, as it requires an ISP to obtain consent from a consumer before sharing any data. Only California has a similar law on the books, but it requires consumers to “opt out”  by asking their ISP to protect their data. Maine’s new law does not allow an ISP to offer a discounted rate to customers who agree to share or sell their data.
Paul Merrell

European Commission publishes guidance on new data protection rules - nsnbc internation... - 0 views

  • The European Commission, on January 24, published its guidance aimed to facilitate a direct and smooth application of the European Union’s new data protection rules across the EU as of 25 May. The Commission also launches a new online tool dedicated to SMEs.
  • With just over 100 days left before the application of the new law, the guidance outlines what the European Commission, national data protection authorities and national administrations, according to the Commission, should still do to bring the preparation to a successful completion. The Commission notes that while the new regulation provides for a single set of rules directly applicable in all Member States, it will still require significant adjustments in certain aspects, like amending existing laws by EU governments or setting up the European Data Protection Board by data protection authorities. The Commission states that the guidance recalls the main innovations, opportunities opened up by the new rules, takes stock of the preparatory work already undertaken and outlines the work still ahead of the European Commission, national data protection authorities and national administrations. Andrus Ansip, European Commission Vice-President for the Digital Single Market, said: “Our digital future can only be built on trust. Everyone’s privacy has to be protected. Strengthened EU data protection rules will become a reality on 25 May. It is a major step forward and we are committed to making it a success for everyone.” Vĕra Jourová, Commissioner for Justice, Consumers and Gender Equality, added:” In today’s world, the way we handle data will determine to a large extent our economic future and personal safety. We need modern rules to respond to new risks, so we call on EU governments, authorities and businesses to use the remaining time efficiently and fulfil their roles in the preparations for the big day.”
  • The guidance recalls the main elements of the new data protection rules: One set of rules across the continent, guaranteeing legal certainty for businesses and the same data protection level across the EU for citizens. Same rules apply to all companies offering services in the EU, even if these companies are based outside the EU. Stronger and new rights for citizens: the right to information, access and the right to be forgotten are strengthened. A new right to data portability allows citizens to move their data from one company to the other. This will give companies new business opportunities. Stronger protection against data breaches: a company experiencing a data breach, which put individuals at risk, has to notify the data protection authority within 72 hours. Rules with teeth and deterrent fines: all data protection authorities will have the power to impose fines for up to EUR 20 million or, in the case of a company, 4% of the worldwide annual turnover.
Paul Merrell

The Little-Known Company That Enables Worldwide Mass Surveillance - 0 views

  • t was a powerful piece of technology created for an important customer. The Medusa system, named after the mythical Greek monster with snakes instead of hair, had one main purpose: to vacuum up vast quantities of internet data at an astonishing speed. The technology was designed by Endace, a little-known New Zealand company. And the important customer was the British electronic eavesdropping agency, Government Communications Headquarters, or GCHQ. Dozens of internal documents and emails from Endace, obtained by The Intercept and reported in cooperation with Television New Zealand, reveal the firm’s key role helping governments across the world harvest vast amounts of information on people’s private emails, online chats, social media conversations, and internet browsing histories.
Paul Merrell

Venezuelan Intelligence Services Arrest Credicard Directors - nsnbc international | nsn... - 0 views

  • Venezuelan President Nicolas Maduro confirmed Saturday that the state intelligence service SEBIN arrested several directors from the Credicard financial transaction company on Friday night. 
  • The financial consortium is accused of having deliberately taken advantage of a series of cyber attacks on state internet provider CANTV Friday to paralyse its online payment platform–responsible for the majority of the country’s accredited financial transactions, according to its website. “We have proof that it was a deliberate act what Credicard did yesterday. Right now the main people responsible for Credicard are under arrest,” confirmed the president. The government says that millions of attempted purchases using in-store credit and debit card payment machines provided by the company were interrupted after its platform went down for the most part of the day. Authorities also maintain that the company waited longer than the established protocol of one hour before responding to the issues.
  • According to CANTV President Manuel Fernandez, Venezuela’s internet platform suffered at least three attacks from an external source on Friday, one of which was aimed at state oil company PDVSA. CANTV was notified of the attacks by international provider LANautilus, which belongs to Telecom Italia. Nonetheless, Fernandez denied that Credicard’s platform was affected by the interferences to CANTV’s service, underscoring that other financial transaction companies that rely on the state enterprise continued to be operative.
  • ...1 more annotation...
  • On Friday SEBIN Director Gustavo Gonzalez Lopez also openly accused members of the rightwing coalition, the Democratic Unity Roundtable (MUD), of being implicated in the incident. “Members of the MUD involved in the attack on electronic banking service,” he tweeted. “The financial war continues inside and outside the country, internally they are damaging banking operability,” he added. Venezuelan news source La Iguana has reported that the server administrator of Credicard is the company Dayco Host, which belongs to the D’Agostino family. Diana D’Angostino is married to veteran opposition politician, Henry Ramos Allup, president of the National Assembly. On Saturday, the government-promoted Productive Economy Council held an extraordinary meeting of political and business representatives to reject the attack on the country’s financial system.
Paul Merrell

YouTube gets the yuck out in comments cleanup | Internet & Media - CNET News - 0 views

  • Laugh all you want, fuzzball, but Google is changing how YouTube uploaders manage comments on their videos. The new system, which began rolling out to a limited number of uploaders on Tuesday, favors relevancy over recency and introduces enhanced moderation tools. The new commenting system, which is powered by Google+ and was developed in collaboration between the YouTube and Google+ teams, provides several new tools for moderation, said Nundu Janakiram, product manager at YouTube. It will default to showing YouTube viewers the most relevant comments first, such as those by the video uploader or channel owner. "Currently, you see comments from the last random person to stop by," Janakiram said. "The new system tries to surface the most meaningful conversation to you. We're trying to shift from comments to meaningful conversations," he said.
  • He explained that three main factors determine which comments are more relevant: community engagement by the commenter, up-votes for a particular comment, and commenter reputation. If you've been flagged for spam or abuse, don't be surprised to find your comments buried, but that also means that celebrities who have strong Google+ reputations will be boosted above others. There's more to the system than just relevancy, though. Because the system is powered by Google+, comments made on posts with YouTube links in the social network will show up on YouTube itself. So, you'll see comments from people in your Google+ Circles higher up, too. Just because it's powered by Google+ doesn't mean that you'll lose your YouTube identity, though. "You are still allowed to use pseudonyms," said Janakiram, whether you're "a Syrian dissident or SoulPancake". Another feature, and one that speaks directly to YouTube's goal of fostering conversations, is that you'll be able to comment publicly or privately to people in your Circles. Replies will be threaded like Gmail. The hope is that new moderation tools will make it easier for video owners to guide the conversation, Janakiram explained. "There have been challenges in the past with certain comments and what's been shown there."
Paul Merrell

Exclusive: Google mulling Wi-Fi for cities with Google Fiber - Network World - 0 views

  • Google is considering deploying Wi-Fi networks in towns and cities covered by its Google Fiber high-speed Internet service. The disclosure is made in a document Google is circulating to 34 cities that are the next candidates to receive Google Fiber in 2015.
  • Specific details of the Wi-Fi plan are not included in the document, which was seen by IDG News Service, but Google says it will be "discussing our Wi-Fi plans and related requirements with your city as we move forward with your city during this planning process."
  • Google Fiber is already available in Provo, Utah, and Kansas City, and is promised soon in Austin, Texas. It delivers a "basic speed" service for no charge, a gigabit-per-second service for US$70 per month and a $120 package that includes a bundle of more than 200 TV channels. Installation costs between nothing and $300. Google has sent the 34 cities that are next in line for Google Fiber a detailed request for information and they have until May 1 to reply.
  • ...1 more annotation...
  • Google is also asking cities to identify locations it would be able to install utility huts. Each 12-foot-by-30-foot (3.6-meter-by-9.1-meter) windowless hut needs to allow 24-hour access and be on land Google could lease for about 20 years. The huts, of which there will be between one and a handful in each city, would house the main networking equipment. From the hut, fiber cables would run along utility poles -- or in underground fiber ducts if they exist -- and terminate at neighborhood boxes, each serving up to 288 or 587 homes. The neighborhood boxes are around the same size or smaller than current utility cabinets often found on city streets.
Paul Merrell

Internet Giants Erect Barriers to Spy Agencies - NYTimes.com - 0 views

  • As fast as it can, Google is sealing up cracks in its systems that Edward J. Snowden revealed the N.S.A. had brilliantly exploited. It is encrypting more data as it moves among its servers and helping customers encode their own emails. Facebook, Microsoft and Yahoo are taking similar steps.
  • After years of cooperating with the government, the immediate goal now is to thwart Washington — as well as Beijing and Moscow. The strategy is also intended to preserve business overseas in places like Brazil and Germany that have threatened to entrust data only to local providers. Google, for example, is laying its own fiber optic cable under the world’s oceans, a project that began as an effort to cut costs and extend its influence, but now has an added purpose: to assure that the company will have more control over the movement of its customer data.
  • A year after Mr. Snowden’s revelations, the era of quiet cooperation is over. Telecommunications companies say they are denying requests to volunteer data not covered by existing law. A.T.&T., Verizon and others say that compared with a year ago, they are far more reluctant to cooperate with the United States government in “gray areas” where there is no explicit requirement for a legal warrant.
  • ...8 more annotations...
  • Eric Grosse, Google’s security chief, suggested in an interview that the N.S.A.'s own behavior invited the new arms race.“I am willing to help on the purely defensive side of things,” he said, referring to Washington’s efforts to enlist Silicon Valley in cybersecurity efforts. “But signals intercept is totally off the table,” he said, referring to national intelligence gathering.“No hard feelings, but my job is to make their job hard,” he added.
  • In Washington, officials acknowledge that covert programs are now far harder to execute because American technology companies, fearful of losing international business, are hardening their networks and saying no to requests for the kind of help they once quietly provided.Continue reading the main story Robert S. Litt, the general counsel of the Office of the Director of National Intelligence, which oversees all 17 American spy agencies, said on Wednesday that it was “an unquestionable loss for our nation that companies are losing the willingness to cooperate legally and voluntarily” with American spy agencies.
  • Many point to an episode in 2012, when Russian security researchers uncovered a state espionage tool, Flame, on Iranian computers. Flame, like the Stuxnet worm, is believed to have been produced at least in part by American intelligence agencies. It was created by exploiting a previously unknown flaw in Microsoft’s operating systems. Companies argue that others could have later taken advantage of this defect.Worried that such an episode undercuts confidence in its wares, Microsoft is now fully encrypting all its products, including Hotmail and Outlook.com, by the end of this year with 2,048-bit encryption, a stronger protection that would take a government far longer to crack. The software is protected by encryption both when it is in data centers and when data is being sent over the Internet, said Bradford L. Smith, the company’s general counsel.
  • Mr. Smith also said the company was setting up “transparency centers” abroad so that technical experts of foreign governments could come in and inspect Microsoft’s proprietary source code. That will allow foreign governments to check to make sure there are no “back doors” that would permit snooping by United States intelligence agencies. The first such center is being set up in Brussels.Microsoft has also pushed back harder in court. In a Seattle case, the government issued a “national security letter” to compel Microsoft to turn over data about a customer, along with a gag order to prevent Microsoft from telling the customer it had been compelled to provide its communications to government officials. Microsoft challenged the gag order as violating the First Amendment. The government backed down.
  • Hardware firms like Cisco, which makes routers and switches, have found their products a frequent subject of Mr. Snowden’s disclosures, and their business has declined steadily in places like Asia, Brazil and Europe over the last year. The company is still struggling to convince foreign customers that their networks are safe from hackers — and free of “back doors” installed by the N.S.A. The frustration, companies here say, is that it is nearly impossible to prove that their systems are N.S.A.-proof.
  • In one slide from the disclosures, N.S.A. analysts pointed to a sweet spot inside Google’s data centers, where they could catch traffic in unencrypted form. Next to a quickly drawn smiley face, an N.S.A. analyst, referring to an acronym for a common layer of protection, had noted, “SSL added and removed here!”
  • Facebook and Yahoo have also been encrypting traffic among their internal servers. And Facebook, Google and Microsoft have been moving to more strongly encrypt consumer traffic with so-called Perfect Forward Secrecy, specifically devised to make it more labor intensive for the N.S.A. or anyone to read stored encrypted communications.One of the biggest indirect consequences from the Snowden revelations, technology executives say, has been the surge in demands from foreign governments that saw what kind of access to user information the N.S.A. received — voluntarily or surreptitiously. Now they want the same.
  • The latest move in the war between intelligence agencies and technology companies arrived this week, in the form of a new Google encryption tool. The company released a user-friendly, email encryption method to replace the clunky and often mistake-prone encryption schemes the N.S.A. has readily exploited.But the best part of the tool was buried in Google’s code, which included a jab at the N.S.A.'s smiley-face slide. The code included the phrase: “ssl-added-and-removed-here-; - )”
1 - 20 of 42 Next › Last »
Showing 20 items per page