Skip to main content

Home/ Open Web/ Group items tagged benefits

Rss Feed Group items tagged

Protocloud Technologies

16 Major Benefits of Custom Magento Development for eCommerce Store - 1 views

  •  
    Custom Magento development has several advantages which make it dominant over other eCommerce platforms. Check top 16 benefits of Magento eCommerce website development are listed here.
Gary Edwards

XML Production Workflows? Start with the Web and XHTML - 1 views

  • Challenges: Some Ugly Truths The challenges of building—and living with—an XML workflow are clear enough. The return on investment is a long-term proposition. Regardless of the benefits XML may provide, the starting reality is that it represents a very different way of doing things than the one we are familiar with. The Word Processing and Desktop Publishing paradigm, based on the promise of onscreen, WYSIWYG layout, is so dominant as to be practically inescapable. It has proven really hard to get from here to there, no matter how attractive XML might be on paper. A considerable amount of organizational effort and labour must be expended up front in order to realize the benefits. This is why XML is often referred to as an “investment”: you sink a bunch of time and money up front, and realize the benefits—greater flexibility, multiple output options, searching and indexing, and general futureproofing—later, over the long haul. It is not a short-term return proposition. And, of course, the returns you are able to realize from your XML investment are commensurate with what you put in up front: fine-grained, semantically rich tagging is going to give you more potential for searchability and recombination than a looser, more general-purpose approach, but it sure costs more. For instance, the Text Encoding Initiative (TEI) is the grand example of pouring enormous amounts of energy into the up-front tagging, with a very open-ended set of possibilities down the line. TEI helpfully defines a level to which most of us do not have to aspire.[5] But understanding this on a theoretical level is only part of the challenge. There are many practical issues that must be addressed. Software and labour are two of the most critical. How do you get the content into XML in the first place? Unfortunately, despite two decades of people doing SGML and XML, this remains an ugly question.
  • Practical Challenges In 2009, there is still no truly likeable—let alone standard—editing and authoring software for XML. For many (myself included), the high-water mark here was Adobe’s FrameMaker, substantially developed by the late 1990s. With no substantial market for it, it is relegated today mostly to the tech writing industry, unavailable for the Mac, and just far enough afield from the kinds of tools we use today that its adoption represents a significant hurdle. And FrameMaker was the best of the breed; most of the other software in decent circulation are programmers’ tools—the sort of things that, as Michael Tamblyn pointed out, encourage editors to drink at their desks. The labour question represents a stumbling block as well. The skill-sets and mind-sets that effective XML editors need have limited overlap with those needed by literary and more traditional production editors. The need to think of documents as machine-readable databases is not something that comes naturally to folks steeped in literary culture. In combination with the sheer time and effort that rich tagging requires, many publishers simply outsource the tagging to India, drawing a division of labour that spans oceans, to put it mildly. Once you have XML content, then what do you do with it? How do you produce books from it? Presumably, you need to be able to produce print output as well as digital formats. But while the latter are new enough to be generally XML-friendly (e-book formats being largely XML based, for instance), there aren’t any straightforward, standard ways of moving XML content into the kind of print production environments we are used to seeing. This isn’t to say that there aren’t ways of getting print—even very high-quality print—output from XML, just that most of them involve replacing your prepress staff with Java programmers.
  • Why does this have to be so hard? It’s not that XML is new, or immature, or untested. Remember that the basics have been around, and in production, since the early 1980s at least. But we have to take account of a substantial and long-running cultural disconnect between traditional editorial and production processes (the ones most of us know intimately) and the ways computing people have approached things. Interestingly, this cultural divide looked rather different in the 1970s, when publishers were looking at how to move to digital typesetting. Back then, printers and software developers could speak the same language. But that was before the ascendancy of the Desktop Publishing paradigm, which computerized the publishing industry while at the same time isolating it culturally. Those of us who learned how to do things the Quark way or the Adobe way had little in common with people who programmed databases or document-management systems. Desktop publishing technology isolated us in a smooth, self-contained universe of toolbars, grid lines, and laser proofs. So, now that the reasons to get with this program, XML, loom large, how can we bridge this long-standing divide?
  • ...44 more annotations...
  • Using the Web as a Production Platform The answer, I think, is right in front of you. The bridge is the Web, a technology and platform that is fundamentally based on XML, and which many publishers are by now comfortably familiar with. Perhaps not entirely comfortably, but at least most publishers are already working with the Web; they already either know or have on staff people who understand it and can work with it. The foundation of our argument is this: rather than looking at jumping to XML in its full, industrial complexity, which seems to be what the O'Reilly-backed StartWithXML initiative[6] is suggesting, publishers instead leverage existing tools and technologies—starting with the Web—as a means of getting XML workflows in place. This means making small investments and working with known tools rather than spending tens of thousands of dollars on XML software and rarefied consultants. It means re-thinking how the existing pieces of the production toolchain fit together; re-thinking the existing roles of software components already in use. It means, fundamentally, taking the Web seriously as a content platform, rather than thinking of it as something you need to get content out to, somehow. If nothing else, the Web represents an opportunity to think about editorial and production from outside the shrink-wrapped Desktop Publishing paradigm.
  • Is the Web made of Real XML? At this point some predictable objections can be heard: wait a moment, the Web isn’t really made out of XML; the HTML that makes up most of the Web is at best the bastard child of SGML, and it is far too flaky/unstructured/underpowered to be taken seriously. We counter by arguing that although HTML on the Web exists in a staggering array of different incarnations, and that the majority of it is indeed an unstructured mess, this does not undermine the general principle that basic, ubiquitous Web technologies can make a solid platform for content management, editorial process, and production workflow.
  • With the advent of a published XML standard in the late 1990s came the W3C’s adoption of XHTML: the realization of the Web’s native content markup as a proper XML document type. Today, its acceptance is almost ubiquitous, even while the majority of actual content out there may not be strictly conforming. The more important point is that most contemporary Web software, from browsers to authoring tools to content management systems (from blogs to enterprise systems), are capable of working with clean, valid XHTML. Or, to put the argument the other way around, clean, valid XHTML content plays absolutely seamlessly with everything else on the Web.[7]
  • The objection which follows, then, will be that even if we grant that XHTML is a real XML document type, that it is underpowered for “serious” content because it is almost entirely presentation (formatting) oriented; it lacks any semantic depth. In XHTML, a paragraph is a paragraph is a paragraph, as opposed to a section or an epigraph or a summary.
  • n contrast, more “serious” XML document types like DocBook[8] or DITA-derived schemas[9] are capable of making semantic distinctions about content chunks at a fine level of granularity and with a high degree of specificity.
  • So there is an argument for recalling the 80:20 rule here. If XHTML can provide 80% of the value with just 20% of the investment, then what exactly is the business case for spending the other 80% to achieve that last 20% of value? We suspect the ratio is actually quite a bit steeper than 80:20 for most publishers.
  • Furthermore, just to get technical for a moment, XHTML is extensible in a fairly straightforward way, through the common “class” attribute on each element. Web developers have long leveraged this kind of extensibility in the elaboration of “microformats” for semantic-web applications.[10] There is no reason why publishers shouldn’t think to use XHTML’s simple extensibility in a similar way for their own ends.
  • XHTML, on the other hand, is supported by a vast array of quotidian software, starting with the ubiquitous Web browser. For this very reason, XHTML is in fact employed as a component part of several more specialized document types (ONIX and ePub among them).
  • Why re-invent a general-purpose prose representation when XHTML already does the job?
  • It is worth pausing for a moment to consider the role of XHTML in the ePub standard for ebook content. An ePub file is, anatomically, a simply disguised zip archive. Inside the zip archive are a few standard component parts: there are specialized files that declare metadata about the book, and about the format of the book. And then there is the book’s content, represented in XHTML. An ePub book is a Web page in a wrapper.
  • To sum up the general argument: the Web as it already exists presents incredible value to publishers, as a platform for doing XML content management with existing (and often free) tools, and without having to go blindly into the unknown. At this point, we can offer a few design guidelines: prefer existing and/or ubiquitous tools over specialized ones wherever possible; prefer free software over proprietary systems where possible; prefer simple tools controlled and coordinated by human beings over fully automated (and therefore complex) systems; play to our strengths: use Web software for storing and managing content, use layout software for layout, and keep editors and production people in charge of their own domains.
  • Putting the Pieces Together: A Prototype
  • At the SFU Master of Publishing Program, we have been chipping away at this general line of thinking for a few years. Over that time, Web content management systems have been getting more and more sophisticated, all the while getting more streamlined and easier to use. (NB: if you have a blog, you have a Web content management system.) The Web is beginning to be recognized as a writing and editing environment used by millions of people. And the ways in which content is represented, stored, and exchanged online have become increasingly robust and standardized.
  • The missing piece of the puzzle has been print production: how can we move content from its malleable, fluid form on line into the kind of high-quality print production environments we’ve come to expect after two decades of Desktop Publishing?
  • Anyone who has tried to print Web content knows that the existing methods leave much to be desired (hyphenation and justification, for starters). In the absence of decent tools for this, most publishers quite naturally think of producing the print content first, and then think about how to get material onto the Web for various purposes. So we tend to export from Word, or from Adobe, as something of an afterthought.
  • While this sort of works, it isn’t elegant, and it completely ignores the considerable advantages of Web-based content management.
  • Content managed online is stored in one central location, accessible simultaneously to everyone in your firm, available anywhere you have an Internet connection, and usually exists in a much more fluid format than Word files. If only we could manage the editorial flow online, and then go to print formats at the end, instead of the other way around. At SFU, we made several attempts to make this work by way of the supposed “XML import” capabilities of various Desktop Publishing tools, without much success.[12]
  • In the winter of 2009, Adobe solved this part of the problem for us with the introduction of its Creative Suite 4. What CS4 offers is the option of a complete XML representation of an InDesign document: what Adobe calls IDML (InDesign Markup Language).
  • The IDML file format is—like ePub—a simply disguised zip archive that, when unpacked, reveals a cluster of XML files that represent all the different facets of an InDesign document: layout spreads, master pages, defined styles, colours, and of course, the content.
  • IDML is a well thought-out XML standard that achieves two very different goals simultaneously: it preserves all of the information that InDesign needs to do what it does; and it is broken up in a way that makes it possible for mere mortals (or at least our Master of Publishing students) to work with it.
  • What this represented to us in concrete terms was the ability to take Web-based content and move it into InDesign in a straightforward way, thus bridging Web and print production environments using existing tools and skillsets, with a little added help from free software.
  • We would take clean XHTML content, transform it to IDML-marked content, and merge that with nicely designed templates in InDesign.
  • The result is an almost push-button publication workflow, which results in a nice, familiar InDesign document that fits straight into the way publishers actually do production.
  • Tracing the steps To begin with, we worked backwards, moving the book content back to clean XHTML.
  • The simplest method for this conversion—and if you want to create Web content, this is an excellent route—was to use Adobe’s “Export to Digital Editions” option, which creates an ePub file.
  • Recall that ePub is just XHTML in a wrapper, so within the ePub file was a relatively clean XHTML document. It was somewhat cleaner (that is, the XHTML tagging was simpler and less cluttered) than InDesign’s other Web-oriented exports, possibly because Digital Editions is a well understood target, compared with somebody’s website.
  • In order to achieve our target of clean XHTML, we needed to do some editing; the XHTML produced by InDesign’s “Digital Editions” export was presentation-oriented. For instance, bulleted list items were tagged as paragraphs, with a class attribute identifying them as list items. Using the search-and-replace function, we converted such structures to proper XHTML list and list-item elements. Our guiding principle was to make the XHTML as straightforward as possible, not dependent on any particular software to interpret it.
  • We broke the book’s content into individual chapter files; each chapter could then carry its own basic metadata, and the pages conveniently fit our Web content management system (which is actually just a wiki). We assembled a dynamically generated table of contents for the 12 chapters, and created a cover page. Essentially, the book was entirely Web-based at this point.
  • When the book chapters are viewed online, they are formatted via a CSS2 stylesheet that defines a main column for content as well as dedicating screen real estate for navigational elements. We then created a second template to render the content for exporting; this was essentially a bare-bones version of the book with no navigation and minimal styling. Pages (or even the entire book) can be exported (via the “Save As...” function in a Web browser) for use in either print production or ebook conversion. At this point, we required no skills beyond those of any decent Web designer.
  • Integrating with CS4 for Print Adobe’s IDML language defines elements specific to InDesign; there is nothing in the language that looks remotely like XHTML. So a mechanical transformation step is needed to convert the XHTML content into something InDesign can use. This is not as hard as it might seem.
  • Both XHTML and IDML are composed of straightforward, well-documented structures, and so transformation from one to the other is, as they say, “trivial.” We chose to use XSLT (Extensible Stylesheet Language Transforms) to do the work. XSLT is part of the overall XML specification, and thus is very well supported in a wide variety of tools. Our prototype used a scripting engine called xsltproc, a nearly ubiquitous piece of software that we found already installed as part of Mac OS X (contemporary Linux distributions also have this as a standard tool), though any XSLT processor would work.
  • In other words, we don’t need to buy InCopy, because we just replaced it with the Web. Our wiki is now plugged directly into our InDesign layout. It even automatically updates the InDesign document when the content changes. Credit is due at this point to Adobe: this integration is possible because of the open file format in the Creative Suite 4.
  • We wrote an XSLT transformation script[18] that converted the XHTML content from the Web into an InCopy ICML file. The script itself is less than 500 lines long, and was written and debugged over a period of about a week by amateurs (again, the people named at the start of this article). The script runs in a couple of seconds, and the resulting .icml file can then be “placed” directly into an InDesign template. The ICML file references an InDesign stylesheet, so the template file can be set up with a house-styled layout, master pages, and stylesheet definitions for paragraphs and character ranges.
  • Rather than a public-facing website, our system relies on the Web as a content management platform—of course a public face could easily be added.
  • It should be noted that the Book Publishing 1 proof-of-concept was artificially complex; we began with a book laid out in InDesign and ended up with a look-alike book laid out in InDesign. But next time—for instance, when we publish Book Publishing 2—we can begin the process with the content on the Web, and keep it there throughout the editorial process. The book’s content could potentially be written and edited entirely online, as Web content, and then automatically poured into an InDesign template at proof time. “Just in time,” as they say. This represents an entirely new way of thinking of book production. With a Web-first orientation, it makes little sense to think of the book as “in print” or “out of print”—the book is simply available, in the first place online; in the second place in derivative digital formats; and third, but really not much more difficult, in print-ready format, via the usual InDesign CS print production system publishers are already familiar with.
  • Creating Ebook Files Creating electronic versions from XHTML source is vastly simpler than trying to generate these out of the existing print process. The ePub version is extremely easy to generate; so is online marketing copy or excerpts for the Web, since the content begins life Web-native.
  • Since an ePub file is essentially XHTML content in a special wrapper, all that is required is that we properly “wrap” our XHTML content. Ideally, the content in an ePub file is broken into chapters (as ours was) and a table of contents file is generated in order to allow easy navigation within an ebook reader. We used Julian Smart’s free tool eCub[19] to simply and automatically generate the ePub wrapper and the table of contents. The only custom development we did was to create a CSS stylesheet for the ebook so that headings and paragraph indents looked the way we wanted. Starting with XHTML content, creating ePub is almost too easy.
  • today, we are able to put the process together using nothing but standard, relatively ubiquitous Web tools: the Web itself as an editing and content management environment, standard Web scripting tools for the conversion process, and the well-documented IDML file format to integrate the layout tool.
  • Our project demonstrates that Web technologies are indeed good enough to use in an XML-oriented workflow; more specialized and expensive options are not necessarily required. For massive-scale enterprise publishing, this approach may not offer enough flexibility, and the challenge of adding and extracting extra semantic richness may prove more trouble than it's worth.
  • But for smaller firms who are looking at the straightforward benefits of XML-based processes—single source publishing, online content and workflow management, open and accessible archive formats, greater online discoverability—here is a way forward.
  • The result is very simple and easy to use. Our demonstration requires that a production editor run the XSLT transformation script manually, but there is no reason why this couldn’t be built directly into the Web content management system so that exporting the content to print ran the transformation automatically. The resulting file would then be “placed” in InDesign and proofed.
  • The final piece of our puzzle, the ability to integrate print production, was made possible by Adobe's release of InDesign with an open XML file format. Since the Web's XHTML is also XML, is can be easily and confidently transformed to the InDesign format.
  • Such a workflow—beginning with the Web and exporting to print—is surely more in line with the way we will do business in the 21st century, where the Web is the default platform for reaching audiences, developing content, and putting the pieces together. It is time, we suggest, for publishers to re-orient their operations and start with the Web.
  • Using the Web as a Production Platform
  •  
    I was looking for an answer to a problem Marbux had presented, and found this interesting article.  The issue was that of the upcoming conversion of the Note Case Pro (NCP) layout engine to the WebKit layout engine, and what to do about the NCP document format. My initial reaction was to encode the legacy NCP document format in XML, and run an XSLT to a universal pivot format like TEI-XML.  From there, the TEI-XML community would provide all the XSLT transformation routines for conversion to ODF, OOXML, XHTML, ePUB and HTML/CSS. Researching the problems one might encounter with this approach, I found this article.  Fascinating stuff. My take away is that TEI-XML would not be as effective a "universal pivot point" as XHTML.  Or perhaps, if NCP really wants to get aggressive; IDML - InDesign Markup Language. As an after thought, i was thinking that an alternative title to this article might have been, "Working with Web as the Center of Everything".
Gary Edwards

Cloud Computing White Papers by the Open Group - 0 views

  •  
    Cloud Computing White Papers   The Open Group Cloud Work Group exists to create a common understanding among buyers and suppliers of how enterprises of all sizes and scales of operation can include Cloud Computing technology in a safe and secure way in their architectures to realize its significant cost, scalability, and agility benefits. It includes some of the industry's leading cloud providers and end-user organizations, collaborating on standard models and frameworks aimed at eliminating vendor lock-in for enterprises looking to benefit from Cloud products and services. The White Papers on this website form the current output of the Work Group. They are also available in PDF form from The Open Group bookstore for download and printing. Further papers will be added as the Work Group progresses. The initial focus of the Work Group is on business drivers for Cloud Computing, and this is reflected in the first items to appear: The Business Scenario Workshop Report White Paper: Building Return on Investment from Cloud Computing White Paper: Strengthening your Business Case for Using Cloud White Paper: Cloud Buyers' Decision Tree White Paper: Cloud Buyers' Requirements Questionnaire Further White Papers will address other key Work Group topics, including Architecture, Infrastructure, and Security.
Gary Edwards

Cloud Computing Set to 'Skyrocket,' Driven by Economy: Survey - Forbes - 0 views

  •  
    Cloud computing is becoming more than a tactical measure adopted by managers and professionals seeking quick solutions to business and technical problems. It is increasingly being seen as a strategic initiative. Not only are most executives now planning to adopt some form of cloud computing for their organizations, they also expect these technology services to help position their organizations to succeed in today's rough-and-tumble economy. These are some of the findings of a new survey of 900 executives released by KPMG International and Forbes Insight. The majority, 81%, say their organizations have already moved at least some business activities to the cloud and expect 2012 investment "to skyrocket, with some companies planning to spend more than a fifth of their IT budget on cloud next year," the study finds. Economic factors were cited by 76% as an important driver for cloud adoption, bringing strategic benefits such as transforming their business models to gain a competitive advantage. Other considerations for moving to cloud computing include improving processes to offer more agility across the enterprise (80%), and offering technical benefits that they otherwise could not gain from their own data centers (76%). Eighty-seven percent of executives feel that the changes delivered by cloud will be "significant." This view is consistent among companies of all sizes and whether the respondents work within IT functions or business units.  KPMG summarized the transformative effects cloud is delivering:
Paul Merrell

Canadian Spies Collect Domestic Emails in Secret Security Sweep - The Intercept - 0 views

  • Canada’s electronic surveillance agency is covertly monitoring vast amounts of Canadians’ emails as part of a sweeping domestic cybersecurity operation, according to top-secret documents. The surveillance initiative, revealed Wednesday by CBC News in collaboration with The Intercept, is sifting through millions of emails sent to Canadian government agencies and departments, archiving details about them on a database for months or even years. The data mining operation is carried out by the Communications Security Establishment, or CSE, Canada’s equivalent of the National Security Agency. Its existence is disclosed in documents obtained by The Intercept from NSA whistleblower Edward Snowden. The emails are vacuumed up by the Canadian agency as part of its mandate to defend against hacking attacks and malware targeting government computers. It relies on a system codenamed PONY EXPRESS to analyze the messages in a bid to detect potential cyber threats.
  • Last year, CSE acknowledged it collected some private communications as part of cybersecurity efforts. But it refused to divulge the number of communications being stored or to explain for how long any intercepted messages would be retained. Now, the Snowden documents shine a light for the first time on the huge scope of the operation — exposing the controversial details the government withheld from the public. Under Canada’s criminal code, CSE is not allowed to eavesdrop on Canadians’ communications. But the agency can be granted special ministerial exemptions if its efforts are linked to protecting government infrastructure — a loophole that the Snowden documents show is being used to monitor the emails. The latest revelations will trigger concerns about how Canadians’ private correspondence with government employees are being archived by the spy agency and potentially shared with police or allied surveillance agencies overseas, such as the NSA. Members of the public routinely communicate with government employees when, for instance, filing tax returns, writing a letter to a member of parliament, applying for employment insurance benefits or submitting a passport application.
  • Chris Parsons, an internet security expert with the Toronto-based internet think tank Citizen Lab, told CBC News that “you should be able to communicate with your government without the fear that what you say … could come back to haunt you in unexpected ways.” Parsons said that there are legitimate cybersecurity purposes for the agency to keep tabs on communications with the government, but he added: “When we collect huge volumes, it’s not just used to track bad guys. It goes into data stores for years or months at a time and then it can be used at any point in the future.” In a top-secret CSE document on the security operation, dated from 2010, the agency says it “processes 400,000 emails per day” and admits that it is suffering from “information overload” because it is scooping up “too much data.” The document outlines how CSE built a system to handle a massive 400 terabytes of data from Internet networks each month — including Canadians’ emails — as part of the cyber operation. (A single terabyte of data can hold about a billion pages of text, or about 250,000 average-sized mp3 files.)
  • ...1 more annotation...
  • The agency notes in the document that it is storing large amounts of “passively tapped network traffic” for “days to months,” encompassing the contents of emails, attachments and other online activity. It adds that it stores some kinds of metadata — data showing who has contacted whom and when, but not the content of the message — for “months to years.” The document says that CSE has “excellent access to full take data” as part of its cyber operations and is receiving policy support on “use of intercepted private communications.” The term “full take” is surveillance-agency jargon that refers to the bulk collection of both content and metadata from Internet traffic. Another top-secret document on the surveillance dated from 2010 suggests the agency may be obtaining at least some of the data by covertly mining it directly from Canadian Internet cables. CSE notes in the document that it is “processing emails off the wire.”
  •  
    " CANADIAN SPIES COLLECT DOMESTIC EMAILS IN SECRET SECURITY SWEEP BY RYAN GALLAGHER AND GLENN GREENWALD @rj_gallagher@ggreenwald YESTERDAY AT 2:02 AM SHARE TWITTER FACEBOOK GOOGLE EMAIL PRINT POPULAR EXCLUSIVE: TSA ISSUES SECRET WARNING ON 'CATASTROPHIC' THREAT TO AVIATION CHICAGO'S "BLACK SITE" DETAINEES SPEAK OUT WHY DOES THE FBI HAVE TO MANUFACTURE ITS OWN PLOTS IF TERRORISM AND ISIS ARE SUCH GRAVE THREATS? NET NEUTRALITY IS HERE - THANKS TO AN UNPRECEDENTED GUERRILLA ACTIVISM CAMPAIGN HOW SPIES STOLE THE KEYS TO THE ENCRYPTION CASTLE Canada's electronic surveillance agency is covertly monitoring vast amounts of Canadians' emails as part of a sweeping domestic cybersecurity operation, according to top-secret documents. The surveillance initiative, revealed Wednesday by CBC News in collaboration with The Intercept, is sifting through millions of emails sent to Canadian government agencies and departments, archiving details about them on a database for months or even years. The data mining operation is carried out by the Communications Security Establishment, or CSE, Canada's equivalent of the National Security Agency. Its existence is disclosed in documents obtained by The Intercept from NSA whistleblower Edward Snowden. The emails are vacuumed up by the Canadian agency as part of its mandate to defend against hacking attacks and malware targeting government computers. It relies on a system codenamed PONY EXPRESS to analyze the messages in a bid to detect potential cyber threats. Last year, CSE acknowledged it collected some private communications as part of cybersecurity efforts. But it refused to divulge the number of communications being stored or to explain for how long any intercepted messages would be retained. Now, the Snowden documents shine a light for the first time on the huge scope of the operation - exposing the controversial details the government withheld from the public. Under Canada's criminal code, CSE is no
Paul Merrell

How to Encrypt the Entire Web for Free - The Intercept - 0 views

  • If we’ve learned one thing from the Snowden revelations, it’s that what can be spied on will be spied on. Since the advent of what used to be known as the World Wide Web, it has been a relatively simple matter for network attackers—whether it’s the NSA, Chinese intelligence, your employer, your university, abusive partners, or teenage hackers on the same public WiFi as you—to spy on almost everything you do online. HTTPS, the technology that encrypts traffic between browsers and websites, fixes this problem—anyone listening in on that stream of data between you and, say, your Gmail window or bank’s web site would get nothing but useless random characters—but is woefully under-used. The ambitious new non-profit Let’s Encrypt aims to make the process of deploying HTTPS not only fast, simple, and free, but completely automatic. If it succeeds, the project will render vast regions of the internet invisible to prying eyes.
  • The benefits of using HTTPS are obvious when you think about protecting secret information you send over the internet, like passwords and credit card numbers. It also helps protect information like what you search for in Google, what articles you read, what prescription medicine you take, and messages you send to colleagues, friends, and family from being monitored by hackers or authorities. But there are less obvious benefits as well. Websites that don’t use HTTPS are vulnerable to “session hijacking,” where attackers can take over your account even if they don’t know your password. When you download software without encryption, sophisticated attackers can secretly replace the download with malware that hacks your computer as soon as you try installing it.
  • Encryption also prevents attackers from tampering with or impersonating legitimate websites. For example, the Chinese government censors specific pages on Wikipedia, the FBI impersonated The Seattle Times to get a suspect to click on a malicious link, and Verizon and AT&T injected tracking tokens into mobile traffic without user consent. HTTPS goes a long way in preventing these sorts of attacks. And of course there’s the NSA, which relies on the limited adoption of HTTPS to continue to spy on the entire internet with impunity. If companies want to do one thing to meaningfully protect their customers from surveillance, it should be enabling encryption on their websites by default.
  • ...2 more annotations...
  • Let’s Encrypt, which was announced this week but won’t be ready to use until the second quarter of 2015, describes itself as “a free, automated, and open certificate authority (CA), run for the public’s benefit.” It’s the product of years of work from engineers at Mozilla, Cisco, Akamai, Electronic Frontier Foundation, IdenTrust, and researchers at the University of Michigan. (Disclosure: I used to work for the Electronic Frontier Foundation, and I was aware of Let’s Encrypt while it was being developed.) If Let’s Encrypt works as advertised, deploying HTTPS correctly and using all of the best practices will be one of the simplest parts of running a website. All it will take is running a command. Currently, HTTPS requires jumping through a variety of complicated hoops that certificate authorities insist on in order prove ownership of domain names. Let’s Encrypt automates this task in seconds, without requiring any human intervention, and at no cost.
  • The transition to a fully encrypted web won’t be immediate. After Let’s Encrypt is available to the public in 2015, each website will have to actually use it to switch over. And major web hosting companies also need to hop on board for their customers to be able to take advantage of it. If hosting companies start work now to integrate Let’s Encrypt into their services, they could offer HTTPS hosting by default at no extra cost to all their customers by the time it launches.
  •  
    Don't miss the video. And if you have a web site, urge your host service to begin preparing for Let's Encrypt. (See video on why it's good for them.)
Felipp Crawly

Success has a New Name; Onward Process - 1 views

When I first heard about Onward Process Solutions, I had my own doubts. As we worked together towards our common goals, that was when I truly understood the significant benefits of their back offic...

started by Felipp Crawly on 26 Nov 12 no follow-up yet
Gary Edwards

A founder-friendly term sheet - Sam Altman - 1 views

  •  
    Must read for every entrepreneur!  When your product and service can command these kind of terms, for sure your company is worth investing in. "A founder-friendly term sheet When I invest (outside of YC) I make offers with the following term sheet.  I've tried to make the terms reflect what I wanted when I was a founder.  A few people have asked me if I'd share it, so here it is.  I think it's pretty founder-friendly. If you believe the upside risk theory, then it makes sense to offer compelling terms and forgo some downside protection to get the best companies to want to work with you. What's most important is what's not in it: *No option pool.  Taking the option pool out of the pre-money valuation (ie, diluting only founders and not investors for future hires) is just a way to artificially manipulate valuation.  New hires benefit everyone and should dilute everyone. *The company doesn't have to pay any of my legal fees.  Requiring the company to pay investors' legal fees always struck me as particularly egregious-the company can probably make better use of the money than investors can, so I'll pay my own legal fees for the round (in a simple deal with no back and forth they always end up super low anyway). *No expiration.  I got burned once by an exploding offer and haven't forgotten it; the founders can take as much time as they want to think about it.  In practice, people usually decide pretty quickly. *No confidentiality.  Founder/investor relationships are long and important.  The founders should talk to whomever they want, and if they want to tell people what I offered them, I don't really care.  Investors certainly tell each other what they offer companies. (Once we shake hands on a deal, of course, I expect the founders to honor it.) *No participating preferred, non-standard liquidation preference, etc.  There is a 1x liquidation preference, but I'm willing to forgo even that and buy common shares (and sometimes
Gary Edwards

Combining the Best of Gmail and Zoho CRM Produces Amazing Results By James Kimmons of A... - 0 views

  •  
    ZOHO has demonstrated some very effective and easy to use data merging. They have also released a ZOHO Writer extension for Chrome that is awesome. The problem with "merge" is that, while full featured, the only usable data source is ZOHO CRM. Not good, but zCRM does fully integrate with ZOHO eMail, which enables the full two way transparent integration with zCRM. Easier to do than explain. Real Estate example excerpt: Zoho is smart, allowing you to integrate Gmail: The best of both worlds is available, because Zoho had the foresight to allow you to use Gmail and integrate your emails with the Zoho CRM system. Once you've set it up, you use Gmail the way you've always used it. I get to continue using all of the things I love about Gmail. But, every email, in or out of Gmail, attaches itself to the appropriate contact in the Zoho CRM system. When I send or receive an email in Gmail that is to or from one of my Zoho contacts or leads, the email automatically is picked up by Zoho and becomes a part of that contact/prospect's record, even though I never opened Zoho. If you've wondered about backing up Gmail, let Zoho do it: A bonus benefit in using Zoho mail is that you can set it up to receive all of your Gmail, sent and received, as well. It's a ready-made backup for your Gmail. So, if CRM isn't something you want to do with Zoho, at least set up the free email to copy all of your Gmail. And, if you're still using Outlook...why? The Internet is Improving Our Business at a Lower Cost: Here we have two free email systems that give you amazing flexibility and backup. Then the Zoho CRM system, with the email module installed, is only $15/month. You can do mass marketing emails, auto-responders, and take in new contacts and prospects with Web forms. Once you tie Gmail and Zoho together, your email and CRM will be top-notch, at a very low cost. Though you may wish for one, there isn't a reasonably priced "does it all" solution out there. This is an
Gary Edwards

Republic Wireless - Combining WiFi with Cellular to reduce Smartphone Costs - 0 views

  • Do I need to buy minutes from Sprint or anyone else? No. We're the first-ever wireless provider to bundle WiFi calling with access to cellular whenever you need it. Depending on the plan you choose, your Republic Wireless phone will have unlimited* access to data, talk and text when using the Sprint cellular network. Note that the $5 plan offered by Republic is WiFi only and the $10 plan includes cellular talk and text (no data). All Republic plans include unlimited data, talk and text on WiFi. 
  • Can I switch between plans? Yes! When you purchase a new Moto X phone, you’ll be able to choose whatever plan you like—and you can also switch plans up to twice per month as your needs change. For example, if you know you’ll be taking a vacation and might require more cell data one week, you can switch to a cell data plan right from your phone and then switch back to a WiFi “friendlier” plan once you return home.
  •  
    Republic Wireless provides a new kind of smartphone cellular service based on a technology that handles the roll over from WiFi to 3G or 4G cellular in the middle of a call. Very cool, but currently it only works with specially outfitted (custom ROM) Android Moto X phones. (They are working on how to port this custom ROM technology to all Android phones :) The concept is based on the fact that WiFi is cheap, very open and near universally available; while 3G and 4G Cellular is expensive, contractual and proprietary. The idea is to leverage free WiFi wherever they can, and roll over to the Sprint 3G - 4G network when needed. Very cool and the business model seems to have it right. ......................................................................... "Which Moto X plan is right for me? Here's the lowdown on our four new plan options. Depending on your needs and how you want to use your phone, you can choose the plan that's best for you. $5 WiFi only plan This is the most powerful tool in your arsenal of options. Why? You can drop your smartphone bill-at will-to $5. If you're interested in getting serious about cutting costs, you can use this tool to best leverage the WiFi in your life to reduce your phone bill. It's also the ultimate plan for home base stickers and kids who don't need a cellular plan. It's fully unlimited data, talk and text-on WiFi only. $10 WiFi + Cell Talk & Text One of our members, 10thdoctor said :  "I use WiFi for everything, except when I'm traveling and for voice at my school." Yep, this is the perfect plan for that. Our members are around WiFi about 90% of the time. During that 10% of the time where you're away from WiFi, this plan gives you cellular backup for communicating when you need to. This plan both cuts costs and accommodates what's quickly becoming the norm: a day filled with WiFi. $25 WiFi + Cell (3G) Talk, Text & Data Lots of people are on 3G plans today and are paying upwards of $100 a month on
Paul Merrell

Joint - Dear Colleague Letter: Electronic Book Readers - 1 views

  • U.S. Department of Justice Civil Rights Division U.S. Department of Education Office for Civil Rights
  •  
    June 29, 2010 Dear College or University President: We write to express concern on the part of the Department of Justice and the Department of Education that colleges and universities are using electronic book readers that are not accessible to students who are blind or have low vision and to seek your help in ensuring that this emerging technology is used in classroom settings in a manner that is permissible under federal law. A serious problem with some of these devices is that they lack an accessible text-to-speech function. Requiring use of an emerging technology in a classroom environment when the technology is inaccessible to an entire population of individuals with disabilities - individuals with visual disabilities - is discrimination prohibited by the Americans with Disabilities Act of 1990 (ADA) and Section 504 of the Rehabilitation Act of 1973 (Section 504) unless those individuals are provided accommodations or modifications that permit them to receive all the educational benefits provided by the technology in an equally effective and equally integrated manner. ... The Department of Justice recently entered into settlement agreements with colleges and universities that used the Kindle DX, an inaccessible, electronic book reader, in the classroom as part of a pilot study with Amazon.com, Inc. In summary, the universities agreed not to purchase, require, or recommend use of the Kindle DX, or any other dedicated electronic book reader, unless or until the device is fully accessible to individuals who are blind or have low vision, or the universities provide reasonable accommodation or modification so that a student can acquire the same information, engage in the same interactions, and enjoy the same services as sighted students with substantially equivalent ease of use. The texts of these agreements may be viewed on the Department of Justice's ADA Web site, www.ada.gov. (To find these settlemen
Gary Edwards

Changing technology - How cloud computing is transforming business - and why you should... - 0 views

  •  
    If someone told you that you could drop your operating costs by 40 percent, would you listen? If that same person said you could save between $70 and $150 per user per year in energy savings alone if you tried something new, would you try it?  A lot of companies are listening, and those same businesses are trying something new - cloud computing and software as a service (SaaS) - and reaping the many benefits, which start with the aforementioned cost savings. "It's about saving money, and there's a tremendous amount of money to be saved, because if you look at IT budgets, nearly 80 percent of that budget, in many cases, is spent just to keep the lights on, which means the other 20 percent is the only money that's actually able to be used to implement new technologies into the model," says Jeff McNaught, chief marketing officer at Wyse Technology Inc. 
Gary Edwards

Businesses Looking to Cloud Computing to Enhance Productivity: Report - Midmarket - New... - 0 views

  •  
    "Based on the results of this survey, it's clear that SMBs see the value of cloud-based solutions and are eager to benefit from a productivity and ROI perspective," said Fonality president and CEO Dean Mansfield. "Cloud-based communications tools in particular can be leveraged by companies to drive competitive differentiation while maximizing working capital." Minimizing total cost of ownership is the "ultimate goal" of adopting service-based offerings, according to survey results, while mobility and UC were recognized by a strong majority of those surveyed as key technologies to increase efficiency and profitability. Most respondents saw their current communications solutions as being "good," but 78 percent also seek to improve their capabilities. There was a very high amount of interest in cloud-based solutions and an "excellent prognosis" for cloud-based AaaS (Anything as a Service), with market opportunities still emerging "The needs of small and mid-size businesses differ significantly from large enterprises," said Steve Taylor, editor-in-chief and publisher for Webtorials, "This study shows that SMBs have a notable disposition to leveraging cloud-based technology to enhance their operations and their communications capabilities in particular."
Gary Edwards

Kaazing | Kaazing WebSocket Gateway - 0 views

  •  
    Kaazing WebSocket Gateway is the world's only enterprise solution for full-duplex, high-performance communication over the Web using the HTML5 WebSocket standard. Designed to be the next evolutionary step in web communication, HTML5 WebSocket addresses the problems inherent with traditional Ajax and Comet solutions today. True real-time connectivity in the browser and on mobile devices is now a reality thanks to this exciting new standard. Kaazing WebSocket Gateway delivers these features and benefits with the performance, scalability, robustness, and security that enterprises demand.
Gary Edwards

Official Google Blog: New ways to experience better collaboration with Google Apps - 0 views

  •  
    If this doesn't make Florian weep, nothing can! Google Cloud Connect for Microsoft Office is now available worldwide. This plugin for Microsoft Office is available to anyone with a Google Account, and brings multi-person collaboration to the Microsoft Word, Excel and PowerPoint applications that you may still need from time to time. The plugin syncs your work through Google's cloud, so everyone can contribute to the same version of a file at the same time. Learning the benefits of web-powered collaboration will help more people make a faster transition to 100% web collaboration tools.
Gary Edwards

Dropbox Could Generate $100 Million In Revenue This Year - 0 views

  •  
    DropBox the startup that makes cloud backup and syncing incredibly easy, is cash-flow positive, on track to generate $100 million in revenue this year and could be worth $1-2 billion, Fortune reports. Dropbox has a good freemium business model. The first 2 gigabytes of data are free, and after that you pay a monthly fee. If you've used Dropbox and gotten the benefits for months and have hit your 2 gig limit, are you going to take all your files off Dropbox? More likely you'll pay up. Importantly, Dropbox's margins should improve over time since it is based in the cloud, where costs are going down all the time. Add in its smart marketing (if you refer someone, both you and your friend get free space) and Dropbox has all the ingredients of a rocketship company. According to Fortune, Dropbox, founded in 2007, has had 10x year-over-year growth. Naturally, since Dropbox is doing very well and is in a hot sector--cloud computing--there are speculations that someone like Google or Amazon could snap it up.
Gary Edwards

Pugpig: iPhone, iPad HTML Reader That Feels Like a Native App - 0 views

  •  
    Open Source framework for building visually-immersive mobile ready magazines in HTML5-CSS3-JavaScript. excerpt:  Pugpig is an open source framework that enables you to publish HTML5 content in the form of a magazine, book or newspaper to iPhone and iPad devices. It's slick and feels like you are using a native app (we tested the it on the iPad) Pugpig is an HTML reader for iOS. It's basically a hybrid - part native application, part web app, designed to prove that you can have an HTML-based app that feels like it's native. Your app sits on top of the Pugpig framework. It can be customized and extended. For example, you can link to your own data source, change the navigation and look and feel. It can also be multi-lingual - for example, the sample app I tested leverages the AJAX API for the Microsoft Translator. Additional Pugpig benefits are its low memory footprint and ability to store a lot magazine/newspaper editions within the device, for easy offline viewing. You can offer your app in either the App Store or the new iOS 5 Newsstand (integration with the framework is in progress now).
Gary Edwards

The State of Cloud Computing in 2011 (Infographic) - ReadWriteCloud - 0 views

  •  
    Incredible Graphic charting the survey responses: excerpt:  BitNami, Cloud.com and Zenoss have released the results of its 2011 Cloud Computing Outlook survey. You can request a copy of the report here. Only 20% respondents have no plans to develop a cloud computing strategy, but there was a clear preference for using dedicated hardware instead of public cloud infrastructure. Virtualization is very popular, and the biggest benefit respondents perceive in cloud computing was hardware savings.
Gary Edwards

Petabytes on a budget: How to build cheap cloud storage | Backblaze Blog - 0 views

  •  
    Amazing must read!  BackBlaze offers unlimited cloud storage/backup for $5 per month.  Now they are releasing the "storage" aspect of their service as an open source design.  The discussion introducing the design is simple to read and follow - which in itself is an achievement.   They held back on open sourcing the BackBlaze Cloud software system, which is understandable.  But they do disclose a Debian Linux OS running Tomcat over Apache Server 5.4 with JFS and HTTPS access.  This is exciting stuff.  I hope the CAR MLS-Cloud guys take notice.  Intro: At Backblaze, we provide unlimited storage to our customers for only $5 per month, so we had to figure out how to store hundreds of petabytes of customer data in a reliable, scalable way-and keep our costs low. After looking at several overpriced commercial solutions, we decided to build our own custom Backblaze Storage Pods: 67 terabyte 4U servers for $7,867. In this post, we'll share how to make one of these storage pods, and you're welcome to use this design. Our hope is that by sharing, others can benefit and, ultimately, refine this concept and send improvements back to us. Evolving and lowering costs is critical to our continuing success at Backblaze.
Gary Edwards

AppleInsider | Inside Mac OS X Snow Leopard: Exchange Support - 0 views

  •  
    Apple desktop and iPhone support of Microsoft Exchange is not support for Microsoft, as some think.  It's actually a strategy to erode Microsoft's desktop monopoly.  It's also part of a longer term plan to thwart Microsoft's hopes of leveraging their desktop monopoly into a Web Server monopoly. Excerpt: Apple is reducing its dependance upon Microsoft's client software, weakening Microsoft's ability to hold back and dumb down its Mac offerings at Apple's expense. More importantly, Apple is providing its users with additional options that benefit both Mac users and the open source community. In the software business, Microsoft has long known the importance of owning the client end. It worked hard to displace Netscape's web browser in the late 90s, not because there was any money to be made in giving away browser clients, but because it knew that whoever controlled the client could set up proprietary demands for a specific web server. That's what Netscape had worked to do as it gave away its web browser in hopes that it could make money selling Netscape web servers; Microsoft first took control of the client with Internet Explorer and then began tying its IE client to its own IIS on the server side with features that gave companies reasons to buy all of their server software from Microsoft. As Apple takes over the client end of Exchange, it similarly gains market leverage. First and foremost, the move allows Apple to improve the Exchange experience of Mac users so that business users have no reason not to buy Macs. Secondly, it gives Apple a client audience to market its own server solutions, including MobileMe to individual users and Snow Leopard Server to organizations. In concert with providing Exchange Server support, Apple is also delivering integrated support for its own Exchange alternatives in both MobileMe and with Snow Leopard Server's improved Dovecot email services, Address Book Server, iCal Server, the new Mobile Access secure gateway, and its include
1 - 20 of 52 Next › Last »
Showing 20 items per page