Skip to main content

Home/ Open Web/ Group items tagged integration

Rss Feed Group items tagged

Gary Edwards

PDF Viewer Module for Drupal - Embed Documents to Your Web-Pages - 1 views

  • Great news for all Drupal CMS users! We have released a PDF viewer module for Drupal. The module allows you to seamlessly embed PDF documents, as well as PowerPoint presentations, Excel spreadsheets, word processing documents and images into web-pages on your Drupal website.
  • The PDF document viewer module for Drupal utilizes our GroupDocs Viewer's functionality and provides you with the following benefits:
  • GroupDocs Viewer converts PDF and other business documents to HTML5, meaning that your website visitors don't need any browser plug-ins or Flash to view documents hosted with our document viewer. You just put a document on your Drupal web-page and visitors can view it right away.
  • ...7 more annotations...
  • While viewing documents, users can quickly turn pages with the Go Forward/Backward buttons, just like in a slideshow. Also users can jump straight to a certain page and preview pages with thumbnails.
  • High-fidelity rendering. Thanks to utilizing HTML5 technology, embedded documents look just like the originals. Layout, formatting and fonts are retained and text looks sharp.
  • Finally, thanks to the newly released module, you can easily integrate the GroupDocs Viewer's functionality into your Drupal website and start hosting PDF and Office documents on your web-pages in minutes.
  • sers can zoom in or out of documents, as well as print and download the original file right from your Drupal web-pages.
  • Options like text copying, document printing and downloading can be disabled so that users can't copy the document.
  • GroupDocs Viewer doesn't convert documents to images, but renders them as real text documents. Your visitors will be able to copy text right from the embedded documents or search for a particular text within the document.
  • Supported Document Formats GroupDocs document viewer module for Drupal supports almost all common business formats. Documents with the following formats can be embedded to your web-pages: PDF documents Word processing documents (DOC, DOCX, TXT, RTF, ODT, etc.) PowerPoint presentations (PPT, PPTX) Image files (JPG, BMP, GIF, TIFF)
Paul Merrell

We finally gave Congress email addresses - Sunlight Foundation Blog - 0 views

  • On OpenCongress, you can now email your representatives and senators just as easily as you would a friend or colleague. We've added a new feature to OpenCongress. It's not flashy. It doesn't use D3 or integrate with social media. But we still think it's pretty cool. You might've already heard of it. Email. This may not sound like a big deal, but it's been a long time coming. A lot of people are surprised to learn that Congress doesn't have publicly available email addresses. It's the number one feature request that we hear from users of our APIs. Until recently, we didn't have a good response. That's because members of Congress typically put their feedback mechanisms behind captchas and zip code requirements. Sometimes these forms break; sometimes their requirements improperly lock out actual constituents. And they always make it harder to email your congressional delegation than it should be.
  • This is a real problem. According to the Congressional Management Foundation, 88% of Capitol Hill staffers agree that electronic messages from constituents influence their bosses' decisions. We think that it's inappropriate to erect technical barriers around such an essential democratic mechanism. Congress itself is addressing the problem. That effort has just entered its second decade, and people are feeling optimistic that a launch to a closed set of partners might be coming soon. But we weren't content to wait. So when the Electronic Frontier Foundation (EFF) approached us about this problem, we were excited to really make some progress. Building on groundwork first done by the Participatory Politics Foundation and more recent work within Sunlight, a network of 150 volunteers collected the data we needed from congressional websites in just two days. That information is now on Github, available to all who want to build the next generation of constituent communication tools. The EFF is already working on some exciting things to that end.
  • But we just wanted to be able to email our representatives like normal people. So now, if you visit a legislator's page on OpenCongress, you'll see an email address in the right-hand sidebar that looks like Sen.Reid@opencongress.org or Rep.Boehner@opencongress.org. You can also email myreps@opencongress.org to email both of your senators and your House representatives at once. The first time we get an email from you, we'll send one back asking for some additional details. This is necessary because our code submits your message by navigating those aforementioned congressional webforms, and we don't want to enter incorrect information. But for emails after the first one, all you'll have to do is click a link that says, "Yes, I meant to send that email."
  • ...1 more annotation...
  • One more thing: For now, our system will only let you email your own representatives. A lot of people dislike this. We do, too. In an age of increasing polarization, party discipline means that congressional leaders must be accountable to citizens outside their districts. But the unfortunate truth is that Congress typically won't bother reading messages from non-constituents — that's why those zip code requirements exist in the first place. Until that changes, we don't want our users to waste their time. So that's it. If it seems simple, it's because it is. But we think that unbreaking how Congress connects to the Internet is important. You should be able to send a call to action in a tweet, easily forward a listserv message to your representative and interact with your government using the tools you use to interact with everyone else.
Paul Merrell

Own Your Own Devices You Will, Under Rep. Farenthold's YODA Bill | Bloomberg BNA - 0 views

  • A bill introduced Sept. 18 would make clear that consumers actually owned the electronic devices, and any accompanying software on that device, that they purchased, according to sponsor Rep. Blake Farenthold's (R-Texas). The You Own Devices Act (H.R. 5586) would amend the Copyright Act “to provide that the first sale doctrine applies to any computer program that enables a machine or other product to operate.” The bill, which is unlikely to receive attention during Congress's lame-duck legislative session, was well-received by consumer's rights groups.
  • Section 109(a) of the Copyright Act, 17 U.S.C. §109(a), serves as the foundation for the first sale doctrine. H.R. 5586 would amend Section 109(a) by adding a provision covering “transfer of computer programs.” That provision would state:if a computer program enables any part of a machine or other product to operate, the owner of the machine or other product is entitled to transfer an authorized copy of the computer pro gram, or the right to obtain such copy, when the owner sells, leases, or otherwise transfers the machine or other product to another person. The right to transfer provided under this subsection may not be waived by any agreement.
  • ‘Things' Versus SoftwareFarenthold had expressed concern during a Sept. 17 hearing on Section 1201 of the Digital Millennium Copyright Act over what he perceived was a muddling between patents and copyrights when it comes to consumer products. “Traditionally patent law has protected things and copyright law has protected artistic-type works,” he said. “But now more and more things have software in them and you are licensing that software when you purchase a thing.” Farenthold asked the witnesses if there was a way to draw a distinction in copyright “between software that is an integral part of a thing as opposed to an add-on app that you would put on your telephone.”
  • ...1 more annotation...
  • H.R. 5586 seeks to draw that distinction. “YODA would simply state that if you want to sell, lease, or give away your device, the software that enables it to work is transferred along with it, and that any right you have to security and bug fixing of that software is transferred as well,” Farenthold said in a statement issued Sept. 19.
Paul Merrell

WikiLeaks' Julian Assange warns: Google is not what it seems - 0 views

  • Back in 2011, Julian Assange met up with Eric Schmidt for an interview that he considers the best he’s ever given. That doesn’t change, however, the opinion he now has about Schmidt and the company he represents, Google.In fact, the WikiLeaks leader doesn’t believe in the famous “Don’t Be Evil” mantra that Google has been preaching for years.Assange thinks both Schmidt and Google are at the exact opposite spectrum.“Nobody wants to acknowledge that Google has grown big and bad. But it has. Schmidt’s tenure as CEO saw Google integrate with the shadiest of US power structures as it expanded into a geographically invasive megacorporation. But Google has always been comfortable with this proximity,” Assange writes in an opinion piece for Newsweek.
  • “Long before company founders Larry Page and Sergey Brin hired Schmidt in 2001, their initial research upon which Google was based had been partly funded by the Defense Advanced Research Projects Agency (DARPA). And even as Schmidt’s Google developed an image as the overly friendly giant of global tech, it was building a close relationship with the intelligence community,” Assange continues.Throughout the lengthy article, Assange goes on to explain how the 2011 meeting came to be and talks about the people the Google executive chairman brought along - Lisa Shields, then vice president of the Council on Foreign Relationship, Jared Cohen, who would later become the director of Google Ideas, and Scott Malcomson, the book’s editor, who would later become the speechwriter and principal advisor to Susan Rice.“At this point, the delegation was one part Google, three parts US foreign-policy establishment, but I was still none the wiser.” Assange goes on to explain the work Cohen was doing for the government prior to his appointment at Google and just how Schmidt himself plays a bigger role than previously thought.In fact, he says that his original image of Schmidt, as a politically unambitious Silicon Valley engineer, “a relic of the good old days of computer science graduate culture on the West Coast,” was wrong.
  • However, Assange concedes that that is not the sort of person who attends Bilderberg conferences, who regularly visits the White House, and who delivers speeches at the Davos Economic Forum.He claims that Schmidt’s emergence as Google’s “foreign minister” did not come out of nowhere, but it was “presaged by years of assimilation within US establishment networks of reputation and influence.” Assange makes further accusations that, well before Prism had even been dreamed of, the NSA was already systematically violating the Foreign Intelligence Surveillance Act under its director at the time, Michael Hayden. He states, however, that during the same period, namely around 2003, Google was accepting NSA money to provide the agency with search tools for its rapidly-growing database of information.Assange continues by saying that in 2008, Google helped launch the NGA spy satellite, the GeoEye-1, into space and that the search giant shares the photographs from the satellite with the US military and intelligence communities. Later on, 2010, after the Chinese government was accused of hacking Google, the company entered into a “formal information-sharing” relationship with the NSA, which would allow the NSA’s experts to evaluate the vulnerabilities in Google’s hardware and software.
  • ...1 more annotation...
  • “Around the same time, Google was becoming involved in a program known as the “Enduring Security Framework” (ESF), which entailed the sharing of information between Silicon Valley tech companies and Pentagon-affiliated agencies at network speed.’’Emails obtained in 2014 under Freedom of Information requests show Schmidt and his fellow Googler Sergey Brin corresponding on first-name terms with NSA chief General Keith Alexander about ESF,” Assange writes.Assange seems to have a lot of backing to his statements, providing links left and right, which people can go check on their own.
  •  
    The "opinion piece for Newsweek" is an excerpt from Assange's new book, When Google met Wikileaks.  The chapter is well worth the read. http://www.newsweek.com/assange-google-not-what-it-seems-279447
Paul Merrell

From Radio to Porn, British Spies Track Web Users' Online Identities - 0 views

  • HERE WAS A SIMPLE AIM at the heart of the top-secret program: Record the website browsing habits of “every visible user on the Internet.” Before long, billions of digital records about ordinary people’s online activities were being stored every day. Among them were details cataloging visits to porn, social media and news websites, search engines, chat forums, and blogs. The mass surveillance operation — code-named KARMA POLICE — was launched by British spies about seven years ago without any public debate or scrutiny. It was just one part of a giant global Internet spying apparatus built by the United Kingdom’s electronic eavesdropping agency, Government Communications Headquarters, or GCHQ. The revelations about the scope of the British agency’s surveillance are contained in documents obtained by The Intercept from National Security Agency whistleblower Edward Snowden. Previous reports based on the leaked files have exposed how GCHQ taps into Internet cables to monitor communications on a vast scale, but many details about what happens to the data after it has been vacuumed up have remained unclear.
  • Amid a renewed push from the U.K. government for more surveillance powers, more than two dozen documents being disclosed today by The Intercept reveal for the first time several major strands of GCHQ’s existing electronic eavesdropping capabilities.
  • The surveillance is underpinned by an opaque legal regime that has authorized GCHQ to sift through huge archives of metadata about the private phone calls, emails and Internet browsing logs of Brits, Americans, and any other citizens — all without a court order or judicial warrant
  • ...17 more annotations...
  • A huge volume of the Internet data GCHQ collects flows directly into a massive repository named Black Hole, which is at the core of the agency’s online spying operations, storing raw logs of intercepted material before it has been subject to analysis. Black Hole contains data collected by GCHQ as part of bulk “unselected” surveillance, meaning it is not focused on particular “selected” targets and instead includes troves of data indiscriminately swept up about ordinary people’s online activities. Between August 2007 and March 2009, GCHQ documents say that Black Hole was used to store more than 1.1 trillion “events” — a term the agency uses to refer to metadata records — with about 10 billion new entries added every day. As of March 2009, the largest slice of data Black Hole held — 41 percent — was about people’s Internet browsing histories. The rest included a combination of email and instant messenger records, details about search engine queries, information about social media activity, logs related to hacking operations, and data on people’s use of tools to browse the Internet anonymously.
  • Throughout this period, as smartphone sales started to boom, the frequency of people’s Internet use was steadily increasing. In tandem, British spies were working frantically to bolster their spying capabilities, with plans afoot to expand the size of Black Hole and other repositories to handle an avalanche of new data. By 2010, according to the documents, GCHQ was logging 30 billion metadata records per day. By 2012, collection had increased to 50 billion per day, and work was underway to double capacity to 100 billion. The agency was developing “unprecedented” techniques to perform what it called “population-scale” data mining, monitoring all communications across entire countries in an effort to detect patterns or behaviors deemed suspicious. It was creating what it said would be, by 2013, “the world’s biggest” surveillance engine “to run cyber operations and to access better, more valued data for customers to make a real world difference.”
  • A document from the GCHQ target analysis center (GTAC) shows the Black Hole repository’s structure.
  • The data is searched by GCHQ analysts in a hunt for behavior online that could be connected to terrorism or other criminal activity. But it has also served a broader and more controversial purpose — helping the agency hack into European companies’ computer networks. In the lead up to its secret mission targeting Netherlands-based Gemalto, the largest SIM card manufacturer in the world, GCHQ used MUTANT BROTH in an effort to identify the company’s employees so it could hack into their computers. The system helped the agency analyze intercepted Facebook cookies it believed were associated with Gemalto staff located at offices in France and Poland. GCHQ later successfully infiltrated Gemalto’s internal networks, stealing encryption keys produced by the company that protect the privacy of cell phone communications.
  • Similarly, MUTANT BROTH proved integral to GCHQ’s hack of Belgian telecommunications provider Belgacom. The agency entered IP addresses associated with Belgacom into MUTANT BROTH to uncover information about the company’s employees. Cookies associated with the IPs revealed the Google, Yahoo, and LinkedIn accounts of three Belgacom engineers, whose computers were then targeted by the agency and infected with malware. The hacking operation resulted in GCHQ gaining deep access into the most sensitive parts of Belgacom’s internal systems, granting British spies the ability to intercept communications passing through the company’s networks.
  • In March, a U.K. parliamentary committee published the findings of an 18-month review of GCHQ’s operations and called for an overhaul of the laws that regulate the spying. The committee raised concerns about the agency gathering what it described as “bulk personal datasets” being held about “a wide range of people.” However, it censored the section of the report describing what these “datasets” contained, despite acknowledging that they “may be highly intrusive.” The Snowden documents shine light on some of the core GCHQ bulk data-gathering programs that the committee was likely referring to — pulling back the veil of secrecy that has shielded some of the agency’s most controversial surveillance operations from public scrutiny. KARMA POLICE and MUTANT BROTH are among the key bulk collection systems. But they do not operate in isolation — and the scope of GCHQ’s spying extends far beyond them.
  • The agency operates a bewildering array of other eavesdropping systems, each serving its own specific purpose and designated a unique code name, such as: SOCIAL ANTHROPOID, which is used to analyze metadata on emails, instant messenger chats, social media connections and conversations, plus “telephony” metadata about phone calls, cell phone locations, text and multimedia messages; MEMORY HOLE, which logs queries entered into search engines and associates each search with an IP address; MARBLED GECKO, which sifts through details about searches people have entered into Google Maps and Google Earth; and INFINITE MONKEYS, which analyzes data about the usage of online bulletin boards and forums. GCHQ has other programs that it uses to analyze the content of intercepted communications, such as the full written body of emails and the audio of phone calls. One of the most important content collection capabilities is TEMPORA, which mines vast amounts of emails, instant messages, voice calls and other communications and makes them accessible through a Google-style search tool named XKEYSCORE.
  • As of September 2012, TEMPORA was collecting “more than 40 billion pieces of content a day” and it was being used to spy on people across Europe, the Middle East, and North Africa, according to a top-secret memo outlining the scope of the program. The existence of TEMPORA was first revealed by The Guardian in June 2013. To analyze all of the communications it intercepts and to build a profile of the individuals it is monitoring, GCHQ uses a variety of different tools that can pull together all of the relevant information and make it accessible through a single interface. SAMUEL PEPYS is one such tool, built by the British spies to analyze both the content and metadata of emails, browsing sessions, and instant messages as they are being intercepted in real time. One screenshot of SAMUEL PEPYS in action shows the agency using it to monitor an individual in Sweden who visited a page about GCHQ on the U.S.-based anti-secrecy website Cryptome.
  • Partly due to the U.K.’s geographic location — situated between the United States and the western edge of continental Europe — a large amount of the world’s Internet traffic passes through its territory across international data cables. In 2010, GCHQ noted that what amounted to “25 percent of all Internet traffic” was transiting the U.K. through some 1,600 different cables. The agency said that it could “survey the majority of the 1,600” and “select the most valuable to switch into our processing systems.”
  • According to Joss Wright, a research fellow at the University of Oxford’s Internet Institute, tapping into the cables allows GCHQ to monitor a large portion of foreign communications. But the cables also transport masses of wholly domestic British emails and online chats, because when anyone in the U.K. sends an email or visits a website, their computer will routinely send and receive data from servers that are located overseas. “I could send a message from my computer here [in England] to my wife’s computer in the next room and on its way it could go through the U.S., France, and other countries,” Wright says. “That’s just the way the Internet is designed.” In other words, Wright adds, that means “a lot” of British data and communications transit across international cables daily, and are liable to be swept into GCHQ’s databases.
  • A map from a classified GCHQ presentation about intercepting communications from undersea cables. GCHQ is authorized to conduct dragnet surveillance of the international data cables through so-called external warrants that are signed off by a government minister. The external warrants permit the agency to monitor communications in foreign countries as well as British citizens’ international calls and emails — for example, a call from Islamabad to London. They prohibit GCHQ from reading or listening to the content of “internal” U.K. to U.K. emails and phone calls, which are supposed to be filtered out from GCHQ’s systems if they are inadvertently intercepted unless additional authorization is granted to scrutinize them. However, the same rules do not apply to metadata. A little-known loophole in the law allows GCHQ to use external warrants to collect and analyze bulk metadata about the emails, phone calls, and Internet browsing activities of British people, citizens of closely allied countries, and others, regardless of whether the data is derived from domestic U.K. to U.K. communications and browsing sessions or otherwise. In March, the existence of this loophole was quietly acknowledged by the U.K. parliamentary committee’s surveillance review, which stated in a section of its report that “special protection and additional safeguards” did not apply to metadata swept up using external warrants and that domestic British metadata could therefore be lawfully “returned as a result of searches” conducted by GCHQ.
  • Perhaps unsurprisingly, GCHQ appears to have readily exploited this obscure legal technicality. Secret policy guidance papers issued to the agency’s analysts instruct them that they can sift through huge troves of indiscriminately collected metadata records to spy on anyone regardless of their nationality. The guidance makes clear that there is no exemption or extra privacy protection for British people or citizens from countries that are members of the Five Eyes, a surveillance alliance that the U.K. is part of alongside the U.S., Canada, Australia, and New Zealand. “If you are searching a purely Events only database such as MUTANT BROTH, the issue of location does not occur,” states one internal GCHQ policy document, which is marked with a “last modified” date of July 2012. The document adds that analysts are free to search the databases for British metadata “without further authorization” by inputing a U.K. “selector,” meaning a unique identifier such as a person’s email or IP address, username, or phone number. Authorization is “not needed for individuals in the U.K.,” another GCHQ document explains, because metadata has been judged “less intrusive than communications content.” All the spies are required to do to mine the metadata troves is write a short “justification” or “reason” for each search they conduct and then click a button on their computer screen.
  • Intelligence GCHQ collects on British persons of interest is shared with domestic security agency MI5, which usually takes the lead on spying operations within the U.K. MI5 conducts its own extensive domestic surveillance as part of a program called DIGINT (digital intelligence).
  • GCHQ’s documents suggest that it typically retains metadata for periods of between 30 days to six months. It stores the content of communications for a shorter period of time, varying between three to 30 days. The retention periods can be extended if deemed necessary for “cyber defense.” One secret policy paper dated from January 2010 lists the wide range of information the agency classes as metadata — including location data that could be used to track your movements, your email, instant messenger, and social networking “buddy lists,” logs showing who you have communicated with by phone or email, the passwords you use to access “communications services” (such as an email account), and information about websites you have viewed.
  • Records showing the full website addresses you have visited — for instance, www.gchq.gov.uk/what_we_do — are treated as content. But the first part of an address you have visited — for instance, www.gchq.gov.uk — is treated as metadata. In isolation, a single metadata record of a phone call, email, or website visit may not reveal much about a person’s private life, according to Ethan Zuckerman, director of Massachusetts Institute of Technology’s Center for Civic Media. But if accumulated and analyzed over a period of weeks or months, these details would be “extremely personal,” he told The Intercept, because they could reveal a person’s movements, habits, religious beliefs, political views, relationships, and even sexual preferences. For Zuckerman, who has studied the social and political ramifications of surveillance, the most concerning aspect of large-scale government data collection is that it can be “corrosive towards democracy” — leading to a chilling effect on freedom of expression and communication. “Once we know there’s a reasonable chance that we are being watched in one fashion or another it’s hard for that not to have a ‘panopticon effect,’” he said, “where we think and behave differently based on the assumption that people may be watching and paying attention to what we are doing.”
  • When compared to surveillance rules in place in the U.S., GCHQ notes in one document that the U.K. has “a light oversight regime.” The more lax British spying regulations are reflected in secret internal rules that highlight greater restrictions on how NSA databases can be accessed. The NSA’s troves can be searched for data on British citizens, one document states, but they cannot be mined for information about Americans or other citizens from countries in the Five Eyes alliance. No such constraints are placed on GCHQ’s own databases, which can be sifted for records on the phone calls, emails, and Internet usage of Brits, Americans, and citizens from any other country. The scope of GCHQ’s surveillance powers explain in part why Snowden told The Guardian in June 2013 that U.K. surveillance is “worse than the U.S.” In an interview with Der Spiegel in July 2013, Snowden added that British Internet cables were “radioactive” and joked: “Even the Queen’s selfies to the pool boy get logged.”
  • In recent years, the biggest barrier to GCHQ’s mass collection of data does not appear to have come in the form of legal or policy restrictions. Rather, it is the increased use of encryption technology that protects the privacy of communications that has posed the biggest potential hindrance to the agency’s activities. “The spread of encryption … threatens our ability to do effective target discovery/development,” says a top-secret report co-authored by an official from the British agency and an NSA employee in 2011. “Pertinent metadata events will be locked within the encrypted channels and difficult, if not impossible, to prise out,” the report says, adding that the agencies were working on a plan that would “(hopefully) allow our Internet Exploitation strategy to prevail.”
Paul Merrell

Closing CDF WG, Publishing Specs as Notes from Doug Schepers on 2010-07-12 (public-cdf@... - 0 views

  • Hi, CDF folks- While we had hoped that more implementations might emerge that passed the CDF and WICD test suites [1], such that these specifications would meet the criteria as W3C Recommendations, it does not seem that this will happen in a reasonable timeframe. Despite good partial implementation experience, implementers have not show sufficient interest to justify further investment of W3C resources into this group, even at a background level. In order to clarify the status of the CDF WG specifications, including Compound Document by Reference Framework 1.0 [2], Web Integration Compound Document (WICD) Core 1.0 [3], WICD Mobile 1.0 [4], and WICD Full 1.0 [5], all in Candidate Recommendation phase since July 2007, we have decided to publish them as Working Group Notes instead, and to close the Compound Document Formats Working Group.
  •  
    This event speaks loudly to how little interest browser developershave in interoperable web solutions. One-way compatibility wins and the ability of web applications to round-trip data loses. For those that did not realize it, the Compound Document by Reference Framework not only allowes but requires that more featureful implementations round-trip the output of less featureful implementations without data loss. See http://www.w3.org/TR/2007/CR-CDR-20070718/#conformance ("A conformant user agent of a superset profile specification must process subset profile content as if it were the superset profile content"). 
Paul Merrell

Last Call Working Draft -- W3C Authoring Tool Accessibility Guidelines (ATAG) 2.0 - 0 views

  • This is a Working Draft of the Authoring Tool Accessibility Guidelines (ATAG) version 2.0. This document includes recommendations for assisting authoring tool developers to make the authoring tools that they develop more accessible to people with disabilities, including blindness and low vision, deafness and hearing loss, learning disabilities, cognitive limitations, motor difficulties, speech difficulties, and others. Accessibility, from an authoring tool perspective, includes addressing the needs of two (potentially overlapping) user groups with disabilities: authors of web content, whose needs are met by ensuring that the authoring tool user interface itself is accessible (addressed by Part A of the guidelines), and end users of web content, whose needs are met by ensuring that all authors are enabled, supported, and guided towards producing accessible web content (addressed by Part B of the guidelines).
  • Examples of authoring tools: ATAG 2.0 applies to a wide variety of web content generating applications, including, but not limited to: web page authoring tools (e.g., WYSIWYG HTML editors) software for directly editing source code (see note below) software for converting to web content technologies (e.g., "Save as HTML" features in office suites) integrated development environments (e.g., for web application development) software that generates web content on the basis of templates, scripts, command-line input or "wizard"-type processes software for rapidly updating portions of web pages (e.g., blogging, wikis, online forums) software for generating/managing entire web sites (e.g., content management systems, courseware tools, content aggregators) email clients that send messages in web content technologies multimedia authoring tools debugging tools for web content software for creating mobile web applications
  • Web-based and non-web-based: ATAG 2.0 applies equally to authoring tools of web content that are web-based, non-web-based or a combination (e.g., a non-web-based markup editor with a web-based help system, a web-based content management system with a non-web-based file uploader client). Real-time publishing: ATAG 2.0 applies to authoring tools with workflows that involve real-time publishing of web content (e.g., some collaborative tools). For these authoring tools, conformance to Part B of ATAG 2.0 may involve some combination of real-time accessibility supports and additional accessibility supports available after the real-time authoring session (e.g., the ability to add captions for audio that was initially published in real-time). For more information, see the Implementing ATAG 2.0 - Appendix E: Real-time content production. Text Editors: ATAG 2.0 is not intended to apply to simple text editors that can be used to edit source content, but that include no support for the production of any particular web content technology. In contrast, ATAG 2.0 can apply to more sophisticated source content editors that support the production of specific web content technologies (e.g., with syntax checking, markup prediction, etc.).
  •  
    Link is the latest version link so page should update when this specification graduates to a W3C recommendation.
Paul Merrell

FCC approves changes to CableCARD rules - The Hill's Hillicon Valley - 0 views

  • The Federal Communications Commission moved Thursday to open up the retail market for companies that provide cable set-top boxes and digital video recorders.At Thursday's open meeting, the FCC issued an order that would promote competition in the marketplace for set-top boxes by ensuring retail devices such as TiVo have the same access to prescheduled programming as cable providers. The order would also make CableCARD pricing and billing more transparent, streamline the installation process, and ease requirements on manufacturers and operators upgrading their equipment.
  • A trade group representing the cable industry also praised the FCC's action and pledged to work with TiVo and other retail cable box providers to create a new video device capable of seamlessly integrating content from multiple sources.
Paul Merrell

InfoQ: WS-I closes its doors. What does this mean for WS-*? - 0 views

  • The Web Services Interoperability Organization (WS-I) has just announced that it has completed its mission and will betransitioning all further efforts to OASIS. As their recent press release states: The release of WS-I member approved final materials for Basic Profile (BP) 1.2 and 2.0, and Reliable Secure Profile (RSP) 1.0 fulfills WS-I’s last milestone as an organization. By publishing the final three profiles, WS-I marks the completion of its work. Stewardship over WS-I’s assets, operations and mission will transition to OASIS (Organization for the Advancement of Structured Information Standards), a group of technology vendors and customers that drive development and adoption of open standards. Now at any other time this kind of statement from a standards organization might pass without much comment. However, with the rise of REST, a range of non-WS approaches to SOA and the fact that most of the WS-* standards have not been covered by WS-I, is this a reflection of the new position Web Services finds itself in, over a decade after it began? Perhaps this was inevitable given that the over the past few years there has been a lot more emphasis on interoperability within the various WS-* working groups? Or are the days of interactions across heterogeneous SOAP implementations in the past?
  • So the question remains: has interoperability pretty much been achieved for WS-* through WS-I and the improvements made with the way in which the specifications and standards are developed today, or has the real interoperability challenge moved elsewhere, still to be addressed?
Gary Edwards

WAN governance and network unification make or break successful cloud and hybrid comput... - 0 views

  • As soon as you start using multiple networks, you’re in the cloud, because now you’re making use of resources that are outside the control of your own IT organization and your service provider. Whether people think about it or not, just by adding a second network, they’re taking their first steps into the cloud. Anybody who carries a smartphone is experiencing the personal, private, public boundary of operations themselves. But what seems natural to somebody carrying an iPhone or Blackberry is a tremendous challenge to the traditional models of IT.
  •  
    With the increased interest in cloud, software as a service (SaaS), and mobile computing, applications are jockeying across multiple networks, both in terms of how services are assembled, as well in how users in different environments access and respond to these critical applications. Indeed, cloud computing forces a collapse in the gaps between the former silos of private, public, and personal networking domains. Since the network management and governance tasks have changed and continue to evolve rapidly, so too must the ways in which solutions and technologies address the tangled networks environment we all now live and work in.
Gary Edwards

Is the Apps Marketplace just playing catchup to Microsoft? | Googling Google | ZDNet.com - 0 views

shared by Gary Edwards on 12 Mar 10 - Cached
  • Take the basic communication, calendaring, and documentation enabled for free by Apps Standard Edition, add a few slick applications from the Marketplace and the sky was the limit. Or at least the clouds were.
    • Gary Edwards
       
      Google Apps have all the basic elements of a productivity environment, but lack the internal application messaging, data connectivity and exchange that made the Windows desktop productivity platform so powerful.   gAPPS are great.  They even have copy/paste! But they lack the basics needed for simple "merge" of client and contact data into a wordprocessor letter/report/form/research paper. Things like DDE, OLE, ODBC, MAPI, COM, and DCOM have to be reinvented for the Open Web.   gAPPS is a good place to start.  But the focus has got to shift to Wave technologies like OT, XMPP and JSON.  Then there are the lower level innovations such as Web Sockets, Native Client, HTML5, and the Cairo-Skia graphics layer (thanks Florian).
  • Whether you (or your business) choose a Microsoft-centered solution that now has well-implemented cloud integration and tightly coupled productivity and collaboration software (think Office Live Web Apps, Office 2010, and Sharepoint 2010) or you build a business around the web-based collaboration inherent in Google Apps and extend its core functions with cool and useful applications, you win.
    • Gary Edwards
       
      Not true!!! The Microsoft Cloud is based on proprietary technologies, with the Silverlight-OOXML runtime/plug-in at the core of a WPF-.NET driven "Business Productivity Platform. The Google Cloud is based on the Open Web, and not the "Open Web" that's tied up in corporate "standards" consortia like the W3C, OASIS and Ecma. One of the reasons i really like WebKit is that they push HTML5 technologies to the edge, submitting new enhancements back into the knuckle dragging W3C HTML5 workgroups as "proposals".  They don't however wait for the entangled corporate politics of the W3C to "approve and include" these proposals.  Google and Apple submit and go live simultaneously.   This of course negates the heavy influence platform rivals like Microsoft have over the activities of corporate standards orgs.  Which has to be done if WebKit-HTML5-JavaScript-XMPP-OT-Web Sockets-Native Client family of technologies is ever to challenge the interactive and graphical richness of proprietary Microsoft technologies (Silverlight, OOXML, DrawingML, C#). The important hedge here is that Google is Open Sourcing their enhancements and innovations.  Without that Open Sourcing, i think there would be reasons to fear any platform player pushing beyond the corporate standards consortia approval process.  For me, OSS balances out the incredible influence of Google, and the ownership they have over core Open Web productivity application components. Which is to say; i don't want to find myself tomorrow in the same position with a Google Open Web Productivity Platform, that i found myself in with the 1994 Windows desktop productivity environment - where Microsoft owned every opportunity, and could take the marketshare of any Windows developed application with simple announcements that they to will enter that application category.  (ex. the entire independent contact/project management category was wiped out by mere announcement of MS Outlook).
Paul Merrell

Chromium Blog: Bringing improved support for Adobe Flash Player to Google Chrome - 0 views

  • The traditional browser plug-in model has enabled tremendous innovation on the web, but it also presents challenges for both plug-ins and browsers. The browser plug-in interface is loosely specified, limited in capability and varies across browsers and operating systems. This can lead to incompatibilities, reduction in performance and some security headaches.That’s why we are working with Adobe, Mozilla and the broader community to help define the next generation browser plug-in API. This new API aims to address the shortcomings of the current browser plug-in model. There is much to do and we’re eager to get started.
  • As a first step, we’ve begun collaborating with Adobe to improve the Flash Player experience in Google Chrome. Today, we’re making available an initial integration of Flash Player with Chrome in the developer channel. We plan to bring this functionality to all Chrome users as quickly as we can.We believe this initiative will help our users in the following ways:When users download Chrome, they will also receive the latest version of Adobe Flash Player. There will be no need to install Flash Player separately.Users will automatically receive updates related to Flash Player using Google Chrome’s auto-update mechanism. This eliminates the need to manually download separate updates and reduces the security risk of using outdated versions.With Adobe's help, we plan to further protect users by extending Chrome's “sandbox” to web pages with Flash content.
Gary Edwards

Google Apps vs. Microsoft Office - 0 views

  • That's certainly one reason Microsoft still holds a giant lead in market share.
  • An IDC survey in July 2009 shows that nearly 97% of businesses were using Microsoft Office, and 77% were using only Microsoft Office.
  • About 4% of businesses use Google Apps as their primary e-mail and productivity platform, but the overwhelming majority of these are small and midsize organizations, according to a separate survey by ITIC. This puts Google well behind the open source OpenOffice, which has 19% market share, ITIC has found.
  • ...2 more annotations...
  • The ubiquity of Windows and the popularity of Windows 7 also work against Google, as Microsoft's Office tools are likely to have better integration with Windows than Google Apps does. And since most businesses already use the desktop version of Microsoft Office, customers interested in cloud computing may find it easier to switch to the Web-based versions of Office than to the Google suite.
  • According to IDC, nearly 20% of businesses reported extensive use of Google Docs, mainly in addition to Microsoft Office rather than as a replacement. In October 2007, only 6% of businesses were using Google Docs extensively, so adoption is growing quickly.
  •  
    What a dumb ass statement: "That's certainly one reason Microsoft still holds a giant lead in market share." The SFGate article compares Google Apps lack of service to Microsoft's Productivity monopoly, suggesting that Microsoft provides better service?  That's idocy.  Microsoft's service is non existent.  Third party MSDN developers and service businesses provide near 100% of MS Productivity support.  And always have.   Where Microsoft does provide outstanding support is to their MSDN network of developers and service providers.   Google will have to match that support if Google Apps is to make a credible run at Microsoft.  But there is no doubt that the monopolist iron grip on the desktop productivity platform is an almost impossible barrier for Google to climb over.  Service excellence or not.
Gary Edwards

Stoneware, Inc. WebNetwork of Integrated Applications - 0 views

  •  
    Good review of Stoneware's private / public cloud computing system.
Paul Merrell

Inside Google Desktop: Google Desktop Update - 0 views

  • In 2004, Google launched Google Desktop, a program designed to make it easy for users to search their own PCs for emails, files, music, photos, Web pages and more. Desktop has been used by tens of millions of people and we’ve been humbled by its usage and great user feedback. However, over the past seven years we’ve also witnessed some big changes in how users store and access their own data, with many moving to web-based applications. There has been a significant shift from local to cloud-based storage and computing, as well as integration of Google Desktop functionality (like local search) into most modern operating systems. This is a positive development for users and we’re excited that most people now have instant access to their personal information. As such, we’ll be discontinuing support for Google Desktop, including all of the associated APIs, services, plugins and gadgets. As of September 14, Google Desktop will no longer be available for download, and existing installations will not be updated to include new features or fixes.
  • n 2004, Google launched Google Desktop, a program designed to make it easy for users to search their own PCs
  •  
    Google throws in the towel on desktop search, just as Microsoft somehow reached into my WinXP Pro (which never runs with automatic updates turned on) and killed the file search functionality, replaced by a message that file search is no longer supported in Explorer 6, with an invitation to upgrade MSIE or use Bing. As though I would ever let MSIE outside my firewall! 
Maluvia Haseltine

Apatar - Open Source Data Integration & ETL - 0 views

  •  
    Join your on-premises data sources with the web without coding. Feed data from/to APIs, mashups, and mashup building tools.
Paul Merrell

Google adds bookmark sync to Chrome browser - 0 views

  • Google upgraded the beta version of its Chrome browser yesterday, adding integrated bookmark synchronization and boasting of a 30% speed improvement over the current production edition.
  • Bookmark sync requires that all the machines being kept in step run the Chrome beta, and that the user has a Google account, such as a Gmail username and password. The browser syncs bookmarks using Google Docs, the company's Web-based application suite.
Gary Edwards

Quickoffice » Quickoffice Pro for Android - 0 views

  •  
    Good stuff.  Florian needs to see this!
Gary Edwards

Major SugarSync for iOS update adds desktop-like features | ZDNet - 2 views

  •  
    Good review of SugarSync.  5GB free, reliable and stable copetitor to DropBox.  Excellent array of mobile platforms.  Better mobile file and folder management than DropBox.  Also blocked from China!  Thank you Sursen.  Amazon enters the Cloud sync-share-store space today with 5 GB free.
« First ‹ Previous 81 - 100 of 112 Next ›
Showing 20 items per page