Skip to main content

Home/ Open Web/ Group items tagged libraries

Rss Feed Group items tagged

Gary Edwards

TEI: Text Encoding Initiative - 1 views

  •  
    The Text Encoding Initiative (TEI) is a consortium which collectively develops and maintains a standard for the representation of texts in digital form. Its chief deliverable is a set of Guidelines which specify encoding methods for machine-readable texts, chiefly in the humanities, social sciences and linguistics. Since 1994, the TEI Guidelines have been widely used by libraries, museums, publishers, and individual scholars to present texts for online research, teaching, and preservation. In addition to the Guidelines themselves, the Consortium provides a variety of supporting resources, including resources for learning TEI, information on projects using the TEI, TEI-related publications, and software developed for or adapted to the TEI. The TEI Consortium is a non-profit membership organization composed of academic institutions, research projects, and individual scholars from around the world. Members contribute financially to the Consortium and elect representatives to its Council and Board of Directors.
Paul Merrell

Web Browser Supports Time Travel (Library of Congress) - 0 views

  • September 24, 2010 -- For those who use the Mozilla Firefox browser, you now have the option to time travel through the web.  MementoFox  is a free extension that users can add-on to their browsers. The extension implements the Memento protocal , which the Los Alamos National Laboratory and Old Dominion University are developing to enable capture of and access to older versions of websites. 
  • have the option to time travel through the web.  MementoFox  is a free extension that users can add-on to their browsers. The extension implements the Memento protocal , which the Los Alamos National Laboratory and Old Dominion University are developing to enable capture of and access to older versions of websites. 
  • Memento allows a user to link resources, or web pages, with their previous versions automatically.  The term Memento refers to an archival record of a resource as well as the technological framework that supports the ability to discover and browse older versions of Web resources.  One of the challenges of researching older versions of Internet resources is searching for them in web archives. With the Firefox add-on, users can easily view older versions in the same browser without having to search across archives. After a URL is specified in the browser, the newly released extension allows users to set a target day, using the slider bar (the "add-on").  The protocol will search archives across the web for previous versions of the URL.   As long as those archives are available on a server accessible across the web, MementoFox will return the previous targeted version. MementoFox is available for download  and developers interested in working with the protocol can join the development group .
  • ...1 more annotation...
  • The Memento project receives support from the Library of Congress National Digital Information Infrastructure and Preservation Program.
Gary Edwards

Target Survey - the Open Siddur Project Development Wiki - 0 views

  •  
    The ultimate goals are to have a computer-viewable display format (XHTML) and at least one printable format. We may also want a post-processing editable format. Our farthest target as yet is XHTML, styled by CSS. For a printed format, one expects a complete target to be able to produce a document that has features which one would expect of any Siddur: page numbers, table of contents, footnotes, side notes, header/page title, etc. XHTML originated as a computer-display format, not a publishing format. Even when combined with CSS 2.1, it does not support some of the features above (with some hacking, side notes, a static header/footer, and page numbers are possible, but it is still missing vital features). CSS3 is more publishing friendly, when implemented, will make life much easier. Until then, we will have to be a bit more creative. The following is a list of software libraries and formats that can help us increase the range of formats that we can target. XSLT or Java are the preferred languages, since the rest of our chain is in XSLT, and driven by Saxon, which is written in Java, allowing us to bundle the entire chain in a portable program, which can be distributed ( with the added bonus of being able to be distributed within a web browser as an applet ).
Paul Merrell

The Strongest Link: Libraries and Linked Data - 0 views

  •  
    See also Wikipedia on Linked Data: http://en.wikipedia.org/wiki/Linked_Data
Gary Edwards

Amazon SDKs Boost Support for Mobile Cloud « Data Center Knowledge - 0 views

  •  
    Amazon Releases Developer SDKs One interesting and important exception is Amazon's recent release of its Software Development Kits (SDK) for Google's Android and Apple's iOS. With these kits, developers are provided with tools that will simplify development of cloud applications stored on the Amazon Web Services cloud platform, or AWS. Developing apps that can use many of the already popular AWS cloud services offers many new opportunities for the developer community, especially due to its low-barrier-to-entry and affordability, enabling more developers with limited resources  to build and provision new mobile cloud services. The new SDK includes libraries that simplify handling of HTTP connections, request retries and error handling, which used to be complex and arduous. Integration of applications with several AWS cloud services, like the Simple Storage Service (S3), SimpleDB database, Simple Notification Service (SNS) and Simple Queue Service (SQN) will be much more accessible than before. For example, it's going to be interesting to see whether developers will build a viable messaging solution atop the AWS SNS service that can actually compete with mobile SMS services - which have been a long-time major cash-cow for many mobile network operators.
Gary Edwards

GMailr: An Unofficial Javascript API for GMail - ReadWriteCloud - 1 views

  •  
    Google has pretty much given up on developing a JavaScript API for GMail. There was once a Greasemonkey script Google developed for GMail but that broke and Google shows no sign of fixing it. James Yu is now trying to fix that scenario with GMailr, a JavaScript API for GMail. It is made from the code he wrote for 0Boxer, an extension for GMail that turns organizing your inbox into a game. Yu is also a lead developer at Scribd. Yu said developing the API took him on a path fraught with frustrations and dead ends. He writes there is supported official JavaScript API for Gmail. The Greasemonkey script is broken and no one has yet released a frontend API for Gmail. He said he needed access to the various user actions in the UI as the backend APIs were not going to work as he wished. He decided to write his own library from scratch.
Gary Edwards

The Web Fights Back Against Flipboard - 0 views

  •  
    This is the Dec 2010 interview that totally changed my view of the future of Documents.  Separating content and layout, and then reconstituting is the essence of preparing a publication.  Are documents Web pages?  Are Web sights magazines?   Visually-immersive apps like TreeSaver and Flipboard change everything, as this video demonstrates.  TreeSaver is OpenWeb HTML+.  FlipBoard is iOS platform specific.  Filipe argues why Open Web will win.  Great interview.  Life changing stuff. excerpt: The problem with Flipboard is that it's an app, not the Web, and I keep hoping someone will show me a really well-designed Web app that shows me that the Web can still win. Yesterday Treesaver's Filipe Fortes took me up on my "can the Web be saved" challenge and visited my house to show me what he's been working on for publishers. An open-source JavaScript/HTML5/CSS library of design templates that will help developers at content companies compete with the design aesthetic that Flipboard showed us.
Gary Edwards

Download CSS Regions Protoype - Adobe Labs - 1 views

  •  
    Yes!  Finally we have CSS Layout enabling professional typography.  Download includes a modified version of WebKit and a number of open source libraries. CSS Regions bring new properties to CSS (Cascading Style Sheets) that provide:     * text containers with custom shapes......     * exclusion shapes which text will wrap around......     * text that flows from one area into another. Demos showcase some of the concepts Adobe proposed to the  W3C with CSS Regions: content threads, content shapes and text exclusions. The samples require a mini-browser using a specially modified version of WebKit.  
Paul Merrell

Gtk+ HTML backend update « Alexander Larsson - 0 views

  •  
    Still at the experimental stage, but here's a screencast of GTK+ desktop apps and widgets running in a web browser, via HTML5 magic. Lots of collaboration and remote operation potential. The disruption potential here is huge. GTK+ is one of only three major multi-platform desktop widget toolkits that have accessibility baked in (the ATK library). Thousands of desktop apps have been developed with it. Coming to a browser near you?
Paul Merrell

Google to slip SVG into Internet Explorer * The Register - 0 views

  • Microsoft might be hesitating on Scalable Vector Graphics (SVG) in Internet Explorer 8, but Google's pressing on. The search giant's engineers are building a JavaScript library to render static and dynamic SVG in Microsoft's browser. Google promised that the library, a Javascript shim, will simply drop into IE.
  • SVG has a huge presence on the web. This facet of the World Wide Web Consortium's HTML 5 spec is supported in Firefox, Safari, Opera, Chrome, and Apple's iPhone, and is used in Google Maps and Google Docs. It also topped a list of features wanted by developers in a OpenAJAX browser wish list last year.
  • There's suspicion, though, that the reason has more to do with Microsoft's internal politics, with the company wanting graphics and drawing in IE done using Silverlight instead. SVG Web is more than an answer to Microsoft's foot-dragging, however. Google has declared for HTML 5 on the web, proclaiming last week that the web programming model has "won". Support for graphics capabilities in HTML 5 should also be seen as Google's partial answer to Adobe Systems' Flash. Google has complained that Flash is not open source and its development is not driven by the community. Google said the benefit of SVG Web is that it would sit inside the DOM whereas Flash "sits on top of the web, it's not part of the web"
Gary Edwards

InfoQ: How to Design a Good API & Why it Matters - 0 views

  •  
    A video with slide presentation featuring Google's Joshua Bloch. The topic is Java API Summary: A well-written API can be a great asset to the organization that wrote it and to all that use it. Given the importance of good API design, surprisingly little has been written on the subject. In this talk (recorded at Javapolis), Java library designer Joshua Bloch teaches how to design good APIs, with many examples of what good and bad APIs look like.
Gary Edwards

Modernizr: HTML5 and CSS3 detection | Ajaxian » - 0 views

  •  
    Modernizr is a new library that detects various HTML5 and CSS3 features and lets you know so you can use them: Enables the writing of conditional CSS and conditional JavaScript! The JS tools just keep coming.
Paul Merrell

Most Agencies Falling Short on Mandate for Online Records - 0 views

  • Nearly 20 years after Congress passed the Electronic Freedom of Information Act Amendments (E-FOIA), only 40 percent of agencies have followed the law's instruction for systematic posting of records released through FOIA in their electronic reading rooms, according to a new FOIA Audit released today by the National Security Archive at www.nsarchive.org to mark Sunshine Week. The Archive team audited all federal agencies with Chief FOIA Officers as well as agency components that handle more than 500 FOIA requests a year — 165 federal offices in all — and found only 67 with online libraries populated with significant numbers of released FOIA documents and regularly updated.
  • Congress called on agencies to embrace disclosure and the digital era nearly two decades ago, with the passage of the 1996 "E-FOIA" amendments. The law mandated that agencies post key sets of records online, provide citizens with detailed guidance on making FOIA requests, and use new information technology to post online proactively records of significant public interest, including those already processed in response to FOIA requests and "likely to become the subject of subsequent requests." Congress believed then, and openness advocates know now, that this kind of proactive disclosure, publishing online the results of FOIA requests as well as agency records that might be requested in the future, is the only tenable solution to FOIA backlogs and delays. Thus the National Security Archive chose to focus on the e-reading rooms of agencies in its latest audit. Even though the majority of federal agencies have not yet embraced proactive disclosure of their FOIA releases, the Archive E-FOIA Audit did find that some real "E-Stars" exist within the federal government, serving as examples to lagging agencies that technology can be harnessed to create state-of-the art FOIA platforms. Unfortunately, our audit also found "E-Delinquents" whose abysmal web performance recalls the teletype era.
  • E-Delinquents include the Office of Science and Technology Policy at the White House, which, despite being mandated to advise the President on technology policy, does not embrace 21st century practices by posting any frequently requested records online. Another E-Delinquent, the Drug Enforcement Administration, insults its website's viewers by claiming that it "does not maintain records appropriate for FOIA Library at this time."
  • ...9 more annotations...
  • "The presumption of openness requires the presumption of posting," said Archive director Tom Blanton. "For the new generation, if it's not online, it does not exist." The National Security Archive has conducted fourteen FOIA Audits since 2002. Modeled after the California Sunshine Survey and subsequent state "FOI Audits," the Archive's FOIA Audits use open-government laws to test whether or not agencies are obeying those same laws. Recommendations from previous Archive FOIA Audits have led directly to laws and executive orders which have: set explicit customer service guidelines, mandated FOIA backlog reduction, assigned individualized FOIA tracking numbers, forced agencies to report the average number of days needed to process requests, and revealed the (often embarrassing) ages of the oldest pending FOIA requests. The surveys include:
  • The federal government has made some progress moving into the digital era. The National Security Archive's last E-FOIA Audit in 2007, " File Not Found," reported that only one in five federal agencies had put online all of the specific requirements mentioned in the E-FOIA amendments, such as guidance on making requests, contact information, and processing regulations. The new E-FOIA Audit finds the number of agencies that have checked those boxes is now much higher — 100 out of 165 — though many (66 in 165) have posted just the bare minimum, especially when posting FOIA responses. An additional 33 agencies even now do not post these types of records at all, clearly thwarting the law's intent.
  • The FOIAonline Members (Department of Commerce, Environmental Protection Agency, Federal Labor Relations Authority, Merit Systems Protection Board, National Archives and Records Administration, Pension Benefit Guaranty Corporation, Department of the Navy, General Services Administration, Small Business Administration, U.S. Citizenship and Immigration Services, and Federal Communications Commission) won their "E-Star" by making past requests and releases searchable via FOIAonline. FOIAonline also allows users to submit their FOIA requests digitally.
  • Disabilities Compliance. Despite the E-FOIA Act, many government agencies do not embrace the idea of posting their FOIA responses online. The most common reason agencies give is that it is difficult to post documents in a format that complies with the Americans with Disabilities Act, also referred to as being "508 compliant," and the 1998 Amendments to the Rehabilitation Act that require federal agencies "to make their electronic and information technology (EIT) accessible to people with disabilities." E-Star agencies, however, have proven that 508 compliance is no barrier when the agency has a will to post. All documents posted on FOIAonline are 508 compliant, as are the documents posted by the Department of Defense and the Department of State. In fact, every document created electronically by the US government after 1998 should already be 508 compliant. Even old paper records that are scanned to be processed through FOIA can be made 508 compliant with just a few clicks in Adobe Acrobat, according to this Department of Homeland Security guide (essentially OCRing the text, and including information about where non-textual fields appear). Even if agencies are insistent it is too difficult to OCR older documents that were scanned from paper, they cannot use that excuse with digital records.
  • Key Findings
  • Excuses Agencies Give for Poor E-Performance
  • Justice Department guidance undermines the statute. Currently, the FOIA stipulates that documents "likely to become the subject of subsequent requests" must be posted by agencies somewhere in their electronic reading rooms. The Department of Justice's Office of Information Policy defines these records as "frequently requested records… or those which have been released three or more times to FOIA requesters." Of course, it is time-consuming for agencies to develop a system that keeps track of how often a record has been released, which is in part why agencies rarely do so and are often in breach of the law. Troublingly, both the current House and Senate FOIA bills include language that codifies the instructions from the Department of Justice. The National Security Archive believes the addition of this "three or more times" language actually harms the intent of the Freedom of Information Act as it will give agencies an easy excuse ("not requested three times yet!") not to proactively post documents that agency FOIA offices have already spent time, money, and energy processing. We have formally suggested alternate language requiring that agencies generally post "all records, regardless of form or format that have been released in response to a FOIA request."
  • THE E-DELINQUENTS: WORST OVERALL AGENCIES In alphabetical order
  • Privacy. Another commonly articulated concern about posting FOIA releases online is that doing so could inadvertently disclose private information from "first person" FOIA requests. This is a valid concern, and this subset of FOIA requests should not be posted online. (The Justice Department identified "first party" requester rights in 1989. Essentially agencies cannot use the b(6) privacy exemption to redact information if a person requests it for him or herself. An example of a "first person" FOIA would be a person's request for his own immigration file.) Cost and Waste of Resources. There is also a belief that there is little public interest in the majority of FOIA requests processed, and hence it is a waste of resources to post them. This thinking runs counter to the governing principle of the Freedom of Information Act: that government information belongs to US citizens, not US agencies. As such, the reason that a person requests information is immaterial as the agency processes the request; the "interest factor" of a document should also be immaterial when an agency is required to post it online. Some think that posting FOIA releases online is not cost effective. In fact, the opposite is true. It's not cost effective to spend tens (or hundreds) of person hours to search for, review, and redact FOIA requests only to mail it to the requester and have them slip it into their desk drawer and forget about it. That is a waste of resources. The released document should be posted online for any interested party to utilize. This will only become easier as FOIA processing systems evolve to automatically post the documents they track. The State Department earned its "E-Star" status demonstrating this very principle, and spent no new funds and did not hire contractors to build its Electronic Reading Room, instead it built a self-sustaining platform that will save the agency time and money going forward.
Paul Merrell

Victory for Users: Librarian of Congress Renews and Expands Protections for Fair Uses |... - 0 views

  • The new rules for exemptions to copyright's DRM-circumvention laws were issued today, and the Librarian of Congress has granted much of what EFF asked for over the course of months of extensive briefs and hearings. The exemptions we requested—ripping DVDs and Blurays for making fair use remixes and analysis; preserving video games and running multiplayer servers after publishers have abandoned them; jailbreaking cell phones, tablets, and other portable computing devices to run third party software; and security research and modification and repairs on cars—have each been accepted, subject to some important caveats.
  • The exemptions are needed thanks to a fundamentally flawed law that forbids users from breaking DRM, even if the purpose is a clearly lawful fair use. As software has become ubiquitous, so has DRM.  Users often have to circumvent that DRM to make full use of their devices, from DVDs to games to smartphones and cars. The law allows users to request exemptions for such lawful uses—but it doesn’t make it easy. Exemptions are granted through an elaborate rulemaking process that takes place every three years and places a heavy burden on EFF and the many other requesters who take part. Every exemption must be argued anew, even if it was previously granted, and even if there is no opposition. The exemptions that emerge are limited in scope. What is worse, they only apply to end users—the people who are actually doing the ripping, tinkering, jailbreaking, or research—and not to the people who make the tools that facilitate those lawful activities. The section of the law that creates these restrictions—the Digital Millennium Copyright Act's Section 1201—is fundamentally flawed, has resulted in myriad unintended consequences, and is long past due for reform or removal altogether from the statute books. Still, as long as its rulemaking process exists, we're pleased to have secured the following exemptions.
  • The new rules are long and complicated, and we'll be posting more details about each as we get a chance to analyze them. In the meantime, we hope each of these exemptions enable more exciting fair uses that educate, entertain, improve the underlying technology, and keep us safer. A better long-terms solution, though, is to eliminate the need for this onerous rulemaking process. We encourage lawmakers to support efforts like the Unlocking Technology Act, which would limit the scope of Section 1201 to copyright infringements—not fair uses. And as the White House looks for the next Librarian of Congress, who is ultimately responsible for issuing the exemptions, we hope to get a candidate who acts—as a librarian should—in the interest of the public's access to information.
Paul Merrell

This project aims to make '404 not found' pages a thing of the past - 0 views

  • The Internet is always changing. Sites are rising and falling, content is deleted, and bad URLs can lead to '404 Not Found' errors that are as helpful as a brick wall. A new project proposes an do away with dead 404 errors by implementing new HTML code that will help access prior versions of hyperlinked content. With any luck, that means that you’ll never have to run into a dead link again. The “404-No-More” project is backed by a formidable coalition including members from organizations like the Harvard Library Innovation Lab, Los Alamos National Laboratory, Old Dominion University, and the Berkman Center for Internet & Society. Part of the Knight News Challenge, which seeks to strengthen the Internet for free expression and innovation through a variety of initiatives, 404-No-More recently reached the semifinal stage. The project aims to cure so-called link rot, the process by which hyperlinks become useless overtime because they point to addresses that are no longer available. If implemented, websites such as Wikipedia and other reference documents would be vastly improved. The new feature would also give Web authors a way provide links that contain both archived copies of content and specific dates of reference, the sort of information that diligent readers have to hunt down on a website like Archive.org.
  • While it may sound trivial, link rot can actually have real ramifications. Nearly 50 percent of the hyperlinks in Supreme Court decisions no longer work, a 2013 study revealed. Losing footnotes and citations in landmark legal decisions can mean losing crucial information and context about the laws that govern us. The same study found that 70 percent of URLs within the Harvard Law Review and similar journals didn’t link to the originally cited information, considered a serious loss surrounding the discussion of our laws. The project’s proponents have come up with more potential uses as well. Activists fighting censorship will have an easier time combatting government takedowns, for instance. Journalists will be much more capable of researching dynamic Web pages. “If every hyperlink was annotated with a publication date, you could automatically view an archived version of the content as the author intended for you to see it,” the project’s authors explain. The ephemeral nature of the Web could no longer be used as a weapon. Roger Macdonald, a director at the Internet Archive, called the 404-No-More project “an important contribution to preservation of knowledge.”
  • The new feature would come in the form of introducing the mset attribute to the <a> element in HTML, which would allow users of the code to specify multiple dates and copies of content as an external resource. For instance, if both the date of reference and the location of a copy of targeted content is known by an author, the new code would like like this: The 404-No-More project’s goals are numerous, but the ultimate goal is to have mset become a new HTML standard for hyperlinks. “An HTML standard that incorporates archives for hyperlinks will loop in these efforts and make the Web better for everyone,” project leaders wrote, “activists, journalists, and regular ol’ everyday web users.”
Paul Merrell

Activists send the Senate 6 million faxes to oppose cyber bill - CBS News - 0 views

  • Activists worried about online privacy are sending Congress a message with some old-school technology: They're sending faxes -- more than 6.2 million, they claim -- to express opposition to the Cybersecurity Information Sharing Act (CISA).Why faxes? "Congress is stuck in 1984 and doesn't understand modern technology," according to the campaign Fax Big Brother. The week-long campaign was organized by the nonpartisan Electronic Frontier Foundation, the group Access and Fight for the Future, the activist group behind the major Internet protests that helped derail a pair of anti-piracy bills in 2012. It also has the backing of a dozen groups like the ACLU, the American Library Association, National Association of Criminal Defense Lawyers and others.
  • CISA aims to facilitate information sharing regarding cyberthreats between the government and the private sector. The bill gained more attention following the massive hack in which the records of nearly 22 million people were stolen from government computers."The ability to easily and quickly share cyber attack information, along with ways to counter attacks, is a key method to stop them from happening in the first place," Sen. Dianne Feinstein, D-California, who helped introduce CISA, said in a statement after the hack. Senate leadership had planned to vote on CISA this week before leaving for its August recess. However, the bill may be sidelined for the time being as the Republican-led Senate puts precedent on a legislative effort to defund Planned Parenthood.Even as the bill was put on the backburner, the grassroots campaign to stop it gained steam. Fight for the Future started sending faxes to all 100 Senate offices on Monday, but the campaign really took off after it garnered attention on the website Reddit and on social media. The faxed messages are generated by Internet users who visit faxbigbrother.com or stopcyberspying.com -- or who simply send a message via Twitter with the hashtag #faxbigbrother. To send all those faxes, Fight for the Future set up a dedicated server and a dozen phone lines and modems they say are capable of sending tens of thousands of faxes a day.
  • Fight for the Future told CBS News that it has so many faxes queued up at this point, that it may take months for Senate offices to receive them all, though the group is working on scaling up its capability to send them faster. They're also limited by the speed at which Senate offices can receive them.
  •  
    From an Fight For the Future mailing: "Here's the deal: yesterday the Senate delayed its expected vote on CISA, the Cybersecurity Information Sharing Act that would let companies share your private information--like emails and medical records--with the government. "The delay is good news; but it's a delay, not a victory. "We just bought some precious extra time to fight CISA, but we need to use it to go big like we did with SOPA or this bill will still pass. Even if we stop it in September, they'll try again after that. "The truth is that right now, things are looking pretty grim. Democrats and Republicans have been holding closed-door meetings to work out a deal to pass CISA quickly when they return from recess. "Right before the expected Senate vote on CISA, the Obama Administration endorsed the bill, which means if Congress passes it, the White House will definitely sign it.  "We've stalled and delayed CISA and bills like it nearly half a dozen times, but this month could be our last chance to stop it for good." See also http://tumblr.fightforthefuture.org/post/125953876003/senate-fails-to-advance-cisa-before-recess-amid (;) http://www.cbsnews.com/news/activists-send-the-senate-6-million-faxes-to-oppose-cyber-bill/ (;) http://www.npr.org/2015/08/04/429386027/privacy-advocates-to-senate-cyber-security-bill (.)
Paul Merrell

Thinking XML: The XML flavor of HTML5 - 1 views

  • 6 recommendations for developers using the next generation of the web's native language
  • In this article, I shall provide a practical guide that illustrates the state of play when it comes to XML in the HTML5 world. The article is written for what I call the desperate web hacker: someone who is not a W3C standards guru, but interested in either generating XHTML5 on the web, or consuming it in a simple way (that is, to consume information, rather than worrying about the enormous complexity of rendering). I'll admit that some of my recommendations will be painful for me to make, as a long-time advocate for processing XML the right way. Remember that HTML5 is still a W3C working draft, and it might be a while before it becomes a full recommendation. Many of its features are stable, though, and already well-implemented on the web.
Paul Merrell

Creating mobile Web applications with HTML 5 -- Five "How To" Articles - 0 views

  •  
    HTML 5 is a very hyped technology, but with good reason. It promises to be a technological tipping point for bringing desktop application capabilities to the browser. As promising as it is for traditional browsers, it has even more potential for mobile browsers. Even better, the most popular mobile browsers have already adopted and implemented many significant parts of the HTML 5 specification. In this five-part series, you will take a closer look at several of those new technologies that are part of HTML 5, that can have a huge impact on mobile Web application development. In each part of this series you will develop a working mobile Web application showcasing an HTML 5 feature that can be used on modern mobile Web browsers, like the ones found on the iPhone and Android-based devices
Gary Edwards

Ex-Apple Javascript Guru: HTML5 and Native Apps Can Live Together: Tech News « - 0 views

  •  
    Good interview with Charles Jolley - SproutCore - WebKit (met Charles at Web 2.0).  He has left Apple and started a SproutCore Web App development company called "Strobe".  Looking very good Charles! The Blended Brew Apps have become a preferred way of accessing information on mobile devices. But developers want to provide a unified experience, and that is why Jolley believes that we will soon have apps that use HTML5 inside a native app wrapper. "People are looking for an either/or solution, but it is not going to end up like that," he said. Think of Strobe's offerings as a way to create an experience that is a blend of HTML5 and native mobile apps. How this works is that an application is developed in HTML5 instead of proprietary formats. It is wrapped in a native app wrapper for, say, the iPhone, but when accessed through a web browser on a PC or any other device, like tablet, it offers the same user experience. This is a good way to solve a problem that is only going to get compounded many fold as multiple endpoints for content start to emerge. The co-existence of web and native apps also means content publishers need to think differently about content and how it is offered to consumers. The multiplicity of endpoints (iPhone, iPad, TV and PC) is going to force content producers to think differently about how they build the user experiences for different sets of screens. Jolley argues that the best way to do so is to stop taking a document-centric view that is part of the PC-era. In the touch-based mobile device era, folks need to think of ways to have a single technology stack married to the ability to create unique experiences for different devices. And if you do that, there is no doubt that HTML5 and native apps can live in harmony.
Paul Merrell

The Past Clouds the Future of Europe's New Antitrust Enforcer - Vox - 0 views

  • Joaquin Almunia left his job as the E.U.’s economics and monetary affairs commissioner this month to become antitrust chief.
  • Christine Varney, the head of the antitrust division at the United States Justice Department, warned European regulators in a speech on Monday to restrict imposing obligations to the European Union on American companies that are doing business globally.Regulators in Europe are under pressure from governments, media companies and technology developers to blunt the market power that Google has amassed by running the world’s most popular Internet search tools.
  • Mr. Almunia also will need to resolve whether to give greater freedom to online merchants like eBay and Amazon which, like Google, are based in the United States. Some specialty goods and luxury goods brands, in particular LVMH of France, have lobbied hard to require that merchants have traditional shops as a precondition for selling goods over the Internet.
‹ Previous 21 - 40 of 49 Next ›
Showing 20 items per page