Skip to main content

Home/ Future of the Web/ Group items tagged performance

Rss Feed Group items tagged

Paul Merrell

Most Agencies Falling Short on Mandate for Online Records - 1 views

  • Nearly 20 years after Congress passed the Electronic Freedom of Information Act Amendments (E-FOIA), only 40 percent of agencies have followed the law's instruction for systematic posting of records released through FOIA in their electronic reading rooms, according to a new FOIA Audit released today by the National Security Archive at www.nsarchive.org to mark Sunshine Week. The Archive team audited all federal agencies with Chief FOIA Officers as well as agency components that handle more than 500 FOIA requests a year — 165 federal offices in all — and found only 67 with online libraries populated with significant numbers of released FOIA documents and regularly updated.
  • Congress called on agencies to embrace disclosure and the digital era nearly two decades ago, with the passage of the 1996 "E-FOIA" amendments. The law mandated that agencies post key sets of records online, provide citizens with detailed guidance on making FOIA requests, and use new information technology to post online proactively records of significant public interest, including those already processed in response to FOIA requests and "likely to become the subject of subsequent requests." Congress believed then, and openness advocates know now, that this kind of proactive disclosure, publishing online the results of FOIA requests as well as agency records that might be requested in the future, is the only tenable solution to FOIA backlogs and delays. Thus the National Security Archive chose to focus on the e-reading rooms of agencies in its latest audit. Even though the majority of federal agencies have not yet embraced proactive disclosure of their FOIA releases, the Archive E-FOIA Audit did find that some real "E-Stars" exist within the federal government, serving as examples to lagging agencies that technology can be harnessed to create state-of-the art FOIA platforms. Unfortunately, our audit also found "E-Delinquents" whose abysmal web performance recalls the teletype era.
  • E-Delinquents include the Office of Science and Technology Policy at the White House, which, despite being mandated to advise the President on technology policy, does not embrace 21st century practices by posting any frequently requested records online. Another E-Delinquent, the Drug Enforcement Administration, insults its website's viewers by claiming that it "does not maintain records appropriate for FOIA Library at this time."
  • ...9 more annotations...
  • "The presumption of openness requires the presumption of posting," said Archive director Tom Blanton. "For the new generation, if it's not online, it does not exist." The National Security Archive has conducted fourteen FOIA Audits since 2002. Modeled after the California Sunshine Survey and subsequent state "FOI Audits," the Archive's FOIA Audits use open-government laws to test whether or not agencies are obeying those same laws. Recommendations from previous Archive FOIA Audits have led directly to laws and executive orders which have: set explicit customer service guidelines, mandated FOIA backlog reduction, assigned individualized FOIA tracking numbers, forced agencies to report the average number of days needed to process requests, and revealed the (often embarrassing) ages of the oldest pending FOIA requests. The surveys include:
  • The federal government has made some progress moving into the digital era. The National Security Archive's last E-FOIA Audit in 2007, " File Not Found," reported that only one in five federal agencies had put online all of the specific requirements mentioned in the E-FOIA amendments, such as guidance on making requests, contact information, and processing regulations. The new E-FOIA Audit finds the number of agencies that have checked those boxes is now much higher — 100 out of 165 — though many (66 in 165) have posted just the bare minimum, especially when posting FOIA responses. An additional 33 agencies even now do not post these types of records at all, clearly thwarting the law's intent.
  • The FOIAonline Members (Department of Commerce, Environmental Protection Agency, Federal Labor Relations Authority, Merit Systems Protection Board, National Archives and Records Administration, Pension Benefit Guaranty Corporation, Department of the Navy, General Services Administration, Small Business Administration, U.S. Citizenship and Immigration Services, and Federal Communications Commission) won their "E-Star" by making past requests and releases searchable via FOIAonline. FOIAonline also allows users to submit their FOIA requests digitally.
  • THE E-DELINQUENTS: WORST OVERALL AGENCIES In alphabetical order
  • Key Findings
  • Excuses Agencies Give for Poor E-Performance
  • Justice Department guidance undermines the statute. Currently, the FOIA stipulates that documents "likely to become the subject of subsequent requests" must be posted by agencies somewhere in their electronic reading rooms. The Department of Justice's Office of Information Policy defines these records as "frequently requested records… or those which have been released three or more times to FOIA requesters." Of course, it is time-consuming for agencies to develop a system that keeps track of how often a record has been released, which is in part why agencies rarely do so and are often in breach of the law. Troublingly, both the current House and Senate FOIA bills include language that codifies the instructions from the Department of Justice. The National Security Archive believes the addition of this "three or more times" language actually harms the intent of the Freedom of Information Act as it will give agencies an easy excuse ("not requested three times yet!") not to proactively post documents that agency FOIA offices have already spent time, money, and energy processing. We have formally suggested alternate language requiring that agencies generally post "all records, regardless of form or format that have been released in response to a FOIA request."
  • Disabilities Compliance. Despite the E-FOIA Act, many government agencies do not embrace the idea of posting their FOIA responses online. The most common reason agencies give is that it is difficult to post documents in a format that complies with the Americans with Disabilities Act, also referred to as being "508 compliant," and the 1998 Amendments to the Rehabilitation Act that require federal agencies "to make their electronic and information technology (EIT) accessible to people with disabilities." E-Star agencies, however, have proven that 508 compliance is no barrier when the agency has a will to post. All documents posted on FOIAonline are 508 compliant, as are the documents posted by the Department of Defense and the Department of State. In fact, every document created electronically by the US government after 1998 should already be 508 compliant. Even old paper records that are scanned to be processed through FOIA can be made 508 compliant with just a few clicks in Adobe Acrobat, according to this Department of Homeland Security guide (essentially OCRing the text, and including information about where non-textual fields appear). Even if agencies are insistent it is too difficult to OCR older documents that were scanned from paper, they cannot use that excuse with digital records.
  • Privacy. Another commonly articulated concern about posting FOIA releases online is that doing so could inadvertently disclose private information from "first person" FOIA requests. This is a valid concern, and this subset of FOIA requests should not be posted online. (The Justice Department identified "first party" requester rights in 1989. Essentially agencies cannot use the b(6) privacy exemption to redact information if a person requests it for him or herself. An example of a "first person" FOIA would be a person's request for his own immigration file.) Cost and Waste of Resources. There is also a belief that there is little public interest in the majority of FOIA requests processed, and hence it is a waste of resources to post them. This thinking runs counter to the governing principle of the Freedom of Information Act: that government information belongs to US citizens, not US agencies. As such, the reason that a person requests information is immaterial as the agency processes the request; the "interest factor" of a document should also be immaterial when an agency is required to post it online. Some think that posting FOIA releases online is not cost effective. In fact, the opposite is true. It's not cost effective to spend tens (or hundreds) of person hours to search for, review, and redact FOIA requests only to mail it to the requester and have them slip it into their desk drawer and forget about it. That is a waste of resources. The released document should be posted online for any interested party to utilize. This will only become easier as FOIA processing systems evolve to automatically post the documents they track. The State Department earned its "E-Star" status demonstrating this very principle, and spent no new funds and did not hire contractors to build its Electronic Reading Room, instead it built a self-sustaining platform that will save the agency time and money going forward.
Paul Merrell

Are processors pushing up against the limits of physics? | Ars Technica - 0 views

  • When I first started reading Ars Technica, performance of a processor was measured in megahertz, and the major manufacturers were rushing to squeeze as many of them as possible into their latest silicon. Shortly thereafter, however, the energy needs and heat output of these beasts brought that race crashing to a halt. More recently, the number of processing cores rapidly scaled up, but they quickly reached the point of diminishing returns. Now, getting the most processing power for each Watt seems to be the key measure of performance. None of these things happened because the companies making processors ran up against hard physical limits. Rather, computing power ended up being constrained because progress in certain areas—primarily energy efficiency—was slow compared to progress in others, such as feature size. But could we be approaching physical limits in processing power? In this week's edition of Nature, The University of Michigan's Igor Markov takes a look at the sorts of limits we might face.
Gonzalo San Gil, PhD.

Extra extra! How to use the press to promote open source | Opensource.com - 0 views

  •  
    A recap of Steven J. Vaughan-Nichols talk at All Things Open "This is a report from the All Things Open conference, held this year at the Raleigh Convention Center. I attended Steven Vaughan-Nichols session on marketing and using the press in open source-this is a recap." [# ! The Same Old #Tools... # ! ... for Brand #New #Goals: # ! #Freedom, #transparency, #performance...]
  •  
    A recap of Steven J. Vaughan-Nichols talk at All Things Open "This is a report from the All Things Open conference, held this year at the Raleigh Convention Center. I attended Steven Vaughan-Nichols session on marketing and using the press in open source-this is a recap."
Gonzalo San Gil, PhD.

Symposium: Mass Surveillance - When Reality Exceeds The Fiction | La Quadrature du Net ... - 0 views

  •  
    "Paris, 7 November 2014 - As part of an exceptional event, the Lisbon & Estoril Film Festival and La Quadrature du Net partner for a symposium on mass surveillance. The largest gathering of thinkers, activists and artists - since Edward Snowden's revelations - will take place in Portugal on the 14th, 15th and 16th of November 2014, in the Cultural Center of Belem." What assessment to make eighteen months after Edward Snowden's revelations? How to get involved? With what tools? What is the role of the arts in this looming fight? Julian Assange, Jacob Appelbaum, Wikileaks and Edward Snowden's collaborator Laura Poitras - the journalist who filmed Snowden in Hong Kong and participated in the revelations that have changed the world - Julian Assange's lawyers Baltasar Garzon and Jennifer Robinson, the philosophers Noam Chomsky and Edgar Morin, the author of the lectures Snowden and the Future, Eben Moglen, the co-founders of La Quadrature du Net, Jérémie Zimmermann and Philippe Aigrain, the CEO of El Pais Juan Luis Cebrian... will be the main guests of the first event of this scale in Europe. Many writers, artists, filmmakers or photographers like Nan Goldin, Céline Curiol, Dorota Masłowska, Philippe Parreno ... will join them along the way for readings, performances and interventions. The symposium will be accompanied by a cycle of 15 screenings followed by discussion about mass surveillance. Press accreditations: press@leffest.com
Gonzalo San Gil, PhD.

Review: Graylog delivers open source log management for the dedicated do-it-yourselfer ... - 0 views

  •  
    "By Joel Snyder Network World | Nov 9, 2015 3:06 AM PT RELATED TOPICS Open Source Subnet Network Management System Management Comments In most big security breaches, there's a familiar thread: something funny was going on, but no one noticed. The information was in the logs, but no one was looking for it. Logs from the hundreds or thousands of network devices are the secret sauce to problem solving, security alerting, and performance and capacity management. Gathering logs together, analyzing them, "
Gonzalo San Gil, PhD.

Three ways to easily encrypt your data on Linux | howtoforge.com | [# ! Note] - 0 views

  •  
    "Data encryption is one very solid security measure/precaution that everyone who owns data with significant personal or objective value should perform."
Gonzalo San Gil, PhD.

SXSW 2016 on BitTorrent: 10.33 GB of Free Music - TorrentFreak - 0 views

  •  
    By Ernesto on March 20, 2016 C: 17 News The South by Southwest (SXSW) music festival is one of the largest and most popular in the United States. For more than a decade SXSW has been sharing DRM-free songs of the performing artists, over 69 GB worth so far. This year's release breaks a new record with 1,593 tracks totaling more than 10 gigabytes.
Gonzalo San Gil, PhD.

Do you think Accelerated Mobile Pages (AMP) are open or closed? | Opensource.com - 0 views

  •  
    "A few months ago Google announced a new open source project called Accelerated Mobile Pages (AMP) that promised to "dramatically improve the performance of the mobile Web," and now Google features AMP content at the top of mobile search results. As the amount of AMP content continues to grow, more questions are being asked about whether or not AMP benefits the open web, and whether AMP is a closed silo."
Paul Merrell

Public transit in Beverly Hills may soon be driverless, program unanimously approved - ... - 0 views

  • An uncontested vote by the Beverly Hills City Council could guarantee a chauffeur for all residents in the near future. However, instead of a driver, the newly adopted program foresees municipally-owned driverless cars ready to order via a smartphone app. Also known as autonomous vehicles, or AV, driverless cars would appear to be the next big thing not only for people, but local governments as well – if the Beverly Hills City Council can get its AV development program past a few more hurdles, that is. The technology itself has some challenges ahead as well.
  • In the meantime, the conceptual shuttle service, which was unanimously approved at an April 5 city council meeting, is being celebrated.
  • Naming Google and Tesla in its press release, Beverly Hills must first develop a partnership with a manufacturer that can build it a fleet of unmanned cars. There will also be a need to bring in policy experts. All of these outside parties will have a chance to explore the program’s potential together at an upcoming community event.The Wallis Annenberg Center for the Performing Arts will host a summit this fall that will include expert lectures, discussions, and test drives. Er, test rides.Already in the works for Beverly Hills is a fiber optics cable network that will, in addition to providing high-speed internet access to all residents and businesses, one day be an integral part of a public transit system that runs on its users’ spontaneous desires.Obviously, Beverly Hills has some money on hand for the project, and it is also an ideal testing space as the city takes up an area of less than six square miles. Another positive factor is the quality of the city’s roads, which exceeds that of most in the greater Los Angeles area, not to mention California and the whole United States.“It can’t find the lane markings!” Volvo’s North American CEO, Lex Kerssemakers, complained to Los Angeles Mayor Eric Garcetti last month, according to Reuters. “You need to paint the bloody roads here!”Whether lanes are marked or signs are clear has made a big difference in how successfully the new technology works.Unfortunately, the US Department of Transportation considers 65 percent of US roads to be in poor condition, so AV cars may not be in the works for many Americans living outside of Beverly Hills quite as soon.
Gonzalo San Gil, PhD.

8 Reasons Why Singer/Songwriter Shows Are Boring - Digital Music NewsDigital Music News - 0 views

  •  
    "I just played a new songwriter series billed as #songwritersundays at The Fox and Hounds in Studio City, CA. They have built up quite a supportive crowd of musicians and music lovers for this Sunday series. "
Paul Merrell

Profiled From Radio to Porn, British Spies Track Web Users' Online Identities | Global ... - 0 views

  • One system builds profiles showing people’s web browsing histories. Another analyzes instant messenger communications, emails, Skype calls, text messages, cell phone locations, and social media interactions. Separate programs were built to keep tabs on “suspicious” Google searches and usage of Google Maps. The surveillance is underpinned by an opaque legal regime that has authorized GCHQ to sift through huge archives of metadata about the private phone calls, emails and Internet browsing logs of Brits, Americans, and any other citizens  all without a court order or judicial warrant.
  • The power of KARMA POLICE was illustrated in 2009, when GCHQ launched a top-secret operation to collect intelligence about people using the Internet to listen to radio shows. The agency used a sample of nearly 7 million metadata records, gathered over a period of three months, to observe the listening habits of more than 200,000 people across 185 countries, including the U.S., the U.K., Ireland, Canada, Mexico, Spain, the Netherlands, France, and Germany.
  • GCHQ’s documents indicate that the plans for KARMA POLICE were drawn up between 2007 and 2008. The system was designed to provide the agency with “either (a) a web browsing profile for every visible user on the Internet, or (b) a user profile for every visible website on the Internet.” The origin of the surveillance system’s name is not discussed in the documents. But KARMA POLICE is also the name of a popular song released in 1997 by the Grammy Award-winning British band Radiohead, suggesting the spies may have been fans. A verse repeated throughout the hit song includes the lyric, “This is what you’ll get, when you mess with us.”
  • ...3 more annotations...
  • GCHQ vacuums up the website browsing histories using “probes” that tap into the international fiber-optic cables that transport Internet traffic across the world. A huge volume of the Internet data GCHQ collects flows directly into a massive repository named Black Hole, which is at the core of the agency’s online spying operations, storing raw logs of intercepted material before it has been subject to analysis. Black Hole contains data collected by GCHQ as part of bulk “unselected” surveillance, meaning it is not focused on particular “selected” targets and instead includes troves of data indiscriminately swept up about ordinary people’s online activities. Between August 2007 and March 2009, GCHQ documents say that Black Hole was used to store more than 1.1 trillion “events”  a term the agency uses to refer to metadata records  with about 10 billion new entries added every day. As of March 2009, the largest slice of data Black Hole held  41 percent  was about people’s Internet browsing histories. The rest included a combination of email and instant messenger records, details about search engine queries, information about social media activity, logs related to hacking operations, and data on people’s use of tools to browse the Internet anonymously.
  • Throughout this period, as smartphone sales started to boom, the frequency of people’s Internet use was steadily increasing. In tandem, British spies were working frantically to bolster their spying capabilities, with plans afoot to expand the size of Black Hole and other repositories to handle an avalanche of new data. By 2010, according to the documents, GCHQ was logging 30 billion metadata records per day. By 2012, collection had increased to 50 billion per day, and work was underway to double capacity to 100 billion. The agency was developing “unprecedented” techniques to perform what it called “population-scale” data mining, monitoring all communications across entire countries in an effort to detect patterns or behaviors deemed suspicious. It was creating what it saidwould be, by 2013, “the world’s biggest” surveillance engine “to run cyber operations and to access better, more valued data for customers to make a real world difference.” HERE WAS A SIMPLE AIM at the heart of the top-secret program: Record the website browsing habits of “every visible user on the Internet.” Before long, billions of digital records about ordinary people’s online activities were being stored every day. Among them were details cataloging visits to porn, social media and news websites, search engines, chat forums, and blogs.
  • The mass surveillance operation — code-named KARMA POLICE — was launched by British spies about seven years ago without any public debate or scrutiny. It was just one part of a giant global Internet spying apparatus built by the United Kingdom’s electronic eavesdropping agency, Government Communications Headquarters, or GCHQ. The revelations about the scope of the British agency’s surveillance are contained in documents obtained by The Intercept from National Security Agency whistleblower Edward Snowden. Previous reports based on the leaked files have exposed how GCHQ taps into Internet cables to monitor communications on a vast scale, but many details about what happens to the data after it has been vacuumed up have remained unclear.
Gonzalo San Gil, PhD.

United's woes show what's hard about networking | ITworld - 0 views

  •  
    "SDN and cloud technology may cut down on big glitches, analysts say"
  •  
    "SDN and cloud technology may cut down on big glitches, analysts say"
Gonzalo San Gil, PhD.

5 Linux Laptops for Small Business - 0 views

  •  
    "A Linux laptop makes all kinds of sense for a small business. Not only is Linux the most secure computing platform, it's highly efficient, which means that computing power goes toward doing actual work instead of powering a bloated operating system."
  •  
    "A Linux laptop makes all kinds of sense for a small business. Not only is Linux the most secure computing platform, it's highly efficient, which means that computing power goes toward doing actual work instead of powering a bloated operating system."
Paul Merrell

Google Open Sources Google XML Pages - O'Reilly News - 0 views

  • OSCON 2008, Gonsalves made the announcement that, after several years of consideration, Google was releasing Google XML Pages (or GXP) under the Apache Open Source License.
  • At OSCON 2008, Gonsalves made the announcement that, after several years of consideration, Google was releasing Google XML Pages (or GXP) under the Apache Open Source License.
  • Originally developed as a Python interpreter that produced Java source code, gxp was rewritten in 2006-7 to be a completely Java based application. The idea behind gxp is fairly simple (and is one that is used, in slightly different fashion, for Microsoft's XAML and Silverlight) - a web designer can declare a number of XML namespaces that define specific libraries on an XHTML or GXP container element, intermixing GXP and XHTML code in order to perform conditional logic, invoke server components, define state variables or create template modules. This GXP code is then parsed and used to generate the relevant Java code, which in turn is compiled into a server module invoked from within a Java servlet engine such as Tomcat or Jetty and cached on the server.
Paul Merrell

IDABC - TESTA: Trans European Services for Telematics between Admini - 0 views

  •     The need for tight security may sometimes appear to clash with the need to exchange information effectively. However, TESTA offers an appropriate solution. It constitutes the European Community's own private network, isolated from the Internet and allows officials from different Ministries to communicate at a trans-European level in a safe and prompt way.
  • What is TESTA?ObjectivesHow does it work?AchievementsWho benefits?The role of TESTA in IDABCThe future of TESTATechnical InformationDocumentation
  • What is TESTA? TESTA is the European Community's own private, IP-based network. TESTA offers a telecommunications interconnection platform that responds to the growing need for secure information exchange between European public administrations. It is a European IP network, similar to the Internet in its universal reach, but dedicated to inter-administrative requirements and providing guaranteed performance levels.
  •  
    Note that Barack Obama's campaign platform technology plank calls for something similar in the U.S., under the direction of the nation's first National CIO, with an emphasis on open standards, interoperability, and reinvigorated antitrust enforcement. Short story: The E.U. is 12 years ahead of the U.S. in developing a regional SOA connecting all levels of government and in the U.S., open standards-based eGovernment has achieved the status of a presidential election issue. All major economic powers either follow the E.U.'s path or get left in Europe's IT economic dust. The largest missing element of the internet, a unified internet architecture that rejects big vendor incompatible IT standard games, is under way. I can't stress too much how key TESTA has been in the E.U.'s initiatives regarding document formats, embrace of open source software, and competition law intervention in the IT industry (e.g., the Microsoft case). The E.U. is very serious about restoring competition in the IT market, using both antitrust law and the government procurement power.
Gary Edwards

The Monkey On Microsoft's Back - Forbes.com - 0 views

  • The new technology, dubbed TraceMonkey, promises to speed up Firefox's ability to deliver complex applications. The move heightens the threat posed by a nascent group of online alternatives to Microsoft's most profitable software: PC applications, like Microsoft Office, that allow Microsoft to burn hundreds of millions of dollars on efforts to seize control of the online world. Microsoft's Business Division, which gets 90% of its revenues from sales of Microsoft Office, spat out $12.4 billion in operating income for the fiscal year ending June 30. Google (nasdaq: GOOG - news - people ), however, is playing a parallel game, using profits from its online advertising business to fund alternatives to Microsoft's desktop offerings. Google already says it has "millions" of users for its free, Web-based alternative to desktop staples, including Microsoft's Word, Excel and PowerPoint software. The next version of Firefox, which could debut by the end of this year, promises to speed up such applications, thanks to a new technology built into the developer's version of the software last week. Right now, rich Web applications such as Google Gmail rely on a technology known as Javascript to turn them from lifeless Web pages into applications that respond as users mouse about a Web page. TraceMonkey aims to turn the most frequently used chunks of Javascript code embedded into Web pages into binary form--allowing computers to hustle through the most used bits of code--without waiting around to render all of the code into binary form.
  •  
    I did send a very lenghthy comment to Brian Caulfield, the Forbes author of this article. Of course, i disagreed with his perspective. TraceMonkey is great, performing an acceleration of JavaScript in FireFox in much the same way that Squirrel Fish accelleratees WebKit Browsers. What Brian misses though is that the RiA war that is taking place both inside and outside the browser (RIA = fully functional Web applications that WILL replace the "client/server" apps model)
Paul Merrell

Firefox, Google's Chrome speed past IE, Opera | The Open Road - CNET News - 0 views

  • ZDNet Australia on Tuesday released updated browser speeds, as measured by the industry-standard SunSpider JavaScript test, and the results should give pause to proprietary-browser makers Microsoft and Opera Software: Every open-source browser completely obliterated the proprietary browsers in terms of performance, and by a huge margin. The test compared Microsoft's Internet Explorer 8 Release Candidate 1, Opera 10.00 Alpha, Firefox 3.1b1, Chrome 2.0.158.0, and the WebKit r40220 developer project included in Chrome and Apple's Safari. Google Chrome and Mozilla Firefox (along with WebKit) left the proprietary competition in the dust:
Paul Merrell

Fight over 'forms' clouds future of Net applications | Pagalz.com - Blog - 0 views

  • As Net heavyweights vie to define the next generation of Web applications, the Web’s main standards body is facing a revolt within its own ranks over electronic forms, a cornerstone of interactive documents.
  • “The W3C is saying the answer is XForms. Microsoft is saying it’s XAML. Macromedia is saying its Flash MX. And Mozilla is saying it’s XUL.
  • Though the success of one method or another might not seem to make much difference to the person filling out an order form, the fate of open standards in the process could determine whether that form can relay the data it collects to any standards-compliant database or banking system, or whether it can only operate within certain proprietary systems. The fate of a standard could also determine whether the order form could be accessed in any standards-compliant Web browser, or if it would be available only to users of a particular operating system–an outcome that has browser makers and others worried about the role of Microsoft.
  • ...5 more annotations...
  • browser makers still want a standards-based forms technology to help the Web steer clear of proprietary application platforms. They’re particularly concerned about Microsoft’s sprawling vision for Windows “Longhorn” applications built in the XML-based XAML markup language using Longhorn’s Avalon graphics system. Browsers like Mozilla Firefox, Opera and Apple’s Safari will be useless to access these Internet-based Windows applications.
  • “The WHAT approach works OK for small examples,” Pemberton said. “But actors like the Department of Defense say ‘no scripting.’”
  • HAT approach works OK for small examples,” Pemberton said. “But actors like the Department of Defense say ‘no scripting.’”
  • The evolution versus revolution debate over forms centers on the use of scripting–specifically JavaScript–to perform important tasks in forms-based applications.
  • “I understand where WHAT is coming from, but they are browser makers, not forms experts,” Pemberton said. “It is important to build something that is future-proof and not a Band-Aid solution. Forms (technology) is the basis of the e-commerce revolution and so it is important to do it right.”
Paul Merrell

Google Research Publication: BigTable - 0 views

  • Abstract Bigtable is a distributed storage system for managing structured data that is designed to scale to a very large size: petabytes of data across thousands of commodity servers. Many projects at Google store data in Bigtable, including web indexing, Google Earth, and Google Finance. These applications place very different demands on Bigtable, both in terms of data size (from URLs to web pages to satellite imagery) and latency requirements (from backend bulk processing to real-time data serving). Despite these varied demands, Bigtable has successfully provided a flexible, high-performance solution for all of these Google products. In this paper we describe the simple data model provided by Bigtable, which gives clients dynamic control over data layout and format, and we describe the design and implementation of Bigtable.
Gary Edwards

Sun Labs Lively Kernel - 0 views

  • Main features The main features of the Lively Kernel include: Small web programming environment and computing kernel, written entirely with JavaScript. In addition to its application execution capabilities, the platform can also function as an integrated development environment (IDE), making the whole system self-contained and able to improve and extend itself on the fly. Programmatic access to the user interface. Our system provides programmatic access from JavaScript to the user interface via the Morphic user interface framework. The user interface is built around an event-based programming model familiar to most web developers. Asynchronous networking. As in Ajax, you can use asynchronous HTTP to perform all the network operations asynchronously, without blocking the user interface.
  •  
    "The Sun Labs Lively Kernel is a new web programming environment developed at Sun Microsystems Laboratories. The Lively Kernel supports desktop-style applications with rich graphics and direct manipulation capabilities, but without the installation or upgrade hassles that conventional desktop applications have. The system is written entirely in the JavaScript programming language, a language supported by all the web browsers, with the intent that the system can run in commercial web browsers without installation or any plug-in components. The system leverages the dynamic characteristics of the JavaScript language to make it possible to create, modify and deploy applications on the fly, using tools built into the system itself. In addition to its application execution capabilities, the Lively Kernel can also function as an integrated development environment (IDE), making the whole system self-sufficient and able to improve and extend itself dynamically....." Too little too late? Interestingly, Lively Kernel is 100% JavaScript. Check out this "motivation" rational: "...The main goal of the Lively Kernel is to bring the same kind of simplicity, generality and flexibility to web programming that we have known in desktop programming for thirty years, but without the installation and upgrade hassles than conventional desktop applications have. The Lively Kernel places a special emphasis on treating web applications as real applications, as opposed to the document-oriented nature of most web applications today. In general, we want to put programming into web development, as opposed to the current weaving of HTML, XML and CSS documents that is also sometimes referred to as programming. ...." I agree with the Web document <> Web Application statement. I think the shift though is one where the RiA frames web documents in a new envirnement, blending in massive amounts of data, streaming media and graphics. The WebKit docuemnt model was designed for this p
« First ‹ Previous 41 - 60 of 80 Next ›
Showing 20 items per page