Skip to main content

Home/ Future of the Web/ Group items tagged useful arts

Rss Feed Group items tagged

Gonzalo San Gil, PhD.

The death of patents and what comes after: Alicia Gibb at TEDxStockholm - YouTube - 0 views

  •  
    "Published on Dec 18, 2012 Alicia Gibb got her start as a technologist from her combination in backgrounds including: informatics and library science, a belief system of freedom of information, inspiration from art and design, and a passion for hardware hacking. Alicia has worked between the crossroads of art and electronics for the past nine years, and has worked for the open source hardware community for the past three. She currently founded and is running the Open Source Hardware Association, an organization to educate and promote building and using open source hardware of all types. In her spare time, Alicia is starting an open source hardware company specific to education. Previous to becoming an advocate and an entrepreneur, Alicia was a researcher and prototyper at Bug Labs where she ran the academic research program and the Test Kitchen, an open R&D Lab. Her projects centered around developing lightweight additions to the BUG platform, as well as a sensor-based data collection modules. She is a member of NYCResistor, co-chair of the Open Hardware Summit, and a member of the advisory board for Linux Journal. She holds a degree in art education, a M.S. in Art History and a M.L.I.S. in Information Science from Pratt Institute. She is self-taught in electronics. Her electronics work has appeared in Wired magazine, IEEE Spectrum, Hackaday and the New York Times. When Alicia is not researching at the crossroads of open technology and innovation she is prototyping artwork that twitches, blinks, and might even be tasty to eat. In the spirit of ideas worth spreading, TEDx is a program of local, self-organized events that bring people together to share a TED-like experience. At a TEDx event, TEDTalks video and live speakers combine to spark deep discussion and connection in a small group. These local, self-organized events are branded TEDx, where x = independently organized TED event. The TED Conference provides general guidance for the TEDx program, but individual
David Corking

UK National Portrait Gallery threatens Wikipedia over scans of its public domain art - ... - 0 views

  • If you take public money to buy art, you should make that art available to the public using the best, most efficient means possible. If you believe the public wants to subsidize the creation of commercial art-books, then get out of the art-gallery business, start a publisher and hit the government up for some free tax-money.
    • David Corking
       
      Hear, hear.
  •  
    This is how I would like my taxes used.
  •  
    Analysis from the "open source" novelist
Gonzalo San Gil, PhD.

To Promote the Progress of Science and Useful Arts - 0 views

  •  
    "The Congress shall have power . . . . To promote the progress of science and useful arts, by securing for limited times to authors and inventors the exclusive right to their respective writings and discoveries." (U.S. Constitution, 1787)"
Paul Merrell

German Parliament Says No More Software Patents | Electronic Frontier Foundation - 0 views

  • The German Parliament recently took a huge step that would eliminate software patents (PDF) when it issued a joint motion requiring the German government to ensure that computer programs are only covered by copyright. Put differently, in Germany, software cannot be patented. The Parliament's motion follows a similar announcement made by New Zealand's government last month (PDF), in which it determined that computer programs were not inventions or a manner of manufacture and, thus, cannot be patented.
  • The crux of the German Parliament's motion rests on the fact that software is already protected by copyright, and developers are afforded "exploitation rights." These rights, however, become confused when broad, abstract patents also cover general aspects of computer programs. These two intellectual property systems are at odds. The clearest example of this clash is with free software. The motion recognizes this issue and therefore calls upon the government "to preserve the precedence of copyright law so that software developers can also publish their work under open source license terms and conditions with legal security." The free software movement relies upon the fact that software can be released under a copyright license that allows users to share it and build upon others' works. Patents, as Parliament finds, inhibit this fundamental spread.
  • Just like in the New Zealand order, the German Parliament carved out one type of software that could be patented, when: the computer program serves merely as a replaceable equivalent for a mechanical or electro-mechanical component, as is the case, for instance, when software-based washing machine controls can replace an electromechanical program control unit consisting of revolving cylinders which activate the control circuits for the specific steps of the wash cycle This allows for software that is tied to (and controls part of) another invention to be patented. In other words, if a claimed process is purely a computer program, then it is not patentable. (New Zealand's order uses a similar washing machine example.) The motion ends by calling upon the German government to push for this approach to be standard across all of Europe. We hope policymakers in the United States will also consider fundamental reform that deals with the problems caused by low-quality software patents. Ultimately, any real reform must address this issue.
  •  
    Note that an unofficial translation of the parliamentary motion is linked from the article. This adds substantially to the pressure internationally to end software patents because Germany has been the strongest defender of software patents in Europe. The same legal grounds would not apply in the U.S. The strongest argument for the non-patentability in the U.S., in my opinion, is that software patents embody embody both prior art and obviousness. A general purpose computer can accomplish nothing unforeseen by the prior art of the computing device. And it is impossible for software to do more than cause different sequences of bit register states to be executed. This is the province of "skilled artisans" using known methods to produce predictable results. There is a long line of Supreme Court decisions holding that an "invention" with such traits is non-patentable. I have summarized that argument with citations at . 
Gonzalo San Gil, PhD.

Here's why patents are innovation's worst enemy | Vivek Wadhwa | LinkedIn - 1 views

  •  
    "The Founding Fathers of the United States considered intellectual property so important that they gave it a special place in the Constitution: "To promote the progress of science and useful arts, by securing for limited times to authors and inventors the exclusive right to their respective writings and discoveries.""
  •  
    "The Founding Fathers of the United States considered intellectual property so important that they gave it a special place in the Constitution: "To promote the progress of science and useful arts, by securing for limited times to authors and inventors the exclusive right to their respective writings and discoveries.""
  •  
    The quote is somewhat misleading because it is out of context. The section is preceded by: "The Congress shall have Power ..." Those are words of discretion, not commandment. Nothing in the Constitution *requires* that patent and copyright systems be established. "Stable ownership is the gift of social law, and is given late in the progress of society. It would be curious then, if an idea, the fugitive fermentation of an individual brain, could, of natural right, be claimed in exclusive and stable property. If nature has made any one thing less susceptible than all others of exclusive property, it is the action of the thinking power called an idea, which an individual may exclusively possess as long as he keeps it to himself; but the moment it is divulged, it forces itself into the possession of every one, and the receiver cannot dispossess himself of it. Its peculiar character, too, is that no one possesses the less, because every other possesses the whole of it. He who receives an idea from me, receives instruction himself without lessening mine; as he who lights his taper at mine, receives light without darkening me. That ideas should freely spread from one to another over the globe, for the moral and mutual instruction of man, and improvement of his condition, seems to have been peculiarly and benevolently designed by nature, when she made them, like fire, expansible over all space, without lessening their density in any point, and like the air in which we breathe, move, and have our physical being, incapable of confinement or exclusive appropriation. Inventions then cannot, in nature, be a subject of property. Society may give an exclusive right to the profits arising from them, as an encouragement to men to pursue ideas which may produce utility, *but this may or may not be done, according to the will and convenience of the society, without claim or complaint from any body."* VI Writings of Thomas Jefferson, at 180-181 (Washington ed.).
Paul Merrell

Media Queries - 0 views

  •  
    While the candidate Media Queries specification is interesting and a step in the right direction, W3C continues to butcher the meaning of "interoperability." In this latest sleight of hand, we now have "interoperable" *user agents*, a term of art used by W3C for implementations that only receive and cannot return data, e.g., web browsers.  But under competition law, "interoperability" requires implementations that can exchange data and *mutually* use data that has been exchanged. See e.g., European Commission v. Microsoft, European Community Court of First Instance (Grand Chamber Judgment of 17 September, 2007), para. 230, 374, 421, http://tinyurl.com/23md42c (rejecting Microsoft's argument that "interoperability" has a 1-way rather than 2-way meaning; "Directive 91/250 defines interoperability as 'the ability to exchange information and *mutually* to use the information which has been exchanged'") (emphasis added). W3C, the World Wide Web Conspiracy, continues down its rut of broadcasting information whilst denying the world the power to round-trip the data received. 
Gary Edwards

Should you buy enterprise applications from a startup? - 0 views

  • The biggest advantage of startups, in Mueller's opinion? "They have no technical historical burden, and they don't care about many technical dependencies. They deliver easy-to-use technology with relatively simple but powerful integration options."
  • "The model we've used to buy on-premises software for 20-plus years is shifting," insists Laping. "There are new ways of selecting and vetting partners."
  • Part of that shift is simple: The business side sees what technology can do, and it's banging on IT's door, demanding ... what? Not new drop-down menus in the same-old ERP application, but rather state-of-the-art, cutting-edge, ain't-that-cool innovation. The landscape is wide open: Innovation can come in the form of new technologies, such as the Internet of Things, or from mobility, the cloud, virtualization -- in fact, from anywhere an enterprise vendor isn't filling a need. The easiest place to find that? Startups.
  • ...5 more annotations...
  • "The number one reason to consider a startup is that the current landscape of Magic Quadrant vendors is not serving a critical need. That's a problem."
  • Ravi Belani is managing partner at Alchemist Accelerator, a Palo Alto, Calif.-based venture-backed initiative focused on accelerating startups whose revenue comes from enterprises rather than consumers. He says, "The innovation that used to come out of big software houses isn't there anymore, while the pace of innovation in technology is accelerating."
  • He acknowledges that there has been a longtime concern with startups about the ability of their applications to scale, but given startups' ability to build their software on robust infrastructure platforms using IaaS or PaaS, and then deploy them via SaaS, "scalability isn't as big a deal as it used it be. It costs $50,000 today to do what you needed $50 million to do ten years ago. That means it takes less capital today to create the same innovation. Ten years ago, that was a moat, a barrier to entry, but software vendors don't own that moat anymore."
  • he confluence of offshore programming, open source technologies and cloud-based infrastructures has significantly lowered the barriers to entry of launching a new venture -- not to mention all those newly minted tech millionaires willing to be angel investors.
  • "In the new paradigm, [most software] implementations are so much shorter, you don't have to think about that risk. You're not talking about three years and $20 million. You're talking about 75 days and $50,000. You implement little modules and get big wins along the way."
  •  
    "The idea of buying an enterprise application from a startup company might sound like anathema to a CIO. But Chris Laping, CIO of restaurant chain Red Robin, based in Greenwood Village, Colo., disagrees. He believes we're in the middle of a significant shift that favors startups -- moving from huge applications with extensive features to task-based activities, inspired by the apps running on mobile devices. Featured Resource Presented by Scribe Software 10 Best Practices for Integrating Data Data integration is often underestimated and poorly implemented, taking time and resources. Yet it Learn More Mirco Mueller concurs. He is an IT architect for St. Gallen, Switzerland-based Helvetia Swiss Life Insurance Co., which -- having been founded in 1858 -- is about as far from a startup as possible. He recently chose a SaaS tool from an unnamed startup over what he calls "a much more powerful but much more complex alternative. Its list of features is shorter than the feature list of the big companies, but in terms of agility, flexibility, ease of use and adjustable business model, it beat" all of its competitors. The biggest advantage of startups, in Mueller's opinion? "They have no technical historical burden, and they don't care about many technical dependencies. They deliver easy-to-use technology with relatively simple but powerful integration options." There's certainly no lack of applications available from new players. At a recent conference focusing on innovation, Microsoft Ventures principal Daniel Sumner noted that every month for the last 88 months, there's been a $1 billion valuation for one startup or another. That's seven years and counting. But as Silicon Valley skeptics like to point out, those are the ones you hear about. For every successful startup, there are at least three that fail, according to 2012 research by Harvard Business School professor Shikhar Ghosh. So why, then, would CIOs in their right mind take the risk of buying enterprise applic
Paul Merrell

Prepare to Hang Up the Phone, Forever - WSJ.com - 0 views

  • At decade's end, the trusty landline telephone could be nothing more than a memory. Telecom giants AT&T T +0.31% AT&T Inc. U.S.: NYSE $35.07 +0.11 +0.31% March 28, 2014 4:00 pm Volume (Delayed 15m) : 24.66M AFTER HOURS $35.03 -0.04 -0.11% March 28, 2014 7:31 pm Volume (Delayed 15m): 85,446 P/E Ratio 10.28 Market Cap $182.60 Billion Dividend Yield 5.25% Rev. per Employee $529,844 03/29/14 Prepare to Hang Up the Phone, ... 03/21/14 AT&T Criticizes Netflix's 'Arr... 03/21/14 Samsung's Galaxy S5 Smartphone... More quote details and news » T in Your Value Your Change Short position and Verizon Communications VZ -0.57% Verizon Communications Inc. U.S.: NYSE $47.42 -0.27 -0.57% March 28, 2014 4:01 pm Volume (Delayed 15m) : 24.13M AFTER HOURS $47.47 +0.05 +0.11% March 28, 2014 7:59 pm Volume (Delayed 15m): 1.57M
  • The two providers want to lay the crumbling POTS to rest and replace it with Internet Protocol-based systems that use the same wired and wireless broadband networks that bring Web access, cable programming and, yes, even your telephone service, into your homes. You may think you have a traditional landline because your home phone plugs into a jack, but if you have bundled your phone with Internet and cable services, you're making calls over an IP network, not twisted copper wires. California, Florida, Texas, Georgia, North Carolina, Wisconsin and Ohio are among states that agree telecom resources would be better redirected into modern telephone technologies and innovations, and will kill copper-based technologies in the next three years or so. Kentucky and Colorado are weighing similar laws, which force people to go wireless whether they want to or not. In Mantoloking, N.J., Verizon wants to replace the landline system, which Hurricane Sandy wiped out, with its wireless Voice Link. That would make it the first entire town to go landline-less, a move that isn't sitting well with all residents.
  • New Jersey's legislature, worried about losing data applications such as credit-card processing and alarm systems that wireless systems can't handle, wants a one-year moratorium to block that switch. It will vote on the measure this month. (Verizon tried a similar change in Fire Island, N.Y., when its copper lines were destroyed, but public opposition persuaded Verizon to install fiber-optic cable.) It's no surprise that landlines are unfashionable, considering many of us already have or are preparing to ditch them. More than 38% of adults and 45.5% of children live in households without a landline telephone, says the Centers for Disease Control and Prevention. That means two in every five U.S. homes, or 39%, are wireless, up from 26.6% three years ago. Moreover, a scant 8.5% of households relied only on a landline, while 2% were phoneless in 2013. Metropolitan residents have few worries about the end of landlines. High-speed wire and wireless services are abundant and work well, despite occasional dropped calls. Those living in rural areas, where cell towers are few and 4G capability limited, face different issues.
  • ...2 more annotations...
  • Safety is one of them. Call 911 from a landline and the emergency operator pinpoints your exact address, down to the apartment number. Wireless phones lack those specifics, and even with GPS navigation aren't as precise. Matters are worse in rural and even suburban areas that signals don't reach, sometimes because they're blocked by buildings or the landscape. That's of concern to the Federal Communications Commission, which oversees all forms of U.S. communications services. Universal access is a tenet of its mission, and, despite the state-by-state degradation of the mandate, it's unwilling to let telecom companies simply drop geographically undesirable customers. Telecom firms need FCC approval to ax services completely, and can't do so unless there is a viable competitor to pick up the slack. Last year AT&T asked to turn off its legacy network, which could create gaps in universal coverage and will force people off the grid to get a wireless provider.
  • AT&T and the FCC will soon begin trials to explore life without copper-wired landlines. Consumers will voluntarily test IP-connected networks and their impact on towns like Carbon Hills, Ala., population 2,071. They want to know how households will reach 911, how small businesses will connect to customers, how people with medical-monitoring devices or home alarms know they will always be connected to a reliable network, and what the costs are. "We cannot be a nation of opportunity without networks of opportunity," said FCC Chairman Tom Wheeler in unveiling the plan. "This pilot program will help us learn how fiber might be deployed where it is not now deployed…and how new forms of wireless can reach deep into the interior of rural America."
Paul Merrell

Safe Plurality: Can it be done using OOXML's Markup Compatibility and Extensions mechan... - 0 views

  • During the OOXML standardization proceedings, the ISO particpants felt that there was one particular sub-technology, Markup Compatibility and Extensibility (MCE), that was potentially of such usefulness by other standards, that it was brought out into its own part. It is now IS29500:2009 Part 3: you can download it in its ECMA form here, it only has about 15 pages of substantive text. The particular issue that MCE address is this: what is an application supposed to do when it finds some markup it wasn't programmed to accept? This could be extension elements in some foreign namespace, but it could also be some elements from a known namespace: the case when a document was made against a newer version of the standard than the application.
  •  
    Rick Jelliffe posts a frank view of the OOXML compatibility framework, a document I've studied myself in the past. There is much that is laudable about the framework, but there are also aspects that are troublesome. Jelliffe identifies one red flag item, the freedom for a vendor to "proprietize" OOXML using the MustUnderstand attribute and offers some suggestions for lessening that danger through redrafting of the spec. One issue he does not touch, however, is the Microsoft Open Specification Promise covenant not to sue, a deeply flawed document in terms of anyone implementing OOXML other than Microsoft. Still, there is so much prior art for the OOXML compatibility framework that I doubt any patent reading on it would survive judicial review. E.g., a highly similar framework has been implemented in WordPerfect since version 6.0. and the OOXML framework is remarkably similar to the compatibility framework specified by OASIS OpenDocument 1.0 but subsequently gutted at ISO. The Jelliffe article offers a good overview of factors that must be considered in designing a standard's compatibility framework. For those that go on to read the compatibility framework's specification, keep in mind that in several places the document falsely claims that it is an interoperability framework. It is not. It is a framework designed for one-way transfer of data, not interoperability which involves round-trip 2-way of exchange of data without data loss.
Gary Edwards

Zoho Blogs » Firefox 3.1 & Google Chrome: Javascript Wins, Flash/Silverlight ... - 0 views

  •  
    ZOHO Speaks about Chrome: "The biggest losers in Google's announcement are not really competing browsers, but competing rich client engines like Flash and Silverlight. As Javascript advances rapidly, it inevitably encroaches on the territory currently held by Flash. Native browser video is likely the last nail in the coffin - and Google needs native browser based video for its own YouTube, so we can be confident Google Chrome and Firefox will both have native video support, with Javascript-accessible VOM (video object model) APIs for web applications to manipuate video. As for Silverlight, let me just say that if Silverlight is the future of web computing, companies like us might as well find another line of work - and I suspect Google and Yahoo probably see it the same way too. More speculatively, I believe we will witness the emergence of Javascript as the dominant language of computing, as it sweeps the client side and starts encroaching on the server. The server landscape today is split between "enterprise" platforms like Java and .NET on the one side (we ourselves are in the Java camp on the server side), and "scripting" languages like PHP, Python, Ruby on the other, with Javascript firmly entrenched on the client. Languages like Ruby promise tremendous dynamism and flexibility to the developer, but their relatively weak execution environments have held them back. It is telling that both Java and .NET come with state of the art just-in-time compilers, while none of the major scripting languages do......" Interestingly, ZOHO already has a prototype running on Chrome! Solves tons of performance problems for them, as well as givign them an on-line / off-line story (Gears). The success of Chrome depends on Chrome "killer apps"; Not browser surfing features! And we already have a number of killer apps that will immediately take advantage of Chrome: gMail, gReader, gMaps and Google Docs! ZOHO will no doubt use Chrome to put themselves squarely i
Paul Merrell

Ohio's attorney general wants Google to be declared a public utility. - The New York Times - 2 views

  • Ohio’s attorney general, Dave Yost, filed a lawsuit on Tuesday in pursuit of a novel effort to have Google declared a public utility and subject to government regulation.The lawsuit, which was filed in a Delaware County, Ohio court, seeks to use a law that’s over a century old to regulate Google by applying a legal designation historically used for railroads, electricity and the telephone to the search engine.“When you own the railroad or the electric company or the cellphone tower, you have to treat everyone the same and give everybody access,” Mr. Yost, a Republican, said in a statement. He added that Ohio was the first state to bring such a lawsuit against Google.If Google were declared a so-called common carrier like a utility company, it would prevent the company from prioritizing its own products, services and websites in search results.AdvertisementContinue reading the main storyGoogle said it had none of the attributes of a common carrier that usually provide a standardized service for a fee using public assets, such as rights of way.The “lawsuit would make Google Search results worse and make it harder for small businesses to connect directly with customers,” José Castañeda, a Google spokesman, said in a statement. “Ohioans simply don’t want the government to run Google like a gas or electric company. This lawsuit has no basis in fact or law and we’ll defend ourselves against it in court.”Though the Ohio lawsuit is a stretch, there is a long history of government control of certain kinds of companies, said Andrew Schwartzman, a senior fellow at the nonprofit Benton Institute for Broadband & Society. “Think of ‘The Canterbury Tales.’ Travelers needed a place to stay and eat on long road treks, and innkeepers were not allowed to deny them accommodations or rip them off,” he said.
  • After a series of federal lawsuits filed against Google last year, Ohio’s lawsuit is part of a next wave of state actions aimed at regulating and curtailing the power of Big Tech. Also on Tuesday, Colorado’s legislature passed a data privacy law that would allow consumers to opt out of data collection.On Monday, New York’s Senate passed antitrust legislation that would make it easier for plaintiffs to sue dominant platforms for abuse of power. After years of inaction in Congress with tech legislation, states are beginning to fill the regulatory vacuum.Editors’ PicksThe Abandoned Houses of Instagram21 Easy Summer Dinners You’ll Cook (or Throw Together) on Repeat‘King Richard’ Finds Fresh Drama in WatergateAdvertisementContinue reading the main storyAdvertisementContinue reading the main storyOhio was also one of 38 states that filed an antitrust lawsuit in December accusing Google of being a monopoly and using its dominant position in internet search to squeeze out smaller rivals.
Paul Merrell

Most Agencies Falling Short on Mandate for Online Records - 1 views

  • Nearly 20 years after Congress passed the Electronic Freedom of Information Act Amendments (E-FOIA), only 40 percent of agencies have followed the law's instruction for systematic posting of records released through FOIA in their electronic reading rooms, according to a new FOIA Audit released today by the National Security Archive at www.nsarchive.org to mark Sunshine Week. The Archive team audited all federal agencies with Chief FOIA Officers as well as agency components that handle more than 500 FOIA requests a year — 165 federal offices in all — and found only 67 with online libraries populated with significant numbers of released FOIA documents and regularly updated.
  • Congress called on agencies to embrace disclosure and the digital era nearly two decades ago, with the passage of the 1996 "E-FOIA" amendments. The law mandated that agencies post key sets of records online, provide citizens with detailed guidance on making FOIA requests, and use new information technology to post online proactively records of significant public interest, including those already processed in response to FOIA requests and "likely to become the subject of subsequent requests." Congress believed then, and openness advocates know now, that this kind of proactive disclosure, publishing online the results of FOIA requests as well as agency records that might be requested in the future, is the only tenable solution to FOIA backlogs and delays. Thus the National Security Archive chose to focus on the e-reading rooms of agencies in its latest audit. Even though the majority of federal agencies have not yet embraced proactive disclosure of their FOIA releases, the Archive E-FOIA Audit did find that some real "E-Stars" exist within the federal government, serving as examples to lagging agencies that technology can be harnessed to create state-of-the art FOIA platforms. Unfortunately, our audit also found "E-Delinquents" whose abysmal web performance recalls the teletype era.
  • E-Delinquents include the Office of Science and Technology Policy at the White House, which, despite being mandated to advise the President on technology policy, does not embrace 21st century practices by posting any frequently requested records online. Another E-Delinquent, the Drug Enforcement Administration, insults its website's viewers by claiming that it "does not maintain records appropriate for FOIA Library at this time."
  • ...9 more annotations...
  • "The presumption of openness requires the presumption of posting," said Archive director Tom Blanton. "For the new generation, if it's not online, it does not exist." The National Security Archive has conducted fourteen FOIA Audits since 2002. Modeled after the California Sunshine Survey and subsequent state "FOI Audits," the Archive's FOIA Audits use open-government laws to test whether or not agencies are obeying those same laws. Recommendations from previous Archive FOIA Audits have led directly to laws and executive orders which have: set explicit customer service guidelines, mandated FOIA backlog reduction, assigned individualized FOIA tracking numbers, forced agencies to report the average number of days needed to process requests, and revealed the (often embarrassing) ages of the oldest pending FOIA requests. The surveys include:
  • The federal government has made some progress moving into the digital era. The National Security Archive's last E-FOIA Audit in 2007, " File Not Found," reported that only one in five federal agencies had put online all of the specific requirements mentioned in the E-FOIA amendments, such as guidance on making requests, contact information, and processing regulations. The new E-FOIA Audit finds the number of agencies that have checked those boxes is now much higher — 100 out of 165 — though many (66 in 165) have posted just the bare minimum, especially when posting FOIA responses. An additional 33 agencies even now do not post these types of records at all, clearly thwarting the law's intent.
  • The FOIAonline Members (Department of Commerce, Environmental Protection Agency, Federal Labor Relations Authority, Merit Systems Protection Board, National Archives and Records Administration, Pension Benefit Guaranty Corporation, Department of the Navy, General Services Administration, Small Business Administration, U.S. Citizenship and Immigration Services, and Federal Communications Commission) won their "E-Star" by making past requests and releases searchable via FOIAonline. FOIAonline also allows users to submit their FOIA requests digitally.
  • THE E-DELINQUENTS: WORST OVERALL AGENCIES In alphabetical order
  • Key Findings
  • Excuses Agencies Give for Poor E-Performance
  • Justice Department guidance undermines the statute. Currently, the FOIA stipulates that documents "likely to become the subject of subsequent requests" must be posted by agencies somewhere in their electronic reading rooms. The Department of Justice's Office of Information Policy defines these records as "frequently requested records… or those which have been released three or more times to FOIA requesters." Of course, it is time-consuming for agencies to develop a system that keeps track of how often a record has been released, which is in part why agencies rarely do so and are often in breach of the law. Troublingly, both the current House and Senate FOIA bills include language that codifies the instructions from the Department of Justice. The National Security Archive believes the addition of this "three or more times" language actually harms the intent of the Freedom of Information Act as it will give agencies an easy excuse ("not requested three times yet!") not to proactively post documents that agency FOIA offices have already spent time, money, and energy processing. We have formally suggested alternate language requiring that agencies generally post "all records, regardless of form or format that have been released in response to a FOIA request."
  • Disabilities Compliance. Despite the E-FOIA Act, many government agencies do not embrace the idea of posting their FOIA responses online. The most common reason agencies give is that it is difficult to post documents in a format that complies with the Americans with Disabilities Act, also referred to as being "508 compliant," and the 1998 Amendments to the Rehabilitation Act that require federal agencies "to make their electronic and information technology (EIT) accessible to people with disabilities." E-Star agencies, however, have proven that 508 compliance is no barrier when the agency has a will to post. All documents posted on FOIAonline are 508 compliant, as are the documents posted by the Department of Defense and the Department of State. In fact, every document created electronically by the US government after 1998 should already be 508 compliant. Even old paper records that are scanned to be processed through FOIA can be made 508 compliant with just a few clicks in Adobe Acrobat, according to this Department of Homeland Security guide (essentially OCRing the text, and including information about where non-textual fields appear). Even if agencies are insistent it is too difficult to OCR older documents that were scanned from paper, they cannot use that excuse with digital records.
  • Privacy. Another commonly articulated concern about posting FOIA releases online is that doing so could inadvertently disclose private information from "first person" FOIA requests. This is a valid concern, and this subset of FOIA requests should not be posted online. (The Justice Department identified "first party" requester rights in 1989. Essentially agencies cannot use the b(6) privacy exemption to redact information if a person requests it for him or herself. An example of a "first person" FOIA would be a person's request for his own immigration file.) Cost and Waste of Resources. There is also a belief that there is little public interest in the majority of FOIA requests processed, and hence it is a waste of resources to post them. This thinking runs counter to the governing principle of the Freedom of Information Act: that government information belongs to US citizens, not US agencies. As such, the reason that a person requests information is immaterial as the agency processes the request; the "interest factor" of a document should also be immaterial when an agency is required to post it online. Some think that posting FOIA releases online is not cost effective. In fact, the opposite is true. It's not cost effective to spend tens (or hundreds) of person hours to search for, review, and redact FOIA requests only to mail it to the requester and have them slip it into their desk drawer and forget about it. That is a waste of resources. The released document should be posted online for any interested party to utilize. This will only become easier as FOIA processing systems evolve to automatically post the documents they track. The State Department earned its "E-Star" status demonstrating this very principle, and spent no new funds and did not hire contractors to build its Electronic Reading Room, instead it built a self-sustaining platform that will save the agency time and money going forward.
Gonzalo San Gil, PhD.

State of VoIP in Linux - Datamation - 0 views

  •  
    "Like most people, I find myself using the same VoIP options everyone else is using. Thankfully, these days there are far more options available than what we might think. Today, I'll look at these options and also explore up-and-coming alternatives as well."
  •  
    "Like most people, I find myself using the same VoIP options everyone else is using. Thankfully, these days there are far more options available than what we might think. Today, I'll look at these options and also explore up-and-coming alternatives as well."
Gary Edwards

Design for Developers: Interactivity, animations, and AJAX - 0 views

  •  
    Awesome commentary in the must read category. JC nails it; starting with "layout"! ....... "We were both part of the same team and he was creating some UI elements that I was to wire up. As I sat there (in awe) watching him work I realized that much of his considerable skill was rooted in fundamentals not unlike the art of programming. Of course, there are design skills that are intuitive that can't be "learned." But, that can also be said of the logical clarity found in a really elegant data model or a brilliant inheritance tree. I am certainly no designer, but I have observed the more creative among us for several years and have gained some insight into their world. In this article I'll share some basic principles that can help raise your design acumen and improve the experience of your users...... " Layout I'd like to attack my goal of imparting design wisdom by breaking the topic into four buckets. The first is layout.
Paul Merrell

Google, Facebook made secret deal to divvy up market, Texas alleges - POLITICO - 1 views

  • Google and Facebook, the No. 1 and No. 2 players in online advertising, made a secret illegal pact in 2018 to divide up the market for ads on websites and apps, according to an antitrust suit filed Wednesday against the search giant. The suit — filed by Texas and eight other states — alleges that the companies colluded to fix prices and divvy up the market for mobile advertising between them.
  • The allegation that Google teamed up with Facebook to suppress competition mirrors a major claim in a separate antitrust suit the Justice Department filed against the company in October: that Google teamed up with Apple to help ensure the continued dominance of its search engine. Such allegations provide some of the strongest ammunition yet to advocates who argue that the U.S. major tech companies have gotten too big and are using their power — sometimes in conjunction with each other — to control markets.Many of the details about the Google-Facebook agreement, including its specific language, are redacted from the complaint. But the states say it “fixes prices and allocates markets between Google and Facebook as competing bidders in the auctions for publishers’ web display and in-app advertising inventory.”
  • The complaint alleges that the agreement was prompted by Facebook’s move in 2017 to use “header bidding” — a technology popular with website publishers that helped them increase the money they made from advertising. While Facebook sells ads on its own platform, it also operates a network to let advertisers offer ads on third-party apps and mobile websites.
  • ...1 more annotation...
  • Google was concerned about the move to header bidding, the complaint alleges, because it posed an “existential threat” to its own advertising exchange and limited the ability of the search giant to use information from its ad-buying and selling tools to its advantage. Those tools let Google cherry pick the highest value advertising spots and ads, according to the complaint.Within months of Facebook’s announcement, Google approached it to open negotiations, the complaint alleged, and the two companies eventually cut a deal: Facebook would cut back on the use of header bidding and use Google’s ad server. In exchange, the complaint alleges that Google gave Facebook advantages in its auctions.
Gonzalo San Gil, PhD.

Apple Patents Technology to Legalize P2P Sharing | TorrentFreak * - 1 views

  •  
    "This means that transferring files between devices is only possible if these support Apple's licensing scheme. That's actually a step backwards from the DRM-free music that's sold in most stores today." [* What 'Apple's licensing scheme' -closed source- can hide?]
  •  
    "This means that transferring files between devices is only possible if these support Apple's licensing scheme. That's actually a step backwards from the DRM-free music that's sold in most stores today." [* What 'Apple's licensing scheme' -closed source- can hide?]
  •  
    A business method software patent combining old elements that are all prior art, including DRM. Yech! "... a patent that makes it possible to license P2P sharing" really puts a spin on reality. If the methods were in the public domain, anyone could use them without a license. That's equivalent to to saying "a government-granted monopoly with the power but no responsibility to collect money from anyone who wants to invade the monopoly's protected rights" and presenting that fact as some sort of tremendous philanthropic act by Apple. On software patent claims as prior art and obvious, see my legal memo on that topic here. http://goo.gl/5X8Kg9
Paul Merrell

Public transit in Beverly Hills may soon be driverless, program unanimously approved - ... - 0 views

  • An uncontested vote by the Beverly Hills City Council could guarantee a chauffeur for all residents in the near future. However, instead of a driver, the newly adopted program foresees municipally-owned driverless cars ready to order via a smartphone app. Also known as autonomous vehicles, or AV, driverless cars would appear to be the next big thing not only for people, but local governments as well – if the Beverly Hills City Council can get its AV development program past a few more hurdles, that is. The technology itself has some challenges ahead as well.
  • In the meantime, the conceptual shuttle service, which was unanimously approved at an April 5 city council meeting, is being celebrated.
  • Naming Google and Tesla in its press release, Beverly Hills must first develop a partnership with a manufacturer that can build it a fleet of unmanned cars. There will also be a need to bring in policy experts. All of these outside parties will have a chance to explore the program’s potential together at an upcoming community event.The Wallis Annenberg Center for the Performing Arts will host a summit this fall that will include expert lectures, discussions, and test drives. Er, test rides.Already in the works for Beverly Hills is a fiber optics cable network that will, in addition to providing high-speed internet access to all residents and businesses, one day be an integral part of a public transit system that runs on its users’ spontaneous desires.Obviously, Beverly Hills has some money on hand for the project, and it is also an ideal testing space as the city takes up an area of less than six square miles. Another positive factor is the quality of the city’s roads, which exceeds that of most in the greater Los Angeles area, not to mention California and the whole United States.“It can’t find the lane markings!” Volvo’s North American CEO, Lex Kerssemakers, complained to Los Angeles Mayor Eric Garcetti last month, according to Reuters. “You need to paint the bloody roads here!”Whether lanes are marked or signs are clear has made a big difference in how successfully the new technology works.Unfortunately, the US Department of Transportation considers 65 percent of US roads to be in poor condition, so AV cars may not be in the works for many Americans living outside of Beverly Hills quite as soon.
Gary Edwards

Meet OX Text, a collaborative, non-destructive alternative to Google Docs - Tech News a... - 0 views

  • The German software-as-a-service firm Open-Xchange, which provides apps that telcos and other service providers can bundle with their connectivity or hosting products, is adding a cloud-based office productivity toolset called OX Documents to its OX App Suite lineup. Open-Xchange has around 70 million users through its contracts with roughly 80 providers such as 1&1 Internet and Strato. Its OX App Suite takes the form of a virtual desktop of sorts, that lets users centralize their email and file storage accounts and view all sorts of documents through a unified portal. However, as of an early April release it will also include OX Text, a non-destructive, collaborative document editor that rivals Google Docs, and that has an interesting heritage of its own.
  • The team that created the HTML5- and JavaScript-based OX Text includes some of the core developers behind OpenOffice, the free alternative to Microsoft Office that passed from Sun Microsystems to Oracle before morphing into LibreOffice. The German developers we’re talking about hived off the project before LibreOffice happened, and ended up getting hired by Open-Xchange. “To them it was a once in a lifetime event, because we allowed them to start from scratch,” Open-Xchange CEO Rafael Laguna told me. “We said we wanted a fresh office productivity suite that runs inside the browser. In terms of the architecture and principles for the product, we wanted to make it fully round-trip capable, meaning whatever file format we run into needs to be retained.”
  • This is an extremely handy formatting and version control feature. Changes made to a document in OX Text get pushed through to Open-Xchange’s backend, where a changelog is maintained. “Power” Word features such as Smart Art or Charts, which are not necessarily supported by other productivity suites, are replaced with placeholders during editing and are there, as before, when the edited document is eventually downloaded. As the OX Text blurb says, “OX Text never damages your valuable work even if it does not understand it”.
  • ...1 more annotation...
  • “[This avoids] the big disadvantage of anything other than Microsoft Office,” Laguna said. “If you use OpenOffice with a .docx file, the whole document is converted, creating artefacts, then you convert it back. That’s one of the major reasons not everyone is using OpenOffice, and the same is true for Google Apps.” OX Text will be available as an extension to OX App Suite, which also includes calendaring and other productivity tools. However, it will also come out as a standalone product under both commercial licenses – effectively support-based subscriptions for Open-Xchange’s service provider customers – and open-source licenses, namely the GNU General Public License 2 and Creative Commons Attribution-NonCommercial-ShareAlike 2.5 License, which will allow free personal, non-commercial use. You can find a demo of App Suite, including the OX Text functionality, here, and there’s a video too:
Gonzalo San Gil, PhD.

The state of open data and open hardware licensing | opensource.com - 0 views

  •  
    "Drafting and using open licenses for data and hardware presents both familiar old challenges (like license proliferation) and new challenges (like less developed legal frameworks and different production models)"
  •  
    "Drafting and using open licenses for data and hardware presents both familiar old challenges (like license proliferation) and new challenges (like less developed legal frameworks and different production models)"
Paul Merrell

An Important Kindle request - 0 views

  • A Message from the Amazon Books Team Dear Readers, Just ahead of World War II, there was a radical invention that shook the foundations of book publishing. It was the paperback book. This was a time when movie tickets cost 10 or 20 cents, and books cost $2.50. The new paperback cost 25 cents — it was ten times cheaper. Readers loved the paperback and millions of copies were sold in just the first year. With it being so inexpensive and with so many more people able to afford to buy and read books, you would think the literary establishment of the day would have celebrated the invention of the paperback, yes? Nope. Instead, they dug in and circled the wagons. They believed low cost paperbacks would destroy literary culture and harm the industry (not to mention their own bank accounts). Many bookstores refused to stock them, and the early paperback publishers had to use unconventional methods of distribution — places like newsstands and drugstores. The famous author George Orwell came out publicly and said about the new paperback format, if "publishers had any sense, they would combine against them and suppress them." Yes, George Orwell was suggesting collusion. Well… history doesn't repeat itself, but it does rhyme.
  • Fast forward to today, and it's the e-book's turn to be opposed by the literary establishment. Amazon and Hachette — a big US publisher and part of a $10 billion media conglomerate — are in the middle of a business dispute about e-books. We want lower e-book prices. Hachette does not. Many e-books are being released at $14.99 and even $19.99. That is unjustifiably high for an e-book. With an e-book, there's no printing, no over-printing, no need to forecast, no returns, no lost sales due to out of stock, no warehousing costs, no transportation costs, and there is no secondary market — e-books cannot be resold as used books. E-books can and should be less expensive. Perhaps channeling Orwell's decades old suggestion, Hachette has already been caught illegally colluding with its competitors to raise e-book prices. So far those parties have paid $166 million in penalties and restitution. Colluding with its competitors to raise prices wasn't only illegal, it was also highly disrespectful to Hachette's readers. The fact is many established incumbents in the industry have taken the position that lower e-book prices will "devalue books" and hurt "Arts and Letters." They're wrong. Just as paperbacks did not destroy book culture despite being ten times cheaper, neither will e-books. On the contrary, paperbacks ended up rejuvenating the book industry and making it stronger. The same will happen with e-books.
1 - 20 of 25 Next ›
Showing 20 items per page