Skip to main content

Home/ Future of the Web/ Group items matching "from improvements" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
Gary Edwards

With faster Chrome browser, Google offers an Android alternative - CNET - 0 views

  •  
    "On mobile devices, the Web hasn't lived up to its promise of a universal programming foundation. Google is trying to change that." Android hogged the spotlight at Google I/O, but performance improvements in Google's Chrome browser show that the company hasn't given up on trying to advance its other programming foundation -- the Web. The mobile version of Chrome has become much more responsive since 2013, said Paul Irish, a developer advocate on the Chrome team, speaking at the San Francisco conference. "We've improved the speed of animation by 75 percent and of scrolling 35 percent," Irish told developers Thursday. "We're committed to getting you 60 frames per second on the mobile Web." That performance is crucial for persuading people to use Web sites rather than native apps for things like posting on social networks, reading news, and playing games. It's also key to getting programmers to take the Web path when so many today focus on native apps written directly for Google's Android operating system and Apple's iOS competitor. The 60 frames-per-second rate refers to how fast the screen redraws when elements are in motion, either during games or when people are doing things like swiping among pages and dragging icons. The 60fps threshold is the minimum that game developers strive for, and to achieve it with no distracting stutters, a device must calculate how to update its entire screen every 16.7 milliseconds. Google, whose Android operating system initially lagged Apple's rival iOS significantly in this domain of responsiveness, has made great strides in improving its OS and its apps. But the mobile Web hasn't kept pace, and that means programmers have been more likely to aim for native apps rather than Web-based apps that can run on any device. ............................ Good review focused on the growing threat that native "paltform specific" apps are replacing Web apps as the developer's best choice. Florian thinks that native apps will win
Gary Edwards

Two Microsofts: Mulling an alternate reality | ZDNet - 1 views

  • Judge Jackson had it right. And the Court of Appeals? Not so much
  • Judge Jackson is an American hero and news of his passing thumped me hard. His ruling against Microsoft and the subsequent overturn of that ruling resulted, IMHO, in two extraordinary directions that changed the world. Sure the what-if game is interesting, but the reality itself is stunning enough. Of course, Judge Jackson sought to break the monopoly. The US Court of Appeals overturn resulted in the monopoly remaining intact, but the Internet remaining free and open. Judge Jackson's breakup plan had a good shot at achieving both a breakup of the monopoly and, a free and open Internet. I admit though that at the time I did not favor the Judge's plan. And i actually did submit a proposal based on Microsoft having to both support the WiNE project, and, provide a complete port to WiNE to any software provider requesting a port. I wanted to break the monopolist's hold on the Windows Productivity Environment and the hundreds of millions of investment dollars and time that had been spent on application development forever trapped on that platform. For me, it was the productivity platform that had to be broken.
  • I assume the good Judge thought that separating the Windows OS from Microsoft Office / Applications would force the OS to open up the secret API's even as the OS continued to evolve. Maybe. But a full disclosure of the API's coupled with the community service "port to WiNE" requirement might have sped up the process. Incredibly, the "Undocumented Windows Secrets" industry continues to thrive, and the legendary Andrew Schulman's number is still at the top of Silicon Valley legal profession speed dials. http://goo.gl/0UGe8 Oh well. The Court of Appeals stopped the breakup, leaving the Windows Productivity Platform intact. Microsoft continues to own the "client" in "Client/Server" computing. Although Microsoft was temporarily stopped from leveraging their desktop monopoly to an iron fisted control and dominance of the Internet, I think what were watching today with the Cloud is Judge Jackson's worst nightmare. And mine too. A great transition is now underway, as businesses and enterprises begin the move from legacy client/server business systems and processes to a newly emerging Cloud Productivity Platform. In this great transition, Microsoft holds an inside straight. They have all the aces because they own the legacy desktop productivity platform, and can control the transition to the Cloud. No doubt this transition is going to happen. And it will severely disrupt and change Microsoft's profit formula. But if the Redmond reprobate can provide a "value added" transition of legacy business systems and processes, and direct these new systems to the Microsoft Cloud, the profits will be immense.
  • ...1 more annotation...
  • Judge Jackson sought to break the ability of Microsoft to "leverage" their existing monopoly into the Internet and his plan was overturned and replaced by one based on judicial oversight. Microsoft got a slap on the wrist from the Court of Appeals, but were wailed on with lawsuits from the hundreds of parties injured by their rampant criminality. Some put the price of that criminality as high as $14 Billion in settlements. Plus, the shareholders forced Chairman Bill to resign. At the end of the day though, Chairman Bill was right. Keeping the monopoly intact was worth whatever penalty Microsoft was forced to pay. He knew that even the judicial over-site would end one day. Which it did. And now his company is ready to go for it all by leveraging and controlling the great productivity transition. No business wants to be hostage to a cold heart'd monopolist. But there is huge difference between a non-disruptive and cost effective, process-by-process value-added transition to a Cloud Productivity Platform, and, the very disruptive and costly "rip-out-and-replace" transition offered by Google, ZOHO, Box, SalesForce and other Cloud Productivity contenders. Microsoft, and only Microsoft, can offer the value-added transition path. If they get the Cloud even halfway right, they will own business productivity far into the future. Rest in Peace Judge Jackson. Your efforts were heroic and will be remembered as such. ~ge~
  •  
    Comments on the latest SVN article mulling the effects of Judge Thomas Penfield Jackson's anti trust ruling and proposed break up of Microsoft. comment: "Chinese Wall" Ummm, there was a Chinese Wall between Microsoft Os and the MS Applciations layer. At least that's what Chairman Bill promised developers at a 1990 OS/2-Windows Conference I attended. It was a developers luncheon, hosted by Microsoft, with Chairman Bill speaking to about 40 developers with applications designed to run on the then soon to be released Windows 3.0. In his remarks, the Chairman described his vision of commoditizing the personal computer market through an open hardware-reference platform on the one side of the Windows OS, and provisioning an open application developers layer on the other using open and totally transparent API's. Of course the question came up concerning the obvious advantage Microsoft applications would have. Chairman Bill answered the question by describing the Chinese Wall that existed between Microsoft's OS and Apps develop departments. He promised that OS API's would be developed privately and separate from the Apps department, and publicly disclosed to ALL developers at the same time. Oh yeah. There was lots of anti IBM - evil empire stuff too :) Of course we now know this was a line of crap. Microsoft Apps was discovered to have been using undocumented and secret Window API's. http://goo.gl/0UGe8. Microsoft Apps had a distinct advantage over the competition, and eventually the entire Windows Productivity Platform became dependent on the MSOffice core. The company I worked for back then, Pyramid Data, had the first Contact Management application for Windows; PowerLeads. Every Friday night we would release bug fixes and from using Wildcat BBS. By Monday morning we would be slammed with calls from users complaining that they had downloaded the Friday night patch, and now some other application would not load or function properly. Eventually we tracked th
Gary Edwards

The True Story of How the Patent Bar Captured a Court and Shrank the Intellectual Commons | Cato Unbound - 1 views

  • The change in the law wrought by the Federal Circuit can also be viewed substantively through the controversy over software patents. Throughout the 1960s, the USPTO refused to award patents for software innovations. However, several of the USPTO’s decisions were overruled by the patent-friendly U.S. Court of Customs and Patent Appeals, which ordered that software patents be granted. In Gottschalk v. Benson (1972) and Parker v. Flook (1978), the U.S. Supreme Court reversed the Court of Customs and Patent Appeals, holding that mathematical algorithms (and therefore software) were not patentable subject matter. In 1981, in Diamond v. Diehr, the Supreme Court upheld a software patent on the grounds that the patent in question involved a physical process—the patent was issued for software used in the molding of rubber. While affirming their prior ruling that mathematical formulas are not patentable in the abstract, the Court held that an otherwise patentable invention did not become unpatentable simply because it utilized a computer.
  • In the hands of the newly established Federal Circuit, however, this small scope for software patents in precedent was sufficient to open the floodgates. In a series of decisions culminating in State Street Bank v. Signature Financial Group (1998), the Federal Circuit broadened the criteria for patentability of software and business methods substantially, allowing protection as long as the innovation “produces a useful, concrete and tangible result.” That broadened criteria led to an explosion of low-quality software patents, from Amazon’s 1-Click checkout system to Twitter’s pull-to-refresh feature on smartphones. The GAO estimates that more than half of all patents granted in recent years are software-related. Meanwhile, the Supreme Court continues to hold, as in Parker v. Flook, that computer software algorithms are not patentable, and has begun to push back against the Federal Circuit. In Bilski v. Kappos (2010), the Supreme Court once again held that abstract ideas are not patentable, and in Alice v. CLS (2014), it ruled that simply applying an abstract idea on a computer does not suffice to make the idea patent-eligible. It still is not clear what portion of existing software patents Alice invalidates, but it could be a significant one.
  • Supreme Court justices also recognize the Federal Circuit’s insubordination. In oral arguments in Carlsbad Technology v. HIF Bio (2009), Chief Justice John Roberts joked openly about it:
  • ...17 more annotations...
  • The Opportunity of the Commons
  • As a result of the Federal Circuit’s pro-patent jurisprudence, our economy has been flooded with patents that would otherwise not have been granted. If more patents meant more innovation, then we would now be witnessing a spectacular economic boom. Instead, we have been living through what Tyler Cowen has called a Great Stagnation. The fact that patents have increased while growth has not is known in the literature as the “patent puzzle.” As Michele Boldrin and David Levine put it, “there is no empirical evidence that [patents] serve to increase innovation and productivity, unless productivity is identified with the number of patents awarded—which, as evidence shows, has no correlation with measured productivity.”
  • While more patents have not resulted in faster economic growth, they have resulted in more patent lawsuits.
  • Software patents have characteristics that make them particularly susceptible to litigation. Unlike, say, chemical patents, software patents are plagued by a problem of description. How does one describe a software innovation in such a way that anyone searching for it will easily find it? As Christina Mulligan and Tim Lee demonstrate, chemical formulas are indexable, meaning that as the number of chemical patents grow, it will still be easy to determine if a molecule has been patented. Since software innovations are not indexable, they estimate that “patent clearance by all firms would require many times more hours of legal research than all patent lawyers in the United States can bill in a year. The result has been an explosion of patent litigation.” Software and business method patents, estimate James Bessen and Michael Meurer, are 2 and 7 times more likely to be litigated than other patents, respectively (4 and 13 times more likely than chemical patents).
  • Software patents make excellent material for predatory litigation brought by what are often called “patent trolls.”
  • Trolls use asymmetries in the rules of litigation to legally extort millions of dollars from innocent parties. For example, one patent troll, Innovatio IP Ventures, LLP, acquired patents that implicated Wi-Fi. In 2011, it started sending demand letters to coffee shops and hotels that offered wireless Internet access, offering to settle for $2,500 per location. This amount was far in excess of the 9.56 cents per device that Innovatio was entitled to under the “Fair, Reasonable, and Non-Discriminatory” licensing promises attached to their portfolio, but it was also much less than the cost of trial, and therefore it was rational for firms to pay. Cisco stepped in and spent $13 million in legal fees on the case, and settled on behalf of their customers for 3.2 cents per device. Other manufacturers had already licensed Innovatio’s portfolio, but that didn’t stop their customers from being targeted by demand letters.
  • Litigation cost asymmetries are magnified by the fact that most patent trolls are nonpracticing entities. This means that when patent infringement trials get to the discovery phase, they will cost the troll very little—a firm that does not operate a business has very few records to produce.
  • But discovery can cost a medium or large company millions of dollars. Using an event study methodology, James Bessen and coauthors find that infringement lawsuits by nonpracticing entities cost publicly traded companies $83 billion per year in stock market capitalization, while plaintiffs gain less than 10 percent of that amount.
  • Software patents also reduce innovation in virtue of their cumulative nature and the fact that many of them are frequently inputs into a single product. Law professor Michael Heller coined the phrase “tragedy of the anticommons” to refer to a situation that mirrors the well-understood “tragedy of the commons.” Whereas in a commons, multiple parties have the right to use a resource but not to exclude others, in an anticommons, multiple parties have the right to exclude others, and no one is therefore able to make effective use of the resource. The tragedy of the commons results in overuse of the resource; the tragedy of the anticommons results in underuse.
  • In order to cope with the tragedy of the anticommons, we should carefully investigate the opportunity of  the commons. The late Nobelist Elinor Ostrom made a career of studying how communities manage shared resources without property rights. With appropriate self-governance institutions, Ostrom found again and again that a commons does not inevitably lead to tragedy—indeed, open access to shared resources can provide collective benefits that are not available under other forms of property management.
  • This suggests that—litigation costs aside—patent law could be reducing the stock of ideas rather than expanding it at current margins.
  • Advocates of extensive patent protection frequently treat the commons as a kind of wasteland. But considering the problems in our patent system, it is worth looking again at the role of well-tailored limits to property rights in some contexts. Just as we all benefit from real property rights that no longer extend to the highest heavens, we would also benefit if the scope of patent protection were more narrowly drawn.
  • Reforming the Patent System
  • This analysis raises some obvious possibilities for reforming the patent system. Diane Wood, Chief Judge of the 7th Circuit, has proposed ending the Federal Circuit’s exclusive jurisdiction over patent appeals—instead, the Federal Circuit could share jurisdiction with the other circuit courts. While this is a constructive suggestion, it still leaves the door open to the Federal Circuit playing “a leading role in shaping patent law,” which is the reason for its capture by patent interests. It would be better instead simply to abolish the Federal Circuit and return to the pre-1982 system, in which patents received no special treatment in appeals. This leaves open the possibility of circuit splits, which the creation of the Federal Circuit was designed to mitigate, but there are worse problems than circuit splits, and we now have them.
  • Another helpful reform would be for Congress to limit the scope of patentable subject matter via statute. New Zealand has done just that, declaring that software is “not an invention” to get around WTO obligations to respect intellectual property. Congress should do the same with respect to both software and business methods.
  • Finally, even if the above reforms were adopted, there would still be a need to address the asymmetries in patent litigation that result in predatory “troll” lawsuits. While the holding in Alice v. CLS arguably makes a wide swath of patents invalid, those patents could still be used in troll lawsuits because a ruling of invalidity for each individual patent might not occur until late in a trial. Current legislation in Congress addresses this class of problem by mandating disclosures, shifting fees in the case of spurious lawsuits, and enabling a review of the patent’s validity before a trial commences.
  • What matters for prosperity is not just property rights in the abstract, but good property-defining institutions. Without reform, our patent system will continue to favor special interests and forestall economic growth.
  •  
    "Libertarians intuitively understand the case for patents: just as other property rights internalize the social benefits of improvements to land, automobile maintenance, or business investment, patents incentivize the creation of new inventions, which might otherwise be undersupplied. So far, so good. But it is important to recognize that the laws that govern property, intellectual or otherwise, do not arise out of thin air. Rather, our political institutions, with all their virtues and foibles, determine the contours of property-the exact bundle of rights that property holders possess, their extent, and their limitations. Outlining efficient property laws is not a trivial problem. The optimal contours of property are neither immutable nor knowable a priori. For example, in 1946, the U.S. Supreme Court reversed the age-old common law doctrine that extended real property rights to the heavens without limit. The advent of air travel made such extensive property rights no longer practicable-airlines would have had to cobble together a patchwork of easements, acre by acre, for every corridor through which they flew, and they would have opened themselves up to lawsuits every time their planes deviated improvements the expected path. The Court rightly abridged property rights in light of these empirical realities. In defining the limits of patent rights, our political institutions have gotten an analogous question badly wrong. A single, politically captured circuit court with exclusive jurisdiction over patent appeals has consistently expanded the scope of patentable subject matter. This expansion has resulted in an explosion of both patents and patent litigation, with destructive consequences. "
  •  
    I added a comment to the page's article. Patents are antithetical to the precepts of Libertarianism and do not involve Natural Law rights. But I agree with the author that the Court of Appeals for the Federal Circuit should be abolished. It's a failed experiment.
Gary Edwards

ES4 and the fight for the future of the Open Web - By Haavard - 0 views

  • Here, we have no better theory to explain why Microsoft is enthusiastic to spread C# onto the web via Silverlight, but not to give C# a run for its money in the open web standards by supporting ES4 in IE.The fact is, and we've heard this over late night truth-telling meetings between Mozilla principals and friends at Microsoft, that Microsoft does not think the web needs to change much. Or as one insider said to a Mozilla figure earlier this year: "we could improve the web standards, but what's in it for us?"
  •  
    Microsoft opposes the stunning collection of EcmaScript standards improvements to JavaScript ES3 known as "ES4". Brendan Eich, author of JavaScript and lead Mozilla developer claims that Microsoft is stalling the advance of JavaScript to protect their proprietary advantages with Silverlight - WPF technologies. Opera developer "Haavard" asks the question, "Why would Microsoft do this?" Brendan Eich explains: Indeed Microsoft does not desire serious change to ES3, and we heard this inside TG1 in April. The words were (improvements my notes) more like this: "Microsoft does not think the web needs to change much". Except, of course, via Silverlight and WPF, which if not matched by evolution of the open web standards, will spread far and wide on the Web, as Flash already has. And that change to the Web is apparently just fine and dandy according to Microsoft. First, Microsoft does not think the Web needs to change much, but then they give us Silverlight and WPF? An amazing contradiction if I ever saw one. It is obvious that Microsoft wants to lock the Web to their proprietary technologies again. They want Silverlight, not some new open standard which further threatens their locked-in position. They will use dirty tricks - lies and deception - to convince people that they are in the right. Excellent discussion on how Microsoft participates in open standards groups to delay, stall and dumb down the Open Web formats, protocols and interfaces their competitors use. With their applications and services, Microsoft offers users a Hobbsian choice; use the stalled, limited and dumbed down Open Web standards, or, use rich, fully featured and advanced but proprietary Silverlight-WPF technologies. Some choice.
Gary Edwards

InformationWeek 500 Trends: Web 2.0, Globalization, Virtualization, And More -- InformationWeek 500 - 0 views

  • Web 2.0 is one of the trendiest ideas in tech, for instance, but there are entire industries where not one company in our survey cites it as a top productivity improver. Meantime, adoption of some more tactical technologies, such as WAN optimization, has exploded in the last year.
  • critical trends, from Web 2.0 to globalization to virtualization
  • the momentum is behind wikis, blogs, and social networking, though primarily among co-workers.
  • ...7 more annotations...
  • When it comes to using Web 2.0 collaboration tools
  • Use of hosted collaboration applications--from calendars to document sharing--hit a reasonably high 60%.
  • Asked what technologies have improved productivity the most, only 14% overall cite "encouraging the use of Web 2.0 technologies.
  • One possible bright spot in our survey is that implementing new collaboration tools, such as Microsoft SharePoint, is cited more often than any other--48%--as a technology leveraged to improve productivity.
  • at satisfied companies, business units rather than IT departments are much more likely to drive the selection of Web 2.0 technologies. At companies dissatisfied with Web 2.0, IT is more likely to take the lead.
  • There's more to Web 2.0 than collaboration tools like wikis and other employee-facing tools, and there's interesting progress on the critical back-end layer that enables Web 2.0. One is mashups; 38% of InformationWeek 500 companies are combining Web and enterprise content in new ways. The other is in Web 2.0 development tools.
  •  
    What the InformationWeek 500 data tells us about the use of emerging technologies.
Gary Edwards

Tech Execs Express Extreme Concern That NSA Surveillance Could Lead To 'Breaking' The Internet | Techdirt - 0 views

  • We need to look the world's dangers in the face. And we need to resolve that we will not allow the dangers of the world to freeze this country in its tracks. We need to recognize that antiquated laws will not keep the public safe. We need to recognize that laws that the rest of the world does not respect will ultimately undermine the fundamental ability of our own legal processes, law enforcement agencies and even the intelligence community itself. At the end of the day, we need to recognize... the one asset that the US has which is even stronger than our military might is our moral authority. And this decline in trust, has not only effected people's trust in American technology products. It has effected people's willingness to trust the leadership of the United States. If we are going to win the war on terror. If we are going to keep the public safe. If we are going to improve American competitiveness, we need Congress to stay on the path it's set. We need Congress to finish in December the job the President put before Congress in January.
  •  
    "Nothing necessarily earth-shattering was said by anyone, but it did involve a series of high powered tech execs absolutely slamming the NSA and the intelligence community, and warning of the vast repercussions from that activity, up to and including potentially splintering or "breaking" the internet by causing people to so distrust the existing internet, that they set up separate networks on their own. The execs repeated the same basic points over and over again. They had been absolutely willing to work with law enforcement when and where appropriate based on actual court orders and review -- but that the government itself completely poisoned the well with its activities, including hacking into the transmission lines between overseas datacenters. Thus, as Eric Schmidt noted, if the NSA and other law enforcement folks are "upset" about Google and others suddenly ramping up their use of encryption and being less willing to cooperate with the government, they only have themselves to blame for completely obliterating any sense of trust. Microsoft's Brad Smith, towards the end, made quite an impassioned plea -- it sounded more like a politician's stump speech -- about the need for rebuilding trust in the internet. It's at about an hour and 3 minutes into the video. He points out that while people had expected Congress to pass the USA Freedom Act, the rise of ISIS and other claimed threats has some people scared, but, he notes: We need to look the world's dangers in the face. And we need to resolve that we will not allow the dangers of the world to freeze this country in its tracks. We need to recognize that antiquated laws will not keep the public safe. We need to recognize that laws that the rest of the world does not respect will ultimately undermine the fundamental ability of our own legal processes, law enforcement agencies and even the intelligence community itself. At the end of the day, we need to recognize... the one asset that the US has which is even stron
Gonzalo San Gil, PhD.

How open source is changing the pace of software | Opensource.com - 0 views

  •  
    "When we talk about the innovation that communities bring to open source software, we often focus on how open source enables contributions and collaboration within communities. More contributors, collaborating with less friction." [# ! #improvement... # ! through #collaboration. # ! From The Open Source #community to the Whole W@rld...]
  •  
    "When we talk about the innovation that communities bring to open source software, we often focus on how open source enables contributions and collaboration within communities. More contributors, collaborating with less friction."
Paul Merrell

The Wifi Alliance, Coming Soon to Your Neighborhood: 5G Wireless | Global Research - Centre for Research on Globalization - 0 views

  • Just as any new technology claims to offer the most advanced development; that their definition of progress will cure society’s ills or make life easier by eliminating the drudgery of antiquated appliances, the Wifi Alliance  was organized as a worldwide wireless network to connect ‘everyone and everything, everywhere” as it promised “improvements to nearly every aspect of daily life.”    The Alliance, which makes no pretense of potential health or environmental concerns, further proclaimed (and they may be correct) that there are “more wifi devices than people on earth”.   It is that inescapable exposure to ubiquitous wireless technologies wherein lies the problem.   
  • Even prior to the 1997 introduction of commercially available wifi devices which has saturated every industrialized country, EMF wifi hot spots were everywhere.  Today with the addition of cell and cordless phones and towers, broadcast antennas, smart meters and the pervasive computer wifi, both adults and especially vulnerable children are surrounded 24-7 by an inescapable presence with little recognition that all radiation exposure is cumulative.    
  • The National Toxicology Program (NTP), a branch of the US National Institute for Health (NIH), conducted the world’s largest study on radiofrequency radiation used by the US telecommunications industry and found a ‘significantly statistical increase in brain and heart cancers” in animals exposed to EMF (electromagnetic fields).  The NTP study confirmed the connection between mobile and wireless phone use and human brain cancer risks and its conclusions were supported by other epidemiological peer-reviewed studies.  Of special note is that studies citing the biological risk to human health were below accepted international exposure standards.    
  •  
    ""…what this means is that the current safety standards as off by a factor of about 7 million.' Pointing out that a recent FCC Chair was a former lobbyist for the telecom industry, "I know how they've attacked various people.  In the U.S. … the funding for the EMF research [by the Environmental Protection Agency] was cut off starting in 1986 … The U.S. Office of Naval Research had been funding a fair amount of research in this area [in the '70s]. They [also] … stopped funding new grants in 1986 …  And then the NIH a few years later followed the same path …" As if all was not reason enough for concern or even downright panic,  the next generation of wireless technology known as 5G (fifth generation), representing the innocuous sounding Internet of Things, promises a quantum leap in power and exceedingly more damaging health impacts with mandatory exposures.      The immense expansion of radiation emissions from the current wireless EMF frequency band and 5G about to be perpetrated on an unsuspecting American public should be criminal.  Developed by the US military as non lethal perimeter and crowd control, the Active Denial System emits a high density, high frequency wireless radiation comparable to 5G and emits radiation in the neighborhood of 90 GHz.    The current Pre 5G, frequency band emissions used in today's commercial wireless range is from 300 Mhz to 3 GHZ as 5G will become the first wireless system to utilize millimeter waves with frequencies ranging from 30 to 300 GHz. One example of the differential is that a current LANS (local area network system) uses 2.4 GHz.  Hidden behind these numbers is an utterly devastating increase in health effects of immeasurable impacts so stunning as to numb the senses. In 2017, the international Environmental Health Trust recommended an EU moratorium "on the roll-out of the fifth generation, 5G, for telecommunication until potential hazards for human health and the environment hav
Gary Edwards

Sun Labs Lively Kernel - 0 views

  • Main features The main features of the Lively Kernel include: Small web programming environment and computing kernel, written entirely with JavaScript. In addition to its application execution capabilities, the platform can also function as an integrated development environment (IDE), making the whole system self-contained and able to improve and extend itself on the fly. Programmatic access to the user interface. Our system provides programmatic access from JavaScript to the user interface via the Morphic user interface framework. The user interface is built around an event-based programming model familiar to most web developers. Asynchronous networking. As in Ajax, you can use asynchronous HTTP to perform all the network operations asynchronously, without blocking the user interface.
  •  
    "The Sun Labs Lively Kernel is a new web programming environment developed at Sun Microsystems Laboratories. The Lively Kernel supports desktop-style applications with rich graphics and direct manipulation capabilities, but without the installation or upgrade hassles that conventional desktop applications have. The system is written entirely in the JavaScript programming language, a language supported by all the web browsers, with the intent that the system can run in commercial web browsers without installation or any plug-in components. The system leverages the dynamic characteristics of the JavaScript language to make it possible to create, modify and deploy applications on the fly, using tools built into the system itself. In addition to its application execution capabilities, the Lively Kernel can also function as an integrated development environment (IDE), making the whole system self-sufficient and able to improve and extend itself dynamically....." Too little too late? Interestingly, Lively Kernel is 100% JavaScript. Check out this "motivation" rational: "...The main goal of the Lively Kernel is to bring the same kind of simplicity, generality and flexibility to web programming that we have known in desktop programming for thirty years, but without the installation and upgrade hassles than conventional desktop applications have. The Lively Kernel places a special emphasis on treating web applications as real applications, as opposed to the document-oriented nature of most web applications today. In general, we want to put programming into web development, as opposed to the current weaving of HTML, XML and CSS documents that is also sometimes referred to as programming. ...." I agree with the Web document <> Web Application statement. I think the shift though is one where the RiA frames web documents in a new envirnement, blending in massive amounts of data, streaming media and graphics. The WebKit docuemnt model was designed for this p
Gary Edwards

The Education of Gary Edwards - Rick Jelliffe on O'Reilly Broadcast - 0 views

  •  
    I wonder how i missed this? Incredibly, i have my own biographer and i didn't know it! The date line is September, 2008, I had turned off all my ODF-OOXML-OASIS searches and blog feeds back in October of 2007 when we moved the da Vinci plug-in to HTML+ using the W3C CDF model. Is it appropriate to send flowers to your secret biographer? Maybe i'll find some time and update his work. The gap between October 2007 and April of 2009 is filled with adventure and wonder. And WebKit!

    "....One of the more interesting characters in the recent standards battles has been Gary Edwards: he was a member of the original ODF TC in 2002 which oversaw the creation of ODF 1.0 in 2005, but gradually became more concerned about large vendor dominance of the ODF TC frustrating what he saw as critical improvements in the area of interoperability. This compromised the ability of ODF to act as a universal format."

    "....Edwards increasingly came to believe that the battleground had shifted, with the SharePoint threat increasingly needing to be the focus of open standards and FOSS attention, not just the standalone desktop applications: I think Edwards tends to see Office Open XML as a stalking horse for Microsoft to get its foot back in the door for back-end systems....."

    "....Edwards and some colleagues split with some acrimony improvements the ODF effort in 2007, and subsequently see W3C's Compound Document Formats (CDF) as holding the best promise for interoperability. Edwards' public comments are an interesting reflection of an person evolving their opinion in the light of experience, events and changing opportunities...."

    ".... I have put together some interesting quotes improvements him which, I hope, fairly bring out some of the themes I see. As always, read the source to get more info: ..... "

Gary Edwards

Mashups turn into an industry as offerings mature | Hinchcliffe Enterprise Web 2.0 | ZDNet.com - 0 views

  •  
    Dion has lots to say about the recent Web 2.0 Conference. In this article he covers nine significant announcements from companies specializing in Web based mashups and the related tools for building ad hoc Web applications. This years Web 2.0 was filled with Web developer oriented services, but my favorite was MindTouch. Perhaps because their focus was that of directly engaging end users in the customization of business processes. Yes, the creation of data objects is clearly in the realm of trained developers. And for sure many tools were announced at Web 2.0 to further the much needed wiring of data objects. But once wired and available, services like MindTouch i think will become the way end users interact and create new business productivity methods. Great coverage.

    "...... For awareness and understanding of the fast-growing world of mashups are significant challenges as IT practitioners, business strategists, and software vendors attempt to grapple with what's facing up to be the biggest challenge of all: The habits and expectations of the larger part of a generation of workers who don't yet realize mashups are poised to change many things about the software landscape on the Web and in the workplace. Generational changes can be difficult for businesses to embrace successfully, and while evidence that mashups are remaking the business world are still very much emerging, they certainly hold the promise..."

    ".... while the life of the average Web developer has been greatly improved by the availability of a wide variety of useful open APIs, the average user of the Web hasn't been a direct beneficiary except through the increase in Web apps that are built on the mashup model. And that's because the tools that empower users to weave together existing Web parts and open APIs into the exact solutions they need are just now becoming easy enough and robust enough to readily enable these scenarios. And that doesn't include the variety of
Gary Edwards

Introduction to OpenCalais | OpenCalais - 0 views

  •  
    "The free OpenCalais service and open API is the fastest way to tag the people, places, facts and events in your content.  It can help you improve your SEO, increase your reader engagement, create search-engine-friendly 'topic hubs' and streamline content operations - saving you time and money. OpenCalais is free to use in both commercial and non-commercial settings, but can only be used on public content (don't run your confidential or competitive company information through it!). OpenCalais does not keep a copy of your content, but it does keep a copy of the metadata it extracts there from. To repeat, OpenCalais is not a private service, and there is no secure, enterprise version that you can buy to operate behind a firewall. It is your responsibility to police the content that you submit, so make sure you are comfortable with our Terms of Service (TOS) before you jump in. You can process up to 50,000 documents per day (blog posts, news stories, Web pages, etc.) free of charge.  If you need to process more than that - say you are an aggregator or a media monitoring service - then see this page to learn about Calais Professional. We offer a very affordable license. OpenCalais' early adopters include CBS Interactive / CNET, Huffington Post, Slate, Al Jazeera, The New Republic, The White House and more. Already more than 30,000 developers have signed up, and more than 50 publishers and 75 entrepreneurs are using the free service to help build their businesses. You can read about the pioneering work of these publishers, entrepreneurs and developers here. To get started, scroll to the bottom section of this page. To build OpenCalais into an existing site or publishing platform (CMS), you will need to work with your developers.  Why OpenCalais Matters The reason OpenCalais - and so-called "Web 3.0" in general (concepts like the Semantic Web, Linked Data, etc.) - are important is that these technologies make it easy to automatically conne
Paul Merrell

Gmail blows up e-mail marketing by caching all images on Google servers | Ars Technica - 1 views

  • Ever wonder why most e-mail clients hide images by default? The reason for the "display images" button is because images in an e-mail must be loaded from a third-party server. For promotional e-mails and spam, usually this server is operated by the entity that sent the e-mail. So when you load these images, you aren't just receiving an image—you're also sending a ton of data about yourself to the e-mail marketer. Loading images from these promotional e-mails reveals a lot about you. Marketers get a rough idea of your location via your IP address. They can see the&nbsp;HTTP referrer, meaning the URL of the page that requested the image. With the referral data, marketers can see not only what client you are using (desktop app, Web, mobile, etc.) but also what folder you were viewing the e-mail in. For instance, if you had a Gmail folder named "Ars Technica" and loaded e-mail images, the referral URL would be "https://mail.google.com/mail/u/0/#label/Ars+Technica"—the folder is right there in the URL. The same goes for the inbox, spam, and any other location. It's even possible to uniquely identify each e-mail, so marketers can tell which e-mail address requested the images—they know that you've read the e-mail. And if it was spam, this will often earn you more spam since the spammers can tell you've read their last e-mail.
  • But Google&nbsp;has just announced&nbsp;a move that will shut most of these tactics down: it will cache all images for Gmail users.&nbsp;Embedded images will now be saved by Google, and the e-mail content will be modified to display those images from Google's cache, instead of from a third-party server. E-mail marketers will no longer be able to get any information from images—they will see a single request from Google, which will then be used to send the image out to all Gmail users. Unless you click on a link, marketers will have no idea the e-mail has been seen.&nbsp;While this means improved privacy from e-mail marketers, Google will now be digging deeper than ever into your e-mails and literally modifying the contents. If you were worried about e-mail scanning,&nbsp;this may take things a step further. However, if you don't like the idea of cached images, you can turn it off in the settings. This move will allow Google to automatically display images, killing the "display all images" button in Gmail. Google servers should also be faster than the usual third-party image host. Hosting all images sent to all Gmail users sounds like a huge bandwidth and storage undertaking, but if anyone can do it, it's Google. The new image handling will rollout to desktop users today, and it should hit mobile apps sometime in early 2014. There's also a bonus side effect for Google: e-mail marketing is advertising. Google exists because of advertising dollars, but they don't do e-mail marketing. They've just made a competitive form of advertising much less appealing and informative to advertisers. No doubt Google hopes this move pushes marketers to spend less on e-mail and more on Adsense.
  •  
    There's an antitrust angle to this; it could be viewed by a court as anti-competitive. But given the prevailing winds on digital privacy, my guess would be that Google would slide by.
Gary Edwards

Box extends its enterprise playbook, but users are still at the center | CITEworld - 0 views

  • The 47,000 developers making almost two billion API calls to the Box platform per month are a good start, Levie says, but Box needs to go further and do more to customize its platform to help push this user-centric, everything-everywhere-always model at larger and larger enterprises.&nbsp;
  • Box for Industries is comprised of three parts: A Box-tailored core service offering, a selection of partner apps, and the implementation services to combine the two of those into something that ideally can be used by any enterprise in any vertical.&nbsp;
  • Box is announcing solutions for three specific industries: Retail, healthcare, and media/entertainment. For retail, that includes vendor collaboration (helping vendors work with manufacturers and distributors), digital asset management, and retail store enablement.
  • ...6 more annotations...
  • Ted Blosser, senior vice president of Box Platform, also took the stage to show off how managing digital assets benefit from a just-announced metadata template capability that lets you pre-define custom fields so a store's back-office can flag, say, a new jacket as "blue" or "red." Those metadata tags can be pushed to a custom app running on a retail associate's iPad, so you can sort by color, line, or inventory level.&nbsp;Metadata plus Box Workflows equals a powerful content platform for retail that keeps people in sync with their content across geographies and devices, or so the company is hoping.&nbsp;
  • It's the same collaboration model that cloud storage vendors have been pushing, but customized for very specific verticals, which is exactly the sales pitch that Box wants you to come away with. And developers must be cheering -- Box is going to help them sell their apps to previously inaccessible markets.&nbsp;
  • More on the standard enterprise side, the so-named Box + Office 365 (previewed a few months back) currently only supports the Windows desktop versions of the productivity suite, but Levie promises web and Mac integrations are on the way. It's pretty basic, but potentially handy for the enterprises that Box supports.
  • The crux of the Office 365 announcement is that people expect that their data will follow them from device to device and from app to app. If people want their Box files and storage in Jive, Box needs to support Jive. And if enterprises are using Microsoft Office 365 to work with their documents -- and they are -- then Box needs to support that too. It's easier than it used to be, Levie says, thanks to Satya Nadella's push for a more open Microsoft.&nbsp;
  • "We are quite confident that this is the kind of future they're building towards," Levie says -- but just in case, he urged BoxWorks attendees to tweet at Nadella and encourage him to help Box speed development along.&nbsp;
  • Box SVP of Enterprise Annie Pearl came on stage to discuss how Box Workflow can be used to improve the ways people work with their content in the real world of business. It's worth noting that Box had a workflow tool previously, but it was relatively primitive and seems to have only existed to tick the box -- it didn't really go beyond assigning tasks and soliciting approvals.
  •  
    This will be very interesting. Looks like Box is betting their future on the success of integrating Microsoft Office 365 into the Box Productivity Cloud Service. Which competes directly with the Microsoft Office 365 - OneDrive Cloud Productivity Platform. Honestly, I don't see how this can ever work out for Box. Microsoft has them ripe for the plucking. And they have pulled it off on the eve of Box's expected IPO. "Box CEO Aaron Levie may not be able to talk about the cloud storage and collaboration company's forthcoming IPO, but he still took the stage at the company's biggest BoxWorks conference yet, with 5,000 attendees. Featured Resource Presented by Citrix Systems 10 essential elements for a secure enterprise mobility strategy Best practices for protecting sensitive business information while making people productive from LEARN MORE Levie discussed the future of the business and make some announcements -- including the beta of a Box integration with the Windows version of Microsoft Office 365; the introduction of Box Workflow, a tool coming in 2015 for creating repeatable workflows on the platform; and the unveiling of Box for Industries, an initiative to tailor Box solutions for specific industry use-cases. And if that wasn't enough, Box also announced a partnership with service firm Accenture to push the platform in large enterprises. The unifying factor for the announcements made at BoxWorks, Levie said, is that users expect their data to follow them everywhere, at home and at work. That means that Box has to think about enterprise from the user outwards, putting them at the center of the appified universe -- in effect, building an ecosystem of tools that support the things employees already use."
Paul Merrell

He Was a Hacker for the NSA and He Was Willing to Talk. I Was Willing to Listen. - 2 views

  • he message arrived at night and consisted of three words: “Good evening sir!” The sender was a hacker who had written a series of provocative memos at the National Security Agency. His secret memos had explained — with an earthy use of slang and emojis that was unusual for an operative of the largest eavesdropping organization in the world — how the NSA breaks into the digital accounts of people who manage computer networks, and how it tries to unmask people who use Tor to browse the web anonymously. Outlining some of the NSA’s most sensitive activities, the memos were leaked by Edward Snowden, and I had written about a few of them for The Intercept. There is no Miss Manners for exchanging pleasantries with a man the government has trained to be the digital equivalent of a Navy SEAL. Though I had initiated the contact, I was wary of how he might respond. The hacker had publicly expressed a visceral dislike for Snowden and had accused The Intercept of jeopardizing lives by publishing classified information. One of his memos outlined the ways the NSA reroutes (or “shapes”) the internet traffic of entire countries, and another memo was titled “I Hunt Sysadmins.” I felt sure he could hack anyone’s computer, including mine. Good evening sir!
  • The sender was a hacker who had written a series of provocative memos at the National Security Agency. His secret memos had explained — with an earthy use of slang and emojis that was unusual for an operative of the largest eavesdropping organization in the world — how the NSA breaks into the digital accounts of people who manage computer networks, and how it tries to unmask people who use Tor to browse the web anonymously. Outlining some of the NSA’s most sensitive activities, the memos were leaked by Edward Snowden, and I had written about a few of them for The Intercept. There is no Miss Manners for exchanging pleasantries with a man the government has trained to be the digital equivalent of a Navy SEAL. Though I had initiated the contact, I was wary of how he might respond. The hacker had publicly expressed a visceral dislike for Snowden and had accused The Intercept of jeopardizing lives by publishing classified information. One of his memos outlined the ways the NSA reroutes (or “shapes”) the internet traffic of entire countries, and another memo was titled “I Hunt Sysadmins.” I felt sure he could hack anyone’s computer, including mine.
  • I got lucky with the hacker, because he recently left the agency for the cybersecurity industry; it would be his choice to talk, not the NSA’s. Fortunately, speaking out is his second nature.
  • ...7 more annotations...
  • He agreed to a video chat that turned into a three-hour discussion sprawling from the ethics of surveillance to the downsides of home from and the difficulty of securing your laptop.
  • In recent years, two developments have helped make hacking for the government a lot more attractive than hacking for yourself. First, the Department of Justice has cracked down on freelance hacking, whether it be altruistic or malignant. If the DOJ doesn’t like the way you hack, you are going to jail. Meanwhile, hackers have been warmly invited to deploy their transgressive impulses in service to the homeland, because the NSA and other federal agencies have turned themselves into licensed hives of breaking into other people’s computers. For many, it’s a techno sandbox of irresistible delights, according to Gabriella Coleman, a professor at McGill University who studies hackers. “The NSA is a very exciting place for hackers because you have unlimited resources, you have some of the best talent in the world, whether it’s cryptographers or mathematicians or hackers,” she said. “It is just too intellectually exciting not to go there.”
  • “If I turn the tables on you,” I asked the Lamb, “and say, OK, you’re a target for all kinds of people for all kinds of reasons. How do you feel about being a target and that kind of justification being used to justify getting all of your credentials and the keys to your kingdom?” The Lamb smiled. “There is no real safe, sacred ground on the internet,” he replied. “Whatever you do on the internet is an attack surface of some sort and is just something that you live with. Any time that I do something on the internet, yeah, that is on the back of my mind. Anyone from a script kiddie to some random hacker to some other foreign intelligence service, each with their different capabilities — what could they be doing to me?”
  • The Lamb’s memos on cool ways to hunt sysadmins triggered a strong reaction when I wrote about them in 2014 with my colleague Ryan Gallagher. The memos explained how the NSA tracks down the email and Facebook accounts of systems administrators who oversee computer networks. After plundering their accounts, the NSA can impersonate the admins to get into their computer networks and pilfer the data flowing through them. As the Lamb wrote, “sys admins generally are not my end target. My end target is the extremist/terrorist or government official that happens to be using the network … who better to target than the person that already has the ‘keys to the kingdom’?” Another of his NSA memos, “Network Shaping 101,” used Yemen as a theoretical case study for secretly redirecting the entirety of a country’s internet traffic to NSA servers.
  • “You know, the situation is what it is,” he said. “There are protocols that were designed years ago before anybody had any care about security, because when they were developed, nobody was foreseeing that they would be taken advantage of. … A lot of people on the internet seem to approach the problem [with the attitude of] ‘I’m just going to walk naked outside of my house and hope that nobody looks at me.’ From a security perspective, is that a good way to go about thinking? No, horrible … There are good ways to be more secure on the internet. But do most people use Tor? No. Do most people use Signal? No. Do most people use insecure things that most people can hack? Yes. Is that a bash against the intelligence community that people use stuff that’s easily exploitable? That’s a hard argument for me to make.”
  • I mentioned that lots of people, including Snowden, are now working on the problem of how to make the internet more secure, yet he seemed to do the opposite at the NSA by trying to find ways to track and identify people who use Tor and other anonymizers. Would he consider working on the other side of things? He wouldn’t rule it out, he said, but dismally suggested the game was over as far as having a liberating and safe internet, because our laptops and smartphones will betray us no matter what we do with them. “There’s the old adage that the only secure computer is one that is turned off, buried in a box ten feet underground, and never turned on,” he said. “From a user perspective, someone trying to find holes by day and then just live on the internet by night, there’s the expectation [that] if somebody wants to have access to your computer bad enough, they’re going to get it. Whether that’s an intelligence agency or a cybercrimes syndicate, whoever that is, it’s probably going to happen.”
  • There are precautions one can take, and I did that with the Lamb. When we had our video chat, I used a computer that had been wiped clean of everything except its operating system and essential applications. Afterward, it was wiped clean again. My concern was that the Lamb might use the session to obtain data from or about the computer I was using; there are a lot of things he might have tried, if he was in a scheming mood. At the end of our three hours together, I mentioned to him that I had taken these precautions—and he approved. “That’s fair,” he said. “I’m glad you have that appreciation. … from a perspective of a journalist who has access to classified information, it would be remiss to think you’re not a target of foreign intelligence services.” He was telling me the U.S. government should be the least of my worries. He was trying to help me. Documents published with this article: Tracking Targets Through Proxies &amp; Anonymizers Network Shaping 101 Shaping Diagram I Hunt Sys Admins (first published in 2014)
Paul Merrell

Obama to propose legislation to protect firms that share cyberthreat data - The Washington Post - 0 views

  • President Obama plans to announce legislation Tuesday that would shield companies from lawsuits for sharing computer threat data with the government in an effort to prevent cyber­attacks. On the heels of a destructive attack at Sony Pictures Entertainment and major breaches at JPMorgan Chase and retail chains, Obama is intent on capitalizing on the heightened sense of urgency to improve the security of the nation’s networks, officials said. “He’s been doing everything he can within his executive authority to move the ball on this,” said a&nbsp;senior administration official who spoke on the condition of anonymity to discuss legislation that has not yet been released. “We’ve got to get something in place that allows both industry and government to work more closely together.”
  • The legislation is part of a broader package, to be sent to Capitol Hill on Tuesday, that includes measures to help protect consumers and students against ­cyberattacks and to give law enforcement greater authority to combat cybercrime. The provision’s goal is to “enshrine in law liability protection for the private sector for them to share specific information — cyberthreat indicators — with the government,” the official said. Some analysts questioned the need for such legislation, saying there are adequate measures in place to enable sharing between companies and the government and among companies.
  • “We think the current information-sharing regime is adequate,” said Mark Jaycox, legislative analyst at the Electronic Frontier Foundation, a privacy group. “More companies need to use it, but the idea of broad legal immunity isn’t needed right now.” The administration official disagreed. The lack of such immunity is what prevents many companies from greater sharing of data with the government, the official said. “We have heard that time and time again,” the official said. The proposal, which builds on a 2011 administration bill, grants liability protection to companies that provide indicators of cyberattacks and threats to the Department of Homeland Security.
  • ...5 more annotations...
  • But in a provision likely to raise concerns from privacy advocates, the administration wants to require DHS to share that information “in as near real time as possible” with other government agencies that have a cybersecurity mission, the official said. Those include the National Security Agency, the Pentagon’s ­Cyber Command, the FBI and the Secret Service. “DHS needs to take an active lead role in ensuring that unnecessary personal information is not shared with intelligence authorities,” Jaycox said. The debates over government surveillance prompted by disclosures from former NSA contractor Edward Snowden have shown that “the agencies already have a tremendous amount of unnecessary information,” he said.
  • The administration official stressed that the legislation will require companies to remove unnecessary personal information before furnishing it to the government in order to qualify for liability protection. It also will impose limits on the use of the data for cybersecurity crimes and instances in which there is a threat of death or bodily harm, such as kidnapping, the official said. And it will require DHS and the attorney general to develop guidelines for the federal government’s use and retention of the data. It will not authorize a company to take offensive cyber-measures to defend itself, such as “hacking back” into a server or computer outside its own network to track a breach. The bill also will provide liability protection to companies that share data with private-sector-developed organizations set up specifically for that purpose. Called information sharing and analysis organizations, these groups often are set up by particular industries, such as banking, to facilitate the exchange of data and best practices.
  • Efforts to pass information-sharing legislation have stalled in the past five years, blocked primarily by privacy concerns. The package also contains provisions that would allow prosecution for the sale of botnets or access to armies of compromised computers that can be used to spread malware, would criminalize the overseas sale of stolen U.S. credit card and bank account numbers, would expand federal law enforcement authority to deter the sale of spyware used to stalk people or commit identity theft, and would give courts the authority to shut down botnets being used for criminal activity, such as denial-of-service attacks.
  • It would reaffirm that federal racketeering law applies to cybercrimes and amends the Computer Fraud and Abuse Act by ensuring that “insignificant conduct” does not fall within the scope of the statute. A third element of the package is legislation Obama proposed Monday to help protect consumers and students against cyberattacks. The theft of personal financial information “is a direct threat to the economic security of American families, and we’ve got to stop it,” Obama said. The plan, unveiled in a speech at the Federal Trade Commission, would require companies to notify customers within 30 days after the theft of personal information is discovered. Right now, data breaches are handled under a patchwork of state laws that the president said are confusing and costly to enforce. Obama’s plan would streamline those into one clear federal standard and bolster requirements for companies to notify customers. Obama is proposing closing loopholes to make it easier to track down cybercriminals overseas who steal and sell identities. “The more we do to protect consumer information and privacy, the harder it is for hackers to damage our businesses and hurt our economy,” he said.
  • In October, Obama signed an order to protect consumers from identity theft by strengthening security features in credit cards and the terminals that process them. Marc Rotenberg, executive director of the Electronic Privacy Information Center, said there is concern that a federal standard would “preempt stronger state laws” about how and when companies have to notify consumers. The Student Digital Privacy Act would ensure that data entered would be used only for educational purposes. It would prohibit companies from selling student data to third-party companies for purposes other than education. Obama also plans to introduce a Consumer Privacy Bill of Rights. And the White House will host a summit on cybersecurity and consumer protection on Feb.&nbsp;13 at Stanford University.
Gary Edwards

WebKit: The 21st century mobile - desktop application foundation - Software Development Times On The Web - 0 views

  •  
    Well well well. The experts are finally starting to get it. WebKit is extraordinary, not as a browser, but as the "Open Web" foundation for web application (RiA) development that spans device and desktop.
  •  
    The vastly improved hardware and network throughput are not the primary drivers of this sea change. "The biggest jolt to the mobile Web development experience, in my view, has been the iPhone. Its implementation of mobile Safari, while imperfect, has given handsets the real Internet, rather than a hobbled, niched version that was typical in devices that preceded it." What that means for the future, according to O'Grady, is that the mobile application space "will mirror the development on the client, honestly. It will evolve into a platform barely distinguishable, in many respects, from the traditional desktop browser experience."
Gary Edwards

Microsoft's Quest for Interoperability and Open Standards - 0 views

  •  
    Interesting article discussing the many ways Microsoft is using to improve the public perception that they are serious about interoperability and open formats, protocols and interfaces. Rocketman attended the recent ISO SC34 meeting in Prague and agrees that Microsoft has indeed put on a new public face filled with cooperation, compliance and unheard of sincerity.

    He also says, "Don't be fooled!!!"

    There is a big difference between participation in vendor consortia and government sponsored public standards efforts, and, actual implementation at the product level. Looking at how Microsoft products implement open standards, my take is that they have decided on a policy of end user choice. Their applications offer on the one hand the choice of aging, near irrelevant and often crippled open standards. And on the other, the option of very rich and feature filled but proprietary formats, protocols and interfaces that integrate across the entire Microsoft platform of desktop, devices and servers. For instance; IE8 supports 1998 HTML-CSS, but not the advanced ACiD-3 "HTML+" used by WebKit, Firefox, Opera and near every device or smartphone operating at the edge of the Web. (HTML+ = HTML5, CSS4, SVG/Canvas, JS, JS Libs).

    But they do offer advanced .NET-WPF proprietary alternative to Open Web HTML+. These include XAML, Silverlight, XPS, LINQ, Smart Tags, and OOXML. Very nice.

    "When an open source advocate, open standards advocate, or, well, pretty much anyone that competes with Microsoft (news, site) sees an extended hand from the software giant toward better interoperability, they tend to look and see if the other hand's holding a spiked club.

    Even so, the Redmond, WA company continues to push the message that it has seen the light regarding open standards and interoperability...."

Gonzalo San Gil, PhD.

BitTorrent is the New Radio, Says Counting Crows Frontman | TorrentFreak - 1 views

  •  
    [American rock band Counting Crows have sold more than 20 million albums worldwide, but this success hasn't caused them to overlook the changing landscape of the music business. Today the band releases four tracks from their new album for free on BitTorrent. Talking to TorrentFreak, Counting Crows frontman Adam Duritz says BitTorrent is the new and improved radio. ...]
Gonzalo San Gil, PhD.

Tools | La Quadrature du Net - 1 views

  •  
    [ Who are we? FAQ Tools Contact Press room English Français La Quadrature du Net La Quadrature du Net Internet & Libertés Participate Support us Newsletter RSS Identi.ca Twitter Dossiers Net Neutrality ACTA Anti-sharing directive - IPRED Net filtering Online Services Directive Proposals Tools general Printer-friendly version Send to friend Français Political Memory Political Memory is a toolbox designed to help reach members of the European Parliament (MEPs) and track their voting records. You may find the list of Members of the European Parliament: by alphabetical order by country by political group by committee For each Member of Parliament or European MP are listed contact details, mandates, as well as their votes and how they stand on subjects touched on by La Quadrature du Net. If you have telephony software installed on your computer, you can call them directly by clicking on "click to call". Wiki The wiki is the collaborative part of this website where anyone can create or modify content. This is where information on La Quadrature's campaigns (such as those about the written statement on ACTA or the IPRED Consultation), highlights of the National Assembly1 debates, pages relating to ongoing issues tracked by La Quadrature, as well as analyses, illustrations and more can be found. Mediakit The Mediakit is an audio and video data bank. It contains interventions of La Quadrature's spokespeople in the media as well as reports about issues La Quadrature closely follows. All these media can be viewed and downloaded in different formats. Press Review The Press Review is a collection of press articles about La Quadrature du Net's issues. It is compiled by a team of volunteers and comes in two languages: English and French. Articles written in other languages appear in both press re
1 - 20 of 47 Next › Last »
Showing 20 items per page