Skip to main content

Home/ Future of the Web/ Group items tagged JSON

Rss Feed Group items tagged

Gonzalo San Gil, PhD.

JSON 2 XML (XBEL) - 0 views

  •  
    [This online tool allows you to convert a JSON file into an XML file. This process is not 100% accurate in that XML uses different item types that do not have an equivalent JSON representation. The following rules will be applied during the conversion process: A default root element is created JSON array entries are converted to individual XML elements All JSON property values will be converted to #text item types Offending characters will be XML escaped]
  •  
    [This online tool allows you to convert a JSON file into an XML file. This process is not 100% accurate in that XML uses different item types that do not have an equivalent JSON representation. The following rules will be applied during the conversion process: A default root element is created JSON array entries are converted to individual XML elements All JSON property values will be converted to #text item types Offending characters will be XML escaped]
Gary Edwards

Developer: Dump JavaScript for faster Web loading | CIO - 0 views

  • Accomplishing the goal of a high-speed, responsive Web experience without loading JavaScript "could probably be done by linking anchor elements to JSON/XML (or a new definition) API endpoints [and] having the browser internally load the data into a new data structure," the proposal states.
  • The browser "then replaces DOM elements with whatever data that was loaded as needed.
  • The initial data and standard error responses could be in header fixtures, which could be replaced later if so desired. "The HTML body thus becomes a templating language with all the content residing in the fixtures that can be dynamically reloaded without JavaScript."
  •  
    "A W3C (World Wide Web Consortium) mailing list post entitled "HTML6 proposal for single-page Web apps without JavaScript" details the proposal, dated March 20. "The overall purpose [of the plan] is to reduce response times when loading Web pages," said Web developer Bobby Mozumder, editor in chief of FutureClaw magazine, in an email. "This is the difference between a 300ms page load vs 10ms. The faster you are, the better people are going to feel about using your Website." The proposal cites a standard design pattern emerging via front-end JavaScript frameworks where content is loaded dynamically via JSON APIs. "This is the single-page app Web design pattern," said Mozumder. "Everyone's into it because the responsiveness is so much better than loading a full page -- 10-50ms with a clean API load vs. 300-1500ms for a full HTML page load. Since this is so common now, can we implement this directly in the browsers via HTML so users can dynamically run single-page apps without JavaScript?" Accomplishing the goal of a high-speed, responsive Web experience without loading JavaScript "could probably be done by linking anchor elements to JSON/XML (or a new definition) API endpoints [and] having the browser internally load the data into a new data structure," the proposal states. The browser "then replaces DOM elements with whatever data that was loaded as needed." The initial data and standard error responses could be in header fixtures, which could be replaced later if so desired. "The HTML body thus becomes a templating language with all the content residing in the fixtures that can be dynamically reloaded without JavaScript." JavaScript frameworks and JavaScript are leveraged for loading now, but there are issues with these, Mozumder explained. "Should we force millions of Web developers to learn JavaScript, a framework, and an associated templating language if they want a speedy, responsive Web site out-of-the-box? This is a huge barrier for beginners, and right n
Gary Edwards

Sun pitches new cloud as 'Open Platform' * - 0 views

  •  
    Sun takes on the problem of interoperability and portability of applications in a world where there will be many many clouds. At the roll out of the Sun Cloud, key executives explain Sun's implementation of Open Cloud API's and what they see as a pressing need for management tools that will allow some standardization across clouds.

    Sun's Open Cloud API plan is a clean reuse of existing Open Web API's.

    "..... The underpinning of the Open Cloud Platform that Sun will be pitching to developers is a set of cloud APIs, the creation of which is focused under Project Kenai and which has been released under a Community Commons open source license. Sun wants lots of feedback on the APIs and wants these APIs to become a standard too, hence the open license. These APIs describes how virtual elements in a cloud are created, started, stopped, and hibernated using HTTP commands such as GET, PUT, and POST...."

    "...... The upshot is that these APIs will allow programmatic access to virtual infrastructure from Java, PHP, Python, and Ruby and that means system admins can script how virtual resources are deployed. The APIs, as co-creator Tim Bray explains in his blog, are written in JavaScript Object Notation (JSON), not XML. The Q-Layer software is a graphical representation of what is going on down in the APIs, and you can moving virtual resources into the cloud with a click of a mouse using the dashboard or programmatically using the APIs from those four programming languages listed above. (PHP support is not yet available, but will be)....."
  •  
    I can see why Sun picked those four languages first. Can I assume that with a bit of work, this API will be usable from any language with a C "foreign function interface", such as Perl, Common Lisp, Bourne shell, Squeak Smalltalk, and others that your server application might be written in?
  •  
    I read this comment that largely answers my question at: http://www.tbray.org/ongoing/When/200x/2009/03/16/Sun-Cloud "So right now JSON out of a shell tool is not so good. More things like this will create pressure for development of tools to change that, but years of widespread XML/HTML deployment have only produced a few oddly maintained tools. Perhaps that's because you can scrape quite a bit of the web with a couple sed passes, and if I were to have to deal with the mentioned tools, that's probably the route I'd take." (seth w. klein) In other words, with a bit of work, _anything_ that can talk text over HTTP can do this with a bit of work, but an object-oriented is likely to be more at home with JSON (JavaScript Object Notation)
Gary Edwards

Adamac Attack!: Evolution Revolution - 0 views

  • HTTP as a universal calling convention is pretty interesting. We already have tons of web services in the cloud using HTTP to communicate with one another - why not extend this to include local code talking with other components. The iPhone already supports a form of this IPC using the URL handlers, basically turning your application into a web server. BugLabs exposes interfaces to its various embedded device modules through web services. It has even been suggested in the literature that every object could embed a web server. Why not use this mechanism for calling that object's methods?
  •  
    Given the increasing number of platforms supporting Javascript + HTTP + HTML5, it's not inconceivable that "write-once, run anywhere" might come closer to fruition with this combo than Java ever achieved. Here's how this architecture plays out in my mind. Javascript is the core programming language. Using a HTTP transport and JSON data format, components in different processes can perform RPCs to one another. HTML5 features like local storage and the application cache allow for an offline story (the latest build of Safari on iPhone supports this). And of course, HTML + CSS allows for a common UI platform.
Gary Edwards

Microsoft Office whips Google Docs: It's finally game over | Computerworld Blogs - 0 views

  •  
    "If there was ever any doubt about whether Microsoft or Google would win the war of office suites, there should be no longer. Within the last several weeks, Microsoft has pulled so far ahead that it's game over. Here's why. When it comes to which suite is more fully featured, there's never been any real debate: Microsoft Office wins hands down. Whether you're creating entire presentations, creating complicated word-processing documents, or even doing something as simple as handling text attributes, Office is a far better tool. Until the last few weeks, Google Docs had one significant advantage over Microsoft Office: It's available for Android and the iPad as well as PCs because it's Web-based. The same wasn't the case for Office. So if you wanted to use an office suite on all your mobile devices, Google Docs was the way to go. Google Docs lost that advantage when Microsoft released Office for the iPad. There's not yet a native version for Android tablets, but Microsoft is working on that, telling GeekWire, "Let me tell you conclusively: Yes, we are also building Android native applications for tablets for Word, Excel and PowerPoint." Google Docs is still superior to Office's Web-based version, but that's far less important than it used to be. There's no need to go with a Web-based office suite if a superior suite is available as a native apps on all platforms, mobile or otherwise. And Office's collaboration capabilities are quite considerable now. Of course, there's always the question of price. Google Docs is free. Microsoft Office isn't. But at $100 a year for up to five devices, or $70 a year for two, no one will be going broke paying for Microsoft Office. It's worth paying that relatively small price for a much better office suite. Google Docs won't die. It'll be around as second fiddle for a long time. But that's what it will always remain: a second fiddle to the better Microsoft Office."
  •  
    Google acquired "Writely", a small company in Portola Valley that pioneered document editing in a browser. Writely was perhaps the first cloud computing editor to go beyond simple HTML; eventually crafting some really cool CSS-JavaScript-JSON document layout and editing methods. But it can't edit native MSOffice documents. It converts them. There are more than a few problems with the Google Docs approach to editing advanced "compound" documents, but two stick out and are certain to give pause to anyone making the great transition from local workgroup computing, to the highly mobile, always connected, cloud computing. The first problem certain to become a show stopper is that Google converts documents to their native on-line format for editing and collaboration. And then they convert back. To many this isn't a problem. But if the document is part of a workflow or business process, conversion is a killer. There is an old saw affectionately known as "Reuters Law", dating back to the ODF-OXML document wars, that emphatically states; "Conversion breaks documents." The breakage includes both the visual layout of the document, and, the "compound" aspects and data connections that are internal to the document. Think of this way. A business document that is part of a legacy Windows Workgroup workflow is opened up in gDocs. Google converts the document for editing purposes. The data and the workflow internals that bind the document to the local business system are broken on conversion. The look of the document is also visually shredded as the gDocs layout engine is applied. For all practical purposes, no matter what magic editing and collaboration value is added, a broken document means a broken business process. Let me say that again, with the emphasis of having witnessed this first hand during the year long ODF transition trials the Commonwealth of Massachusetts conducted in 2005 and 2006. The business process broke every time a conversion was conducted "on a busines
Gary Edwards

A Simpler Approach To SOA -- Web-Oriented Architecture -- InformationWeek - 1 views

  • Expanding the Influence of Content Management with Business Service Management Breaking down the silos: from change process to release automation
  •  
    "Web-oriented architectures are easier to implement and offer a similar flexibility to SOA." Lengthy discussion concerning the WOA approach for the quick implementation of Web Application and Data bound services. Think the Comcast OpenSTack model :)
Paul Merrell

Will Language Overload Force Open Enterprises? - 0 views

  • "XML doesn't have the monopoly it used to have," Bray said. "It used to be that if you wanted to send messages back and forth across the wire, XML was the only game in town." "Clearly, there is going to more heterogeneity for wire formats, too," he added. "Increasingly, given that the future will have more than two programming languages, there will be a lot more messages being passed around."
  • Bray comes to the issue from a unique perspective. As a co-author of the original XML specifications, he helped create a language that at one point had been intended to become the lingua franca for all Web communications. He now admits that he wasn't entirely accurate in his original vision of where XML would end up.
  • "I was so completely wrong on everything," Bray admitted. "We thought we were going to replace HTML and that turned out to be a silly idea. It turned out be used for syndication feeds and purchase orders and a million other things, and that's fine. Things find their own level." However, in today's world of language proliferation, Bray has another favorite approach for exchanging information. "I am a strong partisan of the REST (define) approach, which provides a Web-based approach for integration of anything to anything," Bray said. "REST isn't tied to XML or JSON but it makes it easy to use either."
  •  
    Most quality online stores. Know whether you are a trusted online retailer in the world. Whatever we can buy very good quality. and do not hesitate. Everything is very high quality. Including clothes, accessories, bags, cups. Highly recommended. This is one of the trusted online store in the world. View now www.retrostyler.com
Paul Merrell

NSA contractors use LinkedIn profiles to cash in on national security | Al Jazeera America - 0 views

  • NSA spies need jobs, too. And that is why many covert programs could be hiding in plain sight. Job websites such as LinkedIn and Indeed.com contain hundreds of profiles that reference classified NSA efforts, posted by everyone from career government employees to low-level IT workers who served in Iraq or Afghanistan. They offer a rare glimpse into the intelligence community's projects and how they operate. Now some researchers are using the same kinds of big-data tools employed by the NSA to scrape public LinkedIn profiles for classified programs. But the presence of so much classified information in public view raises serious concerns about security — and about the intelligence industry as a whole. “I’ve spent the past couple of years searching LinkedIn profiles for NSA programs,” said Christopher Soghoian, the principal technologist with the American Civil Liberties Union’s Speech, Privacy and Technology Project.
  • On Aug. 3, The Wall Street Journal published a story about the FBI’s growing use of hacking to monitor suspects, based on information Soghoian provided. The next day, Soghoian spoke at the Defcon hacking conference about how he uncovered the existence of the FBI’s hacking team, known as the Remote Operations Unit (ROU), using the LinkedIn profiles of two employees at James Bimen Associates, with which the FBI contracts for hacking operations. “Had it not been for the sloppy actions of a few contractors updating their LinkedIn profiles, we would have never known about this,” Soghoian said in his Defcon talk. Those two contractors were not the only ones being sloppy.
  • And there are many more. A quick search of Indeed.com using three code names unlikely to return false positives — Dishfire, XKeyscore and Pinwale — turned up 323 résumés. The same search on LinkedIn turned up 48 profiles mentioning Dishfire, 18 mentioning XKeyscore and 74 mentioning Pinwale. Almost all these people appear to work in the intelligence industry. Network-mapping the data Fabio Pietrosanti of the Hermes Center for Transparency and Digital Human Rights noticed all the code names on LinkedIn last December. While sitting with M.C. McGrath at the Chaos Communication Congress in Hamburg, Germany, Pietrosanti began searching the website for classified program names — and getting serious results. McGrath was already developing Transparency Toolkit, a Web application for investigative research, and knew he could improve on Pietrosanti’s off-the-cuff methods.
  • ...2 more annotations...
  • “I was, like, huh, maybe there’s more we can do with this — actually get a list of all these profiles that have these results and use that to analyze the structure of which companies are helping with which programs, which people are helping with which programs, try to figure out in what capacity, and learn more about things that we might not know about,” McGrath said. He set up a computer program called a scraper to search LinkedIn for public profiles that mention known NSA programs, contractors or jargon — such as SIGINT, the agency’s term for “signals intelligence” gleaned from intercepted communications. Once the scraper found the name of an NSA program, it searched nearby for other words in all caps. That allowed McGrath to find the names of unknown programs, too. Once McGrath had the raw data — thousands of profiles in all, with 70 to 80 different program names — he created a network graph that showed the relationships between specific government agencies, contractors and intelligence programs. Of course, the data are limited to what people are posting on their LinkedIn profiles. Still, the network graph gives a sense of which contractors work on several NSA programs, which ones work on just one or two, and even which programs military units in Iraq and Afghanistan are using. And that is just the beginning.
  • Click on the image to view an interactive network illustration of the relationships between specific national security surveillance programs in red, and government organizations or private contractors in blue.
  •  
    What a giggle, public spying on NSA and its contractors using Big Data. The interactive network graph with its sidebar display of relevant data derived from LinkedIn profiles is just too delightful. 
Paul Merrell

Trump administration pulls back curtain on secretive cybersecurity process - The Washin... - 0 views

  • The White House on Wednesday made public for the first time the rules by which the government decides to disclose or keep secret software flaws that can be turned into cyberweapons — whether by U.S. agencies hacking for foreign intelligence, money-hungry criminals or foreign spies seeking to penetrate American computers. The move to publish an un­classified charter responds to years of criticism that the process was unnecessarily opaque, fueling suspicion that it cloaked a stockpile of software flaws that the National Security Agency was hoarding to go after foreign targets but that put Americans’ cyber­security at risk.
  • The rules are part of the “Vulnerabilities Equities Process,” which the Obama administration revamped in 2014 as a multi­agency forum to debate whether and when to inform companies such as Microsoft and Juniper that the government has discovered or bought a software flaw that, if weaponized, could affect the security of their product. The Trump administration has mostly not altered the rules under which the government reaches a decision but is disclosing its process. Under the VEP, an “equities review board” of at least a dozen national security and civilian agencies will meet monthly — or more often, if a need arises — to discuss newly discovered vulnerabilities. Besides the NSA, the CIA and the FBI, the list includes the Treasury, Commerce and State departments, and the Office of Management and Budget. The priority is on disclosure, the policy states, to protect core Internet systems, the U.S. economy and critical infrastructure, unless there is “a demonstrable, overriding interest” in using the flaw for intelligence or law enforcement purposes. The government has long said that it discloses the vast majority — more than 90 percent — of the vulnerabilities it discovers or buys in products from defense contractors or other sellers. In recent years, that has amounted to more than 100 a year, according to people familiar with the process. But because the process was classified, the National Security Council, which runs the discussion, was never able to reveal any numbers. Now, Joyce said, the number of flaws disclosed and the number retained will be made public in an annual report. A classified version will be sent to Congress, he said.
1 - 9 of 9
Showing 20 items per page