The Future of Collaborative Networks : Aaron Fulkerson of MindTouch - 0 views
-
MindTouch was by far and away the hottest property at the 2009 Web 2.0 Conference. And for good reason. They have figured out how to tap into the productivity value of enterprise collaborative networks. Most their underlying stuff is based on REST based data objects and services, but they also allow for proprietary data bindings. The key to MindTouch seemd to be the easy to fall into and use collaborative interface: imagine a workgroup project centered around a Web page filled with data objects, graphics and content, with each object also having a collabortaive conversation attached to it. Sounds complicated, but that's where the magic of MindTouch kicks in. It's simple. One the things that most impressed me was an interactive graph placed on one of the wiki project pages. The graph was being fed data from a local excel spreadsheet, and could be interacted with in real time. It was simple to change from a pie chart to a bar graph and so on. It was also possible to interact with the data itself and create what-if scenario's. Great stuff. With considerable persistence though, i was able to discover from Aaron that this interactivity and graphical richness was due to a Silverlight plug-in! From the article: "..... Rather than focusing on socialization, one to one interactions and individual enrichment, businesses must be concerned with creating an information fabric within their organizations. This information fabric is a federation of content from the multiplicity of data and application silos utilized on a daily basis; such as, ERP, CRM, file servers, email, databases, web-services infrastructures, etc. When you make this information fabric easy to edit between groups of individuals in a dynamic, secure, governed and real-time manner, it creates a Collaborative Network." "This is very different from social networks or social software, which is focused entirely on enabling conversations. Collaborative Networks are focused on groups accessing and organiz
Archiveteam - 0 views
-
HISTORY IS OUR FUTURE And we've been trashing our history Archive Team is a loose collective of rogue archivists, programmers, writers and loudmouths dedicated to saving our digital heritage. Since 2009 this variant force of nature has caught wind of shutdowns, shutoffs, mergers, and plain old deletions - and done our best to save the history before it's lost forever. Along the way, we've gotten attention, resistance, press and discussion, but most importantly, we've gotten the message out: IT DOESN'T HAVE TO BE THIS WAY. This website is intended to be an offloading point and information depot for a number of archiving projects, all related to saving websites or data that is in danger of being lost. Besides serving as a hub for team-based pulling down and mirroring of data, this site will provide advice on managing your own data and rescuing it from the brink of destruction. Currently Active Projects (Get Involved Here!) Archive Team recruiting Want to code for Archive Team? Here's a starting point.
-
Archive Team is a loose collective of rogue archivists, programmers, writers and loudmouths dedicated to saving our digital heritage. Since 2009 this variant force of nature has caught wind of shutdowns, shutoffs, mergers, and plain old deletions - and done our best to save the history before it's lost forever. Along the way, we've gotten attention, resistance, press and discussion, but most importantly, we've gotten the message out: IT DOESN'T HAVE TO BE THIS WAY. This website is intended to be an offloading point and information depot for a number of archiving projects, all related to saving websites or data that is in danger of being lost. Besides serving as a hub for team-based pulling down and mirroring of data, this site will provide advice on managing your own data and rescuing it from the brink of destruction.
-
Who We Are and how you can join our cause! Deathwatch is where we keep track of sites that are sickly, dying or dead. Fire Drill is where we keep track of sites that seem fine but a lot depends on them. Projects is a comprehensive list of AT endeavors. Philosophy describes the ideas underpinning our work. Some Starting Points The Introduction is an overview of basic archiving methods. Why Back Up? Because they don't care about you. Back Up your Facebook Data Learn how to liberate your personal data from Facebook. Software will assist you in regaining control of your data by providing tools for information backup, archiving and distribution. Formats will familiarise you with the various data formats, and how to ensure your files will be readable in the future. Storage Media is about where to get it, what to get, and how to use it. Recommended Reading links to others sites for further information. Frequently Asked Questions is where we answer common questions.
-
The Archive Team Warrior is a virtual archiving appliance. You can run it to help with the ArchiveTeam archiving efforts. It will download sites and upload them to our archive - and it's really easy to do! The warrior is a virtual machine, so there is no risk to your computer. The warrior will only use your bandwidth and some of your disk space. It will get tasks from and report progress to the Tracker. Basic usage The warrior runs on Windows, OS X and Linux using a virtual machine. You'll need one of: VirtualBox (recommended) VMware workstation/player (free-gratis for personal use) See below for alternative virtual machines Partners with and contributes lots of archives to the Wayback Machine. Here's how you can help by contributing some bandwidth if you run an always-on box with an internet connection.
He Was a Hacker for the NSA and He Was Willing to Talk. I Was Willing to Listen. - 0 views
-
he message arrived at night and consisted of three words: “Good evening sir!” The sender was a hacker who had written a series of provocative memos at the National Security Agency. His secret memos had explained — with an earthy use of slang and emojis that was unusual for an operative of the largest eavesdropping organization in the world — how the NSA breaks into the digital accounts of people who manage computer networks, and how it tries to unmask people who use Tor to browse the web anonymously. Outlining some of the NSA’s most sensitive activities, the memos were leaked by Edward Snowden, and I had written about a few of them for The Intercept. There is no Miss Manners for exchanging pleasantries with a man the government has trained to be the digital equivalent of a Navy SEAL. Though I had initiated the contact, I was wary of how he might respond. The hacker had publicly expressed a visceral dislike for Snowden and had accused The Intercept of jeopardizing lives by publishing classified information. One of his memos outlined the ways the NSA reroutes (or “shapes”) the internet traffic of entire countries, and another memo was titled “I Hunt Sysadmins.” I felt sure he could hack anyone’s computer, including mine. Good evening sir!
-
The sender was a hacker who had written a series of provocative memos at the National Security Agency. His secret memos had explained — with an earthy use of slang and emojis that was unusual for an operative of the largest eavesdropping organization in the world — how the NSA breaks into the digital accounts of people who manage computer networks, and how it tries to unmask people who use Tor to browse the web anonymously. Outlining some of the NSA’s most sensitive activities, the memos were leaked by Edward Snowden, and I had written about a few of them for The Intercept. There is no Miss Manners for exchanging pleasantries with a man the government has trained to be the digital equivalent of a Navy SEAL. Though I had initiated the contact, I was wary of how he might respond. The hacker had publicly expressed a visceral dislike for Snowden and had accused The Intercept of jeopardizing lives by publishing classified information. One of his memos outlined the ways the NSA reroutes (or “shapes”) the internet traffic of entire countries, and another memo was titled “I Hunt Sysadmins.” I felt sure he could hack anyone’s computer, including mine.
-
I got lucky with the hacker, because he recently left the agency for the cybersecurity industry; it would be his choice to talk, not the NSA’s. Fortunately, speaking out is his second nature.
- ...7 more annotations...
Obama to propose legislation to protect firms that share cyberthreat data - The Washing... - 0 views
-
President Obama plans to announce legislation Tuesday that would shield companies from lawsuits for sharing computer threat data with the government in an effort to prevent cyberattacks. On the heels of a destructive attack at Sony Pictures Entertainment and major breaches at JPMorgan Chase and retail chains, Obama is intent on capitalizing on the heightened sense of urgency to improve the security of the nation’s networks, officials said. “He’s been doing everything he can within his executive authority to move the ball on this,” said a senior administration official who spoke on the condition of anonymity to discuss legislation that has not yet been released. “We’ve got to get something in place that allows both industry and government to work more closely together.”
-
But in a provision likely to raise concerns from privacy advocates, the administration wants to require DHS to share that information “in as near real time as possible” with other government agencies that have a cybersecurity mission, the official said. Those include the National Security Agency, the Pentagon’s Cyber Command, the FBI and the Secret Service. “DHS needs to take an active lead role in ensuring that unnecessary personal information is not shared with intelligence authorities,” Jaycox said. The debates over government surveillance prompted by disclosures from former NSA contractor Edward Snowden have shown that “the agencies already have a tremendous amount of unnecessary information,” he said.
-
“We think the current information-sharing regime is adequate,” said Mark Jaycox, legislative analyst at the Electronic Frontier Foundation, a privacy group. “More companies need to use it, but the idea of broad legal immunity isn’t needed right now.” The administration official disagreed. The lack of such immunity is what prevents many companies from greater sharing of data with the government, the official said. “We have heard that time and time again,” the official said. The proposal, which builds on a 2011 administration bill, grants liability protection to companies that provide indicators of cyberattacks and threats to the Department of Homeland Security.
- ...5 more annotations...
How Edward Snowden Changed Everything | The Nation - 0 views
-
Ben Wizner, who is perhaps best known as Edward Snowden’s lawyer, directs the American Civil Liberties Union’s Speech, Privacy & Technology Project. Wizner, who joined the ACLU in August 2001, one month before the 9/11 attacks, has been a force in the legal battles against torture, watch lists, and extraordinary rendition since the beginning of the global “war on terror.” Ad Policy On October 15, we met with Wizner in an upstate New York pub to discuss the state of privacy advocacy today. In sometimes sardonic tones, he talked about the transition from litigating on issues of torture to privacy advocacy, differences between corporate and state-sponsored surveillance, recent developments in state legislatures and the federal government, and some of the obstacles impeding civil liberties litigation. The interview has been edited and abridged for publication.
-
en Wizner, who is perhaps best known as Edward Snowden’s lawyer, directs the American Civil Liberties Union’s Speech, Privacy & Technology Project. Wizner, who joined the ACLU in August 2001, one month before the 9/11 attacks, has been a force in the legal battles against torture, watch lists, and extraordinary rendition since the beginning of the global “war on terror.” Ad Policy On October 15, we met with Wizner in an upstate New York pub to discuss the state of privacy advocacy today. In sometimes sardonic tones, he talked about the transition from litigating on issues of torture to privacy advocacy, differences between corporate and state-sponsored surveillance, recent developments in state legislatures and the federal government, and some of the obstacles impeding civil liberties litigation. The interview has been edited and abridged for publication.
-
Many of the technologies, both military technologies and surveillance technologies, that are developed for purposes of policing the empire find their way back home and get repurposed. You saw this in Ferguson, where we had military equipment in the streets to police nonviolent civil unrest, and we’re seeing this with surveillance technologies, where things that are deployed for use in war zones are now commonly in the arsenals of local police departments. For example, a cellphone surveillance tool that we call the StingRay—which mimics a cellphone tower and communicates with all the phones around—was really developed as a military technology to help identify targets. Now, because it’s so inexpensive, and because there is a surplus of these things that are being developed, it ends up getting pushed down into local communities without local democratic consent or control.
- ...4 more annotations...
-
A must-read. Ben Wizner discusses the current climate in the courts in government surveillance cases and how Edward Snowden's disclosures have affected that, and much more. Wizner is not only Edward Snowden's lawyer, he is also the coordinator of all ACLU litigation on electronic surveillance matters.
Discoverer of JSON Recommends Suspension of HTML5 | Web Security Journal - 0 views
-
Fascinating conversation between Douglas Crockford and Jeremy Geelan. The issue is that XSS - the Cross Site Scripting capabilities of HTML. and "the painful gap" in the HTML5 specification of the itnerface between JavaScript and the browser. I had to use the Evernote Clearly Chrome extension to read this page. Microsoft is running a huge JavaScript advertisement/pointer that totally blocks the page with no way of closing or escaping. Incredible. Clearly was able to knock it out though. Nicely done! The HTML5-XSS problem is very important, especially if your someone like me that sees the HTML+ format (HTML5-CSS3-JSON-JavaScript-SVG/Canvas) as the undisputed Cloud Productivity Platform "compound document" model. The XSS discussion goes right to the heart of matter of creating an HTML compound document in much the same way that a MSOffice Productivity Compound Document worked. The XSS mimics the functionality of of embedded compound document components such as OLE, DDE, ODBC and Scripting. Crack open any client/server business document and it will be found to be loaded with these embeded components. It seems to me that any one of the Cloud Productivity Platform contenders could solve the HTML-XSS problem. I'm thinking Google Apps, Zoho, SalesForce.com, RackSpace and Amazon - with gApps and Zoho clearly leading the charge. Also let me add that RSS and XMP (Jabber), while not normally mentioned with JSON, ought to be considered. Twitter uses RSS to transport and connect data. Jabber is of course a long time favorite of mine. excerpt: The fundamental mistake in HTML5 was one of prioritization. It should have tackled the browser's most important problem first. Once the platform was secured, then shiny new features could be carefully added. There is much that is attractive about HTML5. But ultimately the thing that made the browser into a credible application delivery system was JavaScript, the ultimate workaround tool. There is a painful gap
Operation Socialist: How GCHQ Spies Hacked Belgium's Largest Telco - 0 views
-
When the incoming emails stopped arriving, it seemed innocuous at first. But it would eventually become clear that this was no routine technical problem. Inside a row of gray office buildings in Brussels, a major hacking attack was in progress. And the perpetrators were British government spies. It was in the summer of 2012 that the anomalies were initially detected by employees at Belgium’s largest telecommunications provider, Belgacom. But it wasn’t until a year later, in June 2013, that the company’s security experts were able to figure out what was going on. The computer systems of Belgacom had been infected with a highly sophisticated malware, and it was disguising itself as legitimate Microsoft software while quietly stealing data. Last year, documents from National Security Agency whistleblower Edward Snowden confirmed that British surveillance agency Government Communications Headquarters was behind the attack, codenamed Operation Socialist. And in November, The Intercept revealed that the malware found on Belgacom’s systems was one of the most advanced spy tools ever identified by security researchers, who named it “Regin.”
-
The full story about GCHQ’s infiltration of Belgacom, however, has never been told. Key details about the attack have remained shrouded in mystery—and the scope of the attack unclear. Now, in partnership with Dutch and Belgian newspapers NRC Handelsblad and De Standaard, The Intercept has pieced together the first full reconstruction of events that took place before, during, and after the secret GCHQ hacking operation. Based on new documents from the Snowden archive and interviews with sources familiar with the malware investigation at Belgacom, The Intercept and its partners have established that the attack on Belgacom was more aggressive and far-reaching than previously thought. It occurred in stages between 2010 and 2011, each time penetrating deeper into Belgacom’s systems, eventually compromising the very core of the company’s networks.
-
When the incoming emails stopped arriving, it seemed innocuous at first. But it would eventually become clear that this was no routine technical problem. Inside a row of gray office buildings in Brussels, a major hacking attack was in progress. And the perpetrators were British government spies. It was in the summer of 2012 that the anomalies were initially detected by employees at Belgium’s largest telecommunications provider, Belgacom. But it wasn’t until a year later, in June 2013, that the company’s security experts were able to figure out what was going on. The computer systems of Belgacom had been infected with a highly sophisticated malware, and it was disguising itself as legitimate Microsoft software while quietly stealing data. Last year, documents from National Security Agency whistleblower Edward Snowden confirmed that British surveillance agency Government Communications Headquarters was behind the attack, codenamed Operation Socialist. And in November, The Intercept revealed that the malware found on Belgacom’s systems was one of the most advanced spy tools ever identified by security researchers, who named it “Regin.”
- ...7 more annotations...
Bulk Collection Under Section 215 Has Ended… What's Next? | Just Security - 0 views
-
The first (and thus far only) roll-back of post-9/11 surveillance authorities was implemented over the weekend: The National Security Agency shuttered its program for collecting and holding the metadata of Americans’ phone calls under Section 215 of the Patriot Act. While bulk collection under Section 215 has ended, the government can obtain access to this information under the procedures specified in the USA Freedom Act. Indeed, some experts have argued that the Agency likely has access to more metadata because its earlier dragnet didn’t cover cell phones or Internet calling. In addition, the metadata of calls made by an individual in the United States to someone overseas and vice versa can still be collected in bulk — this takes place abroad under Executive Order 12333. No doubt the NSA wishes that this was the end of the surveillance reform story and the Paris attacks initially gave them an opening. John Brennan, the Director of the CIA, implied that the attacks were somehow related to “hand wringing” about spying and Sen. Tom Cotton (R-Ark.) introduced a bill to delay the shut down of the 215 program. Opponents of encryption were quick to say: “I told you so.”
-
But the facts that have emerged thus far tell a different story. It appears that much of the planning took place IRL (that’s “in real life” for those of you who don’t have teenagers). The attackers, several of whom were on law enforcement’s radar, communicated openly over the Internet. If France ever has a 9/11 Commission-type inquiry, it could well conclude that the Paris attacks were a failure of the intelligence agencies rather than a failure of intelligence authorities. Despite the passage of the USA Freedom Act, US surveillance authorities have remained largely intact. Section 702 of the FISA Amendments Act — which is the basis of programs like PRISM and the NSA’s Upstream collection of information from Internet cables — sunsets in the summer of 2017. While it’s difficult to predict the political environment that far out, meaningful reform of Section 702 faces significant obstacles. Unlike the Section 215 program, which was clearly aimed at Americans, Section 702 is supposedly targeted at foreigners and only picks up information about Americans “incidentally.” The NSA has refused to provide an estimate of how many Americans’ information it collects under Section 702, despite repeated requests from lawmakers and most recently a large cohort of advocates. The Section 215 program was held illegal by two federal courts (here and here), but civil attempts to challenge Section 702 have run into standing barriers. Finally, while two review panels concluded that the Section 215 program provided little counterterrorism benefit (here and here), they found that the Section 702 program had been useful.
-
There is, nonetheless, some pressure to narrow the reach of Section 702. The recent decision by the European Court of Justice in the safe harbor case suggests that data flows between Europe and the US may be restricted unless the PRISM program is modified to protect the information of Europeans (see here, here, and here for discussion of the decision and reform options). Pressure from Internet companies whose business is suffering — estimates run to the tune of $35 to 180 billion — as a result of disclosures about NSA spying may also nudge lawmakers towards reform. One of the courts currently considering criminal cases which rely on evidence derived from Section 702 surveillance may hold the program unconstitutional either on the basis of the Fourth Amendment or Article III for the reasons set out in this Brennan Center report. A federal district court in Colorado recently rejected such a challenge, although as explained in Steve’s post, the decision did not seriously explore the issues. Further litigation in the European courts too could have an impact on the debate.
- ...2 more annotations...
Google Chrome for PC Latest Version - 0 views
image
In Hearing on Internet Surveillance, Nobody Knows How Many Americans Impacted in Data C... - 0 views
-
The Senate Judiciary Committee held an open hearing today on the FISA Amendments Act, the law that ostensibly authorizes the digital surveillance of hundreds of millions of people both in the United States and around the world. Section 702 of the law, scheduled to expire next year, is designed to allow U.S. intelligence services to collect signals intelligence on foreign targets related to our national security interests. However—thanks to the leaks of many whistleblowers including Edward Snowden, the work of investigative journalists, and statements by public officials—we now know that the FISA Amendments Act has been used to sweep up data on hundreds of millions of people who have no connection to a terrorist investigation, including countless Americans. What do we mean by “countless”? As became increasingly clear in the hearing today, the exact number of Americans impacted by this surveillance is unknown. Senator Franken asked the panel of witnesses, “Is it possible for the government to provide an exact count of how many United States persons have been swept up in Section 702 surveillance? And if not the exact count, then what about an estimate?”
-
Elizabeth Goitein, the Brennan Center director whose articulate and thought-provoking testimony was the highlight of the hearing, noted that at this time an exact number would be difficult to provide. However, she asserted that an estimate should be possible for most if not all of the government’s surveillance programs. None of the other panel participants—which included David Medine and Rachel Brand of the Privacy and Civil Liberties Oversight Board as well as Matthew Olsen of IronNet Cybersecurity and attorney Kenneth Wainstein—offered an estimate. Today’s hearing reaffirmed that it is not only the American people who are left in the dark about how many people or accounts are impacted by the NSA’s dragnet surveillance of the Internet. Even vital oversight committees in Congress like the Senate Judiciary Committee are left to speculate about just how far-reaching this surveillance is. It's part of the reason why we urged the House Judiciary Committee to demand that the Intelligence Community provide the public with a number.
-
The lack of information makes rigorous oversight of the programs all but impossible. As Senator Franken put it in the hearing today, “When the public lacks even a rough sense of the scope of the government’s surveillance program, they have no way of knowing if the government is striking the right balance, whether we are safeguarding our national security without trampling on our citizens’ fundamental privacy rights. But the public can’t know if we succeed in striking that balance if they don’t even have the most basic information about our major surveillance programs." Senator Patrick Leahy also questioned the panel about the “minimization procedures” associated with this type of surveillance, the privacy safeguard that is intended to ensure that irrelevant data and data on American citizens is swiftly deleted. Senator Leahy asked the panel: “Do you believe the current minimization procedures ensure that data about innocent Americans is deleted? Is that enough?” David Medine, who recently announced his pending retirement from the Privacy and Civil Liberties Oversight Board, answered unequivocally:
- ...2 more annotations...
-
The 4th Amendment: "The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and *particularly describing the place to be searched, and the* persons or *things to be seized."* So much for the particularized description of the place to be searched and the thngs to be seized. Fah! Who needs a Constitution, anyway ....
Diary Of An x264 Developer » Flash, Google, VP8, and the future of internet v... - 0 views
-
In depth technical discussion about Flash, HTML5, H.264, and Google's VP8. Excellent. Read the comments. Bottom line - Google has the juice to put Flash and H.264 in the dirt. The YouTube acquisition turns out to be very strategic. excerpt: The internet has been filled for quite some time with an enormous number of blog posts complaining about how Flash sucks-so much that it's sounding as if the entire internet is crying wolf. But, of course, despite the incessant complaining, they're right: Flash has terrible performance on anything other than Windows x86 and Adobe doesn't seem to care at all. But rather than repeat this ad nauseum, let's be a bit more intellectual and try to figure out what happened. Flash became popular because of its power and flexibility. At the time it was the only option for animated vector graphics and interactive content (stuff like VRML hardly counts). Furthermore, before Flash, the primary video options were Windows Media, Real, and Quicktime: all of which were proprietary, had no free software encoders or decoders, and (except for Windows Media) required the user to install a clunky external application, not merely a plugin. Given all this, it's clear why Flash won: it supported open multimedia formats like H.263 and MP3, used an ultra-simple container format that anyone could write (FLV), and worked far more easily and reliably than any alternative. Thus, Adobe (actually, at the time, Macromedia) got their 98% install base. And with that, they began to become complacent. Any suggestion of a competitor was immediately shrugged off; how could anyone possibly compete with Adobe, given their install base? It'd be insane, nobody would be able to do it. They committed the cardinal sin of software development: believing that a competitor being better is excusable. At x264, if we find a competitor that does something better, we immediately look into trying to put ourselves back on top. This is why
The Real Meaning Of Google Wave - Forbes.com - 0 views
-
Wave is a new way to build distributed applications, and it will open the door to an explosion of innovation.
-
So, if Wave is not just the demo application, what is it? Google Wave is a platform for creating distributed applications. Each Wave server can be involved in a number of conversations involving Wavelets, what most people would think of as a document. Wavelets are actually a much more powerful and general because they are based on XML, which means you can have lots of depth of content, like headings and subheadings of a book, but on steroids. Adding a document repository to XMPP is just revolutionary.
-
The XMPP protocol manages the communication between the Wave servers so that all the Wavelets can synchronize as they are changed. Then Google finished the job by making Wavelets tag-able, searchable and versioned, so you can play back changes. But Google Wave goes beyond just managing the content--it also manages the programs that act on the content. At any level, a program can be assigned to a Wavelet to render it, that is, show it to a user and help manage the conversation. Google Wave also manages the distribution and management of these programs. The idea of a platform that combines management of the data and the code is really powerful.
-
Good article. One of the first to go beyond the demo, recognizing that Wave is application platform - a wrapper for the convergence of communications and content. Excerpt: Wave is a new way to build distributed applications, and it will open the door to an explosion of innovation. What the Wave demo showed is support for a continuum from the shortest messages to longer and longer forms of content. All of it can be shared with precise control, tagged, searched. The version history is kept. No more mailing around a document. This takes the beauty of e-mail and wikis and extends it in a more flexible way to a much larger audience. Google Wave is a platform for creating distributed applications. Each Wave server can be involved in a number of conversations involving Wavelets, what most people would think of as a document. Wavelets are actually a much more powerful and general because they are based on XML, which means you can have lots of depth of content, like headings and subheadings of a book, but on steroids. Adding a document repository to XMPP is just revolutionary. The XMPP protocol manages the communication between the Wave servers so that all the Wavelets can synchronize as they are changed. Then Google finished the job by making Wavelets tag-able, searchable and versioned, so you can play back changes. But Google Wave goes beyond just managing the content--it also manages the programs that act on the content. At any level, a program can be assigned to a Wavelet to render it, that is, show it to a user and help manage the conversation. Google Wave also manages the distribution and management of these programs. The idea of a platform that combines management of the data and the code is really powerful.
Needlebase - 2 views
-
Move over FlipBoard and QWiki and meet Needle. The emerging market space for automating the process of collecting Web information to analyse, re-purpose and re-publish is getting crowded. Needle is designed to: acquire data from multiple sources: A simple tagging process quickly imports structured data from complex websites, XML feeds, and spreadsheets into a unified database of your design.merge, deduplicate and cleanse: Needle uses intelligent semantics to help you find and merge variant forms of the same record. Your merges, edits and deletions persist even after the original data is refreshed from its source. merge, deduplicate and cleanse: Needle uses intelligent semantics to help you find and merge variant forms of the same record. Your merges, edits and deletions persist even after the original data is refreshed from its source. build and publish custom data views: Use Needle's visual UI and powerful query language to configure exactly your desired view of the data, whether as a list, table, grid, or map. Then, with one click, publish the data for others to see, or export a feed of the clean data to your own local database. Flipboard is famous for the slick republishing / packaging process focused on iOS devices. Allows end users to choose sources. QWiki takes republishing to the extreme, blending voice over (from wikipedia text) with a slide show of multimedia information. Edn user does not yet have control and selection of information sources with QWiki. The iOS Sports Illustrated app seems to be the starting point for "immersive webzines", with the NY Times close behind. Very very slick packaging of basic Web information. Flipboard followed the iOS re-publishing wave with an end-user facing immersive webzine packaging design. And now we have Needle. Still looking for a business document FlipBoard, where a "project" is packaged in a FlipBoard immersive container. The iPack would be similar to an iPUB book with the added featur
-
Note: On April 12th, 2011 Needle was acquired by Google.
Prepare to Hang Up the Phone, Forever - WSJ.com - 0 views
-
At decade's end, the trusty landline telephone could be nothing more than a memory. Telecom giants AT&T T +0.31% AT&T Inc. U.S.: NYSE $35.07 +0.11 +0.31% March 28, 2014 4:00 pm Volume (Delayed 15m) : 24.66M AFTER HOURS $35.03 -0.04 -0.11% March 28, 2014 7:31 pm Volume (Delayed 15m): 85,446 P/E Ratio 10.28 Market Cap $182.60 Billion Dividend Yield 5.25% Rev. per Employee $529,844 03/29/14 Prepare to Hang Up the Phone, ... 03/21/14 AT&T Criticizes Netflix's 'Arr... 03/21/14 Samsung's Galaxy S5 Smartphone... More quote details and news » T in Your Value Your Change Short position and Verizon Communications VZ -0.57% Verizon Communications Inc. U.S.: NYSE $47.42 -0.27 -0.57% March 28, 2014 4:01 pm Volume (Delayed 15m) : 24.13M AFTER HOURS $47.47 +0.05 +0.11% March 28, 2014 7:59 pm Volume (Delayed 15m): 1.57M
-
The two providers want to lay the crumbling POTS to rest and replace it with Internet Protocol-based systems that use the same wired and wireless broadband networks that bring Web access, cable programming and, yes, even your telephone service, into your homes. You may think you have a traditional landline because your home phone plugs into a jack, but if you have bundled your phone with Internet and cable services, you're making calls over an IP network, not twisted copper wires. California, Florida, Texas, Georgia, North Carolina, Wisconsin and Ohio are among states that agree telecom resources would be better redirected into modern telephone technologies and innovations, and will kill copper-based technologies in the next three years or so. Kentucky and Colorado are weighing similar laws, which force people to go wireless whether they want to or not. In Mantoloking, N.J., Verizon wants to replace the landline system, which Hurricane Sandy wiped out, with its wireless Voice Link. That would make it the first entire town to go landline-less, a move that isn't sitting well with all residents.
-
Safety is one of them. Call 911 from a landline and the emergency operator pinpoints your exact address, down to the apartment number. Wireless phones lack those specifics, and even with GPS navigation aren't as precise. Matters are worse in rural and even suburban areas that signals don't reach, sometimes because they're blocked by buildings or the landscape. That's of concern to the Federal Communications Commission, which oversees all forms of U.S. communications services. Universal access is a tenet of its mission, and, despite the state-by-state degradation of the mandate, it's unwilling to let telecom companies simply drop geographically undesirable customers. Telecom firms need FCC approval to ax services completely, and can't do so unless there is a viable competitor to pick up the slack. Last year AT&T asked to turn off its legacy network, which could create gaps in universal coverage and will force people off the grid to get a wireless provider.
- ...2 more annotations...
Electronic Imp: Former Apple, Google, Facebook engineers launch IoT startup - 2012-05-1... - 0 views
-
"We've put it in a user-installable module. The user buys the card and just plugs it into any device that has a slot," Fiennes explained." All a developer needs to do is add a socket and a 3-pin Atmel ID chip to their product. That's 75 cents: 30 cents for the ID chip and 45 cents for the socket." This assumes the availability of 3.3 V. "But given that most things you want to control from the Internet are electrical, we think that's reasonable," he said. If not, developers can include a battery.
-
Fiennes demonstrated a power adaptor with an Imp socket. He installed a card and an appropriately labeled block appeared in a browser window. Fiennes plugged in a chain of decorative lights and we clicked on the box on our browser. After clicking, the box text went from "off" to "on." Over Skype, we could see the lights had come on.Fiennes emphasized that control need not be manual and could be linked to other Internet apps such as weather reports, or to Electric Imp sensor nodes that monitor conditions such as humidity.A second example is an Electric Imp enabled passive infrared sensor. Fiennes demonstrated how it could be programmed to report the time and date of detected motion to a client's Web pages on the Electric Imp server. In turn, those pages could be programmed to send an alarm to a mobile phone. The alarm could also be triggered if no motion was detected, allowing the sensor to serve as a monitor for the elderly in their homes, for example. If there is no activity before 9 a.m., a message is sent to a caregiver.
-
The final example is an Electric Imp washing machine. Machine operation can be made conditional on a number of variables, including the price of electricity. "Every washing machine has microcontroller and that microcontroller has a lot of data," said Fiennes. "That data could be sent back to a washing machine service organization that could call the client up before the washing machine breaks down."
- ...1 more annotation...
-
Put Electronic Imp at the top of the "Technologies to watch" list. Good stuff and great implementation - platform plan. excerpt "We've put it in a user-installable module. The user buys the card and just plugs it into any device that has a slot," Fiennes explained." All a developer needs to do is add a socket and a 3-pin Atmel ID chip to their product. That's 75 cents: 30 cents for the ID chip and 45 cents for the socket." This assumes the availability of 3.3 V. "But given that most things you want to control from the Internet are electrical, we think that's reasonable," he said. If not, developers can include a battery. When the $25 card is installed in a slot and powered up, it will find the ID number and automatically transmit the information to Electric Imp's servers. Fiennes and his colleagues have written a virtual machine that runs under a proprietary embedded operating system on the node and looks for updates of itself on the Internet. SSL encryption is used for data security when transmitted over the link. ........
Government Market Drags Microsoft Deeper into the Cloud - 0 views
-
Nice article from Scott M. Fulton describing Microsoft's iron fisted lock on government desktop productivity systems and the great transition to a Cloud Productivity Platform. Keep in mind that in 2005, Massachusetts tried to do the same thing with their SOA effort. Then Governor Romney put over $1 M into a beta test that produced the now infamous 300 page report written by Sam Hiser. The details of this test resulted in the even more infamous da Vinci ODF plug-in for Microsoft Office desktops. The lessons of Massachusetts are simple enough; it's not the formats or office suite applications. It's the business process! Conversion of documents not only breaks the document. It also breaks the embedded "business process". The mystery here is that Microsoft owns the client side of client/server computing. Compound documents, loaded with intertwined OLE, ODBC, ActiveX, and other embedded protocols and interface dependencies connecting data sources with work flow, are the fuel of these client/server business productivity systems. Break a compound document and you break the business process. Even though Massachusetts workers were wonderfully enthusiastic and supportive of an SOA based infrastructure that would include Linux servers and desktops as well as OSS productivity applications, at the end of the day it's all about getting the work done. Breaking the business process turned out to be a show stopper. Cloud Computing changes all that. The reason is that the Cloud is rapidly replacing client/server as the target architecture for new productivity developments; including data centers and transaction processing systems. There are many reasons for the great transition, but IMHO the most important is that the Web combines communications with content, data, and collaborative computing. Anyone who ever worked with the Microsoft desktop productivity environment knows that the desktop sucks as a communication device. There was
This 28-Year-Old's Startup Is Moving $350 Million And Wants To Completely Kill Credit C... - 0 views
-
The biggest difference between ideas like this and a PayPal — and PayPal is a phenomenal idea, Square is too — is that those are built on top of networks like Visa and MasterCard. We're building our own
-
Fascinating plan for totally disrupting the Banksters Credit Card Golden Goose industry. Good explanation of how things work, and how Dwolla will disrupt things. PayPal and Square are based on existing credit card transaction processing system. They make their money adding on to the basic credit card charge. Dwolla replaces the credit card processing system with a bank direct model. Here's the thing: Credit Cards charge sellers 3% of the transaction. Dwolla charges a transaction fee of $0.25. Yes, 25 Cents.
-
-
All banks are connected by one ACH system. Credit card companies utilize that same system to pay off your credit card charges. Banks internally set along that same system to move money in their own banks. This system in its own right is riddled with flaws — tons of fraud issues and waste and delays. If you've ever had a payment take a few days to clear, its because they're waiting on that ACH system. We want to fix that system between the banks, take out the delays and make it instant. If we can create this ubiquitous cash layer of distribution between consumers and merchants and developers and financial institutions, that actually fixes the problem.
-
We don't believe in credit cards. We believe in authorization and in lower cost transfers. Our generation actually understands that when you buy sh*t, it comes out of your bank account and you have to pay for that.
-
Incredible interview with Ben Milne of Dwolla, the PayPal and Square killer that promises to take a huge chunk out of the Credit Card transaction industry. Incredible must read! This is page 2 out of four. Starts at: http://bit.ly/vzVUy3 excerpt: How does Dwolla work and how is it different from PayPal? With Dwolla, payments are made directly from your bank account. No credit or debit cards are allowed. And because they don't exist in the system, we don't have to bring the fees into the system. You can spend any amount of money and when you do that, the person on the other end doesn't have to pay 1, 2, 3 or 4%. They only pay $0.25 a transaction, which is especially helpful when it's $1,000, $2,000 or $5,000 transactions. Obviously PayPal becomes very cost prohibitive with those larger transactions. The biggest difference between ideas like this and a PayPal - and PayPal is a phenomenal idea, Square is too - is that those are built on top of networks like Visa and MasterCard. We're building our own.
DARPA seeks the Holy Grail of search engines - 0 views
-
The scientists at DARPA say the current methods of searching the Internet for all manner of information just won't cut it in the future. Today the agency announced a program that would aim to totally revamp Internet search and "revolutionize the discovery, organization and presentation of search results." Specifically, the goal of DARPA's Memex program is to develop software that will enable domain-specific indexing of public web content and domain-specific search capabilities. According to the agency the technologies developed in the program will also provide the mechanisms for content discovery, information extraction, information retrieval, user collaboration, and other areas needed to address distributed aggregation, analysis, and presentation of web content.
-
Memex also aims to produce search results that are more immediately useful to specific domains and tasks, and to improve the ability of military, government and commercial enterprises to find and organize mission-critical publically available information on the Internet. "The current one-size-fits-all approach to indexing and search of web content limits use to the business case of web-scale commercial providers," the agency stated.
-
The Memex program will address the need to move beyond a largely manual process of searching for exact text in a centralized index, including overcoming shortcomings such as: Limited scope and richness of indexed content, which may not include relevant components of the deep web such as temporary pages, pages behind forms, etc.; an impoverished index, which may not include shared content across pages, normalized content, automatic annotations, content aggregation, analysis, etc. Basic search interfaces, where every session is independent, there is no collaboration or history beyond the search term, and nearly exact text input is required; standard practice for interacting with the majority of web content, which remains one-at-a-time manual queries that return federated lists of results. Memex would ultimately apply to any public domain content; initially, DARPA said it intends to develop Memex to address a key Defense Department mission: fighting human trafficking. Human trafficking is a factor in many types of military, law enforcement and intelligence investigations and has a significant web presence to attract customers. The use of forums, chats, advertisements, job postings, hidden services, etc., continues to enable a growing industry of modern slavery. An index curated for the counter-trafficking domain, along with configurable interfaces for search and analysis, would enable new opportunities to uncover and defeat trafficking enterprises.
- ...1 more annotation...
-
DoD announces that they want to go beyond Google. Lots more detail in the proposal description linked from the article. Interesting tidbits: [i] the dark web is a specific target; [ii] they want the ability to crawl web pages blocked by robots.txt; [iii] they want to be able to search page source code and comments.
GSA picks Google Apps: What it means | ZDNet - 0 views
-
The General Services Administration made a bold decision to move its email and collaboration systems to the cloud. This is a huge win for cloud-computing, but perhaps should have been expected since last week the Feds announced a new requisition and purchase mandate that cloud-computing had to be the FIRST consideration for federal agency purchases. Note that the General Services Administration oversees requisitions and purchases for all Federal agencies! This is huge. Estimated to be worth $8 billion to cloud-computing providers. The cloud-computing market is estimated to be $30 Billion, but Gartner did not anticipate or expect Federal Agencies to embrace cloud-computing let alone issue a mandate for it. In the RFP issued last June, it was easy to see their goals in the statement of objectives: This Statement of Objectives (SOO) describes the goals that GSA expects to achieve with regard to the 1. modernization of its e-mail system; 2. provision of an effective collaborative working environment; 3. reduction of the government's in-house system maintenance burden by providing related business, technical, and management functions; and 4. application of appropriate security and privacy safeguards. GSA announced yesterday that they choose Google Apps for email and collaboration and Unisys as the implementation partner. So what does this mean? What it means (WIM) #1: GSA employees will be using a next-generation information workplace. And that means mobile, device-agnostic, and location-agile. Gmail on an iPad? No problem. Email from a home computer? Yep. For GSA and for every other agency and most companies, it's important to give employees the tools to be productive and engage from every location on every device. "Work becomes a thing you do and not a place you go." [Thanks to Earl Newsome of Estee Lauder for that quote.] WIM #2: GSA will save 50% of the cost of email over five years. This is also what our research on the cost of email o
« First
‹ Previous
41 - 60 of 1310
Next ›
Last »
Showing 20▼ items per page
"We could have built a social element into Mosaic. But back then the Internet was all about anonymity."
Anderson: Assuming you have enough bandwidth.
Andreessen: That's the very big if in this equation. If you have infinite network bandwidth, if you have an infinitely fast network, then this is what the technology wants. But we're not yet in a world of infinite speed, so that's why we have mobile apps and PC and Mac software on laptops and phones. That's why there are still Xbox games on discs. That's why everything isn't in the cloud. But eventually the technology wants it all to be up there.
Anderson: Back in 1995, Netscape began pursuing this vision by enabling the browser to do more.
Andreessen: We knew that you would need some pro