Skip to main content

Home/ Open Web/ Group items matching "Edge" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
Gary Edwards

Adobe Edge beta brings Flash-style design to HTML5 - 2 views

  •  
    While HTML5 developers are working directly with JavaScript, SVG, CSS, and other technologies, Flash designers enjoy a high-level environment with timelines, drawing tools, easy control of animation effects, and more. With Edge, released in beta Sunday, Adobe is striving to bring that same ease of use to HTML5 development. The user interface will be familiar to anyone who's used Flash or After Effects; a timeline allowing scrubbing and jumping to any point in an animation, properties panels to adjust objects, and a panel to show the actual animation. Behind the scenes, Edge uses standard HTML5. Scripting is provided by a combination of jQuery and Adobe's own scripts, and animation and styling uses both scripts and CSS. Pages produced by Edge encode the actual animations using a convenient JSON format. Edge itself embeds the WebKit rendering engine-the same one used in Apple's Safari browser and Google's Chrome-to actually display the animations.
Gary Edwards

Does It Matter Who Wins the Browser Wars? Only if you care about the Future of the OpenWeb | BNET - 1 views

  •  
    The Future of the Open Web You're right that the browser wars do not matter - except for this point of demarcation; browsers that support HTML+ and browser that support 1998 HTML. extensive comment by ~ge~ Not all Web services and applications support HTML+, the rapidly advancing set of technologies that includes HTML5, CSS3, SVG/Canvas, and JavaScript (including the libraries and JSON). Microsoft has chosen to draw the Open Web line at what amounts to 1998-2001 level of HTML/CSS. Above that line, they provision a rich-client / rich-server Web model bound to the .NET-WPF platform where C#, Silverlight, and XAML are very prominent. Noticeably, Open Web standards are for the most part replaced at this richer MSWeb level by proprietary technologies. Through limited support for HTML/CSS, IE8 itself acts to dumb down the Open Web. The effect of this is that business systems and day-to-day workflow processes bound to the ubiquitous and very "rich" MSOffice Productivity Environment have little choice when it comes to transitioning to the Web but to stay on the Microsoft 2010 treadmill. Sure, at some point legacy business processes and systems will be rewritten to the Web. The question is, will it be the Open Web or the MS-Web? The Open Web standards are the dividing line between owning your information and content, or, having that content bound to a Web platform comprised of proprietary Microsoft services, systems and applications. Web designers and developers are still caught up in the browser wars. They worry incessantly as to how to dumb down Web content and services to meet the limited functionality of IE. This sucks. So everyone continues to watch "the browser wars" stats. What they are really watching for though is that magic moment where "combined" HTML+ browser uptake in marketshare signals that they can start to implement highly graphical and collaboratively interactive HTML+ specific content. Meanwhile, the greater Web is a
Gary Edwards

Inbox Unchained: Mailbox just fixed email on the iPhone | The Verge - 0 views

  •  
    Good video demonstration of Mailbox, the new iOS app that will be released in the future for Android and the desktop.  Excellent Productivity issue discussion, as the founder of Mailbox explains what they are trying to do.  An excellent video coupled with a great interview and explanation of mobile productivity.   excerpt: "He asked himself, "What are people trying to do with email? What are the goals?" He started with Apple's Mail app for iPhone, which people were already familiar with, and injected elements of to-do apps he liked, since increasingly people are using their inboxes as to-do lists. The point was to create an experience that was distinctly mobile - an app that would let you take meaningful action while you're in line at Starbucks. Mailbox needed to intelligently display emails so you can parse and deal with them as quickly as possible. Most email apps require two or three taps to archive an email - perhaps the most common action you take on emails while you're mobile - but Mailbox only requires one: a swipe to the side. "Our biggest a-ha moment was when we realized that the primary use case of email on the phone is triage," Underwood says. Mailbox takes the reality of people using their inboxes as to-do lists and and builds on what Mail and Sparrow did right (push notifications and nicely threaded messages, respectively). SNOOZING MESSAGES To conserve space, Mailbox turns email conversations into SMS-like bubbles, which lets you quickly fly through an entire email chain. Once you've read a message, it shrinks in size so skimming threads is a snap. "Email will feel more and more like chat, and we'll continue to iterate towards that," Underwood says. "EMAIL WILL FEEL MORE AND MORE LIKE CHAT, AND WE'LL CONTINUE TO ITERATE TOWARDS THAT." Mailbox introduces a few other gestures, such as a swipe to the left that lets you "snooze" a message to be reminded about later. You can choose between a few snooze options: Later Today, This Eveni
Gary Edwards

SMB cloud adoption begins to acclerate, study finds - 0 views

  •  
    Interesting chart describes the massive transition of small and medium sized businesses to the Cloud.  Cloud based eMail and messaging leads the way.  Top two reasons for the great transition?  Cost reduction and productivity improvement. Unfortunately this article fails to describe what this great transition to the Cloud means to legacy productivity systems - most of which are provided and provisioned by Microsoft.  What happens to desktop and workgroup based business systems when the local data and transaction processing server systems are moved to the Cloud?  How are desktop and workgroup systems re written or migrated? Another factor missing from this article is any discussion of what happens to productivity when communications, content and collaborative computing are interoperably entwined throughout the application layer?  We know that the legacy Windows productivity platform seriously lacked communications capabilities.  This fact greatly reduced expected productivity gains.   excerpt: Microsoft commissioned the study of 3,000 small and medium sized businesses in 13 countries. The survey was conducted by Wayland, Mass.-based research firm Edge Strategies. The most commonly used cloud services are email, instant messaging, voice communications, and backup. Edge also looked at SMB cloud plans over the next three years and the same cloud services also are in the IT plans of those embracing the cloud. From this data, it certainly could be argued that SMBs seem to be quick to embrace the cloud in order to enhance communication. It makes sense: in small business, communication is key to ensure rapid growth. The biggest motivators for migration to the cloud among SMBs is to save money (54 percent), followed by increases in productivity. Decision makers also mentioned flexibility as a fairly common response. Of those already using the cloud, 59 percent reported productivity increases as a result. SMB cloud adoption begins to acclerate, study finds http:/
Gary Edwards

Will Intel let Jen-Hsun Huang spread graphics beyond PCs? » VentureBeat - 1 views

  •  
    Nvidia chief executive Jen-Hsun Huang is on a mission to get graphics chips into everything from handheld computers to smart phones. He expects, for instance, that low-cost Netbooks will become the norm and that gadgets will need to have battery life lasting for days. Holding up an Ion platform, which couples an Intel low-cost Atom processor with an Nvidia integrated graphics chip set, he said his company is looking to determine "what is the soul of the new PC." With Ion, Huang said he is prepared for the future of the computer industry. But first, he has to deal with Intel. Good interview. See interview with Charlie Rose! The Dance of the Sugarplum Documents is about the evolution of the Web document model from a text-typographical/calculation model to one that is visually rich with graphical media streams meshing into traditional text/calc. The thing is, this visual document model is being defined on the edge. The challenge to the traditional desktop document model is coming from the edge, primarily from the WebKit - Chrome - iPhone Community. Jen-Hsun argues on Charlie Rose that desktop computers featured processing power and applications designed to automate typewritter (wordprocessing) and calculator (spreadsheet) functions. The x86 CPU design reflects this orientation. He argues that we are now entering the age of visual computing. A GPU is capable of dramatic increases in processing power because the architecture is geared to the volumes of graphical information being processed. Let the CPU do the traditional stuff, and let the GPU race into the future with the visual processing. That a GPU architecture can scale in parallel is an enormous advantage. But Jen-Hsun does not see the need to try to replicate CPU tasks in a GPU. The best way forward in his opinion is to combine the two!!!
Paul Merrell

Report: Microsoft is scrapping Edge, switching to just another Chrome clone | Ars Technica - 0 views

  • Windows Central reports that Microsoft is planning to replace its Edge browser, which uses Microsoft's own EdgeHTML rendering engine and Chakra JavaScript engine, with a new browser built on Chromium, the open source counterpart to Google's Chrome. The new browser has the codename Anaheim.
Gary Edwards

The End of the Battery - Getting All Charged Up over Supercapacitors - Casey Research - 0 views

  •  
    Very interesting article describing the near market ready potential of "supercapacitor" batteries.   This is truly game changer stuff, and very interesting to me since i've been following the research and development of "graphene technologies" for some time.  The graphene superconductor targets the future of both energy and computing.  But graphene is also at the cutting edge of "faster, better, cheaper" water desalinization!  Nor does it take a rocket scientist to see that a graphene nano latice will have an enormous impact on methods of separating water (H2O) atoms to create an electical current - a cost free flow of electons.   Very well written research! excerpt: "an article in the recent issue of Nature Communications on a novel way to mass-produce so-called superconductors on the super-cheap - using no more equipment than the average home CD/DVD burner. Hacked together by a group of research scientists at UCLA, the ingenious technique is a way of producing layers of microscopically nuanced lattices called graphene, an essential component of many superconductor designs. It holds the promise of rapidly dropping prices for what was until now a very expensive process. You see, we've known about the concept of supercapacitors for decades. In fact, their antecedent, the capacitor, is one of the fundamental building blocks of electronics. Long before the Energizer Bunny starting banging its away around our television screens, engineers had been using capacitors to store electrical charge - originally as filters to help tune signals clearly on wireless radios of all sorts. The devices did so by storing and releasing excess energy, but only teeny amounts of it... we're talking millions of them to hold what a simple AA battery can. Over the years, however, scientists worked on increasing their storage capacity. Way back in 1957, engineers at General Electric came up with the first supercapacitor... but back then there were no uses for it. So, the technology
Gary Edwards

Crocodoc's HTML Document Viewer Infiltrates the Enterprise | Xconomy - 0 views

  •  
    Excellent report on Crocodoc and their ability to convert MANY different document file types to HTML5.  Including all MSOffice formats - OOXML, ODF, and PDF. " Crocodoc, and took on the much larger problem of allowing groups to collaborate on editing a document online, no matter what the document type: PowerPoint, PDF, Word, Photoshop, JPEG, or PNG. In the process, they had to build an embeddable viewer that could take apart any document and reassemble it accurately within a Web browser. And as soon as they'd finished that, they had to tear their own system apart and rebuild it around HTML5 rather than Flash, the Adobe multimedia format that's edging closer and closer to extinction. The result of all that iterating is what's probably the world's most flexible and faithful HTML5-based document viewer: when you open a PDF, PowerPoint, or Word document in Crocodoc, the Web version looks exactly like the native version, even though it's basically been stripped down and re-rendered from scratch. When I talked with Damico in February of 2011, the startup had visions of building on this technology to become a kind of central, Web-based clearinghouse for everyone's documents-a cross between Scribd, Dropbox, and Google Docs, but with a focus on consumers, and with prettier viewing tools. In the last year, though, Crocodoc's direction has changed dramatically. Damico and his colleagues realized that it would be smarter to partner with the fastest growing providers of document-sharing services and social business-tool providers than to try to compete with them. "The massive, seismic change for us is that we had a huge opportunity to partner with Dropbox and LinkedIn and SAP and Yammer, and let them build on top of Crocodoc and make it into a core piece of their own products," Damico says. In other words, every time an office worker opens a document from within a Web app like Dropbox or Yammer, they're activating a white-label version
Gary Edwards

WhiteHat Aviator - The most secure browser online - 1 views

  •  
    "FREQUENTLY ASKED QUESTIONS What is WhiteHat Aviator? WhiteHat Aviator; is the most secure , most private Web browser available anywhere. By default, it provides an easy way to bank, shop, and use social networks while stopping viruses from infecting computers, preventing accounts from being hacked, and blocking advertisers from invisibly spying on every click. Why do I need a secure Web browser? According to CA Technologies, 84 percent of hacker attacks in 2009 took advantage of vulnerabilities in Web browsers. Similarly, Symantec found that four of the top five vulnerabilities being exploited were client-side vulnerabilities that were frequently targeted by Web-based attacks. The fact is, that when you visit any website you run the risk of having your surfing history, passwords, real name, workplace, home address, phone number, email, gender, political affiliation, sexual preferences, income bracket, education level, and medical history stolen - and your computer infected with viruses. Sadly, this happens on millions of websites every day. Before you have any chance at protecting yourself, other browsers force you to follow complicated how-to guides, modify settings that only serve advertising empires and install obscure third-party software. What makes WhiteHat Aviator so secure? WhiteHat Aviator; is built on Chromium, the same open-source foundation used by Google Chrome. Chromium has several unique, powerful security features. One is a "sandbox" that prevents websites from stealing files off your computer or infecting it with viruses. As good as Chromium is, we went much further to create the safest online experience possible. WhiteHat Aviator comes ready-to-go with hardened security and privacy settings, giving hackers less to work with. And our browser downloads to you - without any hidden user-tracking functionality. Our default search engine is DuckDuckGo - not Google, which logs your activity. For good measure, Aviator integrates Disconnect
Gary Edwards

Google Chrome 5 WebKit - Firefox - Opera Comparisons - BusinessWeek - 0 views

  •  
    Chrome runs as close as any browser can to the bleeding edge of Web standards. Though it uses the same open source WebKit rendering engine as Safari, it doesn't reliably support the controversial, proprietary CSS3 transformation and animation tricks that Apple's built into Safari. However, like every browser I tested, it earned a perfect score in a compatibility test for CSS3 selectors, and it joined Safari and Opera with a flawless score of 100 in the Acid3 web standards benchmark. Chrome 5 also supports both Apple's H.264 codec and Mozilla's preferred open source Ogg Theora technology for plugin-free HTML5 video, and it beautifully played back HTML5 demo videos from YouTube and Brightcove. In XHTML and CSS tests, Chrome was surprisingly slower than Safari, despite their shared rendering engine -- but the race was close. Safari rendered a local XHTML test page in 0.58 seconds to Chrome's 0.78 seconds, and a local CSS test page in 33 milliseconds to Chrome's 51 milliseconds. Note that Chrome still rendered XHTML more than twice as fast as Opera (1.67 seconds) and left Firefox (12.42 seconds--no, that's not a typo) eating its dust. In CSS, it also beat the pants off Opera (193 milliseconds) and Firefox (342 milliseconds). But Chrome shines brightest when handling JavaScript. Its V8 engine zipped through the SunSpider Javascript benchmark in 448.6 milliseconds, narrowly beating Opera's 485.8 milliseconds, and absolutely plastering Firefox's 1,161.4 milliseconds. However, Safari 5's time of 376.3 miliseconds in the SunSpider test beat Chrome 5 handily.
Gary Edwards

Reinventing Copy and Paste - Anil Dash - 0 views

  •  
    We can all learn a lot of lessons from the history of DDE/OLE/ OLE3/COM /ActiveX/DCOM /COM+ (you can start reading up on Wikipedia to get some background) and how we went from everyone using best-of-breed standalone apps to one integrated, nearly monolithic Office. It basically all started with copy and paste. People who never spent a lot of time in singletasking, character-mode operating environments like the DOS command line don't recall that simply copying-and-pasting information between apps was difficult at the time. And part of the revelation of Windows for mainstream users (or Mac, for leading-edge tech fans), was being able to easily share data in that way. This was different than what Unix users were used to with the command-line pipe, or from what most applications do with feeds today, in allowing structured information flows between applications. There's a desire to combine data from different sources in an arbitrary way, and to have the user interface display the appropriate tools for whatever context you're in. The dominant model here, probably because of the influence of the early PARC demos, is to have toolbars or UI widgets change depending on what kind of content you're manipulating. Microsoft was really into this in the early 90s with OLE2, where your Word toolbars would morph into Excel toolbars if you double-clicked on an embedded spreadsheet. It was ungainly and ugly and slow, especially if you had less than an exorbitant 8MB of RAM, but the idea was pretty cool. And it still is. People are so focused on data formats and feeds that they're ignoring consensus around UI interoperability. The Atom API and Metaweblog API give me a good-enough interface if I want to treat a discrete chunk of information (like a blog post) as an undifferentiated blob. But all the erstwhile spec work around microformats and structured blogging (I forget which one is for XML and which one's for XHTML) doesn't seem to have addressed user experience or editing behavior
Gary Edwards

Cloud Computing IBM's Edge: IDC Numbers - 0 views

  •  
    According to the research group IDC, worldwide spending on cloud services is expected to grow almost threefold, reaching $44.2 billion by 2013. Enterprise spending remains robust, with manifold growth expected in cloud computing space, within a very short span of time. Small and medium size businesses (SMB) are also rapidly adopting cloud computing technologies to improve their IT systems management at a lower cost.
Gary Edwards

J.C. Herz Ringling College Commencement Speech: Harvesting the Creative Life - 0 views

  •  
    Wow!  excellent advice, top to bottom.  Very well thought out flow of wisdom. excerpt: The important work that you build your reputation on - you can't just Google it. You don't cut and paste it from Wikipedia. You roll up your sleeves, and bring all your creativity and meaningful skills to bear on the problem of building something.   This is what the world requires - this is what the world rewards. Not just calling yourself creative, but understanding how to exercise your creative powers to some end, to bring your vision and skills together in a meaningful way. This is a powerful thing to be able to do. It gives you tremendous value in a society where attention is currency - being able to capture people's imaginations is the scarcest kind of power in a fractured culture. Creating work that transports and transcends is one of the few ways to create sustainable value in a disposable society. What you do, if you do it well, is never going to be a commodity. Vision, magic, delight. Heart-rocking spectacle. Pulse-pounding action. These things don't get outsourced to some cubicle drone in the developing world.   You are an influential group of people, and today is an important moment, as you set forth to become the chief stewards of your gifts. Because, this is what it means to be a creative professional: figuring out how to be the best steward of your gifts, so that your power to create grows and deepens meaningfully over time. So that your edges stay sharp, and your light stays bright. The life you've chosen is not one that simply requires clocking in and clocking out. You've got to bring your soul to it every day. You've got to be on your game.   That takes discipline. And it takes awareness - of how you're spending your time, and of how what you're doing affects your capability and your capacity. You are going to have to ask yourself, at every turn: is this project making me smarter, or making me stupider. Is this job stoking my fire,
Gary Edwards

Google Bets Big on HTML 5: News from Google I/O - O'Reilly Radar - 0 views

  •  
    "Never underestimate the web," says Google VP of Engineering Vic Gundotra in his keynote at Google I/O this morning..... Tim O'Reilly provides us with his play-by-play account of the Google I/O event. Amazing stuff. The Web has arrived and it is no longer the "network of networks". It's rather quickly becoming the mother of all platforms. Great coverage.
  •  
    That article includes a link to an amazing web page, amazing if you've got a bleeding edge HTML 5 browser. http://htmlfive.appspot.com/ The browsers and versions needed are listed on that page. If you've got Google Chrome, upgrade to Chrome 2.0 (hot off the presses) from About Google Chrome (on the customization menu). Playtime with the bleeding edge of the Open Web.
Gary Edwards

On Mobiles, There's No Stopping Webkit - 0 views

  •  
    Great title, no substance.  But who can pass this up?  Even if it's been obvious since the 2007 release of the iPhone.  WebKit Rules the Edge of the Web today!   Tomorrow, the greater Web will follow. Excerpt: There are a lot of brave souls out there making mobile browsers, hoping to gain traction with the phone makers. But most of them are fighting a losing battle, for the mobile browser war is increasingly being fought between two camps - the Webkit-based browsers camp, which includes Safari on the iPhone, the Google Android Browser, the Palm browser and the Nokia browser; and the Opera camp.
Gary Edwards

Push Pop Press: About Us - 0 views

  •  
    iOS visual eBooks and magazine "immersive media" category.  Push Pop Press seeks to provide a platform for digital eBooks that are more multimedia content than text.  It's more like writing in powerpoint or Flash than Word.  Think flipboard.  Another interesting term used by Push Pop Press is that this is a "layout platform" for rich content. features:  A demo of the first book powered byPush Pop Press, Al Gore's Our Choice.A New Digital Publishing Platform Easy to PublishLayout and publish interactive digital books without writing codeMixed MediaTell rich stories using text, images, audio, video, maps and interactive graphicsInteractive GraphicsEmbed interactive graphics that use the microphone, accelerometer and moreMulti-Touch User InterfaceEdge-to-Edge content without any distracting toolbars or buttonsVisual Table of ContentsBrowse through hundreds of pages quickly and easilyPages Load InstantlyPages load as fast as your finger can swipeStart Reading ImmediatelyStart reading the first chapter as the rest of the book downloads in the backgroundUpdatable ContentUpdate your content without having to update the appiPad, iPhone & iPod TouchPublish one universal app that can be read on an iPad, iPhone and iPod Touch
Paul Merrell

Opinion: Berkeley Can Become a City of Refuge | Opinion | East Bay Express - 0 views

  • The Berkeley City Council is poised to vote March 13 on the Surveillance Technology Use and Community Safety Ordinance, which will significantly protect people's right to privacy and safeguard the civil liberties of Berkeley residents in this age of surveillance and Big Data. The ordinance is based on an ACLU model that was first enacted by Santa Clara County in 2016. The Los Angeles Times has editorialized that the ACLU's model ordinance approach "is so pragmatic that cities, counties, and law enforcement agencies throughout California would be foolish not to embrace it." Berkeley's Peace and Justice and Police Review commissions agreed and unanimously approved a draft that will be presented to the council on Tuesday. The ordinance requires public notice and public debate prior to seeking funding, acquiring equipment, or otherwise moving forward with surveillance technology proposals. In neighboring Oakland, we saw the negative outcome that can occur from lack of such a discussion, when the city's administration pursued funding for, and began building, the citywide surveillance network known as the Domain Awareness Center ("DAC") without community input. Ultimately, the community rejected the project, and the fallout led to the establishment of a Privacy Advisory Commission and subsequent consideration of a similar surveillance ordinance to ensure proper vetting occurs up front, not after the fact. ✖ Play VideoPauseUnmuteCurrent Time 0:00/Duration Time 0:00Loaded: 0%Progress: 0%Stream TypeLIVERemaining Time -0:00 Playback Rate1ChaptersChaptersdescriptions off, selectedDescriptionssubtitles off, selectedSubtitlescaptions settings, opens captions settings dialogcaptions off, selectedCaptionsAudio TrackFullscreenThis is a modal window.Caption Settings DialogBeginning of dialog window. Escape will cancel and close the window.
Gary Edwards

No Jitter | Post | Cisco Or Microsoft? Who Wins the Line-of-Business War? - 0 views

  • The multitude of services gives Microsoft an early edge when it comes to cloud, but the channel-enablement model for Cisco can create much greater scale than a direct to line-of-business model. The key is ensuring its resellers are fully trained in selling to line-of-business, which isn't a simple undertaking. Bottom line: With regard to cloud, Microsoft has a faster route to market, but Cisco's should give it an advantage over time.
  • Putting cloud aside, Cisco and Microsoft have markedly different approaches in selling to lines of business. For Microsoft, the key lies in its developer community. Developers build applications that business people use and buy. Many of these applications use Microsoft as an underlying technology without the purchaser really even being aware of that fact. Microsoft gets pulled through with really no involvement from Microsoft, providing a low- to no-cost sales model for the company. The only down side is that the application brand often overshadows the underlying brand.
  • Microsoft has made a living off selling products, many of them sub-par, into business because of its developer relationships. Does anyone really think Microsoft gained monopoly-like share with desktop operating systems because of quality of product and ease of use? Hardly. Windows became the de facto standard for developers because of the quality of the developer program. Microsoft does a good job of meeting the needs of its large software vendors, but does an even better job of making sure those millions of small ISVs have access to Microsoft platforms and developer support.
  • ...6 more annotations...
  • Cisco has been trying to build its own "Cisco Developer Network" (CDN) for the better part of a decade. The company kicked off this initiative way back in the early 2000s when it bought a company called Metreos that had some interesting VoIP applications and a slick developer interface. Back then, the program was known as CTDP, Cisco Technology Developer Program, and was run by VoIP people, not individuals that understand software and how to build a developer environment. Since then the program has undergone a number of facelifts and Cisco appears to have some real software people running the group, so there is some potential.
  • With regards to UC, as this market transitions away from products to platforms, services will play a significant role. Cisco's services plays a role similar to IBM services. IBM's consulting group works with its top tier customers to understand how to solve business problems through compute-centric solutions. Cisco services works with its customers to create solutions through networking- and communications-related products. As more and more organizations look to leverage UC strategically, I would expect Cisco services to target its top-tier customers. The key for Cisco then is to take these solutions and push them down through its channel for scale and market share gains.
  • So developer-led or services-led?
  • Microsoft should get an early advantage, as many in-house developers will look to Lync; but the services strategy by Cisco should create longer, more sustainable value, as it has for IBM.
  • The key for Microsoft is being able to adapt its developer environment faster as market trends change. Obviously, compute is moving away from the traditional desktop to mobile clients and the cloud, and there are far more single-use, purpose-built applications being built in the consumer world. I think Microsoft's Developer Network is oriented towards more old-school developers.
  • The key for Cisco is having the patience to work with its lead customers and find those unique, game-changing applications and use cases that it can then push down into the channel. It's the right strategy for Cisco, but it might take a bit more time to bear some fruit.
  •  
    excerpt: "Developer-led or services-led? Microsoft should get an early advantage, but the services strategy by Cisco should create longer, more sustainable value, Last month I wrote a blog outlining how the line-of-business manager holds the key to winning the Cisco versus Microsoft war. A number of you commented that this was obvious and both companies are already doing it. I'll agree that this is something both companies are trying to do, but neither is doing a great job. Microsoft is a company with high appeal to IT pros and Cisco to network managers, with high brand familiarity to line of business managers but low appeal beyond this."
Paul Merrell

From Radio to Porn, British Spies Track Web Users' Online Identities - 0 views

  • HERE WAS A SIMPLE AIM at the heart of the top-secret program: Record the website browsing habits of “every visible user on the Internet.” Before long, billions of digital records about ordinary people’s online activities were being stored every day. Among them were details cataloging visits to porn, social media and news websites, search engines, chat forums, and blogs. The mass surveillance operation — code-named KARMA POLICE — was launched by British spies about seven years ago without any public debate or scrutiny. It was just one part of a giant global Internet spying apparatus built by the United Kingdom’s electronic eavesdropping agency, Government Communications Headquarters, or GCHQ. The revelations about the scope of the British agency’s surveillance are contained in documents obtained by The Intercept from National Security Agency whistleblower Edward Snowden. Previous reports based on the leaked files have exposed how GCHQ taps into Internet cables to monitor communications on a vast scale, but many details about what happens to the data after it has been vacuumed up have remained unclear.
  • Amid a renewed push from the U.K. government for more surveillance powers, more than two dozen documents being disclosed today by The Intercept reveal for the first time several major strands of GCHQ’s existing electronic eavesdropping capabilities.
  • The surveillance is underpinned by an opaque legal regime that has authorized GCHQ to sift through huge archives of metadata about the private phone calls, emails and Internet browsing logs of Brits, Americans, and any other citizens — all without a court order or judicial warrant
  • ...17 more annotations...
  • A huge volume of the Internet data GCHQ collects flows directly into a massive repository named Black Hole, which is at the core of the agency’s online spying operations, storing raw logs of intercepted material before it has been subject to analysis. Black Hole contains data collected by GCHQ as part of bulk “unselected” surveillance, meaning it is not focused on particular “selected” targets and instead includes troves of data indiscriminately swept up about ordinary people’s online activities. Between August 2007 and March 2009, GCHQ documents say that Black Hole was used to store more than 1.1 trillion “events” — a term the agency uses to refer to metadata records — with about 10 billion new entries added every day. As of March 2009, the largest slice of data Black Hole held — 41 percent — was about people’s Internet browsing histories. The rest included a combination of email and instant messenger records, details about search engine queries, information about social media activity, logs related to hacking operations, and data on people’s use of tools to browse the Internet anonymously.
  • Throughout this period, as smartphone sales started to boom, the frequency of people’s Internet use was steadily increasing. In tandem, British spies were working frantically to bolster their spying capabilities, with plans afoot to expand the size of Black Hole and other repositories to handle an avalanche of new data. By 2010, according to the documents, GCHQ was logging 30 billion metadata records per day. By 2012, collection had increased to 50 billion per day, and work was underway to double capacity to 100 billion. The agency was developing “unprecedented” techniques to perform what it called “population-scale” data mining, monitoring all communications across entire countries in an effort to detect patterns or behaviors deemed suspicious. It was creating what it said would be, by 2013, “the world’s biggest” surveillance engine “to run cyber operations and to access better, more valued data for customers to make a real world difference.”
  • A document from the GCHQ target analysis center (GTAC) shows the Black Hole repository’s structure.
  • The data is searched by GCHQ analysts in a hunt for behavior online that could be connected to terrorism or other criminal activity. But it has also served a broader and more controversial purpose — helping the agency hack into European companies’ computer networks. In the lead up to its secret mission targeting Netherlands-based Gemalto, the largest SIM card manufacturer in the world, GCHQ used MUTANT BROTH in an effort to identify the company’s employees so it could hack into their computers. The system helped the agency analyze intercepted Facebook cookies it believed were associated with Gemalto staff located at offices in France and Poland. GCHQ later successfully infiltrated Gemalto’s internal networks, stealing encryption keys produced by the company that protect the privacy of cell phone communications.
  • Similarly, MUTANT BROTH proved integral to GCHQ’s hack of Belgian telecommunications provider Belgacom. The agency entered IP addresses associated with Belgacom into MUTANT BROTH to uncover information about the company’s employees. Cookies associated with the IPs revealed the Google, Yahoo, and LinkedIn accounts of three Belgacom engineers, whose computers were then targeted by the agency and infected with malware. The hacking operation resulted in GCHQ gaining deep access into the most sensitive parts of Belgacom’s internal systems, granting British spies the ability to intercept communications passing through the company’s networks.
  • In March, a U.K. parliamentary committee published the findings of an 18-month review of GCHQ’s operations and called for an overhaul of the laws that regulate the spying. The committee raised concerns about the agency gathering what it described as “bulk personal datasets” being held about “a wide range of people.” However, it censored the section of the report describing what these “datasets” contained, despite acknowledging that they “may be highly intrusive.” The Snowden documents shine light on some of the core GCHQ bulk data-gathering programs that the committee was likely referring to — pulling back the veil of secrecy that has shielded some of the agency’s most controversial surveillance operations from public scrutiny. KARMA POLICE and MUTANT BROTH are among the key bulk collection systems. But they do not operate in isolation — and the scope of GCHQ’s spying extends far beyond them.
  • The agency operates a bewildering array of other eavesdropping systems, each serving its own specific purpose and designated a unique code name, such as: SOCIAL ANTHROPOID, which is used to analyze metadata on emails, instant messenger chats, social media connections and conversations, plus “telephony” metadata about phone calls, cell phone locations, text and multimedia messages; MEMORY HOLE, which logs queries entered into search engines and associates each search with an IP address; MARBLED GECKO, which sifts through details about searches people have entered into Google Maps and Google Earth; and INFINITE MONKEYS, which analyzes data about the usage of online bulletin boards and forums. GCHQ has other programs that it uses to analyze the content of intercepted communications, such as the full written body of emails and the audio of phone calls. One of the most important content collection capabilities is TEMPORA, which mines vast amounts of emails, instant messages, voice calls and other communications and makes them accessible through a Google-style search tool named XKEYSCORE.
  • As of September 2012, TEMPORA was collecting “more than 40 billion pieces of content a day” and it was being used to spy on people across Europe, the Middle East, and North Africa, according to a top-secret memo outlining the scope of the program. The existence of TEMPORA was first revealed by The Guardian in June 2013. To analyze all of the communications it intercepts and to build a profile of the individuals it is monitoring, GCHQ uses a variety of different tools that can pull together all of the relevant information and make it accessible through a single interface. SAMUEL PEPYS is one such tool, built by the British spies to analyze both the content and metadata of emails, browsing sessions, and instant messages as they are being intercepted in real time. One screenshot of SAMUEL PEPYS in action shows the agency using it to monitor an individual in Sweden who visited a page about GCHQ on the U.S.-based anti-secrecy website Cryptome.
  • Partly due to the U.K.’s geographic location — situated between the United States and the western edge of continental Europe — a large amount of the world’s Internet traffic passes through its territory across international data cables. In 2010, GCHQ noted that what amounted to “25 percent of all Internet traffic” was transiting the U.K. through some 1,600 different cables. The agency said that it could “survey the majority of the 1,600” and “select the most valuable to switch into our processing systems.”
  • According to Joss Wright, a research fellow at the University of Oxford’s Internet Institute, tapping into the cables allows GCHQ to monitor a large portion of foreign communications. But the cables also transport masses of wholly domestic British emails and online chats, because when anyone in the U.K. sends an email or visits a website, their computer will routinely send and receive data from servers that are located overseas. “I could send a message from my computer here [in England] to my wife’s computer in the next room and on its way it could go through the U.S., France, and other countries,” Wright says. “That’s just the way the Internet is designed.” In other words, Wright adds, that means “a lot” of British data and communications transit across international cables daily, and are liable to be swept into GCHQ’s databases.
  • A map from a classified GCHQ presentation about intercepting communications from undersea cables. GCHQ is authorized to conduct dragnet surveillance of the international data cables through so-called external warrants that are signed off by a government minister. The external warrants permit the agency to monitor communications in foreign countries as well as British citizens’ international calls and emails — for example, a call from Islamabad to London. They prohibit GCHQ from reading or listening to the content of “internal” U.K. to U.K. emails and phone calls, which are supposed to be filtered out from GCHQ’s systems if they are inadvertently intercepted unless additional authorization is granted to scrutinize them. However, the same rules do not apply to metadata. A little-known loophole in the law allows GCHQ to use external warrants to collect and analyze bulk metadata about the emails, phone calls, and Internet browsing activities of British people, citizens of closely allied countries, and others, regardless of whether the data is derived from domestic U.K. to U.K. communications and browsing sessions or otherwise. In March, the existence of this loophole was quietly acknowledged by the U.K. parliamentary committee’s surveillance review, which stated in a section of its report that “special protection and additional safeguards” did not apply to metadata swept up using external warrants and that domestic British metadata could therefore be lawfully “returned as a result of searches” conducted by GCHQ.
  • Perhaps unsurprisingly, GCHQ appears to have readily exploited this obscure legal technicality. Secret policy guidance papers issued to the agency’s analysts instruct them that they can sift through huge troves of indiscriminately collected metadata records to spy on anyone regardless of their nationality. The guidance makes clear that there is no exemption or extra privacy protection for British people or citizens from countries that are members of the Five Eyes, a surveillance alliance that the U.K. is part of alongside the U.S., Canada, Australia, and New Zealand. “If you are searching a purely Events only database such as MUTANT BROTH, the issue of location does not occur,” states one internal GCHQ policy document, which is marked with a “last modified” date of July 2012. The document adds that analysts are free to search the databases for British metadata “without further authorization” by inputing a U.K. “selector,” meaning a unique identifier such as a person’s email or IP address, username, or phone number. Authorization is “not needed for individuals in the U.K.,” another GCHQ document explains, because metadata has been judged “less intrusive than communications content.” All the spies are required to do to mine the metadata troves is write a short “justification” or “reason” for each search they conduct and then click a button on their computer screen.
  • Intelligence GCHQ collects on British persons of interest is shared with domestic security agency MI5, which usually takes the lead on spying operations within the U.K. MI5 conducts its own extensive domestic surveillance as part of a program called DIGINT (digital intelligence).
  • GCHQ’s documents suggest that it typically retains metadata for periods of between 30 days to six months. It stores the content of communications for a shorter period of time, varying between three to 30 days. The retention periods can be extended if deemed necessary for “cyber defense.” One secret policy paper dated from January 2010 lists the wide range of information the agency classes as metadata — including location data that could be used to track your movements, your email, instant messenger, and social networking “buddy lists,” logs showing who you have communicated with by phone or email, the passwords you use to access “communications services” (such as an email account), and information about websites you have viewed.
  • Records showing the full website addresses you have visited — for instance, www.gchq.gov.uk/what_we_do — are treated as content. But the first part of an address you have visited — for instance, www.gchq.gov.uk — is treated as metadata. In isolation, a single metadata record of a phone call, email, or website visit may not reveal much about a person’s private life, according to Ethan Zuckerman, director of Massachusetts Institute of Technology’s Center for Civic Media. But if accumulated and analyzed over a period of weeks or months, these details would be “extremely personal,” he told The Intercept, because they could reveal a person’s movements, habits, religious beliefs, political views, relationships, and even sexual preferences. For Zuckerman, who has studied the social and political ramifications of surveillance, the most concerning aspect of large-scale government data collection is that it can be “corrosive towards democracy” — leading to a chilling effect on freedom of expression and communication. “Once we know there’s a reasonable chance that we are being watched in one fashion or another it’s hard for that not to have a ‘panopticon effect,’” he said, “where we think and behave differently based on the assumption that people may be watching and paying attention to what we are doing.”
  • When compared to surveillance rules in place in the U.S., GCHQ notes in one document that the U.K. has “a light oversight regime.” The more lax British spying regulations are reflected in secret internal rules that highlight greater restrictions on how NSA databases can be accessed. The NSA’s troves can be searched for data on British citizens, one document states, but they cannot be mined for information about Americans or other citizens from countries in the Five Eyes alliance. No such constraints are placed on GCHQ’s own databases, which can be sifted for records on the phone calls, emails, and Internet usage of Brits, Americans, and citizens from any other country. The scope of GCHQ’s surveillance powers explain in part why Snowden told The Guardian in June 2013 that U.K. surveillance is “worse than the U.S.” In an interview with Der Spiegel in July 2013, Snowden added that British Internet cables were “radioactive” and joked: “Even the Queen’s selfies to the pool boy get logged.”
  • In recent years, the biggest barrier to GCHQ’s mass collection of data does not appear to have come in the form of legal or policy restrictions. Rather, it is the increased use of encryption technology that protects the privacy of communications that has posed the biggest potential hindrance to the agency’s activities. “The spread of encryption … threatens our ability to do effective target discovery/development,” says a top-secret report co-authored by an official from the British agency and an NSA employee in 2011. “Pertinent metadata events will be locked within the encrypted channels and difficult, if not impossible, to prise out,” the report says, adding that the agencies were working on a plan that would “(hopefully) allow our Internet Exploitation strategy to prevail.”
1 - 20 of 23 Next ›
Showing 20 items per page