Skip to main content

Home/ Future of the Web/ Group items tagged control advantage

Rss Feed Group items tagged

Gonzalo San Gil, PhD.

The War Over Control Of The Net Is A War Over Information Advantage | TorrentFreak - 0 views

  •  
    " Rick Falkvinge on February 15, 2015 C: 0 Opinion Throughout history, you can observe that many groups have fought over the information advantage - to know more about other people than those others know in return. Whoever has held the information advantage has usually risen to power."
  •  
    " Rick Falkvinge on February 15, 2015 C: 0 Opinion Throughout history, you can observe that many groups have fought over the information advantage - to know more about other people than those others know in return. Whoever has held the information advantage has usually risen to power."
Gary Edwards

Two Microsofts: Mulling an alternate reality | ZDNet - 1 views

  • Judge Jackson had it right. And the Court of Appeals? Not so much
  • Judge Jackson is an American hero and news of his passing thumped me hard. His ruling against Microsoft and the subsequent overturn of that ruling resulted, IMHO, in two extraordinary directions that changed the world. Sure the what-if game is interesting, but the reality itself is stunning enough. Of course, Judge Jackson sought to break the monopoly. The US Court of Appeals overturn resulted in the monopoly remaining intact, but the Internet remaining free and open. Judge Jackson's breakup plan had a good shot at achieving both a breakup of the monopoly and, a free and open Internet. I admit though that at the time I did not favor the Judge's plan. And i actually did submit a proposal based on Microsoft having to both support the WiNE project, and, provide a complete port to WiNE to any software provider requesting a port. I wanted to break the monopolist's hold on the Windows Productivity Environment and the hundreds of millions of investment dollars and time that had been spent on application development forever trapped on that platform. For me, it was the productivity platform that had to be broken.
  • I assume the good Judge thought that separating the Windows OS from Microsoft Office / Applications would force the OS to open up the secret API's even as the OS continued to evolve. Maybe. But a full disclosure of the API's coupled with the community service "port to WiNE" requirement might have sped up the process. Incredibly, the "Undocumented Windows Secrets" industry continues to thrive, and the legendary Andrew Schulman's number is still at the top of Silicon Valley legal profession speed dials. http://goo.gl/0UGe8 Oh well. The Court of Appeals stopped the breakup, leaving the Windows Productivity Platform intact. Microsoft continues to own the "client" in "Client/Server" computing. Although Microsoft was temporarily stopped from leveraging their desktop monopoly to an iron fisted control and dominance of the Internet, I think what were watching today with the Cloud is Judge Jackson's worst nightmare. And mine too. A great transition is now underway, as businesses and enterprises begin the move from legacy client/server business systems and processes to a newly emerging Cloud Productivity Platform. In this great transition, Microsoft holds an inside straight. They have all the aces because they own the legacy desktop productivity platform, and can control the transition to the Cloud. No doubt this transition is going to happen. And it will severely disrupt and change Microsoft's profit formula. But if the Redmond reprobate can provide a "value added" transition of legacy business systems and processes, and direct these new systems to the Microsoft Cloud, the profits will be immense.
  • ...1 more annotation...
  • Judge Jackson sought to break the ability of Microsoft to "leverage" their existing monopoly into the Internet and his plan was overturned and replaced by one based on judicial oversight. Microsoft got a slap on the wrist from the Court of Appeals, but were wailed on with lawsuits from the hundreds of parties injured by their rampant criminality. Some put the price of that criminality as high as $14 Billion in settlements. Plus, the shareholders forced Chairman Bill to resign. At the end of the day though, Chairman Bill was right. Keeping the monopoly intact was worth whatever penalty Microsoft was forced to pay. He knew that even the judicial over-site would end one day. Which it did. And now his company is ready to go for it all by leveraging and controlling the great productivity transition. No business wants to be hostage to a cold heart'd monopolist. But there is huge difference between a non-disruptive and cost effective, process-by-process value-added transition to a Cloud Productivity Platform, and, the very disruptive and costly "rip-out-and-replace" transition offered by Google, ZOHO, Box, SalesForce and other Cloud Productivity contenders. Microsoft, and only Microsoft, can offer the value-added transition path. If they get the Cloud even halfway right, they will own business productivity far into the future. Rest in Peace Judge Jackson. Your efforts were heroic and will be remembered as such. ~ge~
  •  
    Comments on the latest SVN article mulling the effects of Judge Thomas Penfield Jackson's anti trust ruling and proposed break up of Microsoft. comment: "Chinese Wall" Ummm, there was a Chinese Wall between Microsoft Os and the MS Applciations layer. At least that's what Chairman Bill promised developers at a 1990 OS/2-Windows Conference I attended. It was a developers luncheon, hosted by Microsoft, with Chairman Bill speaking to about 40 developers with applications designed to run on the then soon to be released Windows 3.0. In his remarks, the Chairman described his vision of commoditizing the personal computer market through an open hardware-reference platform on the one side of the Windows OS, and provisioning an open application developers layer on the other using open and totally transparent API's. Of course the question came up concerning the obvious advantage Microsoft applications would have. Chairman Bill answered the question by describing the Chinese Wall that existed between Microsoft's OS and Apps develop departments. He promised that OS API's would be developed privately and separate from the Apps department, and publicly disclosed to ALL developers at the same time. Oh yeah. There was lots of anti IBM - evil empire stuff too :) Of course we now know this was a line of crap. Microsoft Apps was discovered to have been using undocumented and secret Window API's. http://goo.gl/0UGe8. Microsoft Apps had a distinct advantage over the competition, and eventually the entire Windows Productivity Platform became dependent on the MSOffice core. The company I worked for back then, Pyramid Data, had the first Contact Management application for Windows; PowerLeads. Every Friday night we would release bug fixes and improvements using Wildcat BBS. By Monday morning we would be slammed with calls from users complaining that they had downloaded the Friday night patch, and now some other application would not load or function properly. Eventually we tracked th
Paul Merrell

Testosterone Pit - Home - The Other Reason Why IBM Throws A Billion At Linux ... - 0 views

  • IBM announced today that it would throw another billion at Linux, the open-source operating system, to run its Power System servers. The first time it had thrown a billion at Linux was in 2001, when Linux was a crazy, untested, even ludicrous proposition for the corporate world. So the moolah back then didn’t go to Linux itself, which was free, but to related technologies across hardware, software, and service, including things like sales and advertising – and into IBM’s partnership with Red Hat which was developing its enterprise operating system, Red Hat Enterprise Linux. “It helped start a flurry of innovation that has never slowed,” said Jim Zemlin, executive director of the Linux Foundation. IBM claims that the investment would “help clients capitalize on big data and cloud computing with modern systems built to handle the new wave of applications coming to the data center in the post-PC era.” Some of the moolah will be plowed into the Power Systems Linux Center in Montpellier, France, which opened today. IBM’s first Power Systems Linux Center opened in Beijing in May. IBM may be trying to make hay of the ongoing revelations that have shown that the NSA and other intelligence organizations in the US and elsewhere have roped in American tech companies of all stripes with huge contracts to perfect a seamless spy network. They even include physical aspects of surveillance, such as license plate scanners and cameras, which are everywhere [read.... Surveillance Society: If You Drive, You Get Tracked].
  • Then another boon for IBM. Experts at the German Federal Office for Security in Information Technology (BIS) determined that Windows 8 is dangerous for data security. It allows Microsoft to control the computer remotely through a “special surveillance chip,” the wonderfully named Trusted Platform Module (TPM), and a backdoor in the software – with keys likely accessible to the NSA and possibly other third parties, such as the Chinese. Risks: “Loss of control over the operating system and the hardware” [read.... LEAKED: German Government Warns Key Entities Not To Use Windows 8 – Links The NSA.
  • It would be an enormous competitive advantage for an IBM salesperson to walk into a government or corporate IT department and sell Big Data servers that don’t run on Windows, but on Linux. With the Windows 8 debacle now in public view, IBM salespeople don’t even have to mention it. In the hope of stemming the pernicious revenue decline their employer has been suffering from, they can politely and professionally hype the security benefits of IBM’s systems and mention in passing the comforting fact that some of it would be developed in the Power Systems Linux Centers in Montpellier and Beijing. Alas, Linux too is tarnished. The backdoors are there, though the code can be inspected, unlike Windows code. And then there is Security-Enhanced Linux (SELinux), which was integrated into the Linux kernel in 2003. It provides a mechanism for supporting “access control” (a backdoor) and “security policies.” Who developed SELinux? Um, the NSA – which helpfully discloses some details on its own website (emphasis mine): The results of several previous research projects in this area have yielded a strong, flexible mandatory access control architecture called Flask. A reference implementation of this architecture was first integrated into a security-enhanced Linux® prototype system in order to demonstrate the value of flexible mandatory access controls and how such controls could be added to an operating system. The architecture has been subsequently mainstreamed into Linux and ported to several other systems, including the Solaris™ operating system, the FreeBSD® operating system, and the Darwin kernel, spawning a wide range of related work.
  • ...1 more annotation...
  • Among a slew of American companies who contributed to the NSA’s “mainstreaming” efforts: Red Hat. And IBM? Like just about all of our American tech heroes, it looks at the NSA and other agencies in the Intelligence Community as “the Customer” with deep pockets, ever increasing budgets, and a thirst for technology and data. Which brings us back to Windows 8 and TPM. A decade ago, a group was established to develop and promote Trusted Computing that governs how operating systems and the “special surveillance chip” TPM work together. And it too has been cooperating with the NSA. The founding members of this Trusted Computing Group, as it’s called facetiously: AMD, Cisco, Hewlett-Packard, Intel, Microsoft, and Wave Systems. Oh, I almost forgot ... and IBM. And so IBM might not escape, despite its protestations and slick sales presentations, the suspicion by foreign companies and governments alike that its Linux servers too have been compromised – like the cloud products of other American tech companies. And now, they’re going to pay a steep price for their cooperation with the NSA. Read...  NSA Pricked The “Cloud” Bubble For US Tech Companies
Paul Merrell

Xcerion's 'Icloud' Promises Marriage of Remote And Local Computing -- Xcerion -- Inform... - 0 views

  • Xcerion has continued to work toward the general release of its XML-based "Cloud OS," a service based on Xcerion XML Internet Operating System/3 (XIOS/3). The announcement of an official name for the service brings the company a step close to that goal; it also certainly reassures investors like Lou Perazzoli, one of the core architects of Microsoft (NSDQ: MSFT) Windows NT, and Terry Drayton, founder of HomeGrocer.com, that Xcerion's technology is almost ready for prime time.
  • Icloud relies on an XML virtual machine for local (and offline) operation. It thus combines the advantages of remote computing -- a central point for software distribution, storage, and updates -- with the advantages of local computing -- execution speed and user control without a bandwidth bottleneck.
  • Icloud offers an intriguing technology that Xcerion is calling "gesture-based computing." Jonas Thornholm, CFO of Xcerion, believes it may be the service's "killer app." Gesture-based computing is essentially real-time content sharing. It allows users to drag and drop documents from their computer to a friend's computer in real time, as if they two machines were dual monitors powered by a single machine.
  • ...1 more annotation...
  • Another point of differentiation between Icloud and other WebTop systems is the breadth of Xcerion's ambitions: It's aiming not just to move the desktop into the Internet "cloud" but also to reinvent the economics of software development. Icloud developers can look forward to an Internet-based marketplace for their Web applications that includes monetization technology. They will be able to offer free, ad-supported, or fee-based software with minimal hassle.
  •  
    Most quality online stores. Know whether you are a trusted online retailer in the world. Whatever we can buy very good quality. and do not hesitate. Everything is very high quality. Including clothes, accessories, bags, cups. Highly recommended. This is one of the trusted online store in the world. View now www.retrostyler.com
Paul Merrell

Google, Facebook made secret deal to divvy up market, Texas alleges - POLITICO - 1 views

  • Google and Facebook, the No. 1 and No. 2 players in online advertising, made a secret illegal pact in 2018 to divide up the market for ads on websites and apps, according to an antitrust suit filed Wednesday against the search giant. The suit — filed by Texas and eight other states — alleges that the companies colluded to fix prices and divvy up the market for mobile advertising between them.
  • The allegation that Google teamed up with Facebook to suppress competition mirrors a major claim in a separate antitrust suit the Justice Department filed against the company in October: that Google teamed up with Apple to help ensure the continued dominance of its search engine. Such allegations provide some of the strongest ammunition yet to advocates who argue that the U.S. major tech companies have gotten too big and are using their power — sometimes in conjunction with each other — to control markets.Many of the details about the Google-Facebook agreement, including its specific language, are redacted from the complaint. But the states say it “fixes prices and allocates markets between Google and Facebook as competing bidders in the auctions for publishers’ web display and in-app advertising inventory.”
  • The complaint alleges that the agreement was prompted by Facebook’s move in 2017 to use “header bidding” — a technology popular with website publishers that helped them increase the money they made from advertising. While Facebook sells ads on its own platform, it also operates a network to let advertisers offer ads on third-party apps and mobile websites.
  • ...1 more annotation...
  • Google was concerned about the move to header bidding, the complaint alleges, because it posed an “existential threat” to its own advertising exchange and limited the ability of the search giant to use information from its ad-buying and selling tools to its advantage. Those tools let Google cherry pick the highest value advertising spots and ads, according to the complaint.Within months of Facebook’s announcement, Google approached it to open negotiations, the complaint alleged, and the two companies eventually cut a deal: Facebook would cut back on the use of header bidding and use Google’s ad server. In exchange, the complaint alleges that Google gave Facebook advantages in its auctions.
Paul Merrell

Facebook's Marketplace Faces Antitrust Probes in EU, U.K. - WSJ - 1 views

  • The European Union and the U.K. opened formal antitrust investigations into Facebook Inc.’s FB -0.86% classified-ads service Marketplace, ramping up regulatory scrutiny for the company in Europe. Both the European Commission—the EU’s top antitrust enforcer—and the U.K.’s Competition and Markets Authority said Friday they are investigating whether Facebook repurposes data it gathers from advertisers who buy ads in order to give illegal advantages to its own services, including its Marketplace online flea market. The U.K. added that it is also investigating whether Facebook uses advertiser data to give similar advantages to its online-dating service. The two competition watchdogs said they would coordinate their investigations.
  • Separately on Friday, Germany’s competition regulator announced that it is opening an investigation into Google’s News Showcase, in which the tech company pays to license certain content from news publishers. That probe, which is based on new powers Germany had granted the regulator, will look among other things at whether Google is imposing unfair conditions on publishers and how it selects participants, the Federal Cartel Office said.
  • The three newly opened cases are part of a new wave of antitrust enforcement in Europe. The European Commission filed formal charges last month against Apple Inc. for allegedly abusing its control over the distribution of music-streaming apps, including Spotify Technology SA . In November, it filed formal charges against Amazon.com Inc. for allegedly using nonpublic data it gathers from third-party sellers to unfairly compete against them. Both companies denied wrongdoing. At the same time, the U.K.’s CMA has opened investigations into Google’s announcement that it will retire third-party cookies, a technology advertisers use to track web users, and whether Apple imposes anticompetitive conditions on some app developers, including the use of Apple’s in-app payment system, which is also the subject of a lawsuit in the U.S. In the EU, the European Commission has been investigating Facebook for more than a year on multiple fronts. Facebook and the Commission have squabbled over access to internal documents as part of those investigations.
  • ...1 more annotation...
  • New York State Attorney General Letitia James outlined in December a sweeping antitrust suit against Facebook by the Federal Trade Commission and a bipartisan group of 46 state attorneys general, targeting the company’s tactics against competitors. Photo: Saul Loeb/AFP via Getty Images (Video from 12/9/20)
Paul Merrell

It's the business processes that are bound to MSOffice - Windows' dominance stifles dem... - 0 views

  • 15 years of workgroup oriented business process automation based on the MSOffice productivity environment has had an impact. Microsoft pretty much owns the "client" in "client/server" because so many of these day-to-day business processes are bound to the MSOffice productivity environment in some way.
  • The good news is that there is a great transition underway. The world is slowly but inexorably moving from "client/server" systems to an emerging architecture one might describe as "client/ WebStack-Cloud-Ria /server. The reason for the great transition is simple; the productivity advantages of putting the Web in the center of information systems and workflows are extraordinary.
  • Now the bad news. Microsoft fully understands this and has spent years preparing for a very controlled transition. They are ready. The pieces are finally falling into place for a controlled transition connecting legacy MSOffice bound business processes to the Microsoft WebStack-Cloud-RiA model (Exchange-SharePoint-SQL Server-Mesh-Silverlight).
  • ...2 more annotations...
  • Anyone with a pulse knows that the Web is the future. Yet, look at how much time and effort has been spent on formats, protocols and interfaces that at best would "break" the Web, and at worst, determine to refight the 1995 office desktop wars. In Massachusetts, while the war between ODF and OOXML raged, Exchange and SharePoint servers were showing up everywhere. It was as if the outcome of the desktop office format decision didn't matter to the Web future.
  • And if we don't successfully re-purpose MSOffice to the Open Web? (And for that matter, OpenOffice). The Web will break. The great transition will be directed to the MS WebStack-Cloud-RiA model. Web enhanced business processes will be entangled with proprietary formats, protocols and interfaces. The barriers to this emerging desktop-Web-device platform of business processes and systems will prove even more impenetrable than the 1995 desktop productivity environment. Linux will not penetrate the business desktop arena. And we will all wonder what it was that we were doing as this unfolded before our eyes.
Paul Merrell

ZoooS Previews "OpenOffice.org 3.0 in a Browser" | Software Journal - 0 views

  • ZoooS LLC today previewed ZoooS Office, a web-based office suite that puts OpenOffice.org 3.0 in a browser, targeting enterprise, SMB, and individual users alike with a blend of software-as-a-service (SaaS) and desktop advantages.
  • Other key ZoooS Office implementations will include Mozilla XULRunner; Firefox, Opera, Safari as well as the new Google Chrome web browser; social networking sites such as Facebook, MySpace, and Second Life; and Nintendo Wii and Sony PlayStation. Regardless of implementation, ZoooS applications run entirely on the client machine, performing all file operations locally to reduce network traffic, improve application performance, and support offline access.
  • Public availability of ZoooS Office is scheduled for the fourth quarter of 2008. Initially, ZoooS will deliver the Mozilla XULRunner version, a Firefox plug-in, an Opera widget, and an intranet server. ZoooS will follow up with a Vista gadget and Internet Explorer support in the first half of 2009. For more information on ZoooS, please visit www.zooos.com.
  •  
    Yet another wrapper around OpenOffice.org, this time the 3.0 version still in beta. $99.90 per seat for 10 users, Lots of Javascript to give a web collaboration capability. Perhaps most notable so far: [i] a sniff that there's a fair amount of money behind this one; and [ii] an article by Eric Lai says they approached the OOo Project but were rebuffed because they compete with desktop OOo. Support for different browsers planned. an XULRunner plug-in the works. Several mashups mentioned. Claims 80 percent of OOo features available, which is another way of saying that 20 per cent of the features are not supported. Claim that oSays code will be released under GPL. Apparently that's just their custom stuff because OOo 3.0 beta is LGPL. Building a business atop a code base controlled by a malevolent branch of Sun Microsystems seems less than wise. More at zooos.com. Preliminary impression: Like OOo itself, dead end technology that sucks mind and market share from software that supports truly open standards. The world needs to figure out that the OpenDocument format is roughly as open as OOXML. Open standards are fully specified so anyone can implement them.
Paul Merrell

Internet Giants Erect Barriers to Spy Agencies - NYTimes.com - 0 views

  • As fast as it can, Google is sealing up cracks in its systems that Edward J. Snowden revealed the N.S.A. had brilliantly exploited. It is encrypting more data as it moves among its servers and helping customers encode their own emails. Facebook, Microsoft and Yahoo are taking similar steps.
  • After years of cooperating with the government, the immediate goal now is to thwart Washington — as well as Beijing and Moscow. The strategy is also intended to preserve business overseas in places like Brazil and Germany that have threatened to entrust data only to local providers. Google, for example, is laying its own fiber optic cable under the world’s oceans, a project that began as an effort to cut costs and extend its influence, but now has an added purpose: to assure that the company will have more control over the movement of its customer data.
  • A year after Mr. Snowden’s revelations, the era of quiet cooperation is over. Telecommunications companies say they are denying requests to volunteer data not covered by existing law. A.T.&T., Verizon and others say that compared with a year ago, they are far more reluctant to cooperate with the United States government in “gray areas” where there is no explicit requirement for a legal warrant.
  • ...8 more annotations...
  • Eric Grosse, Google’s security chief, suggested in an interview that the N.S.A.'s own behavior invited the new arms race.“I am willing to help on the purely defensive side of things,” he said, referring to Washington’s efforts to enlist Silicon Valley in cybersecurity efforts. “But signals intercept is totally off the table,” he said, referring to national intelligence gathering.“No hard feelings, but my job is to make their job hard,” he added.
  • In Washington, officials acknowledge that covert programs are now far harder to execute because American technology companies, fearful of losing international business, are hardening their networks and saying no to requests for the kind of help they once quietly provided.Continue reading the main story Robert S. Litt, the general counsel of the Office of the Director of National Intelligence, which oversees all 17 American spy agencies, said on Wednesday that it was “an unquestionable loss for our nation that companies are losing the willingness to cooperate legally and voluntarily” with American spy agencies.
  • Many point to an episode in 2012, when Russian security researchers uncovered a state espionage tool, Flame, on Iranian computers. Flame, like the Stuxnet worm, is believed to have been produced at least in part by American intelligence agencies. It was created by exploiting a previously unknown flaw in Microsoft’s operating systems. Companies argue that others could have later taken advantage of this defect.Worried that such an episode undercuts confidence in its wares, Microsoft is now fully encrypting all its products, including Hotmail and Outlook.com, by the end of this year with 2,048-bit encryption, a stronger protection that would take a government far longer to crack. The software is protected by encryption both when it is in data centers and when data is being sent over the Internet, said Bradford L. Smith, the company’s general counsel.
  • Mr. Smith also said the company was setting up “transparency centers” abroad so that technical experts of foreign governments could come in and inspect Microsoft’s proprietary source code. That will allow foreign governments to check to make sure there are no “back doors” that would permit snooping by United States intelligence agencies. The first such center is being set up in Brussels.Microsoft has also pushed back harder in court. In a Seattle case, the government issued a “national security letter” to compel Microsoft to turn over data about a customer, along with a gag order to prevent Microsoft from telling the customer it had been compelled to provide its communications to government officials. Microsoft challenged the gag order as violating the First Amendment. The government backed down.
  • Hardware firms like Cisco, which makes routers and switches, have found their products a frequent subject of Mr. Snowden’s disclosures, and their business has declined steadily in places like Asia, Brazil and Europe over the last year. The company is still struggling to convince foreign customers that their networks are safe from hackers — and free of “back doors” installed by the N.S.A. The frustration, companies here say, is that it is nearly impossible to prove that their systems are N.S.A.-proof.
  • In one slide from the disclosures, N.S.A. analysts pointed to a sweet spot inside Google’s data centers, where they could catch traffic in unencrypted form. Next to a quickly drawn smiley face, an N.S.A. analyst, referring to an acronym for a common layer of protection, had noted, “SSL added and removed here!”
  • Facebook and Yahoo have also been encrypting traffic among their internal servers. And Facebook, Google and Microsoft have been moving to more strongly encrypt consumer traffic with so-called Perfect Forward Secrecy, specifically devised to make it more labor intensive for the N.S.A. or anyone to read stored encrypted communications.One of the biggest indirect consequences from the Snowden revelations, technology executives say, has been the surge in demands from foreign governments that saw what kind of access to user information the N.S.A. received — voluntarily or surreptitiously. Now they want the same.
  • The latest move in the war between intelligence agencies and technology companies arrived this week, in the form of a new Google encryption tool. The company released a user-friendly, email encryption method to replace the clunky and often mistake-prone encryption schemes the N.S.A. has readily exploited.But the best part of the tool was buried in Google’s code, which included a jab at the N.S.A.'s smiley-face slide. The code included the phrase: “ssl-added-and-removed-here-; - )”
Gary Edwards

XML Production Workflows? Start with the Web and XHTML - 0 views

  • Challenges: Some Ugly Truths The challenges of building—and living with—an XML workflow are clear enough. The return on investment is a long-term proposition. Regardless of the benefits XML may provide, the starting reality is that it represents a very different way of doing things than the one we are familiar with. The Word Processing and Desktop Publishing paradigm, based on the promise of onscreen, WYSIWYG layout, is so dominant as to be practically inescapable. It has proven really hard to get from here to there, no matter how attractive XML might be on paper. A considerable amount of organizational effort and labour must be expended up front in order to realize the benefits. This is why XML is often referred to as an “investment”: you sink a bunch of time and money up front, and realize the benefits—greater flexibility, multiple output options, searching and indexing, and general futureproofing—later, over the long haul. It is not a short-term return proposition. And, of course, the returns you are able to realize from your XML investment are commensurate with what you put in up front: fine-grained, semantically rich tagging is going to give you more potential for searchability and recombination than a looser, more general-purpose approach, but it sure costs more. For instance, the Text Encoding Initiative (TEI) is the grand example of pouring enormous amounts of energy into the up-front tagging, with a very open-ended set of possibilities down the line. TEI helpfully defines a level to which most of us do not have to aspire.[5] But understanding this on a theoretical level is only part of the challenge. There are many practical issues that must be addressed. Software and labour are two of the most critical. How do you get the content into XML in the first place? Unfortunately, despite two decades of people doing SGML and XML, this remains an ugly question.
  • Practical Challenges In 2009, there is still no truly likeable—let alone standard—editing and authoring software for XML. For many (myself included), the high-water mark here was Adobe’s FrameMaker, substantially developed by the late 1990s. With no substantial market for it, it is relegated today mostly to the tech writing industry, unavailable for the Mac, and just far enough afield from the kinds of tools we use today that its adoption represents a significant hurdle. And FrameMaker was the best of the breed; most of the other software in decent circulation are programmers’ tools—the sort of things that, as Michael Tamblyn pointed out, encourage editors to drink at their desks. The labour question represents a stumbling block as well. The skill-sets and mind-sets that effective XML editors need have limited overlap with those needed by literary and more traditional production editors. The need to think of documents as machine-readable databases is not something that comes naturally to folks steeped in literary culture. In combination with the sheer time and effort that rich tagging requires, many publishers simply outsource the tagging to India, drawing a division of labour that spans oceans, to put it mildly. Once you have XML content, then what do you do with it? How do you produce books from it? Presumably, you need to be able to produce print output as well as digital formats. But while the latter are new enough to be generally XML-friendly (e-book formats being largely XML based, for instance), there aren’t any straightforward, standard ways of moving XML content into the kind of print production environments we are used to seeing. This isn’t to say that there aren’t ways of getting print—even very high-quality print—output from XML, just that most of them involve replacing your prepress staff with Java programmers.
  • Why does this have to be so hard? It’s not that XML is new, or immature, or untested. Remember that the basics have been around, and in production, since the early 1980s at least. But we have to take account of a substantial and long-running cultural disconnect between traditional editorial and production processes (the ones most of us know intimately) and the ways computing people have approached things. Interestingly, this cultural divide looked rather different in the 1970s, when publishers were looking at how to move to digital typesetting. Back then, printers and software developers could speak the same language. But that was before the ascendancy of the Desktop Publishing paradigm, which computerized the publishing industry while at the same time isolating it culturally. Those of us who learned how to do things the Quark way or the Adobe way had little in common with people who programmed databases or document-management systems. Desktop publishing technology isolated us in a smooth, self-contained universe of toolbars, grid lines, and laser proofs. So, now that the reasons to get with this program, XML, loom large, how can we bridge this long-standing divide?
  • ...44 more annotations...
  • Using the Web as a Production Platform The answer, I think, is right in front of you. The bridge is the Web, a technology and platform that is fundamentally based on XML, and which many publishers are by now comfortably familiar with. Perhaps not entirely comfortably, but at least most publishers are already working with the Web; they already either know or have on staff people who understand it and can work with it. The foundation of our argument is this: rather than looking at jumping to XML in its full, industrial complexity, which seems to be what the O'Reilly-backed StartWithXML initiative[6] is suggesting, publishers instead leverage existing tools and technologies—starting with the Web—as a means of getting XML workflows in place. This means making small investments and working with known tools rather than spending tens of thousands of dollars on XML software and rarefied consultants. It means re-thinking how the existing pieces of the production toolchain fit together; re-thinking the existing roles of software components already in use. It means, fundamentally, taking the Web seriously as a content platform, rather than thinking of it as something you need to get content out to, somehow. If nothing else, the Web represents an opportunity to think about editorial and production from outside the shrink-wrapped Desktop Publishing paradigm.
  • Is the Web made of Real XML? At this point some predictable objections can be heard: wait a moment, the Web isn’t really made out of XML; the HTML that makes up most of the Web is at best the bastard child of SGML, and it is far too flaky/unstructured/underpowered to be taken seriously. We counter by arguing that although HTML on the Web exists in a staggering array of different incarnations, and that the majority of it is indeed an unstructured mess, this does not undermine the general principle that basic, ubiquitous Web technologies can make a solid platform for content management, editorial process, and production workflow.
  • With the advent of a published XML standard in the late 1990s came the W3C’s adoption of XHTML: the realization of the Web’s native content markup as a proper XML document type. Today, its acceptance is almost ubiquitous, even while the majority of actual content out there may not be strictly conforming. The more important point is that most contemporary Web software, from browsers to authoring tools to content management systems (from blogs to enterprise systems), are capable of working with clean, valid XHTML. Or, to put the argument the other way around, clean, valid XHTML content plays absolutely seamlessly with everything else on the Web.[7]
  • The objection which follows, then, will be that even if we grant that XHTML is a real XML document type, that it is underpowered for “serious” content because it is almost entirely presentation (formatting) oriented; it lacks any semantic depth. In XHTML, a paragraph is a paragraph is a paragraph, as opposed to a section or an epigraph or a summary.
  • n contrast, more “serious” XML document types like DocBook[8] or DITA-derived schemas[9] are capable of making semantic distinctions about content chunks at a fine level of granularity and with a high degree of specificity.
  • So there is an argument for recalling the 80:20 rule here. If XHTML can provide 80% of the value with just 20% of the investment, then what exactly is the business case for spending the other 80% to achieve that last 20% of value? We suspect the ratio is actually quite a bit steeper than 80:20 for most publishers.
  • Furthermore, just to get technical for a moment, XHTML is extensible in a fairly straightforward way, through the common “class” attribute on each element. Web developers have long leveraged this kind of extensibility in the elaboration of “microformats” for semantic-web applications.[10] There is no reason why publishers shouldn’t think to use XHTML’s simple extensibility in a similar way for their own ends.
  • XHTML, on the other hand, is supported by a vast array of quotidian software, starting with the ubiquitous Web browser. For this very reason, XHTML is in fact employed as a component part of several more specialized document types (ONIX and ePub among them).
  • Why re-invent a general-purpose prose representation when XHTML already does the job?
  • It is worth pausing for a moment to consider the role of XHTML in the ePub standard for ebook content. An ePub file is, anatomically, a simply disguised zip archive. Inside the zip archive are a few standard component parts: there are specialized files that declare metadata about the book, and about the format of the book. And then there is the book’s content, represented in XHTML. An ePub book is a Web page in a wrapper.
  • To sum up the general argument: the Web as it already exists presents incredible value to publishers, as a platform for doing XML content management with existing (and often free) tools, and without having to go blindly into the unknown. At this point, we can offer a few design guidelines: prefer existing and/or ubiquitous tools over specialized ones wherever possible; prefer free software over proprietary systems where possible; prefer simple tools controlled and coordinated by human beings over fully automated (and therefore complex) systems; play to our strengths: use Web software for storing and managing content, use layout software for layout, and keep editors and production people in charge of their own domains.
  • Putting the Pieces Together: A Prototype
  • At the SFU Master of Publishing Program, we have been chipping away at this general line of thinking for a few years. Over that time, Web content management systems have been getting more and more sophisticated, all the while getting more streamlined and easier to use. (NB: if you have a blog, you have a Web content management system.) The Web is beginning to be recognized as a writing and editing environment used by millions of people. And the ways in which content is represented, stored, and exchanged online have become increasingly robust and standardized.
  • The missing piece of the puzzle has been print production: how can we move content from its malleable, fluid form on line into the kind of high-quality print production environments we’ve come to expect after two decades of Desktop Publishing?
  • Anyone who has tried to print Web content knows that the existing methods leave much to be desired (hyphenation and justification, for starters). In the absence of decent tools for this, most publishers quite naturally think of producing the print content first, and then think about how to get material onto the Web for various purposes. So we tend to export from Word, or from Adobe, as something of an afterthought.
  • While this sort of works, it isn’t elegant, and it completely ignores the considerable advantages of Web-based content management.
  • Content managed online is stored in one central location, accessible simultaneously to everyone in your firm, available anywhere you have an Internet connection, and usually exists in a much more fluid format than Word files. If only we could manage the editorial flow online, and then go to print formats at the end, instead of the other way around. At SFU, we made several attempts to make this work by way of the supposed “XML import” capabilities of various Desktop Publishing tools, without much success.[12]
  • In the winter of 2009, Adobe solved this part of the problem for us with the introduction of its Creative Suite 4. What CS4 offers is the option of a complete XML representation of an InDesign document: what Adobe calls IDML (InDesign Markup Language).
  • The IDML file format is—like ePub—a simply disguised zip archive that, when unpacked, reveals a cluster of XML files that represent all the different facets of an InDesign document: layout spreads, master pages, defined styles, colours, and of course, the content.
  • IDML is a well thought-out XML standard that achieves two very different goals simultaneously: it preserves all of the information that InDesign needs to do what it does; and it is broken up in a way that makes it possible for mere mortals (or at least our Master of Publishing students) to work with it.
  • What this represented to us in concrete terms was the ability to take Web-based content and move it into InDesign in a straightforward way, thus bridging Web and print production environments using existing tools and skillsets, with a little added help from free software.
  • We would take clean XHTML content, transform it to IDML-marked content, and merge that with nicely designed templates in InDesign.
  • The result is an almost push-button publication workflow, which results in a nice, familiar InDesign document that fits straight into the way publishers actually do production.
  • Tracing the steps To begin with, we worked backwards, moving the book content back to clean XHTML.
  • The simplest method for this conversion—and if you want to create Web content, this is an excellent route—was to use Adobe’s “Export to Digital Editions” option, which creates an ePub file.
  • Recall that ePub is just XHTML in a wrapper, so within the ePub file was a relatively clean XHTML document. It was somewhat cleaner (that is, the XHTML tagging was simpler and less cluttered) than InDesign’s other Web-oriented exports, possibly because Digital Editions is a well understood target, compared with somebody’s website.
  • In order to achieve our target of clean XHTML, we needed to do some editing; the XHTML produced by InDesign’s “Digital Editions” export was presentation-oriented. For instance, bulleted list items were tagged as paragraphs, with a class attribute identifying them as list items. Using the search-and-replace function, we converted such structures to proper XHTML list and list-item elements. Our guiding principle was to make the XHTML as straightforward as possible, not dependent on any particular software to interpret it.
  • We broke the book’s content into individual chapter files; each chapter could then carry its own basic metadata, and the pages conveniently fit our Web content management system (which is actually just a wiki). We assembled a dynamically generated table of contents for the 12 chapters, and created a cover page. Essentially, the book was entirely Web-based at this point.
  • When the book chapters are viewed online, they are formatted via a CSS2 stylesheet that defines a main column for content as well as dedicating screen real estate for navigational elements. We then created a second template to render the content for exporting; this was essentially a bare-bones version of the book with no navigation and minimal styling. Pages (or even the entire book) can be exported (via the “Save As...” function in a Web browser) for use in either print production or ebook conversion. At this point, we required no skills beyond those of any decent Web designer.
  • Integrating with CS4 for Print Adobe’s IDML language defines elements specific to InDesign; there is nothing in the language that looks remotely like XHTML. So a mechanical transformation step is needed to convert the XHTML content into something InDesign can use. This is not as hard as it might seem.
  • Both XHTML and IDML are composed of straightforward, well-documented structures, and so transformation from one to the other is, as they say, “trivial.” We chose to use XSLT (Extensible Stylesheet Language Transforms) to do the work. XSLT is part of the overall XML specification, and thus is very well supported in a wide variety of tools. Our prototype used a scripting engine called xsltproc, a nearly ubiquitous piece of software that we found already installed as part of Mac OS X (contemporary Linux distributions also have this as a standard tool), though any XSLT processor would work.
  • In other words, we don’t need to buy InCopy, because we just replaced it with the Web. Our wiki is now plugged directly into our InDesign layout. It even automatically updates the InDesign document when the content changes. Credit is due at this point to Adobe: this integration is possible because of the open file format in the Creative Suite 4.
  • We wrote an XSLT transformation script[18] that converted the XHTML content from the Web into an InCopy ICML file. The script itself is less than 500 lines long, and was written and debugged over a period of about a week by amateurs (again, the people named at the start of this article). The script runs in a couple of seconds, and the resulting .icml file can then be “placed” directly into an InDesign template. The ICML file references an InDesign stylesheet, so the template file can be set up with a house-styled layout, master pages, and stylesheet definitions for paragraphs and character ranges.
  • The result is very simple and easy to use. Our demonstration requires that a production editor run the XSLT transformation script manually, but there is no reason why this couldn’t be built directly into the Web content management system so that exporting the content to print ran the transformation automatically. The resulting file would then be “placed” in InDesign and proofed.
  • It should be noted that the Book Publishing 1 proof-of-concept was artificially complex; we began with a book laid out in InDesign and ended up with a look-alike book laid out in InDesign. But next time—for instance, when we publish Book Publishing 2—we can begin the process with the content on the Web, and keep it there throughout the editorial process. The book’s content could potentially be written and edited entirely online, as Web content, and then automatically poured into an InDesign template at proof time. “Just in time,” as they say. This represents an entirely new way of thinking of book production. With a Web-first orientation, it makes little sense to think of the book as “in print” or “out of print”—the book is simply available, in the first place online; in the second place in derivative digital formats; and third, but really not much more difficult, in print-ready format, via the usual InDesign CS print production system publishers are already familiar with.
  • Creating Ebook Files Creating electronic versions from XHTML source is vastly simpler than trying to generate these out of the existing print process. The ePub version is extremely easy to generate; so is online marketing copy or excerpts for the Web, since the content begins life Web-native.
  • Since an ePub file is essentially XHTML content in a special wrapper, all that is required is that we properly “wrap” our XHTML content. Ideally, the content in an ePub file is broken into chapters (as ours was) and a table of contents file is generated in order to allow easy navigation within an ebook reader. We used Julian Smart’s free tool eCub[19] to simply and automatically generate the ePub wrapper and the table of contents. The only custom development we did was to create a CSS stylesheet for the ebook so that headings and paragraph indents looked the way we wanted. Starting with XHTML content, creating ePub is almost too easy.
  • Such a workflow—beginning with the Web and exporting to print—is surely more in line with the way we will do business in the 21st century, where the Web is the default platform for reaching audiences, developing content, and putting the pieces together. It is time, we suggest, for publishers to re-orient their operations and start with the Web.
  • Our project demonstrates that Web technologies are indeed good enough to use in an XML-oriented workflow; more specialized and expensive options are not necessarily required. For massive-scale enterprise publishing, this approach may not offer enough flexibility, and the challenge of adding and extracting extra semantic richness may prove more trouble than it's worth.
  • But for smaller firms who are looking at the straightforward benefits of XML-based processes—single source publishing, online content and workflow management, open and accessible archive formats, greater online discoverability—here is a way forward.
  • Rather than a public-facing website, our system relies on the Web as a content management platform—of course a public face could easily be added.
  • The final piece of our puzzle, the ability to integrate print production, was made possible by Adobe's release of InDesign with an open XML file format. Since the Web's XHTML is also XML, is can be easily and confidently transformed to the InDesign format.
  • today, we are able to put the process together using nothing but standard, relatively ubiquitous Web tools: the Web itself as an editing and content management environment, standard Web scripting tools for the conversion process, and the well-documented IDML file format to integrate the layout tool.
  • Using the Web as a Production Platform
  •  
    I was looking for an answer to a problem Marbux had presented, and found this interesting article.  The issue was that of the upcoming conversion of the Note Case Pro (NCP) layout engine to the WebKit layout engine, and what to do about the NCP document format. My initial reaction was to encode the legacy NCP document format in XML, and run an XSLT to a universal pivot format like TEI-XML.  From there, the TEI-XML community would provide all the XSLT transformation routines for conversion to ODF, OOXML, XHTML, ePUB and HTML/CSS. Researching the problems one might encounter with this approach, I found this article.  Fascinating stuff. My take away is that TEI-XML would not be as effective a "universal pivot point" as XHTML.  Or perhaps, if NCP really wants to get aggressive; IDML - InDesign Markup Language. The important point though is that XHTML is a browser specific version of XML, and compatible with the Web Kit layout engine Miro wants to move NCP to. The concept of encoding an existing application-specific format in XML has been around since 1998, when XML was first introduced as a W3C standard, a "structured" subset of SGML. (HTML is also a subset of SGML). The multiplatform StarOffice productivity suite became "OpenOffice" when Sun purchased the company in 1998, and open sourced the code base. The OpenOffice developer team came out with a XML encoding of their existing document formats in 2000. The application specific encoding became an OASIS document format standard proposal in 2002 - also known as ODF. Microsoft followed OpenOffice with a XML encoding of their application-specific binary document formats, known as OOXML. Encoding the existing NCP format in XML, specifically targeting XHTML as a "universal pivot point", would put the NCP Outliner in the Web editor category, without breaking backwards compatibility. The trick is in the XSLT conversion process. But I think that is something much easier to handle then trying to
  •  
    I was looking for an answer to a problem Marbux had presented, and found this interesting article.  The issue was that of the upcoming conversion of the Note Case Pro (NCP) layout engine to the WebKit layout engine, and what to do about the NCP document format. My initial reaction was to encode the legacy NCP document format in XML, and run an XSLT to a universal pivot format like TEI-XML.  From there, the TEI-XML community would provide all the XSLT transformation routines for conversion to ODF, OOXML, XHTML, ePUB and HTML/CSS. Researching the problems one might encounter with this approach, I found this article.  Fascinating stuff. My take away is that TEI-XML would not be as effective a "universal pivot point" as XHTML.  Or perhaps, if NCP really wants to get aggressive; IDML - InDesign Markup Language. The important point though is that XHTML is a browser specific version of XML, and compatible with the Web Kit layout engine Miro wants to move NCP to. The concept of encoding an existing application-specific format in XML has been around since 1998, when XML was first introduced as a W3C standard, a "structured" subset of SGML. (HTML is also a subset of SGML). The multiplatform StarOffice productivity suite became "OpenOffice" when Sun purchased the company in 1998, and open sourced the code base. The OpenOffice developer team came out with a XML encoding of their existing document formats in 2000. The application specific encoding became an OASIS document format standard proposal in 2002 - also known as ODF. Microsoft followed OpenOffice with a XML encoding of their application-specific binary document formats, known as OOXML. Encoding the existing NCP format in XML, specifically targeting XHTML as a "universal pivot point", would put the NCP Outliner in the Web editor category, without breaking backwards compatibility. The trick is in the XSLT conversion process. But I think that is something much easier to handle then trying to
Paul Merrell

LocalOrg: Decentralizing Telecom - 0 views

  • SOPA, ACTA, the criminalization of sharing, and a myriad of other measures taken to perpetuate antiquated business models propping up enduring monopolies - all have become increasingly taxing on the tech community and informed citizens alike. When the storm clouds gather and torrential rain begins to fall, the people have managed to stave off the flood waters through collective effort and well organized activism - stopping, or at least delaying SOPA and ACTA. However, is it really sustainable to mobilize each and every time multi-billion dollar corporations combine their resources and attempt to pass another series of draconian rules and regulations? Instead of manning the sandbags during each storm, wouldn't it suit us all better to transform the surrounding landscape in such a way as to harmlessly divert the floods, or better yet, harness them to our advantage? In many ways the transformation has already begun.
  • While open source software and hardware, as well as innovative business models built around collaboration and crowd-sourcing have done much to build a paradigm independent of current centralized proprietary business models, large centralized corporations and the governments that do their bidding, still guard all the doors and carry all the keys. The Internet, the phone networks, radio waves, and satellite systems still remain firmly in the hands of big business. As long as they do, they retain the ability to not only reassert themselves in areas where gains have been made, but can impose preemptive measures to prevent any future progress. With the advent of hackerspaces, increasingly we see projects that hold the potential of replacing, at least on a local level, much of the centralized infrastructure we take for granted until disasters or greed-driven rules and regulations upset the balance. It is with the further developing of our local infrastructure that we can leave behind the sandbags of perpetual activism and enjoy a permanently altered landscape that favors our peace and prosperity. Decentralizing Telecom
  • As impressive as a hydroelectric dam may be and as overwhelming as it may seem as a project to undertake, it will always start with but a single shovelful of dirt. The work required becomes in its own way part of the payoff - with experienced gained and with a magnificent accomplishment to aspire toward. In the same way, a communication network that runs parallel to existing networks, with global coverage, but locally controlled, may seem an impossible, overwhelming objective - and for one individual, or even a small group of individuals, it is. However, the paradigm has shifted. In the age of digital collaboration made possible by existing networks, the building of such a network can be done in parallel. In an act of digital-judo, we can use the system's infrastructure as a means of supplanting and replacing it with something superior in both function and in form. 
Paul Merrell

For sale: Systems that can secretly track where cellphone users go around the globe - T... - 0 views

  • Makers of surveillance systems are offering governments across the world the ability to track the movements of almost anybody who carries a cellphone, whether they are blocks away or on another continent. The technology works by exploiting an essential fact of all cellular networks: They must keep detailed, up-to-the-minute records on the locations of their customers to deliver calls and other services to them. Surveillance systems are secretly collecting these records to map people’s travels over days, weeks or longer, according to company marketing documents and experts in surveillance technology.
  • The world’s most powerful intelligence services, such as the National Security Agency and Britain’s GCHQ, long have used cellphone data to track targets around the globe. But experts say these new systems allow less technically advanced governments to track people in any nation — including the United States — with relative ease and precision.
  • It is unclear which governments have acquired these tracking systems, but one industry official, speaking on the condition of anonymity to share sensitive trade information, said that dozens of countries have bought or leased such technology in recent years. This rapid spread underscores how the burgeoning, multibillion-dollar surveillance industry makes advanced spying technology available worldwide. “Any tin-pot dictator with enough money to buy the system could spy on people anywhere in the world,” said Eric King, deputy director of Privacy International, a London-based activist group that warns about the abuse of surveillance technology. “This is a huge problem.”
  • ...9 more annotations...
  • Security experts say hackers, sophisticated criminal gangs and nations under sanctions also could use this tracking technology, which operates in a legal gray area. It is illegal in many countries to track people without their consent or a court order, but there is no clear international legal standard for secretly tracking people in other countries, nor is there a global entity with the authority to police potential abuses.
  • tracking systems that access carrier location databases are unusual in their ability to allow virtually any government to track people across borders, with any type of cellular phone, across a wide range of carriers — without the carriers even knowing. These systems also can be used in tandem with other technologies that, when the general location of a person is already known, can intercept calls and Internet traffic, activate microphones, and access contact lists, photos and other documents. Companies that make and sell surveillance technology seek to limit public information about their systems’ capabilities and client lists, typically marketing their technology directly to law enforcement and intelligence services through international conferences that are closed to journalists and other members of the public.
  • Yet marketing documents obtained by The Washington Post show that companies are offering powerful systems that are designed to evade detection while plotting movements of surveillance targets on computerized maps. The documents claim system success rates of more than 70 percent. A 24-page marketing brochure for SkyLock, a cellular tracking system sold by Verint, a maker of analytics systems based in Melville, N.Y., carries the subtitle “Locate. Track. Manipulate.” The document, dated January 2013 and labeled “Commercially Confidential,” says the system offers government agencies “a cost-effective, new approach to obtaining global location information concerning known targets.”
  • (Privacy International has collected several marketing brochures on cellular surveillance systems, including one that refers briefly to SkyLock, and posted them on its Web site. The 24-page SkyLock brochure and other material was independently provided to The Post by people concerned that such systems are being abused.)
  • Verint, which also has substantial operations in Israel, declined to comment for this story. It says in the marketing brochure that it does not use SkyLock against U.S. or Israeli phones, which could violate national laws. But several similar systems, marketed in recent years by companies based in Switzerland, Ukraine and elsewhere, likely are free of such limitations.
  • The tracking technology takes advantage of the lax security of SS7, a global network that cellular carriers use to communicate with one another when directing calls, texts and Internet data. The system was built decades ago, when only a few large carriers controlled the bulk of global phone traffic. Now thousands of companies use SS7 to provide services to billions of phones and other mobile devices, security experts say. All of these companies have access to the network and can send queries to other companies on the SS7 system, making the entire network more vulnerable to exploitation. Any one of these companies could share its access with others, including makers of surveillance systems.
  • Companies that market SS7 tracking systems recommend using them in tandem with “IMSI catchers,” increasingly common surveillance devices that use cellular signals collected directly from the air to intercept calls and Internet traffic, send fake texts, install spyware on a phone, and determine precise locations. IMSI catchers — also known by one popular trade name, StingRay — can home in on somebody a mile or two away but are useless if a target’s general location is not known. SS7 tracking systems solve that problem by locating the general area of a target so that IMSI catchers can be deployed effectively. (The term “IMSI” refers to a unique identifying code on a cellular phone.)
  • Verint can install SkyLock on the networks of cellular carriers if they are cooperative — something that telecommunications experts say is common in countries where carriers have close relationships with their national governments. Verint also has its own “worldwide SS7 hubs” that “are spread in various locations around the world,” says the brochure. It does not list prices for the services, though it says that Verint charges more for the ability to track targets in many far-flung countries, as opposed to only a few nearby ones. Among the most appealing features of the system, the brochure says, is its ability to sidestep the cellular operators that sometimes protect their users’ personal information by refusing government requests or insisting on formal court orders before releasing information.
  • Another company, Defentek, markets a similar system called Infiltrator Global Real-Time Tracking System on its Web site, claiming to “locate and track any phone number in the world.” The site adds: “It is a strategic solution that infiltrates and is undetected and unknown by the network, carrier, or the target.”
  •  
    The Verint company has very close ties to the Iraeli government. Its former parent company Comverse, was heavily subsidized by Israel and the bulk of its manufacturing and code development was done in Israel. See https://en.wikipedia.org/wiki/Comverse_Technology "In December 2001, a Fox News report raised the concern that wiretapping equipment provided by Comverse Infosys to the U.S. government for electronic eavesdropping may have been vulnerable, as these systems allegedly had a back door through which the wiretaps could be intercepted by unauthorized parties.[55] Fox News reporter Carl Cameron said there was no reason to believe the Israeli government was implicated, but that "a classified top-secret investigation is underway".[55] A March 2002 story by Le Monde recapped the Fox report and concluded: "Comverse is suspected of having introduced into its systems of the 'catch gates' in order to 'intercept, record and store' these wire-taps. This hardware would render the 'listener' himself 'listened to'."[56] Fox News did not pursue the allegations, and in the years since, there have been no legal or commercial actions of any type taken against Comverse by the FBI or any other branch of the US Government related to data access and security issues. While no real evidence has been presented against Comverse or Verint, the allegations have become a favorite topic of conspiracy theorists.[57] By 2005, the company had $959 million in sales and employed over 5,000 people, of whom about half were located in Israel.[16]" Verint is also the company that got the Dept. of Homeland Security contract to provide and install an electronic and video surveillance system across the entire U.S. border with Mexico.  One need not be much of a conspiracy theorist to have concerns about Verint's likely interactions and data sharing with the NSA and its Israeli equivalent, Unit 8200. 
Paul Merrell

Microsoft pledges to tell email customers of state-sponsored hacking in future - Techno... - 0 views

  • Microsoft Corp. has agreed to change its policies and always tell email customers when it suspects there has been a government hacking attempt after widespread hacking by Chinese authorities was exposed. Microsoft experts concluded several years ago that Chinese authorities had hacked into more than a thousand Hotmail email accounts, targeting international leaders of China's Tibetan and Uighur minorities in particular — but it decided not to tell the victims, allowing the hackers to continue their campaign, according to former employees of the company. On Wednesday, after a series of requests for comment from Reuters, Microsoft said it would change its policy on notifying customers. Microsoft spokesman Frank Shaw said the company was never certain of the origin of the Hotmail attacks.
  • The company also confirmed for the first time that it had not called, emailed or otherwise told the Hotmail users that their electronic correspondence had been collected. The company declined to say what role the exposure of the Hotmail campaign played in its decision to make the policy shift. The first public signal of the attacks came in May 2011, though no direct link was immediately made with the Chinese authorities.
  • That's when security firm Trend Micro Inc announced it had found an email sent to someone in Taiwan that contained a miniature computer program. The program took advantage of a previously undetected flaw in Microsoft's own web pages to direct Hotmail and other free Microsoft email services to secretly forward copies of all of a recipient's incoming mail to an account controlled by the attacker. Trend Micro found more than a thousand victims, and Microsoft patched the vulnerability before the security company announced its findings publicly
1 - 13 of 13
Showing 20 items per page