Skip to main content

Home/ Open Web/ Group items tagged internet-access

Rss Feed Group items tagged

Paul Merrell

US websites should inform EU citizens about NSA surveillance, says report - 0 views

  • All existing data sharing agreements between Europe and the US should be revoked, and US web site providers should prominently inform European citizens that their data may be subject to government surveillance, according to the recommendations of a briefing report for the European Parliament. The report was produced in response to revelations about the US National Security Agency (NSA) snooping on internet traffic, and aims to highlight the subsequent effect on European Union (EU) citizens' rights.
  • The report warns that EU data protection authorities have failed to understand the “structural shift of data sovereignty implied by cloud computing”, and the associated risks to the rights of EU citizens. It suggests “a full industrial policy for development of an autonomous European cloud computing capacity” should be set up to reduce exposure of EU data to NSA surveillance that is undertaken by the use of US legislation that forces US-based cloud providers to provide access to data they hold.
  • To put pressure on the US government, the report recommends that US websites should ask EU citizens for their consent before gathering data that could be used by the NSA. “Prominent notices should be displayed by every US web site offering services in the EU to inform consent to collect data from EU citizens. The users should be made aware that the data may be subject to surveillance by the US government for any purpose which furthers US foreign policy,” it said. “A consent requirement will raise EU citizen awareness and favour growth of services solely within EU jurisdiction. This will thus have economic impact on US business and increase pressure on the US government to reach a settlement.”
  • ...2 more annotations...
  • Other recommendations include the EU offering protection and rewards for whistleblowers, including “strong guarantees of immunity and asylum”. Such a move would be seen as a direct response to the plight of Edward Snowden, the former NSA analyst who leaked documents that revealed the extent of the NSA’s global internet surveillance programmes. The report also says that, “Encryption is futile to defend against NSA accessing data processed by US clouds,” and that there is “no technical solution to the problem”. It calls for the EU to press for changes to US law.
  • “It seems that the only solution which can be trusted to resolve the Prism affair must involve changes to the law of the US, and this should be the strategic objective of the EU,” it said. The report was produced for the European Parliament committee on civil liberties, justice and home affairs, and comes before the latest hearing of an inquiry into electronic mass surveillance of EU citizens, due to take place in Brussels on 24 September.
  •  
    Yee-haw! E.U. sanctuary and rewards for NSA whistle-blowers. Mandatory warnings for customers of U.S. cloud services that their data may be turned over to the NSA. Pouring more gasoline on the NSA diplomatic fire. 
Gary Edwards

In Mobile, Fragmentation is Forever. Deal With It. - washingtonpost.com - 0 views

  •  
    I disagree with the authors conclusions here.  He misses some very significant developments.  Particularly around Google, WebKit, and WebKit-HTML5. For instance, there is this article out today; "Google Really is Giving Away Free Nexus One and Droid Handsets to Developers".  Also, Palm is working on a WiMAX/WiFi version of their WebOS (WebKit) smartphone for Sprint.  Sprint and ClearWire are pushing forward with a very aggressive WiMAX rollout in the USA.  San Francisco should go on line this year!   One of the more interesting things about the Sprint WiMAX plan is that they have a set fee of $69.00 per month that covers EVERYTHING; cellphone, WiMAX Web browsing, video, and data connectivity, texting (SMS) and VOIP.  Major Sprint competitors, Verizon, AT&T and TMobile charge $69 per month, but it only covers cellphone access.  Everything else is extra adn also at low speed/ low bandwidth.  3G at best.  WiMAX however is a 4G screamer.  It's also an open standard.  (Verizon FIOS and LTE are comparable and said to be coming soon, but they are proprietary technologies).   The Cable guys are itneresting in that they are major backers of WiMAX, but also have a bandwidth explosive technology called Docsis. There is an interesting article at TechCrunch, "In Mobile, Fragmentation is Forever. Deal With It."  I disagree entirely with the authors conclusion.  WebKit is capable of providing a universal HTML5 application developers layer for mobile and desktop browser computing.  It's supported by Apple, Google, Palm (WebOS), Nokia, RiMM (Blackberry) and others to such an extent that 85% of all smartphones shipped this year will either ship with WebKit or, an Opera browser compatible with the WebKit HTML5 document layout/rendering model.   I would even go as far as to say that WebKit-HTML5 owns the Web's document model and application layer for the future.  Excepting for Silverlight, which features the OOXML document model with over 500 million desktop develop
Gary Edwards

RuleLab.Net Server: Web system for design, implementation and management of business pr... - 0 views

  •  
    RuleLab.Net is a web-based system for designing and implementing the business rules that operate on an application's XML data. Extend your existing applications by adding Rule building and Business Rules Engine (BRE) capabilities. Consolidate your business logic in an easy to read format, build, test, share, and deploy your Rules using the web browser; and integrate them into your system via the BRE. Intuitive GUI, English-like syntax, and centralized repository empower business users with direct access to the Rules.In the RuleLab.Net system, Business Rules are composed and managed over the Internet or Intranet using the web-based Rules Designer. It allows users to associate an application XML data template with Rules, create a vocabulary of natural terms, graphically build complex logical expressions, test the Rules on data samples, and store the Rules in a database. Features include strong data types, reasoning, rule priorities and dependencies, calculation formulas, looping-data-structure support, and a built-in set of computational, aggregate and other data processing functions. Rules and other system objects are stored in XML files that can be downloaded, modified, and uploaded to the online repository. Rule changes made online can be instantly deployed for runtime use by the applications integrated with the BRE. The forward chaining BRE parses XML application data against the ruleset, updates your data XML document, and returns it back to the application along with the comprehensive state information. Written in .NET, the BRE component can be utilized as a managed assembly, a COM object, or through the Web Service.
Paul Merrell

IEEE wraps up standard for 22Mbps white space wireless | Electronista - 1 views

  • The IEEE announced today that it had finalized the 802.22TM white space wireless standards for Wireless Regional Area Networks (WRAN). White space wireless, sometimes called "super Wi-Fi" because of its long range capability and faster throughput, broadcasts data on unused, unlicensed frequencies that were designated for VHF and UHF television broadcast. The new standard will be capable of providing broadband wireless access over a large area, with a range of more than 60 miles from the transmitter. White space wireless can deliver up to 22 Mbps per channel.The white space standard was sought by Microsoft and other companies to make broadband access available in areas where extending wired internet service is impractical. Consumers will also benefit from extended range wireless hubs. A test network was established at Rice University in Houston in April.
Gary Edwards

Are the feds the first to a common cloud definition? | The Wisdom of Clouds - CNET News - 0 views

  •  
    Cisco's James Urquhart discusses the NIST definition of Cloud Computing. The National Institute of Technology and Standards is a non regulatory branch of the Commerce Department and is responsible for much of the USA's official participation in World Standards organizations. This is an important discussion, but i'm a bit disappointed by the loose use of the term "network". I guess they mean the Internet? No mention of RESTfull computing or Open Web Standards either. Some interesting clips: ...(The NIST's) definition of cloud computing will be the de facto standard definition that the entire US government will be given...In creating this definition, NIST consulted extensively with the private sector including a wide range of vendors, consultants and industry pundants including your truly. Below is the draft NIST working definition of Cloud Computing. I should note, this definition is a work in progress and therefore is open to public ratification & comment. The initial feedback was very positive from the federal CIO's who were presented it yesterday in DC. Baring any last minute lobbying I doubt we'll see many more major revisions. ....... Cloud computing is a pay-per-use model for enabling available, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, services) that can be rapidly provisioned and released with minimal management effort or service provider interaction. This cloud model promotes availability and is comprised of five key characteristics, three delivery models, and four deployment models.
  •  
    Gary, NIST really is not "responsible for much of the USA's official participation in World Standards organizations." Lots of legal analysis omitted, but the bottom line is that NIST would have had to be delegated that responsibility by the President, but never was. However, that did not stop NIST from signing over virtually all responsibility for U.S. participation in international standard development to the private ANSI, without so much as a public notice and comment rulemaking process. See section 3 at http://ts.nist.gov/Standards/Conformity/ansimou.cfm. Absolutely illegal, including at least two bright-line violations of the U.S. Constitution. But the Feds have unmistakably abdicated their legal responsibilities in regard to international standards to the private sector.
Gary Edwards

AppleInsider | Inside Mac OS X Snow Leopard: Exchange Support - 0 views

  •  
    Apple desktop and iPhone support of Microsoft Exchange is not support for Microsoft, as some think.  It's actually a strategy to erode Microsoft's desktop monopoly.  It's also part of a longer term plan to thwart Microsoft's hopes of leveraging their desktop monopoly into a Web Server monopoly. Excerpt: Apple is reducing its dependance upon Microsoft's client software, weakening Microsoft's ability to hold back and dumb down its Mac offerings at Apple's expense. More importantly, Apple is providing its users with additional options that benefit both Mac users and the open source community. In the software business, Microsoft has long known the importance of owning the client end. It worked hard to displace Netscape's web browser in the late 90s, not because there was any money to be made in giving away browser clients, but because it knew that whoever controlled the client could set up proprietary demands for a specific web server. That's what Netscape had worked to do as it gave away its web browser in hopes that it could make money selling Netscape web servers; Microsoft first took control of the client with Internet Explorer and then began tying its IE client to its own IIS on the server side with features that gave companies reasons to buy all of their server software from Microsoft. As Apple takes over the client end of Exchange, it similarly gains market leverage. First and foremost, the move allows Apple to improve the Exchange experience of Mac users so that business users have no reason not to buy Macs. Secondly, it gives Apple a client audience to market its own server solutions, including MobileMe to individual users and Snow Leopard Server to organizations. In concert with providing Exchange Server support, Apple is also delivering integrated support for its own Exchange alternatives in both MobileMe and with Snow Leopard Server's improved Dovecot email services, Address Book Server, iCal Server, the new Mobile Access secure gateway, and its include
Paul Merrell

British Prime Minister Suggests Banning Some Online Messaging Apps - NYTimes.com - 0 views

  • Popular messaging services like Snapchat and WhatsApp are in the cross hairs in Britain. That was the message delivered on Monday by Prime Minister David Cameron, who said he would pursue banning encrypted messaging services if Britain’s intelligence services were not given access to the communications. The statement comes as many European politicians are demanding that Internet companies like Google and Facebook provide greater information about people’s online activities after several recent terrorist threats, including the attacks in Paris.
  • Mr. Cameron, who has started to campaign ahead of a national election in Britain in May, said his government, if elected, would ban encrypted online communication tools that could potentially be used by terrorists if the country’s intelligence agencies were not given increased access. The reforms are part of new legislation that would force telecom operators and Internet services providers to store more data on people’s online activities, including social network messages. “Are we going to allow a means of communications which it simply isn’t possible to read?” Mr. Cameron said at an event on Monday, in reference to services like WhatsApp, Snapchat and other encrypted online applications. “My answer to that question is: ‘No, we must not.’ ” Mr. Cameron said his first duty was to protect the country against terrorist attacks.
  • “The attacks in Paris demonstrated the scale of the threat that we face and the need to have robust powers through our intelligence and security agencies in order to keep our people safe,” he added. Any restriction on these online services, however, would not take effect until 2016, at the earliest, and it remained unclear how the British government could stop people from using these apps, which are used by hundreds of millions of people worldwide.
Paul Merrell

Internet Archive: Scanning Services - 0 views

  •  
    I've been looking for a permanent online home for a couple of historical works I co-authored. My guidiing criterion has been the best chance of the works' long-term survival in a publicly-accessible form after my death. I think I may have just found my solution. 
Paul Merrell

First Look Publishes Open Source Code To Advance Privacy, Security, and Journalism - Th... - 0 views

  • today we’re excited to contribute back to the open source community by launching First Look Code, the home for our own open source projects related to privacy, security, data, and journalism. To begin with, First Look Code is the new home for document sanitization software PDF Redact Tools, and we’ve launched a brand new anti-gag order project called AutoCanary.
  • AutoCanary A warrant canary is a regularly published statement that a company hasn’t received any legal orders that it’s not allowed to talk about, such as a national security letter. Canaries can help prevent web publishers from misleading visitors and prevent tech companies from misleading users when they share data with the government and are prevented from talking about it. One such situation arose — without a canary in place — in 2013, when the U.S. government sent Lavabit, a provider of encrypted email services apparently used by Snowden, a legal request to access Snowden’s email, thwarting some of the very privacy protections Lavabit had promised users. This request included a gag order, so the company was legally prohibited from talking about it. Rather than becoming “complicit in crimes against the American people,” in his words, Lavabit founder Ladar Levison, chose to shut down the service.
  • Warrant canaries are designed to help companies in this kind of situation. You can see a list of companies that publish warrant canary statements at Canary Watch. As of today, First Look Media is among the companies that publish canaries. We’re happy to announce the first version of AutoCanary, a desktop program for Windows, Mac OS X, and Linux that makes the process of generating machine-readable, digitally-signed warrant canary statements simpler. Read more about AutoCanary on its new website.
  •  
    The internet continues to fight back against the Dark State. On the unsettled nature of the law in regard to use of warrant canaries in the U.S. see EFF's faq: https://www.eff.org/deeplinks/2014/04/warrant-canary-faq (it needs a test case).
Paul Merrell

Activists send the Senate 6 million faxes to oppose cyber bill - CBS News - 0 views

  • Activists worried about online privacy are sending Congress a message with some old-school technology: They're sending faxes -- more than 6.2 million, they claim -- to express opposition to the Cybersecurity Information Sharing Act (CISA).Why faxes? "Congress is stuck in 1984 and doesn't understand modern technology," according to the campaign Fax Big Brother. The week-long campaign was organized by the nonpartisan Electronic Frontier Foundation, the group Access and Fight for the Future, the activist group behind the major Internet protests that helped derail a pair of anti-piracy bills in 2012. It also has the backing of a dozen groups like the ACLU, the American Library Association, National Association of Criminal Defense Lawyers and others.
  • CISA aims to facilitate information sharing regarding cyberthreats between the government and the private sector. The bill gained more attention following the massive hack in which the records of nearly 22 million people were stolen from government computers."The ability to easily and quickly share cyber attack information, along with ways to counter attacks, is a key method to stop them from happening in the first place," Sen. Dianne Feinstein, D-California, who helped introduce CISA, said in a statement after the hack. Senate leadership had planned to vote on CISA this week before leaving for its August recess. However, the bill may be sidelined for the time being as the Republican-led Senate puts precedent on a legislative effort to defund Planned Parenthood.Even as the bill was put on the backburner, the grassroots campaign to stop it gained steam. Fight for the Future started sending faxes to all 100 Senate offices on Monday, but the campaign really took off after it garnered attention on the website Reddit and on social media. The faxed messages are generated by Internet users who visit faxbigbrother.com or stopcyberspying.com -- or who simply send a message via Twitter with the hashtag #faxbigbrother. To send all those faxes, Fight for the Future set up a dedicated server and a dozen phone lines and modems they say are capable of sending tens of thousands of faxes a day.
  • Fight for the Future told CBS News that it has so many faxes queued up at this point, that it may take months for Senate offices to receive them all, though the group is working on scaling up its capability to send them faster. They're also limited by the speed at which Senate offices can receive them.
  •  
    From an Fight For the Future mailing: "Here's the deal: yesterday the Senate delayed its expected vote on CISA, the Cybersecurity Information Sharing Act that would let companies share your private information--like emails and medical records--with the government. "The delay is good news; but it's a delay, not a victory. "We just bought some precious extra time to fight CISA, but we need to use it to go big like we did with SOPA or this bill will still pass. Even if we stop it in September, they'll try again after that. "The truth is that right now, things are looking pretty grim. Democrats and Republicans have been holding closed-door meetings to work out a deal to pass CISA quickly when they return from recess. "Right before the expected Senate vote on CISA, the Obama Administration endorsed the bill, which means if Congress passes it, the White House will definitely sign it.  "We've stalled and delayed CISA and bills like it nearly half a dozen times, but this month could be our last chance to stop it for good." See also http://tumblr.fightforthefuture.org/post/125953876003/senate-fails-to-advance-cisa-before-recess-amid (;) http://www.cbsnews.com/news/activists-send-the-senate-6-million-faxes-to-oppose-cyber-bill/ (;) http://www.npr.org/2015/08/04/429386027/privacy-advocates-to-senate-cyber-security-bill (.)
Paul Merrell

American Surveillance Now Threatens American Business - The Atlantic - 0 views

  • What does it look like when a society loses its sense of privacy? <div><a href="http://pubads.g.doubleclick.net/gampad/jump?iu=%2F4624%2FTheAtlanticOnline%2Fchannel_technology&t=src%3Dblog%26by%3Drobinson-meyer%26title%3Damerican-surveillance-now-threatens-american-business%26pos%3Din-article&sz=300x250&c=285899172&tile=1" title=""><img style="border:none;" src="http://pubads.g.doubleclick.net/gampad/ad?iu=%2F4624%2FTheAtlanticOnline%2Fchannel_technology&t=src%3Dblog%26by%3Drobinson-meyer%26title%3Damerican-surveillance-now-threatens-american-business%26pos%3Din-article&sz=300x250&c=285899172&tile=1" alt="" /></a></div>In the almost 18 months since the Snowden files first received coverage, writers and critics have had to guess at the answer. Does a certain trend, consumer complaint, or popular product epitomize some larger shift? Is trust in tech companies eroding—or is a subset just especially vocal about it? Polling would make those answers clear, but polling so far has been… confused. A new study, conducted by the Pew Internet Project last January and released last week, helps make the average American’s view of his or her privacy a little clearer. And their confidence in their own privacy is ... low. The study's findings—and the statistics it reports—stagger. Vast majorities of Americans are uncomfortable with how the government uses their data, how private companies use and distribute their data, and what the government does to regulate those companies. No summary can equal a recounting of the findings. Americans are displeased with government surveillance en masse:   
  • A new study finds that a vast majority of Americans trust neither the government nor tech companies with their personal data.
  • What does it look like when a society loses its sense of privacy? <div><a href="http://pubads.g.doubleclick.net/gampad/jump?iu=%2F4624%2FTheAtlanticOnline%2Fchannel_technology&t=src%3Dblog%26by%3Drobinson-meyer%26title%3Damerican-surveillance-now-threatens-american-business%26pos%3Din-article&sz=300x250&c=285899172&tile=1" title=""><img style="border:none;" src="http://pubads.g.doubleclick.net/gampad/ad?iu=%2F4624%2FTheAtlanticOnline%2Fchannel_technology&t=src%3Dblog%26by%3Drobinson-meyer%26title%3Damerican-surveillance-now-threatens-american-business%26pos%3Din-article&sz=300x250&c=285899172&tile=1" alt="" /></a></div>In the almost 18 months since the Snowden files first received coverage, writers and critics have had to guess at the answer. Does a certain trend, consumer complaint, or popular product epitomize some larger shift? Is trust in tech companies eroding—or is a subset just especially vocal about it? Polling would make those answers clear, but polling so far has been… confused. A new study, conducted by the Pew Internet Project last January and released last week, helps make the average American’s view of his or her privacy a little clearer. And their confidence in their own privacy is ... low. The study's findings—and the statistics it reports—stagger. Vast majorities of Americans are uncomfortable with how the government uses their data, how private companies use and distribute their data, and what the government does to regulate those companies. No summary can equal a recounting of the findings. Americans are displeased with government surveillance en masse:   
  • ...3 more annotations...
  • According to the study, 70 percent of Americans are “at least somewhat concerned” with the government secretly obtaining information they post to social networking sites. Eighty percent of respondents agreed that “Americans should be concerned” with government surveillance of telephones and the web. They are also uncomfortable with how private corporations use their data: Ninety-one percent of Americans believe that “consumers have lost control over how personal information is collected and used by companies,” according to the study. Eighty percent of Americans who use social networks “say they are concerned about third parties like advertisers or businesses accessing the data they share on these sites.” And even though they’re squeamish about the government’s use of data, they want it to regulate tech companies and data brokers more strictly: 64 percent wanted the government to do more to regulate private data collection. Since June 2013, American politicians and corporate leaders have fretted over how much the leaks would cost U.S. businesses abroad.
  • “It’s clear the global community of Internet users doesn’t like to be caught up in the American surveillance dragnet,” Senator Ron Wyden said last month. At the same event, Google chairman Eric Schmidt agreed with him. “What occurred was a loss of trust between America and other countries,” he said, according to the Los Angeles Times. “It's making it very difficult for American firms to do business.” But never mind the world. Americans don’t trust American social networks. More than half of the poll’s respondents said that social networks were “not at all secure. Only 40 percent of Americans believe email or texting is at least “somewhat” secure. Indeed, Americans trusted most of all communication technologies where some protections has been enshrined into the law (though the report didn’t ask about snail mail). That is: Talking on the telephone, whether on a landline or cell phone, is the only kind of communication that a majority of adults believe to be “very secure” or “somewhat secure.”
  • (That may seem a bit incongruous, because making a telephone call is one area where you can be almost sure you are being surveilled: The government has requisitioned mass call records from phone companies since 2001. But Americans appear, when discussing security, to differentiate between the contents of the call and data about it.) Last month, Ramsey Homsany, the general counsel of Dropbox, said that one big thing could take down the California tech scene. “We have built this incredible economic engine in this region of the country,” said Homsany in the Los Angeles Times, “and [mistrust] is the one thing that starts to rot it from the inside out.” According to this poll, the mistrust has already begun corroding—and is already, in fact, well advanced. We’ve always assumed that the great hurt to American business will come globally—that citizens of other nations will stop using tech companies’s services. But the new Pew data shows that Americans suspect American businesses just as much. And while, unlike citizens of other nations, they may not have other places to turn, they may stop putting sensitive or delicate information online.
Paul Merrell

Exclusive: Tim Berners-Lee tells us his radical new plan to upend the - 0 views

  • “The intent is world domination,” Berners-Lee says with a wry smile. The British-born scientist is known for his dry sense of humor. But in this case, he is not joking.This week, Berners-Lee will launch Inrupt, a startup that he has been building, in stealth mode, for the past nine months. Backed by Glasswing Ventures, its mission is to turbocharge a broader movement afoot, among developers around the world, to decentralize the web and take back power from the forces that have profited from centralizing it. In other words, it’s game on for Facebook, Google, Amazon. For years now, Berners-Lee and other internet activists have been dreaming of a digital utopia where individuals control their own data and the internet remains free and open. But for Berners-Lee, the time for dreaming is over.
  • In a post published this weekend, Berners-Lee explains that he is taking a sabbatical from MIT to work full time on Inrupt. The company will be the first major commercial venture built off of Solid, a decentralized web platform he and others at MIT have spent years building.
  • f all goes as planned, Inrupt will be to Solid what Netscape once was for many first-time users of the web: an easy way in. And like with Netscape, Berners-Lee hopes Inrupt will be just the first of many companies to emerge from Solid.
  • ...4 more annotations...
  • On his screen, there is a simple-looking web page with tabs across the top: Tim’s to-do list, his calendar, chats, address book. He built this app–one of the first on Solid–for his personal use. It is simple, spare. In fact, it’s so plain that, at first glance, it’s hard to see its significance. But to Berners-Lee, this is where the revolution begins. The app, using Solid’s decentralized technology, allows Berners-Lee to access all of his data seamlessly–his calendar, his music library, videos, chat, research. It’s like a mashup of Google Drive, Microsoft Outlook, Slack, Spotify, and WhatsApp.The difference here is that, on Solid, all the information is under his control. Every bit of data he creates or adds on Solid exists within a Solid pod–which is an acronym for personal online data store. These pods are what give Solid users control over their applications and information on the web. Anyone using the platform will get a Solid identity and Solid pod. This is how people, Berners-Lee says, will take back the power of the web from corporations.
  • For example, one idea Berners-Lee is currently working on is a way to create a decentralized version of Alexa, Amazon’s increasingly ubiquitous digital assistant. He calls it Charlie. Unlike with Alexa, on Charlie people would own all their data. That means they could trust Charlie with, for example, health records, children’s school events, or financial records. That is the kind of machine Berners-Lee hopes will spring up all over Solid to flip the power dynamics of the web from corporation to individuals.
  • Berners-Lee believes Solid will resonate with the global community of developers, hackers, and internet activists who bristle over corporate and government control of the web. “Developers have always had a certain amount of revolutionary spirit,” he observes. Circumventing government spies or corporate overlords may be the initial lure of Solid, but the bigger draw will be something even more appealing to hackers: freedom. In the centralized web, data is kept in silos–controlled by the companies that build them, like Facebook and Google. In the decentralized web, there are no silos.Starting this week, developers around the world will be able to start building their own decentralized apps with tools through the Inrupt site. Berners-Lee will spend this fall crisscrossing the globe, giving tutorials and presentations to developers about Solid and Inrupt.
  • When asked about this, Berners-Lee says flatly: “We are not talking to Facebook and Google about whether or not to introduce a complete change where all their business models are completely upended overnight. We are not asking their permission.”Game on.
Paul Merrell

The Senate has its own insincere net neutrality bill - 0 views

  • Now that the House of Representatives has floated a superficial net neutrality bill, it's the Senate's turn. Louisiana Senator John Kennedy has introduced a companion version of the Open Internet Preservation Act that effectively replicates the House measure put forward by Tennessee Representative Marsha Blackburn. As before, it supports net neutrality only on a basic level -- and there are provisions that would make it difficult to combat other abuses. The legislation would technically forbid internet providers from blocking and throttling content, but it wouldn't bar paid prioritization. Theoretically, ISPs could create de facto "slow lanes" for competing services by offering mediocre speeds unless they pay for faster connections. The bill would also curb the FCC's ability to deal with other violations, and would prevent states from passing their own net neutrality laws. In short, the bill is much more about limiting regulation than protecting open access and competition.Kennedy's bill isn't expected to go far in the Senate, just as Blackburn's hasn't done much in the House. However, his proposal comes mere days after senators put forward a Congressional Review Act that would undo the FCC's decision to kill net neutrality. Kennedy had claimed he was considering support for the CRA, but his proposal contradicts that -- why push a heavily watered-down bill if you were willing to revert to the stronger legislation? It's not a completely surprising move and is largely symbolic, but it's disappointing for those who hoped there would be truly bipartisan support for a return to net neutrality.
Paul Merrell

ACLU Demands Secret Court Hand Over Crucial Rulings On Surveillance Law - 0 views

  • The American Civil Liberties Union (ACLU) has filed a motion to reveal the secret court opinions with “novel or significant interpretations” of surveillance law, in a renewed push for government transparency. The motion, filed Wednesday by the ACLU and Yale Law School’s Media Freedom and Information Access Clinic, asks the Foreign Intelligence Surveillance Act (FISA) Court, which rules on intelligence gathering activities in secret, to release 23 classified decisions it made between 9/11 and the passage of the USA Freedom Act in June 2015. As ACLU National Security Project staff attorney Patrick Toomey explains, the opinions are part of a “much larger collection of hidden rulings on all sorts of government surveillance activities that affect the privacy rights of Americans.” Among them is the court order that the government used to direct Yahoo to secretly scanits users’ emails for “a specific set of characters.” Toomey writes: These court rulings are essential for the public to understand how federal laws are being construed and implemented. They also show how constitutional protections for personal privacy and expressive activities are being enforced by the courts. In other words, access to these opinions is necessary for the public to properly oversee their government.
  • Although the USA Freedom Act requires the release of novel FISA court opinions on surveillance law, the government maintains that the rule does not apply retroactively—thereby protecting the panel from publishing many of its post-9/11 opinions, which helped create an “unprecedented buildup” of secret surveillance laws. Even after National Security Agency (NSA) whistleblower Edward Snowden revealed the scope of mass surveillance in 2013, sparking widespread outcry, dozens of rulings on spying operations remain hidden from the public eye, which stymies efforts to keep the government accountable, civil liberties advocates say. “These rulings are necessary to inform the public about the scope of the government’s surveillance powers today,” the ACLU’s motion states.
  • Toomey writes that the rulings helped influence a number of novel spying activities, including: The government’s use of malware, which it calls “Network Investigative Techniques” The government’s efforts to compel technology companies to weaken or circumvent their own encryption protocols The government’s efforts to compel technology companies to disclose their source code so that it can identify vulnerabilities The government’s use of “cybersignatures” to search through internet communications for evidence of computer intrusions The government’s use of stingray cell-phone tracking devices under the Foreign Intelligence Surveillance Act (FISA) The government’s warrantless surveillance of Americans under FISA Section 702—a controversial authority scheduled to expire in December 2017 The bulk collection of financial records by the CIA and FBI under Section 215 of the Patriot Act Without these rulings being made public, “it simply isn’t possible to understand the government’s claimed authority to conduct surveillance,” Toomey writes. As he told The Intercept on Wednesday, “The people of this country can’t hold the government accountable for its surveillance activities unless they know what our laws allow. These secret court opinions define the limits of the government’s spying powers. Their disclosure is essential for meaningful public oversight in our democracy.”
Paul Merrell

'Manhunting Timeline' Further Suggests US Pressured Countries to Prosecute WikiLeaks Ed... - 0 views

  • An entry in something the government calls a “Manhunting Timeline” suggests that the United States pressured officials of countries around the world to prosecute WikiLeaks editor-in-chief, Julian Assange, in 2010. The file—marked unclassified, revealed by National Security Agency whistleblower Edward Snowden and published by The Intercept—is dated August 2010. Under the headline, “United States, Australia, Great Britain, Germany, Iceland” – it states: The United States on 10 August urged other nations with forces in Afghanistan, including Australia, United Kingdom and Germany, to consider filing criminal charges against Julian Assange, founder of the rogue WikiLeaks Internet website and responsible for the unauthorized publication of over 70,000 classified documents covering the war in Afghanistan. The documents may have been provided to WikiLeaks by Army Private First Class Bradley Manning. The appeal exemplifies the start of an international effort to focus the legal element of national power upon non-state actor Assange and the human network that supports WikiLeaks. Another document—a top-secret page from an internal wiki—indicates there has been discussion in the NSA with the Threat Operations Center Oversight and Compliance (NOC) and Office of General Counsel (OGC) on the legality of designating WikiLeaks a “malicious foreign actor” and whether this would make it permissible to conduct surveillance on Americans accessing the website. “Can we treat a foreign server who stores or potentially disseminates leaked or stolen data on its server as a ‘malicious foreign actor’ for the purpose of targeting with no defeats?” Examples: WikiLeaks, thepiratebay.org). The NOC/OGC answered, “Let me get back to you.” (The page does not indicate if anyone ever got back to the NSA. And “defeats” essentially means protections.)
  • GCHQ, the NSA’s counterpart in the UK, had a program called “ANTICRISIS GIRL,” which could engage in “targeted website monitoring.” This means data of hundreds of users accessing a website, like WikiLeaks, could be collected. The IP addresses of readers and supporters could be monitored. The agency could even target the publisher if it had a public dropbox or submission system. NSA and GCHQ could also target the foreign “branches” of the hacktivist group, Anonymous. An answer to another question from the wiki entry involves the question, “Is it okay to query against a foreign server known to be malicious even if there is a possibility that US persons could be using it as well? Example: thepiratebay.org.” The NOC/OGC responded, “Okay to go after foreign servers which US people use also (with no defeats). But try to minimize to ‘post’ only for example to filter out non-pertinent information.” WikiLeaks is not an example in this question, however, if it was designated as a “malicious foreign actor,” then the NSA would do queries of American users.
  • Michael Ratner, a lawyer from the Center for Constitutional Rights (CCR) who represents WikiLeaks, said on “Democracy Now!”, this shows he has every reason to fear what would happen if he set foot outside of the embassy. The files show some of the extent to which the US and UK have tried to destroy WikiLeaks. CCR added in a statement, “These NSA documents should make people understand why Julian Assange was granted diplomatic asylum, why he must be given safe passage to Ecuador, and why he must keep himself out of the hands of the United States and apparently other countries as well. These revelations only corroborate the expectation that Julian Assange is on a US target list for prosecution under the archaic “Espionage Act,” for what is nothing more than publishing evidence of government misconduct.” “These documents demonstrate that the political persecution of WikiLeaks is very much alive,”Baltasar Garzón, the Spanish former judge who now represents the group, told The Intercept. “The paradox is that Julian Assange and the WikiLeaks organization are being treated as a threat instead of what they are: a journalist and a media organization that are exercising their fundamental right to receive and impart information in its original form, free from omission and censorship, free from partisan interests, free from economic or political pressure.”
Paul Merrell

Lawmakers warn of 'radical' move by NSA to share information | TheHill - 0 views

  • A bipartisan pair of lawmakers is expressing alarm at reported changes at the National Security Agency that would allow the intelligence service’s information to be used for policing efforts in the United States.“If media accounts are true, this radical policy shift by the NSA would be unconstitutional, and dangerous,” Reps. Ted Lieu (D-Calif.) and Blake FarentholdBlake FarentholdLawmakers warn of 'radical' move by NSA to share information Overnight Tech: Netflix scores win over Postal Service Lawmakers go green for St. Patrick's Day MORE (R-Texas) wrote in a letter to the spy agency this week. “The proposed shift in the relationship between our intelligence agencies and the American people should not be done in secret.ADVERTISEMENT“NSA’s mission has never been, and should never be, domestic policing or domestic spying.”The NSA has yet to publicly announce the change, but The New York Times reported last month that the administration was poised to expand the agency's ability to share information that it picks up about people’s communications with other intelligence agencies.The modification would open the door for the NSA to give the FBI and other federal agencies uncensored communications of foreigners and Americans picked up incidentally — but without a warrant — during sweeps.  
  • Robert Litt, the general counsel at the Office of the Director of National Intelligence, told the Times that it was finalizing a 21-page draft of procedures to allow the expanded sharing.  Separately, the Guardian reported earlier this month that the FBI had quietly changed its internal privacy rules to allow direct access to the NSA’s massive storehouse of communication data picked up on Internet service providers and websites.The revelations unnerved civil liberties advocates, who encouraged lawmakers to demand answers of the spy agency.“Under a policy like this, information collected by the NSA would be available to a host of federal agencies that may use it to investigate and prosecute domestic crimes,” said Neema Singh Guliani, legislative counsel and the American Civil Liberties Union. “Making such a change without authorization from Congress or the opportunity for debate would ignore public demands for greater transparency and oversight over intelligence activities.”In their letter this week, Lieu and Farenthold warned that the NSA’s changes would undermine Congress and unconstitutionally violate people’s privacy rights.   
  • “The executive branch would be violating the separation of powers by unilaterally transferring warrantless data collected under the NSA’s extraordinary authority to domestic agencies, which do not have such authority,” they wrote.“Domestic law enforcement agencies — which need a warrant supported by probable cause to search or seize — cannot do an end run around the Fourth Amendment by searching warrantless information collected by the NSA.”
Paul Merrell

WhatsApp Encryption Said to Stymie Wiretap Order - The New York Times - 0 views

  • While the Justice Department wages a public fight with Apple over access to a locked iPhone, government officials are privately debating how to resolve a prolonged standoff with another technology company, WhatsApp, over access to its popular instant messaging application, officials and others involved in the case said. No decision has been made, but a court fight with WhatsApp, the world’s largest mobile messaging service, would open a new front in the Obama administration’s dispute with Silicon Valley over encryption, security and privacy.WhatsApp, which is owned by Facebook, allows customers to send messages and make phone calls over the Internet. In the last year, the company has been adding encryption to those conversations, making it impossible for the Justice Department to read or eavesdrop, even with a judge’s wiretap order.
  • As recently as this past week, officials said, the Justice Department was discussing how to proceed in a continuing criminal investigation in which a federal judge had approved a wiretap, but investigators were stymied by WhatsApp’s encryption.The Justice Department and WhatsApp declined to comment. The government officials and others who discussed the dispute did so on condition of anonymity because the wiretap order and all the information associated with it were under seal. The nature of the case was not clear, except that officials said it was not a terrorism investigation. The location of the investigation was also unclear.
  • To understand the battle lines, consider this imperfect analogy from the predigital world: If the Apple dispute is akin to whether the F.B.I. can unlock your front door and search your house, the issue with WhatsApp is whether it can listen to your phone calls. In the era of encryption, neither question has a clear answer.Some investigators view the WhatsApp issue as even more significant than the one over locked phones because it goes to the heart of the future of wiretapping. They say the Justice Department should ask a judge to force WhatsApp to help the government get information that has been encrypted. Others are reluctant to escalate the dispute, particularly with senators saying they will soon introduce legislation to help the government get data in a format it can read.
Gary Edwards

XML Production Workflows? Start with the Web and XHTML - 1 views

  • Challenges: Some Ugly Truths The challenges of building—and living with—an XML workflow are clear enough. The return on investment is a long-term proposition. Regardless of the benefits XML may provide, the starting reality is that it represents a very different way of doing things than the one we are familiar with. The Word Processing and Desktop Publishing paradigm, based on the promise of onscreen, WYSIWYG layout, is so dominant as to be practically inescapable. It has proven really hard to get from here to there, no matter how attractive XML might be on paper. A considerable amount of organizational effort and labour must be expended up front in order to realize the benefits. This is why XML is often referred to as an “investment”: you sink a bunch of time and money up front, and realize the benefits—greater flexibility, multiple output options, searching and indexing, and general futureproofing—later, over the long haul. It is not a short-term return proposition. And, of course, the returns you are able to realize from your XML investment are commensurate with what you put in up front: fine-grained, semantically rich tagging is going to give you more potential for searchability and recombination than a looser, more general-purpose approach, but it sure costs more. For instance, the Text Encoding Initiative (TEI) is the grand example of pouring enormous amounts of energy into the up-front tagging, with a very open-ended set of possibilities down the line. TEI helpfully defines a level to which most of us do not have to aspire.[5] But understanding this on a theoretical level is only part of the challenge. There are many practical issues that must be addressed. Software and labour are two of the most critical. How do you get the content into XML in the first place? Unfortunately, despite two decades of people doing SGML and XML, this remains an ugly question.
  • Practical Challenges In 2009, there is still no truly likeable—let alone standard—editing and authoring software for XML. For many (myself included), the high-water mark here was Adobe’s FrameMaker, substantially developed by the late 1990s. With no substantial market for it, it is relegated today mostly to the tech writing industry, unavailable for the Mac, and just far enough afield from the kinds of tools we use today that its adoption represents a significant hurdle. And FrameMaker was the best of the breed; most of the other software in decent circulation are programmers’ tools—the sort of things that, as Michael Tamblyn pointed out, encourage editors to drink at their desks. The labour question represents a stumbling block as well. The skill-sets and mind-sets that effective XML editors need have limited overlap with those needed by literary and more traditional production editors. The need to think of documents as machine-readable databases is not something that comes naturally to folks steeped in literary culture. In combination with the sheer time and effort that rich tagging requires, many publishers simply outsource the tagging to India, drawing a division of labour that spans oceans, to put it mildly. Once you have XML content, then what do you do with it? How do you produce books from it? Presumably, you need to be able to produce print output as well as digital formats. But while the latter are new enough to be generally XML-friendly (e-book formats being largely XML based, for instance), there aren’t any straightforward, standard ways of moving XML content into the kind of print production environments we are used to seeing. This isn’t to say that there aren’t ways of getting print—even very high-quality print—output from XML, just that most of them involve replacing your prepress staff with Java programmers.
  • Why does this have to be so hard? It’s not that XML is new, or immature, or untested. Remember that the basics have been around, and in production, since the early 1980s at least. But we have to take account of a substantial and long-running cultural disconnect between traditional editorial and production processes (the ones most of us know intimately) and the ways computing people have approached things. Interestingly, this cultural divide looked rather different in the 1970s, when publishers were looking at how to move to digital typesetting. Back then, printers and software developers could speak the same language. But that was before the ascendancy of the Desktop Publishing paradigm, which computerized the publishing industry while at the same time isolating it culturally. Those of us who learned how to do things the Quark way or the Adobe way had little in common with people who programmed databases or document-management systems. Desktop publishing technology isolated us in a smooth, self-contained universe of toolbars, grid lines, and laser proofs. So, now that the reasons to get with this program, XML, loom large, how can we bridge this long-standing divide?
  • ...44 more annotations...
  • Using the Web as a Production Platform The answer, I think, is right in front of you. The bridge is the Web, a technology and platform that is fundamentally based on XML, and which many publishers are by now comfortably familiar with. Perhaps not entirely comfortably, but at least most publishers are already working with the Web; they already either know or have on staff people who understand it and can work with it. The foundation of our argument is this: rather than looking at jumping to XML in its full, industrial complexity, which seems to be what the O'Reilly-backed StartWithXML initiative[6] is suggesting, publishers instead leverage existing tools and technologies—starting with the Web—as a means of getting XML workflows in place. This means making small investments and working with known tools rather than spending tens of thousands of dollars on XML software and rarefied consultants. It means re-thinking how the existing pieces of the production toolchain fit together; re-thinking the existing roles of software components already in use. It means, fundamentally, taking the Web seriously as a content platform, rather than thinking of it as something you need to get content out to, somehow. If nothing else, the Web represents an opportunity to think about editorial and production from outside the shrink-wrapped Desktop Publishing paradigm.
  • Is the Web made of Real XML? At this point some predictable objections can be heard: wait a moment, the Web isn’t really made out of XML; the HTML that makes up most of the Web is at best the bastard child of SGML, and it is far too flaky/unstructured/underpowered to be taken seriously. We counter by arguing that although HTML on the Web exists in a staggering array of different incarnations, and that the majority of it is indeed an unstructured mess, this does not undermine the general principle that basic, ubiquitous Web technologies can make a solid platform for content management, editorial process, and production workflow.
  • With the advent of a published XML standard in the late 1990s came the W3C’s adoption of XHTML: the realization of the Web’s native content markup as a proper XML document type. Today, its acceptance is almost ubiquitous, even while the majority of actual content out there may not be strictly conforming. The more important point is that most contemporary Web software, from browsers to authoring tools to content management systems (from blogs to enterprise systems), are capable of working with clean, valid XHTML. Or, to put the argument the other way around, clean, valid XHTML content plays absolutely seamlessly with everything else on the Web.[7]
  • The objection which follows, then, will be that even if we grant that XHTML is a real XML document type, that it is underpowered for “serious” content because it is almost entirely presentation (formatting) oriented; it lacks any semantic depth. In XHTML, a paragraph is a paragraph is a paragraph, as opposed to a section or an epigraph or a summary.
  • n contrast, more “serious” XML document types like DocBook[8] or DITA-derived schemas[9] are capable of making semantic distinctions about content chunks at a fine level of granularity and with a high degree of specificity.
  • So there is an argument for recalling the 80:20 rule here. If XHTML can provide 80% of the value with just 20% of the investment, then what exactly is the business case for spending the other 80% to achieve that last 20% of value? We suspect the ratio is actually quite a bit steeper than 80:20 for most publishers.
  • Furthermore, just to get technical for a moment, XHTML is extensible in a fairly straightforward way, through the common “class” attribute on each element. Web developers have long leveraged this kind of extensibility in the elaboration of “microformats” for semantic-web applications.[10] There is no reason why publishers shouldn’t think to use XHTML’s simple extensibility in a similar way for their own ends.
  • XHTML, on the other hand, is supported by a vast array of quotidian software, starting with the ubiquitous Web browser. For this very reason, XHTML is in fact employed as a component part of several more specialized document types (ONIX and ePub among them).
  • Why re-invent a general-purpose prose representation when XHTML already does the job?
  • It is worth pausing for a moment to consider the role of XHTML in the ePub standard for ebook content. An ePub file is, anatomically, a simply disguised zip archive. Inside the zip archive are a few standard component parts: there are specialized files that declare metadata about the book, and about the format of the book. And then there is the book’s content, represented in XHTML. An ePub book is a Web page in a wrapper.
  • To sum up the general argument: the Web as it already exists presents incredible value to publishers, as a platform for doing XML content management with existing (and often free) tools, and without having to go blindly into the unknown. At this point, we can offer a few design guidelines: prefer existing and/or ubiquitous tools over specialized ones wherever possible; prefer free software over proprietary systems where possible; prefer simple tools controlled and coordinated by human beings over fully automated (and therefore complex) systems; play to our strengths: use Web software for storing and managing content, use layout software for layout, and keep editors and production people in charge of their own domains.
  • Putting the Pieces Together: A Prototype
  • At the SFU Master of Publishing Program, we have been chipping away at this general line of thinking for a few years. Over that time, Web content management systems have been getting more and more sophisticated, all the while getting more streamlined and easier to use. (NB: if you have a blog, you have a Web content management system.) The Web is beginning to be recognized as a writing and editing environment used by millions of people. And the ways in which content is represented, stored, and exchanged online have become increasingly robust and standardized.
  • The missing piece of the puzzle has been print production: how can we move content from its malleable, fluid form on line into the kind of high-quality print production environments we’ve come to expect after two decades of Desktop Publishing?
  • Anyone who has tried to print Web content knows that the existing methods leave much to be desired (hyphenation and justification, for starters). In the absence of decent tools for this, most publishers quite naturally think of producing the print content first, and then think about how to get material onto the Web for various purposes. So we tend to export from Word, or from Adobe, as something of an afterthought.
  • While this sort of works, it isn’t elegant, and it completely ignores the considerable advantages of Web-based content management.
  • Content managed online is stored in one central location, accessible simultaneously to everyone in your firm, available anywhere you have an Internet connection, and usually exists in a much more fluid format than Word files. If only we could manage the editorial flow online, and then go to print formats at the end, instead of the other way around. At SFU, we made several attempts to make this work by way of the supposed “XML import” capabilities of various Desktop Publishing tools, without much success.[12]
  • In the winter of 2009, Adobe solved this part of the problem for us with the introduction of its Creative Suite 4. What CS4 offers is the option of a complete XML representation of an InDesign document: what Adobe calls IDML (InDesign Markup Language).
  • The IDML file format is—like ePub—a simply disguised zip archive that, when unpacked, reveals a cluster of XML files that represent all the different facets of an InDesign document: layout spreads, master pages, defined styles, colours, and of course, the content.
  • IDML is a well thought-out XML standard that achieves two very different goals simultaneously: it preserves all of the information that InDesign needs to do what it does; and it is broken up in a way that makes it possible for mere mortals (or at least our Master of Publishing students) to work with it.
  • What this represented to us in concrete terms was the ability to take Web-based content and move it into InDesign in a straightforward way, thus bridging Web and print production environments using existing tools and skillsets, with a little added help from free software.
  • We would take clean XHTML content, transform it to IDML-marked content, and merge that with nicely designed templates in InDesign.
  • The result is an almost push-button publication workflow, which results in a nice, familiar InDesign document that fits straight into the way publishers actually do production.
  • Tracing the steps To begin with, we worked backwards, moving the book content back to clean XHTML.
  • The simplest method for this conversion—and if you want to create Web content, this is an excellent route—was to use Adobe’s “Export to Digital Editions” option, which creates an ePub file.
  • Recall that ePub is just XHTML in a wrapper, so within the ePub file was a relatively clean XHTML document. It was somewhat cleaner (that is, the XHTML tagging was simpler and less cluttered) than InDesign’s other Web-oriented exports, possibly because Digital Editions is a well understood target, compared with somebody’s website.
  • In order to achieve our target of clean XHTML, we needed to do some editing; the XHTML produced by InDesign’s “Digital Editions” export was presentation-oriented. For instance, bulleted list items were tagged as paragraphs, with a class attribute identifying them as list items. Using the search-and-replace function, we converted such structures to proper XHTML list and list-item elements. Our guiding principle was to make the XHTML as straightforward as possible, not dependent on any particular software to interpret it.
  • We broke the book’s content into individual chapter files; each chapter could then carry its own basic metadata, and the pages conveniently fit our Web content management system (which is actually just a wiki). We assembled a dynamically generated table of contents for the 12 chapters, and created a cover page. Essentially, the book was entirely Web-based at this point.
  • When the book chapters are viewed online, they are formatted via a CSS2 stylesheet that defines a main column for content as well as dedicating screen real estate for navigational elements. We then created a second template to render the content for exporting; this was essentially a bare-bones version of the book with no navigation and minimal styling. Pages (or even the entire book) can be exported (via the “Save As...” function in a Web browser) for use in either print production or ebook conversion. At this point, we required no skills beyond those of any decent Web designer.
  • Integrating with CS4 for Print Adobe’s IDML language defines elements specific to InDesign; there is nothing in the language that looks remotely like XHTML. So a mechanical transformation step is needed to convert the XHTML content into something InDesign can use. This is not as hard as it might seem.
  • Both XHTML and IDML are composed of straightforward, well-documented structures, and so transformation from one to the other is, as they say, “trivial.” We chose to use XSLT (Extensible Stylesheet Language Transforms) to do the work. XSLT is part of the overall XML specification, and thus is very well supported in a wide variety of tools. Our prototype used a scripting engine called xsltproc, a nearly ubiquitous piece of software that we found already installed as part of Mac OS X (contemporary Linux distributions also have this as a standard tool), though any XSLT processor would work.
  • In other words, we don’t need to buy InCopy, because we just replaced it with the Web. Our wiki is now plugged directly into our InDesign layout. It even automatically updates the InDesign document when the content changes. Credit is due at this point to Adobe: this integration is possible because of the open file format in the Creative Suite 4.
  • We wrote an XSLT transformation script[18] that converted the XHTML content from the Web into an InCopy ICML file. The script itself is less than 500 lines long, and was written and debugged over a period of about a week by amateurs (again, the people named at the start of this article). The script runs in a couple of seconds, and the resulting .icml file can then be “placed” directly into an InDesign template. The ICML file references an InDesign stylesheet, so the template file can be set up with a house-styled layout, master pages, and stylesheet definitions for paragraphs and character ranges.
  • Rather than a public-facing website, our system relies on the Web as a content management platform—of course a public face could easily be added.
  • It should be noted that the Book Publishing 1 proof-of-concept was artificially complex; we began with a book laid out in InDesign and ended up with a look-alike book laid out in InDesign. But next time—for instance, when we publish Book Publishing 2—we can begin the process with the content on the Web, and keep it there throughout the editorial process. The book’s content could potentially be written and edited entirely online, as Web content, and then automatically poured into an InDesign template at proof time. “Just in time,” as they say. This represents an entirely new way of thinking of book production. With a Web-first orientation, it makes little sense to think of the book as “in print” or “out of print”—the book is simply available, in the first place online; in the second place in derivative digital formats; and third, but really not much more difficult, in print-ready format, via the usual InDesign CS print production system publishers are already familiar with.
  • Creating Ebook Files Creating electronic versions from XHTML source is vastly simpler than trying to generate these out of the existing print process. The ePub version is extremely easy to generate; so is online marketing copy or excerpts for the Web, since the content begins life Web-native.
  • Since an ePub file is essentially XHTML content in a special wrapper, all that is required is that we properly “wrap” our XHTML content. Ideally, the content in an ePub file is broken into chapters (as ours was) and a table of contents file is generated in order to allow easy navigation within an ebook reader. We used Julian Smart’s free tool eCub[19] to simply and automatically generate the ePub wrapper and the table of contents. The only custom development we did was to create a CSS stylesheet for the ebook so that headings and paragraph indents looked the way we wanted. Starting with XHTML content, creating ePub is almost too easy.
  • today, we are able to put the process together using nothing but standard, relatively ubiquitous Web tools: the Web itself as an editing and content management environment, standard Web scripting tools for the conversion process, and the well-documented IDML file format to integrate the layout tool.
  • Our project demonstrates that Web technologies are indeed good enough to use in an XML-oriented workflow; more specialized and expensive options are not necessarily required. For massive-scale enterprise publishing, this approach may not offer enough flexibility, and the challenge of adding and extracting extra semantic richness may prove more trouble than it's worth.
  • But for smaller firms who are looking at the straightforward benefits of XML-based processes—single source publishing, online content and workflow management, open and accessible archive formats, greater online discoverability—here is a way forward.
  • The result is very simple and easy to use. Our demonstration requires that a production editor run the XSLT transformation script manually, but there is no reason why this couldn’t be built directly into the Web content management system so that exporting the content to print ran the transformation automatically. The resulting file would then be “placed” in InDesign and proofed.
  • The final piece of our puzzle, the ability to integrate print production, was made possible by Adobe's release of InDesign with an open XML file format. Since the Web's XHTML is also XML, is can be easily and confidently transformed to the InDesign format.
  • Such a workflow—beginning with the Web and exporting to print—is surely more in line with the way we will do business in the 21st century, where the Web is the default platform for reaching audiences, developing content, and putting the pieces together. It is time, we suggest, for publishers to re-orient their operations and start with the Web.
  • Using the Web as a Production Platform
  •  
    I was looking for an answer to a problem Marbux had presented, and found this interesting article.  The issue was that of the upcoming conversion of the Note Case Pro (NCP) layout engine to the WebKit layout engine, and what to do about the NCP document format. My initial reaction was to encode the legacy NCP document format in XML, and run an XSLT to a universal pivot format like TEI-XML.  From there, the TEI-XML community would provide all the XSLT transformation routines for conversion to ODF, OOXML, XHTML, ePUB and HTML/CSS. Researching the problems one might encounter with this approach, I found this article.  Fascinating stuff. My take away is that TEI-XML would not be as effective a "universal pivot point" as XHTML.  Or perhaps, if NCP really wants to get aggressive; IDML - InDesign Markup Language. As an after thought, i was thinking that an alternative title to this article might have been, "Working with Web as the Center of Everything".
Paul Merrell

Dr Dobbs - HTML5 Web Storage - 0 views

  • HTML5 Web Storage is an API that makes it easy to persist data across web requests. Before the Web Storage API, remote web servers had to store any data that persisted by sending it back and forth from client to server. With the advent of the Web Storage API, however, developers can now store data directly in a browser for repeated access across requests, or to be retrieved long after you completely close the browser, thus greatly reducing network traffic. One more reason to use Web Storage is that this is one of few HTML5 APIs that is already supported in all browsers, including Internet Explorer 8.
  • In many cases, the same results can be achieved without involving a network or remote server. This is where the HTML5 Web Storage API comes in. By using this simple API, developers can store values in easily retrievable JavaScript objects, which persist across page loads. By using either sessionStorage or localStorage, developers can choose to let values survive either across page loads in a single window or tab, or across browser restarts, respectively. Stored data is not transmitted across the network, and is easily accessed on return visits to a page. Furthermore, larger values -- as high as a few megabytes -- can be persisted using the HTML5 Web Storage API. This makes Web Storage suitable for document and file data that would quickly blow out the size limit of a cookie.
Paul Merrell

"Windows Management Framework" is here for Windows XP, Vista, Server 2003, 2008 - Remot... - 0 views

  • Windows Management Framework, which includes Windows PowerShell 2.0, WinRM 2.0, and BITS 4.0, was officially released to the world this morning.
  • “Windows Management Framework, which includes Windows PowerShell 2.0, WinRM 2.0, and BITS 4.0, was officially released to the world this morning. By providing a consistent management interface across the various flavors of Windows, we are making our platform that much more attractive to deploy. IT Professionals can now easily manage their Windows XP, Windows Server 2003, Windows Vista, Windows Server 2008, Windows 7, and Windows Server 2008 R2 machines through PowerShell remoting
  • WinRM is the Microsoft implementation of WS-Management Protocol, a standard Simple Object Access Protocol (SOAP)–based, firewall-friendly protocol that allows for hardware and operating systems from different vendors to interoperate. The WS-Management Protocol specification provides a common way for systems to access and exchange management information across an IT infrastructure.
  • ...1 more annotation...
  • BITS is a service that transfers files between a client and a server. BITS provides a simple way to reliably and politely transfer files over HTTP or HTTPS. File downloads and file uploads are supported. By default, BITS transfers files in the background, unlike other protocols that transfer files in the foreground. Background transfers use only idle network bandwidth in order to preserve the user’s interactive experience with other network applications, such as Internet Explorer. Foreground or typical transfers are also supported.
« First ‹ Previous 81 - 100 of 130 Next › Last »
Showing 20 items per page