Skip to main content

Home/ Future of the Web/ Group items tagged article

Rss Feed Group items tagged

Paul Merrell

Eric Holder: The Justice Department could strike deal with Edward Snowden - 0 views

  • Eric Holder: The Justice Department could strike deal with Edward SnowdenMichael IsikoffChief Investigative CorrespondentJuly 6, 2015Former U.S. Attorney General Eric Holder. (Photo: Olivier Douliery-Pool/Getty) Former Attorney General Eric Holder said today that a “possibility exists” for the Justice Department to cut a deal with former NSA contractor Edward Snowden that would allow him to return to the United States from Moscow. In an interview with Yahoo News, Holder said “we are in a different place as a result of the Snowden disclosures” and that “his actions spurred a necessary debate” that prompted President Obama and Congress to change policies on the bulk collection of phone records of American citizens. Asked if that meant the Justice Department might now be open to a plea bargain that allows Snowden to return from his self-imposed exile in Moscow, Holder replied: “I certainly think there could be a basis for a resolution that everybody could ultimately be satisfied with. I think the possibility exists.”
  • But his remarks to Yahoo News go further than any current or former Obama administration official in suggesting that Snowden’s disclosures had a positive impact and that the administration might be open to a negotiated plea that the self-described whistleblower could accept, according to his lawyer Ben Wizner.
  • It’s also not clear whether Holder’s comments signal a shift in Obama administration attitudes that could result in a resolution of the charges against Snowden. Melanie Newman, chief spokeswoman for Attorney General Loretta Lynch, Holder’s successor, immediately shot down the idea that the Justice Department was softening its stance on Snowden. “This is an ongoing case so I am not going to get into specific details but I can say our position regarding bringing Edward Snowden back to the United States to face charges has not changed,” she said in an email.
  • ...1 more annotation...
  • Three sources familiar with informal discussions of Snowden’s case told Yahoo News that one top U.S. intelligence official, Robert Litt, the chief counsel to Director of National Intelligence James Clapper, recently privately floated the idea that the government might be open to a plea bargain in which Snowden returns to the United States, pleads guilty to one felony count and receives a prison sentence of three to five years in exchange for full cooperation with the government.
Paul Merrell

NSA Director Finally Admits Encryption Is Needed to Protect Public's Privacy - 0 views

  • NSA Director Finally Admits Encryption Is Needed to Protect Public’s Privacy The new stance denotes a growing awareness within the government that Americans are not comfortable with the State’s grip on their data. By Carey Wedler | AntiMedia | January 22, 2016 Share this article! https://mail.google.com/mail/?view=cm&fs=1&to&su=NSA%20Director%20Finally%20Admits%20Encryption%20Is%20Needed%20to%20Protect%20Public%E2%80%99s%20Privacy&body=http%3A%2F%2Fwww.mintpress
  • Rogers cited the recent Office of Personnel Management hack of over 20 million users as a reason to increase encryption rather than scale it back. “What you saw at OPM, you’re going to see a whole lot more of,” he said, referring to the massive hack that compromised the personal data about 20 million people who obtained background checks. Rogers’ comments, while forward-thinking, signify an about face in his stance on encryption. In February 2015, he said he “shares [FBI] Director [James] Comey’s concern” about cell phone companies’ decision to add encryption features to their products. Comey has been one loudest critics of encryption. However, Rogers’ comments on Thursday now directly conflict with Comey’s stated position. The FBI director has publicly chastised encryption, as well as the companies that provide it. In 2014, he claimed Apple’s then-new encryption feature could lead the world to “a very dark place.” At a Department of Justice hearing in November, Comey testified that “Increasingly, the shadow that is ‘going dark’ is falling across more and more of our work.” Though he claimed, “We support encryption,” he insisted “we have a problem that encryption is crashing into public safety and we have to figure out, as people who care about both, to resolve it. So, I think the conversation’s in a healthier place.”
  • At the same hearing, Comey and Attorney General Loretta Lynch declined to comment on whether they had proof the Paris attackers used encryption. Even so, Comey recently lobbied for tech companies to do away with end-to-end encryption. However, his crusade has fallen on unsympathetic ears, both from the private companies he seeks to control — and from the NSA. Prior to Rogers’ statements in support of encryption Thursday, former NSA chief Michael Hayden said, “I disagree with Jim Comey. I actually think end-to-end encryption is good for America.” Still another former NSA chair has criticized calls for backdoor access to information. In October, Mike McConnell told a panel at an encryption summit that the United States is “better served by stronger encryption, rather than baking in weaker encryption.” Former Department of Homeland Security chief, Michael Chertoff, has also spoken out against government being able to bypass encryption.
  • ...2 more annotations...
  • Regardless of these individual defenses of encryption, the Intercept explained why these statements may be irrelevant: “Left unsaid is the fact that the FBI and NSA have the ability to circumvent encryption and get to the content too — by hacking. Hacking allows law enforcement to plant malicious code on someone’s computer in order to gain access to the photos, messages, and text before they were ever encrypted in the first place, and after they’ve been decrypted. The NSA has an entire team of advanced hackers, possibly as many as 600, camped out at Fort Meade.”
  • Rogers statements, of course, are not a full-fledged endorsement of privacy, nor can the NSA be expected to make it a priority. Even so, his new stance denotes a growing awareness within the government that Americans are not comfortable with the State’s grip on their data. “So spending time arguing about ‘hey, encryption is bad and we ought to do away with it’ … that’s a waste of time to me,” Rogers said Thursday. “So what we’ve got to ask ourselves is, with that foundation, what’s the best way for us to deal with it? And how do we meet those very legitimate concerns from multiple perspectives?”
Paul Merrell

American Surveillance Now Threatens American Business - The Atlantic - 0 views

  • What does it look like when a society loses its sense of privacy? <div><a href="http://pubads.g.doubleclick.net/gampad/jump?iu=%2F4624%2FTheAtlanticOnline%2Fchannel_technology&t=src%3Dblog%26by%3Drobinson-meyer%26title%3Damerican-surveillance-now-threatens-american-business%26pos%3Din-article&sz=300x250&c=285899172&tile=1" title=""><img style="border:none;" src="http://pubads.g.doubleclick.net/gampad/ad?iu=%2F4624%2FTheAtlanticOnline%2Fchannel_technology&t=src%3Dblog%26by%3Drobinson-meyer%26title%3Damerican-surveillance-now-threatens-american-business%26pos%3Din-article&sz=300x250&c=285899172&tile=1" alt="" /></a></div>In the almost 18 months since the Snowden files first received coverage, writers and critics have had to guess at the answer. Does a certain trend, consumer complaint, or popular product epitomize some larger shift? Is trust in tech companies eroding—or is a subset just especially vocal about it? Polling would make those answers clear, but polling so far has been… confused. A new study, conducted by the Pew Internet Project last January and released last week, helps make the average American’s view of his or her privacy a little clearer. And their confidence in their own privacy is ... low. The study's findings—and the statistics it reports—stagger. Vast majorities of Americans are uncomfortable with how the government uses their data, how private companies use and distribute their data, and what the government does to regulate those companies. No summary can equal a recounting of the findings. Americans are displeased with government surveillance en masse:   
  • A new study finds that a vast majority of Americans trust neither the government nor tech companies with their personal data.
  • According to the study, 70 percent of Americans are “at least somewhat concerned” with the government secretly obtaining information they post to social networking sites. Eighty percent of respondents agreed that “Americans should be concerned” with government surveillance of telephones and the web. They are also uncomfortable with how private corporations use their data: Ninety-one percent of Americans believe that “consumers have lost control over how personal information is collected and used by companies,” according to the study. Eighty percent of Americans who use social networks “say they are concerned about third parties like advertisers or businesses accessing the data they share on these sites.” And even though they’re squeamish about the government’s use of data, they want it to regulate tech companies and data brokers more strictly: 64 percent wanted the government to do more to regulate private data collection. Since June 2013, American politicians and corporate leaders have fretted over how much the leaks would cost U.S. businesses abroad.
  • ...3 more annotations...
  • What does it look like when a society loses its sense of privacy? <div><a href="http://pubads.g.doubleclick.net/gampad/jump?iu=%2F4624%2FTheAtlanticOnline%2Fchannel_technology&t=src%3Dblog%26by%3Drobinson-meyer%26title%3Damerican-surveillance-now-threatens-american-business%26pos%3Din-article&sz=300x250&c=285899172&tile=1" title=""><img style="border:none;" src="http://pubads.g.doubleclick.net/gampad/ad?iu=%2F4624%2FTheAtlanticOnline%2Fchannel_technology&t=src%3Dblog%26by%3Drobinson-meyer%26title%3Damerican-surveillance-now-threatens-american-business%26pos%3Din-article&sz=300x250&c=285899172&tile=1" alt="" /></a></div>In the almost 18 months since the Snowden files first received coverage, writers and critics have had to guess at the answer. Does a certain trend, consumer complaint, or popular product epitomize some larger shift? Is trust in tech companies eroding—or is a subset just especially vocal about it? Polling would make those answers clear, but polling so far has been… confused. A new study, conducted by the Pew Internet Project last January and released last week, helps make the average American’s view of his or her privacy a little clearer. And their confidence in their own privacy is ... low. The study's findings—and the statistics it reports—stagger. Vast majorities of Americans are uncomfortable with how the government uses their data, how private companies use and distribute their data, and what the government does to regulate those companies. No summary can equal a recounting of the findings. Americans are displeased with government surveillance en masse:   
  • “It’s clear the global community of Internet users doesn’t like to be caught up in the American surveillance dragnet,” Senator Ron Wyden said last month. At the same event, Google chairman Eric Schmidt agreed with him. “What occurred was a loss of trust between America and other countries,” he said, according to the Los Angeles Times. “It's making it very difficult for American firms to do business.” But never mind the world. Americans don’t trust American social networks. More than half of the poll’s respondents said that social networks were “not at all secure. Only 40 percent of Americans believe email or texting is at least “somewhat” secure. Indeed, Americans trusted most of all communication technologies where some protections has been enshrined into the law (though the report didn’t ask about snail mail). That is: Talking on the telephone, whether on a landline or cell phone, is the only kind of communication that a majority of adults believe to be “very secure” or “somewhat secure.”
  • (That may seem a bit incongruous, because making a telephone call is one area where you can be almost sure you are being surveilled: The government has requisitioned mass call records from phone companies since 2001. But Americans appear, when discussing security, to differentiate between the contents of the call and data about it.) Last month, Ramsey Homsany, the general counsel of Dropbox, said that one big thing could take down the California tech scene. “We have built this incredible economic engine in this region of the country,” said Homsany in the Los Angeles Times, “and [mistrust] is the one thing that starts to rot it from the inside out.” According to this poll, the mistrust has already begun corroding—and is already, in fact, well advanced. We’ve always assumed that the great hurt to American business will come globally—that citizens of other nations will stop using tech companies’s services. But the new Pew data shows that Americans suspect American businesses just as much. And while, unlike citizens of other nations, they may not have other places to turn, they may stop putting sensitive or delicate information online.
Paul Merrell

Sick Of Facebook? Read This. - 2 views

  • In 2012, The Guardian reported on Facebook’s arbitrary and ridiculous nudity and violence guidelines which allow images of crushed limbs but – dear god spare us the image of a woman breastfeeding. Still, people stayed – and Facebook grew. In 2014, Facebook admitted to mind control games via positive or negative emotional content tests on unknowing and unwilling platform users. Still, people stayed – and Facebook grew. Following the 2016 election, Facebook responded to the Harpie shrieks from the corporate Democrats bysetting up a so-called “fake news” task force to weed out those dastardly commies (or socialists or anarchists or leftists or libertarians or dissidents or…). And since then, I’ve watched my reach on Facebook drain like water in a bathtub – hard to notice at first and then a spastic swirl while people bicker about how to plug the drain. And still, we stayed – and the censorship tightened. Roughly a year ago, my show Act Out! reported on both the censorship we were experiencing but also the cramped filter bubbling that Facebook employs in order to keep the undesirables out of everyone’s news feed. Still, I stayed – and the censorship tightened. 2017 into 2018 saw more and more activist organizers, particularly black and brown, thrown into Facebook jail for questioning systemic violence and demanding better. In August, puss bag ass hat in a human suit Alex Jones was banned from Facebook – YouTube, Apple and Twitter followed suit shortly thereafter. Some folks celebrated. Some others of us skipped the party because we could feel what was coming.
  • On Thursday, October 11th of this year, Facebook purged more than 800 pages including The Anti-Media, Police the Police, Free Thought Project and many other social justice and alternative media pages. Their explanation rested on the painfully flimsy foundation of “inauthentic behavior.” Meanwhile, their fake-news checking team is stacked with the likes of the Atlantic Council and the Weekly Standard, neocon junk organizations that peddle such drivel as “The Character Assassination of Brett Kavanaugh.” Soon after, on the Monday before the Midterm elections, Facebook blocked another 115 accounts citing once again, “inauthentic behavior.” Then, in mid November, a massive New York Times piece chronicled Facebook’s long road to not only save its image amid rising authoritarian behavior, but “to discredit activist protesters, in part by linking them to the liberal financier George Soros.” (I consistently find myself waiting for those Soros and Putin checks in the mail that just never appear.)
  • What we need is an open source, non-surveillance platform. And right now, that platform is Minds. Before you ask, I’m not being paid to write that.
  • ...2 more annotations...
  • Fashioned as an alternative to the closed and creepy Facebook behemoth, Minds advertises itself as “an open source and decentralized social network for Internet freedom.” Minds prides itself on being hands-off with regards to any content that falls in line with what’s permitted by law, which has elicited critiques from some on the left who say Minds is a safe haven for fascists and right-wing extremists. Yet, Ottman has himself stated openly that he wants ideas on content moderation and ways to make Minds a better place for social network users as well as radical content creators. What a few fellow journos and I are calling #MindsShift is an important step in not only moving away from our gagged existence on Facebook but in building a social network that can serve up the real news folks are now aching for.
  • To be clear, we aren’t advocating that you delete your Facebook account – unless you want to. For many, Facebook is still an important tool and our goal is to add to the outreach toolkit, not suppress it. We have set January 1st, 2019 as the ultimate date for this #MindsShift. Several outlets with a combined reach of millions of users will be making the move – and asking their readerships/viewerships to move with them. Along with fellow journalists, I am working with Minds to brainstorm new user-friendly functions and ways to make this #MindsShift a loud and powerful move. We ask that you, the reader, add to the conversation by joining the #MindsShift and spreading the word to your friends and family. (Join Minds via this link) We have created the #MindsShift open group on Minds.com so that you can join and offer up suggestions and ideas to make this platform a new home for radical and progressive media.
Paul Merrell

NSA Based Malware Used In Massive Cyber-Attack Hitting 74 Countries - 0 views

  • Apparent National Security Agency (NSA) malware has been used in a global cyber-attack, including on British hospitals, in what whistleblower Edward Snowden described as the repercussion of the NSA’s reckless decision to build the tools. “Despite warnings, @NSAGov built dangerous attack tools that could target Western software. Today we see the cost,” Snowden tweeted Friday. At least two hospitals in London were forced to shut down and stop admitting patients after being attacked by the malware, which operates by locking out the user, encrypting data, and demanding a ransom to release it. The attacks hit dozens of other hospitals, ambulance operators, and doctors’ offices as well.
  • The Blackpool Gazette in the northwest reported that medical staff had resorted to using pen and paper when phone and computer systems shut down. Elsewhere, journalist Ollie Cowan tweeted a photo of ambulances “backed up” at Southport Hospital as the staff attempted to cope with the crisis.
  • Other disruptions were reported in at least 74 countries, including Russia, Spain, Turkey, and Japan, and the number is “growing fast,” according to Kaspersky Lab chief Costin Raiu. Security architect Kevin Beau said it was spreading into the U.S. as well. The malware, which Microsoft tested briefly earlier this year, was leaked by a group calling itself the Shadow Brokers, which has been releasing NSA hacking tools online since last year, the New York Times reports. Times journalists Dan Bilefsky and Nicole Perlroth wrote: Microsoft rolled out a patch for the vulnerability in March, but hackers apparently took advantage of the fact that vulnerable targets—particularly hospitals—had yet to update their systems. The malware was circulated by email. Targets were sent an encrypted, compressed file that, once loaded, allowed the ransomware to infiltrate its targets. Reuters reported that the National Health Service (NHS), England’s public health system, was warned about possible hacking earlier in the day, but that by then it was already too late.
  • ...2 more annotations...
  • A Twitter account with the handle @HackerFantastic, the co-founder of the cyber security company Hacker House, tweeted that the firm had “warned the NHS with Sky news about vulnerabilities they had last year, this was inevitable and bound to happen at some stage.” “In light of today’s attack, Congress needs to be asking @NSAgov if it knows of any other vulnerabilities in software used in our hospitals,” Snowden tweeted. “If @NSAGov had privately disclosed the flaw used to attack hospitals when they *found* it, not when they lost it, this may not have happened.” Disclosing the vulnerability when it was found would have given hospitals years, not months, to update their systems and prepare for an attack, he added.
  • witter user @MalwareTechBlog added, “Something like this is incredibly significant, we’ve not seen P2P spreading on PC via exploits at this scale in nearly a decade.” Patrick Toomey, a staff attorney with the American Civil Liberties Union’s (ACLU) National Security Project, said, “It would be shocking if the NSA knew about this vulnerability but failed to disclose it to Microsoft until after it was stolen.” “These attacks underscore the fact that vulnerabilities will be exploited not just by our security agencies, but by hackers and criminals around the world,” Toomey said. “It is past time for Congress to enhance cybersecurity by passing a law that requires the government to disclose vulnerabilities to companies in a timely manner. Patching security holes immediately, not stockpiling them, is the best way to make everyone’s digital life safer.”
Paul Merrell

Ohio's attorney general wants Google to be declared a public utility. - The New York Times - 2 views

  • Ohio’s attorney general, Dave Yost, filed a lawsuit on Tuesday in pursuit of a novel effort to have Google declared a public utility and subject to government regulation.The lawsuit, which was filed in a Delaware County, Ohio court, seeks to use a law that’s over a century old to regulate Google by applying a legal designation historically used for railroads, electricity and the telephone to the search engine.“When you own the railroad or the electric company or the cellphone tower, you have to treat everyone the same and give everybody access,” Mr. Yost, a Republican, said in a statement. He added that Ohio was the first state to bring such a lawsuit against Google.If Google were declared a so-called common carrier like a utility company, it would prevent the company from prioritizing its own products, services and websites in search results.AdvertisementContinue reading the main storyGoogle said it had none of the attributes of a common carrier that usually provide a standardized service for a fee using public assets, such as rights of way.The “lawsuit would make Google Search results worse and make it harder for small businesses to connect directly with customers,” José Castañeda, a Google spokesman, said in a statement. “Ohioans simply don’t want the government to run Google like a gas or electric company. This lawsuit has no basis in fact or law and we’ll defend ourselves against it in court.”Though the Ohio lawsuit is a stretch, there is a long history of government control of certain kinds of companies, said Andrew Schwartzman, a senior fellow at the nonprofit Benton Institute for Broadband & Society. “Think of ‘The Canterbury Tales.’ Travelers needed a place to stay and eat on long road treks, and innkeepers were not allowed to deny them accommodations or rip them off,” he said.
  • After a series of federal lawsuits filed against Google last year, Ohio’s lawsuit is part of a next wave of state actions aimed at regulating and curtailing the power of Big Tech. Also on Tuesday, Colorado’s legislature passed a data privacy law that would allow consumers to opt out of data collection.On Monday, New York’s Senate passed antitrust legislation that would make it easier for plaintiffs to sue dominant platforms for abuse of power. After years of inaction in Congress with tech legislation, states are beginning to fill the regulatory vacuum.Editors’ PicksThe Abandoned Houses of Instagram21 Easy Summer Dinners You’ll Cook (or Throw Together) on Repeat‘King Richard’ Finds Fresh Drama in WatergateAdvertisementContinue reading the main storyAdvertisementContinue reading the main storyOhio was also one of 38 states that filed an antitrust lawsuit in December accusing Google of being a monopoly and using its dominant position in internet search to squeeze out smaller rivals.
Paul Merrell

Meta reaches $37.5 mln settlement of Facebook location tracking lawsuit | Reuters - 1 views

  • Meta Platforms Inc (META.O) reached a $37.5 million settlement of a lawsuit accusing the parent of Facebook of violating users' privacy by tracking their movements through their smartphones without permission.A preliminary settlement of the proposed class action was filed on Monday in San Francisco federal court, and requires a judge's approval.It resolved claims that Facebook violated California law and its own privacy policy by gathering data from users who turned off Location Services on their mobile devices.Register now for FREE unlimited access to Reuters.comRegisterAdvertisement · Scroll to continueThe users said that while they did not want to share their locations with Facebook, the company nevertheless inferred where they were from their IP (internet protocol) addresses, and used that information to send them targeted advertising.Monday's settlement covers people in the United States who used Facebook after Jan. 30, 2015.Meta denied wrongdoing in agreeing to settle. It did not immediately respond on Tuesday to requests for comment.Advertisement · Scroll to continueIn June 2018, Facebook and Chief Executive Mark Zuckerberg told the U.S. Congress that the Menlo Park, California-based company uses location data "to help advertisers reach people in particular areas."As an example, it said users who dined at particular restaurants might receive posts from friends who also ate there, or ads from businesses that wanted to provide services nearby.The lawsuit began in November 2018. Lawyers for the plaintiffs may seek up to 30% of Monday's settlement for legal fees, settlement papers show.Advertisement · Scroll to continueThe cases is Lundy et al v Facebook Inc, U.S. District Court, Northern District of California, No. 18-06793.
Paul Merrell

Glassholes: A Mini NSA on Your Face, Recorded by the Spy Agency | Global Research - 2 views

  • eOnline reports: A new app will allow total strangers to ID you and pull up all your information, just by looking at you and scanning your face with their Google Glass. The app is called NameTag and it sounds CREEPY. The “real-time facial recognition” software “can detect a face using the Google Glass camera, send it wirelessly to a server, compare it to millions of records, and in seconds return a match complete with a name, additional photos and social media profiles.” The information listed could include your name, occupation, any social media profiles you have set up and whether or not you have a criminal record (“CRIMINAL HISTORY FOUND” pops up in bright red letters according to the demo).
  • Since the NSA is tapping into all of our digital communications, it is not unreasonable to assume that all of the info from your digital glasses – yup, everything – may be recorded by the spy agency. Are we going to have millions of mini NSAs walking around recording everything … glassholes? It doesn’t help inspire confidence that America’s largest police force and Taser are beta-testing Google Glasses. Postscript: I love gadgets and tech, and previously discussed the exciting possibilities of Google Glasses. But the NSA is ruining the fun, just like it’s harming U.S. Internet business.
  •  
    Thankfully, there's buddying technology to block computer facial-recognition algorithms. http://tinyurl.com/mzfyfra On the other hand, used Hallowe'en masks can usually be purchased inexpensively from some nearby school kids at this time of year. Now if I could just put together a few near-infrared LEDs to fry a license plate-scanner's view ...  
Paul Merrell

The New Snowden? NSA Contractor Arrested Over Alleged Theft Of Classified Data - 0 views

  • A contractor working for the National Security Agency (NSA) was arrested by the FBI following his alleged theft of “state secrets.” More specifically, the contractor, Harold Thomas Martin, is charged with stealing highly classified source codes developed to covertly hack the networks of foreign governments, according to several senior law enforcement and intelligence officials. The Justice Department has said that these stolen materials were “critical to national security.” Martin was employed by Booz Allen Hamilton, the company responsible for most of the NSA’s most sensitive cyber-operations. Edward Snowden, the most well-known NSA whistleblower, also worked for Booz Allen Hamilton until he fled to Hong Kong in 2013 where he revealed a trove of documents exposing the massive scope of the NSA dragnet surveillance. That surveillance system was shown to have targeted untold numbers of innocent Americans. According to the New York Times, the theft “raises the embarrassing prospect” that an NSA insider managed to steal highly damaging secret information from the NSA for the second time in three years, not to mention the “Shadow Broker” hack this past August, which made classified NSA hacking tools available to the public.
  • Snowden himself took to Twitter to comment on the arrest. In a tweet, he said the news of Martin’s arrest “is huge” and asked, “Did the FBI secretly arrest the person behind the reports [that the] NSA sat on huge flaws in US products?” It is currently unknown if Martin was connected to those reports as well.
  • It also remains to be seen what Martin’s motivations were in removing classified data from the NSA. Though many suspect that he planned to follow in Snowden’s footsteps, the government will more likely argue that he had planned to commit espionage by selling state secrets to “adversaries.” According to the New York Times article on the arrest, Russia, China, Iran, and North Korea are named as examples of the “adversaries” who would have been targeted by the NSA codes that Martin is accused of stealing. However, Snowden revealed widespread US spying on foreign governments including several US allies such as France and Germany. This suggests that the stolen “source codes” were likely utilized on a much broader scale.
Gonzalo San Gil, PhD.

Linux Security Guide (extended version) - Linux Audit - 0 views

  •  
    "With so many articles about Linux security on the internet, you may feel overwhelmed by how to properly secure your Linux systems. With this guide, we walk through different steps, tools, and resources. The main goal is to have you make an educated choice on what security defenses to implement on Linux. For this reason, this article won't show any specific configuration values, as it would implicate a possible best value. Instead, related articles and resources will be available in the text. The goal is to make this guide into a go-to article for when you need to secure your Linux installation. If you like this article, help others and share it on your favorite social media channels. Got feedback? Use the comments at the bottom. This document in work in progress and last updated in September 2016"
Paul Merrell

Commentary: Don't be so sure Russia hacked the Clinton emails | Reuters - 0 views

  • By James Bamford Last summer, cyber investigators plowing through the thousands of leaked emails from the Democratic National Committee uncovered a clue.A user named “Феликс Эдмундович” modified one of the documents using settings in the Russian language. Translated, his name was Felix Edmundovich, a pseudonym referring to Felix Edmundovich Dzerzhinsky, the chief of the Soviet Union’s first secret-police organization, the Cheka.It was one more link in the chain of evidence pointing to Russian President Vladimir Putin as the man ultimately behind the operation.During the Cold War, when Soviet intelligence was headquartered in Dzerzhinsky Square in Moscow, Putin was a KGB officer assigned to the First Chief Directorate. Its responsibilities included “active measures,” a form of political warfare that included media manipulation, propaganda and disinformation. Soviet active measures, retired KGB Major General Oleg Kalugin told Army historian Thomas Boghart, aimed to discredit the United States and “conquer world public opinion.”As the Cold War has turned into the code war, Putin recently unveiled his new, greatly enlarged spy organization: the Ministry of State Security, taking the name from Joseph Stalin’s secret service. Putin also resurrected, according to James Clapper, the U.S. director of national intelligence, some of the KGB’s old active- measures tactics. On October 7, Clapper issued a statement: “The U.S. Intelligence community is confident that the Russian government directed the recent compromises of emails from U.S. persons and institutions, including from U.S. political organizations.” Notably, however, the FBI declined to join the chorus, according to reports by the New York Times and CNBC.A week later, Vice President Joe Biden said on NBC’s Meet the Press that "we're sending a message" to Putin and "it will be at the time of our choosing, and under the circumstances that will have the greatest impact." When asked if the American public would know a message was sent, Biden replied, "Hope not." Meanwhile, the CIA was asked, according to an NBC report on October 14, “to deliver options to the White House for a wide-ranging ‘clandestine’ cyber operation designed to harass and ‘embarrass’ the Kremlin leadership.”But as both sides begin arming their cyberweapons, it is critical for the public to be confident that the evidence is really there, and to understand the potential consequences of a tit-for-tat cyberwar escalating into a real war. 
  • This is a prospect that has long worried Richard Clarke, the former White House cyber czar under President George W. Bush. “It’s highly likely that any war that began as a cyberwar,” Clarke told me last year, “would ultimately end up being a conventional war, where the United States was engaged with bombers and missiles.”The problem with attempting to draw a straight line from the Kremlin to the Clinton campaign is the number of variables that get in the way. For one, there is little doubt about Russian cyber fingerprints in various U.S. campaign activities. Moscow, like Washington, has long spied on such matters. The United States, for example, inserted malware in the recent Mexican election campaign. The question isn’t whether Russia spied on the U.S. presidential election, it’s whether it released the election emails.Then there’s the role of Guccifer 2.0, the person or persons supplying WikiLeaks and other organizations with many of the pilfered emails. Is this a Russian agent? A free agent? A cybercriminal? A combination, or some other entity? No one knows.There is also the problem of groupthink that led to the war in Iraq. For example, just as the National Security Agency, the Central Intelligence Agency and the rest of the intelligence establishment are convinced Putin is behind the attacks, they also believed it was a slam-dunk that Saddam Hussein had a trove of weapons of mass destruction. Consider as well the speed of the political-hacking investigation, followed by a lack of skepticism, culminating in a rush to judgment. After the Democratic committee discovered the potential hack last spring, it called in the cybersecurity firm CrowdStrike in May to analyze the problem.
  • CrowdStrike took just a month or so before it conclusively determined that Russia’s FSB, the successor to the KGB, and the Russian military intelligence organization, GRU, were behind it. Most of the other major cybersecurity firms quickly fell in line and agreed. By October, the intelligence community made it unanimous. That speed and certainty contrasts sharply with a previous suspected Russian hack in 2010, when the target was the Nasdaq stock market. According to an extensive investigation by Bloomberg Businessweek in 2014, the NSA and FBI made numerous mistakes over many months that stretched to nearly a year. “After months of work,” the article said, “there were still basic disagreements in different parts of government over who was behind the incident and why.”  There was no consensus­, with just a 70 percent certainty that the hack was a cybercrime. Months later, this determination was revised again: It was just a Russian attempt to spy on the exchange in order to design its own. The federal agents also considered the possibility that the Nasdaq snooping was not connected to the Kremlin. Instead, “someone in the FSB could have been running a for-profit operation on the side, or perhaps sold the malware to a criminal hacking group.” Again, that’s why it’s necessary to better understand the role of Guccifer 2.0 in releasing the Democratic National Committee and Clinton campaign emails before launching any cyberweapons.
  • ...2 more annotations...
  • t is strange that clues in the Nasdaq hack were very difficult to find ― as one would expect from a professional, state-sponsored cyber operation. Conversely, the sloppy, Inspector Clouseau-like nature of the Guccifer 2.0 operation, with someone hiding behind a silly Bolshevik cover name, and Russian language clues in the metadata, smacked more of either an amateur operation or a deliberate deception.Then there’s the Shadow Brokers, that mysterious person or group that surfaced in August with its farcical “auction” to profit from a stolen batch of extremely secret NSA hacking tools, in essence, cyberweapons. Where do they fit into the picture? They have a small armory of NSA cyberweapons, and they appeared just three weeks after the first DNC emails were leaked. On Monday, the Shadow Brokers released more information, including what they claimed is a list of hundreds of organizations that the NSA has targeted over more than a decade, complete with technical details. This offers further evidence that their information comes from a leaker inside the NSA rather than the Kremlin. The Shadow Brokers also discussed Obama’s threat of cyber retaliation against Russia. Yet they seemed most concerned that the CIA, rather than the NSA or Cyber Command, was given the assignment. This may be a possible indication of a connection to NSA’s elite group, Tailored Access Operations, considered by many the A-Team of hackers.“Why is DirtyGrandpa threating CIA cyberwar with Russia?” they wrote. “Why not threating with NSA or Cyber Command? CIA is cyber B-Team, yes? Where is cyber A-Team?” Because of legal and other factors, the NSA conducts cyber espionage, Cyber Command conducts cyberattacks in wartime, and the CIA conducts covert cyberattacks. 
  • The Shadow Brokers connection is important because Julian Assange, the founder of WikiLeaks, claimed to have received identical copies of the Shadow Brokers cyberweapons even before they announced their “auction.” Did he get them from the Shadow Brokers, from Guccifer, from Russia or from an inside leaker at the NSA?Despite the rushed, incomplete investigation and unanswered questions, the Obama administration has announced its decision to retaliate against Russia.  But a public warning about a secret attack makes little sense. If a major cyber crisis happens in Russia sometime in the future, such as a deadly power outage in frigid winter, the United States could be blamed even if it had nothing to do with it. That could then trigger a major retaliatory cyberattack against the U.S. cyber infrastructure, which would call for another reprisal attack ― potentially leading to Clarke’s fear of a cyberwar triggering a conventional war. President Barack Obama has also not taken a nuclear strike off the table as an appropriate response to a devastating cyberattack.
  •  
    Article by James Bamford, the first NSA whistleblower and author of three books on the NSA.
Gary Edwards

XML Production Workflows? Start with the Web and XHTML - 0 views

  • Challenges: Some Ugly Truths The challenges of building—and living with—an XML workflow are clear enough. The return on investment is a long-term proposition. Regardless of the benefits XML may provide, the starting reality is that it represents a very different way of doing things than the one we are familiar with. The Word Processing and Desktop Publishing paradigm, based on the promise of onscreen, WYSIWYG layout, is so dominant as to be practically inescapable. It has proven really hard to get from here to there, no matter how attractive XML might be on paper. A considerable amount of organizational effort and labour must be expended up front in order to realize the benefits. This is why XML is often referred to as an “investment”: you sink a bunch of time and money up front, and realize the benefits—greater flexibility, multiple output options, searching and indexing, and general futureproofing—later, over the long haul. It is not a short-term return proposition. And, of course, the returns you are able to realize from your XML investment are commensurate with what you put in up front: fine-grained, semantically rich tagging is going to give you more potential for searchability and recombination than a looser, more general-purpose approach, but it sure costs more. For instance, the Text Encoding Initiative (TEI) is the grand example of pouring enormous amounts of energy into the up-front tagging, with a very open-ended set of possibilities down the line. TEI helpfully defines a level to which most of us do not have to aspire.[5] But understanding this on a theoretical level is only part of the challenge. There are many practical issues that must be addressed. Software and labour are two of the most critical. How do you get the content into XML in the first place? Unfortunately, despite two decades of people doing SGML and XML, this remains an ugly question.
  • Practical Challenges In 2009, there is still no truly likeable—let alone standard—editing and authoring software for XML. For many (myself included), the high-water mark here was Adobe’s FrameMaker, substantially developed by the late 1990s. With no substantial market for it, it is relegated today mostly to the tech writing industry, unavailable for the Mac, and just far enough afield from the kinds of tools we use today that its adoption represents a significant hurdle. And FrameMaker was the best of the breed; most of the other software in decent circulation are programmers’ tools—the sort of things that, as Michael Tamblyn pointed out, encourage editors to drink at their desks. The labour question represents a stumbling block as well. The skill-sets and mind-sets that effective XML editors need have limited overlap with those needed by literary and more traditional production editors. The need to think of documents as machine-readable databases is not something that comes naturally to folks steeped in literary culture. In combination with the sheer time and effort that rich tagging requires, many publishers simply outsource the tagging to India, drawing a division of labour that spans oceans, to put it mildly. Once you have XML content, then what do you do with it? How do you produce books from it? Presumably, you need to be able to produce print output as well as digital formats. But while the latter are new enough to be generally XML-friendly (e-book formats being largely XML based, for instance), there aren’t any straightforward, standard ways of moving XML content into the kind of print production environments we are used to seeing. This isn’t to say that there aren’t ways of getting print—even very high-quality print—output from XML, just that most of them involve replacing your prepress staff with Java programmers.
  • Why does this have to be so hard? It’s not that XML is new, or immature, or untested. Remember that the basics have been around, and in production, since the early 1980s at least. But we have to take account of a substantial and long-running cultural disconnect between traditional editorial and production processes (the ones most of us know intimately) and the ways computing people have approached things. Interestingly, this cultural divide looked rather different in the 1970s, when publishers were looking at how to move to digital typesetting. Back then, printers and software developers could speak the same language. But that was before the ascendancy of the Desktop Publishing paradigm, which computerized the publishing industry while at the same time isolating it culturally. Those of us who learned how to do things the Quark way or the Adobe way had little in common with people who programmed databases or document-management systems. Desktop publishing technology isolated us in a smooth, self-contained universe of toolbars, grid lines, and laser proofs. So, now that the reasons to get with this program, XML, loom large, how can we bridge this long-standing divide?
  • ...44 more annotations...
  • Using the Web as a Production Platform The answer, I think, is right in front of you. The bridge is the Web, a technology and platform that is fundamentally based on XML, and which many publishers are by now comfortably familiar with. Perhaps not entirely comfortably, but at least most publishers are already working with the Web; they already either know or have on staff people who understand it and can work with it. The foundation of our argument is this: rather than looking at jumping to XML in its full, industrial complexity, which seems to be what the O'Reilly-backed StartWithXML initiative[6] is suggesting, publishers instead leverage existing tools and technologies—starting with the Web—as a means of getting XML workflows in place. This means making small investments and working with known tools rather than spending tens of thousands of dollars on XML software and rarefied consultants. It means re-thinking how the existing pieces of the production toolchain fit together; re-thinking the existing roles of software components already in use. It means, fundamentally, taking the Web seriously as a content platform, rather than thinking of it as something you need to get content out to, somehow. If nothing else, the Web represents an opportunity to think about editorial and production from outside the shrink-wrapped Desktop Publishing paradigm.
  • Is the Web made of Real XML? At this point some predictable objections can be heard: wait a moment, the Web isn’t really made out of XML; the HTML that makes up most of the Web is at best the bastard child of SGML, and it is far too flaky/unstructured/underpowered to be taken seriously. We counter by arguing that although HTML on the Web exists in a staggering array of different incarnations, and that the majority of it is indeed an unstructured mess, this does not undermine the general principle that basic, ubiquitous Web technologies can make a solid platform for content management, editorial process, and production workflow.
  • With the advent of a published XML standard in the late 1990s came the W3C’s adoption of XHTML: the realization of the Web’s native content markup as a proper XML document type. Today, its acceptance is almost ubiquitous, even while the majority of actual content out there may not be strictly conforming. The more important point is that most contemporary Web software, from browsers to authoring tools to content management systems (from blogs to enterprise systems), are capable of working with clean, valid XHTML. Or, to put the argument the other way around, clean, valid XHTML content plays absolutely seamlessly with everything else on the Web.[7]
  • The objection which follows, then, will be that even if we grant that XHTML is a real XML document type, that it is underpowered for “serious” content because it is almost entirely presentation (formatting) oriented; it lacks any semantic depth. In XHTML, a paragraph is a paragraph is a paragraph, as opposed to a section or an epigraph or a summary.
  • n contrast, more “serious” XML document types like DocBook[8] or DITA-derived schemas[9] are capable of making semantic distinctions about content chunks at a fine level of granularity and with a high degree of specificity.
  • So there is an argument for recalling the 80:20 rule here. If XHTML can provide 80% of the value with just 20% of the investment, then what exactly is the business case for spending the other 80% to achieve that last 20% of value? We suspect the ratio is actually quite a bit steeper than 80:20 for most publishers.
  • Furthermore, just to get technical for a moment, XHTML is extensible in a fairly straightforward way, through the common “class” attribute on each element. Web developers have long leveraged this kind of extensibility in the elaboration of “microformats” for semantic-web applications.[10] There is no reason why publishers shouldn’t think to use XHTML’s simple extensibility in a similar way for their own ends.
  • XHTML, on the other hand, is supported by a vast array of quotidian software, starting with the ubiquitous Web browser. For this very reason, XHTML is in fact employed as a component part of several more specialized document types (ONIX and ePub among them).
  • Why re-invent a general-purpose prose representation when XHTML already does the job?
  • It is worth pausing for a moment to consider the role of XHTML in the ePub standard for ebook content. An ePub file is, anatomically, a simply disguised zip archive. Inside the zip archive are a few standard component parts: there are specialized files that declare metadata about the book, and about the format of the book. And then there is the book’s content, represented in XHTML. An ePub book is a Web page in a wrapper.
  • To sum up the general argument: the Web as it already exists presents incredible value to publishers, as a platform for doing XML content management with existing (and often free) tools, and without having to go blindly into the unknown. At this point, we can offer a few design guidelines: prefer existing and/or ubiquitous tools over specialized ones wherever possible; prefer free software over proprietary systems where possible; prefer simple tools controlled and coordinated by human beings over fully automated (and therefore complex) systems; play to our strengths: use Web software for storing and managing content, use layout software for layout, and keep editors and production people in charge of their own domains.
  • Putting the Pieces Together: A Prototype
  • At the SFU Master of Publishing Program, we have been chipping away at this general line of thinking for a few years. Over that time, Web content management systems have been getting more and more sophisticated, all the while getting more streamlined and easier to use. (NB: if you have a blog, you have a Web content management system.) The Web is beginning to be recognized as a writing and editing environment used by millions of people. And the ways in which content is represented, stored, and exchanged online have become increasingly robust and standardized.
  • The missing piece of the puzzle has been print production: how can we move content from its malleable, fluid form on line into the kind of high-quality print production environments we’ve come to expect after two decades of Desktop Publishing?
  • Anyone who has tried to print Web content knows that the existing methods leave much to be desired (hyphenation and justification, for starters). In the absence of decent tools for this, most publishers quite naturally think of producing the print content first, and then think about how to get material onto the Web for various purposes. So we tend to export from Word, or from Adobe, as something of an afterthought.
  • While this sort of works, it isn’t elegant, and it completely ignores the considerable advantages of Web-based content management.
  • Content managed online is stored in one central location, accessible simultaneously to everyone in your firm, available anywhere you have an Internet connection, and usually exists in a much more fluid format than Word files. If only we could manage the editorial flow online, and then go to print formats at the end, instead of the other way around. At SFU, we made several attempts to make this work by way of the supposed “XML import” capabilities of various Desktop Publishing tools, without much success.[12]
  • In the winter of 2009, Adobe solved this part of the problem for us with the introduction of its Creative Suite 4. What CS4 offers is the option of a complete XML representation of an InDesign document: what Adobe calls IDML (InDesign Markup Language).
  • The IDML file format is—like ePub—a simply disguised zip archive that, when unpacked, reveals a cluster of XML files that represent all the different facets of an InDesign document: layout spreads, master pages, defined styles, colours, and of course, the content.
  • IDML is a well thought-out XML standard that achieves two very different goals simultaneously: it preserves all of the information that InDesign needs to do what it does; and it is broken up in a way that makes it possible for mere mortals (or at least our Master of Publishing students) to work with it.
  • What this represented to us in concrete terms was the ability to take Web-based content and move it into InDesign in a straightforward way, thus bridging Web and print production environments using existing tools and skillsets, with a little added help from free software.
  • We would take clean XHTML content, transform it to IDML-marked content, and merge that with nicely designed templates in InDesign.
  • The result is an almost push-button publication workflow, which results in a nice, familiar InDesign document that fits straight into the way publishers actually do production.
  • Tracing the steps To begin with, we worked backwards, moving the book content back to clean XHTML.
  • The simplest method for this conversion—and if you want to create Web content, this is an excellent route—was to use Adobe’s “Export to Digital Editions” option, which creates an ePub file.
  • Recall that ePub is just XHTML in a wrapper, so within the ePub file was a relatively clean XHTML document. It was somewhat cleaner (that is, the XHTML tagging was simpler and less cluttered) than InDesign’s other Web-oriented exports, possibly because Digital Editions is a well understood target, compared with somebody’s website.
  • In order to achieve our target of clean XHTML, we needed to do some editing; the XHTML produced by InDesign’s “Digital Editions” export was presentation-oriented. For instance, bulleted list items were tagged as paragraphs, with a class attribute identifying them as list items. Using the search-and-replace function, we converted such structures to proper XHTML list and list-item elements. Our guiding principle was to make the XHTML as straightforward as possible, not dependent on any particular software to interpret it.
  • We broke the book’s content into individual chapter files; each chapter could then carry its own basic metadata, and the pages conveniently fit our Web content management system (which is actually just a wiki). We assembled a dynamically generated table of contents for the 12 chapters, and created a cover page. Essentially, the book was entirely Web-based at this point.
  • When the book chapters are viewed online, they are formatted via a CSS2 stylesheet that defines a main column for content as well as dedicating screen real estate for navigational elements. We then created a second template to render the content for exporting; this was essentially a bare-bones version of the book with no navigation and minimal styling. Pages (or even the entire book) can be exported (via the “Save As...” function in a Web browser) for use in either print production or ebook conversion. At this point, we required no skills beyond those of any decent Web designer.
  • Integrating with CS4 for Print Adobe’s IDML language defines elements specific to InDesign; there is nothing in the language that looks remotely like XHTML. So a mechanical transformation step is needed to convert the XHTML content into something InDesign can use. This is not as hard as it might seem.
  • Both XHTML and IDML are composed of straightforward, well-documented structures, and so transformation from one to the other is, as they say, “trivial.” We chose to use XSLT (Extensible Stylesheet Language Transforms) to do the work. XSLT is part of the overall XML specification, and thus is very well supported in a wide variety of tools. Our prototype used a scripting engine called xsltproc, a nearly ubiquitous piece of software that we found already installed as part of Mac OS X (contemporary Linux distributions also have this as a standard tool), though any XSLT processor would work.
  • In other words, we don’t need to buy InCopy, because we just replaced it with the Web. Our wiki is now plugged directly into our InDesign layout. It even automatically updates the InDesign document when the content changes. Credit is due at this point to Adobe: this integration is possible because of the open file format in the Creative Suite 4.
  • We wrote an XSLT transformation script[18] that converted the XHTML content from the Web into an InCopy ICML file. The script itself is less than 500 lines long, and was written and debugged over a period of about a week by amateurs (again, the people named at the start of this article). The script runs in a couple of seconds, and the resulting .icml file can then be “placed” directly into an InDesign template. The ICML file references an InDesign stylesheet, so the template file can be set up with a house-styled layout, master pages, and stylesheet definitions for paragraphs and character ranges.
  • The result is very simple and easy to use. Our demonstration requires that a production editor run the XSLT transformation script manually, but there is no reason why this couldn’t be built directly into the Web content management system so that exporting the content to print ran the transformation automatically. The resulting file would then be “placed” in InDesign and proofed.
  • It should be noted that the Book Publishing 1 proof-of-concept was artificially complex; we began with a book laid out in InDesign and ended up with a look-alike book laid out in InDesign. But next time—for instance, when we publish Book Publishing 2—we can begin the process with the content on the Web, and keep it there throughout the editorial process. The book’s content could potentially be written and edited entirely online, as Web content, and then automatically poured into an InDesign template at proof time. “Just in time,” as they say. This represents an entirely new way of thinking of book production. With a Web-first orientation, it makes little sense to think of the book as “in print” or “out of print”—the book is simply available, in the first place online; in the second place in derivative digital formats; and third, but really not much more difficult, in print-ready format, via the usual InDesign CS print production system publishers are already familiar with.
  • Creating Ebook Files Creating electronic versions from XHTML source is vastly simpler than trying to generate these out of the existing print process. The ePub version is extremely easy to generate; so is online marketing copy or excerpts for the Web, since the content begins life Web-native.
  • Since an ePub file is essentially XHTML content in a special wrapper, all that is required is that we properly “wrap” our XHTML content. Ideally, the content in an ePub file is broken into chapters (as ours was) and a table of contents file is generated in order to allow easy navigation within an ebook reader. We used Julian Smart’s free tool eCub[19] to simply and automatically generate the ePub wrapper and the table of contents. The only custom development we did was to create a CSS stylesheet for the ebook so that headings and paragraph indents looked the way we wanted. Starting with XHTML content, creating ePub is almost too easy.
  • Such a workflow—beginning with the Web and exporting to print—is surely more in line with the way we will do business in the 21st century, where the Web is the default platform for reaching audiences, developing content, and putting the pieces together. It is time, we suggest, for publishers to re-orient their operations and start with the Web.
  • Our project demonstrates that Web technologies are indeed good enough to use in an XML-oriented workflow; more specialized and expensive options are not necessarily required. For massive-scale enterprise publishing, this approach may not offer enough flexibility, and the challenge of adding and extracting extra semantic richness may prove more trouble than it's worth.
  • But for smaller firms who are looking at the straightforward benefits of XML-based processes—single source publishing, online content and workflow management, open and accessible archive formats, greater online discoverability—here is a way forward.
  • Rather than a public-facing website, our system relies on the Web as a content management platform—of course a public face could easily be added.
  • The final piece of our puzzle, the ability to integrate print production, was made possible by Adobe's release of InDesign with an open XML file format. Since the Web's XHTML is also XML, is can be easily and confidently transformed to the InDesign format.
  • today, we are able to put the process together using nothing but standard, relatively ubiquitous Web tools: the Web itself as an editing and content management environment, standard Web scripting tools for the conversion process, and the well-documented IDML file format to integrate the layout tool.
  • Using the Web as a Production Platform
  •  
    I was looking for an answer to a problem Marbux had presented, and found this interesting article.  The issue was that of the upcoming conversion of the Note Case Pro (NCP) layout engine to the WebKit layout engine, and what to do about the NCP document format. My initial reaction was to encode the legacy NCP document format in XML, and run an XSLT to a universal pivot format like TEI-XML.  From there, the TEI-XML community would provide all the XSLT transformation routines for conversion to ODF, OOXML, XHTML, ePUB and HTML/CSS. Researching the problems one might encounter with this approach, I found this article.  Fascinating stuff. My take away is that TEI-XML would not be as effective a "universal pivot point" as XHTML.  Or perhaps, if NCP really wants to get aggressive; IDML - InDesign Markup Language. The important point though is that XHTML is a browser specific version of XML, and compatible with the Web Kit layout engine Miro wants to move NCP to. The concept of encoding an existing application-specific format in XML has been around since 1998, when XML was first introduced as a W3C standard, a "structured" subset of SGML. (HTML is also a subset of SGML). The multiplatform StarOffice productivity suite became "OpenOffice" when Sun purchased the company in 1998, and open sourced the code base. The OpenOffice developer team came out with a XML encoding of their existing document formats in 2000. The application specific encoding became an OASIS document format standard proposal in 2002 - also known as ODF. Microsoft followed OpenOffice with a XML encoding of their application-specific binary document formats, known as OOXML. Encoding the existing NCP format in XML, specifically targeting XHTML as a "universal pivot point", would put the NCP Outliner in the Web editor category, without breaking backwards compatibility. The trick is in the XSLT conversion process. But I think that is something much easier to handle then trying to
  •  
    I was looking for an answer to a problem Marbux had presented, and found this interesting article.  The issue was that of the upcoming conversion of the Note Case Pro (NCP) layout engine to the WebKit layout engine, and what to do about the NCP document format. My initial reaction was to encode the legacy NCP document format in XML, and run an XSLT to a universal pivot format like TEI-XML.  From there, the TEI-XML community would provide all the XSLT transformation routines for conversion to ODF, OOXML, XHTML, ePUB and HTML/CSS. Researching the problems one might encounter with this approach, I found this article.  Fascinating stuff. My take away is that TEI-XML would not be as effective a "universal pivot point" as XHTML.  Or perhaps, if NCP really wants to get aggressive; IDML - InDesign Markup Language. The important point though is that XHTML is a browser specific version of XML, and compatible with the Web Kit layout engine Miro wants to move NCP to. The concept of encoding an existing application-specific format in XML has been around since 1998, when XML was first introduced as a W3C standard, a "structured" subset of SGML. (HTML is also a subset of SGML). The multiplatform StarOffice productivity suite became "OpenOffice" when Sun purchased the company in 1998, and open sourced the code base. The OpenOffice developer team came out with a XML encoding of their existing document formats in 2000. The application specific encoding became an OASIS document format standard proposal in 2002 - also known as ODF. Microsoft followed OpenOffice with a XML encoding of their application-specific binary document formats, known as OOXML. Encoding the existing NCP format in XML, specifically targeting XHTML as a "universal pivot point", would put the NCP Outliner in the Web editor category, without breaking backwards compatibility. The trick is in the XSLT conversion process. But I think that is something much easier to handle then trying to
Paul Merrell

New Leak Of Final TPP Text Confirms Attack On Freedom Of Expression, Public Health - 0 views

  • Offering a first glimpse of the secret 12-nation “trade” deal in its final form—and fodder for its growing ranks of opponents—WikiLeaks on Friday published the final negotiated text for the Trans-Pacific Partnership (TPP)’s Intellectual Property Rights chapter, confirming that the pro-corporate pact would harm freedom of expression by bolstering monopolies while and injure public health by blocking patient access to lifesaving medicines. The document is dated October 5, the same day it was announced in Atlanta, Georgia that the member states to the treaty had reached an accord after more than five years of negotiations. Aside from the WikiLeaks publication, the vast majority of the mammoth deal’s contents are still being withheld from the public—which a WikiLeaks press statement suggests is a strategic move by world leaders to forestall public criticism until after the Canadian election on October 19. Initial analyses suggest that many of the chapter’s more troubling provisions, such as broader patent and data protections that pharmaceutical companies use to delay generic competition, have stayed in place since draft versions were leaked in 2014 and 2015. Moreover, it codifies a crackdown on freedom of speech with rules allowing widespread internet censorship.
Paul Merrell

Prepare to Hang Up the Phone, Forever - WSJ.com - 0 views

  • At decade's end, the trusty landline telephone could be nothing more than a memory. Telecom giants AT&T T +0.31% AT&T Inc. U.S.: NYSE $35.07 +0.11 +0.31% March 28, 2014 4:00 pm Volume (Delayed 15m) : 24.66M AFTER HOURS $35.03 -0.04 -0.11% March 28, 2014 7:31 pm Volume (Delayed 15m): 85,446 P/E Ratio 10.28 Market Cap $182.60 Billion Dividend Yield 5.25% Rev. per Employee $529,844 03/29/14 Prepare to Hang Up the Phone, ... 03/21/14 AT&T Criticizes Netflix's 'Arr... 03/21/14 Samsung's Galaxy S5 Smartphone... More quote details and news » T in Your Value Your Change Short position and Verizon Communications VZ -0.57% Verizon Communications Inc. U.S.: NYSE $47.42 -0.27 -0.57% March 28, 2014 4:01 pm Volume (Delayed 15m) : 24.13M AFTER HOURS $47.47 +0.05 +0.11% March 28, 2014 7:59 pm Volume (Delayed 15m): 1.57M
  • The two providers want to lay the crumbling POTS to rest and replace it with Internet Protocol-based systems that use the same wired and wireless broadband networks that bring Web access, cable programming and, yes, even your telephone service, into your homes. You may think you have a traditional landline because your home phone plugs into a jack, but if you have bundled your phone with Internet and cable services, you're making calls over an IP network, not twisted copper wires. California, Florida, Texas, Georgia, North Carolina, Wisconsin and Ohio are among states that agree telecom resources would be better redirected into modern telephone technologies and innovations, and will kill copper-based technologies in the next three years or so. Kentucky and Colorado are weighing similar laws, which force people to go wireless whether they want to or not. In Mantoloking, N.J., Verizon wants to replace the landline system, which Hurricane Sandy wiped out, with its wireless Voice Link. That would make it the first entire town to go landline-less, a move that isn't sitting well with all residents.
  • New Jersey's legislature, worried about losing data applications such as credit-card processing and alarm systems that wireless systems can't handle, wants a one-year moratorium to block that switch. It will vote on the measure this month. (Verizon tried a similar change in Fire Island, N.Y., when its copper lines were destroyed, but public opposition persuaded Verizon to install fiber-optic cable.) It's no surprise that landlines are unfashionable, considering many of us already have or are preparing to ditch them. More than 38% of adults and 45.5% of children live in households without a landline telephone, says the Centers for Disease Control and Prevention. That means two in every five U.S. homes, or 39%, are wireless, up from 26.6% three years ago. Moreover, a scant 8.5% of households relied only on a landline, while 2% were phoneless in 2013. Metropolitan residents have few worries about the end of landlines. High-speed wire and wireless services are abundant and work well, despite occasional dropped calls. Those living in rural areas, where cell towers are few and 4G capability limited, face different issues.
  • ...2 more annotations...
  • Safety is one of them. Call 911 from a landline and the emergency operator pinpoints your exact address, down to the apartment number. Wireless phones lack those specifics, and even with GPS navigation aren't as precise. Matters are worse in rural and even suburban areas that signals don't reach, sometimes because they're blocked by buildings or the landscape. That's of concern to the Federal Communications Commission, which oversees all forms of U.S. communications services. Universal access is a tenet of its mission, and, despite the state-by-state degradation of the mandate, it's unwilling to let telecom companies simply drop geographically undesirable customers. Telecom firms need FCC approval to ax services completely, and can't do so unless there is a viable competitor to pick up the slack. Last year AT&T asked to turn off its legacy network, which could create gaps in universal coverage and will force people off the grid to get a wireless provider.
  • AT&T and the FCC will soon begin trials to explore life without copper-wired landlines. Consumers will voluntarily test IP-connected networks and their impact on towns like Carbon Hills, Ala., population 2,071. They want to know how households will reach 911, how small businesses will connect to customers, how people with medical-monitoring devices or home alarms know they will always be connected to a reliable network, and what the costs are. "We cannot be a nation of opportunity without networks of opportunity," said FCC Chairman Tom Wheeler in unveiling the plan. "This pilot program will help us learn how fiber might be deployed where it is not now deployed…and how new forms of wireless can reach deep into the interior of rural America."
Paul Merrell

Google, ACLU call to delay government hacking rule | TheHill - 0 views

  • A coalition of 26 organizations, including the American Civil Liberties Union (ACLU) and Google, signed a letter Monday asking lawmakers to delay a measure that would expand the government’s hacking authority. The letter asks Senate Majority Leader Mitch McConnellMitch McConnellTrump voices confidence on infrastructure plan GOP leaders to Obama: Leave Iran policy to Trump GOP debates going big on tax reform MORE (R-Ky.) and Minority Leader Harry ReidHarry ReidNevada can’t trust Trump to protect public lands Sanders, Warren face tough decision on Trump Google, ACLU call to delay government hacking rule MORE (D-Nev.), plus House Speaker Paul RyanPaul RyanTrump voices confidence on infrastructure plan GOP leaders to Obama: Leave Iran policy to Trump GOP debates going big on tax reform MORE (R-Wis.), and House Minority Leader Nancy Pelosi (D-Calif.) to further review proposed changes to Rule 41 and delay its implementation until July 1, 2017. ADVERTISEMENTThe Department of Justice’s alterations to the rule would allow law enforcement to use a single warrant to hack multiple devices beyond the jurisdiction that the warrant was issued in. The FBI used such a tactic to apprehend users of the child pornography dark website, Playpen. It took control of the dark website for two weeks and after securing two warrants, installed malware on Playpen users computers to acquire their identities. But the signatories of the letter — which include advocacy groups, companies and trade associations — are raising questions about the effects of the change. 
  •  
    ".. no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized." Fourth Amendment. The changes to Rule 41 ignore the particularity requirement by allowing the government to search computers that are not particularly identified in multiple locations not particularly identifed, in other words, a general warrant that is precisely the reason the particularity requirement was adopted to outlaw.
Paul Merrell

Senate narrowly rejects new FBI surveillance | TheHill - 0 views

  • The Senate narrowly rejected expanding the FBI's surveillance powers Wednesday in the wake of the worst mass shooting in U.S. history.  Senators voted 58-38 on a procedural hurdle, with 60 votes needed to move forward. Majority Leader Mitch McConnellMitch McConnellOvernight Finance: Wall Street awaits Brexit result | Clinton touts biz support | New threat to Puerto Rico bill? | Dodd, Frank hit back The Trail 2016: Berning embers McConnell quashes Senate effort on guns MORE, who initially voted "yes," switched his vote, which allows him to potentially bring the measure back up. 
  • The Senate GOP proposal—being offered as an amendment to the Commerce, Justice and Science appropriations bill—would allow the FBI to use "national security letters" to obtain people's internet browsing history and other information without a warrant during a terrorism or federal intelligence probe.  It would also permanently extend a Patriot Act provision — currently set to expire in 2019 — meant to monitor "lone wolf" extremists.  Senate Republicans said they would likely be able to get enough votes if McConnell schedules a redo.
  • Asked if he anticipates supporters will be able to get 60 votes, Sen. John CornynJohn CornynSenate to vote on two gun bills Senate Dems rip GOP on immigration ruling Post Orlando, hawks make a power play MORE (R-Texas) separately told reporters "that's certainly my expectation." McConnell urged support for the proposal earlier Wednesday, saying it would give the FBI to "connect the dots" in terrorist investigations.  "We can focus on defeating [the Islamic State in Iraq and Syria] or we can focus on partisan politics. Some of our colleagues many think this is all some game," he said. "I believe this is a serious moment that calls for serious solutions."  But Democrats—and some Republicans—raised concerns that the changes didn't go far enough to ensure Americans' privacy.  Sen. Ron WydenRon WydenPost Orlando, hawks make a power play Democrats seize spotlight with sit-in on guns Democrats stage sit-in on House floor to push for gun vote MORE (D-Ore.) blasted his colleagues for "hypocrisy" after a gunman killed 49 people and injured dozens more during the mass shooting in Orlando, Fla. "Due process ought to apply as it relates to guns, but due process wouldn't apply as it relates to the internet activity of millions of Americans," he said ahead of Wednesday's vote. "Supporters of this amendment...have suggested that Americans need to choose between protecting our security and protecting our constitutional right to privacy." 
  • ...1 more annotation...
  • The American Civil Liberties Union (ACLU) also came out in opposition the Senate GOP proposal on Tuesday, warning it would urge lawmakers to vote against it. 
  •  
    Too close for comfort and coming around the bernd again. 
Paul Merrell

Verizon Will Now Let Users Kill Previously Indestructible Tracking Code - ProPublica - 0 views

  • Verizon says it will soon offer customers a way to opt out from having their smartphone and tablet browsing tracked via a hidden un-killable tracking identifier. The decision came after a ProPublica article revealed that an online advertiser, Turn, was exploiting the Verizon identifier to respawn tracking cookies that users had deleted. Two days after the article appeared, Turn said it would suspend the practice of creating so-called "zombie cookies" that couldn't be deleted. But Verizon couldn't assure users that other companies might not also exploit the number - which was transmitted automatically to any website or app a user visited from a Verizon-enabled device - to build dossiers about people's behavior on their mobile devices. Verizon subsequently updated its website to note Turn's decision and declared that it would "work with other partners to ensure that their use of [the undeletable tracking number] is consistent with the purposes we intended." Previously, its website had stated: "It is unlikely that sites and ad entities will attempt to build customer profiles.
  • However, policing the hundreds of companies in the online tracking business was likely to be a difficult task for Verizon. And so, on Monday, Verizon followed in the footsteps of AT&T, which had already declared in November that it would stop inserting the hidden undeletable number in its users' Web traffic. In a statement emailed to reporters on Friday, Verizon said, "We have begun working to expand the opt-out to include the identifier referred to as the UIDH, and expect that to be available soon." Previously, users who opted out from Verizon's program were told that information about their demographics and Web browsing behavior would no longer be shared with advertisers, but that the tracking number would still be attached to their traffic. For more coverage, read ProPublica's previous reporting on Verizon's indestructible tracking and how one company used the tool to create zombie cookies.
  •  
    Good for Pro Publica!
Paul Merrell

YouTube To Censor "Controversial" Content, ADL On Board As Flagger - 0 views

  • Chief among the groups seeking to clamp down on independent media has been Google, the massive technology company with deep connections to the U.S. intelligence community, as well as to U.S. government and business elites.
  • Since 2015, Google has worked to become the Internet’s “Ministry of Truth,” first through its creation of the First Draft Coalition and more recently via major changes made to its search engine that curtail public access to new sites independent of the corporate media.
  • Google has now stepped up its war on free speech and the freedom of the press through its popular subsidiary, YouTube. On Tuesday, YouTube announced online that it is set to begin censoring content deemed “controversial,” even if that content does not break any laws or violate YouTube’s user agreement. Misleadingly dubbed as an effort “to fight terror content online,” the new program will flag content for review through a mix of machine algorithms and “human review,” guided by standards set up by “expert NGOs and institutions” that are part of YouTube’s “Trusted Flagger” program. YouTube stated that such organizations “bring expert knowledge of complex issues like hate speech, radicalization, and terrorism.” One of the leading institutions directing the course of the Trusted Flagger program is the Anti-Defamation League (ADL). The ADL was initially founded to “stop the defamation of the Jewish people and to secure justice and fair treatment to all” but has gained a reputation over the years for labeling any critic of Israel’s government as an “anti-Semite.” For instance, characterizing Israeli policies towards the Palestinians as “racist” or “apartheid-like” is considered “hate speech” by the ADL, as is accusing Israel of war crimes or attempted ethnic cleansing. The ADL has even described explicitly Jewish organizations who are critical of Israel’s government as being “anti-Semitic.”
Paul Merrell

Facebook's Marketplace Faces Antitrust Probes in EU, U.K. - WSJ - 1 views

  • The European Union and the U.K. opened formal antitrust investigations into Facebook Inc.’s FB -0.86% classified-ads service Marketplace, ramping up regulatory scrutiny for the company in Europe. Both the European Commission—the EU’s top antitrust enforcer—and the U.K.’s Competition and Markets Authority said Friday they are investigating whether Facebook repurposes data it gathers from advertisers who buy ads in order to give illegal advantages to its own services, including its Marketplace online flea market. The U.K. added that it is also investigating whether Facebook uses advertiser data to give similar advantages to its online-dating service. The two competition watchdogs said they would coordinate their investigations.
  • Separately on Friday, Germany’s competition regulator announced that it is opening an investigation into Google’s News Showcase, in which the tech company pays to license certain content from news publishers. That probe, which is based on new powers Germany had granted the regulator, will look among other things at whether Google is imposing unfair conditions on publishers and how it selects participants, the Federal Cartel Office said.
  • The three newly opened cases are part of a new wave of antitrust enforcement in Europe. The European Commission filed formal charges last month against Apple Inc. for allegedly abusing its control over the distribution of music-streaming apps, including Spotify Technology SA . In November, it filed formal charges against Amazon.com Inc. for allegedly using nonpublic data it gathers from third-party sellers to unfairly compete against them. Both companies denied wrongdoing. At the same time, the U.K.’s CMA has opened investigations into Google’s announcement that it will retire third-party cookies, a technology advertisers use to track web users, and whether Apple imposes anticompetitive conditions on some app developers, including the use of Apple’s in-app payment system, which is also the subject of a lawsuit in the U.S. In the EU, the European Commission has been investigating Facebook for more than a year on multiple fronts. Facebook and the Commission have squabbled over access to internal documents as part of those investigations.
  • ...1 more annotation...
  • New York State Attorney General Letitia James outlined in December a sweeping antitrust suit against Facebook by the Federal Trade Commission and a bipartisan group of 46 state attorneys general, targeting the company’s tactics against competitors. Photo: Saul Loeb/AFP via Getty Images (Video from 12/9/20)
Paul Merrell

'Pardon Snowden' Campaign Takes Off As Sanders, Ellsberg, And Others Join - 0 views

  • Prominent activists, lawmakers, artists, academics, and other leading voices in civil society, including Sen. Bernie Sanders (I-Vt.), are joining the campaign to get a pardon for National Security Agency (NSA) whistleblower Edward Snowden. “The information disclosed by Edward Snowden has allowed Congress and the American people to understand the degree to which the NSA has abused its authority and violated our constitutional rights,” Sanders wrote for the Guardian on Wednesday. “Now we must learn from the troubling revelations Mr. Snowden brought to light. Our intelligence and law enforcement agencies must be given the tools they need to protect us, but that can be done in a way that does not sacrifice our rights.” Pentagon Papers whistleblower Daniel Ellsberg, who co-founded the public interest journalism advocacy group Freedom of the Press Foundation, where Snowden is a board member, also wrote, “Ed Snowden should be freed of the legal burden hanging over him. They should remove the indictment, pardon him if that’s the way to do it, so that he is no longer facing prison.” Snowden faces charges under the Espionage Act after he released classified NSA files to media outlets in 2013 exposing the U.S. government’s global mass surveillance operations. He fled to Hong Kong, then Russia, where he has been living under political asylum for the past three years.
  • The Pardon Snowden campaign, supported by the American Civil Liberties Union (ACLU), Amnesty International, and Human Rights Watch (HRW), urgespeople around the world to write to Obama throughout his last four months in the White House.
  •  
    If you want to take part, the action page is at https://www.pardonsnowden.org/
1 - 20 of 701 Next › Last »
Showing 20 items per page