Skip to main content

Home/ Future of the Web/ Group items matching "installing" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
5More

Leaked docs show spyware used to snoop on US computers | Ars Technica - 0 views

  • Software created by the controversial UK-based Gamma Group International was used to spy on computers that appear to be located in the United States, the UK, Germany, Russia, Iran, and Bahrain, according to a leaked trove of documents analyzed by ProPublica. It's not clear whether the surveillance was conducted by governments or private entities. Customer e-mail addresses in the collection appeared to belong to a German surveillance company, an independent consultant in Dubai, the Bosnian and Hungarian Intelligence services, a Dutch law enforcement officer, and the Qatari government.
  • The leaked files—which were posted online by hackers—are the latest in a series of revelations about how state actors including repressive regimes have used Gamma's software to spy on dissidents, journalists, and activist groups. The documents, leaked last Saturday, could not be readily verified, but experts told ProPublica they believed them to be genuine. "I think it's highly unlikely that it's a fake," said Morgan Marquis-Bore, a security researcher who while at The Citizen Lab at the University of Toronto had analyzed Gamma Group's software and who authored an article about the leak on Thursday. The documents confirm many details that have already been reported about Gamma, such as that its tools were used to spy on Bahraini activists. Some documents in the trove contain metadata tied to e-mail addresses of several Gamma employees. Bill Marczak, another Gamma Group expert at the Citizen Lab, said that several dates in the documents correspond to publicly known events—such as the day that a particular Bahraini activist was hacked.
  • The leaked files contain more than 40 gigabytes of confidential technical material, including software code, internal memos, strategy reports, and user guides on how to use Gamma Group software suite called FinFisher. FinFisher enables customers to monitor secure Web traffic, Skype calls, webcams, and personal files. It is installed as malware on targets' computers and cell phones. A price list included in the trove lists a license of the software at almost $4 million. The documents reveal that Gamma uses technology from a French company called Vupen Security that sells so-called computer "exploits." Exploits include techniques called "zero days" for "popular software like Microsoft Office, Internet Explorer, Adobe Acrobat Reader, and many more." Zero days are exploits that have not yet been detected by the software maker and therefore are not blocked.
  • ...2 more annotations...
  • Many of Gamma's product brochures have previously been published by the Wall Street Journal and Wikileaks, but the latest trove shows how the products are getting more sophisticated. In one document, engineers at Gamma tested a product called FinSpy, which inserts malware onto a user's machine, and found that it could not be blocked by most antivirus software. Documents also reveal that Gamma had been working to bypass encryption tools including a mobile phone encryption app, Silent Circle, and were able to bypass the protection given by hard-drive encryption products TrueCrypt and Microsoft's Bitlocker.
  • The documents also describe a "country-wide" surveillance product called FinFly ISP which promises customers the ability to intercept Internet traffic and masquerade as ordinary websites in order to install malware on a target's computer. The most recent date-stamp found in the documents is August 2, coincidung with the first tweet by a parody Twitter account, @GammaGroupPR, which first announced the hack and may be run by the hacker or hackers responsible for the leak. On Reddit, a user called PhineasFisher claimed responsibility for the leak. "Two years ago their software was found being widely used by governments in the middle east, especially Bahrain, to hack and spy on the computers and phones of journalists and dissidents," the user wrote. The name on the @GammaGroupPR Twitter account is also "Phineas Fisher." GammaGroup, the surveillance company whose documents were released, is no stranger to the spotlight. The security firm F-Secure first reported the purchase of FinFisher software by the Egyptian State Security agency in 2011. In 2012, Bloomberg News and The Citizen Lab showed how the company's malware was used to target activists in Bahrain. In 2013, the software company Mozilla sent a cease-and-desist letter to the company after a report by The Citizen Lab showed that a spyware-infected version of the Firefox browser manufactured by Gamma was being used to spy on Malaysian activists.
5More

Help:CirrusSearch - MediaWiki - 0 views

  • CirrusSearch is a new search engine for MediaWiki. The Wikimedia Foundation is migrating to CirrusSearch since it features key improvements over the previously used search engine, LuceneSearch. This page describes the features that are new or different compared to the past solutions.
  • 1 Frequently asked questions 1.1 What's improved? 2 Updates 3 Search suggestions 4 Full text search 4.1 Stemming 4.2 Filters (intitle:, incategory: and linksto:) 4.3 prefix: 4.4 Special prefixes 4.5 Did you mean 4.6 Prefer phrase matches 4.7 Fuzzy search 4.8 Phrase search and proximity 4.9 Quotes and exact matches 4.10 prefer-recent: 4.11 hastemplate: 4.12 boost-templates: 4.13 insource: 4.14 Auxiliary Text 4.15 Lead Text 4.16 Commons Search 5 See also
  • Stemming In search terminology, support for "stemming" means that a search for "swim" will also include "swimming" and "swimmed", but not "swam". There is support for dozens of languages, but all languages are wanted. There is a list of currently supported languages at elasticsearch.org; see their documentation on contributing to submit requests or patches.
  • ...1 more annotation...
  • See also Full specifications in the browser tests
  •  
    Lots of new tricks to learn on sites using MediaWiki as folks update their installations, I'm not a big fan of programs written in PHP and Javascript, but they're impossible to avoid on the Web. So is MediaWiki, so any real improvements help.  
9More

Is Linux dead for the desktop? - 1 views

  • Linux never had the apps
  • Charles King, an IT analyst who follows enterprise trends, says the big change is in IT. At one time, executives in charge of computing services were mostly concerned with operating systems and applications for massive throng of traditional business users. Those users have now flocked to mobile computing devices, but they still have a Windows PC sitting on their desk.
  • Today, Microsoft's lock (on the desktop, anyway) remains secure, even in the face of Apple's surge," King says. "Ironically enough, though, the open source model remains alive and well but mostly in the development of new standards and development platforms."
  • ...5 more annotations...
  • David Johnson
  • What corporate end users really need is familiarity, consistency and compatibility - something Apple, Microsoft and Google seem more adept at offering."
  • Can desktop Linux OS be saved? Johnson says the best example of how to save Linux OS is the Chrome OS, an all-in-one laptop and desktop offering available through major consumer electronics companies such as LG (with their Chromebase all-in-one) and the Samsung Chromebook 2
  • The problem is that Chrome OS and Android aren't the same as Linux OS on the desktop. It's a complete reinvention. There are few Windows-like productivity apps and no knowledge worker apps designed for keyboard and mouse.
  • All of experts agree - Windows won every battle for the business user.
  •  
    "For executives in charge of desktop deployments in a large company, Linux OS was once hailed as a saviour for corporate end users. With incredibly low pricing - free, with fee-based support plans, for example - distributions such as Ubuntu Desktop and SUSE Linux Enterprise offered a "good enough" user interface, along with plenty of powerful apps and a rich browser. A few years ago, both Dell and HP jumped on the bandwagon; today, they still offer "developer" and "workstation" models that come pre-loaded with a Linux install. Plus, anyone who follows the Linux market knows that Google has reimagined Linux as a user-friendly tablet interface (the wildly popular Android OS) and a browser-only desktop variant (Chrome OS). Linux also shows up on countless connected home gadgets, fitness trackers, watches and other low-cost devices, mostly because OS costs are so low. The desktop computing OS for end users has failed to capture any attention lately, though. Al Gillen, the programme vice president for servers and system software at IDC, says the Linux OS as a computing platform for end users is at least comatose - and probably dead. Yes, it has reemerged on Android and other devices, but it has gone almost completely silent as a competitor to Windows for mass deployment. As they say, you can hear the crickets chirping."
1More

OS.js Is A New Javascript Based Open Source Operating System Running In Your Browser - 2 views

  •  
    "hort Bytes: OS.js is a free and open source operating system that runs in your web browser. Based on Javascript, this operating system comes with a fully-fledged window manager, ability to install applications, access to virtual filesystems and a lot more. Read more to know about the OS in detail."
1More

Windows 10 November Update mysteriously pulled, as concerns about bugs grow | Ars Techn... - 0 views

  •  
    "Clean installs of the new version of Windows 10 are no longer possible. by Peter Bright (US) - Nov 24, 2015 11:05am CET Share Tweet 115 Downloadable versions of Windows 10 version 1511, the November 2015 update, appear to have been removed after their release earlier this month."
2More

The Million Dollar Dissident: NSO Group's iPhone Zero-Days used against a UAE Human Rig... - 0 views

  • 1. Executive Summary Ahmed Mansoor is an internationally recognized human rights defender, based in the United Arab Emirates (UAE), and recipient of the Martin Ennals Award (sometimes referred to as a “Nobel Prize for human rights”).  On August 10 and 11, 2016, Mansoor received SMS text messages on his iPhone promising “new secrets” about detainees tortured in UAE jails if he clicked on an included link. Instead of clicking, Mansoor sent the messages to Citizen Lab researchers.  We recognized the links as belonging to an exploit infrastructure connected to NSO Group, an Israel-based “cyber war” company that sells Pegasus, a government-exclusive “lawful intercept” spyware product.  NSO Group is reportedly owned by an American venture capital firm, Francisco Partners Management. The ensuing investigation, a collaboration between researchers from Citizen Lab and from Lookout Security, determined that the links led to a chain of zero-day exploits (“zero-days”) that would have remotely jailbroken Mansoor’s stock iPhone 6 and installed sophisticated spyware.  We are calling this exploit chain Trident.  Once infected, Mansoor’s phone would have become a digital spy in his pocket, capable of employing his iPhone’s camera and microphone to snoop on activity in the vicinity of the device, recording his WhatsApp and Viber calls, logging messages sent in mobile chat apps, and tracking his movements.   We are not aware of any previous instance of an iPhone remote jailbreak used in the wild as part of a targeted attack campaign, making this a rare find.
  • The Trident Exploit Chain: CVE-2016-4657: Visiting a maliciously crafted website may lead to arbitrary code execution CVE-2016-4655: An application may be able to disclose kernel memory CVE-2016-4656: An application may be able to execute arbitrary code with kernel privileges Once we confirmed the presence of what appeared to be iOS zero-days, Citizen Lab and Lookout quickly initiated a responsible disclosure process by notifying Apple and sharing our findings. Apple responded promptly, and notified us that they would be addressing the vulnerabilities. We are releasing this report to coincide with the availability of the iOS 9.3.5 patch, which blocks the Trident exploit chain by closing the vulnerabilities that NSO Group appears to have exploited and sold to remotely compromise iPhones. Recent Citizen Lab research has shown that many state-sponsored spyware campaigns against civil society groups and human rights defenders use “just enough” technical sophistication, coupled with carefully planned deception. This case demonstrates that not all threats follow this pattern.  The iPhone has a well-deserved reputation for security.  As the iPhone platform is tightly controlled by Apple, technically sophisticated exploits are often required to enable the remote installation and operation of iPhone monitoring tools. These exploits are rare and expensive. Firms that specialize in acquiring zero-days often pay handsomely for iPhone exploits.  One such firm, Zerodium, acquired an exploit chain similar to the Trident for one million dollars in November 2015. The high cost of iPhone zero-days, the apparent use of NSO Group’s government-exclusive Pegasus product, and prior known targeting of Mansoor by the UAE government provide indicators that point to the UAE government as the likely operator behind the targeting. Remarkably, this case marks the third commercial “lawful intercept” spyware suite employed in attempts to compromise Mansoor.  In 2011, he was targeted with FinFisher’s FinSpy spyware, and in 2012 he was targeted with Hacking Team’s Remote Control System.  Both Hacking Team and FinFisher have been the object of several years of revelations highlighting the misuse of spyware to compromise civil society groups, journalists, and human rights workers.
1More

F-Droid - Free and Open Source Android App Repository - 1 views

  • What is F-Droid? F-Droid is an installable catalogue of FOSS (Free and Open Source Software) applications for the Android platform. The client makes it easy to browse, install, and keep track of updates on your device.
1More

DLL Files Fixer Crack 2017 Serial Key Full Activator Is Here - 1 views

  •  
    DLL Files Fixer Crack is an excellent software. You can use it to correct all Dynamic Link Library (DDL) file errors. It is very effective software with good repute. This a long running software than the other DDL file fixer serial number. That is why, we have spent enough time to garner more experience. DLL Files Fixer Crack can permit you to access to a large range of DLL file database. You can download as well as install the DLL files with some easy clicks and searching technique.
2More

Google's blazingly fast Internet goes live in Kansas City - CNN.com - 0 views

  • After months of fanfare and anticipation, gigabit home Internet service Google Fiber finally went live on Tuesday in Kansas City. The search giant is offering 1 Gbps speeds for just $70 per month -- significantly faster and cheaper than what any traditional American ISPs are offering.
  • Meanwhile, Demarais said that on an Ethernet connection, he's seen consistent Google Fiber speeds of 600 to 700 Mbps, with Wi-Fi topping out around 200 Mbps. Even at the slower wireless speeds, that's more than an order of magnitude faster than what most Americans have at home. "The first thing I did was BitTorrent Ubuntu," he said. "I think that took two minutes, let me try it again right now."
2More

Google, ACLU call to delay government hacking rule | TheHill - 0 views

  • A coalition of 26 organizations, including the American Civil Liberties Union (ACLU) and Google, signed a letter Monday asking lawmakers to delay a measure that would expand the government’s hacking authority. The letter asks Senate Majority Leader Mitch McConnellMitch McConnellTrump voices confidence on infrastructure plan GOP leaders to Obama: Leave Iran policy to Trump GOP debates going big on tax reform MORE (R-Ky.) and Minority Leader Harry ReidHarry ReidNevada can’t trust Trump to protect public lands Sanders, Warren face tough decision on Trump Google, ACLU call to delay government hacking rule MORE (D-Nev.), plus House Speaker Paul RyanPaul RyanTrump voices confidence on infrastructure plan GOP leaders to Obama: Leave Iran policy to Trump GOP debates going big on tax reform MORE (R-Wis.), and House Minority Leader Nancy Pelosi (D-Calif.) to further review proposed changes to Rule 41 and delay its implementation until July 1, 2017. ADVERTISEMENTThe Department of Justice’s alterations to the rule would allow law enforcement to use a single warrant to hack multiple devices beyond the jurisdiction that the warrant was issued in. The FBI used such a tactic to apprehend users of the child pornography dark website, Playpen. It took control of the dark website for two weeks and after securing two warrants, installed malware on Playpen users computers to acquire their identities. But the signatories of the letter — which include advocacy groups, companies and trade associations — are raising questions about the effects of the change. 
  •  
    ".. no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized." Fourth Amendment. The changes to Rule 41 ignore the particularity requirement by allowing the government to search computers that are not particularly identified in multiple locations not particularly identifed, in other words, a general warrant that is precisely the reason the particularity requirement was adopted to outlaw.
7More

How Secret Partners Expand NSA's Surveillance Dragnet - The Intercept - 0 views

  • Huge volumes of private emails, phone calls, and internet chats are being intercepted by the National Security Agency with the secret cooperation of more foreign governments than previously known, according to newly disclosed documents from whistleblower Edward Snowden. The classified files, revealed today by the Danish newspaper Dagbladet Information in a reporting collaboration with The Intercept, shed light on how the NSA’s surveillance of global communications has expanded under a clandestine program, known as RAMPART-A, that depends on the participation of a growing network of intelligence agencies.
  • It has already been widely reported that the NSA works closely with eavesdropping agencies in the United Kingdom, Canada, New Zealand, and Australia as part of the so-called Five Eyes surveillance alliance. But the latest Snowden documents show that a number of other countries, described by the NSA as “third-party partners,” are playing an increasingly important role – by secretly allowing the NSA to install surveillance equipment on their fiber-optic cables. The NSA documents state that under RAMPART-A, foreign partners “provide access to cables and host U.S. equipment.” This allows the agency to covertly tap into “congestion points around the world” where it says it can intercept the content of phone calls, faxes, e-mails, internet chats, data from virtual private networks, and calls made using Voice over IP software like Skype.
  • The secret documents reveal that the NSA has set up at least 13 RAMPART-A sites, nine of which were active in 2013. Three of the largest – codenamed AZUREPHOENIX, SPINNERET and MOONLIGHTPATH – mine data from some 70 different cables or networks. The precise geographic locations of the sites and the countries cooperating with the program are among the most carefully guarded of the NSA’s secrets, and these details are not contained in the Snowden files. However, the documents point towards some of the countries involved – Denmark and Germany among them. An NSA memo prepared for a 2012 meeting between the then-NSA director, Gen. Keith Alexander, and his Danish counterpart noted that the NSA had a longstanding partnership with the country’s intelligence service on a special “cable access” program. Another document, dated from 2013 and first published by Der Spiegel on Wednesday, describes a German cable access point under a program that was operated by the NSA, the German intelligence service BND, and an unnamed third partner.
  • ...2 more annotations...
  • The program, which the secret files show cost U.S. taxpayers about $170 million between 2011 and 2013, sweeps up a vast amount of communications at lightning speed. According to the intelligence community’s classified “Black Budget” for 2013, RAMPART-A enables the NSA to tap into three terabits of data every second as the data flows across the compromised cables – the equivalent of being able to download about 5,400 uncompressed high-definition movies every minute. In an emailed statement, the NSA declined to comment on the RAMPART-A program. “The fact that the U.S. government works with other nations, under specific and regulated conditions, mutually strengthens the security of all,” said NSA spokeswoman Vanee’ Vines. “NSA’s efforts are focused on ensuring the protection of the national security of the United States, its citizens, and our allies through the pursuit of valid foreign intelligence targets only.”
  • The Danish and German operations appear to be associated with RAMPART-A because it is the only NSA cable-access initiative that depends on the cooperation of third-party partners. Other NSA operations tap cables without the consent or knowledge of the countries that host the cables, or are operated from within the United States with the assistance of American telecommunications companies that have international links. One secret NSA document notes that most of the RAMPART-A projects are operated by the partners “under the cover of an overt comsat effort,” suggesting that the tapping of the fiber-optic cables takes place at Cold War-era eavesdropping stations in the host countries, usually identifiable by their large white satellite dishes and radomes. A shortlist of other countries potentially involved in the RAMPART-A operation is contained in the Snowden archive. A classified presentation dated 2013, published recently in Intercept editor Glenn Greenwald’s book No Place To Hide, revealed that the NSA had top-secret spying agreements with 33 third-party countries, including Denmark, Germany, and 15 other European Union member states:
  •  
    Don't miss the slide with the names of the NSA-partner nations. Lots of E.U. member nations.
  •  
    Very good info. Lucky me I came across your site by accident (stumbleupon). I have saved it for later. I Hate NSA's Surveilances. http://watchlive.us/movie/watch-Venus-in-Fur-online.html Howdy! I could have sworn I've visited this website before but after looking at many of the articles I realized it's new to me. Nonetheless, I'm certainly pleased I found it and I'll be book-marking it and checking back often. <
3More

Marriott fined $600,000 for jamming guest hotspots - SlashGear - 0 views

  • Marriott will cough up $600,000 in penalties after being caught blocking mobile hotspots so that guests would have to pay for its own WiFi services, the FCC has confirmed today. The fine comes after staff at the Gaylord Opryland Hotel and Convention Center in Nashville, Tennessee were found to be jamming individual hotspots and then charging people up to $1,000 per device to get online. Marriott has been operating the center since 2012, and is believed to have been running its interruption scheme since then. The first complaint to the FCC, however, wasn't until March 2013, when one guest warned the Commission that they suspected their hardware had been jammed. An investigation by the FCC's Enforcement Bureau revealed that was, in fact, the case. A WiFi monitoring system installed at the Gaylord Opryland would target access points with de-authentication packets, disconnecting users so that their browsing was interrupted.
  • The FCC deemed Marriott's behaviors as contravening Section 333 of the Communications Act, which states that "no person shall willfully or maliciously interfere with or cause interference to any radio communications of any station licensed or authorized by or under this chapter or operated by the United States Government." In addition to the $600,000 civil penalty, Marriott will have to cease blocking guests, hand over details of any access point containment features to the FCC across its entire portfolio of owned or managed properties, and finally file compliance and usage reports each quarter for the next three years.
  • Update:&nbsp;Marriott has issued the following statement on the FCC ruling: "Marriott has a strong interest in ensuring that when our guests use our Wi-Fi service, they will be protected from rogue wireless hotspots that can cause degraded service, insidious cyber-attacks and identity theft. Like many other institutions and companies in a wide variety of industries, including hospitals and universities, the Gaylord Opryland protected its Wi-Fi network by using FCC-authorized equipment provided by well-known, reputable manufacturers. We believe that the Gaylord Opryland's actions were lawful. We will continue to encourage the FCC to pursue a rulemaking in order to eliminate the ongoing confusion resulting from today's action and to assess the merits of its underlying policy."
1More

New York City to turn phone booths into Wi-Fi hot spots | Al Jazeera America - 0 views

  • New York City Mayor Bill de Blasio is fielding proposals to transform the city’s largely forgotten phone booths into Wi-Fi hot spots, an ambitious project that would create one of the largest public Wi-Fi networks in the country. The team with the winning proposal will be charged with the installation, operation and maintenance of up to 10,000 hot spots distributed across the five boroughs, according to a statement released Thursday by the mayor’s office.
3More

Deutsche Telekom to follow Vodafone in revealing surveillance | World news | The Guardian - 0 views

  • Germany's biggest telecoms company is to follow Vodafone in disclosing for the first time the number of surveillance requests it receives from governments around the world.Deutsche Telekom, which owns half of Britain's EE mobile network and operates in 14 countries including the US, Spain and Poland, has already published surveillance data for its home nation – one of the countries that have reacted most angrily to the Edward Snowden revelations. In the wake of Vodafone's disclosures, first published in the Guardian on Friday, it announced that it would extend its disclosures to every other market where it operates and where it is legal.A spokeswoman for Deutsche Telekom, which has 140 million customers worldwide, said: "Deutsche Telekom has initially focused on Germany when it comes to disclosure of government requests. We are currently checking if and to what extent our national companies can disclose information. We intend to publish something similar to Vodafone."
  • Bosses of the world's biggest mobile networks, many of which have headquarters in Europe, are gathering for an industry conference in Shanghai this weekend, and the debate is expected to centre on whether they should join Deutsche and Vodafone in using transparency to push back against the use of their technology for government surveillance.Mobile companies, unlike social networks, cannot operate without a government-issued licence, and have previously been reluctant to discuss the extent of their cooperation with national security and law enforcement agencies.But Vodafone broke cover on Friday by confirming that in around half a dozen of the markets in which it operates, governments in Europe and outside have installed their own secret listening equipment on its network and those of other operators.
  •  
    Looks like Vodafone broke a government transparency logjam on government surveillance via digital communications, as to disclosure of raw totals of search warrants by nations other than the U.S. 
11More

Internet Giants Erect Barriers to Spy Agencies - NYTimes.com - 0 views

  • As fast as it can, Google is sealing up cracks in its systems that Edward J. Snowden revealed the N.S.A. had brilliantly exploited. It is encrypting more data as it moves among its servers and helping customers encode their own emails. Facebook, Microsoft and Yahoo are taking similar steps.
  • After years of cooperating with the government, the immediate goal now is to thwart Washington — as well as Beijing and Moscow. The strategy is also intended to preserve business overseas in places like Brazil and Germany that have threatened to entrust data only to local providers. Google, for example, is laying its own fiber optic cable under the world’s oceans, a project that began as an effort to cut costs and extend its influence, but now has an added purpose: to assure that the company will have more control over the movement of its customer data.
  • A year after Mr. Snowden’s revelations, the era of quiet cooperation is over. Telecommunications companies say they are denying requests to volunteer data not covered by existing law. A.T.&amp;T., Verizon and others say that compared with a year ago, they are far more reluctant to cooperate with the United States government in “gray areas” where there is no explicit requirement for a legal warrant.
  • ...8 more annotations...
  • Eric Grosse, Google’s security chief, suggested in an interview that the N.S.A.'s own behavior invited the new arms race.“I am willing to help on the purely defensive side of things,” he said, referring to Washington’s efforts to enlist Silicon Valley in cybersecurity efforts. “But signals intercept is totally off the table,” he said, referring to national intelligence gathering.“No hard feelings, but my job is to make their job hard,” he added.
  • In Washington, officials acknowledge that covert programs are now far harder to execute because American technology companies, fearful of losing international business, are hardening their networks and saying no to requests for the kind of help they once quietly provided.Continue reading the main story Robert S. Litt, the general counsel of the Office of the Director of National Intelligence, which oversees all 17 American spy agencies, said on Wednesday that it was “an unquestionable loss for our nation that companies are losing the willingness to cooperate legally and voluntarily” with American spy agencies.
  • Many point to an episode in 2012, when Russian security researchers uncovered a state espionage tool, Flame, on Iranian computers. Flame, like the Stuxnet worm, is believed to have been produced at least in part by American intelligence agencies. It was created by exploiting a previously unknown flaw in Microsoft’s operating systems. Companies argue that others could have later taken advantage of this defect.Worried that such an episode undercuts confidence in its wares, Microsoft is now fully encrypting all its products, including Hotmail and Outlook.com, by the end of this year with 2,048-bit encryption, a stronger protection that would take a government far longer to crack. The software is protected by encryption both when it is in data centers and when data is being sent over the Internet, said Bradford L. Smith, the company’s general counsel.
  • Mr. Smith also said the company was setting up “transparency centers” abroad so that technical experts of foreign governments could come in and inspect Microsoft’s proprietary source code. That will allow foreign governments to check to make sure there are no “back doors” that would permit snooping by United States intelligence agencies. The first such center is being set up in Brussels.Microsoft has also pushed back harder in court. In a Seattle case, the government issued a “national security letter” to compel Microsoft to turn over data about a customer, along with a gag order to prevent Microsoft from telling the customer it had been compelled to provide its communications to government officials. Microsoft challenged the gag order as violating the First Amendment. The government backed down.
  • Hardware firms like Cisco, which makes routers and switches, have found their products a frequent subject of Mr. Snowden’s disclosures, and their business has declined steadily in places like Asia, Brazil and Europe over the last year. The company is still struggling to convince foreign customers that their networks are safe from hackers — and free of “back doors” installed by the N.S.A. The frustration, companies here say, is that it is nearly impossible to prove that their systems are N.S.A.-proof.
  • In one slide from the disclosures, N.S.A. analysts pointed to a sweet spot inside Google’s data centers, where they could catch traffic in unencrypted form. Next to a quickly drawn smiley face, an N.S.A. analyst, referring to an acronym for a common layer of protection, had noted, “SSL added and removed here!”
  • Facebook and Yahoo have also been encrypting traffic among their internal servers. And Facebook, Google and Microsoft have been moving to more strongly encrypt consumer traffic with so-called Perfect Forward Secrecy, specifically devised to make it more labor intensive for the N.S.A. or anyone to read stored encrypted communications.One of the biggest indirect consequences from the Snowden revelations, technology executives say, has been the surge in demands from foreign governments that saw what kind of access to user information the N.S.A. received — voluntarily or surreptitiously. Now they want the same.
  • The latest move in the war between intelligence agencies and technology companies arrived this week, in the form of a new Google encryption tool. The company released a user-friendly, email encryption method to replace the clunky and often mistake-prone encryption schemes the N.S.A. has readily exploited.But the best part of the tool was buried in Google’s code, which included a jab at the N.S.A.'s smiley-face slide. The code included the phrase: “ssl-added-and-removed-here-; - )”
2More

How To Keep NSA Computers From Turning Your Phone Conversations Into Searchable Text - ... - 0 views

  • As soon as my article about how NSA computers can now turn phone conversations into searchable text came out on Tuesday, people started asking me: What should I do if I don’t want them doing that to mine? The solution, as it is to so many other outrageously invasive U.S. government tactics exposed by NSA whistleblower Edward Snowden, is, of course, Congressional legislation. I kid, I kid. No, the real solution is end-to-end encryption, preferably of the unbreakable kind. And as luck would have it, you can have exactly that on your mobile phone, for the price of zero dollars and zero cents.
  • The Intercept’s Micah Lee wrote about this&nbsp;in March, in an article titled: “You Should Really Consider Installing Signal, an Encrypted Messaging App for iPhone.” (Signal is for iPhone and iPads, and encrypts both voice and texts; RedPhone is the Android version of the voice product; TextSecure&nbsp;is the Android version of the text product.) As Lee explains, the open source software group known as Open Whisper Systems, which makes all three, is gaining a reputation for combining trustworthy encryption with ease of use and mobile convenience. Nobody – not your mobile provider, your ISP or the phone manufacturer — can promise you that your phone conversations won’t be intercepted in transit. That leaves end-to-end encryption – using a trustworthy app whose makers themselves literally cannot break the encryption — your best play.
49More

XML Production Workflows? Start with the Web and XHTML - 0 views

  • Challenges: Some Ugly Truths The challenges of building—and living with—an XML workflow are clear enough. The return on investment is a long-term proposition. Regardless of the benefits XML may provide, the starting reality is that it represents a very different way of doing things than the one we are familiar with. The Word Processing and Desktop Publishing paradigm, based on the promise of onscreen, WYSIWYG layout, is so dominant as to be practically inescapable. It has proven really hard to get from here to there, no matter how attractive XML might be on paper. A considerable amount of organizational effort and labour must be expended up front in order to realize the benefits. This is why XML is often referred to as an “investment”: you sink a bunch of time and money up front, and realize the benefits—greater flexibility, multiple output options, searching and indexing, and general futureproofing—later, over the long haul. It is not a short-term return proposition. And, of course, the returns you are able to realize from your XML investment are commensurate with what you put in up front: fine-grained, semantically rich tagging is going to give you more potential for searchability and recombination than a looser, more general-purpose approach, but it sure costs more. For instance, the Text Encoding Initiative (TEI) is the grand example of pouring enormous amounts of energy into the up-front tagging, with a very open-ended set of possibilities down the line. TEI helpfully defines a level to which most of us do not have to aspire.[5] But understanding this on a theoretical level is only part of the challenge. There are many practical issues that must be addressed. Software and labour are two of the most critical. How do you get the content into XML in the first place? Unfortunately, despite two decades of people doing SGML and XML, this remains an ugly question.
  • Practical Challenges In 2009, there is still no truly likeable—let alone standard—editing and authoring software for XML. For many (myself included), the high-water mark here was Adobe’s FrameMaker, substantially developed by the late 1990s. With no substantial market for it, it is relegated today mostly to the tech writing industry, unavailable for the Mac, and just far enough afield from the kinds of tools we use today that its adoption represents a significant hurdle. And FrameMaker was the best of the breed; most of the other software in decent circulation are programmers’ tools—the sort of things that, as Michael Tamblyn pointed out, encourage editors to drink at their desks. The labour question represents a stumbling block as well. The skill-sets and mind-sets that effective XML editors need have limited overlap with those needed by literary and more traditional production editors. The need to think of documents as machine-readable databases is not something that comes naturally to folks steeped in literary culture. In combination with the sheer time and effort that rich tagging requires, many publishers simply outsource the tagging to India, drawing a division of labour that spans oceans, to put it mildly. Once you have XML content, then what do you do with it? How do you produce books from it? Presumably, you need to be able to produce print output as well as digital formats. But while the latter are new enough to be generally XML-friendly (e-book formats being largely XML based, for instance), there aren’t any straightforward, standard ways of moving XML content into the kind of print production environments we are used to seeing. This isn’t to say that there aren’t ways of getting print—even very high-quality print—output from XML, just that most of them involve replacing your prepress staff with Java programmers.
  • Why does this have to be so hard? It’s not that XML is new, or immature, or untested. Remember that the basics have been around, and in production, since the early 1980s at least. But we have to take account of a substantial and long-running cultural disconnect between traditional editorial and production processes (the ones most of us know intimately) and the ways computing people have approached things. Interestingly, this cultural divide looked rather different in the 1970s, when publishers were looking at how to move to digital typesetting. Back then, printers and software developers could speak the same language. But that was before the ascendancy of the Desktop Publishing paradigm, which computerized the publishing industry while at the same time isolating it culturally. Those of us who learned how to do things the Quark way or the Adobe way had little in common with people who programmed databases or document-management systems. Desktop publishing technology isolated us in a smooth, self-contained universe of toolbars, grid lines, and laser proofs. So, now that the reasons to get with this program, XML, loom large, how can we bridge this long-standing divide?
  • ...44 more annotations...
  • Using the Web as a Production Platform The answer, I think, is right in front of you. The bridge is the Web, a technology and platform that is fundamentally based on XML, and which many publishers are by now comfortably familiar with. Perhaps not entirely comfortably, but at least most publishers are already working with the Web; they already either know or have on staff people who understand it and can work with it. The foundation of our argument is this: rather than looking at jumping to XML in its full, industrial complexity, which seems to be what the O'Reilly-backed StartWithXML initiative[6] is suggesting, publishers instead leverage existing tools and technologies—starting with the Web—as a means of getting XML workflows in place. This means making small investments and working with known tools rather than spending tens of thousands of dollars on XML software and rarefied consultants. It means re-thinking how the existing pieces of the production toolchain fit together; re-thinking the existing roles of software components already in use. It means, fundamentally, taking the Web seriously as a content platform, rather than thinking of it as something you need to get content out to, somehow. If nothing else, the Web represents an opportunity to think about editorial and production from outside the shrink-wrapped Desktop Publishing paradigm.
  • Is the Web made of Real XML? At this point some predictable objections can be heard: wait a moment, the Web isn’t really made out of XML; the HTML that makes up most of the Web is at best the bastard child of SGML, and it is far too flaky/unstructured/underpowered to be taken seriously. We counter by arguing that although HTML on the Web exists in a staggering array of different incarnations, and that the majority of it is indeed an unstructured mess, this does not undermine the general principle that basic, ubiquitous Web technologies can make a solid platform for content management, editorial process, and production workflow.
  • With the advent of a published XML standard in the late 1990s came the W3C’s adoption of XHTML: the realization of the Web’s native content markup as a proper XML document type. Today, its acceptance is almost ubiquitous, even while the majority of actual content out there may not be strictly conforming. The more important point is that most contemporary Web software, from browsers to authoring tools to content management systems (from blogs to enterprise systems), are capable of working with clean, valid XHTML. Or, to put the argument the other way around, clean, valid XHTML content plays absolutely seamlessly with everything else on the Web.[7]
  • The objection which follows, then, will be that even if we grant that XHTML is a real XML document type, that it is underpowered for “serious” content because it is almost entirely presentation (formatting) oriented; it lacks any semantic depth. In XHTML, a paragraph is a paragraph is a paragraph, as opposed to a section or an epigraph or a summary.
  • n contrast, more “serious” XML document types like DocBook[8] or DITA-derived schemas[9] are capable of making semantic distinctions about content chunks at a fine level of granularity and with a high degree of specificity.
  • So there is an argument for recalling the 80:20 rule here. If XHTML can provide 80% of the value with just 20% of the investment, then what exactly is the business case for spending the other 80% to achieve that last 20% of value? We suspect the ratio is actually quite a bit steeper than 80:20 for most publishers.
  • Furthermore, just to get technical for a moment, XHTML is extensible in a fairly straightforward way, through the common “class” attribute on each element. Web developers have long leveraged this kind of extensibility in the elaboration of “microformats” for semantic-web applications.[10] There is no reason why publishers shouldn’t think to use XHTML’s simple extensibility in a similar way for their own ends.
  • XHTML, on the other hand, is supported by a vast array of quotidian software, starting with the ubiquitous Web browser. For this very reason, XHTML is in fact employed as a component part of several more specialized document types (ONIX and ePub among them).
  • Why re-invent a general-purpose prose representation when XHTML already does the job?
  • It is worth pausing for a moment to consider the role of XHTML in the ePub standard for ebook content. An ePub file is, anatomically, a simply disguised zip archive. Inside the zip archive are a few standard component parts: there are specialized files that declare metadata about the book, and about the format of the book. And then there is the book’s content, represented in XHTML. An ePub book is a Web page in a wrapper.
  • To sum up the general argument: the Web as it already exists presents incredible value to publishers, as a platform for doing XML content management with existing (and often free) tools, and without having to go blindly into the unknown. At this point, we can offer a few design guidelines: prefer existing and/or ubiquitous tools over specialized ones wherever possible; prefer free software over proprietary systems where possible; prefer simple tools controlled and coordinated by human beings over fully automated (and therefore complex) systems; play to our strengths: use Web software for storing and managing content, use layout software for layout, and keep editors and production people in charge of their own domains.
  • Putting the Pieces Together: A Prototype
  • At the SFU Master of Publishing Program, we have been chipping away at this general line of thinking for a few years. Over that time, Web content management systems have been getting more and more sophisticated, all the while getting more streamlined and easier to use. (NB: if you have a blog, you have a Web content management system.) The Web is beginning to be recognized as a writing and editing environment used by millions of people. And the ways in which content is represented, stored, and exchanged online have become increasingly robust and standardized.
  • The missing piece of the puzzle has been print production: how can we move content from its malleable, fluid form on line into the kind of high-quality print production environments we’ve come to expect after two decades of Desktop Publishing?
  • Anyone who has tried to print Web content knows that the existing methods leave much to be desired (hyphenation and justification, for starters). In the absence of decent tools for this, most publishers quite naturally think of producing the print content first, and then think about how to get material onto the Web for various purposes. So we tend to export from Word, or from Adobe, as something of an afterthought.
  • While this sort of works, it isn’t elegant, and it completely ignores the considerable advantages of Web-based content management.
  • Content managed online is stored in one central location, accessible simultaneously to everyone in your firm, available anywhere you have an Internet connection, and usually exists in a much more fluid format than Word files. If only we could manage the editorial flow online, and then go to print formats at the end, instead of the other way around. At SFU, we made several attempts to make this work by way of the supposed “XML import” capabilities of various Desktop Publishing tools, without much success.[12]
  • In the winter of 2009, Adobe solved this part of the problem for us with the introduction of its Creative Suite 4. What CS4 offers is the option of a complete XML representation of an InDesign document: what Adobe calls IDML (InDesign Markup Language).
  • The IDML file format is—like ePub—a simply disguised zip archive that, when unpacked, reveals a cluster of XML files that represent all the different facets of an InDesign document: layout spreads, master pages, defined styles, colours, and of course, the content.
  • IDML is a well thought-out XML standard that achieves two very different goals simultaneously: it preserves all of the information that InDesign needs to do what it does; and it is broken up in a way that makes it possible for mere mortals (or at least our Master of Publishing students) to work with it.
  • What this represented to us in concrete terms was the ability to take Web-based content and move it into InDesign in a straightforward way, thus bridging Web and print production environments using existing tools and skillsets, with a little added help from free software.
  • We would take clean XHTML content, transform it to IDML-marked content, and merge that with nicely designed templates in InDesign.
  • The result is an almost push-button publication workflow, which results in a nice, familiar InDesign document that fits straight into the way publishers actually do production.
  • Tracing the steps To begin with, we worked backwards, moving the book content back to clean XHTML.
  • The simplest method for this conversion—and if you want to create Web content, this is an excellent route—was to use Adobe’s “Export to Digital Editions” option, which creates an ePub file.
  • Recall that ePub is just XHTML in a wrapper, so within the ePub file was a relatively clean XHTML document. It was somewhat cleaner (that is, the XHTML tagging was simpler and less cluttered) than InDesign’s other Web-oriented exports, possibly because Digital Editions is a well understood target, compared with somebody’s website.
  • In order to achieve our target of clean XHTML, we needed to do some editing; the XHTML produced by InDesign’s “Digital Editions” export was presentation-oriented. For instance, bulleted list items were tagged as paragraphs, with a class attribute identifying them as list items. Using the search-and-replace function, we converted such structures to proper XHTML list and list-item elements. Our guiding principle was to make the XHTML as straightforward as possible, not dependent on any particular software to interpret it.
  • We broke the book’s content into individual chapter files; each chapter could then carry its own basic metadata, and the pages conveniently fit our Web content management system (which is actually just a wiki). We assembled a dynamically generated table of contents for the 12 chapters, and created a cover page. Essentially, the book was entirely Web-based at this point.
  • When the book chapters are viewed online, they are formatted via a CSS2 stylesheet that defines a main column for content as well as dedicating screen real estate for navigational elements. We then created a second template to render the content for exporting; this was essentially a bare-bones version of the book with no navigation and minimal styling. Pages (or even the entire book) can be exported (via the “Save As...” function in a Web browser) for use in either print production or ebook conversion. At this point, we required no skills beyond those of any decent Web designer.
  • Integrating with CS4 for Print Adobe’s IDML language defines elements specific to InDesign; there is nothing in the language that looks remotely like XHTML. So a mechanical transformation step is needed to convert the XHTML content into something InDesign can use. This is not as hard as it might seem.
  • Both XHTML and IDML are composed of straightforward, well-documented structures, and so transformation from one to the other is, as they say, “trivial.” We chose to use XSLT (Extensible Stylesheet Language Transforms) to do the work. XSLT is part of the overall XML specification, and thus is very well supported in a wide variety of tools. Our prototype used a scripting engine called xsltproc, a nearly ubiquitous piece of software that we found already installed as part of Mac OS X (contemporary Linux distributions also have this as a standard tool), though any XSLT processor would work.
  • In other words, we don’t need to buy InCopy, because we just replaced it with the Web. Our wiki is now plugged directly into our InDesign layout. It even automatically updates the InDesign document when the content changes. Credit is due at this point to Adobe: this integration is possible because of the open file format in the Creative Suite 4.
  • We wrote an XSLT transformation script[18] that converted the XHTML content from the Web into an InCopy ICML file. The script itself is less than 500 lines long, and was written and debugged over a period of about a week by amateurs (again, the people named at the start of this article). The script runs in a couple of seconds, and the resulting .icml file can then be “placed” directly into an InDesign template. The ICML file references an InDesign stylesheet, so the template file can be set up with a house-styled layout, master pages, and stylesheet definitions for paragraphs and character ranges.
  • The result is very simple and easy to use. Our demonstration requires that a production editor run the XSLT transformation script manually, but there is no reason why this couldn’t be built directly into the Web content management system so that exporting the content to print ran the transformation automatically. The resulting file would then be “placed” in InDesign and proofed.
  • It should be noted that the Book Publishing 1 proof-of-concept was artificially complex; we began with a book laid out in InDesign and ended up with a look-alike book laid out in InDesign. But next time—for instance, when we publish Book Publishing 2—we can begin the process with the content on the Web, and keep it there throughout the editorial process. The book’s content could potentially be written and edited entirely online, as Web content, and then automatically poured into an InDesign template at proof time. “Just in time,” as they say. This represents an entirely new way of thinking of book production. With a Web-first orientation, it makes little sense to think of the book as “in print” or “out of print”—the book is simply available, in the first place online; in the second place in derivative digital formats; and third, but really not much more difficult, in print-ready format, via the usual InDesign CS print production system publishers are already familiar with.
  • Creating Ebook Files Creating electronic versions from XHTML source is vastly simpler than trying to generate these out of the existing print process. The ePub version is extremely easy to generate; so is online marketing copy or excerpts for the Web, since the content begins life Web-native.
  • Since an ePub file is essentially XHTML content in a special wrapper, all that is required is that we properly “wrap” our XHTML content. Ideally, the content in an ePub file is broken into chapters (as ours was) and a table of contents file is generated in order to allow easy navigation within an ebook reader. We used Julian Smart’s free tool eCub[19] to simply and automatically generate the ePub wrapper and the table of contents. The only custom development we did was to create a CSS stylesheet for the ebook so that headings and paragraph indents looked the way we wanted. Starting with XHTML content, creating ePub is almost too easy.
  • Such a workflow—beginning with the Web and exporting to print—is surely more in line with the way we will do business in the 21st century, where the Web is the default platform for reaching audiences, developing content, and putting the pieces together. It is time, we suggest, for publishers to re-orient their operations and start with the Web.
  • Our project demonstrates that Web technologies are indeed good enough to use in an XML-oriented workflow; more specialized and expensive options are not necessarily required. For massive-scale enterprise publishing, this approach may not offer enough flexibility, and the challenge of adding and extracting extra semantic richness may prove more trouble than it's worth.
  • But for smaller firms who are looking at the straightforward benefits of XML-based processes—single source publishing, online content and workflow management, open and accessible archive formats, greater online discoverability—here is a way forward.
  • Rather than a public-facing website, our system relies on the Web as a content management platform—of course a public face could easily be added.
  • The final piece of our puzzle, the ability to integrate print production, was made possible by Adobe's release of InDesign with an open XML file format. Since the Web's XHTML is also XML, is can be easily and confidently transformed to the InDesign format.
  • today, we are able to put the process together using nothing but standard, relatively ubiquitous Web tools: the Web itself as an editing and content management environment, standard Web scripting tools for the conversion process, and the well-documented IDML file format to integrate the layout tool.
  • Using the Web as a Production Platform
  •  
    I was looking for an answer to a problem Marbux had presented, and found this interesting article.  The issue was that of the upcoming conversion of the Note Case Pro (NCP) layout engine to the WebKit layout engine, and what to do about the NCP document format. My initial reaction was to encode the legacy NCP document format in XML, and run an XSLT to a universal pivot format like TEI-XML.  From there, the TEI-XML community would provide all the XSLT transformation routines for conversion to ODF, OOXML, XHTML, ePUB and HTML/CSS. Researching the problems one might encounter with this approach, I found this article.  Fascinating stuff. My take away is that TEI-XML would not be as effective a "universal pivot point" as XHTML.  Or perhaps, if NCP really wants to get aggressive; IDML - InDesign Markup Language. The important point though is that XHTML is a browser specific version of XML, and compatible with the Web Kit layout engine Miro wants to move NCP to. The concept of encoding an existing application-specific format in XML has been around since 1998, when XML was first introduced as a W3C standard, a "structured" subset of SGML. (HTML is also a subset of SGML). The multiplatform StarOffice productivity suite became "OpenOffice" when Sun purchased the company in 1998, and open sourced the code base. The OpenOffice developer team came out with a XML encoding of their existing document formats in 2000. The application specific encoding became an OASIS document format standard proposal in 2002 - also known as ODF. Microsoft followed OpenOffice with a XML encoding of their application-specific binary document formats, known as OOXML. Encoding the existing NCP format in XML, specifically targeting XHTML as a "universal pivot point", would put the NCP Outliner in the Web editor category, without breaking backwards compatibility. The trick is in the XSLT conversion process. But I think that is something much easier to handle then trying to
  •  
    I was looking for an answer to a problem Marbux had presented, and found this interesting article.  The issue was that of the upcoming conversion of the Note Case Pro (NCP) layout engine to the WebKit layout engine, and what to do about the NCP document format. My initial reaction was to encode the legacy NCP document format in XML, and run an XSLT to a universal pivot format like TEI-XML.  From there, the TEI-XML community would provide all the XSLT transformation routines for conversion to ODF, OOXML, XHTML, ePUB and HTML/CSS. Researching the problems one might encounter with this approach, I found this article.  Fascinating stuff. My take away is that TEI-XML would not be as effective a "universal pivot point" as XHTML.  Or perhaps, if NCP really wants to get aggressive; IDML - InDesign Markup Language. The important point though is that XHTML is a browser specific version of XML, and compatible with the Web Kit layout engine Miro wants to move NCP to. The concept of encoding an existing application-specific format in XML has been around since 1998, when XML was first introduced as a W3C standard, a "structured" subset of SGML. (HTML is also a subset of SGML). The multiplatform StarOffice productivity suite became "OpenOffice" when Sun purchased the company in 1998, and open sourced the code base. The OpenOffice developer team came out with a XML encoding of their existing document formats in 2000. The application specific encoding became an OASIS document format standard proposal in 2002 - also known as ODF. Microsoft followed OpenOffice with a XML encoding of their application-specific binary document formats, known as OOXML. Encoding the existing NCP format in XML, specifically targeting XHTML as a "universal pivot point", would put the NCP Outliner in the Web editor category, without breaking backwards compatibility. The trick is in the XSLT conversion process. But I think that is something much easier to handle then trying to
4More

NSA Spying Relies on AT&amp;T's 'Extreme Willingness to Help' - ProPublica - 0 views

  • he National Security Agency’s ability to spy on vast quantities of Internet traffic passing through the United States has relied on its extraordinary, decades-long partnership with a single company: the telecom giant AT&amp;T. While it has been long known that American telecommunications companies worked closely with the spy agency, newly disclosed NSA documents show that the relationship with AT&amp;T has been considered unique and especially productive. One document described it as “highly collaborative,” while another lauded the company’s “extreme willingness to help.”
  • AT&amp;T’s cooperation has involved a broad range of classified activities, according to the documents, which date from 2003 to 2013. AT&amp;T has given the NSA access, through several methods covered under different legal rules, to billions of emails as they have flowed across its domestic networks. It provided technical assistance in carrying out a secret court order permitting the wiretapping of all Internet communications at the United Nations headquarters, a customer of AT&amp;T. The NSA’s top-secret budget in 2013 for the AT&amp;T partnership was more than twice that of the next-largest such program, according to the documents. The company installed surveillance equipment in at least 17 of its Internet hubs on American soil, far more than its similarly sized competitor, Verizon. And its engineers were the first to try out new surveillance technologies invented by the eavesdropping agency. One document reminds NSA officials to be polite when visiting AT&amp;T facilities, noting: “This is a partnership, not a contractual relationship.” The documents, provided by the former agency contractor Edward Snowden, were jointly reviewed by The New York Times and ProPublica.
  • It is not clear if the programs still operate in the same way today. Since the Snowden revelations set off a global debate over surveillance two years ago, some Silicon Valley technology companies have expressed anger at what they characterize as NSA intrusions and have rolled out new encryption to thwart them. The telecommunications companies have been quieter, though Verizon unsuccessfully challenged a court order for bulk phone records in 2014. At the same time, the government has been fighting in court to keep the identities of its telecom partners hidden. In a recent case, a group of AT&amp;T customers claimed that the NSA’s tapping of the Internet violated the Fourth Amendment protection against unreasonable searches. This year, a federal judge dismissed key portions of the lawsuit after the Obama administration argued that public discussion of its telecom surveillance efforts would reveal state secrets, damaging national security.
« First ‹ Previous 81 - 100 of 127 Next › Last »
Showing 20 items per page