Skip to main content

Home/ Future of the Web/ Group items tagged image

Rss Feed Group items tagged

Paul Merrell

Trump administration pulls back curtain on secretive cybersecurity process - The Washin... - 0 views

  • The White House on Wednesday made public for the first time the rules by which the government decides to disclose or keep secret software flaws that can be turned into cyberweapons — whether by U.S. agencies hacking for foreign intelligence, money-hungry criminals or foreign spies seeking to penetrate American computers. The move to publish an un­classified charter responds to years of criticism that the process was unnecessarily opaque, fueling suspicion that it cloaked a stockpile of software flaws that the National Security Agency was hoarding to go after foreign targets but that put Americans’ cyber­security at risk.
  • The rules are part of the “Vulnerabilities Equities Process,” which the Obama administration revamped in 2014 as a multi­agency forum to debate whether and when to inform companies such as Microsoft and Juniper that the government has discovered or bought a software flaw that, if weaponized, could affect the security of their product. The Trump administration has mostly not altered the rules under which the government reaches a decision but is disclosing its process. Under the VEP, an “equities review board” of at least a dozen national security and civilian agencies will meet monthly — or more often, if a need arises — to discuss newly discovered vulnerabilities. Besides the NSA, the CIA and the FBI, the list includes the Treasury, Commerce and State departments, and the Office of Management and Budget. The priority is on disclosure, the policy states, to protect core Internet systems, the U.S. economy and critical infrastructure, unless there is “a demonstrable, overriding interest” in using the flaw for intelligence or law enforcement purposes. The government has long said that it discloses the vast majority — more than 90 percent — of the vulnerabilities it discovers or buys in products from defense contractors or other sellers. In recent years, that has amounted to more than 100 a year, according to people familiar with the process. But because the process was classified, the National Security Council, which runs the discussion, was never able to reveal any numbers. Now, Joyce said, the number of flaws disclosed and the number retained will be made public in an annual report. A classified version will be sent to Congress, he said.
Paul Merrell

The People and Tech Behind the Panama Papers - Features - Source: An OpenNews project - 0 views

  • Then we put the data up, but the problem with Solr was it didn’t have a user interface, so we used Project Blacklight, which is open source software normally used by librarians. We used it for the journalists. It’s simple because it allows you to do faceted search—so, for example, you can facet by the folder structure of the leak, by years, by type of file. There were more complex things—it supports queries in regular expressions, so the more advanced users were able to search for documents with a certain pattern of numbers that, for example, passports use. You could also preview and download the documents. ICIJ open-sourced the code of our document processing chain, created by our web developer Matthew Caruana Galizia. We also developed a batch-searching feature. So say you were looking for politicians in your country—you just run it through the system, and you upload your list to Blacklight and you would get a CSV back saying yes, there are matches for these names—not only exact matches, but also matches based on proximity. So you would say “I want Mar Cabra proximity 2” and that would give you “Mar Cabra,” “Mar whatever Cabra,” “Cabra, Mar,”—so that was good, because very quickly journalists were able to see… I have this list of politicians and they are in the data!
  • Last Sunday, April 3, the first stories emerging from the leaked dataset known as the Panama Papers were published by a global partnership of news organizations working in coordination with the International Consortium of Investigative Journalists, or ICIJ. As we begin the second week of reporting on the leak, Iceland’s Prime Minister has been forced to resign, Germany has announced plans to end anonymous corporate ownership, governments around the world launched investigations into wealthy citizens’ participation in tax havens, the Russian government announced that the investigation was an anti-Putin propaganda operation, and the Chinese government banned mentions of the leak in Chinese media. As the ICIJ-led consortium prepares for its second major wave of reporting on the Panama Papers, we spoke with Mar Cabra, editor of ICIJ’s Data & Research unit and lead coordinator of the data analysis and infrastructure work behind the leak. In our conversation, Cabra reveals ICIJ’s years-long effort to build a series of secure communication and analysis platforms in support of genuinely global investigative reporting collaborations.
  • For communication, we have the Global I-Hub, which is a platform based on open source software called Oxwall. Oxwall is a social network, like Facebook, which has a wall when you log in with the latest in your network—it has forum topics, links, you can share files, and you can chat with people in real time.
  • ...3 more annotations...
  • We had the data in a relational database format in SQL, and thanks to ETL (Extract, Transform, and Load) software Talend, we were able to easily transform the data from SQL to Neo4j (the graph-database format we used). Once the data was transformed, it was just a matter of plugging it into Linkurious, and in a couple of minutes, you have it visualized—in a networked way, so anyone can log in from anywhere in the world. That was another reason we really liked Linkurious and Neo4j—they’re very quick when representing graph data, and the visualizations were easy to understand for everybody. The not-very-tech-savvy reporter could expand the docs like magic, and more technically expert reporters and programmers could use the Neo4j query language, Cypher, to do more complex queries, like show me everybody within two degrees of separation of this person, or show me all the connected dots…
  • We believe in open source technology and try to use it as much as possible. We used Apache Solr for the indexing and Apache Tika for document processing, and it’s great because it processes dozens of different formats and it’s very powerful. Tika interacts with Tesseract, so we did the OCRing on Tesseract. To OCR the images, we created an army of 30–40 temporary servers in Amazon that allowed us to process the documents in parallel and do parallel OCR-ing. If it was very slow, we’d increase the number of servers—if it was going fine, we would decrease because of course those servers have a cost.
  • For the visualization of the Mossack Fonseca internal database, we worked with another tool called Linkurious. It’s not open source, it’s licensed software, but we have an agreement with them, and they allowed us to work with it. It allows you to represent data in graphs. We had a version of Linkurious on our servers, so no one else had the data. It was pretty intuitive—journalists had to click on dots that expanded, basically, and could search the names.
Gonzalo San Gil, PhD.

The Failure Of Social Media | Six Pixels of Separation - Marketing and Communications B... - 0 views

  •  
    "Social Media doesn't work for the vast majority of small businesses. That was the main message in the USA Today article titled, Study: Social media a bust for small businesses, published on April 17th, 2013. From the news item: "About 61% of small businesses don't see any return on investment on their social-media activities, according to a survey released Tuesday from Manta, a social network for small businesses. Yet, almost 50% say they've increased their time spent on social media, and only 7% have decreased their time. What businesses are trying to get out of social media: 36% said their goal was to acquire and engage new customers, 19% said to gain leads and referrals, and 17% said to boost awareness. Facebook was most cited as the hardest to maintain social-media platform, according to the survey." There is a big lesson in this data..."
Paul Merrell

Tim Berners-Lee, W3C Approve Work On DRM For HTML 5.1 - Slashdot - 1 views

  • "Danny O'Brien from the EFF has a weblog post about how the Encrypted Media Extension (EME) proposal will continue to be part of HTML Work Group's bailiwick and may make it into a future HTML revision." From O'Brien's post: "A Web where you cannot cut and paste text; where your browser can't 'Save As...' an image; where the 'allowed' uses of saved files are monitored beyond the browser; where JavaScript is sealed away in opaque tombs; and maybe even where we can no longer effectively 'View Source' on some sites, is a very different Web from the one we have today. It's a Web where user agents—browsers—must navigate a nest of enforced duties every time they visit a page. It's a place where the next Tim Berners-Lee or Mozilla, if they were building a new browser from scratch, couldn't just look up the details of all the 'Web' technologies. They'd have to negotiate and sign compliance agreements with a raft of DRM providers just to be fully standards-compliant and interoperable."
  •  
    From the Dept. of YouGottaBeKiddingMe. 
Paul Merrell

Google Acquires Titan Aerospace, The Drone Company Pursued By Facebook | TechCrunch - 0 views

  • Google has acquired Titan Aerospace, the drone startup that makes high-flying robots which was previously scoped by Facebook as a potential acquisition target (as first reported by TechCrunch), the WSJ reports. The details of the purchase weren’t disclosed, but the deal comes after Facebook disclosed its own purchase of a Titan Aerospace competitor in U.K.-based Ascenta for its globe-spanning Internet plans. Both Ascenta and Titan Aerospace are in the business of high altitude drones, which cruise nearer the edge of the earth’s atmosphere and provide tech that could be integral to blanketing the globe in cheap, omnipresent Internet connectivity to help bring remote areas online. According to the WSJ, Google will be using Titan Aerospace’s expertise and tech to contribute to Project Loon, the balloon-based remote Internet delivery project it’s currently working on along these lines. That’s not all the Titan drones can help Google with, however. The company’s robots also take high-quality images in real-time that could help with Maps initiatives, as well as contribute to things like “disaster relief” and addressing “deforestation,” a Google spokesperson tells WSJ. The main goal, however, is likely spreading the potential reach of Google and its network, which is Facebook’s aim, too. When you saturate your market and you’re among the world’s most wealthy companies, you don’t go into maintenance mode; you build new ones.
  • As for why an exit to Google looked appealing to a company like Titan, Sarah Perez outlines how Titan had sparked early interest from VCs thanks to its massive drones, which were capable of flying at a reported altitude of 65,000 feet for up to three years, but how there was also a lot of risk involved that would’ve made it difficult to find sustained investment while remaining independent. Google had just recently demonstrated how its Loon prototype balloons could traverse the globe in a remarkably short period of time, but the use of drones could conceivably make a network of Internet-providing automotons even better at globe-trotting, with a higher degree of control and ability to react to changing conditions. Some kind of hybrid system might also be in the pipeline that marries both technologies.
Paul Merrell

This project aims to make '404 not found' pages a thing of the past - 0 views

  • The Internet is always changing. Sites are rising and falling, content is deleted, and bad URLs can lead to '404 Not Found' errors that are as helpful as a brick wall. A new project proposes an do away with dead 404 errors by implementing new HTML code that will help access prior versions of hyperlinked content. With any luck, that means that you’ll never have to run into a dead link again. The “404-No-More” project is backed by a formidable coalition including members from organizations like the Harvard Library Innovation Lab, Los Alamos National Laboratory, Old Dominion University, and the Berkman Center for Internet & Society. Part of the Knight News Challenge, which seeks to strengthen the Internet for free expression and innovation through a variety of initiatives, 404-No-More recently reached the semifinal stage. The project aims to cure so-called link rot, the process by which hyperlinks become useless overtime because they point to addresses that are no longer available. If implemented, websites such as Wikipedia and other reference documents would be vastly improved. The new feature would also give Web authors a way provide links that contain both archived copies of content and specific dates of reference, the sort of information that diligent readers have to hunt down on a website like Archive.org.
  • While it may sound trivial, link rot can actually have real ramifications. Nearly 50 percent of the hyperlinks in Supreme Court decisions no longer work, a 2013 study revealed. Losing footnotes and citations in landmark legal decisions can mean losing crucial information and context about the laws that govern us. The same study found that 70 percent of URLs within the Harvard Law Review and similar journals didn’t link to the originally cited information, considered a serious loss surrounding the discussion of our laws. The project’s proponents have come up with more potential uses as well. Activists fighting censorship will have an easier time combatting government takedowns, for instance. Journalists will be much more capable of researching dynamic Web pages. “If every hyperlink was annotated with a publication date, you could automatically view an archived version of the content as the author intended for you to see it,” the project’s authors explain. The ephemeral nature of the Web could no longer be used as a weapon. Roger Macdonald, a director at the Internet Archive, called the 404-No-More project “an important contribution to preservation of knowledge.”
  • The new feature would come in the form of introducing the mset attribute to the <a> element in HTML, which would allow users of the code to specify multiple dates and copies of content as an external resource. For instance, if both the date of reference and the location of a copy of targeted content is known by an author, the new code would like like this: The 404-No-More project’s goals are numerous, but the ultimate goal is to have mset become a new HTML standard for hyperlinks. “An HTML standard that incorporates archives for hyperlinks will loop in these efforts and make the Web better for everyone,” project leaders wrote, “activists, journalists, and regular ol’ everyday web users.”
Paul Merrell

Another judge upholds NSA call tracking - POLITICO.com - 0 views

  • A federal judge in Idaho has upheld the constitutionality of the National Security Agency's program that gathers massive quanities of data on the telephone calls of Americans. The ruling Tuesday from U.S. District Court Judge B. Lynn Winmill leaves the federal government with two wins in lawsuits decided since the program was revealed about a year ago by ex-NSA contractor Edward Snowden. In addition, one judge handling a criminal case ruled that the surveillance did not violate the Constitution. Opponents of the program have only one win: U.S. District Court Judge Richard Leon's ruling in December that the program likely violates the Fourth Amendment. In the new decision, Winmill said binding precedent in the Ninth Circuit holds that call and email metadata are not protected by the Constitution and no warrant is needed to obtain it.
  • "The weight of the authority favors the NSA," wrote Winmill, an appointee of President Bill Clinton. Winmill took note of Leon's contrary decision and called it eloquent, but concluded it departs from current Supreme Court precedent — though perhaps not for long. "Judge Leon’s decision should serve as a template for a Supreme Court opinion. And it might yet," Winmill wrote as he threw out the lawsuit brought by an Idaho registered nurse who objected to the gathering of data on her phone calls. Winmill's opinion (posted here) does not address an argument put forward by some critics of the program, including some lawmakers: that the metadata program violates federal law because it does not fit squarely within the language of the statute used to authorize it.
  •  
    A partial win for the public. The judge makes plain that he disagrees with pre-Snowden disclosure precedent and recommends that the Supreme Court adopt the reasoning of Judge Richard Leon's ruling that finds the NSA call-metadata violative of the Fourth Amendment. The judge says his hands are tied by prior decisions in the Ninth Circuit Court of Appeals that gave an expansive reading to Smith v. Maryland.
Gonzalo San Gil, PhD.

Pirate Bay Helps Puts Sweden on the Map, Govt. Agency Says | TorrentFreak [# Note] - 0 views

    • Gonzalo San Gil, PhD.
       
      # ! 'Thank God' The Pirate Bay Helps Some@ne. # ! (... more than is told...)
  •  
    [ By Andy on May 18, 2015 C: 0 Breaking According to a government agency responsible for promoting Sweden overseas, the country has several major brands to thank when it comes to being recognized on the world stage. In addition to car makers Volvo and furniture store IKEA, interest in Sweden has been boosted thanks to the notorious Pirate Bay. But the file-sharing fun doesn't end there. ...]
  •  
    [ By Andy on May 18, 2015 C: 0 Breaking According to a government agency responsible for promoting Sweden overseas, the country has several major brands to thank when it comes to being recognized on the world stage. In addition to car makers Volvo and furniture store IKEA, interest in Sweden has been boosted thanks to the notorious Pirate Bay. But the file-sharing fun doesn't end there. ...]
Gary Edwards

Chris Dixon Explains Why He Loves Paper - Business Insider - 0 views

  • Steve Jobs predicted that tablet computers would become so dominant that “PCs would become like trucks” – special-purpose industrial devices. Skeptics replied that tablets were only useful for consumption and not creation and therefore couldn’t replace PCs, to which Jobs said:
  • We are just scratching the surface on the kinds of apps for the iPad…I think there are lots of kinds of content that can be created on the iPad. When I am going to write that 35-page analyst report, I am going to want my Bluetooth keyboard. That’s 1 percent of the time. The software will get more powerful. I think your vision would have to be pretty short to think these can’t grow into machines that can do more things, like editing video, graphic arts, productivity. You can imagine all of these content creation possibilities on these kind of things. Time takes care of lots of these things.
  • History supports Jobs’ argument. In the past, new user interfaces led to new categories of creation applications. Back in the 70s and 80s, when computers had text-based interfaces, word processors and spreadsheets were invented. In the 80s and 90s, when computers had graphical interfaces, presentation and image editors proliferated. Jobs was simply predicting that historical patterns would repeat.
  • ...1 more annotation...
  • Today we are announcing that Andreessen Horowitz is leading a $15M Series A investment in FiftyThree, a company whose goal is to build the essential suite of mobile tools for creativity. You might know FiftyThree as the company behind the iPad app Paper. Paper has been embraced by millions of everyday creators, and has won dozens of awards (including Apple’s App of the Year). It is also one of the top grossing iPad productivity apps ever. But this is only the beginning of FiftyThree’s ambitious plans.
Gonzalo San Gil, PhD.

US, China reach cyberespionage agreement | ITworld - 1 views

  •  
    "e U.S. and China have reached their first ever cybercrime and cyberespionage agreement, but the deal is quite general and how it will translate into actions is still unclear."
  •  
    "e U.S. and China have reached their first ever cybercrime and cyberespionage agreement, but the deal is quite general and how it will translate into actions is still unclear."
Paul Merrell

Facebook's Deepface Software Has Gotten Them in Deep Trouble | nsnbc international - 0 views

  • In a Chicago court, several Facebook users filed a class-action lawsuit against the social media giant for allegedly violating its users’ privacy rights to acquire the largest privately held stash of biometric face-recognition data in the world. The court documents reveal claims that “Facebook began violating the Illinois Biometric Information Privacy Act (IBIPA) of 2008 in 2010, in a purported attempt to make the process of tagging friends easier.”
  • This was accomplished through the “tag suggestions” feature provided by Facebook which “scans all pictures uploaded by users and identifies any Facebook friends they may want to tag.” The Facebook users maintain that this feature is a “form of data mining [that] violates user’s privacy”. One plaintiff said this is a “brazen disregard for its users’ privacy rights,” through which Facebook has “secretly amassed the world’s largest privately held database of consumer biometrics data.” Because “Facebook actively conceals” their protocol using “faceprint databases” to identify Facebook users in photos, and “doesn’t disclose its wholesale biometrics data collection practices in its privacy policies, nor does it even ask users to acknowledge them.”
  • This would be a violation of the IBIPA which states it is “unlawful to collect biometric data without written notice to the subject stating the purpose and length of the data collection, and without obtaining the subject’s written release.” Because all users are automatically part of the “faceprint’ facial recognition program, this is an illegal act in the state of Illinois, according to the complaint. Jay Edelson, attorney for the plaintiffs, asserts the opt-out ability to prevent other Facebook users from tagging them in photos is “insufficient”.
  • ...1 more annotation...
  • Deepface is the name of the new technology researchers at Facebook created in order to identify people in pictures; mimicking the way humans recognize the differences in each other’s faces. Facebook has already implemented facial recognition software (FRS) to suggest names for tagging photos; however Deepface can “identify faces from a side view” as well as when the person is directly facing the camera in the picture. In 2013, Erin Egan, chief privacy officer for Facebook, said that this upgrade “would give users better control over their personal information, by making it easier to identify posted photos in which they appear.” Egan explained: “Our goal is to facilitate tagging so that people know when there are photos of them on our service.” Facebook has stated that they retain information from their users that is syphoned from all across the web. This data is used to increase Facebook’s profits with the information being sold for marketing purposes. This is the impressive feature of Deepface; as previous FRS can only decipher faces in images that are frontal views of people. Shockingly, Deepface displays 97.25% accuracy in identifying faces in photos. That is quite a feat considering humans have a 97.53% accuracy rate. In order to ensure accuracy, Deepface “conducts its analysis based on more than 120 million different parameters.”
Matteo Spreafico

TWEET IDEAS: 13 Things to Do on Twitter Besides Tweet - 1 views

Paul Merrell

Sun's Advanced Datacenter (Santa Clara, CA) - System News - 0 views

  • To run Sun’s award-winning data centers, a modular design containing many "pods" was implemented to save power and time. The modular design aids the building of any sized datacenter. Inside of each pod, there are 24 racks. Each of these 24 racks has a common cooling system as does every other modular building block. The number of pods is limited by the size of the datacenters. Large and small datacenters can benefit from using the pod approach. The module design makes it easy to configure a datacenter to meet a client's requirements. As the datacenter grows over time, adding pods is convenient. The module and pod designs make it easy to adapt to new technology such as blade servers. Some of the ways that Sun’s datacenter modules are designed with the future in mind are as follows:
  • To run Sun’s award-winning data centers, a modular design containing many "pods" was implemented to save power and time. The modular design aids the building of any sized datacenter. Inside of each pod, there are 24 racks. Each of these 24 racks has a common cooling system as does every other modular building block. The number of pods is limited by the size of the datacenters. Large and small datacenters can benefit from using the pod approach. The module design makes it easy to configure a datacenter to meet a client's requirements. As the datacenter grows over time, adding pods is convenient. The module and pod designs make it easy to adapt to new technology such as blade servers.
  • An updated 58-page Sun BluePrint covers Sun's approach to designing datacenters. (Authors - Dean Nelson, Michael Ryan, Serena DeVito, Ramesh KV, Petr Vlasaty, Brett Rucker, and Brian Day): ENERGY EFFICIENT DATACENTERS: THE ROLE OF MODULARITY IN DATACENTER DESIGN. More Information Sun saves $1 million/year with new datacenter Take a Virtual Tour
  • ...3 more annotations...
  • An updated 58-page Sun BluePrint covers Sun's approach to designing datacenters. (Authors - Dean Nelson, Michael Ryan, Serena DeVito, Ramesh KV, Petr Vlasaty, Brett Rucker, and Brian Day): ENERGY EFFICIENT DATACENTERS: THE ROLE OF MODULARITY IN DATACENTER DESIGN.
  • Take a Virtual Tour
  • Other articles in the Hardware section of Volume 125, Issue 1: Sun's Advanced Datacenter (Santa Clara, CA) Modular Approach Is Key to Datacenter Design for Sun Sun Datacenter Switch 3x24 See all archived articles in the
  •  
    This page seems to be the hub for information about the Sun containerized data centers. I've highlighted links as well as text, but not all the text on the page. Info gathered in the process of surfing the linked pages: [i] the 3x24 data switch page recomends redundant Solaris instances; [ii] x64 blade servers are the design target; [iii] there is specific mention of other Sun-managed data centers being erected in Indiana and in Bangalore, India; [iv] the whiff is that Sun might not only be supplying the data centers for the Microsoft cloud but also managing them; and [v] the visual tour is very impressive; clearly some very brilliant people put a lot of hard and creative work into this.
Paul Merrell

Hakia Retools Semantic Search Engine to Better Battle Google, Yahoo - 0 views

  • Semantic search engine startup Hakia has retooled its Web site, adding tabs for news, images and "credible" site searches as a way to differentiate between its search approach and what it calls the "10 blue links" approach search incumbents Google, Yahoo and Microsoft have used in the first era of search engines. Hakia employs semantic search technologies, leveraging natural language processing to derive broader meaning from search queries.
  • Hakia began hawking "credible" Web sites, vetted by librarians and informational professionals, in April for health and medical searches drawing from sites examined by the Medical Library Association. These sites have a peer review process or strict editorial controls to ensure the accuracy of the information and zero commercial bias. The idea is to clearly define sites users can trust in an age when do-it-yourself chronicling via Wikipedia and other sites that enable crowdsourcing activities has led to some questionable results.
Paul Merrell

Goggles turns Android into pocket translator - Google 24/7 - Fortune Tech - 2 views

  • Google's Goggles mobile application has always been a fun tool.  The idea is that if you snap a picture and upload it to Google (as well as your location/time), Google could present more about that object, and by extension, your surroundings.  It isn't always terribly accurate in identifying what is in the picture, but the results are sometimes helpful, if not amusing. Today, Goggles got a very specific use feature that will help travelers and readers of foreign language texts immensely.  Now you can point your Android camera at a sign, book, or any sort of foreign word, snap a picture,  and get a translation.  Google uses optical character recognition, or OCR, to turn the image into words, and then uses its translation services to turn those words into a language you recognize
Paul Merrell

NSA contractors use LinkedIn profiles to cash in on national security | Al Jazeera America - 0 views

  • NSA spies need jobs, too. And that is why many covert programs could be hiding in plain sight. Job websites such as LinkedIn and Indeed.com contain hundreds of profiles that reference classified NSA efforts, posted by everyone from career government employees to low-level IT workers who served in Iraq or Afghanistan. They offer a rare glimpse into the intelligence community's projects and how they operate. Now some researchers are using the same kinds of big-data tools employed by the NSA to scrape public LinkedIn profiles for classified programs. But the presence of so much classified information in public view raises serious concerns about security — and about the intelligence industry as a whole. “I’ve spent the past couple of years searching LinkedIn profiles for NSA programs,” said Christopher Soghoian, the principal technologist with the American Civil Liberties Union’s Speech, Privacy and Technology Project.
  • On Aug. 3, The Wall Street Journal published a story about the FBI’s growing use of hacking to monitor suspects, based on information Soghoian provided. The next day, Soghoian spoke at the Defcon hacking conference about how he uncovered the existence of the FBI’s hacking team, known as the Remote Operations Unit (ROU), using the LinkedIn profiles of two employees at James Bimen Associates, with which the FBI contracts for hacking operations. “Had it not been for the sloppy actions of a few contractors updating their LinkedIn profiles, we would have never known about this,” Soghoian said in his Defcon talk. Those two contractors were not the only ones being sloppy.
  • And there are many more. A quick search of Indeed.com using three code names unlikely to return false positives — Dishfire, XKeyscore and Pinwale — turned up 323 résumés. The same search on LinkedIn turned up 48 profiles mentioning Dishfire, 18 mentioning XKeyscore and 74 mentioning Pinwale. Almost all these people appear to work in the intelligence industry. Network-mapping the data Fabio Pietrosanti of the Hermes Center for Transparency and Digital Human Rights noticed all the code names on LinkedIn last December. While sitting with M.C. McGrath at the Chaos Communication Congress in Hamburg, Germany, Pietrosanti began searching the website for classified program names — and getting serious results. McGrath was already developing Transparency Toolkit, a Web application for investigative research, and knew he could improve on Pietrosanti’s off-the-cuff methods.
  • ...2 more annotations...
  • “I was, like, huh, maybe there’s more we can do with this — actually get a list of all these profiles that have these results and use that to analyze the structure of which companies are helping with which programs, which people are helping with which programs, try to figure out in what capacity, and learn more about things that we might not know about,” McGrath said. He set up a computer program called a scraper to search LinkedIn for public profiles that mention known NSA programs, contractors or jargon — such as SIGINT, the agency’s term for “signals intelligence” gleaned from intercepted communications. Once the scraper found the name of an NSA program, it searched nearby for other words in all caps. That allowed McGrath to find the names of unknown programs, too. Once McGrath had the raw data — thousands of profiles in all, with 70 to 80 different program names — he created a network graph that showed the relationships between specific government agencies, contractors and intelligence programs. Of course, the data are limited to what people are posting on their LinkedIn profiles. Still, the network graph gives a sense of which contractors work on several NSA programs, which ones work on just one or two, and even which programs military units in Iraq and Afghanistan are using. And that is just the beginning.
  • Click on the image to view an interactive network illustration of the relationships between specific national security surveillance programs in red, and government organizations or private contractors in blue.
  •  
    What a giggle, public spying on NSA and its contractors using Big Data. The interactive network graph with its sidebar display of relevant data derived from LinkedIn profiles is just too delightful. 
Paul Merrell

A Short Guide to the Internet's Biggest Enemies | Electronic Frontier Foundation - 1 views

  • Reporters Without Borders (RSF) released its annual “Enemies of the Internet” index this week—a ranking first launched in 2006 intended to track countries that repress online speech, intimidate and arrest bloggers, and conduct surveillance of their citizens.  Some countries have been mainstays on the annual index, while others have been able to work their way off the list.  Two countries particularly deserving of praise in this area are Tunisia and Myanmar (Burma), both of which have stopped censoring the Internet in recent years and are headed in the right direction toward Internet freedom. In the former category are some of the world’s worst offenders: Cuba, North Korea, China, Iran, Saudi Arabia, Vietnam, Belarus, Bahrain, Turkmenistan, Syria.  Nearly every one of these countries has amped up their online repression in recent years, from implementing sophisticated surveillance (Syria) to utilizing targeted surveillance tools (Vietnam) to increasing crackdowns on online speech (Saudi Arabia).  These are countries where, despite advocacy efforts by local and international groups, no progress has been made. The newcomers  A third, perhaps even more disheartening category, is the list of countries new to this year's index.  A motley crew, these nations have all taken new, harsh approaches to restricting speech or monitoring citizens:
  • United States: This is the first time the US has made it onto RSF’s list.  While the US government doesn’t censor online content, and pours money into promoting Internet freedom worldwide, the National Security Agency’s unapologetic dragnet surveillance and the government’s treatment of whistleblowers have earned it a spot on the index. United Kingdom: The European nation has been dubbed by RSF as the “world champion of surveillance” for its recently-revealed depraved strategies for spying on individuals worldwide.  The UK also joins countries like Ethiopia and Morocco in using terrorism laws to go after journalists.  Not noted by RSF, but also important, is the fact that the UK is also cracking down on legal pornography, forcing Internet users to opt-in with their ISP if they wish to view it and creating a slippery slope toward overblocking.  This is in addition to the government’s use of an opaque, shadowy NGO to identify child sexual abuse images, sometimes resulting instead in censorship of legitimate speech.
Paul Merrell

Most Agencies Falling Short on Mandate for Online Records - 1 views

  • Nearly 20 years after Congress passed the Electronic Freedom of Information Act Amendments (E-FOIA), only 40 percent of agencies have followed the law's instruction for systematic posting of records released through FOIA in their electronic reading rooms, according to a new FOIA Audit released today by the National Security Archive at www.nsarchive.org to mark Sunshine Week. The Archive team audited all federal agencies with Chief FOIA Officers as well as agency components that handle more than 500 FOIA requests a year — 165 federal offices in all — and found only 67 with online libraries populated with significant numbers of released FOIA documents and regularly updated.
  • Congress called on agencies to embrace disclosure and the digital era nearly two decades ago, with the passage of the 1996 "E-FOIA" amendments. The law mandated that agencies post key sets of records online, provide citizens with detailed guidance on making FOIA requests, and use new information technology to post online proactively records of significant public interest, including those already processed in response to FOIA requests and "likely to become the subject of subsequent requests." Congress believed then, and openness advocates know now, that this kind of proactive disclosure, publishing online the results of FOIA requests as well as agency records that might be requested in the future, is the only tenable solution to FOIA backlogs and delays. Thus the National Security Archive chose to focus on the e-reading rooms of agencies in its latest audit. Even though the majority of federal agencies have not yet embraced proactive disclosure of their FOIA releases, the Archive E-FOIA Audit did find that some real "E-Stars" exist within the federal government, serving as examples to lagging agencies that technology can be harnessed to create state-of-the art FOIA platforms. Unfortunately, our audit also found "E-Delinquents" whose abysmal web performance recalls the teletype era.
  • E-Delinquents include the Office of Science and Technology Policy at the White House, which, despite being mandated to advise the President on technology policy, does not embrace 21st century practices by posting any frequently requested records online. Another E-Delinquent, the Drug Enforcement Administration, insults its website's viewers by claiming that it "does not maintain records appropriate for FOIA Library at this time."
  • ...9 more annotations...
  • "The presumption of openness requires the presumption of posting," said Archive director Tom Blanton. "For the new generation, if it's not online, it does not exist." The National Security Archive has conducted fourteen FOIA Audits since 2002. Modeled after the California Sunshine Survey and subsequent state "FOI Audits," the Archive's FOIA Audits use open-government laws to test whether or not agencies are obeying those same laws. Recommendations from previous Archive FOIA Audits have led directly to laws and executive orders which have: set explicit customer service guidelines, mandated FOIA backlog reduction, assigned individualized FOIA tracking numbers, forced agencies to report the average number of days needed to process requests, and revealed the (often embarrassing) ages of the oldest pending FOIA requests. The surveys include:
  • The federal government has made some progress moving into the digital era. The National Security Archive's last E-FOIA Audit in 2007, " File Not Found," reported that only one in five federal agencies had put online all of the specific requirements mentioned in the E-FOIA amendments, such as guidance on making requests, contact information, and processing regulations. The new E-FOIA Audit finds the number of agencies that have checked those boxes is now much higher — 100 out of 165 — though many (66 in 165) have posted just the bare minimum, especially when posting FOIA responses. An additional 33 agencies even now do not post these types of records at all, clearly thwarting the law's intent.
  • The FOIAonline Members (Department of Commerce, Environmental Protection Agency, Federal Labor Relations Authority, Merit Systems Protection Board, National Archives and Records Administration, Pension Benefit Guaranty Corporation, Department of the Navy, General Services Administration, Small Business Administration, U.S. Citizenship and Immigration Services, and Federal Communications Commission) won their "E-Star" by making past requests and releases searchable via FOIAonline. FOIAonline also allows users to submit their FOIA requests digitally.
  • THE E-DELINQUENTS: WORST OVERALL AGENCIES In alphabetical order
  • Key Findings
  • Excuses Agencies Give for Poor E-Performance
  • Justice Department guidance undermines the statute. Currently, the FOIA stipulates that documents "likely to become the subject of subsequent requests" must be posted by agencies somewhere in their electronic reading rooms. The Department of Justice's Office of Information Policy defines these records as "frequently requested records… or those which have been released three or more times to FOIA requesters." Of course, it is time-consuming for agencies to develop a system that keeps track of how often a record has been released, which is in part why agencies rarely do so and are often in breach of the law. Troublingly, both the current House and Senate FOIA bills include language that codifies the instructions from the Department of Justice. The National Security Archive believes the addition of this "three or more times" language actually harms the intent of the Freedom of Information Act as it will give agencies an easy excuse ("not requested three times yet!") not to proactively post documents that agency FOIA offices have already spent time, money, and energy processing. We have formally suggested alternate language requiring that agencies generally post "all records, regardless of form or format that have been released in response to a FOIA request."
  • Disabilities Compliance. Despite the E-FOIA Act, many government agencies do not embrace the idea of posting their FOIA responses online. The most common reason agencies give is that it is difficult to post documents in a format that complies with the Americans with Disabilities Act, also referred to as being "508 compliant," and the 1998 Amendments to the Rehabilitation Act that require federal agencies "to make their electronic and information technology (EIT) accessible to people with disabilities." E-Star agencies, however, have proven that 508 compliance is no barrier when the agency has a will to post. All documents posted on FOIAonline are 508 compliant, as are the documents posted by the Department of Defense and the Department of State. In fact, every document created electronically by the US government after 1998 should already be 508 compliant. Even old paper records that are scanned to be processed through FOIA can be made 508 compliant with just a few clicks in Adobe Acrobat, according to this Department of Homeland Security guide (essentially OCRing the text, and including information about where non-textual fields appear). Even if agencies are insistent it is too difficult to OCR older documents that were scanned from paper, they cannot use that excuse with digital records.
  • Privacy. Another commonly articulated concern about posting FOIA releases online is that doing so could inadvertently disclose private information from "first person" FOIA requests. This is a valid concern, and this subset of FOIA requests should not be posted online. (The Justice Department identified "first party" requester rights in 1989. Essentially agencies cannot use the b(6) privacy exemption to redact information if a person requests it for him or herself. An example of a "first person" FOIA would be a person's request for his own immigration file.) Cost and Waste of Resources. There is also a belief that there is little public interest in the majority of FOIA requests processed, and hence it is a waste of resources to post them. This thinking runs counter to the governing principle of the Freedom of Information Act: that government information belongs to US citizens, not US agencies. As such, the reason that a person requests information is immaterial as the agency processes the request; the "interest factor" of a document should also be immaterial when an agency is required to post it online. Some think that posting FOIA releases online is not cost effective. In fact, the opposite is true. It's not cost effective to spend tens (or hundreds) of person hours to search for, review, and redact FOIA requests only to mail it to the requester and have them slip it into their desk drawer and forget about it. That is a waste of resources. The released document should be posted online for any interested party to utilize. This will only become easier as FOIA processing systems evolve to automatically post the documents they track. The State Department earned its "E-Star" status demonstrating this very principle, and spent no new funds and did not hire contractors to build its Electronic Reading Room, instead it built a self-sustaining platform that will save the agency time and money going forward.
Paul Merrell

Are processors pushing up against the limits of physics? | Ars Technica - 0 views

  • When I first started reading Ars Technica, performance of a processor was measured in megahertz, and the major manufacturers were rushing to squeeze as many of them as possible into their latest silicon. Shortly thereafter, however, the energy needs and heat output of these beasts brought that race crashing to a halt. More recently, the number of processing cores rapidly scaled up, but they quickly reached the point of diminishing returns. Now, getting the most processing power for each Watt seems to be the key measure of performance. None of these things happened because the companies making processors ran up against hard physical limits. Rather, computing power ended up being constrained because progress in certain areas—primarily energy efficiency—was slow compared to progress in others, such as feature size. But could we be approaching physical limits in processing power? In this week's edition of Nature, The University of Michigan's Igor Markov takes a look at the sorts of limits we might face.
« First ‹ Previous 61 - 80 of 96 Next ›
Showing 20 items per page