Skip to main content

Home/ Future of the Web/ Group items matching "libraries" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
Paul Merrell

Tech firms and privacy groups press for curbs on NSA surveillance powers - The Washington Post - 0 views

  • The nation’s top technology firms and a coalition of privacy groups are urging Congress to place curbs on government surveillance in the face of a fast-approaching deadline for legislative action. A set of key Patriot Act surveillance authorities expire June 1, but the effective date is May 21 — the last day before Congress breaks for a Memorial Day recess. In a letter to be sent Wednesday to the Obama administration and senior lawmakers, the coalition vowed to oppose any legislation that, among other things, does not ban the “bulk collection” of Americans’ phone records and other data.
  • We know that there are some in Congress who think that they can get away with reauthorizing the expiring provisions of the Patriot Act without any reforms at all,” said Kevin Bankston, policy director of New America Foundation’s Open Technology Institute, a privacy group that organized the effort. “This letter draws a line in the sand that makes clear that the privacy community and the Internet industry do not intend to let that happen without a fight.” At issue is the bulk collection of Americans’ data by intelligence agencies such as the National Security Agency. The NSA’s daily gathering of millions of records logging phone call times, lengths and other “metadata” stirred controversy when it was revealed in June 2013 by former NSA contractor Edward Snowden. The records are placed in a database that can, with a judge’s permission, be searched for links to foreign terrorists.They do not include the content of conversations.
  • That program, placed under federal surveillance court oversight in 2006, was authorized by the court in secret under Section 215 of the Patriot Act — one of the expiring provisions. The public outcry that ensued after the program was disclosed forced President Obama in January 2014 to call for an end to the NSA’s storage of the data. He also appealed to Congress to find a way to preserve the agency’s access to the data for counterterrorism information.
  • ...3 more annotations...
  • Despite growing opposition in some quarters to ending the NSA’s program, a “clean” authorization — one that would enable its continuation without any changes — is unlikely, lawmakers from both parties say. Sen. Ron Wyden (D-Ore.), a leading opponent of the NSA’s program in its current format, said he would be “surprised if there are 60 votes” in the Senate for that. In the House, where there is bipartisan support for reining in surveillance, it’s a longer shot still. “It’s a toxic vote back in your district to reauthorize the Patriot Act, if you don’t get some reforms” with it, said Rep. Thomas Massie (R-Ky.). The House last fall passed the USA Freedom Act, which would have ended the NSA program, but the Senate failed to advance its own version.The House and Senate judiciary committees are working to come up with new bipartisan legislation to be introduced soon.
  • The tech firms and privacy groups’ demands are a baseline, they say. Besides ending bulk collection, they want companies to have the right to be more transparent in reporting on national security requests and greater declassification of opinions by the Foreign Intelligence Surveillance Court.
  • Some legal experts have pointed to a little-noticed clause in the Patriot Act that would appear to allow bulk collection to continue even if the authority is not renewed. Administration officials have conceded privately that a legal case probably could be made for that, but politically it would be a tough sell. On Tuesday, a White House spokesman indicated the administration would not seek to exploit that clause. “If Section 215 sunsets, we will not continue the bulk telephony metadata program,” National Security Council spokesman Edward Price said in a statement first reported by Reuters. Price added that allowing Section 215 to expire would result in the loss of a “critical national security tool” used in investigations that do not involve the bulk collection of data. “That is why we have underscored the imperative of Congressional action in the coming weeks, and we welcome the opportunity to work with lawmakers on such legislation,” he said.
  •  
    I omitted some stuff about opposition to sunsetting the provisions. They  seem to forget, as does Obama, that the proponents of the FISA Court's expansive reading of section 215 have not yet come up with a single instance where 215-derived data caught a single terrorist or prevented a single act of terrorism. Which means that if that data is of some use, it ain't in fighting terrorism, the purpose of the section.  Patriot Act § 215 is codified as 50 USCS § 1861, https://www.law.cornell.edu/uscode/text/50/1861 That section authorizes the FBI to obtain an iorder from the FISA Court "requiring the production of *any tangible things* (including books, records, papers, documents, and other items)."  Specific examples (a non-exclusive list) include: the production of library circulation records, library patron lists, book sales records, book customer lists, firearms sales records, tax return records, educational records, or medical records containing information that would identify a person." The Court can order that the recipient of the order tell no one of its receipt of the order or its response to it.   In other words, this is about way more than your telephone metadata. Do you trust the NSA with your medical records? 
Paul Merrell

Most Agencies Falling Short on Mandate for Online Records - 1 views

  • Nearly 20 years after Congress passed the Electronic Freedom of Information Act Amendments (E-FOIA), only 40 percent of agencies have followed the law's instruction for systematic posting of records released through FOIA in their electronic reading rooms, according to a new FOIA Audit released today by the National Security Archive at www.nsarchive.org to mark Sunshine Week. The Archive team audited all federal agencies with Chief FOIA Officers as well as agency components that handle more than 500 FOIA requests a year — 165 federal offices in all — and found only 67 with online libraries populated with significant numbers of released FOIA documents and regularly updated.
  • Congress called on agencies to embrace disclosure and the digital era nearly two decades ago, with the passage of the 1996 "E-FOIA" amendments. The law mandated that agencies post key sets of records online, provide citizens with detailed guidance on making FOIA requests, and use new information technology to post online proactively records of significant public interest, including those already processed in response to FOIA requests and "likely to become the subject of subsequent requests." Congress believed then, and openness advocates know now, that this kind of proactive disclosure, publishing online the results of FOIA requests as well as agency records that might be requested in the future, is the only tenable solution to FOIA backlogs and delays. Thus the National Security Archive chose to focus on the e-reading rooms of agencies in its latest audit. Even though the majority of federal agencies have not yet embraced proactive disclosure of their FOIA releases, the Archive E-FOIA Audit did find that some real "E-Stars" exist within the federal government, serving as examples to lagging agencies that technology can be harnessed to create state-of-the art FOIA platforms. Unfortunately, our audit also found "E-Delinquents" whose abysmal web performance recalls the teletype era.
  • E-Delinquents include the Office of Science and Technology Policy at the White House, which, despite being mandated to advise the President on technology policy, does not embrace 21st century practices by posting any frequently requested records online. Another E-Delinquent, the Drug Enforcement Administration, insults its website's viewers by claiming that it "does not maintain records appropriate for FOIA Library at this time."
  • ...9 more annotations...
  • "The presumption of openness requires the presumption of posting," said Archive director Tom Blanton. "For the new generation, if it's not online, it does not exist." The National Security Archive has conducted fourteen FOIA Audits since 2002. Modeled after the California Sunshine Survey and subsequent state "FOI Audits," the Archive's FOIA Audits use open-government laws to test whether or not agencies are obeying those same laws. Recommendations from previous Archive FOIA Audits have led directly to laws and executive orders which have: set explicit customer service guidelines, mandated FOIA backlog reduction, assigned individualized FOIA tracking numbers, forced agencies to report the average number of days needed to process requests, and revealed the (often embarrassing) ages of the oldest pending FOIA requests. The surveys include:
  • The federal government has made some progress moving into the digital era. The National Security Archive's last E-FOIA Audit in 2007, " File Not Found," reported that only one in five federal agencies had put online all of the specific requirements mentioned in the E-FOIA amendments, such as guidance on making requests, contact information, and processing regulations. The new E-FOIA Audit finds the number of agencies that have checked those boxes is now much higher — 100 out of 165 — though many (66 in 165) have posted just the bare minimum, especially when posting FOIA responses. An additional 33 agencies even now do not post these types of records at all, clearly thwarting the law's intent.
  • The FOIAonline Members (Department of Commerce, Environmental Protection Agency, Federal Labor Relations Authority, Merit Systems Protection Board, National Archives and Records Administration, Pension Benefit Guaranty Corporation, Department of the Navy, General Services Administration, Small Business Administration, U.S. Citizenship and Immigration Services, and Federal Communications Commission) won their "E-Star" by making past requests and releases searchable via FOIAonline. FOIAonline also allows users to submit their FOIA requests digitally.
  • THE E-DELINQUENTS: WORST OVERALL AGENCIES In alphabetical order
  • Key Findings
  • Excuses Agencies Give for Poor E-Performance
  • Justice Department guidance undermines the statute. Currently, the FOIA stipulates that documents "likely to become the subject of subsequent requests" must be posted by agencies somewhere in their electronic reading rooms. The Department of Justice's Office of Information Policy defines these records as "frequently requested records… or those which have been released three or more times to FOIA requesters." Of course, it is time-consuming for agencies to develop a system that keeps track of how often a record has been released, which is in part why agencies rarely do so and are often in breach of the law. Troublingly, both the current House and Senate FOIA bills include language that codifies the instructions from the Department of Justice. The National Security Archive believes the addition of this "three or more times" language actually harms the intent of the Freedom of Information Act as it will give agencies an easy excuse ("not requested three times yet!") not to proactively post documents that agency FOIA offices have already spent time, money, and energy processing. We have formally suggested alternate language requiring that agencies generally post "all records, regardless of form or format that have been released in response to a FOIA request."
  • Disabilities Compliance. Despite the E-FOIA Act, many government agencies do not embrace the idea of posting their FOIA responses online. The most common reason agencies give is that it is difficult to post documents in a format that complies with the Americans with Disabilities Act, also referred to as being "508 compliant," and the 1998 Amendments to the Rehabilitation Act that require federal agencies "to make their electronic and information technology (EIT) accessible to people with disabilities." E-Star agencies, however, have proven that 508 compliance is no barrier when the agency has a will to post. All documents posted on FOIAonline are 508 compliant, as are the documents posted by the Department of Defense and the Department of State. In fact, every document created electronically by the US government after 1998 should already be 508 compliant. Even old paper records that are scanned to be processed through FOIA can be made 508 compliant with just a few clicks in Adobe Acrobat, according to this Department of Homeland Security guide (essentially OCRing the text, and including information about where non-textual fields appear). Even if agencies are insistent it is too difficult to OCR older documents that were scanned from paper, they cannot use that excuse with digital records.
  • Privacy. Another commonly articulated concern about posting FOIA releases online is that doing so could inadvertently disclose private information from "first person" FOIA requests. This is a valid concern, and this subset of FOIA requests should not be posted online. (The Justice Department identified "first party" requester rights in 1989. Essentially agencies cannot use the b(6) privacy exemption to redact information if a person requests it for him or herself. An example of a "first person" FOIA would be a person's request for his own immigration file.) Cost and Waste of Resources. There is also a belief that there is little public interest in the majority of FOIA requests processed, and hence it is a waste of resources to post them. This thinking runs counter to the governing principle of the Freedom of Information Act: that government information belongs to US citizens, not US agencies. As such, the reason that a person requests information is immaterial as the agency processes the request; the "interest factor" of a document should also be immaterial when an agency is required to post it online. Some think that posting FOIA releases online is not cost effective. In fact, the opposite is true. It's not cost effective to spend tens (or hundreds) of person hours to search for, review, and redact FOIA requests only to mail it to the requester and have them slip it into their desk drawer and forget about it. That is a waste of resources. The released document should be posted online for any interested party to utilize. This will only become easier as FOIA processing systems evolve to automatically post the documents they track. The State Department earned its "E-Star" status demonstrating this very principle, and spent no new funds and did not hire contractors to build its Electronic Reading Room, instead it built a self-sustaining platform that will save the agency time and money going forward.
Gonzalo San Gil, PhD.

Seed Libraries Fight for the Right to Share - Shareable - 0 views

  •  
    "Amid government crackdown, seed libraries expand biodiversity and food access. Photo: Betsy Goodman of the Common Soil Seed Library. Credit: Associated Press"
Paul Merrell

Internet Archive: Scanning Services - 1 views

  • Digitizing Print Collections with the Internet Archive Open and free online access, permanent storage, unlimited downloads and lifetime file management. We can help digitize your collections in 4 simple steps:
  • In addition to permanent hosting on archive.org, your books will be integrated with Open Library, openlibrary.org, a page on the web for every book.
  • Non-destructive color scanning using our Scribe system at one of our scanning centers across the globe. Complete MARC records, Dublin Core & XML, just 10c USD per image and a small set up charge per item.
  • ...4 more annotations...
  • Create and upload high-quality JP2000 images; persistent identifiers, lifetime hosting of files, lifetime management of file system and file access.
  • Create high quality PDF A files; run OCR across texts to allow "search inside" all books. Add to Internet Archive search engine; display via our open source Book Reader
  • 2,000,000 books online 600 million pages scanned 1,500 book scanned each day 15 million downloads each month 33 scanning centers in 7 countries 3.5 petabytes of storage 8 Gb per second bandwidth
  • Library of Congress Harvard University The New York Public Library Smithsonian Institution The Getty Research Institute University of California University of Toronto Biodiversity Heritage Library Boston Library Consortium C.A.R.L.I. Johns Hopkins University Allen County Public Library Lyrasis Massachusetts Institute of technology State Library of Massachusetts . . . and over 1,000 other Open Content Alliance partners
  •  
    I've been looking for a permanent online home for a couple of historical works I co-authored. My guidiing criterion has been the best chance of the works' long-term survival in a publicly-accessible form after my death. I think I may have just found my solution. 
Paul Merrell

Tell Congress: My Phone Calls are My Business. Reform the NSA. | EFF Action Center - 3 views

  • The USA PATRIOT Act granted the government powerful new spying capabilities that have grown out of control—but the provision that the FBI and NSA have been using to collect the phone records of millions of innocent people expires on June 1. Tell Congress: it’s time to rethink out-of-control spying. A vote to reauthorize Section 215 is a vote against the Constitution.
  • On June 5, 2013, the Guardian published a secret court order showing that the NSA has interpreted Section 215 to mean that, with the help of the FBI, it can collect the private calling records of millions of innocent people. The government could even try to use Section 215 for bulk collection of financial records. The NSA’s defenders argue that invading our privacy is the only way to keep us safe. But the White House itself, along with the President’s Review Board has said that the government can accomplish its goals without bulk telephone records collection. And the Privacy and Civil Liberties Oversight Board said, “We have not identified a single instance involving a threat to the United States in which [bulk collection under Section 215 of the PATRIOT Act] made a concrete difference in the outcome of a counterterrorism investigation.” Since June of 2013, we’ve continued to learn more about how out of control the NSA is. But what has not happened since June is legislative reform of the NSA. There have been myriad bipartisan proposals in Congress—some authentic and some not—but lawmakers didn’t pass anything. We need comprehensive reform that addresses all the ways the NSA has overstepped its authority and provides the NSA with appropriate and constitutional tools to keep America safe. In the meantime, tell Congress to take a stand. A vote against reauthorization of Section 215 is a vote for the Constitution.
  •  
    EFF has launched an email campagin to press members of Congress not to renew sectiion 215 of the Patriot Act when it expires on June 1, 2015.   Sectjon 215 authorizes FBI officials to "make an application for an order requiring the production of *any tangible things* (including books, records, papers, documents, and other items) for an investigation to obtain foreign intelligence information not concerning a United States person or to protect against international terrorism or clandestine intelligence activities, provided that such investigation of a United States person is not conducted solely upon the basis of activities protected by the first amendment to the Constitution." http://www.law.cornell.edu/uscode/text/50/1861 The section has been abused to obtain bulk collecdtion of all telephone records for the NSA's storage and processing.But the section goes farther and lists as specific examples of records that can be obtained under section 215's authority, "library circulation records, library patron lists, book sales records, book customer lists, firearms sales records, tax return records, educational records, or medical records."  Think of the NSA's voracious appetite for new "haystacks" it can store  and search in its gigantic new data center in Utah. Then ask yourself, "do I want the NSA to obtain all of my personal data, store it, and search it at will?" If your anser is "no," you might consider visiting this page to send your Congress critters an email urging them to vote against renewal of section 215 and to vote for other NSA reforms listed in the EFF sample email text. Please do not procrastinate. Do it now, before you forget. Every voice counts. 
Gonzalo San Gil, PhD.

Highly critical "Ghost" allowing code execution affects most Linux systems | Ars Technica - 1 views

  •  
    ""A lot of collateral damage on the Internet" The glibc is the most common code library used by Linux. It contains standard functions that programs written in the C and C++ languages use to carry out common tasks. The vulnerability also affects Linux programs written in Python, Ruby, and most other languages because they also rely on glibc."
  •  
    ""A lot of collateral damage on the Internet" The glibc is the most common code library used by Linux. It contains standard functions that programs written in the C and C++ languages use to carry out common tasks. The vulnerability also affects Linux programs written in Python, Ruby, and most other languages because they also rely on glibc."
Paul Merrell

The Latest Rules on How Long NSA Can Keep Americans' Encrypted Data Look Too Familiar | Just Security - 0 views

  • Does the National Security Agency (NSA) have the authority to collect and keep all encrypted Internet traffic for as long as is necessary to decrypt that traffic? That was a question first raised in June 2013, after the minimization procedures governing telephone and Internet records collected under Section 702 of the Foreign Intelligence Surveillance Act were disclosed by Edward Snowden. The issue quickly receded into the background, however, as the world struggled to keep up with the deluge of surveillance disclosures. The Intelligence Authorization Act of 2015, which passed Congress this last December, should bring the question back to the fore. It established retention guidelines for communications collected under Executive Order 12333 and included an exception that allows NSA to keep ‘incidentally’ collected encrypted communications for an indefinite period of time. This creates a massive loophole in the guidelines. NSA’s retention of encrypted communications deserves further consideration today, now that these retention guidelines have been written into law. It has become increasingly clear over the last year that surveillance reform will be driven by technological change—specifically by the growing use of encryption technologies. Therefore, any legislation touching on encryption should receive close scrutiny.
  • Section 309 of the intel authorization bill describes “procedures for the retention of incidentally acquired communications.” It establishes retention guidelines for surveillance programs that are “reasonably anticipated to result in the acquisition of [telephone or electronic communications] to or from a United States person.” Communications to or from a United States person are ‘incidentally’ collected because the U.S. person is not the actual target of the collection. Section 309 states that these incidentally collected communications must be deleted after five years unless they meet a number of exceptions. One of these exceptions is that “the communication is enciphered or reasonably believed to have a secret meaning.” This exception appears to be directly lifted from NSA’s minimization procedures for data collected under Section 702 of FISA, which were declassified in 2013. 
  • While Section 309 specifically applies to collection taking place under E.O. 12333, not FISA, several of the exceptions described in Section 309 closely match exceptions in the FISA minimization procedures. That includes the exception for “enciphered” communications. Those minimization procedures almost certainly served as a model for these retention guidelines and will likely shape how this new language is interpreted by the Executive Branch. Section 309 also asks the heads of each relevant member of the intelligence community to develop procedures to ensure compliance with new retention requirements. I expect those procedures to look a lot like the FISA minimization guidelines.
  • ...6 more annotations...
  • This language is broad, circular, and technically incoherent, so it takes some effort to parse appropriately. When the minimization procedures were disclosed in 2013, this language was interpreted by outside commentators to mean that NSA may keep all encrypted data that has been incidentally collected under Section 702 for at least as long as is necessary to decrypt that data. Is this the correct interpretation? I think so. It is important to realize that the language above isn’t just broad. It seems purposefully broad. The part regarding relevance seems to mirror the rationale NSA has used to justify its bulk phone records collection program. Under that program, all phone records were relevant because some of those records could be valuable to terrorism investigations and (allegedly) it isn’t possible to collect only those valuable records. This is the “to find a needle a haystack, you first have to have the haystack” argument. The same argument could be applied to encrypted data and might be at play here.
  • This exception doesn’t just apply to encrypted data that might be relevant to a current foreign intelligence investigation. It also applies to cases in which the encrypted data is likely to become relevant to a future intelligence requirement. This is some remarkably generous language. It seems one could justify keeping any type of encrypted data under this exception. Upon close reading, it is difficult to avoid the conclusion that these procedures were written carefully to allow NSA to collect and keep a broad category of encrypted data under the rationale that this data might contain the communications of NSA targets and that it might be decrypted in the future. If NSA isn’t doing this today, then whoever wrote these minimization procedures wanted to at least ensure that NSA has the authority to do this tomorrow.
  • There are a few additional observations that are worth making regarding these nominally new retention guidelines and Section 702 collection. First, the concept of incidental collection as it has typically been used makes very little sense when applied to encrypted data. The way that NSA’s Section 702 upstream “about” collection is understood to work is that technology installed on the network does some sort of pattern match on Internet traffic; say that an NSA target uses example@gmail.com to communicate. NSA would then search content of emails for references to example@gmail.com. This could notionally result in a lot of incidental collection of U.S. persons’ communications whenever the email that references example@gmail.com is somehow mixed together with emails that have nothing to do with the target. This type of incidental collection isn’t possible when the data is encrypted because it won’t be possible to search and find example@gmail.com in the body of an email. Instead, example@gmail.com will have been turned into some alternative, indecipherable string of bits on the network. Incidental collection shouldn’t occur because the pattern match can’t occur in the first place. This demonstrates that, when communications are encrypted, it will be much harder for NSA to search Internet traffic for a unique ID associated with a specific target.
  • This lends further credence to the conclusion above: rather than doing targeted collection against specific individuals, NSA is collecting, or plans to collect, a broad class of data that is encrypted. For example, NSA might collect all PGP encrypted emails or all Tor traffic. In those cases, NSA could search Internet traffic for patterns associated with specific types of communications, rather than specific individuals’ communications. This would technically meet the definition of incidental collection because such activity would result in the collection of communications of U.S. persons who aren’t the actual targets of surveillance. Collection of all Tor traffic would entail a lot of this “incidental” collection because the communications of NSA targets would be mixed with the communications of a large number of non-target U.S. persons. However, this “incidental” collection is inconsistent with how the term is typically used, which is to refer to over-collection resulting from targeted surveillance programs. If NSA were collecting all Tor traffic, that activity wouldn’t actually be targeted, and so any resulting over-collection wouldn’t actually be incidental. Moreover, greater use of encryption by the general public would result in an ever-growing amount of this type of incidental collection.
  • This type of collection would also be inconsistent with representations of Section 702 upstream collection that have been made to the public and to Congress. Intelligence officials have repeatedly suggested that search terms used as part of this program have a high degree of specificity. They have also argued that the program is an example of targeted rather than bulk collection. ODNI General Counsel Robert Litt, in a March 2014 meeting before the Privacy and Civil Liberties Oversight Board, stated that “there is either a misconception or a mischaracterization commonly repeated that Section 702 is a form of bulk collection. It is not bulk collection. It is targeted collection based on selectors such as telephone numbers or email addresses where there’s reason to believe that the selector is relevant to a foreign intelligence purpose.” The collection of Internet traffic based on patterns associated with types of communications would be bulk collection; more akin to NSA’s collection of phone records en mass than it is to targeted collection focused on specific individuals. Moreover, this type of collection would certainly fall within the definition of bulk collection provided just last week by the National Academy of Sciences: “collection in which a significant portion of the retained data pertains to identifiers that are not targets at the time of collection.”
  • The Section 702 minimization procedures, which will serve as a template for any new retention guidelines established for E.O. 12333 collection, create a large loophole for encrypted communications. With everything from email to Internet browsing to real-time communications moving to encrypted formats, an ever-growing amount of Internet traffic will fall within this loophole.
  •  
    Tucked into a budget authorization act in December without press notice. Section 309 (the Act is linked from the article) appears to be very broad authority for the NSA to intercept any form of telephone or other electronic information in bulk. There are far more exceptions from the five-year retention limitation than the encrypted information exception. When reading this, keep in mind that the U.S. intelligence community plays semantic games to obfuscate what it does. One of its word plays is that communications are not "collected" until an analyst looks at or listens to partiuclar data, even though the data will be searched to find information countless times before it becomes "collected." That searching was the major basis for a decision by the U.S. District Court in Washington, D.C. that bulk collection of telephone communications was unconstitutional: Under the Fourth Amendment, a "search" or "seizure" requiring a judicial warrant occurs no later than when the information is intercepted. That case is on appeal, has been briefed and argued, and a decision could come any time now. Similar cases are pending in two other courts of appeals. Also, an important definition from the new Intelligence Authorization Act: "(a) DEFINITIONS.-In this section: (1) COVERED COMMUNICATION.-The term ''covered communication'' means any nonpublic telephone or electronic communication acquired without the consent of a person who is a party to the communication, including communications in electronic storage."       
Gonzalo San Gil, PhD.

Outernet | Discussions [Outernet is NOT The Internet...] - 1 views

  •  
    [# ! ... nor it pretends to be, # ! just an #alternative / #interactive #information #channel. # ! #stop #gratuitous #critics, by thxse [sic.] determined to keep on # ! #monopolizing #information, #opinion & #entertainment.] "Welcome to the official discussion forum for Outernet: Humanity's Public Library. If you are new to the forum, please look at the FAQ before poting questions. This forum is monitored regularly by Outernet staff and is a place to ask questions about the project or, even better, create discussion around various aspects of the project. "
  •  
    "Welcome to the official discussion forum for Outernet: Humanity's Public Library. If you are new to the forum, please look at the FAQ before poting questions. This forum is monitored regularly by Outernet staff and is a place to ask questions about the project or, even better, create discussion around various aspects of the project. "
Gonzalo San Gil, PhD.

The digital open source library of tomorrow Posted 20 Nov 2014 by Nicole C. Engard | Opensource.com - 0 views

  •  
    "Phil Shapiro's vision for the libraries of tomorrow Phil Shapiro, one of my fellow Opensource.com Community Moderators, gave a talk at All Things Open 2014 about open source and libraries. This is a recap of that talk."
  •  
    "Phil Shapiro, one of my fellow Opensource.com Community Moderators, gave a talk at All Things Open 2014 about open source and libraries. This is a recap of that talk."
Gonzalo San Gil, PhD.

Heartbleed Was Bad, but Shellshock Was Worse, Researcher Says - 0 views

  •  
    "Both the Heartbleed and Shellshock bugs were open-source flaws found in many Linux distributions, and both had the potential to impact OpenStack cloud users. Heartbleed is a flaw in the OpenSSL crytographic library for secure transport while Shellshock is a vulnerability in the Bash shell." [# ! At least... # ! … #OpenSource #community were #warned# ! and the #flaws were #solved…. among @ll. # ! #imagine how many flaws live in the #proprietary #closed #source# ! #unaware #users' #software…]
  •  
    "Both the Heartbleed and Shellshock bugs were open-source flaws found in many Linux distributions, and both had the potential to impact OpenStack cloud users. Heartbleed is a flaw in the OpenSSL crytographic library for secure transport while Shellshock is a vulnerability in the Bash shell."
Gonzalo San Gil, PhD.

Thunderclap: Free Information from Space Outernet for Aug 11, 2014 - 0 views

  • Right now, only 40% of humanity can connect to the Internet. Even less than that have access to truly free, uncensored Internet. What this represents is an enormous gap in access to information. While the Internet is an amazing communication tool, it is also the largest library ever constructed. It grants access to anything from books, videos, courseware, news, and weather, to open source farm equipment or instructions on how to treat infection or prevent HIV from spreading. #ImagineIf everyone could have that information for free?On August 11, 2014, Outernet will make that library available from space for free for the first time. Help us tell the world.#ImagineIf everyone had any information they wanted - what would that world look like? What new inventions would be created or diseases cured? What would people read about if their governments no longer deprived them of their right to free information? Soon, we won't have to imagine.
  • Right now, only 40% of humanity can connect to the Internet. Even less than that have access to truly free, uncensored Internet. What this represents is an enormous gap in access to information. While the Internet is an amazing communication tool, it is also the largest library ever constructed. It grants access to anything from books, videos, courseware, news, and weather, to open source farm equipment or instructions on how to treat infection or prevent HIV from spreading. #ImagineIf everyone could have that information for free?On August 11, 2014, Outernet will make that library available from space for free for the first time. Help us tell the world.#ImagineIf everyone had any information they wanted - what would that world look like? What new inventions would be created or diseases cured? What would people read about if their governments no longer deprived them of their right to free information? 
  •  
    INFORMATION FOR THE WORLD FROM OUTER SPACE Unrestricted, globally accessible, broadcast data. Quality content from all over the Internet. Available to all of humanity. For free. Through satellite data broadcasting, Outernet is able to bypass censorship, ensure privacy, and offer a universally-accessible information service at no cost to global citizens. It's the modern version of shortwave radio, or BitTorrent from space.
  •  
    ""#ImagineIf every human had a free library at home... Information equality begins TODAY: Outernet is LIVE from space! http://thndr.it/1pazaP3" "
  •  
    ""#ImagineIf every human had a free library at home... Information equality begins TODAY: Outernet is LIVE from space! http://thndr.it/1pazaP3" "
Paul Merrell

2nd Cir. Affirms That Creation of Full-Text Searchable Database of Works Is Fair Use | Bloomberg BNA - 0 views

  • The fair use doctrine permits the unauthorized digitization of copyrighted works in order to create a full-text searchable database, the U.S. Court of Appeals for the Second Circuit ruled June 10.Affirming summary judgment in favor of a consortium of university libraries, the court also ruled that the fair use doctrine permits the unauthorized conversion of those works into accessible formats for use by persons with disabilities, such as the blind.
  • The dispute is connected to the long-running conflict between Google Inc. and various authors of books that Google included in a mass digitization program. In 2004, Google began soliciting the participation of publishers in its Google Print for Publishers service, part of what was then called the Google Print project, aimed at making information available for free over the Internet.Subsequently, Google announced a new project, Google Print for Libraries. In 2005, Google Print was renamed Google Book Search and it is now known simply as Google Books. Under this program, Google made arrangements with several of the world's largest Libraries to digitize the entire contents of their collections to create an online full-text searchable database.The announcement of this program triggered a copyright infringement action by the Authors Guild that continues to this day.
  • Part of the deal between Google and the libraries included an offer by Google to hand over to the libraries their own copies of the digitized versions of their collections.In 2011, a group of those libraries announced the establishment of a new service, called the HathiTrust digital library, to which the libraries would contribute their digitized collections. This database of copies is to be made available for full-text searching and preservation activities. Additionally, it is intended to offer free access to works to individuals who have “print disabilities.” For works under copyright protection, the search function would return only a list of page numbers that a search term appeared on and the frequency of such appearance.
  • ...3 more annotations...
  • Turning to the fair use question, the court first concluded that the full-text search function of the Hathitrust Digital Library was a “quintessentially transformative use,” and thus constituted fair use. The court said:the result of a word search is different in purpose, character, expression, meaning, and message from the page (and the book) from which it is drawn. Indeed, we can discern little or no resemblance between the original text and the results of the HDL full-text search.There is no evidence that the Authors write with the purpose of enabling text searches of their books. Consequently, the full-text search function does not “supersede[ ] the objects [or purposes] of the original creation.”Turning to the fourth fair use factor—whether the use functions as a substitute for the original work—the court rejected the argument that such use represents lost sales to the extent that it prevents the future development of a market for licensing copies of works to be used in full-text searches.However, the court emphasized that the search function “does not serve as a substitute for the books that are being searched.”
  • The court also rejected the argument that the database represented a threat of a security breach that could result in the full text of all the books becoming available for anyone to access. The court concluded that Hathitrust's assertions of its security measures were unrebutted.Thus, the full-text search function was found to be protected as fair use.
  • The court also concluded that allowing those with print disabilities access to the full texts of the works collected in the Hathitrust database was protected as fair use. Support for this conclusion came from the legislative history of the Copyright Act's fair use provision, 17 U.S.C. §107.
Gonzalo San Gil, PhD.

Sex Cams at NASA & Library of Congress, Anti-Piracy Outfit Says | TorrentFreak - 1 views

  •  
    " Andy on May 27, 2014 C: 27 Breaking An adult media company's hiring of an anti-piracy outfit to blitz the Internet for content infringing on its webcam copyrights has produced ridiculous results."
Paul Merrell

This project aims to make '404 not found' pages a thing of the past - 0 views

  • The Internet is always changing. Sites are rising and falling, content is deleted, and bad URLs can lead to '404 Not Found' errors that are as helpful as a brick wall. A new project proposes an do away with dead 404 errors by implementing new HTML code that will help access prior versions of hyperlinked content. With any luck, that means that you’ll never have to run into a dead link again. The “404-No-More” project is backed by a formidable coalition including members from organizations like the Harvard Library Innovation Lab, Los Alamos National Laboratory, Old Dominion University, and the Berkman Center for Internet & Society. Part of the Knight News Challenge, which seeks to strengthen the Internet for free expression and innovation through a variety of initiatives, 404-No-More recently reached the semifinal stage. The project aims to cure so-called link rot, the process by which hyperlinks become useless overtime because they point to addresses that are no longer available. If implemented, websites such as Wikipedia and other reference documents would be vastly improved. The new feature would also give Web authors a way provide links that contain both archived copies of content and specific dates of reference, the sort of information that diligent readers have to hunt down on a website like Archive.org.
  • While it may sound trivial, link rot can actually have real ramifications. Nearly 50 percent of the hyperlinks in Supreme Court decisions no longer work, a 2013 study revealed. Losing footnotes and citations in landmark legal decisions can mean losing crucial information and context about the laws that govern us. The same study found that 70 percent of URLs within the Harvard Law Review and similar journals didn’t link to the originally cited information, considered a serious loss surrounding the discussion of our laws. The project’s proponents have come up with more potential uses as well. Activists fighting censorship will have an easier time combatting government takedowns, for instance. Journalists will be much more capable of researching dynamic Web pages. “If every hyperlink was annotated with a publication date, you could automatically view an archived version of the content as the author intended for you to see it,” the project’s authors explain. The ephemeral nature of the Web could no longer be used as a weapon. Roger Macdonald, a director at the Internet Archive, called the 404-No-More project “an important contribution to preservation of knowledge.”
  • The new feature would come in the form of introducing the mset attribute to the <a> element in HTML, which would allow users of the code to specify multiple dates and copies of content as an external resource. For instance, if both the date of reference and the location of a copy of targeted content is known by an author, the new code would like like this: The 404-No-More project’s goals are numerous, but the ultimate goal is to have mset become a new HTML standard for hyperlinks. “An HTML standard that incorporates archives for hyperlinks will loop in these efforts and make the Web better for everyone,” project leaders wrote, “activists, journalists, and regular ol’ everyday web users.”
Gary Edwards

Readium at the London Book Fair 2014: Open Source for an Open Publishing Ecosystem: Readium.org Turns One - 0 views

  •  
    excerpt/intro: Last month marked the one-year anniversary of the formation of the Readium Foundation (Readium.org), an independent nonprofit launched in March 2013 with the objective of developing commercial-grade open source publishing technology software. The overall goal of Readium.org is to accelerate adoption of ePub 3, HTML5, and the Open Web Platform by the digital publishing industry to help realize the full potential of open-standards-based interoperability. More specifically, the aim is to raise the bar for ePub 3 support across the industry so that ePub maintains its position as the standard distribution format for e-books and expands its reach to include other types of digital publications. In its first year, the Readium consortium added 15 organizations to its membership, including Adobe, Google, IBM, Ingram, KERIS (S. Korea Education Ministry), and the New York Public Library. The membership now boasts publishers, retailers, distributors and technology companies from around the world, including organizations based in France, Germany, Norway, U.S., Canada, China, Korea, and Japan. In addition, in February 2014 the first Readium.org board was elected by the membership and the first three projects being developed by members and other contributors are all nearing "1.0" status. The first project, Readium SDK, is a rendering "engine" enabling native apps to support ePub 3. Readium SDK is available on four platforms-Android, iOS, OS/X, and Windows- and the first product incorporating Readium SDK (by ACCESS Japan) was announced last October. Readium SDK is designed to be DRM-agnostic, and vendors Adobe and Sony have publicized plans to integrate their respective DRM solutions with Readium SDK. A second effort, Readium JS, is a pure JavaScript ePub 3 implementation, with configurations now available for cloud based deployment of ePub files, as well as Readium for Chrome, the successor to the original Readium Chrome extension developed by IDPF as the
  •  
    excerpt/intro: Last month marked the one-year anniversary of the formation of the Readium Foundation (Readium.org), an independent nonprofit launched in March 2013 with the objective of developing commercial-grade open source publishing technology software. The overall goal of Readium.org is to accelerate adoption of ePub 3, HTML5, and the Open Web Platform by the digital publishing industry to help realize the full potential of open-standards-based interoperability. More specifically, the aim is to raise the bar for ePub 3 support across the industry so that ePub maintains its position as the standard distribution format for e-books and expands its reach to include other types of digital publications. In its first year, the Readium consortium added 15 organizations to its membership, including Adobe, Google, IBM, Ingram, KERIS (S. Korea Education Ministry), and the New York Public Library. The membership now boasts publishers, retailers, distributors and technology companies from around the world, including organizations based in France, Germany, Norway, U.S., Canada, China, Korea, and Japan. In addition, in February 2014 the first Readium.org board was elected by the membership and the first three projects being developed by members and other contributors are all nearing "1.0" status. The first project, Readium SDK, is a rendering "engine" enabling native apps to support ePub 3. Readium SDK is available on four platforms-Android, iOS, OS/X, and Windows- and the first product incorporating Readium SDK (by ACCESS Japan) was announced last October. Readium SDK is designed to be DRM-agnostic, and vendors Adobe and Sony have publicized plans to integrate their respective DRM solutions with Readium SDK. A second effort, Readium JS, is a pure JavaScript ePub 3 implementation, with configurations now available for cloud based deployment of ePub files, as well as Readium for Chrome, the successor to the original Readium Chrome extension developed by IDPF as the
Gary Edwards

Spritz reader: Getting words into your brain faster - 1 views

  • Static blocks of text like the one you’re looking at now are an antiquated and inefficient way to get words into your head. That’s the contention of Boston-based startup Spritz, which has developed a speed-reading text box that shows no more than 13 characters at a time. The Spritz box flashes words at you in quick succession so you don’t have to move your eyes around a page, and in my very quick testing it allowed me to read at more than double my usual reading pace. Spritz has teamed up with Samsung to integrate its speed reading functionality with the upcoming Galaxy S5 smartphone. The written word, after 8,000 or so years, is still an extremely effective way to get a message from one mind into the minds of others. But even with the advent of the digital age and decades of usability work, font and layout development, we’re still nowhere near optimal efficiency with it yet.
  • Take this article – I’ve written it in easily digestible chunks, and we’ve presented it in nice, thin, 10 to 14 word columns that should make it easy to scan. But pay attention to what your eyes are doing while you try to read it. Chances are, even if you’re a quick reader, your eyes are jumping around all over the place. In fact, according to Boston-based startup Spritz, you spend as little as 20 percent of your reading time actually taking in the words you’re looking at, and as much as 80 percent physically moving your eyes around to find the right spot to read each word from. So, the Spritz team decided, why not eliminate that time altogether? The Spritz reader is a simple, small box that streams text at the reader, one word at a time. The words are presented in a large, very reader-friendly font, and centered around the "optimal recognition point" of each word. In fact, the box will only display a maximum of 13 characters, so larger words are broken up.
  • What’s really interesting is just how quickly this system can pipe information into your brain. I did a couple of online reading speed tests and found my average reading speed for regular blocks of text is around 330-350 words per minute. But I can comfortably follow a Spritz box at up to 500 words per minute without missing much, losing concentration or feeling any kind of eye strain. In short stints I can follow 800 words per minute, and the team says it’s easy to train yourself to go faster and retain more. Try it yourself. Here’s 250 words per minute:
  • ...1 more annotation...
  • Spritz claims that information retention rates on "spritzed" content are equal to or higher than that of traditional text block reading, and that some of its testers are now comfortably ingesting content at 1000 words per minute with no loss of information retention. That’s Tolstoy’s 1,440 page behemoth War and Peace dispatched in a single 10 hour sitting, if you had the concentration for it, or Stieg Larsson's Girl with a Dragon Tattoo in two and a bit hours. Spritz is also clearly developed to excel on mobile and handheld reading devices, and as such, the company has announced that Spritz will make its mobile debut on the upcoming Samsung Galaxy S5 release. Smartwatch and Google glass-type implementations are also on the radar. The mobile angle will have to be strong as there are numerous free tools for desktop browsers that can replicate a similar reading experience for free. If you’re using a Chrome browser, check out Spreed as an example. Perhaps the most significant move for Spritz will be bringing this speed reading technology to bear on your Android e-book library. Anything that can help me get through my reading backlog quicker will be most welcome!
Gonzalo San Gil, PhD.

The death of patents and what comes after: Alicia Gibb at TEDxStockholm - YouTube - 0 views

  •  
    "Published on Dec 18, 2012 Alicia Gibb got her start as a technologist from her combination in backgrounds including: informatics and library science, a belief system of freedom of information, inspiration from art and design, and a passion for hardware hacking. Alicia has worked between the crossroads of art and electronics for the past nine years, and has worked for the open source hardware community for the past three. She currently founded and is running the Open Source Hardware Association, an organization to educate and promote building and using open source hardware of all types. In her spare time, Alicia is starting an open source hardware company specific to education. Previous to becoming an advocate and an entrepreneur, Alicia was a researcher and prototyper at Bug Labs where she ran the academic research program and the Test Kitchen, an open R&D Lab. Her projects centered around developing lightweight additions to the BUG platform, as well as a sensor-based data collection modules. She is a member of NYCResistor, co-chair of the Open Hardware Summit, and a member of the advisory board for Linux Journal. She holds a degree in art education, a M.S. in Art History and a M.L.I.S. in Information Science from Pratt Institute. She is self-taught in electronics. Her electronics work has appeared in Wired magazine, IEEE Spectrum, Hackaday and the New York Times. When Alicia is not researching at the crossroads of open technology and innovation she is prototyping artwork that twitches, blinks, and might even be tasty to eat. In the spirit of ideas worth spreading, TEDx is a program of local, self-organized events that bring people together to share a TED-like experience. At a TEDx event, TEDTalks video and live speakers combine to spark deep discussion and connection in a small group. These local, self-organized events are branded TEDx, where x = independently organized TED event. The TED Conference provides general guidance for the TEDx program, but individual
Paul Merrell

Cloud computing with Amazon Web Services, Part 1: Introduction - 0 views

  • Cloud computing is a paradigm shift in how we architect and deliver scalable applications. In the past, successful companies spent precious time and resources building an infrastructure that in turn provided them a competitive advantage. It was frequently a case of "You build it first and they will come." In most cases, this approach: Left large tracts of unused computing capacity that took up space in big data centers. Required someone to babysit the servers. Had associated energy costs. The unused computing power wasted away, with no way to push it out to other companies or users who might be willing to pay for additional compute cycles. With cloud computing, excess computing capacity can be put to use and be profitably sold to consumers. This transformation of computing and IT infrastructure into a utility, which is available to all, somewhat levels the playing field.
  • According to Amazon’s estimates, businesses spend about 70 percent of their time on building and maintaining their infrastructures while using only 30 percent of their precious time actually working on the ideas that power their businesses.
  •  
    We're a 100% free online dating site. View photos of singles in your area, see who's online now! Never pay for online dating, chat with singles here for free. www.sugarhoneys4u.com Match.com is the number one destination for online dating with more dates, more relationships, & more marriages than any other dating or personals site. www.killdo.de.gg 1 in 5 relationships now start online. Start dating for free with match.com, the dating site with more relationships & marriages than any other site.
Paul Merrell

Office Business Applications for Store Operations - 0 views

  • Service orientation addresses these challenges by centering on rapidly evolving XML and Web services standards that are revolutionizing how developers compose systems and integrate them over distributed networks. No longer are developers forced to make do with rigid and proprietary languages and object models that used to be the norm before service orientation came into play. The emergence of this new methodology is helping to develop new approaches specifically for Web-based distributed computing. This revolution is transforming the business by integrating disparate systems to establish a real-time enterprise. Making information available where it is needed to simplify merchandising processes requires a methodology that is based on loosely coupled integration between various in-store and back-end applications. This demand makes it critical for an architecture that is based on service orientation for integration between disparate applications. In addition, surfacing information at the right place requires the ability to compose dynamic applications using an array of underlying services. The Office Business Applications platform provides this ability to create composite applications, such as dashboards for the store, regional, and corporate managers.
  •  
    Summary: Changing market conditions require agility in business applications. Service orientation answers the challenge by centering on XML and Web services standards that revolutionize how developers compose systems and integrate them over distributed networks. Once integrated, how is the information presented to the decision makers? (36 printed pages)
  •  
    Most quality online stores. Know whether you are a trusted online retailer in the world. Whatever we can buy very good quality. and do not hesitate. Everything is very high quality. Including clothes, accessories, bags, cups. Highly recommended. This is one of the trusted online store in the world. View now www.retrostyler.com
Gary Edwards

Nokia and Google: Too much emphasis on the mobile OS? | ge TalkBack on ZDNet - 0 views

  • It's the document model! There is nothing wrong with RiA. Adobe is doing great stuff, and they fully support the WebKit flow document model in their RiA. Silverlight on the other hand is a true threat to the Open Web because it implements uniquely proprietary format, protocol and interface alternatives. The problem with AJAX is that it's difficult to scale. Advancing JavaScript libraries, structured WebKit, BrowserPlus and Google's GWT-Google Gears promise to tame that problem. At the end of the day though, i see AJAX as an important aspect of the browser as an RiA runtime engine. Here's what concerns me; 500 million MSOffice desktops that anchor most of the world's client/server business processes speak XAML "fixed/flow". These desktops are the information pumps for billions of business critical documents. And they do not speak the language of the Open Web. They speak the language of the Microsoft Web-Stack and Mesh services.
  •  
    Response to questions about RiA vs AJAX.
  •  
    Most quality online stores. Know whether you are a trusted online retailer in the world. Whatever we can buy very good quality. and do not hesitate. Everything is very high quality. Including clothes, accessories, bags, cups. Highly recommended. This is one of the trusted online store in the world. View now www.retrostyler.com
‹ Previous 21 - 40 of 62 Next › Last »
Showing 20 items per page