Skip to main content

Home/ Future of the Web/ Group items tagged do

Rss Feed Group items tagged

Gonzalo San Gil, PhD.

U.S. Net Neutrality Has a Massive Copyright Loophole | TorrentFreak - 0 views

  •  
    # ! [... Fingers crossed…. ] " Ernesto on March 15, 2015 C: 0 Opinion After years of debating U.S. Internet subscribers now have Government regulated Net Neutrality. A huge step forward according to some, but the full order released a few days ago reveals some worrying caveats. While the rules prevent paid prioritization, they do very little to prevent BitTorrent blocking, the very issue that got the net neutrality debate started."
  •  
    # ! [... Fingers crossed…. ] " Ernesto on March 15, 2015 C: 0 Opinion After years of debating U.S. Internet subscribers now have Government regulated Net Neutrality. A huge step forward according to some, but the full order released a few days ago reveals some worrying caveats. While the rules prevent paid prioritization, they do very little to prevent BitTorrent blocking, the very issue that got the net neutrality debate started."
  •  
    # ! [... Fingers crossed…. ] " Ernesto on March 15, 2015 C: 0 Opinion After years of debating U.S. Internet subscribers now have Government regulated Net Neutrality. A huge step forward according to some, but the full order released a few days ago reveals some worrying caveats. While the rules prevent paid prioritization, they do very little to prevent BitTorrent blocking, the very issue that got the net neutrality debate started."
Paul Merrell

Most Agencies Falling Short on Mandate for Online Records - 1 views

  • Nearly 20 years after Congress passed the Electronic Freedom of Information Act Amendments (E-FOIA), only 40 percent of agencies have followed the law's instruction for systematic posting of records released through FOIA in their electronic reading rooms, according to a new FOIA Audit released today by the National Security Archive at www.nsarchive.org to mark Sunshine Week. The Archive team audited all federal agencies with Chief FOIA Officers as well as agency components that handle more than 500 FOIA requests a year — 165 federal offices in all — and found only 67 with online libraries populated with significant numbers of released FOIA documents and regularly updated.
  • Congress called on agencies to embrace disclosure and the digital era nearly two decades ago, with the passage of the 1996 "E-FOIA" amendments. The law mandated that agencies post key sets of records online, provide citizens with detailed guidance on making FOIA requests, and use new information technology to post online proactively records of significant public interest, including those already processed in response to FOIA requests and "likely to become the subject of subsequent requests." Congress believed then, and openness advocates know now, that this kind of proactive disclosure, publishing online the results of FOIA requests as well as agency records that might be requested in the future, is the only tenable solution to FOIA backlogs and delays. Thus the National Security Archive chose to focus on the e-reading rooms of agencies in its latest audit. Even though the majority of federal agencies have not yet embraced proactive disclosure of their FOIA releases, the Archive E-FOIA Audit did find that some real "E-Stars" exist within the federal government, serving as examples to lagging agencies that technology can be harnessed to create state-of-the art FOIA platforms. Unfortunately, our audit also found "E-Delinquents" whose abysmal web performance recalls the teletype era.
  • E-Delinquents include the Office of Science and Technology Policy at the White House, which, despite being mandated to advise the President on technology policy, does not embrace 21st century practices by posting any frequently requested records online. Another E-Delinquent, the Drug Enforcement Administration, insults its website's viewers by claiming that it "does not maintain records appropriate for FOIA Library at this time."
  • ...9 more annotations...
  • "The presumption of openness requires the presumption of posting," said Archive director Tom Blanton. "For the new generation, if it's not online, it does not exist." The National Security Archive has conducted fourteen FOIA Audits since 2002. Modeled after the California Sunshine Survey and subsequent state "FOI Audits," the Archive's FOIA Audits use open-government laws to test whether or not agencies are obeying those same laws. Recommendations from previous Archive FOIA Audits have led directly to laws and executive orders which have: set explicit customer service guidelines, mandated FOIA backlog reduction, assigned individualized FOIA tracking numbers, forced agencies to report the average number of days needed to process requests, and revealed the (often embarrassing) ages of the oldest pending FOIA requests. The surveys include:
  • The federal government has made some progress moving into the digital era. The National Security Archive's last E-FOIA Audit in 2007, " File Not Found," reported that only one in five federal agencies had put online all of the specific requirements mentioned in the E-FOIA amendments, such as guidance on making requests, contact information, and processing regulations. The new E-FOIA Audit finds the number of agencies that have checked those boxes is now much higher — 100 out of 165 — though many (66 in 165) have posted just the bare minimum, especially when posting FOIA responses. An additional 33 agencies even now do not post these types of records at all, clearly thwarting the law's intent.
  • The FOIAonline Members (Department of Commerce, Environmental Protection Agency, Federal Labor Relations Authority, Merit Systems Protection Board, National Archives and Records Administration, Pension Benefit Guaranty Corporation, Department of the Navy, General Services Administration, Small Business Administration, U.S. Citizenship and Immigration Services, and Federal Communications Commission) won their "E-Star" by making past requests and releases searchable via FOIAonline. FOIAonline also allows users to submit their FOIA requests digitally.
  • THE E-DELINQUENTS: WORST OVERALL AGENCIES In alphabetical order
  • Key Findings
  • Excuses Agencies Give for Poor E-Performance
  • Justice Department guidance undermines the statute. Currently, the FOIA stipulates that documents "likely to become the subject of subsequent requests" must be posted by agencies somewhere in their electronic reading rooms. The Department of Justice's Office of Information Policy defines these records as "frequently requested records… or those which have been released three or more times to FOIA requesters." Of course, it is time-consuming for agencies to develop a system that keeps track of how often a record has been released, which is in part why agencies rarely do so and are often in breach of the law. Troublingly, both the current House and Senate FOIA bills include language that codifies the instructions from the Department of Justice. The National Security Archive believes the addition of this "three or more times" language actually harms the intent of the Freedom of Information Act as it will give agencies an easy excuse ("not requested three times yet!") not to proactively post documents that agency FOIA offices have already spent time, money, and energy processing. We have formally suggested alternate language requiring that agencies generally post "all records, regardless of form or format that have been released in response to a FOIA request."
  • Disabilities Compliance. Despite the E-FOIA Act, many government agencies do not embrace the idea of posting their FOIA responses online. The most common reason agencies give is that it is difficult to post documents in a format that complies with the Americans with Disabilities Act, also referred to as being "508 compliant," and the 1998 Amendments to the Rehabilitation Act that require federal agencies "to make their electronic and information technology (EIT) accessible to people with disabilities." E-Star agencies, however, have proven that 508 compliance is no barrier when the agency has a will to post. All documents posted on FOIAonline are 508 compliant, as are the documents posted by the Department of Defense and the Department of State. In fact, every document created electronically by the US government after 1998 should already be 508 compliant. Even old paper records that are scanned to be processed through FOIA can be made 508 compliant with just a few clicks in Adobe Acrobat, according to this Department of Homeland Security guide (essentially OCRing the text, and including information about where non-textual fields appear). Even if agencies are insistent it is too difficult to OCR older documents that were scanned from paper, they cannot use that excuse with digital records.
  • Privacy. Another commonly articulated concern about posting FOIA releases online is that doing so could inadvertently disclose private information from "first person" FOIA requests. This is a valid concern, and this subset of FOIA requests should not be posted online. (The Justice Department identified "first party" requester rights in 1989. Essentially agencies cannot use the b(6) privacy exemption to redact information if a person requests it for him or herself. An example of a "first person" FOIA would be a person's request for his own immigration file.) Cost and Waste of Resources. There is also a belief that there is little public interest in the majority of FOIA requests processed, and hence it is a waste of resources to post them. This thinking runs counter to the governing principle of the Freedom of Information Act: that government information belongs to US citizens, not US agencies. As such, the reason that a person requests information is immaterial as the agency processes the request; the "interest factor" of a document should also be immaterial when an agency is required to post it online. Some think that posting FOIA releases online is not cost effective. In fact, the opposite is true. It's not cost effective to spend tens (or hundreds) of person hours to search for, review, and redact FOIA requests only to mail it to the requester and have them slip it into their desk drawer and forget about it. That is a waste of resources. The released document should be posted online for any interested party to utilize. This will only become easier as FOIA processing systems evolve to automatically post the documents they track. The State Department earned its "E-Star" status demonstrating this very principle, and spent no new funds and did not hire contractors to build its Electronic Reading Room, instead it built a self-sustaining platform that will save the agency time and money going forward.
Paul Merrell

Information Warfare: Automated Propaganda and Social Media Bots | Global Research - 0 views

  • NATO has announced that it is launching an “information war” against Russia. The UK publicly announced a battalion of keyboard warriors to spread disinformation. It’s well-documented that the West has long used false propaganda to sway public opinion. Western military and intelligence services manipulate social media to counter criticism of Western policies. Such manipulation includes flooding social media with comments supporting the government and large corporations, using armies of sock puppets, i.e. fake social media identities. See this, this, this, this and this. In 2013, the American Congress repealed the formal ban against the deployment of propaganda against U.S. citizens living on American soil. So there’s even less to constrain propaganda than before.
  • Information warfare for propaganda purposes also includes: The Pentagon, Federal Reserve and other government entities using software to track discussion of political issues … to try to nip dissent in the bud before it goes viral “Controlling, infiltrating, manipulating and warping” online discourse Use of artificial intelligence programs to try to predict how people will react to propaganda
  • Some of the propaganda is spread by software programs. We pointed out 6 years ago that people were writing scripts to censor hard-hitting information from social media. One of America’s top cyber-propagandists – former high-level military information officer Joel Harding – wrote in December: I was in a discussion today about information being used in social media as a possible weapon.  The people I was talking with have a tool which scrapes social media sites, gauges their sentiment and gives the user the opportunity to automatically generate a persuasive response. Their tool is called a “Social Networking Influence Engine”. *** The implications seem to be profound for the information environment. *** The people who own this tool are in the civilian world and don’t even remotely touch the defense sector, so getting approval from the US Department of State might not even occur to them.
  • ...2 more annotations...
  • How Can This Real? Gizmodo reported in 2010: Software developer Nigel Leck got tired rehashing the same 140-character arguments against climate change deniers, so he programmed a bot that does the work for him. With citations! Leck’s bot, @AI_AGW, doesn’t just respond to arguments directed at Leck himself, it goes out and picks fights. Every five minutes it trawls Twitter for terms and phrases that commonly crop up in Tweets that refute human-caused climate change. It then searches its database of hundreds to find a counter-argument best suited for that tweet—usually a quick statement and a link to a scientific source. As can be the case with these sorts of things, many of the deniers don’t know they’ve been targeted by a robot and engage AI_AGW in debate. The bot will continue to fire back canned responses that best fit the interlocutor’s line of debate—Leck says this goes on for days, in some cases—and the bot’s been outfitted with a number of responses on the topic of religion, where the arguments unsurprisingly often end up. Technology has come a long way in the past 5 years. So if a lone programmer could do this 5 years ago, imagine what he could do now. And the big players have a lot more resources at their disposal than a lone climate activist/software developer does.  For example, a government expert told the Washington Post that the government “quite literally can watch your ideas form as you type” (and see this).  So if the lone programmer is doing it, it’s not unreasonable to assume that the big boys are widely doing it.
  • How Effective Are Automated Comments? Unfortunately, this is more effective than you might assume … Specifically, scientists have shown that name-calling and swearing breaks down people’s ability to think rationally … and intentionally sowing discord and posting junk comments to push down insightful comments  are common propaganda techniques. Indeed, an automated program need not even be that sophisticated … it can copy a couple of words from the main post or a comment, and then spew back one or more radioactive labels such as “terrorist”, “commie”, “Russia-lover”, “wimp”, “fascist”, “loser”, “traitor”, “conspiratard”, etc. Given that Harding and his compadres consider anyone who questions any U.S. policies as an enemy of the state  – as does the Obama administration (and see this) – many honest, patriotic writers and commenters may be targeted for automated propaganda comments.
Gonzalo San Gil, PhD.

Yes, the NSA Worried About Whether Spying Would Backfire | WIRED - 1 views

  •  
    ""For all the time I worked on all of these issues, this was a constant discussion," Olsen says. "How do we calibrate what we're trying to do for the country with how to protect civil liberties and privacy?""
  •  
    NSA can't credibly claim surprise at how people reacted to the Snowden disclosures. NSA's spying on U.S. citizens was first uncovered by the Senate's Church Committee in about 1976. Congress enacted legslation unequivocally telling NSA and the Defense Department that spying on Americans was not to happen again (and that the CIA was to immediately cease spying within the territorial boundaries of the U.S.). Then came the Total Information Awareness scandal, when Congress discovered that DoD was right back at it again, this time operating from under the cover of the Defense Advanced Research Projects Agency. Congress responded by abolishing the program and eliminating the job position of its director, former Admiral John Poindexter of Iran/Contra scandal fame. But rather than complying with the abolition order, most of the TIA program's staff, hardware, software, and data was simply transferred to NSA. NSA, of course, persuaded the Justice Department to secretly reinterpret key provisions of the Patriot Act more broadly than a First Grade preschooler would allow to continue spying on U.S. citizens. Indeed, anyone whose college education included the assignment to read and discuss George Orwell's 1984 would have known that NSA's program had drastically outgrown the limits of what a free society would tolerate. So this is really about deliberate defiance of the limits established by the Constitution and Congressional enactments, not about anything even remotely legal or morally acceptable. The fact that Congress did not react strongly after the Snowden disclosures, as it had after the Church Committee's report and discovery of the TIA program raises a strong suspicion that members of Congress have been blackmailed into submission using information about them gathered via NSA surveillance. We know from whistleblowers Edward Snowden and Russell Tice that members of Congress were surveilled by NSA, yet not even that violation has been taken up by Congress. Instead
Gonzalo San Gil, PhD.

Linux and Unix Port Scanning With netcat [nc] Command - 1 views

  •  
    "by Vivek Gite on July 12, 2007 last updated November 27, 2015 in Linux, Networking, UNIX How do I find out which ports are opened on my own server? How do I run port scanning using the nc command instead of the nmap command on a Linux or Unix-like systems?"
Paul Merrell

Rural America and the 5G Digital Divide. Telecoms Expanding Their "Toxic Infrastructure... - 0 views

  • While there is considerable telecom hubris regarding the 5G rollout and increasing speculation that the next generation of wireless is not yet ready for Prime Time, the industry continues to make promises to Rural America that it has no intention of fulfilling. Decades-long promises to deliver digital Utopia to rural America by T-Mobile, Verizon and AT&T have never materialized.  
  • In 2017, the USDA reported that 29% of American farms had no internet access. The FCC says that 14 million rural Americans and 1.2 million Americans living on tribal lands do not have 4G LTE on their phones, and that 30 million rural residents do not have broadband service compared to 2% of urban residents.  It’s beginning to sound like a Third World country. Despite an FCC $4.5 billion annual subsidy to carriers to provide broadband service in rural areas, the FCC reports that ‘over 24 million Americans do not have access to high-speed internet service, the bulk of them in rural area”while a  Microsoft Study found that  “162 million people across the US do not have internet service at broadband speeds.” At the same time, only three cable companies have access to 70% of the market in a sweetheart deal to hike rates as they avoid competition and the FCC looks the other way.  The FCC believes that it would cost $40 billion to bring broadband access to 98% of the country with expansion in rural America even more expensive.  While the FCC has pledged a $2 billion, ten year plan to identify rural wireless locations, only 4 million rural American businesses and homes will be targeted, a mere drop in the bucket. Which brings us to rural mapping: Since the advent of the digital age, there have been no accurate maps identifying where broadband service is available in rural America and where it is not available.  The FCC has a long history of promulgating unreliable and unverified carrier-provided numbers as the Commission has repeatedly ‘bungled efforts to produce accurate broadband maps” that would have facilitated rural coverage. During the Senate Commerce Committee hearing on April 10th regarding broadband mapping, critical testimony questioned whether the FCC and/or the telecom industry have either the commitment or the proficiency to provide 5G to rural America.  Members of the Committee shared concerns that 5G might put rural America further behind the curve so as to never catch up with the rest of the country
Paul Merrell

Google starts watching what you do off the Internet too - RT - 1 views

  • The most powerful company on the Internet just got a whole lot creepier: a new service from Google merges offline consumer info with online intelligence, allowing advertisers to target users based on what they do at the keyboard and at the mall. Without much fanfare, Google announced news this week of a new advertising project, Conversions API, that will let businesses build all-encompassing user profiles based off of not just what users search for on the Web, but what they purchase outside of the home. In a blog post this week on Google’s DoubleClick Search site, the Silicon Valley giant says that targeting consumers based off online information only allows advertisers to learn so much. “Conversions,” tech-speak for the digital metric made by every action a user makes online, are incomplete until coupled with real life data, Google says.
  • Of course, there is always the possibility that all of this information can be unencrypted and, in some cases, obtained by third-parties that you might not want prying into your personal business. Edwards notes in his report that Google does not explicitly note that intelligence used in Conversions API will be anonymized, but the blowback from not doing as much would sure be enough to start a colossal uproar. Meanwhile, however, all of the information being collected by Google — estimated to be on millions of servers around the globe — is being handed over to more than just advertising companies. Last month Google reported that the US government requested personal information from roughly 8,000 individual users during just the first few months of 2012.“This is the sixth time we’ve released this data, and one trend has become clear: Government surveillance is on the rise,” Google admitted with their report.
Paul Merrell

Haavard - 300 million users strong, Opera moves to WebKit - 1 views

  • Today, we announced that Opera has reached 300 million active users. At the same time, we made the official announcement that Opera will move from Presto to WebKit as the engine at the core of the browser.
  • It was always a goal to be compatible with the real web while also supporting and promoting open standards.That turns out to be a bit of a challenge when you are faced with a web that is not as open as one might have wanted. Add to that the fact that it is constantly changing and that you don't get site compatibility for free (which some browsers are fortunate enough to do), and it ends up taking up a lot of resources - resources that could have been spent on innovation and polish instead.
  • Although I was skeptical at first when I started hearing about the switch, I am now fully convinced that it is the right thing to do. Not only will it free up significant engineering resources at Opera and allow us to do more innovation instead of constantly trying to adapt to the web, but our users should benefit from better site compatibility and more innovative features and polish.This move allows us to focus even more on the actual user experience.
  • ...2 more annotations...
  • If switching to WebKit allows us to accelerate our growth and become an important contributor to the project (we will contribute back to WebKit, and have already submitted our first patch (bug)), we may finally have a direct impact on the way web sites are coded. We want sites to be coded for open standards rather than specific browsers.
  • WebKit has matured enough that it is actually possible to make the switch, and we can help it mature even further. In return, we get to spend more resources on a better user experience, and less on chasing an ever-changing web.This move allows us to create a platform for future growth because it allows us to focus our resources on things that can actually differentiate Opera from the competition, and could help the web move in the right direction.
  •  
    And so there will be only three major web page rendering engines, webkit, mozilla's gecko, and MSIE. with only webkit in the ascendancy. 
Gonzalo San Gil, PhD.

No one should have to use proprietary software to communicate with their government - F... - 0 views

  •  
    "by Donald Robertson - Published on May 04, 2016 12:36 PM The Free Software Foundation (FSF) submitted a comment to the U.S. Copyright Office calling for a method to submit comments that do not require the use of proprietary JavaScript. Proprietary JavaScript is a threat to all users on the Web. When minified, the code can hide all sorts of nasty items, like spyware and other security risks. Savvy users can protect themselves by blocking scripts in their browser, or by installing the LibreJS browser extension and avoiding sites that require proprietary JavaScript in order to function. B"
  •  
    "by Donald Robertson - Published on May 04, 2016 12:36 PM The Free Software Foundation (FSF) submitted a comment to the U.S. Copyright Office calling for a method to submit comments that do not require the use of proprietary JavaScript. Proprietary JavaScript is a threat to all users on the Web. When minified, the code can hide all sorts of nasty items, like spyware and other security risks. Savvy users can protect themselves by blocking scripts in their browser, or by installing the LibreJS browser extension and avoiding sites that require proprietary JavaScript in order to function. B"
Paul Merrell

The People and Tech Behind the Panama Papers - Features - Source: An OpenNews project - 0 views

  • Then we put the data up, but the problem with Solr was it didn’t have a user interface, so we used Project Blacklight, which is open source software normally used by librarians. We used it for the journalists. It’s simple because it allows you to do faceted search—so, for example, you can facet by the folder structure of the leak, by years, by type of file. There were more complex things—it supports queries in regular expressions, so the more advanced users were able to search for documents with a certain pattern of numbers that, for example, passports use. You could also preview and download the documents. ICIJ open-sourced the code of our document processing chain, created by our web developer Matthew Caruana Galizia. We also developed a batch-searching feature. So say you were looking for politicians in your country—you just run it through the system, and you upload your list to Blacklight and you would get a CSV back saying yes, there are matches for these names—not only exact matches, but also matches based on proximity. So you would say “I want Mar Cabra proximity 2” and that would give you “Mar Cabra,” “Mar whatever Cabra,” “Cabra, Mar,”—so that was good, because very quickly journalists were able to see… I have this list of politicians and they are in the data!
  • Last Sunday, April 3, the first stories emerging from the leaked dataset known as the Panama Papers were published by a global partnership of news organizations working in coordination with the International Consortium of Investigative Journalists, or ICIJ. As we begin the second week of reporting on the leak, Iceland’s Prime Minister has been forced to resign, Germany has announced plans to end anonymous corporate ownership, governments around the world launched investigations into wealthy citizens’ participation in tax havens, the Russian government announced that the investigation was an anti-Putin propaganda operation, and the Chinese government banned mentions of the leak in Chinese media. As the ICIJ-led consortium prepares for its second major wave of reporting on the Panama Papers, we spoke with Mar Cabra, editor of ICIJ’s Data & Research unit and lead coordinator of the data analysis and infrastructure work behind the leak. In our conversation, Cabra reveals ICIJ’s years-long effort to build a series of secure communication and analysis platforms in support of genuinely global investigative reporting collaborations.
  • For communication, we have the Global I-Hub, which is a platform based on open source software called Oxwall. Oxwall is a social network, like Facebook, which has a wall when you log in with the latest in your network—it has forum topics, links, you can share files, and you can chat with people in real time.
  • ...3 more annotations...
  • We had the data in a relational database format in SQL, and thanks to ETL (Extract, Transform, and Load) software Talend, we were able to easily transform the data from SQL to Neo4j (the graph-database format we used). Once the data was transformed, it was just a matter of plugging it into Linkurious, and in a couple of minutes, you have it visualized—in a networked way, so anyone can log in from anywhere in the world. That was another reason we really liked Linkurious and Neo4j—they’re very quick when representing graph data, and the visualizations were easy to understand for everybody. The not-very-tech-savvy reporter could expand the docs like magic, and more technically expert reporters and programmers could use the Neo4j query language, Cypher, to do more complex queries, like show me everybody within two degrees of separation of this person, or show me all the connected dots…
  • We believe in open source technology and try to use it as much as possible. We used Apache Solr for the indexing and Apache Tika for document processing, and it’s great because it processes dozens of different formats and it’s very powerful. Tika interacts with Tesseract, so we did the OCRing on Tesseract. To OCR the images, we created an army of 30–40 temporary servers in Amazon that allowed us to process the documents in parallel and do parallel OCR-ing. If it was very slow, we’d increase the number of servers—if it was going fine, we would decrease because of course those servers have a cost.
  • For the visualization of the Mossack Fonseca internal database, we worked with another tool called Linkurious. It’s not open source, it’s licensed software, but we have an agreement with them, and they allowed us to work with it. It allows you to represent data in graphs. We had a version of Linkurious on our servers, so no one else had the data. It was pretty intuitive—journalists had to click on dots that expanded, basically, and could search the names.
Gonzalo San Gil, PhD.

5 Ways to Repurpose an Old PC with Open Source Software - 1 views

  •  
    "Most small businesses refresh their desktops and laptops every three to five years, but that process brings up a thorny question: What should you do with the old equipment? Answer: learn how to repurpose old PCs and laptops."
  •  
    "Most small businesses refresh their desktops and laptops every three to five years, but that process brings up a thorny question: What should you do with the old equipment? Answer: learn how to repurpose old PCs and laptops."
Gonzalo San Gil, PhD.

How to Create and Run New Service Units in Systemd Using Shell Script - 0 views

  •  
    " Few days ago, I came across a Centos 7 32-bit distro and I felt the desire to test it on an old 32-bit machine. After booting I realized that it had a bug and it was loosing the network connection, which I had to turn it "up" manually every time after boot. So, the question was how could I set a script doing this job, running every time I boot my machine?"
  •  
    " Few days ago, I came across a Centos 7 32-bit distro and I felt the desire to test it on an old 32-bit machine. After booting I realized that it had a bug and it was loosing the network connection, which I had to turn it "up" manually every time after boot. So, the question was how could I set a script doing this job, running every time I boot my machine?"
Gonzalo San Gil, PhD.

Blender: An Introduction for Final Cut Pro Users | FOSS Force - 0 views

  •  
    "Phil Shapiro Have you often considered quitting your day job to begin an exciting career as a filmmaker? You don't need the resources of a Hollywood studio anymore. In fact, you can do it all with free and open source software."
  •  
    "Phil Shapiro Have you often considered quitting your day job to begin an exciting career as a filmmaker? You don't need the resources of a Hollywood studio anymore. In fact, you can do it all with free and open source software."
Gonzalo San Gil, PhD.

Contributing to an Open Source Project | FOSS Force - 0 views

  •  
    "Phil Shapiro There are many ways to contribute to an open source project. There are also many reasons for doing so. But before jumping in, you might want to know how things generally work within these projects."
  •  
    "Phil Shapiro There are many ways to contribute to an open source project. There are also many reasons for doing so. But before jumping in, you might want to know how things generally work within these projects."
Paul Merrell

In Hearing on Internet Surveillance, Nobody Knows How Many Americans Impacted in Data C... - 0 views

  • The Senate Judiciary Committee held an open hearing today on the FISA Amendments Act, the law that ostensibly authorizes the digital surveillance of hundreds of millions of people both in the United States and around the world. Section 702 of the law, scheduled to expire next year, is designed to allow U.S. intelligence services to collect signals intelligence on foreign targets related to our national security interests. However—thanks to the leaks of many whistleblowers including Edward Snowden, the work of investigative journalists, and statements by public officials—we now know that the FISA Amendments Act has been used to sweep up data on hundreds of millions of people who have no connection to a terrorist investigation, including countless Americans. What do we mean by “countless”? As became increasingly clear in the hearing today, the exact number of Americans impacted by this surveillance is unknown. Senator Franken asked the panel of witnesses, “Is it possible for the government to provide an exact count of how many United States persons have been swept up in Section 702 surveillance? And if not the exact count, then what about an estimate?”
  • The lack of information makes rigorous oversight of the programs all but impossible. As Senator Franken put it in the hearing today, “When the public lacks even a rough sense of the scope of the government’s surveillance program, they have no way of knowing if the government is striking the right balance, whether we are safeguarding our national security without trampling on our citizens’ fundamental privacy rights. But the public can’t know if we succeed in striking that balance if they don’t even have the most basic information about our major surveillance programs."  Senator Patrick Leahy also questioned the panel about the “minimization procedures” associated with this type of surveillance, the privacy safeguard that is intended to ensure that irrelevant data and data on American citizens is swiftly deleted. Senator Leahy asked the panel: “Do you believe the current minimization procedures ensure that data about innocent Americans is deleted? Is that enough?”  David Medine, who recently announced his pending retirement from the Privacy and Civil Liberties Oversight Board, answered unequivocally:
  • Elizabeth Goitein, the Brennan Center director whose articulate and thought-provoking testimony was the highlight of the hearing, noted that at this time an exact number would be difficult to provide. However, she asserted that an estimate should be possible for most if not all of the government’s surveillance programs. None of the other panel participants—which included David Medine and Rachel Brand of the Privacy and Civil Liberties Oversight Board as well as Matthew Olsen of IronNet Cybersecurity and attorney Kenneth Wainstein—offered an estimate. Today’s hearing reaffirmed that it is not only the American people who are left in the dark about how many people or accounts are impacted by the NSA’s dragnet surveillance of the Internet. Even vital oversight committees in Congress like the Senate Judiciary Committee are left to speculate about just how far-reaching this surveillance is. It's part of the reason why we urged the House Judiciary Committee to demand that the Intelligence Community provide the public with a number. 
  • ...2 more annotations...
  • Senator Leahy, they don’t. The minimization procedures call for the deletion of innocent Americans’ information upon discovery to determine whether it has any foreign intelligence value. But what the board’s report found is that in fact information is never deleted. It sits in the databases for 5 years, or sometimes longer. And so the minimization doesn’t really address the privacy concerns of incidentally collected communications—again, where there’s been no warrant at all in the process… In the United States, we simply can’t read people’s emails and listen to their phone calls without court approval, and the same should be true when the government shifts its attention to Americans under this program. One of the most startling exchanges from the hearing today came toward the end of the session, when Senator Dianne Feinstein—who also sits on the Intelligence Committee—seemed taken aback by Ms. Goitein’s mention of “backdoor searches.” 
  • Feinstein: Wow, wow. What do you call it? What’s a backdoor search? Goitein: Backdoor search is when the FBI or any other agency targets a U.S. person for a search of data that was collected under Section 702, which is supposed to be targeted against foreigners overseas. Feinstein: Regardless of the minimization that was properly carried out. Goitein: Well the data is searched in its unminimized form. So the FBI gets raw data, the NSA, the CIA get raw data. And they search that raw data using U.S. person identifiers. That’s what I’m referring to as backdoor searches. It’s deeply concerning that any member of Congress, much less a member of the Senate Judiciary Committee and the Senate Intelligence Committee, might not be aware of the problem surrounding backdoor searches. In April 2014, the Director of National Intelligence acknowledged the searches of this data, which Senators Ron Wyden and Mark Udall termed “the ‘back-door search’ loophole in section 702.” The public was so incensed that the House of Representatives passed an amendment to that year's defense appropriations bill effectively banning the warrantless backdoor searches. Nonetheless, in the hearing today it seemed like Senator Feinstein might not recognize or appreciate the serious implications of allowing U.S. law enforcement agencies to query the raw data collected through these Internet surveillance programs. Hopefully today’s testimony helped convince the Senator that there is more to this topic than what she’s hearing in jargon-filled classified security briefings.
  •  
    The 4th Amendment: "The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and *particularly describing the place to be searched, and the* persons or *things to be seized."* So much for the particularized description of the place to be searched and the thngs to be seized.  Fah! Who needs a Constitution, anyway .... 
Paul Merrell

Save Firefox! | Electronic Frontier Foundation - 0 views

  • The World Wide Web Consortium (W3C), once the force for open standards that kept browsers from locking publishers to their proprietary capabilities, has changed its mission. Since 2013, the organization has provided a forum where today's dominant browser companies and the dominant entertainment companies can collaborate on a system to let our browsers control our behavior, rather than the other way. This system, "Encrypted Media Extensions" (EME) uses standards-defined code to funnel video into a proprietary container called a "Content Decryption Module." For a new browser to support this new video streaming standard -- which major studios and cable operators are pushing for -- it would have to convince those entertainment companies or one of their partners to let them have a CDM, or this part of the "open" Web would not display in their new browser. This is the opposite of every W3C standard to date: once, all you needed to do to render content sent by a server was follow the standard, not get permission. If browsers had needed permission to render a page at the launch of Mozilla, the publishers would have frozen out this new, pop-up-blocking upstart. Kiss Firefox goodbye, in other words.
  • The W3C didn't have to do this. No copyright law says that making a video gives you the right to tell people who legally watch it how they must configure their equipment. But because of the design of EME, copyright holders will be able to use the law to shut down any new browser that tries to render the video without their permission. That's because EME is designed to trigger liability under section 1201 of the Digital Millennium Copyright Act (DMCA), which says that removing a digital lock that controls access to a copyrighted work without permission is an offense, even if the person removing the lock has the right to the content it restricts. In other words, once a video is sent with EME, a new company that unlocks it for its users can be sued, even if the users do nothing illegal with that video. We proposed that the W3C could protect new browsers by making their members promise not to use the DMCA to attack new entrants in the market, an idea supported by a diverse group of W3C members, but the W3C executive overruled us saying the work would go forward with no safeguards for future competition. It's even worse than at first glance. The DMCA isn't limited to the USA: the US Trade Representative has spread DMCA-like rules to virtually every country that does business with America. Worse still: the DMCA is also routinely used by companies to threaten and silence security researchers who reveal embarrassing defects in their products. The W3C also declined to require its members to protect security researchers who discover flaws in EME, leaving every Web user vulnerable to vulnerabilities whose disclosure can only safely take place if the affected company decides to permit it.
  • The W3C needs credibility with people who care about the open Web and innovation in order to be viable. They are sensitive to this kind of criticism. We empathize. There are lots of good people working there, people who genuinely, passionately want the Web to stay open to everyone, and to be safe for its users. But the organization made a terrible decision when it opted to provide a home for EME, and an even worse one when it overruled its own members and declined protection for security research and new competitors. It needs to hear from you now. Please share this post, and spread the word. Help the W3C be the organization it is meant to be.
Gonzalo San Gil, PhD.

ISP Vows to Protect Users From a Piracy Witch Hunt - TorrentFreak - 0 views

  •  
    " By Ernesto on April 22, 2016 C: 14 Breaking Swedish Internet service provider Bahnhof says it will do everything in its power to prevent copyright holders from threatening its subscribers. The provider is responding to a recent case in which a competing ISP was ordered to expose alleged BitTorrent pirates, reportedly without any thorough evidence."
  •  
    " By Ernesto on April 22, 2016 C: 14 Breaking Swedish Internet service provider Bahnhof says it will do everything in its power to prevent copyright holders from threatening its subscribers. The provider is responding to a recent case in which a competing ISP was ordered to expose alleged BitTorrent pirates, reportedly without any thorough evidence."
Gonzalo San Gil, PhD.

SunZilla provides portable open source electricity | Opensource.com - 0 views

  •  
    "Do-it-yourself electricity generation is still difficult and expensive. The inventors of the SunZilla project aim to make it easier, cleaner, portable, quiet, and completely open source."
  •  
    "Do-it-yourself electricity generation is still difficult and expensive. The inventors of the SunZilla project aim to make it easier, cleaner, portable, quiet, and completely open source."
Gonzalo San Gil, PhD.

El brexit empaña el tratado comercial entre la UE y Estados Unidos | El Perió... - 0 views

  •  
    "VIKTORIA DENDRINOU, The Wall Street Journal BRUSELAS (EFE Dow Jones)--La decisión de Reino Unido de abandonar la Unión Europea ha arrojado más dudas sobre el futuro de un tratado comercial de gran alcance entre la UE y Estados Unidos. Los dos mayores bloques económicos del mundo han estado negociando la Asociación Transatlántica para el Comercio y la Inversión --o TTIP por sus siglas en inglés-- desde 2013, y todavía dicen que esperan finalizar las negociaciones antes de que finalice el mandato de la Administración Obama en enero."
  •  
    "VIKTORIA DENDRINOU, The Wall Street Journal BRUSELAS (EFE Dow Jones)--La decisión de Reino Unido de abandonar la Unión Europea ha arrojado más dudas sobre el futuro de un tratado comercial de gran alcance entre la UE y Estados Unidos. Los dos mayores bloques económicos del mundo han estado negociando la Asociación Transatlántica para el Comercio y la Inversión --o TTIP por sus siglas en inglés-- desde 2013, y todavía dicen que esperan finalizar las negociaciones antes de que finalice el mandato de la Administración Obama en enero."
Gonzalo San Gil, PhD.

Export - Support - WordPress.com (Backup) - 0 views

  •  
    "Export Your Content to Another Blog or Platform It's your content; you can do whatever you like with it. Go to Tools -> Export in your WordPress.com dashboard to download an XML file of your blog's content. This format, which we call WordPress eXtended RSS or WXR, will contain your posts, pages, comments, categories, and tags."
  •  
    "Export Your Content to Another Blog or Platform It's your content; you can do whatever you like with it. Go to Tools -> Export in your WordPress.com dashboard to download an XML file of your blog's content. This format, which we call WordPress eXtended RSS or WXR, will contain your posts, pages, comments, categories, and tags."
« First ‹ Previous 61 - 80 of 525 Next › Last »
Showing 20 items per page