Skip to main content

Home/ Open Web/ Group items tagged Technology

Rss Feed Group items tagged

Paul Merrell

From Radio to Porn, British Spies Track Web Users' Online Identities - 0 views

  • HERE WAS A SIMPLE AIM at the heart of the top-secret program: Record the website browsing habits of “every visible user on the Internet.” Before long, billions of digital records about ordinary people’s online activities were being stored every day. Among them were details cataloging visits to porn, social media and news websites, search engines, chat forums, and blogs. The mass surveillance operation — code-named KARMA POLICE — was launched by British spies about seven years ago without any public debate or scrutiny. It was just one part of a giant global Internet spying apparatus built by the United Kingdom’s electronic eavesdropping agency, Government Communications Headquarters, or GCHQ. The revelations about the scope of the British agency’s surveillance are contained in documents obtained by The Intercept from National Security Agency whistleblower Edward Snowden. Previous reports based on the leaked files have exposed how GCHQ taps into Internet cables to monitor communications on a vast scale, but many details about what happens to the data after it has been vacuumed up have remained unclear.
  • Amid a renewed push from the U.K. government for more surveillance powers, more than two dozen documents being disclosed today by The Intercept reveal for the first time several major strands of GCHQ’s existing electronic eavesdropping capabilities.
  • The surveillance is underpinned by an opaque legal regime that has authorized GCHQ to sift through huge archives of metadata about the private phone calls, emails and Internet browsing logs of Brits, Americans, and any other citizens — all without a court order or judicial warrant
  • ...17 more annotations...
  • A huge volume of the Internet data GCHQ collects flows directly into a massive repository named Black Hole, which is at the core of the agency’s online spying operations, storing raw logs of intercepted material before it has been subject to analysis. Black Hole contains data collected by GCHQ as part of bulk “unselected” surveillance, meaning it is not focused on particular “selected” targets and instead includes troves of data indiscriminately swept up about ordinary people’s online activities. Between August 2007 and March 2009, GCHQ documents say that Black Hole was used to store more than 1.1 trillion “events” — a term the agency uses to refer to metadata records — with about 10 billion new entries added every day. As of March 2009, the largest slice of data Black Hole held — 41 percent — was about people’s Internet browsing histories. The rest included a combination of email and instant messenger records, details about search engine queries, information about social media activity, logs related to hacking operations, and data on people’s use of tools to browse the Internet anonymously.
  • Throughout this period, as smartphone sales started to boom, the frequency of people’s Internet use was steadily increasing. In tandem, British spies were working frantically to bolster their spying capabilities, with plans afoot to expand the size of Black Hole and other repositories to handle an avalanche of new data. By 2010, according to the documents, GCHQ was logging 30 billion metadata records per day. By 2012, collection had increased to 50 billion per day, and work was underway to double capacity to 100 billion. The agency was developing “unprecedented” techniques to perform what it called “population-scale” data mining, monitoring all communications across entire countries in an effort to detect patterns or behaviors deemed suspicious. It was creating what it said would be, by 2013, “the world’s biggest” surveillance engine “to run cyber operations and to access better, more valued data for customers to make a real world difference.”
  • A document from the GCHQ target analysis center (GTAC) shows the Black Hole repository’s structure.
  • The data is searched by GCHQ analysts in a hunt for behavior online that could be connected to terrorism or other criminal activity. But it has also served a broader and more controversial purpose — helping the agency hack into European companies’ computer networks. In the lead up to its secret mission targeting Netherlands-based Gemalto, the largest SIM card manufacturer in the world, GCHQ used MUTANT BROTH in an effort to identify the company’s employees so it could hack into their computers. The system helped the agency analyze intercepted Facebook cookies it believed were associated with Gemalto staff located at offices in France and Poland. GCHQ later successfully infiltrated Gemalto’s internal networks, stealing encryption keys produced by the company that protect the privacy of cell phone communications.
  • Similarly, MUTANT BROTH proved integral to GCHQ’s hack of Belgian telecommunications provider Belgacom. The agency entered IP addresses associated with Belgacom into MUTANT BROTH to uncover information about the company’s employees. Cookies associated with the IPs revealed the Google, Yahoo, and LinkedIn accounts of three Belgacom engineers, whose computers were then targeted by the agency and infected with malware. The hacking operation resulted in GCHQ gaining deep access into the most sensitive parts of Belgacom’s internal systems, granting British spies the ability to intercept communications passing through the company’s networks.
  • In March, a U.K. parliamentary committee published the findings of an 18-month review of GCHQ’s operations and called for an overhaul of the laws that regulate the spying. The committee raised concerns about the agency gathering what it described as “bulk personal datasets” being held about “a wide range of people.” However, it censored the section of the report describing what these “datasets” contained, despite acknowledging that they “may be highly intrusive.” The Snowden documents shine light on some of the core GCHQ bulk data-gathering programs that the committee was likely referring to — pulling back the veil of secrecy that has shielded some of the agency’s most controversial surveillance operations from public scrutiny. KARMA POLICE and MUTANT BROTH are among the key bulk collection systems. But they do not operate in isolation — and the scope of GCHQ’s spying extends far beyond them.
  • The agency operates a bewildering array of other eavesdropping systems, each serving its own specific purpose and designated a unique code name, such as: SOCIAL ANTHROPOID, which is used to analyze metadata on emails, instant messenger chats, social media connections and conversations, plus “telephony” metadata about phone calls, cell phone locations, text and multimedia messages; MEMORY HOLE, which logs queries entered into search engines and associates each search with an IP address; MARBLED GECKO, which sifts through details about searches people have entered into Google Maps and Google Earth; and INFINITE MONKEYS, which analyzes data about the usage of online bulletin boards and forums. GCHQ has other programs that it uses to analyze the content of intercepted communications, such as the full written body of emails and the audio of phone calls. One of the most important content collection capabilities is TEMPORA, which mines vast amounts of emails, instant messages, voice calls and other communications and makes them accessible through a Google-style search tool named XKEYSCORE.
  • As of September 2012, TEMPORA was collecting “more than 40 billion pieces of content a day” and it was being used to spy on people across Europe, the Middle East, and North Africa, according to a top-secret memo outlining the scope of the program. The existence of TEMPORA was first revealed by The Guardian in June 2013. To analyze all of the communications it intercepts and to build a profile of the individuals it is monitoring, GCHQ uses a variety of different tools that can pull together all of the relevant information and make it accessible through a single interface. SAMUEL PEPYS is one such tool, built by the British spies to analyze both the content and metadata of emails, browsing sessions, and instant messages as they are being intercepted in real time. One screenshot of SAMUEL PEPYS in action shows the agency using it to monitor an individual in Sweden who visited a page about GCHQ on the U.S.-based anti-secrecy website Cryptome.
  • Partly due to the U.K.’s geographic location — situated between the United States and the western edge of continental Europe — a large amount of the world’s Internet traffic passes through its territory across international data cables. In 2010, GCHQ noted that what amounted to “25 percent of all Internet traffic” was transiting the U.K. through some 1,600 different cables. The agency said that it could “survey the majority of the 1,600” and “select the most valuable to switch into our processing systems.”
  • According to Joss Wright, a research fellow at the University of Oxford’s Internet Institute, tapping into the cables allows GCHQ to monitor a large portion of foreign communications. But the cables also transport masses of wholly domestic British emails and online chats, because when anyone in the U.K. sends an email or visits a website, their computer will routinely send and receive data from servers that are located overseas. “I could send a message from my computer here [in England] to my wife’s computer in the next room and on its way it could go through the U.S., France, and other countries,” Wright says. “That’s just the way the Internet is designed.” In other words, Wright adds, that means “a lot” of British data and communications transit across international cables daily, and are liable to be swept into GCHQ’s databases.
  • A map from a classified GCHQ presentation about intercepting communications from undersea cables. GCHQ is authorized to conduct dragnet surveillance of the international data cables through so-called external warrants that are signed off by a government minister. The external warrants permit the agency to monitor communications in foreign countries as well as British citizens’ international calls and emails — for example, a call from Islamabad to London. They prohibit GCHQ from reading or listening to the content of “internal” U.K. to U.K. emails and phone calls, which are supposed to be filtered out from GCHQ’s systems if they are inadvertently intercepted unless additional authorization is granted to scrutinize them. However, the same rules do not apply to metadata. A little-known loophole in the law allows GCHQ to use external warrants to collect and analyze bulk metadata about the emails, phone calls, and Internet browsing activities of British people, citizens of closely allied countries, and others, regardless of whether the data is derived from domestic U.K. to U.K. communications and browsing sessions or otherwise. In March, the existence of this loophole was quietly acknowledged by the U.K. parliamentary committee’s surveillance review, which stated in a section of its report that “special protection and additional safeguards” did not apply to metadata swept up using external warrants and that domestic British metadata could therefore be lawfully “returned as a result of searches” conducted by GCHQ.
  • Perhaps unsurprisingly, GCHQ appears to have readily exploited this obscure legal technicality. Secret policy guidance papers issued to the agency’s analysts instruct them that they can sift through huge troves of indiscriminately collected metadata records to spy on anyone regardless of their nationality. The guidance makes clear that there is no exemption or extra privacy protection for British people or citizens from countries that are members of the Five Eyes, a surveillance alliance that the U.K. is part of alongside the U.S., Canada, Australia, and New Zealand. “If you are searching a purely Events only database such as MUTANT BROTH, the issue of location does not occur,” states one internal GCHQ policy document, which is marked with a “last modified” date of July 2012. The document adds that analysts are free to search the databases for British metadata “without further authorization” by inputing a U.K. “selector,” meaning a unique identifier such as a person’s email or IP address, username, or phone number. Authorization is “not needed for individuals in the U.K.,” another GCHQ document explains, because metadata has been judged “less intrusive than communications content.” All the spies are required to do to mine the metadata troves is write a short “justification” or “reason” for each search they conduct and then click a button on their computer screen.
  • Intelligence GCHQ collects on British persons of interest is shared with domestic security agency MI5, which usually takes the lead on spying operations within the U.K. MI5 conducts its own extensive domestic surveillance as part of a program called DIGINT (digital intelligence).
  • GCHQ’s documents suggest that it typically retains metadata for periods of between 30 days to six months. It stores the content of communications for a shorter period of time, varying between three to 30 days. The retention periods can be extended if deemed necessary for “cyber defense.” One secret policy paper dated from January 2010 lists the wide range of information the agency classes as metadata — including location data that could be used to track your movements, your email, instant messenger, and social networking “buddy lists,” logs showing who you have communicated with by phone or email, the passwords you use to access “communications services” (such as an email account), and information about websites you have viewed.
  • Records showing the full website addresses you have visited — for instance, www.gchq.gov.uk/what_we_do — are treated as content. But the first part of an address you have visited — for instance, www.gchq.gov.uk — is treated as metadata. In isolation, a single metadata record of a phone call, email, or website visit may not reveal much about a person’s private life, according to Ethan Zuckerman, director of Massachusetts Institute of Technology’s Center for Civic Media. But if accumulated and analyzed over a period of weeks or months, these details would be “extremely personal,” he told The Intercept, because they could reveal a person’s movements, habits, religious beliefs, political views, relationships, and even sexual preferences. For Zuckerman, who has studied the social and political ramifications of surveillance, the most concerning aspect of large-scale government data collection is that it can be “corrosive towards democracy” — leading to a chilling effect on freedom of expression and communication. “Once we know there’s a reasonable chance that we are being watched in one fashion or another it’s hard for that not to have a ‘panopticon effect,’” he said, “where we think and behave differently based on the assumption that people may be watching and paying attention to what we are doing.”
  • When compared to surveillance rules in place in the U.S., GCHQ notes in one document that the U.K. has “a light oversight regime.” The more lax British spying regulations are reflected in secret internal rules that highlight greater restrictions on how NSA databases can be accessed. The NSA’s troves can be searched for data on British citizens, one document states, but they cannot be mined for information about Americans or other citizens from countries in the Five Eyes alliance. No such constraints are placed on GCHQ’s own databases, which can be sifted for records on the phone calls, emails, and Internet usage of Brits, Americans, and citizens from any other country. The scope of GCHQ’s surveillance powers explain in part why Snowden told The Guardian in June 2013 that U.K. surveillance is “worse than the U.S.” In an interview with Der Spiegel in July 2013, Snowden added that British Internet cables were “radioactive” and joked: “Even the Queen’s selfies to the pool boy get logged.”
  • In recent years, the biggest barrier to GCHQ’s mass collection of data does not appear to have come in the form of legal or policy restrictions. Rather, it is the increased use of encryption technology that protects the privacy of communications that has posed the biggest potential hindrance to the agency’s activities. “The spread of encryption … threatens our ability to do effective target discovery/development,” says a top-secret report co-authored by an official from the British agency and an NSA employee in 2011. “Pertinent metadata events will be locked within the encrypted channels and difficult, if not impossible, to prise out,” the report says, adding that the agencies were working on a plan that would “(hopefully) allow our Internet Exploitation strategy to prevail.”
Paul Merrell

Obama administration opts not to force firms to decrypt data - for now - The Washington... - 0 views

  • After months of deliberation, the Obama administration has made a long-awaited decision on the thorny issue of how to deal with encrypted communications: It will not — for now — call for legislation requiring companies to decode messages for law enforcement. Rather, the administration will continue trying to persuade companies that have moved to encrypt their customers’ data to create a way for the government to still peer into people’s data when needed for criminal or terrorism investigations. “The administration has decided not to seek a legislative remedy now, but it makes sense to continue the conversations with industry,” FBI Director James B. Comey said at a Senate hearing Thursday of the Homeland Security and Governmental Affairs Committee.
  • To Amie Stepanovich, the U.S. policy manager for Access, one of the groups signing the petition, the status quo isn’t good enough. “It’s really crucial that even if the government is not pursuing legislation, it’s also not pursuing policies that will weaken security through other methods,” she said. The FBI and Justice Department have been talking with tech companies for months. On Thursday, Comey said the conversations have been “increasingly productive.” He added: “People have stripped out a lot of the venom.” He said the tech executives “are all people who care about the safety of America and also care about privacy and civil liberties.” Comey said the issue afflicts not just federal law enforcement but also state and local agencies investigating child kidnappings and car crashes — “cops and sheriffs . . . [who are] increasingly encountering devices they can’t open with a search warrant.”
  • The decision was made at a Cabinet meeting Oct. 1. “As the president has said, the United States will work to ensure that malicious actors can be held to account — without weakening our commitment to strong encryption,” National Security Council spokesman Mark Stroh said. “As part of those efforts, we are actively engaged with private companies to ensure they understand the public safety and national security risks that result from malicious actors’ use of their encrypted products and services.” But privacy advocates are concerned that the administration’s definition of strong encryption also could include a system in which a company holds a decryption key or can retrieve unencrypted communications from its servers for law enforcement. “The government should not erode the security of our devices or applications, pressure companies to keep and allow government access to our data, mandate implementation of vulnerabilities or backdoors into products, or have disproportionate access to the keys to private data,” said Savecrypto.org, a coalition of industry and privacy groups that has launched a campaign to petition the Obama administration.
  • ...3 more annotations...
  • The decision, which essentially maintains the status quo, underscores the bind the administration is in — balancing competing pressures to help law enforcement and protect consumer privacy. The FBI says it is facing an increasing challenge posed by the encryption of communications of criminals, terrorists and spies. A growing number of companies have begun to offer encryption in which the only people who can read a message, for instance, are the person who sent it and the person who received it. Or, in the case of a device, only the device owner has access to the data. In such cases, the companies themselves lack “backdoors” or keys to decrypt the data for government investigators, even when served with search warrants or intercept orders.
  • One senior administration official said the administration thinks it’s making enough progress with companies that seeking legislation now is unnecessary. “We feel optimistic,” said the official, who spoke on the condition of anonymity to describe internal discussions. “We don’t think it’s a lost cause at this point.” Legislation, said Rep. Adam Schiff (D-Calif.), is not a realistic option given the current political climate. He said he made a recent trip to Silicon Valley to talk to Twitter, Facebook and Google. “They quite uniformly are opposed to any mandate or pressure — and more than that, they don’t want to be asked to come up with a solution,” Schiff said. Law enforcement officials know that legislation is a tough sell now. But, one senior official stressed, “it’s still going to be in the mix.” On the other side of the debate, technology, diplomatic and commerce agencies were pressing for an outright statement by Obama to disavow a legislative mandate on companies. But their position did not prevail.
  • Daniel Castro, vice president of the Information Technology & Innovation Foundation, said absent any new laws, either in the United States or abroad, “companies are in the driver’s seat.” He said that if another country tried to require companies to retain an ability to decrypt communications, “I suspect many tech companies would try to pull out.”
Paul Merrell

Revealed: How DOJ Gagged Google over Surveillance of WikiLeaks Volunteer - The Intercept - 0 views

  • The Obama administration fought a legal battle against Google to secretly obtain the email records of a security researcher and journalist associated with WikiLeaks. Newly unsealed court documents obtained by The Intercept reveal the Justice Department won an order forcing Google to turn over more than one year’s worth of data from the Gmail account of Jacob Appelbaum (pictured above), a developer for the Tor online anonymity project who has worked with WikiLeaks as a volunteer. The order also gagged Google, preventing it from notifying Appelbaum that his records had been provided to the government. The surveillance of Appelbaum’s Gmail account was tied to the Justice Department’s long-running criminal investigation of WikiLeaks, which began in 2010 following the transparency group’s publication of a large cache of U.S. government diplomatic cables. According to the unsealed documents, the Justice Department first sought details from Google about a Gmail account operated by Appelbaum in January 2011, triggering a three-month dispute between the government and the tech giant. Government investigators demanded metadata records from the account showing email addresses of those with whom Appelbaum had corresponded between the period of November 2009 and early 2011; they also wanted to obtain information showing the unique IP addresses of the computers he had used to log in to the account.
  • The Justice Department argued in the case that Appelbaum had “no reasonable expectation of privacy” over his email records under the Fourth Amendment, which protects against unreasonable searches and seizures. Rather than seeking a search warrant that would require it to show probable cause that he had committed a crime, the government instead sought and received an order to obtain the data under a lesser standard, requiring only “reasonable grounds” to believe that the records were “relevant and material” to an ongoing criminal investigation. Google repeatedly attempted to challenge the demand, and wanted to immediately notify Appelbaum that his records were being sought so he could have an opportunity to launch his own legal defense. Attorneys for the tech giant argued in a series of court filings that the government’s case raised “serious First Amendment concerns.” They noted that Appelbaum’s records “may implicate journalistic and academic freedom” because they could “reveal confidential sources or information about WikiLeaks’ purported journalistic or academic activities.” However, the Justice Department asserted that “journalists have no special privilege to resist compelled disclosure of their records, absent evidence that the government is acting in bad faith,” and refused to concede Appelbaum was in fact a journalist. It claimed it had acted in “good faith throughout this criminal investigation, and there is no evidence that either the investigation or the order is intended to harass the … subscriber or anyone else.” Google’s attempts to fight the surveillance gag order angered the government, with the Justice Department stating that the company’s “resistance to providing the records” had “frustrated the government’s ability to efficiently conduct a lawful criminal investigation.”
  • The Justice Department wanted to keep the surveillance secret largely because of an earlier public backlash over its WikiLeaks investigation. In January 2011, Appelbaum and other WikiLeaks volunteers’ – including Icelandic parlimentarian Birgitta Jonsdottir – were notified by Twitter that the Justice Department had obtained data about their accounts. This disclosure generated widepread news coverage and controversy; the government says in the unsealed court records that it “failed to anticipate the degree of  damage that would be caused” by the Twitter disclosure and did not want to “exacerbate this problem” when it went after Appelbaum’s Gmail data. The court documents show the Justice Department said the disclosure of its Twitter data grab “seriously jeopardized the [WikiLeaks] investigation” because it resulted in efforts to “conceal evidence” and put public pressure on other companies to resist similar surveillance orders. It also claimed that officials named in the subpeona ordering Twitter to turn over information were “harassed” after a copy was published by Intercept co-founder Glenn Greenwald at Salon in 2011. (The only specific evidence of the alleged harassment cited by the government is an email that was sent to an employee of the U.S. Attorney’s office that purportedly said: “You guys are fucking nazis trying to controll [sic] the whole fucking world. Well guess what. WE DO NOT FORGIVE. WE DO NOT FORGET. EXPECT US.”)
  • ...4 more annotations...
  • Google accused the government of hyperbole and argued that the backlash over the Twitter order did not justify secrecy related to the Gmail surveillance. “Rather than demonstrating how unsealing the order will harm its well-publicized investigation, the government lists a parade of horribles that have allegedly occurred since it unsealed the Twitter order, yet fails to establish how any of these developments could be further exacerbated by unsealing this order,” wrote Google’s attorneys. “The proverbial toothpaste is out of the tube, and continuing to seal a materially identical order will not change it.” But Google’s attempt to overturn the gag order was denied by magistrate judge Ivan D. Davis in February 2011. The company launched an appeal against that decision, but this too was rebuffed, in March 2011, by District Court judge Thomas Selby Ellis, III.
  • The government agreed to unseal some of the court records on Apr. 1 this year, and they were apparently turned over to Appelbaum on May 14 through a notification sent to his Gmail account. The files were released on condition that they would contain some redactions, which are bizarre and inconsistent, in some cases censoring the name of “WikiLeaks” from cited public news reports. Not all of the documents in the case – such as the original surveillance orders contested by Google – were released as part of the latest disclosure. Some contain “specific and sensitive details of the investigation” and “remain properly sealed while the grand jury investigation continues,” according to the court records from April this year. Appelbaum, an American citizen who is based in Berlin, called the case “a travesty that continues at a slow pace” and said he felt it was important to highlight “the absolute madness in these documents.”
  • He told The Intercept: “After five years, receiving such legal documents is neither a shock nor a needed confirmation. … Will we ever see the full documents about our respective cases? Will we even learn the names of those signing so-called legal orders against us in secret sealed documents? Certainly not in a timely manner and certainly not in a transparent, just manner.” The 32-year-old, who has recently collaborated with Intercept co-founder Laura Poitras to report revelations about National Security Agency surveillance for German news magazine Der Spiegel, said he plans to remain in Germany “in exile, rather than returning to the U.S. to experience more harassment of a less than legal kind.”
  • “My presence in Berlin ensures that the cost of physically harassing me or politically harassing me is much higher than when I last lived on U.S. soil,” Appelbaum said. “This allows me to work as a journalist freely from daily U.S. government interference. It also ensures that any further attempts to continue this will be forced into the open through [a Mutal Legal Assistance Treaty] and other international processes. The German goverment is less likely to allow the FBI to behave in Germany as they do on U.S. soil.” The Justice Department’s WikiLeaks investigaton is headed by prosecutors in the Eastern District of Virginia. Since 2010, the secretive probe has seen activists affiliated with WikiLeaks compelled to appear before a grand jury and the FBI attempting to infiltrate the group with an informant. Earlier this year, it was revealed that the government had obtained the contents of three core WikiLeaks staffers’ Gmail accounts as part of the investigation.
Paul Merrell

Activists send the Senate 6 million faxes to oppose cyber bill - CBS News - 0 views

  • Activists worried about online privacy are sending Congress a message with some old-school technology: They're sending faxes -- more than 6.2 million, they claim -- to express opposition to the Cybersecurity Information Sharing Act (CISA).Why faxes? "Congress is stuck in 1984 and doesn't understand modern technology," according to the campaign Fax Big Brother. The week-long campaign was organized by the nonpartisan Electronic Frontier Foundation, the group Access and Fight for the Future, the activist group behind the major Internet protests that helped derail a pair of anti-piracy bills in 2012. It also has the backing of a dozen groups like the ACLU, the American Library Association, National Association of Criminal Defense Lawyers and others.
  • CISA aims to facilitate information sharing regarding cyberthreats between the government and the private sector. The bill gained more attention following the massive hack in which the records of nearly 22 million people were stolen from government computers."The ability to easily and quickly share cyber attack information, along with ways to counter attacks, is a key method to stop them from happening in the first place," Sen. Dianne Feinstein, D-California, who helped introduce CISA, said in a statement after the hack. Senate leadership had planned to vote on CISA this week before leaving for its August recess. However, the bill may be sidelined for the time being as the Republican-led Senate puts precedent on a legislative effort to defund Planned Parenthood.Even as the bill was put on the backburner, the grassroots campaign to stop it gained steam. Fight for the Future started sending faxes to all 100 Senate offices on Monday, but the campaign really took off after it garnered attention on the website Reddit and on social media. The faxed messages are generated by Internet users who visit faxbigbrother.com or stopcyberspying.com -- or who simply send a message via Twitter with the hashtag #faxbigbrother. To send all those faxes, Fight for the Future set up a dedicated server and a dozen phone lines and modems they say are capable of sending tens of thousands of faxes a day.
  • Fight for the Future told CBS News that it has so many faxes queued up at this point, that it may take months for Senate offices to receive them all, though the group is working on scaling up its capability to send them faster. They're also limited by the speed at which Senate offices can receive them.
  •  
    From an Fight For the Future mailing: "Here's the deal: yesterday the Senate delayed its expected vote on CISA, the Cybersecurity Information Sharing Act that would let companies share your private information--like emails and medical records--with the government. "The delay is good news; but it's a delay, not a victory. "We just bought some precious extra time to fight CISA, but we need to use it to go big like we did with SOPA or this bill will still pass. Even if we stop it in September, they'll try again after that. "The truth is that right now, things are looking pretty grim. Democrats and Republicans have been holding closed-door meetings to work out a deal to pass CISA quickly when they return from recess. "Right before the expected Senate vote on CISA, the Obama Administration endorsed the bill, which means if Congress passes it, the White House will definitely sign it.  "We've stalled and delayed CISA and bills like it nearly half a dozen times, but this month could be our last chance to stop it for good." See also http://tumblr.fightforthefuture.org/post/125953876003/senate-fails-to-advance-cisa-before-recess-amid (;) http://www.cbsnews.com/news/activists-send-the-senate-6-million-faxes-to-oppose-cyber-bill/ (;) http://www.npr.org/2015/08/04/429386027/privacy-advocates-to-senate-cyber-security-bill (.)
Gary Edwards

Google Chrome 5 WebKit - Firefox - Opera Comparisons - BusinessWeek - 0 views

  •  
    Chrome runs as close as any browser can to the bleeding edge of Web standards. Though it uses the same open source WebKit rendering engine as Safari, it doesn't reliably support the controversial, proprietary CSS3 transformation and animation tricks that Apple's built into Safari. However, like every browser I tested, it earned a perfect score in a compatibility test for CSS3 selectors, and it joined Safari and Opera with a flawless score of 100 in the Acid3 web standards benchmark. Chrome 5 also supports both Apple's H.264 codec and Mozilla's preferred open source Ogg Theora technology for plugin-free HTML5 video, and it beautifully played back HTML5 demo videos from YouTube and Brightcove. In XHTML and CSS tests, Chrome was surprisingly slower than Safari, despite their shared rendering engine -- but the race was close. Safari rendered a local XHTML test page in 0.58 seconds to Chrome's 0.78 seconds, and a local CSS test page in 33 milliseconds to Chrome's 51 milliseconds. Note that Chrome still rendered XHTML more than twice as fast as Opera (1.67 seconds) and left Firefox (12.42 seconds--no, that's not a typo) eating its dust. In CSS, it also beat the pants off Opera (193 milliseconds) and Firefox (342 milliseconds). But Chrome shines brightest when handling JavaScript. Its V8 engine zipped through the SunSpider Javascript benchmark in 448.6 milliseconds, narrowly beating Opera's 485.8 milliseconds, and absolutely plastering Firefox's 1,161.4 milliseconds. However, Safari 5's time of 376.3 miliseconds in the SunSpider test beat Chrome 5 handily.
Gary Edwards

Cloudy Battle in Los Angeles: Microturf vs. Googzilla -- Redmond Developer News - 0 views

  •  
    Talk about a game changer: Excerpt:  An epic battle is brewing out West with much more than a lucrative technology contract at stake: Microsoft Office or Google's cloud? As the Los Angeles Times reported yesterday, Microsoft and Google are bidding for a $7.25 million contract to replace the city of Los Angeles' outdated email system. Los Angeles put out a call for bids in 2008. "Google Apps got the nod because city administrators believed it would be cheaper and less labor-intensive," writes LA Times reporter David Sarno. We all knew this day of reckoning was coming. For Microsoft, the fight to hold on to its Office base is on. Google Apps, the Web-based office suite that includes the viral Gmail, promises less overhead and potentially big savings to fiscally strapped cities, corporations and college campuses. In addition to dispatching teams of lobbyists, both Steve Ballmer and Eric Schmidt have offered to put in appearances at city hall, if city officials think it will help, according to a city councilman quoted in the article.
Paul Merrell

Google Gets Semantic - Google Blog - InformationWeek - 0 views

  • While most emerging technologies tend to happen quickly, one that has been "emerging" for a really long time is the Semantic Web. However, Google's recent acquisition of Metaweb may be the signal that the Semantic Web has finally arrived.
  • Metaweb's Freebase is definitely semantic, and the fact that Google has acquired the company could signal an increased focus on the Semantic Web within Google, something they have not always been that interested in.If Google increases the profile of Freebase, and begins incorporating more semantic technologies within their other services, it could finally offer the deep deployment that has always been missing in order for the Semantic Web to take off.
Gary Edwards

Method for invoking UOML instructions - Patent application - Embodiments of the present... - 1 views

  •  
    Patent application filed on OASIS UOML access by API. 0002]The present invention relates to electronic document processing technologies, and particularly to a method for encapsulating Unstructured Operation Markup Language (UOML) into an Application Programming Interface (API).  BACKGROUND OF THE INVENTION  [0003]The UOML standard includes a series of docbase management system instructions defined according to a format of "action+object" in Extensible Markup Language (XML), which has been explained in detail in an UOML Standard published by of the Organization for the Advancement of Structured Information Standards (OASIS ). Since XML works across different platforms and with different languages, the UOML standard can enable the docbase management system instructions to be exchanged across the different platforms in the different languages. However, in practical applications, operations on a docbase are usually controlled by using programs written in programming languages, hence the programs need to parse and process UOML XML texts. If every application developer designs his/her own way of parsing and processing UOML XML texts in his/her programs, the workload of coding will increase significantly and the efficiency of coding will drop sharply.  SUMMARY OF THE INVENTION  [0004]The objective of the present invention is to provide a method for encapsulating Unstructured Operation Markup Language (UOML) into an Application Programming Interface (API) of a programming language so as to improve the development efficiency of docbase management system application developers.  [0005]The method provided by the present invention for encapsulating UOML into an API includes:  Read more: http://www.faqs.org/patents/app/20090187927#ixzz0xVS2ZUSr
Gary Edwards

Father of CSS plans for Web publishing future | Deep Tech - CNET News - 1 views

  • "You paint a layout with ASCII art," a sort of visual design made out of text directly in the CSS code, Lie said, "then fill content into that. It's an experimental specification, but one I think has that compactness and terseness and minimalism that's part of CSS but still allows you to do quite advanced layouts."
    • Gary Edwards
       
      What???  Why not use SVG!
  •  
    After years of relative obscurity, the Web formatting standard called CSS, or Cascading Style Sheets has come into its own, taking a starring role as the mechanism for building a new generation of interactive, elaborate Web pages. CSS is growing in new directions now, and the technology's original creator believes its next direction for improvement will be dealing with more complicated Web page layout chores. "There is important work left to be done for layout," Håkon Wium Lie, who is also Opera's chief technology officer, said in an interview here. The new CSS3 under development now can handle multi-column text arrangements, "but you couldn't replicate a printed newspaper in CSS."
Gary Edwards

Ex-Apple Javascript Guru: HTML5 and Native Apps Can Live Together: Tech News « - 0 views

  •  
    Good interview with Charles Jolley - SproutCore - WebKit (met Charles at Web 2.0).  He has left Apple and started a SproutCore Web App development company called "Strobe".  Looking very good Charles! The Blended Brew Apps have become a preferred way of accessing information on mobile devices. But developers want to provide a unified experience, and that is why Jolley believes that we will soon have apps that use HTML5 inside a native app wrapper. "People are looking for an either/or solution, but it is not going to end up like that," he said. Think of Strobe's offerings as a way to create an experience that is a blend of HTML5 and native mobile apps. How this works is that an application is developed in HTML5 instead of proprietary formats. It is wrapped in a native app wrapper for, say, the iPhone, but when accessed through a web browser on a PC or any other device, like tablet, it offers the same user experience. This is a good way to solve a problem that is only going to get compounded many fold as multiple endpoints for content start to emerge. The co-existence of web and native apps also means content publishers need to think differently about content and how it is offered to consumers. The multiplicity of endpoints (iPhone, iPad, TV and PC) is going to force content producers to think differently about how they build the user experiences for different sets of screens. Jolley argues that the best way to do so is to stop taking a document-centric view that is part of the PC-era. In the touch-based mobile device era, folks need to think of ways to have a single technology stack married to the ability to create unique experiences for different devices. And if you do that, there is no doubt that HTML5 and native apps can live in harmony.
Gary Edwards

oEmbed: How New Twitter Could Help Combine Content From Different Sites - 0 views

  •  
    transclusion of hypertext documents. Transclusion is technically defined as "when you put that one thing in that other thing". In its current implementation, Twitter has declared that media which is shown within the Twitter interface comes from selected partners. But actually, the technology to allow embedding of rich media from almost any site already exists, using a system called OEmbed. Geeky stuff, but it's made by nice people who are pretty smart, and it lets any site say, "Hey, if you want to put our thing in your thing, do it like this". It works. Lots of sites do it. Nobody's getting rich off of it, but nobody's getting sued, and in between those two extremes lies most of what makes the Web great.
Gary Edwards

Open Source Cloud Collaboration - Port 25: The Open Source Community at Microsoft - 1 views

  •  
    Today Microsoft announced an open source cloud collaboration that may surprise some people, but not our customers and partners who have relied on our interoperability solutions over the past few years. Today Microsoft announced that it has partnered with Cloud.com to provide integration and support of Windows Server 2008 R2 Hyper-V  to the OpenStack project, an open source cloud infrastructure platform. The Hyper-V addition provides enterprise customers running a mix of Microsoft and non-Microsoft technologies greater flexibility when using OpenStack. Until today, OpenStack only supported several open source virtualization products. Comment:  Microsoft needs to slow down Google and keep Apple's focus elsewhere.  Contributing a Windows only hypervisor to the OSS Cloud.com OpenStack project is one way Microsoft can hedge their own flailing Azure Cloud effort.  Read the Ray Ozzie good-bye letter.  The combination of Cloud, Web and Mobile Computing is the end of the Windows OS.
Maluvia Haseltine

Open Web Foundation - 0 views

  •  
    An independent non-profit dedicated to the development and protection of open, non-proprietary specifications for web technologies. Aimed at building a lightweight framework to help communities deal with the legal requirements necessary to create successful and widely adopted specifications. hoping to break the trend of creating separate legal entities to support individual specifications, coming out of the realization that we could come together and generalize our efforts.
Gary Edwards

How Sir Tim Berners-Lee cut the Gordian Knot of HTML5 | Technology | guardian.co.uk - 0 views

  •  
    Good article with excellent URL references.  Bottom line is that the W3C will support the advance of HTML5 and controversial components such as "canvas", HTML + RDFa, and HTML microdata. excerpt: The key question is: who's going to get their way with HTML5? The companies who want to keep the kitchen sink in? Or those which want it to be a more flexible format which might also be able to displace some rather comfortable organisations that are doing fine with things as they are? Adobe, it turned out, seemed to be trying to slow things down a little. It was accused of trying to put HTML5 "on hold". It strongly denied it. Others said it was using "procedural bullshit". Then Berners-Lee weighed in with a post on the W3 mailing list. First he noted the history: "Some in the community have raised questions recently about whether some work products of the HTML Working Group are within the scope of the Group's charter. Specifically in question were the HTML Canvas 2D API, and the HTML Microdata and HTML+RDFa Working Drafts." (Translation: Adobe seems to have been trying to slow things down on at least one of these points.) And then he pushes: "I agree with the WG [working group] chairs that these items -- data and canvas - are reasonable areas of work for the group. It is appropriate for the group to publish documents in this area." Chop! And that's it. There goes the Gordian Knot. With that simple message, Berners-Lee has probably created a fresh set of headaches for Adobe - but it means that we can also look forward to a web with open standards, rather than proprietary ones, and where commercial interests don't get to push it around.
Gary Edwards

Why You Should Upload Documents to Office Web Apps via SkyDrive - 0 views

  •  
    Here it comes - the "rich" Web experience based on integrated but proprietary 2010 technologies from Microsoft.  Note the comparative "advantages" listed in this article describing Microsoft SkyDrive, and comparing to Google Docs. excerpt:  Do you use Microsoft Office programs for creating documents and then use Google Docs to edit these documents online or as an offsite backup? Well, now that Office 2010 and Office Web Apps are available under public beta for free, here are some reasons why you should consider uploading documents, presentations and spreadsheets into Office Web Apps via Windows Live SkyDrive in addition to your Google Docs account. 1. Windows Live SkyDrive supports larger files 2. Document formatting is preserved 3. Native OpenXML file formats 4. Public Documents are in the Lifestream 5. Content is not 'lost in translation'  ....... When you upload a document in Office Web Apps, the application will automatically preserve all the data in that document even if a particular feature is not currently supported by the online applications. For instance, if your PowerPoint presentation contains a slide transition (e.g., Vortex) that is not supported in the online version of Office, the feature will be preserved in your presentation even if you upload it on to Office Web Apps via Windows Live SkyDrive. Later, when you download and open that presentation inside PowerPoint, it would be just like the original version. The content is not 'lost in translation' with Office Web Apps. Are you using Google Docs as a Document Backup Service?  Office Web Apps won't just preserve all the original features of your documents but you can also download entire directories of Office documents as a ZIP file with a simple click.
Paul Merrell

WG Review: Internet Wideband Audio Codec (codec) - 0 views

  •  
    A new IETF working group has been proposed in the Real-time Applications and Infrastructure Area. The IESG has not made any determination as yet. The following draft charter was submitted, and is provided for informational purposes only. Please send your comments to the IESG mailing list (iesg at ietf.org) by January 20, 2010. ... According to reports from developers of Internet audio applications and operators of Internet audio services, there are no standardized, high-quality audio codecs that meet all of the following three conditions: 1. Are optimized for use in interactive Internet applications. 2. Are published by a recognized standards development organization (SDO) and therefore subject to clear change control. 3. Can be widely implemented and easily distributed among application developers, service operators, and end users. ... The goal of this working group is to develop a single high-quality audio codec that is optimized for use over the Internet and that can be widely implemented and easily distributed among application developers, service operators, and end users. Core technical considerations include, but are not necessarily limited to, the following: 1. Designing for use in interactive applications (examples include, but are not limited to, point-to-point voice calls, multi-party voice conferencing, telepresence, teleoperation, in-game voice chat, and live music performance) 2. Addressing the real transport conditions of the Internet as identified and prioritized by the working group 3. Ensuring interoperability with the Real-time Transport Protocol (RTP), including secure transport via SRTP 4. Ensuring interoperability with Internet signaling technologies such as Session Initiation Protocol (SIP), Session Description Protocol (SDP), and Extensible Messaging and Presence Protocol (XMPP); however, the result should not depend on the details of any particular signaling technology.
Gary Edwards

10 most useful Google Chrome experiments | ITworld - 1 views

  •  
    When it comes to presenting graphically oriented programs through a browser, the usual go-to development platforms have been Adobe Flash and -- to a lesser extent -- Microsoft Silverlight. But other, more open technologies are starting to show promise. The 10 best Chrome extensions for work and play |Watch a slideshow of this review. That's what Google aims to highlight on Chrome Experiments, a Web site that showcases JavaScript programs that deliver a rich user-graphics experience. Of the nearly 80 projects featured on Chrome Experiments, the majority are graphic demos. As impressive as such eye candy is, they're not good examples of how capable JavaScript can be for running graphically-oriented applications that are actually useful. But there are a few notable ones, which we present here. (Despite the site's name, these programs should run on any browser that supports JavaScript.)
Paul Merrell

Official Google Blog: Alis volat propriis: Oregon's bringing Google Apps to classrooms ... - 0 views

  • Things have changed since I was in middle school of course, and there are people working hard to bring technology into classrooms to help students learn and teachers teach. Today Oregon is taking a huge step in that direction — they’re the first state to open up Google Apps for Education to public schools throughout the state.Starting today, the Oregon Department of Education will offer Google Apps to all the school districts in the state — helping teachers, staff and students use Gmail, Docs, Sites, Video, Groups and more within their elementary, middle and high schools. School funding has been hit hard over the past couple of years, and Oregon is no exception. This move is going to save the Department of Education $1.5 million per year — big bucks for a hurting budget.With Google Apps, students in Oregon can build websites or email teachers about a project. Their documents and email will live online in the cloud — so they’ll be able to work from a classroom or a computer lab, at home or at the city (or county) library. And instead of just grading a paper at the end of the process, Oregonian teachers can help students with their docs in real time, coaching them along the way. It’s critical that students learn how to use the kind of productivity technology they’ll need throughout their lives, and Oregon is helping students across the state do just that.
Paul Merrell

Cover Pages: Content Management Interoperability Services (CMIS) - 0 views

  • On October 06, 2008, OASIS issued a public call for participation in a new technical committee chartered to define specifications for use of Web services and Web 2.0 interfaces to enable information sharing across content management repositories from different vendors. The OASIS Content Management Interoperability Services (CMIS) TC will build upon existing specifications to "define a domain model and bindings that are designed to be layered on top of existing Content Management systems and their existing programmatic interfaces. The TC will not prescribe how specific features should be implemented within those Enterprise Content Management (ECM) systems. Rather it will seek to define a generic/universal set of capabilities provided by an ECM system and a set of services for working with those capabilities." As of February 17, 2010, the CMIS technical work had received broad support through TC participation, industry analyst opinion, and declarations of interest from major companies. Some of these include Adobe, Adullact, AIIM, Alfresco, Amdocs, Anakeen, ASG Software Solutions, Booz Allen Hamilton, Capgemini, Citytech, Content Technologies, Day Software, dotCMS, Ektron, EMC, EntropySoft, ESoCE-NET, Exalead, FatWire, Fidelity, Flatirons, fme AG, Genus Technologies, Greenbytes GmbH, Harris, IBM, ISIS Papyrus, KnowledgeTree, Lexmark, Liferay, Magnolia, Mekon, Microsoft, Middle East Technical University, Nuxeo, Open Text, Oracle, Pearson, Quark, RSD, SAP, Saperion, Structured Software Systems (3SL), Sun Microsystems, Tanner AG, TIBCO Software, Vamosa, Vignette, and WeWebU Software. Early commentary from industry analysts and software engineers is positive about the value proposition in standardizing an enterprise content-centric management specification. The OASIS announcement of November 17, 2008 includes endorsements. Principal use cases motivating the CMIS technical work include collaborative content applications, portals leveraging content management repositories, mashups, and searching a content repository.
  •  
    I should have posted before about CMIS, an emerging standard with a very lot of buy-in by vendors large and small. I've been watching the buzz grow via Robin Cover's Daily XML links service. IIt's now on my "need to watch" list. 
« First ‹ Previous 161 - 180 of 451 Next › Last »
Showing 20 items per page