Skip to main content

Home/ Future of the Web/ Group items matching "profiles" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
Paul Merrell

Membership applications - 7 views

Folks, I apologize for the recent spate of spam bookmarks in the Future of the Web group. The only pattern I've been able to discern is that all of the spammers were people who had just registered ...

admin

started by Paul Merrell on 16 Jun 10 no follow-up yet
Paul Merrell

W3C Standards Make Mobile Web Experience More Inviting - 0 views

  • W3C today announced new standards that will make it easier for people to browse the Web on mobile devices. Mobile Web Best Practices 1.0, published as a W3C Recommendation, condenses the experience of many mobile Web stakeholders into practical advice on creating mobile-friendly content.
  • Until today, content developers faced an additional challenge: a variety of mobile markup languages to choose from. With the publication of the XHTML Basic 1.1 Recommendation today, the preferred format specification of the Best Practices, there is now a full convergence in mobile markup languages, including those developed by the Open Mobile Alliance (OMA). The W3C mobileOK checker (beta), when used with the familiar W3C validator, helps developers test mobile-friendly Web content.
  • W3C is also developing resources to help authors understand how to create content that is both mobile-friendly and accessible to people with disabilities. A draft of Relationship between Mobile Web Best Practices (MWBP) and Web Content Accessibility Guidelines (WCAG) is jointly published by the The Mobile Web Best Practices Working Group and WAI's Education & Outreach Working Group (EOWG).
  •  
    Most quality online stores. Know whether you are a trusted online retailer in the world. Whatever we can buy very good quality. and do not hesitate. Everything is very high quality. Including clothes, accessories, bags, cups. Highly recommended. This is one of the trusted online store in the world. View now www.retrostyler.com
Paul Merrell

W3C releases Working Draft for Widgets 1.0: APIs and Events - 0 views

  • This specification defines a set of APIs and events for the Widgets 1.0 Family of Specifications that enable baseline functionality for widgets. The APIs and Events defined by this specification defines, amongst other things, the means to:access the metadata declared in a widget's configuration document, receive events related to changes in the view state of a widget, determine the locale under which a widget is currently running, be notified of events relating to the widget being updated, invoke a widget to open a URL on the system's default browser, requests the user's attention in a device independent manner, and check if any additional APIs requested via the configuration document's feature element have successfully loaded.
  • This specification defines a set of APIs and events for widgets that enable baseline functionality for widgets. Widgets are full-fledged client-side applications that are authored using Web standards. They are typically downloaded and installed on a client machine or device where they typically run as stand-alone applications outside of a Web browser. Examples range from simple clocks, stock tickers, news casters, games and weather forecasters, to complex applications that pull data from multiple sources to be "mashed-up" and presented to a user in some interesting and useful way
  • This specification is part of the Widgets 1.0 family of specifications, which together standardize widgets as a whole. The Widgets 1.0: Packaging and Configuration [Widgets-Packaging] standardizes a Zip-based packaging format, an XML-based configuration document format and a series of steps that user agents follow when processing and verifying various aspects of widgets. The Widgets 1.0: Digital Signature [Widgets-DigSig] specification defines a means for widgets to be digitally signed using a custom profile of the XML-Signature Syntax and Processing Specification. The Widgets: 1.0: Automatic Updates [Widgets-Updates] specification defines a version control model that allows widgets to be kept up-to-date over [HTTP].
Paul Merrell

XHTML Modularization 1.1 Released as W3C Recommendation - 0 views

  • XHTML Modularization is a decomposition of XHTML 1.0, and by reference HTML 4, into a collection of abstract modules that provide specific types of functionality.
  • The modularization of XHTML refers to the task of specifying well-defined sets of XHTML elements that can be combined and extended by document authors, document type architects, other XML standards specifications, and application and product designers to make it economically feasible for content developers to deliver content on a greater number and diversity of platforms. Over the last couple of years, many specialized markets have begun looking to HTML as a content language. There is a great movement toward using HTML across increasingly diverse computing platforms. Currently there is activity to move HTML onto mobile devices (hand held computers, portable phones, etc.), television devices (digital televisions, TV-based Web browsers, etc.), and appliances (fixed function devices). Each of these devices has different requirements and constraints.
  • XHTML Modularization is a decomposition of XHTML 1.0, and by reference HTML 4, into a collection of abstract modules that provide specific types of functionality. These abstract modules are implemented in this specification using the XML Schema and XML Document Type Definition languages. The rules for defining the abstract modules, and for implementing them using XML Schemas and XML DTDs, are also defined in this document. These modules may be combined with each other and with other modules to create XHTML subset and extension document types that qualify as members of the XHTML-family of document types.
  • ...1 more annotation...
  • Modularizing XHTML provides a means for product designers to specify which elements are supported by a device using standard building blocks and standard methods for specifying which building blocks are used. These modules serve as "points of conformance" for the content community. The content community can now target the installed base that supports a certain collection of modules, rather than worry about the installed base that supports this or that permutation of XHTML elements. The use of standards is critical for modularized XHTML to be successful on a large scale. It is not economically feasible for content developers to tailor content to each and every permutation of XHTML elements. By specifying a standard, either software processes can autonomously tailor content to a device, or the device can automatically load the software required to process a module. Modularization also allows for the extension of XHTML's layout and presentation capabilities, using the extensibility of XML, without breaking the XHTML standard. This development path provides a stable, useful, and implementable framework for content developers and publishers to manage the rapid pace of technological change on the Web.
Gonzalo San Gil, PhD.

2011 Interested in Day Against DRM | Free Software Foundation - 0 views

  •  
    Take Part of The Day Against DRM. Paricipate to Liberate Culture and Technology from Their imposed Chains.
Ernest Rando

Google Analytics Reports are back concerning the results of using Twitter and blogging to increase KS Design Website awareness on the internet. - Smaller Indiana - 0 views

  •  
    The Google reports of the effects twitter and blogging has had on KS Design, My company
Paul Merrell

W3C Issues Report on Web and Television Convergence - 0 views

  • 28 March 2011 -- The Web and television convergence story was the focus of W3C's Second Web and TV Workshop, which took place in Berlin in February. Today, W3C publishes a report that summarizes the discussion among the 77 organizations that participated, including broadcasters, telecom companies, cable operators, OTT (over the top) companies, content providers, device vendors, software vendors, Web application providers, researchers, governments, and standardization organizations active in the TV space. Convergence priorities identified in the report include: Adaptive streaming over HTTP Home networking and second-screen scenarios The role of metadata and relation to Semantic Web technology Ensuring that convergent solutions are accessible. Profiling and testing Possible extensions to HTML5 for Television
Paul Merrell

Why the Sony hack is unlikely to be the work of North Korea. | Marc's Security Ramblings - 0 views

  • Everyone seems to be eager to pin the blame for the Sony hack on North Korea. However, I think it’s unlikely. Here’s why:1. The broken English looks deliberately bad and doesn’t exhibit any of the classic comprehension mistakes you actually expect to see in “Konglish”. i.e it reads to me like an English speaker pretending to be bad at writing English. 2. The fact that the code was written on a PC with Korean locale & language actually makes it less likely to be North Korea. Not least because they don’t speak traditional “Korean” in North Korea, they speak their own dialect and traditional Korean is forbidden. This is one of the key things that has made communication with North Korean refugees difficult. I would find the presence of Chinese far more plausible.
  • 3. It’s clear from the hard-coded paths and passwords in the malware that whoever wrote it had extensive knowledge of Sony’s internal architecture and access to key passwords. While it’s plausible that an attacker could have built up this knowledge over time and then used it to make the malware, Occam’s razor suggests the simpler explanation of an insider. It also fits with the pure revenge tact that this started out as. 4. Whoever did this is in it for revenge. The info and access they had could have easily been used to cash out, yet, instead, they are making every effort to burn Sony down. Just think what they could have done with passwords to all of Sony’s financial accounts? With the competitive intelligence in their business documents? From simple theft, to the sale of intellectual property, or even extortion – the attackers had many ways to become rich. Yet, instead, they chose to dump the data, rendering it useless. Likewise, I find it hard to believe that a “Nation State” which lives by propaganda would be so willing to just throw away such an unprecedented level of access to the beating heart of Hollywood itself.
  • 5. The attackers only latched onto “The Interview” after the media did – the film was never mentioned by GOP right at the start of their campaign. It was only after a few people started speculating in the media that this and the communication from DPRK “might be linked” that suddenly it became linked. I think the attackers both saw this as an opportunity for “lulz” and as a way to misdirect everyone into thinking it was a nation state. After all, if everyone believes it’s a nation state, then the criminal investigation will likely die.
  • ...4 more annotations...
  • 6. Whoever is doing this is VERY net and social media savvy. That, and the sophistication of the operation, do not match with the profile of DPRK up until now. Grugq did an excellent analysis of this aspect his findings are here – http://0paste.com/6875#md 7. Finally, blaming North Korea is the easy way out for a number of folks, including the security vendors and Sony management who are under the microscope for this. Let’s face it – most of today’s so-called “cutting edge” security defenses are either so specific, or so brittle, that they really don’t offer much meaningful protection against a sophisticated attacker or group of attackers.
  • 8. It probably also suits a number of political agendas to have something that justifies sabre-rattling at North Korea, which is why I’m not that surprised to see politicians starting to point their fingers at the DPRK also. 9. It’s clear from the leaked data that Sony has a culture which doesn’t take security very seriously. From plaintext password files, to using “password” as the password in business critical certificates, through to just the shear volume of aging unclassified yet highly sensitive data left out in the open. This isn’t a simple slip-up or a “weak link in the chain” – this is a serious organization-wide failure to implement anything like a reasonable security architecture.
  • The reality is, as things stand, Sony has little choice but to burn everything down and start again. Every password, every key, every certificate is tainted now and that’s a terrifying place for an organization to find itself. This hack should be used as the definitive lesson in why security matters and just how bad things can get if you don’t take it seriously. 10. Who do I think is behind this? My money is on a disgruntled (possibly ex) employee of Sony.
  • EDIT: This appears (at least in part) to be substantiated by a conversation the Verge had with one of the alleged hackers – http://www.theverge.com/2014/11/25/7281097/sony-pictures-hackers-say-they-want-equality-worked-with-staff-to-break-in Finally for an EXCELLENT blow by blow analysis of the breach and the events that followed, read the following post by my friends from Risk Based Security – https://www.riskbasedsecurity.com/2014/12/a-breakdown-and-analysis-of-the-december-2014-sony-hack EDIT: Also make sure you read my good friend Krypt3ia’s post on the hack – http://krypt3ia.wordpress.com/2014/12/18/sony-hack-winners-and-losers/
  •  
    Seems that the FBI overlooked a few clues before it told Obama to go ahead and declare war against North Korea. 
Paul Merrell

Verizon Will Now Let Users Kill Previously Indestructible Tracking Code - ProPublica - 0 views

  • Verizon says it will soon offer customers a way to opt out from having their smartphone and tablet browsing tracked via a hidden un-killable tracking identifier. The decision came after a ProPublica article revealed that an online advertiser, Turn, was exploiting the Verizon identifier to respawn tracking cookies that users had deleted. Two days after the article appeared, Turn said it would suspend the practice of creating so-called "zombie cookies" that couldn't be deleted. But Verizon couldn't assure users that other companies might not also exploit the number - which was transmitted automatically to any website or app a user visited from a Verizon-enabled device - to build dossiers about people's behavior on their mobile devices. Verizon subsequently updated its website to note Turn's decision and declared that it would "work with other partners to ensure that their use of [the undeletable tracking number] is consistent with the purposes we intended." Previously, its website had stated: "It is unlikely that sites and ad entities will attempt to build customer profiles.
  • However, policing the hundreds of companies in the online tracking business was likely to be a difficult task for Verizon. And so, on Monday, Verizon followed in the footsteps of AT&T, which had already declared in November that it would stop inserting the hidden undeletable number in its users' Web traffic. In a statement emailed to reporters on Friday, Verizon said, "We have begun working to expand the opt-out to include the identifier referred to as the UIDH, and expect that to be available soon." Previously, users who opted out from Verizon's program were told that information about their demographics and Web browsing behavior would no longer be shared with advertisers, but that the tracking number would still be attached to their traffic. For more coverage, read ProPublica's previous reporting on Verizon's indestructible tracking and how one company used the tool to create zombie cookies.
  •  
    Good for Pro Publica!
Paul Merrell

Verizon Injecting Perma-Cookies to Track Mobile Customers, Bypassing Privacy Controls | Electronic Frontier Foundation - 0 views

  • Verizon users might want to start looking for another provider. In an effort to better serve advertisers, Verizon Wireless has been silently modifying its users' web traffic on its network to inject a cookie-like tracker. This tracker, included in an HTTP header called X-UIDH, is sent to every unencrypted website a Verizon customer visits from a mobile device. It allows third-party advertisers and websites to assemble a deep, permanent profile of visitors' web browsing habits without their consent.Verizon apparently created this mechanism to expand their advertising programs, but it has privacy implications far beyond those programs. Indeed, while we're concerned about Verizon's own use of the header, we're even more worried about what it allows others to find out about Verizon users. The X-UIDH header effectively reinvents the cookie, but does so in a way that is shockingly insecure and dangerous to your privacy. Worse still, Verizon doesn't let users turn off this "feature." In fact, it functions even if you use a private browsing mode or clear your cookies. You can test whether the header is injected in your traffic by visiting lessonslearned.org/sniff or amibeingtracked.com over a cell data connection.How X-UIDH Works, and Why It's a Problem
  • To compound the problem, the header also affects more than just web browsers. Mobile apps that send HTTP requests will also have the header inserted. This means that users' behavior in apps can be correlated with their behavior on the web, which would be difficult or impossible without the header. Verizon describes this as a key benefit of using their system. But Verizon bypasses the 'Limit Ad Tracking' settings in iOS and Android that are specifically intended to limit abuse of unique identifiers by mobile apps.
  • Because the header is injected at the network level, Verizon can add it to anyone using their towers, even those who aren't Verizon customers.
  • ...1 more annotation...
  • We're also concerned that Verizon's failure to permit its users to opt out of X-UIDH may be a violation of the federal law that requires phone companies to maintain the confidentiality of their customers' data. Only two months ago, the wireline sector of Verizon's business was hit with a $7.4 million fine by the Federal Communications Commission after it was caught using its "customers' personal information for thousands of marketing campaigns without even giving them the choice to opt out." With this header, it looks like Verizon lets its customers opt out of the marketing side of the program, but not from the disclosure of their browsing habits.
Paul Merrell

RIOT gear: your online trail just got way more visible - 0 views

  • The recent publication of a leaked video demonstrating American security firm Raytheon’s social media mining tool RIOT (Rapid Information Overlay Technology) has rightly incensed individuals and online privacy groups. In a nutshell, RIOT – already shared with US government and industry as part of a joint research and development effort in 2010 – uses social media traces to profile people’s activities, map their contacts, and predict their future activities. Yet the most surprising thing isn’t how RIOT works, but that the information it mines is what we’ve each already shared publicly.
  •  
    Privacy Act, 5 U.S.C.552a(e) "Each agency that maintains a system of rec­ords shall- ... "(7) maintain no record describing how any individual exercises rights guaranteed by the First Amendment unless expressly authorized by statute or by the individual about whom the record is maintained or unless pertinent to and within the scope of an authorized law enforcement activity"
Paul Merrell

Apple could use Brooklyn case to pursue details about FBI iPhone hack: source | Reuters - 0 views

  • If the U.S. Department of Justice asks a New York court to force Apple Inc to unlock an iPhone, the technology company could push the government to reveal how it accessed the phone which belonged to a shooter in San Bernardino, a source familiar with the situation said.The Justice Department will disclose over the next two weeks whether it will continue with its bid to compel Apple to help access an iPhone in a Brooklyn drug case, according to a court filing on Tuesday.The Justice Department this week withdrew a similar request in California, saying it had succeeded in unlocking an iPhone used by one of the shooters involved in a rampage in San Bernardino in December without Apple's help.The legal dispute between the U.S. government and Apple has been a high-profile test of whether law enforcement should have access to encrypted phone data.
  • Apple, supported by most of the technology industry, says anything that helps authorities bypass security features will undermine security for all users. Government officials say that all kinds of criminal investigations will be crippled without access to phone data.Prosecutors have not said whether the San Bernardino technique would work for other seized iPhones, including the one at issue in Brooklyn. Should the Brooklyn case continue, Apple could pursue legal discovery that would potentially force the FBI to reveal what technique it used on the San Bernardino phone, the source said. A Justice Department representative did not have immediate comment.
Paul Merrell

Civil Society Groups Ask Facebook To Provide Method To Appeal Censorship | PopularResistance.Org - 0 views

  • EFF, Human Rights Watch, and Over 70 Civil Society Groups Ask Mark Zuckerberg to Provide All Users with Mechanism to Appeal Content Censorship on Facebook World’s Freedom of Expression Is In Your Hands, Groups Tell CEO San Francisco—The Electronic Frontier Foundation (EFF) and more than 70 human and digital rights groups called on Mark Zuckerberg today to add real transparency and accountability to Facebook’s content removal process. Specifically, the groups demand that Facebook clearly explain how much content it removes, both rightly and wrongly, and provide all users with a fair and timely method to appeal removals and get their content back up. While Facebook is under enormous—and still mounting—pressure to remove material that is truly threatening, without transparency, fairness, and processes to identify and correct mistakes, Facebook’s content takedown policies too often backfire and silence the very people that should have their voices heard on the platform.  Politicians, museums, celebrities, and other high profile groups and individuals whose improperly removed content can garner media attention seem to have little trouble reaching Facebook to have content restored—they sometimes even receive an apology. But the average user? Not so much. Facebook only allows people to appeal content decisions in a limited set of circumstances, and in many cases, users have absolutely no option to appeal. Onlinecensorship.org, an EFF project for users to report takedown notices, has collected reports of hundreds of unjustified takedown incidents where appeals were unavailable. For most users, content Facebook removes is rarely restored, and some are banned from the platform for no good reason. EFF, Article 19, the Center for Democracy and Technology, and Ranking Digital Rights wrote directly to Mark Zuckerberg today demanding that Facebook implement common sense standards so that average users can easily appeal content moderation decisions, receive prompt replies and timely review by a human or humans, and have the opportunity to present evidence during the review process. The letter was co-signed by more than 70 human rights, digital rights, and civil liberties organizations from South America, Europe, the Middle East, Asia, Africa, and the U.S.
Paul Merrell

'I made Steve Bannon's psychological warfare tool': meet the data war whistleblower | News | The Guardian - 0 views

  • For more than a year we’ve been investigating Cambridge Analytica and its links to the Brexit Leave campaign in the UK and Team Trump in the US presidential election. Now, 28-year-old Christopher Wylie goes on the record to discuss his role in hijacking the profiles of millions of Facebook users in order to target the US electorate
Paul Merrell

From Radio to Porn, British Spies Track Web Users' Online Identities - 1 views

  • HERE WAS A SIMPLE AIM at the heart of the top-secret program: Record the website browsing habits of “every visible user on the Internet.” Before long, billions of digital records about ordinary people’s online activities were being stored every day. Among them were details cataloging visits to porn, social media and news websites, search engines, chat forums, and blogs. The mass surveillance operation — code-named KARMA POLICE — was launched by British spies about seven years ago without any public debate or scrutiny. It was just one part of a giant global Internet spying apparatus built by the United Kingdom’s electronic eavesdropping agency, Government Communications Headquarters, or GCHQ. The revelations about the scope of the British agency’s surveillance are contained in documents obtained by The Intercept from National Security Agency whistleblower Edward Snowden. Previous reports based on the leaked files have exposed how GCHQ taps into Internet cables to monitor communications on a vast scale, but many details about what happens to the data after it has been vacuumed up have remained unclear.
  • Amid a renewed push from the U.K. government for more surveillance powers, more than two dozen documents being disclosed today by The Intercept reveal for the first time several major strands of GCHQ’s existing electronic eavesdropping capabilities.
  • The surveillance is underpinned by an opaque legal regime that has authorized GCHQ to sift through huge archives of metadata about the private phone calls, emails and Internet browsing logs of Brits, Americans, and any other citizens — all without a court order or judicial warrant
  • ...17 more annotations...
  • A huge volume of the Internet data GCHQ collects flows directly into a massive repository named Black Hole, which is at the core of the agency’s online spying operations, storing raw logs of intercepted material before it has been subject to analysis. Black Hole contains data collected by GCHQ as part of bulk “unselected” surveillance, meaning it is not focused on particular “selected” targets and instead includes troves of data indiscriminately swept up about ordinary people’s online activities. Between August 2007 and March 2009, GCHQ documents say that Black Hole was used to store more than 1.1 trillion “events” — a term the agency uses to refer to metadata records — with about 10 billion new entries added every day. As of March 2009, the largest slice of data Black Hole held — 41 percent — was about people’s Internet browsing histories. The rest included a combination of email and instant messenger records, details about search engine queries, information about social media activity, logs related to hacking operations, and data on people’s use of tools to browse the Internet anonymously.
  • Throughout this period, as smartphone sales started to boom, the frequency of people’s Internet use was steadily increasing. In tandem, British spies were working frantically to bolster their spying capabilities, with plans afoot to expand the size of Black Hole and other repositories to handle an avalanche of new data. By 2010, according to the documents, GCHQ was logging 30 billion metadata records per day. By 2012, collection had increased to 50 billion per day, and work was underway to double capacity to 100 billion. The agency was developing “unprecedented” techniques to perform what it called “population-scale” data mining, monitoring all communications across entire countries in an effort to detect patterns or behaviors deemed suspicious. It was creating what it said would be, by 2013, “the world’s biggest” surveillance engine “to run cyber operations and to access better, more valued data for customers to make a real world difference.”
  • A document from the GCHQ target analysis center (GTAC) shows the Black Hole repository’s structure.
  • The data is searched by GCHQ analysts in a hunt for behavior online that could be connected to terrorism or other criminal activity. But it has also served a broader and more controversial purpose — helping the agency hack into European companies’ computer networks. In the lead up to its secret mission targeting Netherlands-based Gemalto, the largest SIM card manufacturer in the world, GCHQ used MUTANT BROTH in an effort to identify the company’s employees so it could hack into their computers. The system helped the agency analyze intercepted Facebook cookies it believed were associated with Gemalto staff located at offices in France and Poland. GCHQ later successfully infiltrated Gemalto’s internal networks, stealing encryption keys produced by the company that protect the privacy of cell phone communications.
  • Similarly, MUTANT BROTH proved integral to GCHQ’s hack of Belgian telecommunications provider Belgacom. The agency entered IP addresses associated with Belgacom into MUTANT BROTH to uncover information about the company’s employees. Cookies associated with the IPs revealed the Google, Yahoo, and LinkedIn accounts of three Belgacom engineers, whose computers were then targeted by the agency and infected with malware. The hacking operation resulted in GCHQ gaining deep access into the most sensitive parts of Belgacom’s internal systems, granting British spies the ability to intercept communications passing through the company’s networks.
  • In March, a U.K. parliamentary committee published the findings of an 18-month review of GCHQ’s operations and called for an overhaul of the laws that regulate the spying. The committee raised concerns about the agency gathering what it described as “bulk personal datasets” being held about “a wide range of people.” However, it censored the section of the report describing what these “datasets” contained, despite acknowledging that they “may be highly intrusive.” The Snowden documents shine light on some of the core GCHQ bulk data-gathering programs that the committee was likely referring to — pulling back the veil of secrecy that has shielded some of the agency’s most controversial surveillance operations from public scrutiny. KARMA POLICE and MUTANT BROTH are among the key bulk collection systems. But they do not operate in isolation — and the scope of GCHQ’s spying extends far beyond them.
  • The agency operates a bewildering array of other eavesdropping systems, each serving its own specific purpose and designated a unique code name, such as: SOCIAL ANTHROPOID, which is used to analyze metadata on emails, instant messenger chats, social media connections and conversations, plus “telephony” metadata about phone calls, cell phone locations, text and multimedia messages; MEMORY HOLE, which logs queries entered into search engines and associates each search with an IP address; MARBLED GECKO, which sifts through details about searches people have entered into Google Maps and Google Earth; and INFINITE MONKEYS, which analyzes data about the usage of online bulletin boards and forums. GCHQ has other programs that it uses to analyze the content of intercepted communications, such as the full written body of emails and the audio of phone calls. One of the most important content collection capabilities is TEMPORA, which mines vast amounts of emails, instant messages, voice calls and other communications and makes them accessible through a Google-style search tool named XKEYSCORE.
  • As of September 2012, TEMPORA was collecting “more than 40 billion pieces of content a day” and it was being used to spy on people across Europe, the Middle East, and North Africa, according to a top-secret memo outlining the scope of the program. The existence of TEMPORA was first revealed by The Guardian in June 2013. To analyze all of the communications it intercepts and to build a profile of the individuals it is monitoring, GCHQ uses a variety of different tools that can pull together all of the relevant information and make it accessible through a single interface. SAMUEL PEPYS is one such tool, built by the British spies to analyze both the content and metadata of emails, browsing sessions, and instant messages as they are being intercepted in real time. One screenshot of SAMUEL PEPYS in action shows the agency using it to monitor an individual in Sweden who visited a page about GCHQ on the U.S.-based anti-secrecy website Cryptome.
  • Partly due to the U.K.’s geographic location — situated between the United States and the western edge of continental Europe — a large amount of the world’s Internet traffic passes through its territory across international data cables. In 2010, GCHQ noted that what amounted to “25 percent of all Internet traffic” was transiting the U.K. through some 1,600 different cables. The agency said that it could “survey the majority of the 1,600” and “select the most valuable to switch into our processing systems.”
  • According to Joss Wright, a research fellow at the University of Oxford’s Internet Institute, tapping into the cables allows GCHQ to monitor a large portion of foreign communications. But the cables also transport masses of wholly domestic British emails and online chats, because when anyone in the U.K. sends an email or visits a website, their computer will routinely send and receive data from servers that are located overseas. “I could send a message from my computer here [in England] to my wife’s computer in the next room and on its way it could go through the U.S., France, and other countries,” Wright says. “That’s just the way the Internet is designed.” In other words, Wright adds, that means “a lot” of British data and communications transit across international cables daily, and are liable to be swept into GCHQ’s databases.
  • A map from a classified GCHQ presentation about intercepting communications from undersea cables. GCHQ is authorized to conduct dragnet surveillance of the international data cables through so-called external warrants that are signed off by a government minister. The external warrants permit the agency to monitor communications in foreign countries as well as British citizens’ international calls and emails — for example, a call from Islamabad to London. They prohibit GCHQ from reading or listening to the content of “internal” U.K. to U.K. emails and phone calls, which are supposed to be filtered out from GCHQ’s systems if they are inadvertently intercepted unless additional authorization is granted to scrutinize them. However, the same rules do not apply to metadata. A little-known loophole in the law allows GCHQ to use external warrants to collect and analyze bulk metadata about the emails, phone calls, and Internet browsing activities of British people, citizens of closely allied countries, and others, regardless of whether the data is derived from domestic U.K. to U.K. communications and browsing sessions or otherwise. In March, the existence of this loophole was quietly acknowledged by the U.K. parliamentary committee’s surveillance review, which stated in a section of its report that “special protection and additional safeguards” did not apply to metadata swept up using external warrants and that domestic British metadata could therefore be lawfully “returned as a result of searches” conducted by GCHQ.
  • Perhaps unsurprisingly, GCHQ appears to have readily exploited this obscure legal technicality. Secret policy guidance papers issued to the agency’s analysts instruct them that they can sift through huge troves of indiscriminately collected metadata records to spy on anyone regardless of their nationality. The guidance makes clear that there is no exemption or extra privacy protection for British people or citizens from countries that are members of the Five Eyes, a surveillance alliance that the U.K. is part of alongside the U.S., Canada, Australia, and New Zealand. “If you are searching a purely Events only database such as MUTANT BROTH, the issue of location does not occur,” states one internal GCHQ policy document, which is marked with a “last modified” date of July 2012. The document adds that analysts are free to search the databases for British metadata “without further authorization” by inputing a U.K. “selector,” meaning a unique identifier such as a person’s email or IP address, username, or phone number. Authorization is “not needed for individuals in the U.K.,” another GCHQ document explains, because metadata has been judged “less intrusive than communications content.” All the spies are required to do to mine the metadata troves is write a short “justification” or “reason” for each search they conduct and then click a button on their computer screen.
  • Intelligence GCHQ collects on British persons of interest is shared with domestic security agency MI5, which usually takes the lead on spying operations within the U.K. MI5 conducts its own extensive domestic surveillance as part of a program called DIGINT (digital intelligence).
  • GCHQ’s documents suggest that it typically retains metadata for periods of between 30 days to six months. It stores the content of communications for a shorter period of time, varying between three to 30 days. The retention periods can be extended if deemed necessary for “cyber defense.” One secret policy paper dated from January 2010 lists the wide range of information the agency classes as metadata — including location data that could be used to track your movements, your email, instant messenger, and social networking “buddy lists,” logs showing who you have communicated with by phone or email, the passwords you use to access “communications services” (such as an email account), and information about websites you have viewed.
  • Records showing the full website addresses you have visited — for instance, www.gchq.gov.uk/what_we_do — are treated as content. But the first part of an address you have visited — for instance, www.gchq.gov.uk — is treated as metadata. In isolation, a single metadata record of a phone call, email, or website visit may not reveal much about a person’s private life, according to Ethan Zuckerman, director of Massachusetts Institute of Technology’s Center for Civic Media. But if accumulated and analyzed over a period of weeks or months, these details would be “extremely personal,” he told The Intercept, because they could reveal a person’s movements, habits, religious beliefs, political views, relationships, and even sexual preferences. For Zuckerman, who has studied the social and political ramifications of surveillance, the most concerning aspect of large-scale government data collection is that it can be “corrosive towards democracy” — leading to a chilling effect on freedom of expression and communication. “Once we know there’s a reasonable chance that we are being watched in one fashion or another it’s hard for that not to have a ‘panopticon effect,’” he said, “where we think and behave differently based on the assumption that people may be watching and paying attention to what we are doing.”
  • When compared to surveillance rules in place in the U.S., GCHQ notes in one document that the U.K. has “a light oversight regime.” The more lax British spying regulations are reflected in secret internal rules that highlight greater restrictions on how NSA databases can be accessed. The NSA’s troves can be searched for data on British citizens, one document states, but they cannot be mined for information about Americans or other citizens from countries in the Five Eyes alliance. No such constraints are placed on GCHQ’s own databases, which can be sifted for records on the phone calls, emails, and Internet usage of Brits, Americans, and citizens from any other country. The scope of GCHQ’s surveillance powers explain in part why Snowden told The Guardian in June 2013 that U.K. surveillance is “worse than the U.S.” In an interview with Der Spiegel in July 2013, Snowden added that British Internet cables were “radioactive” and joked: “Even the Queen’s selfies to the pool boy get logged.”
  • In recent years, the biggest barrier to GCHQ’s mass collection of data does not appear to have come in the form of legal or policy restrictions. Rather, it is the increased use of encryption technology that protects the privacy of communications that has posed the biggest potential hindrance to the agency’s activities. “The spread of encryption … threatens our ability to do effective target discovery/development,” says a top-secret report co-authored by an official from the British agency and an NSA employee in 2011. “Pertinent metadata events will be locked within the encrypted channels and difficult, if not impossible, to prise out,” the report says, adding that the agencies were working on a plan that would “(hopefully) allow our Internet Exploitation strategy to prevail.”
Paul Merrell

Google Concealed Data Breach Over Fear Of Repercussions; Shuts Down Google+ Service | Zero Hedge - 0 views

  • Google opted in the Spring not to disclose that the data of hundreds of thousands of Google+ users had been exposed because the company says they found no evidence of misuse, reports the Wall Street Journal. The Silicon Valley giant feared both regulatory scrutiny and regulatory damage, according to documents reviewed by the Journal and people briefed on the incident.  In response to being busted, Google parent Alphabet is set to announce broad privacy measures which include permanently shutting down all consumer functionality of Google+, a move which "effectively puts the final nail in the coffin of a product that was launched in 2011 to challenge Facebook, and is widely seen as one of Google's biggest failures."  Shares in Alphabet fell as much as 2.1% following the Journal's report: 
  • The software glitch gave outside developers access to private Google+ profile data between 2015 and March 2018, after Google internal investigators found the problem and fixed it. According to a memo prepared by Google's legal and policy staff and reviewed by the Journal, senior executives worried that disclosing the incident would probably trigger "immediate regulatory interest," while inviting comparisons to Facebook's massive data harvesting scandal. 
Paul Merrell

Belgian court finds Facebook guilty of breaching privacy laws - nsnbc international | nsnbc international - 0 views

  • A court in the Belgian capital Brussels, on Friday, found the social media company Facebook guilty of breaching Belgian privacy laws. Belgium’s Privacy Commission had taken Facebook to court and the judge agreed with the Commission’s view that Facebook had flouted the country’s privacy legislation. The company has been ordered to correct its practice right away of face fines. Facebook has lodged an appeal.
  • Facebook follows its users activities by means of so-called social plug-ins, cookies, and pixels. These digital technologies enable Facebook to follow users’ behavior when online. Cookies, for example, are small files that are attached to your internet browser when you go online and visit a particular site. They are used to collect information about the kind of things you like to read or look at while surfing the web. Facebook uses the data both for its own ends, but also to help advertisers send tailor made advertising. In so doing Facebook also uses certain cookies to follow people that don’t even have a Facebook profile. The court ruled that it is “unclear what information Facebook is collecting about us” and “what it uses the information for”. Moreover, Facebook has not been given permission to keep tabs on internet-users by a court of law.
  • he court has ordered Facebook to stop the practice straight away and to delete any data that it has obtained by means contrary to Belgian privacy legislation. If Facebook fails to comply it will face a penalty payment of 250,000 euro/day.
  • ...1 more annotation...
  • Facebook, for its part, has said that it is to appeal against the verdict.
Paul Merrell

HART: Homeland Security's Massive New Database Will Include Face Recognition, DNA, and Peoples' "Non-Obvious Relationships" | Electronic Frontier Foundation - 0 views

  • The U.S. Department of Homeland Security (DHS) is quietly building what will likely become the largest database of biometric and biographic data on citizens and foreigners in the United States. The agency’s new Homeland Advanced Recognition Technology (HART) database will include multiple forms of biometrics—from face recognition to DNA, data from questionable sources, and highly personal data on innocent people. It will be shared with federal agencies outside of DHS as well as state and local law enforcement and foreign governments. And yet, we still know very little about it.The records DHS plans to include in HART will chill and deter people from exercising their First Amendment protected rights to speak, assemble, and associate. Data like face recognition makes it possible to identify and track people in real time, including at lawful political protests and other gatherings. Other data DHS is planning to collect—including information about people’s “relationship patterns” and from officer “encounters” with the public—can be used to identify political affiliations, religious activities, and familial and friendly relationships. These data points are also frequently colored by conjecture and bias.
  • DHS currently collects a lot of data. Its legacy IDENT fingerprint database contains information on 220-million unique individuals and processes 350,000 fingerprint transactions every day. This is an exponential increase from 20 years ago when IDENT only contained information on 1.8-million people. Between IDENT and other DHS-managed databases, the agency manages over 10-billion biographic records and adds 10-15 million more each week.
  • DHS’s new HART database will allow the agency to vastly expand the types of records it can collect and store. HART will support at least seven types of biometric identifiers, including face and voice data, DNA, scars and tattoos, and a blanket category for “other modalities.” It will also include biographic information, like name, date of birth, physical descriptors, country of origin, and government ID numbers. And it will include data we know to by highly subjective, including information collected from officer “encounters” with the public and information about people’s “relationship patterns.”
  • ...1 more annotation...
  • DHS’s face recognition roll-out is especially concerning. The agency uses mobile biometric devices that can identify faces and capture face data in the field, allowing its ICE (immigration) and CBP (customs) officers to scan everyone with whom they come into contact, whether or not those people are suspected of any criminal activity or an immigration violation. DHS is also partnering with airlines and other third parties to collect face images from travelers entering and leaving the U.S. When combined with data from other government agencies, these troubling collection practices will allow DHS to build a database large enough to identify and track all people in public places, without their knowledge—not just in places the agency oversees, like airports, but anywhere there are cameras.Police abuse of facial recognition technology is not a theoretical issue: it’s happening today. Law enforcement has already used face recognition on public streets and at political protests. During the protests surrounding the death of Freddie Gray in 2015, Baltimore Police ran social media photos against a face recognition database to identify protesters and arrest them. Recent Amazon promotional videos encourage police agencies to acquire that company’s face “Rekognition” capabilities and use them with body cameras and smart cameras to track people throughout cities. At least two U.S. cities are already using Rekognition.DHS compounds face recognition’s threat to anonymity and free speech by planning to include “records related to the analysis of relationship patterns among individuals.” We don’t know where DHS or its external partners will be getting these “relationship pattern” records, but they could come from social media profiles and posts, which the government plans to track by collecting social media user names from all foreign travelers entering the country.
Paul Merrell

How a "location API" allows cops to figure out where we all are in real time | Ars Technica - 0 views

  • The digital privacy world was rocked late Thursday evening when The New York Times reported on Securus, a prison telecom company that has a service enabling law enforcement officers to locate most American cell phones within seconds. The company does this via a basic Web interface leveraging a location API—creating a way to effectively access a massive real-time database of cell-site records. Securus’ location ability relies on other data brokers and location aggregators that obtain that information directly from mobile providers, usually for the purposes of providing some commercial service like an opt-in product discount triggered by being near a certain location. ("You’re near a Carl’s Jr.! Stop in now for a free order of fries with purchase!") The Texas-based Securus reportedly gets its data from 3CInteractive, which in turn buys data from LocationSmart. Ars reached 3CInteractive's general counsel, Scott Elk, who referred us to a spokesperson. The spokesperson did not immediately respond to our query. But currently, anyone can get a sense of the power of a location API by trying out a demo from LocationSmart itself. Currently, the Supreme Court is set to rule on the case of Carpenter v. United States, which asks whether police can obtain more than 120 days' worth of cell-site location information of a criminal suspect without a warrant. In that case, as is common in many investigations, law enforcement presented a cell provider with a court order to obtain such historical data. But the ability to obtain real-time location data that Securus reportedly offers skips that entire process, and it's potentially far more invasive. Securus’ location service as used by law enforcement is also currently being scrutinized. The service is at the heart of an ongoing federal prosecution of a former Missouri sheriff’s deputy who allegedly used it at least 11 times against a judge and other law enforcement officers. On Friday, Sen. Ron Wyden (D-Ore.) publicly released his formal letters to AT&T and also to the Federal Communications Commission demanding detailed answers regarding these Securus revelations.
Paul Merrell

California's Attorney General joins the long list of people who have had it with Facebook * The Register - 0 views

  • California’s attorney general has gone to court to force Facebook to hand over documents as part of an investigation into the company. Xavier Becerra filed a “petition to enforce investigative subpoena” with the Superior Court of California in San Francisco on Wednesday morning, arguing that Facebook’s response to his subpoenas has been “patently inadequate.” Citing a “lack of cooperation” not just with his office but also the Federal Trade Commission (FTC), Xavier Becerra points out [PDF] that it took Facebook a year to respond to his initial inquiry to produce documents relating to the Cambridge Analytica scandal, where Facebook allowed a third party to access vast amounts of personal information through its systems.
  • Not only that but Facebook flat out refused to “search communications involving senior executives,” meaning that it refused to search for relevant information in the emails and other communications of CEO Mark Zuckerberg and COO Sheryl Sandberg, among others. “Facebook is not just continuing to drag its feet, it is failing to comply with lawfully issued subpoenas and interrogatories,” the filing states.
  • The filing comes the same day that 7,000 pages of internal Facebook files were published online. Those documents were obtained and leaked amid a lawsuit between Facebook and a third-party app developer and were labelled as “highly confidential” by the antisocial network. The main upshot of those files is that they show Facebook used the data it gathered on millions of its users as a business weapon: it provided people's profile information to companies that, for instance, agreed to spend hundreds of thousands of dollars on adverts within Facebook, and it cut off developers that posed a competitive threat to its ever-growing stable of companies and services (or developers that wouldn't pay up, or were just too sketchy for the internet giant.) This confirms earlier reporting. CEO Zuckerberg also continues to avoid visiting London, or anywhere in the UK, out of fear he will be arrested for repeatedly failing to comply with a request by Parliament to answer questions about Facebook’s actions, as revealed in the tranche of documents.
« First ‹ Previous 41 - 60 of 62 Next ›
Showing 20 items per page