Skip to main content

Home/ Open Web/ Group items tagged 1

Rss Feed Group items tagged

Paul Merrell

Hackers Prove Fingerprints Are Not Secure, Now What? | nsnbc international - 0 views

  • The Office of Personnel Management (OPM) recently revealed that an estimated 5.6 million government employees were affected by the hack; and not 1.1 million as previously assumed.
  • Samuel Schumach, spokesman for the OPM, said: “As part of the government’s ongoing work to notify individuals affected by the theft of background investigation records, the Office of Personnel Management and the Department of Defense have been analyzing impacted data to verify its quality and completeness. Of the 21.5 million individuals whose Social Security Numbers and other sensitive information were impacted by the breach, the subset of individuals whose fingerprints have been stolen has increased from a total of approximately 1.1 million to approximately 5.6 million.” This endeavor expended the use of the Department of Defense (DoD), the Department of Homeland Security (DHS), the National Security Agency (NSA), and the Pentagon. Schumer added that “if, in the future, new means are developed to misuse the fingerprint data, the government will provide additional information to individuals whose fingerprints may have been stolen in this breach.” However, we do not need to wait for the future for fingerprint data to be misused and coveted by hackers.
  • Look no further than the security flaws in Samsung’s new Galaxy 5 smartphone as was demonstrated by researchers at Security Research Labs (SRL) showing how fingerprints, iris scans and other biometric identifiers could be fabricated and yet authenticated by the Apple Touch ID fingerprints scanner. The shocking part of this demonstration is that this hack was achieved less than 2 days after the technology was released to the public by Apple. Ben Schlabs, researcher for SRL explained: “We expected we’d be able to spoof the S5’s Finger Scanner, but I hoped it would at least be a challenge. The S5 Finger Scanner feature offers nothing new except—because of the way it is implemented in this Android device—slightly higher risk than that already posed by previous devices.” Schlabs and other researchers discovered that “the S5 has no mechanism requiring a password when encountering a large number of incorrect finger swipes.” By rebotting the smartphone, Schlabs could force “the handset to accept an unlimited number of incorrect swipes without requiring users to enter a password [and] the S5 fingerprint authenticator [could] be associated with sensitive banking or payment apps such as PayPal.”
  • ...1 more annotation...
  • Schlab said: “Perhaps most concerning is that Samsung does not seem to have learned from what others have done less poorly. Not only is it possible to spoof the fingerprint authentication even after the device has been turned off, but the implementation also allows for seemingly unlimited authentication attempts without ever requiring a password. Incorporation of fingerprint authentication into highly sensitive apps such as PayPal gives a would-be attacker an even greater incentive to learn the simple skill of fingerprint spoofing.” Last year Hackers from the Chaos Computer Club (CCC) proved Apple wrong when the corporation insisted that their new iPhone 5S fingerprint sensor is “a convenient and highly secure way to access your phone.” CCC stated that it is as easy as stealing a fingerprint from a drinking glass – and anyone can do it.
Paul Merrell

From Radio to Porn, British Spies Track Web Users' Online Identities - 0 views

  • HERE WAS A SIMPLE AIM at the heart of the top-secret program: Record the website browsing habits of “every visible user on the Internet.” Before long, billions of digital records about ordinary people’s online activities were being stored every day. Among them were details cataloging visits to porn, social media and news websites, search engines, chat forums, and blogs. The mass surveillance operation — code-named KARMA POLICE — was launched by British spies about seven years ago without any public debate or scrutiny. It was just one part of a giant global Internet spying apparatus built by the United Kingdom’s electronic eavesdropping agency, Government Communications Headquarters, or GCHQ. The revelations about the scope of the British agency’s surveillance are contained in documents obtained by The Intercept from National Security Agency whistleblower Edward Snowden. Previous reports based on the leaked files have exposed how GCHQ taps into Internet cables to monitor communications on a vast scale, but many details about what happens to the data after it has been vacuumed up have remained unclear.
  • Amid a renewed push from the U.K. government for more surveillance powers, more than two dozen documents being disclosed today by The Intercept reveal for the first time several major strands of GCHQ’s existing electronic eavesdropping capabilities.
  • The surveillance is underpinned by an opaque legal regime that has authorized GCHQ to sift through huge archives of metadata about the private phone calls, emails and Internet browsing logs of Brits, Americans, and any other citizens — all without a court order or judicial warrant
  • ...17 more annotations...
  • A huge volume of the Internet data GCHQ collects flows directly into a massive repository named Black Hole, which is at the core of the agency’s online spying operations, storing raw logs of intercepted material before it has been subject to analysis. Black Hole contains data collected by GCHQ as part of bulk “unselected” surveillance, meaning it is not focused on particular “selected” targets and instead includes troves of data indiscriminately swept up about ordinary people’s online activities. Between August 2007 and March 2009, GCHQ documents say that Black Hole was used to store more than 1.1 trillion “events” — a term the agency uses to refer to metadata records — with about 10 billion new entries added every day. As of March 2009, the largest slice of data Black Hole held — 41 percent — was about people’s Internet browsing histories. The rest included a combination of email and instant messenger records, details about search engine queries, information about social media activity, logs related to hacking operations, and data on people’s use of tools to browse the Internet anonymously.
  • Throughout this period, as smartphone sales started to boom, the frequency of people’s Internet use was steadily increasing. In tandem, British spies were working frantically to bolster their spying capabilities, with plans afoot to expand the size of Black Hole and other repositories to handle an avalanche of new data. By 2010, according to the documents, GCHQ was logging 30 billion metadata records per day. By 2012, collection had increased to 50 billion per day, and work was underway to double capacity to 100 billion. The agency was developing “unprecedented” techniques to perform what it called “population-scale” data mining, monitoring all communications across entire countries in an effort to detect patterns or behaviors deemed suspicious. It was creating what it said would be, by 2013, “the world’s biggest” surveillance engine “to run cyber operations and to access better, more valued data for customers to make a real world difference.”
  • A document from the GCHQ target analysis center (GTAC) shows the Black Hole repository’s structure.
  • The data is searched by GCHQ analysts in a hunt for behavior online that could be connected to terrorism or other criminal activity. But it has also served a broader and more controversial purpose — helping the agency hack into European companies’ computer networks. In the lead up to its secret mission targeting Netherlands-based Gemalto, the largest SIM card manufacturer in the world, GCHQ used MUTANT BROTH in an effort to identify the company’s employees so it could hack into their computers. The system helped the agency analyze intercepted Facebook cookies it believed were associated with Gemalto staff located at offices in France and Poland. GCHQ later successfully infiltrated Gemalto’s internal networks, stealing encryption keys produced by the company that protect the privacy of cell phone communications.
  • Similarly, MUTANT BROTH proved integral to GCHQ’s hack of Belgian telecommunications provider Belgacom. The agency entered IP addresses associated with Belgacom into MUTANT BROTH to uncover information about the company’s employees. Cookies associated with the IPs revealed the Google, Yahoo, and LinkedIn accounts of three Belgacom engineers, whose computers were then targeted by the agency and infected with malware. The hacking operation resulted in GCHQ gaining deep access into the most sensitive parts of Belgacom’s internal systems, granting British spies the ability to intercept communications passing through the company’s networks.
  • In March, a U.K. parliamentary committee published the findings of an 18-month review of GCHQ’s operations and called for an overhaul of the laws that regulate the spying. The committee raised concerns about the agency gathering what it described as “bulk personal datasets” being held about “a wide range of people.” However, it censored the section of the report describing what these “datasets” contained, despite acknowledging that they “may be highly intrusive.” The Snowden documents shine light on some of the core GCHQ bulk data-gathering programs that the committee was likely referring to — pulling back the veil of secrecy that has shielded some of the agency’s most controversial surveillance operations from public scrutiny. KARMA POLICE and MUTANT BROTH are among the key bulk collection systems. But they do not operate in isolation — and the scope of GCHQ’s spying extends far beyond them.
  • The agency operates a bewildering array of other eavesdropping systems, each serving its own specific purpose and designated a unique code name, such as: SOCIAL ANTHROPOID, which is used to analyze metadata on emails, instant messenger chats, social media connections and conversations, plus “telephony” metadata about phone calls, cell phone locations, text and multimedia messages; MEMORY HOLE, which logs queries entered into search engines and associates each search with an IP address; MARBLED GECKO, which sifts through details about searches people have entered into Google Maps and Google Earth; and INFINITE MONKEYS, which analyzes data about the usage of online bulletin boards and forums. GCHQ has other programs that it uses to analyze the content of intercepted communications, such as the full written body of emails and the audio of phone calls. One of the most important content collection capabilities is TEMPORA, which mines vast amounts of emails, instant messages, voice calls and other communications and makes them accessible through a Google-style search tool named XKEYSCORE.
  • As of September 2012, TEMPORA was collecting “more than 40 billion pieces of content a day” and it was being used to spy on people across Europe, the Middle East, and North Africa, according to a top-secret memo outlining the scope of the program. The existence of TEMPORA was first revealed by The Guardian in June 2013. To analyze all of the communications it intercepts and to build a profile of the individuals it is monitoring, GCHQ uses a variety of different tools that can pull together all of the relevant information and make it accessible through a single interface. SAMUEL PEPYS is one such tool, built by the British spies to analyze both the content and metadata of emails, browsing sessions, and instant messages as they are being intercepted in real time. One screenshot of SAMUEL PEPYS in action shows the agency using it to monitor an individual in Sweden who visited a page about GCHQ on the U.S.-based anti-secrecy website Cryptome.
  • Partly due to the U.K.’s geographic location — situated between the United States and the western edge of continental Europe — a large amount of the world’s Internet traffic passes through its territory across international data cables. In 2010, GCHQ noted that what amounted to “25 percent of all Internet traffic” was transiting the U.K. through some 1,600 different cables. The agency said that it could “survey the majority of the 1,600” and “select the most valuable to switch into our processing systems.”
  • According to Joss Wright, a research fellow at the University of Oxford’s Internet Institute, tapping into the cables allows GCHQ to monitor a large portion of foreign communications. But the cables also transport masses of wholly domestic British emails and online chats, because when anyone in the U.K. sends an email or visits a website, their computer will routinely send and receive data from servers that are located overseas. “I could send a message from my computer here [in England] to my wife’s computer in the next room and on its way it could go through the U.S., France, and other countries,” Wright says. “That’s just the way the Internet is designed.” In other words, Wright adds, that means “a lot” of British data and communications transit across international cables daily, and are liable to be swept into GCHQ’s databases.
  • A map from a classified GCHQ presentation about intercepting communications from undersea cables. GCHQ is authorized to conduct dragnet surveillance of the international data cables through so-called external warrants that are signed off by a government minister. The external warrants permit the agency to monitor communications in foreign countries as well as British citizens’ international calls and emails — for example, a call from Islamabad to London. They prohibit GCHQ from reading or listening to the content of “internal” U.K. to U.K. emails and phone calls, which are supposed to be filtered out from GCHQ’s systems if they are inadvertently intercepted unless additional authorization is granted to scrutinize them. However, the same rules do not apply to metadata. A little-known loophole in the law allows GCHQ to use external warrants to collect and analyze bulk metadata about the emails, phone calls, and Internet browsing activities of British people, citizens of closely allied countries, and others, regardless of whether the data is derived from domestic U.K. to U.K. communications and browsing sessions or otherwise. In March, the existence of this loophole was quietly acknowledged by the U.K. parliamentary committee’s surveillance review, which stated in a section of its report that “special protection and additional safeguards” did not apply to metadata swept up using external warrants and that domestic British metadata could therefore be lawfully “returned as a result of searches” conducted by GCHQ.
  • Perhaps unsurprisingly, GCHQ appears to have readily exploited this obscure legal technicality. Secret policy guidance papers issued to the agency’s analysts instruct them that they can sift through huge troves of indiscriminately collected metadata records to spy on anyone regardless of their nationality. The guidance makes clear that there is no exemption or extra privacy protection for British people or citizens from countries that are members of the Five Eyes, a surveillance alliance that the U.K. is part of alongside the U.S., Canada, Australia, and New Zealand. “If you are searching a purely Events only database such as MUTANT BROTH, the issue of location does not occur,” states one internal GCHQ policy document, which is marked with a “last modified” date of July 2012. The document adds that analysts are free to search the databases for British metadata “without further authorization” by inputing a U.K. “selector,” meaning a unique identifier such as a person’s email or IP address, username, or phone number. Authorization is “not needed for individuals in the U.K.,” another GCHQ document explains, because metadata has been judged “less intrusive than communications content.” All the spies are required to do to mine the metadata troves is write a short “justification” or “reason” for each search they conduct and then click a button on their computer screen.
  • Intelligence GCHQ collects on British persons of interest is shared with domestic security agency MI5, which usually takes the lead on spying operations within the U.K. MI5 conducts its own extensive domestic surveillance as part of a program called DIGINT (digital intelligence).
  • GCHQ’s documents suggest that it typically retains metadata for periods of between 30 days to six months. It stores the content of communications for a shorter period of time, varying between three to 30 days. The retention periods can be extended if deemed necessary for “cyber defense.” One secret policy paper dated from January 2010 lists the wide range of information the agency classes as metadata — including location data that could be used to track your movements, your email, instant messenger, and social networking “buddy lists,” logs showing who you have communicated with by phone or email, the passwords you use to access “communications services” (such as an email account), and information about websites you have viewed.
  • Records showing the full website addresses you have visited — for instance, www.gchq.gov.uk/what_we_do — are treated as content. But the first part of an address you have visited — for instance, www.gchq.gov.uk — is treated as metadata. In isolation, a single metadata record of a phone call, email, or website visit may not reveal much about a person’s private life, according to Ethan Zuckerman, director of Massachusetts Institute of Technology’s Center for Civic Media. But if accumulated and analyzed over a period of weeks or months, these details would be “extremely personal,” he told The Intercept, because they could reveal a person’s movements, habits, religious beliefs, political views, relationships, and even sexual preferences. For Zuckerman, who has studied the social and political ramifications of surveillance, the most concerning aspect of large-scale government data collection is that it can be “corrosive towards democracy” — leading to a chilling effect on freedom of expression and communication. “Once we know there’s a reasonable chance that we are being watched in one fashion or another it’s hard for that not to have a ‘panopticon effect,’” he said, “where we think and behave differently based on the assumption that people may be watching and paying attention to what we are doing.”
  • When compared to surveillance rules in place in the U.S., GCHQ notes in one document that the U.K. has “a light oversight regime.” The more lax British spying regulations are reflected in secret internal rules that highlight greater restrictions on how NSA databases can be accessed. The NSA’s troves can be searched for data on British citizens, one document states, but they cannot be mined for information about Americans or other citizens from countries in the Five Eyes alliance. No such constraints are placed on GCHQ’s own databases, which can be sifted for records on the phone calls, emails, and Internet usage of Brits, Americans, and citizens from any other country. The scope of GCHQ’s surveillance powers explain in part why Snowden told The Guardian in June 2013 that U.K. surveillance is “worse than the U.S.” In an interview with Der Spiegel in July 2013, Snowden added that British Internet cables were “radioactive” and joked: “Even the Queen’s selfies to the pool boy get logged.”
  • In recent years, the biggest barrier to GCHQ’s mass collection of data does not appear to have come in the form of legal or policy restrictions. Rather, it is the increased use of encryption technology that protects the privacy of communications that has posed the biggest potential hindrance to the agency’s activities. “The spread of encryption … threatens our ability to do effective target discovery/development,” says a top-secret report co-authored by an official from the British agency and an NSA employee in 2011. “Pertinent metadata events will be locked within the encrypted channels and difficult, if not impossible, to prise out,” the report says, adding that the agencies were working on a plan that would “(hopefully) allow our Internet Exploitation strategy to prevail.”
Paul Merrell

HTML5: Getting to Last Call - W3C Blog - 0 views

  • We started to work on HTML5 back in 2007 and have been going through issues since then. In November 2009, the HTML Chairs instituted a decision policy, which allowed us to close around 20 issues or so. We now have around 200 bugs and 25 issues on the document. In order to drive the Group to Last Call, the HTML Chairs, following the advice from the W3C Team, produced a timeline to get the initial Last Call for HTML5. The W3C team expresses its strong support to the chairs of the HTML Working Group in their efforts to lead the group toward an initial Last Call according to the published timeline. All new bugs related to the HTML5 specification received after the first of October 2010 will be treated as Last Call comments, with possible exceptions granted by the Chairs. The intention is to get to the initial Last Call and have a feature-complete document. The HTML Chairs will keep driving the Group forward after that date in order to resolve all the bugs received by October 1. The expectation is to issue the Last Call document at the end of May 2011. I encourage everyone to send bugs prior to October 1 and keep track of them in order to escalate them to the Working Group if necessary.
  •  
    Get your HTML 5 bug reports filed *before* October 1.  See http://lists.w3.org/Archives/Public/public-html/2010Sep/0074.html for more details.
Gary Edwards

GSA picks Google Apps: What it means | ZDNet - 0 views

  •  
    The General Services Administration made a bold decision to move its email and collaboration systems to the cloud.  This is a huge win for cloud-computing, but perhaps should have been expected since last week the Feds announced a new requisition and purchase mandate that cloud-computing had to be the FIRST consideration for federal agency purchases.  Note that the General Services Administration oversees requisitions and purchases for all Federal agencies!  This is huge.  Estimated to be worth $8 billion to cloud-computing providers. The cloud-computing market is estimated to be $30 Billion, but Gartner did not anticipate or expect Federal Agencies to embrace cloud-computing let alone issue a mandate for it.   In the RFP issued last June, it was easy to see their goals in the statement of objectives: This Statement of Objectives (SOO) describes the goals that GSA expects to achieve with regard to the 1. modernization of its e-mail system; 2. provision of an effective collaborative working environment; 3. reduction of the government's in-house system maintenance burden by providing related business, technical, and management functions; and 4. application of appropriate security and privacy safeguards. GSA announced yesterday that they choose Google Apps for email and collaboration and Unisys as the implementation partner. So what does this mean? What it means (WIM) #1: GSA employees will be using a next-generation information workplace. And that means mobile, device-agnostic, and location-agile. Gmail on an iPad? No problem. Email from a home computer? Yep. For GSA and for every other agency and most companies, it's important to give employees the tools to be productive and engage from every location on every device. "Work becomes a thing you do and not a place you go." [Thanks to Earl Newsome of Estee Lauder for that quote.] WIM #2: GSA will save 50% of the cost of email over five years. This is also what our research on the cost of email o
Gary Edwards

What to expect from HTML 5 | Developer World - InfoWorld - 0 views

  •  
    Neil McAllister provides a good intro to HTML5 and what it will mean to the future of the Web.  It's just an intro, but the links he provides are excellent resources for deep dive. excerpt:  "Among Web developers, anticipation is mounting for HTML 5, the overhaul of the Web markup language currently under way at the Worldwide Web Consortium (W3C). For many, the revamping is long overdue. HTML hasn't had a proper upgrade in more than a decade. In fact, the last markup language to win W3C Recommendation status -- the final stage of the Web standards process -- was XHTML 1.1 in 2001. In the intervening years, Web developers have grown increasingly restless. Many claim the HTML and XHTML standards have become outdated, and that their document-centric focus does not adequately address the needs of modern Web applications. HTML 5 aims to change all that. When it is finalized, the new standard will include tags and APIs for improved interactivity, multimedia, and localization. As experimental support for HTML 5 features has crept into the current crop of Web browsers, some developers have even begun voicing hope that this new, modernized HTML will free them from reliance on proprietary plug-ins such as Flash, QuickTime, and Silverlight."
Gary Edwards

Where is there an end of it? | Thomas Jefferson on Patents | Marbux on Document Format ... - 1 views

  •  
    Whether a patent constitutes "property" in the U.S. is an issue on which the Supreme Court has apparently never ruled. However, there is no question that the nation's founders viewed it only as a government-granted privilege, not a "property" right. The U.S. Supreme Court quoted Thomas Jefferson on the topic: Stable ownership is the gift of social law, and is given late in the progress of society. It would be curious then, if an idea, the fugitive fermentation of an individual brain, could, of natural right, be claimed in exclusive and stable property. If nature has made any one thing less susceptible than all others of exclusive property, it is the action of the thinking power called an idea, which an individual may exclusively possess as long as he keeps it to himself; but the moment it is divulged, it forces itself into the possession of every one, and the receiver cannot dispossess himself of it. Its peculiar character, too, is that no one possesses the less, because every other possesses the whole of it. He who receives an idea from me, receives instruction himself without lessening mine; as he who lights his taper at mine, receives light without darkening me. That ideas should freely spread from one to another over the globe, for the moral and mutual instruction of man, and improvement of his condition, seems to have been peculiarly and benevolently designed by nature, when she made them, like fire, expansible over all space, without lessening their density in any point, and like the air in which we breathe, move, and have our physical being, incapable of confinement or exclusive appropriation. Inventions then cannot, in nature, be a subject of property. Society may give an exclusive right to the profits arising from them, as an encouragement to men to pursue ideas which may produce utility, but this may or may not be done, according to the will and convenience of the society, without claim or complaint from any body. VI Writings of Thomas Jefferson, at 18
Gary Edwards

Cloud Computing and Mobile Devices a Hot Area for ICT in 2011 Says Frost & Sullivan - ... - 0 views

  •  
    Increasing adoption has created a US$1.1 billion Cloud Computing market in Asia Pacific 'With a 90% share of the market, SaaS is the dominant segment of the Cloud market in the Asia Pacific region. The APAC SaaS market expected to grow at a CAGR of 39% for the 2010-2014 period,' says Nitin. He continues, 'Cloud Computing is to be an important driver of growth as Singapore establishes itself as one of the Cloud hubs in Asia Pacific. The Singapore Cloud Computing market is set witness strong growth powered by CRM, Collaboration and HRM applications.'
Gary Edwards

2011 Will be the Year For Mobile in APAC - 0 views

  •  
    APAC=Asian Pacific markets Meanwhile, the popularity of smartphones and tablets is expected to give rise to mobile cloud applications. In effect, going to the cloud will help smartphones and tablets overcome inherent hardware limitations, such as small storage, inadequate processing speed and power-saving requirements. Mobile security will also be a prime concern, especially for enterprise users. This will include safety and privacy applications like remote wipe and virus protection. Nitin says that the market for cloud computing in APAC has grown to US$ 1.1 billion this year, which is mostly comprised of SaaS deployments. He highlights the role of Singapore as a cloud computing hub in the region, given a strong broadband infrastructure and the presence of a large base of multinational companies.
Paul Merrell

American Surveillance Now Threatens American Business - The Atlantic - 0 views

  • What does it look like when a society loses its sense of privacy? <div><a href="http://pubads.g.doubleclick.net/gampad/jump?iu=%2F4624%2FTheAtlanticOnline%2Fchannel_technology&t=src%3Dblog%26by%3Drobinson-meyer%26title%3Damerican-surveillance-now-threatens-american-business%26pos%3Din-article&sz=300x250&c=285899172&tile=1" title=""><img style="border:none;" src="http://pubads.g.doubleclick.net/gampad/ad?iu=%2F4624%2FTheAtlanticOnline%2Fchannel_technology&t=src%3Dblog%26by%3Drobinson-meyer%26title%3Damerican-surveillance-now-threatens-american-business%26pos%3Din-article&sz=300x250&c=285899172&tile=1" alt="" /></a></div>In the almost 18 months since the Snowden files first received coverage, writers and critics have had to guess at the answer. Does a certain trend, consumer complaint, or popular product epitomize some larger shift? Is trust in tech companies eroding—or is a subset just especially vocal about it? Polling would make those answers clear, but polling so far has been… confused. A new study, conducted by the Pew Internet Project last January and released last week, helps make the average American’s view of his or her privacy a little clearer. And their confidence in their own privacy is ... low. The study's findings—and the statistics it reports—stagger. Vast majorities of Americans are uncomfortable with how the government uses their data, how private companies use and distribute their data, and what the government does to regulate those companies. No summary can equal a recounting of the findings. Americans are displeased with government surveillance en masse:   
  • A new study finds that a vast majority of Americans trust neither the government nor tech companies with their personal data.
  • What does it look like when a society loses its sense of privacy? <div><a href="http://pubads.g.doubleclick.net/gampad/jump?iu=%2F4624%2FTheAtlanticOnline%2Fchannel_technology&t=src%3Dblog%26by%3Drobinson-meyer%26title%3Damerican-surveillance-now-threatens-american-business%26pos%3Din-article&sz=300x250&c=285899172&tile=1" title=""><img style="border:none;" src="http://pubads.g.doubleclick.net/gampad/ad?iu=%2F4624%2FTheAtlanticOnline%2Fchannel_technology&t=src%3Dblog%26by%3Drobinson-meyer%26title%3Damerican-surveillance-now-threatens-american-business%26pos%3Din-article&sz=300x250&c=285899172&tile=1" alt="" /></a></div>In the almost 18 months since the Snowden files first received coverage, writers and critics have had to guess at the answer. Does a certain trend, consumer complaint, or popular product epitomize some larger shift? Is trust in tech companies eroding—or is a subset just especially vocal about it? Polling would make those answers clear, but polling so far has been… confused. A new study, conducted by the Pew Internet Project last January and released last week, helps make the average American’s view of his or her privacy a little clearer. And their confidence in their own privacy is ... low. The study's findings—and the statistics it reports—stagger. Vast majorities of Americans are uncomfortable with how the government uses their data, how private companies use and distribute their data, and what the government does to regulate those companies. No summary can equal a recounting of the findings. Americans are displeased with government surveillance en masse:   
  • ...3 more annotations...
  • According to the study, 70 percent of Americans are “at least somewhat concerned” with the government secretly obtaining information they post to social networking sites. Eighty percent of respondents agreed that “Americans should be concerned” with government surveillance of telephones and the web. They are also uncomfortable with how private corporations use their data: Ninety-one percent of Americans believe that “consumers have lost control over how personal information is collected and used by companies,” according to the study. Eighty percent of Americans who use social networks “say they are concerned about third parties like advertisers or businesses accessing the data they share on these sites.” And even though they’re squeamish about the government’s use of data, they want it to regulate tech companies and data brokers more strictly: 64 percent wanted the government to do more to regulate private data collection. Since June 2013, American politicians and corporate leaders have fretted over how much the leaks would cost U.S. businesses abroad.
  • “It’s clear the global community of Internet users doesn’t like to be caught up in the American surveillance dragnet,” Senator Ron Wyden said last month. At the same event, Google chairman Eric Schmidt agreed with him. “What occurred was a loss of trust between America and other countries,” he said, according to the Los Angeles Times. “It's making it very difficult for American firms to do business.” But never mind the world. Americans don’t trust American social networks. More than half of the poll’s respondents said that social networks were “not at all secure. Only 40 percent of Americans believe email or texting is at least “somewhat” secure. Indeed, Americans trusted most of all communication technologies where some protections has been enshrined into the law (though the report didn’t ask about snail mail). That is: Talking on the telephone, whether on a landline or cell phone, is the only kind of communication that a majority of adults believe to be “very secure” or “somewhat secure.”
  • (That may seem a bit incongruous, because making a telephone call is one area where you can be almost sure you are being surveilled: The government has requisitioned mass call records from phone companies since 2001. But Americans appear, when discussing security, to differentiate between the contents of the call and data about it.) Last month, Ramsey Homsany, the general counsel of Dropbox, said that one big thing could take down the California tech scene. “We have built this incredible economic engine in this region of the country,” said Homsany in the Los Angeles Times, “and [mistrust] is the one thing that starts to rot it from the inside out.” According to this poll, the mistrust has already begun corroding—and is already, in fact, well advanced. We’ve always assumed that the great hurt to American business will come globally—that citizens of other nations will stop using tech companies’s services. But the new Pew data shows that Americans suspect American businesses just as much. And while, unlike citizens of other nations, they may not have other places to turn, they may stop putting sensitive or delicate information online.
Paul Merrell

Rural America and the 5G Digital Divide. Telecoms Expanding Their "Toxic Infrastructure... - 0 views

  • While there is considerable telecom hubris regarding the 5G rollout and increasing speculation that the next generation of wireless is not yet ready for Prime Time, the industry continues to make promises to Rural America that it has no intention of fulfilling. Decades-long promises to deliver digital Utopia to rural America by T-Mobile, Verizon and AT&T have never materialized.  
  • In 2017, the USDA reported that 29% of American farms had no internet access. The FCC says that 14 million rural Americans and 1.2 million Americans living on tribal lands do not have 4G LTE on their phones, and that 30 million rural residents do not have broadband service compared to 2% of urban residents.  It’s beginning to sound like a Third World country. Despite an FCC $4.5 billion annual subsidy to carriers to provide broadband service in rural areas, the FCC reports that ‘over 24 million Americans do not have access to high-speed internet service, the bulk of them in rural area”while a  Microsoft Study found that  “162 million people across the US do not have internet service at broadband speeds.” At the same time, only three cable companies have access to 70% of the market in a sweetheart deal to hike rates as they avoid competition and the FCC looks the other way.  The FCC believes that it would cost $40 billion to bring broadband access to 98% of the country with expansion in rural America even more expensive.  While the FCC has pledged a $2 billion, ten year plan to identify rural wireless locations, only 4 million rural American businesses and homes will be targeted, a mere drop in the bucket. Which brings us to rural mapping: Since the advent of the digital age, there have been no accurate maps identifying where broadband service is available in rural America and where it is not available.  The FCC has a long history of promulgating unreliable and unverified carrier-provided numbers as the Commission has repeatedly ‘bungled efforts to produce accurate broadband maps” that would have facilitated rural coverage. During the Senate Commerce Committee hearing on April 10th regarding broadband mapping, critical testimony questioned whether the FCC and/or the telecom industry have either the commitment or the proficiency to provide 5G to rural America.  Members of the Committee shared concerns that 5G might put rural America further behind the curve so as to never catch up with the rest of the country
emileybrown89

Dial Kaspersky Technical Support +1-855-676-2448 for multiple error's - 0 views

  •  
    Kaspersky technical customer support +1-855-676-2448 is one of most trusted antivirus software used by billions of individual users for personally and professionally all over the world. Kaspersky is designed in such a manner that it fulfils number of instant antivirus requirements faced by a general user or a professional that includes virus detection, inventory and safe data along with several other various transactions
emileybrown89

Looking for Kaspersky support +1-855-676-24448 for titchy and voluminous solution - 0 views

  •  
    The Kaspersky Technical Support desk was formed with the aim of extending help to all Kaspersky antivirus users with titchy or voluminous issues such as setting up a Kaspersky account, configuring the device as per the software, antivirus or malware concern along with the unwanted pop-up advertisement. No need worry if you facing such issues with your device simply capitalize our toll-free number +1-855-676-2448 for instant or immediate solution without investing auxiliary capital.
emileybrown89

Kaspersky Antivirus Support Number +1-855-676-2448 for activation errors - Oregon, USA ... - 0 views

  •  
    Why do we offer free antivirus software? It's simple: First, it helps us fulfill our mission of saving the world, and second, it helps us make our products better. With the rise in the number of devices whose owners need only basic protection, we're expanding our database of threats and user requirements, enhancing client security and product quality with the help of Kaspersky Customer Support Number +1-855-676-2448. After installing Kaspersky Free Antivirus, you can switch to the trial version of Kaspersky Internet Security at any time, without having to download any additional files.
Paul Merrell

Wikileaks Releases "NightSkies 1.2": Proof CIA Bugs "Factory Fresh" iPhones | Zero Hedge - 0 views

  • The latest leaks from WikiLeaks' Vault 7 is titled “Dark Matter” and claims that the CIA has been bugging “factory fresh” iPhones since at least 2008 through suppliers.
  • And here is the full press release from WikiLeaks: Today, March 23rd 2017, WikiLeaks releases Vault 7 "Dark Matter", which contains documentation for several CIA projects that infect Apple Mac Computer firmware (meaning the infection persists even if the operating system is re-installed) developed by the CIA's Embedded Development Branch (EDB). These documents explain the techniques used by CIA to gain 'persistence' on Apple Mac devices, including Macs and iPhones and demonstrate their use of EFI/UEFI and firmware malware.   Among others, these documents reveal the "Sonic Screwdriver" project which, as explained by the CIA, is a "mechanism for executing code on peripheral devices while a Mac laptop or desktop is booting" allowing an attacker to boot its attack software for example from a USB stick "even when a firmware password is enabled". The CIA's "Sonic Screwdriver" infector is stored on the modified firmware of an Apple Thunderbolt-to-Ethernet adapter.   "DarkSeaSkies" is "an implant that persists in the EFI firmware of an Apple MacBook Air computer" and consists of "DarkMatter", "SeaPea" and "NightSkies", respectively EFI, kernel-space and user-space implants.   Documents on the "Triton" MacOSX malware, its infector "Dark Mallet" and its EFI-persistent version "DerStake" are also included in this release. While the DerStake1.4 manual released today dates to 2013, other Vault 7 documents show that as of 2016 the CIA continues to rely on and update these systems and is working on the production of DerStarke2.0.   Also included in this release is the manual for the CIA's "NightSkies 1.2" a "beacon/loader/implant tool" for the Apple iPhone. Noteworthy is that NightSkies had reached 1.2 by 2008, and is expressly designed to be physically installed onto factory fresh iPhones. i.e the CIA has been infecting the iPhone supply chain of its targets since at least 2008.   While CIA assets are sometimes used to physically infect systems in the custody of a target it is likely that many CIA physical access attacks have infected the targeted organization's supply chain including by interdicting mail orders and other shipments (opening, infecting, and resending) leaving the United States or otherwise.
Gary Edwards

The Man Who Makes the Future: Wired Icon Marc Andreessen | Epicenter | Wired.com - 1 views

  •  
    Must read interview. Marc Andreessen explains his five big ideas, taking us from the beginning of the Web, into the Cloud and beyond. Great stuff! ... (1) 1992 - Everyone Will Have the Web ... (2) 1995 - The Browser will the Operating System ... (3) 1999 - Web business will live in the Cloud ... (4) 2004 - Everything will be Social ... (5) 2009 - Software will Eat the World excerpt: Technology is like water; it wants to find its level. So if you hook up your computer to a billion other computers, it just makes sense that a tremendous share of the resources you want to use-not only text or media but processing power too-will be located remotely. People tend to think of the web as a way to get information or perhaps as a place to carry out ecommerce. But really, the web is about accessing applications. Think of each website as an application, and every single click, every single interaction with that site, is an opportunity to be on the very latest version of that application. Once you start thinking in terms of networks, it just doesn't make much sense to prefer local apps, with downloadable, installable code that needs to be constantly updated.

    "We could have built a social element into Mosaic. But back then the Internet was all about anonymity."
    Anderson: Assuming you have enough bandwidth.

    Andreessen: That's the very big if in this equation. If you have infinite network bandwidth, if you have an infinitely fast network, then this is what the technology wants. But we're not yet in a world of infinite speed, so that's why we have mobile apps and PC and Mac software on laptops and phones. That's why there are still Xbox games on discs. That's why everything isn't in the cloud. But eventually the technology wants it all to be up there.

    Anderson: Back in 1995, Netscape began pursuing this vision by enabling the browser to do more.

    Andreessen: We knew that you would need some pro
Paul Merrell

Supreme Court Will Hear Arguments On Section 101 Software Patent Eligibility | Bloomber... - 0 views

  • The Supreme Court granted a petition for writ of certiorari on Dec. 6 in a case challenging software method and system patent eligibility under 35 U.S.C. §101, in Alice Corp. Pty. Ltd. v. CLS Bank Int'l ( U.S., No. 13-298, review granted, 12/6/13).The question presented by the patent owner in the case is:Whether claims to computer-implemented inventions--including claims to systems and machines, processes, and items of manufacture--are directed to patent-eligible subject matter within the meaning of 35 U.S.C. §101 as interpreted by this Court? 
  • The CLS Bank case is controversial because the U.S. Court of Appeals for the Federal Circuit, sitting en banc, failed to reach enough agreement on patent eligibility of computer-related claims to supply precedential jurisprudence. CLS Bank Int'l v. Alice Corp. Pty. Ltd., 717 F.3d 1269, 2013 BL 124940, 106 U.S.P.Q.2d 1696 (Fed. Cir. 2013) (en banc) (92 PTD, 5/13/13).Alice Corp. asserted four patents (U.S. Patent Nos. 5,970,479; 6,912,510; 7,149,720; and 7,725,375) directed to the formulation and trading of risk management contracts against alleged infringer CLS Bank International.The en banc court was 7-3 against patent eligibility of the method claims and 5-5 as to the system claims. Since the lower court had ruled that the system claims were ineligible, that judgment stands and all of Alice's claims are ineligible unless the Supreme Court overturns the decision. Eight members of the en banc court said that method and system or media claims should rise or fall together, but not for the same reasons.
  •  
    U.S. Supreme Court finally to decide whether software patent claims are legal? It looks like this may finally be the case. 
Paul Merrell

W3C News Archive: 2010 W3C - 0 views

  • Today W3C, the International Standards Organization (ISO), and the International Electrotechnical Commission (IEC) took steps that will encourage greater international adoption of W3C standards. W3C is now an "ISO/IEC JTC 1 PAS Submitter" (see the application), bringing "de jure" standards communities closer to the Internet ecosystem. As national bodies refer increasingly to W3C's widely deployed standards, users will benefit from an improved Web experience based on W3C's standards for an Open Web Platform. W3C expects to use this process (1) to help avoid global market fragmentation; (2) to improve deployment within government use of the specification; and (3) when there is evidence of stability/market acceptance of the specification. Web Services specifications will likely constitute the first package W3C will submit, by the end of 2010. For more information, see the W3C PAS Submission FAQ.
Paul Merrell

InfoQ: WS-I closes its doors. What does this mean for WS-*? - 0 views

  • The Web Services Interoperability Organization (WS-I) has just announced that it has completed its mission and will betransitioning all further efforts to OASIS. As their recent press release states: The release of WS-I member approved final materials for Basic Profile (BP) 1.2 and 2.0, and Reliable Secure Profile (RSP) 1.0 fulfills WS-I’s last milestone as an organization. By publishing the final three profiles, WS-I marks the completion of its work. Stewardship over WS-I’s assets, operations and mission will transition to OASIS (Organization for the Advancement of Structured Information Standards), a group of technology vendors and customers that drive development and adoption of open standards. Now at any other time this kind of statement from a standards organization might pass without much comment. However, with the rise of REST, a range of non-WS approaches to SOA and the fact that most of the WS-* standards have not been covered by WS-I, is this a reflection of the new position Web Services finds itself in, over a decade after it began? Perhaps this was inevitable given that the over the past few years there has been a lot more emphasis on interoperability within the various WS-* working groups? Or are the days of interactions across heterogeneous SOAP implementations in the past?
  • So the question remains: has interoperability pretty much been achieved for WS-* through WS-I and the improvements made with the way in which the specifications and standards are developed today, or has the real interoperability challenge moved elsewhere, still to be addressed?
Gary Edwards

10 Blogs By Entrepreneurs You Should Be Reading - 1 views

  •  
    Good slide show!  A keeper
Paul Merrell

No, Department of Justice, 80 Percent of Tor Traffic Is Not Child Porn | WIRED - 0 views

  • The debate over online anonymity, and all the whistleblowers, trolls, anarchists, journalists and political dissidents it enables, is messy enough. It doesn’t need the US government making up bogus statistics about how much that anonymity facilitates child pornography.
  • he debate over online anonymity, and all the whistleblowers, trolls, anarchists, journalists and political dissidents it enables, is messy enough. It doesn’t need the US government making up bogus statistics about how much that anonymity facilitates child pornography. At the State of the Net conference in Washington on Tuesday, US assistant attorney general Leslie Caldwell discussed what she described as the dangers of encryption and cryptographic anonymity tools like Tor, and how those tools can hamper law enforcement. Her statements are the latest in a growing drumbeat of federal criticism of tech companies and software projects that provide privacy and anonymity at the expense of surveillance. And as an example of the grave risks presented by that privacy, she cited a study she said claimed an overwhelming majority of Tor’s anonymous traffic relates to pedophilia. “Tor obviously was created with good intentions, but it’s a huge problem for law enforcement,” Caldwell said in comments reported by Motherboard and confirmed to me by others who attended the conference. “We understand 80 percent of traffic on the Tor network involves child pornography.” That statistic is horrifying. It’s also baloney.
  • In a series of tweets that followed Caldwell’s statement, a Department of Justice flack said Caldwell was citing a University of Portsmouth study WIRED covered in December. He included a link to our story. But I made clear at the time that the study claimed 80 percent of traffic to Tor hidden services related to child pornography, not 80 percent of all Tor traffic. That is a huge, and important, distinction. The vast majority of Tor’s users run the free anonymity software while visiting conventional websites, using it to route their traffic through encrypted hops around the globe to avoid censorship and surveillance. But Tor also allows websites to run Tor, something known as a Tor hidden service. This collection of hidden sites, which comprise what’s often referred to as the “dark web,” use Tor to obscure the physical location of the servers that run them. Visits to those dark web sites account for only 1.5 percent of all Tor traffic, according to the software’s creators at the non-profit Tor Project. The University of Portsmouth study dealt exclusively with visits to hidden services. In contrast to Caldwell’s 80 percent claim, the Tor Project’s director Roger Dingledine pointed out last month that the study’s pedophilia findings refer to something closer to a single percent of Tor’s overall traffic.
  • ...1 more annotation...
  • So to whoever at the Department of Justice is preparing these talking points for public consumption: Thanks for citing my story. Next time, please try reading it.
‹ Previous 21 - 40 of 244 Next › Last »
Showing 20 items per page