Skip to main content

Home/ Open Web/ Group items matching ""web services"" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
5More

Constructing A SharePoint History: Microsoft SharePoint Team Blog - 0 views

  • it was clear customers wanted a more integrated and comprehensive solution from us. As just one example, they told us like they liked the WYSWIG HTML editing of SharePoint Team Services and the Web Part declarative and reusable editing of SharePoint Portal but wanted to use both models on the same site?
  • On the application side, we were hearing customers wanted Office to go beyond personal productivity to organizational productivity and we had to decide whether Microsoft would invest in content management, portals, unified communications, business intelligence and many other new scenarios.
  • we made sure SharePoint was an open platform and worked with vendors across the industry on a variety of integration approaches including published APIs and protocols.
  • ...1 more annotation...
  • to enable customers to build business process integration and business intelligence portals, we added Excel Services and InfoPath Forms Services. Besides being exciting features, we gained invaluable learning for the team how to have an architecture that worked in the rich Office client and on the server with web access with high fidelity, round tripping, etc.
  •  
    Wow.  Why fight over the editing of Wikiword when you can make up your own history?  The Microsoft Office - SharePoint Blog team is busy trying to reshape history from the inside out.   This bookmark is going to require a ton of highlights and comments.
1More

Say hello to the new Mega: We go hands on. - The Next Web - 1 views

  •  
    50GB free.  $9.95 /mo for $500GB + 1TB bandwidth.  WOW!! Full encryption with private key exchange using secure messaging.  Awesome Cloud Drive service, excerpt: "Right now, Mega is still a barebones file sharing service, but the company has massive plans for the future. Snooping around on their site reveals their plans for going much further than just file sharing. A post on their blog - dated January 18th - details the features that were cut from launch but will be added soon. There are also apps for mobile platforms already underway, with the company planning to support all major platforms in the near future and allow uploading from them. The blog details a secure email component (probably the part we can't get working) that will be added, secure instant messaging and the ability for non-Mega users to send large files to those with a Mega account (for example, for printing files at a print shop). It doesn't stop there, though, the company also says they're planning on-site word processing, calendar and spreadsheet applications (watch out, Google Docs!) as well as a Dropbox-esque client for Windows, Linux and Mac. They also say that there are plans in the works for allowing users to run Mega as an appliance on their own machine, though there aren't many details on that in the post. In another place on the site, Mega promises that in the near future they will be offering secure video calling and traditional calling as well. Talk about trying to take over the world."

UI designing Services in Hyderabad - 0 views

started by pranetorweb on 18 Jul 16 no follow-up yet

UI Designing Services in Hyderabad - 0 views

started by pranetorweb on 30 Jun 16 no follow-up yet

UI Designing Services in Hyderabad - 0 views

started by pranetorweb on 30 Jun 16 no follow-up yet
1More

NoSQL Pioneers Are Driving the Web's Manifest Destiny - 1 views

  •  
    Good Chart comparing four types of Data Stores: Key-Value, Tabular/Columnar, Document Store, Relational excerpt: The bottleneck is no longer around performance or the cost of computing - it's about quickly getting the information to thousands, or hundreds of thousands, of nodes trying to act as one computer delivering a service. Google and IBM both have written about the data center as a computer, and Facebook says it thinks of adding hardware at the rack level rather than at the server level. But the current means of storing and accessing data have not made this leap from a single server to a rack - let alone an entire data center. As programmers attempt this leap, they face several difficulties, which include working with existing software and programming languages and figuring out what problems and bottlenecks the new services built on these monolithic computer platforms will encounter. Plus, the IT world doesn't all move at once, which means plenty of jobs and workloads will continue with the old way of doing things - that is, relational databases such as Oracle's offerings and the open source MySQL, which Oracle now has a stake in thanks to its purchase of Sun. The result is not a steady movement to non-relational databases or other methods of storing data, but a back-and-forth as programmers and businesses figure out what kind of architecture they need and what problems they want to solve. For a closer look at the issue and a bunch of charts detailing how the landscape is currently laid out, analyst Matt Sarrel, has penned a report over at GigaOM Pro (sub. req'd.) on the NoSQL movement called "NoSQL Databases - Providing Extreme Scale and Flexibility."
3More

Eucalyptus open-sources the cloud (Q&A) | The Open Road - CNET News - 0 views

  • The ideal customer is one with an IT organization that is tasked with supporting a heterogeneous set of user groups (each with its own technology needs, business logic, policies, etc.) using infrastructure that it must maintain across different phases of the technology lifecycle. There are two prevalent usage models that we observe regularly. The first is as a development and testing platform for applications that, ultimately, will be deployed in a public cloud. It is often easier, faster, and cheaper to use locally sited resources to develop and debug an application (particularly one that is designed to operate at scale) prior to its operational deployment in an externally hosted environment. The virtualization of machines makes cross-platform configuration easier to achieve and Eucalyptus' API compatibility makes the transition between on-premise resources and the public clouds simple. The second model is as an operational hybrid. It is possible to run the same image simultaneously both on-premise using Eucalyptus and in a public cloud thereby providing a way to augment local resources with those rented from a provider without modification to the application. For whom is this relevant technology today? Who are your customers? Wolski: We are seeing tremendous interest in several verticals. Banking/finance, big pharma, manufacturing, gaming, and the service provider market have been the early adopters to deploy and experiment with the Eucalyptus technology.
  • Eucalyptus is designed to be able to compose multiple technology platforms into a single "universal" cloud platform that exposes a common API, but that can at the same time support separate APIs for the individual technologies. Moreover, it is possible to export some of the specific and unique features of each technology through the common API as "quality-of-service" attributes.
  •  
    Eucalyptus, an open-source platform that implements "infrastructure as a service" (IaaS) style cloud computing, aims to take open source front and center in the cloud-computing craze. The project, founded by academics at the University of California at Santa Barbara, is now a Benchmark-funded company with an ambitious goal: become the universal cloud platform that everyone from Amazon to Microsoft to Red Hat to VMware ties into. [Eucalyptus] is architected to be compatible with such a wide variety of commonly installed data center technologies, [and hence] provides an easy and low-risk way of building private (i.e. on-premise or internal) clouds...Thus data center operators choosing Eucalyptus are assured of compatibility with the emerging application development and operational cloud ecosystem while attaining the security and IT investment amortization levels they desire without the "fear" of being locked into a single public cloud platform.
2More

Obama wants to help make your Internet faster and cheaper. This is his plan. - The Wash... - 0 views

  • Frustrated over the number of Internet providers that are available to you? If so, you're like many who are limited to just a handful of broadband companies. But now President Obama wants to change that, arguing that choice and competition are lacking in the U.S. broadband market. On Wednesday, Obama will unveil a series of measures aimed at making high-speed Web connections cheaper and more widely available to millions of Americans. The announcement will focus chiefly on efforts by cities to build their own alternatives to major Internet providers such as Comcast, Verizon or AT&T — a public option for Internet access, you could say. He'll write to the Federal Communications Commission urging the agency to help neutralize laws, erected by states, that effectively protect large established Internet providers against the threat represented by cities that want to build and offer their own, municipal Internet service. He'll direct federal agencies to expand grants and loans for these projects and for smaller, rural Internet providers. And he'll draw attention to a new coalition of mayors from 50 cities who've committed to spurring choice in the broadband industry.
  • "When more companies compete for your broadband business, it means lower prices," Jeff Zients, director of Obama's National Economic Council, told reporters Tuesday. "Broadband is no longer a luxury. It's a necessity." The announcement highlights a growing chorus of small and mid-sized cities that say they've been left behind by some of the country's biggest Internet providers. In many of these places, incumbent companies have delayed network upgrades or offer what customers say is unsatisfactory service because it isn't cost-effective to build new infrastructure. Many cities, such as Cedar Falls, Iowa, have responded by building their own, publicly operated competitors. Obama will travel to Cedar Falls on Wednesday to roll out his initiative.
6More

The punk rock internet - how DIY ​​rebels ​are working to ​replace the tech g... - 0 views

  • What they are doing could be seen as the online world’s equivalent of punk rock: a scattered revolt against an industry that many now think has grown greedy, intrusive and arrogant – as well as governments whose surveillance programmes have fuelled the same anxieties. As concerns grow about an online realm dominated by a few huge corporations, everyone involved shares one common goal: a comprehensively decentralised internet.
  • In the last few months, they have started working with people in the Belgian city of Ghent – or, in Flemish, Gent – where the authorities own their own internet domain, complete with .gent web addresses. Using the blueprint of Heartbeat, they want to create a new kind of internet they call the indienet – in which people control their data, are not tracked and each own an equal space online. This would be a radical alternative to what we have now: giant “supernodes” that have made a few men in northern California unimaginable amounts of money thanks to the ocean of lucrative personal information billions of people hand over in exchange for their services.
  • His alternative is what he calls the Safe network: the acronym stands for “Safe Access for Everyone”. In this model, rather than being stored on distant servers, people’s data – files, documents, social-media interactions – will be broken into fragments, encrypted and scattered around other people’s computers and smartphones, meaning that hacking and data theft will become impossible. Thanks to a system of self-authentication in which a Safe user’s encrypted information would only be put back together and unlocked on their own devices, there will be no centrally held passwords. No one will leave data trails, so there will be nothing for big online companies to harvest. The financial lubricant, Irvine says, will be a cryptocurrency called Safecoin: users will pay to store data on the network, and also be rewarded for storing other people’s (encrypted) information on their devices. Software developers, meanwhile, will be rewarded with Safecoin according to the popularity of their apps. There is a community of around 7,000 interested people already working on services that will work on the Safe network, including alternatives to platforms such as Facebook and YouTube.
  • ...3 more annotations...
  • Once MaidSafe is up and running, there will be very little any government or authority can do about it: “We can’t stop the network if we start it. If anyone turned round and said: ‘You need to stop that,’ we couldn’t. We’d have to go round to people’s houses and switch off their computers. That’s part of the whole thing. The network is like a cyber-brain; almost a lifeform in itself. And once you start it, that’s it.” Before my trip to Scotland, I tell him, I spent whole futile days signing up to some of the decentralised social networks that already exist – Steemit, Diaspora, Mastadon – and trying to approximate the kind of experience I can easily get on, say, Twitter or Facebook.
  • And herein lie two potential breakthroughs. One, according to some cryptocurrency enthusiasts, is a means of securing and protecting people’s identities that doesn’t rely on remotely stored passwords. The other is a hope that we can leave behind intermediaries such as Uber and eBay, and allow buyers and sellers to deal directly with each other. Blockstack, a startup based in New York, aims to bring blockchain technology to the masses. Like MaidSafe, its creators aim to build a new internet, and a 13,000-strong crowd of developers are already working on apps that either run on the platform Blockstack has created, or use its features. OpenBazaar is an eBay-esque service, up and running since November last year, which promises “the world’s most private, secure, and liberating online marketplace”. Casa aims to be an decentralised alternative to Airbnb; Guild is a would-be blogging service that bigs up its libertarian ethos and boasts that its founders will have “no power to remove blogs they don’t approve of or agree with”.
  • An initial version of Blockstack is already up and running. Even if data is stored on conventional drives, servers and clouds, thanks to its blockchain-based “private key” system each Blockstack user controls the kind of personal information we currently blithely hand over to Big Tech, and has the unique power to unlock it. “That’s something that’s extremely powerful – and not just because you know your data is more secure because you’re not giving it to a company,” he says. “A hacker would have to hack a million people if they wanted access to their data.”
1More

How a "location API" allows cops to figure out where we all are in real time | Ars Tech... - 0 views

  • The digital privacy world was rocked late Thursday evening when The New York Times reported on Securus, a prison telecom company that has a service enabling law enforcement officers to locate most American cell phones within seconds. The company does this via a basic Web interface leveraging a location API—creating a way to effectively access a massive real-time database of cell-site records. Securus’ location ability relies on other data brokers and location aggregators that obtain that information directly from mobile providers, usually for the purposes of providing some commercial service like an opt-in product discount triggered by being near a certain location. ("You’re near a Carl’s Jr.! Stop in now for a free order of fries with purchase!") The Texas-based Securus reportedly gets its data from 3CInteractive, which in turn buys data from LocationSmart. Ars reached 3CInteractive's general counsel, Scott Elk, who referred us to a spokesperson. The spokesperson did not immediately respond to our query. But currently, anyone can get a sense of the power of a location API by trying out a demo from LocationSmart itself. Currently, the Supreme Court is set to rule on the case of Carpenter v. United States, which asks whether police can obtain more than 120 days' worth of cell-site location information of a criminal suspect without a warrant. In that case, as is common in many investigations, law enforcement presented a cell provider with a court order to obtain such historical data. But the ability to obtain real-time location data that Securus reportedly offers skips that entire process, and it's potentially far more invasive. Securus’ location service as used by law enforcement is also currently being scrutinized. The service is at the heart of an ongoing federal prosecution of a former Missouri sheriff’s deputy who allegedly used it at least 11 times against a judge and other law enforcement officers. On Friday, Sen. Ron Wyden (D-Ore.) publicly released his formal letters to AT&T and also to the Federal Communications Commission demanding detailed answers regarding these Securus revelations.
2More

FCC Turns Itself into a Deregulatory Agency - WhoWhatWhy - 0 views

  • Since taking office, President Donald Trump has wasted no time in proposing rollbacks to Obama-era federal regulations. So, it should come as no surprise that the Federal Communications Commission (FCC) voted last month to propose changes to current regulations on Internet service providers. Spearheaded by Ajit Pai — the Trump-appointed FCC chairman and former lawyer for Verizon — the 2-1 vote is the first step in dismantling the Open Internet Order. The lone FCC Democrat, Mignon Clyburn, was overruled by Pai and fellow commissioner Michael O’Reilly. The 2015 order classified broadband internet as a utility under Title II of the Communications Act of 1934. Opponents of the current state of net neutrality argue that the rules are archaic and place unnecessary — even harmful — restrictions on internet service providers (ISPs), leading to lack of innovation and investment. While it’s true that policies conceived in the 1930s could hardly anticipate the complexities of the modern Internet, a complete rollback of Title II protections would leave ISPs free to favor their own services and whichever company pays for upgraded service. Considering relaxed FEC rules on media ownership and lack of antitrust enforcement, some could argue that a rollback of net neutrality is even more toxic to innovation and affordable pricing. That is, fast lanes could be created for companies with deeper pockets, effectively giving them an advantage over companies and individuals who can’t pay extra. This approach effectively penalizes small businesses, nonprofits and innovative start-ups. Today’s Internet is so vast and so pervasive that it’s hard to grasp the impact that an abandonment of net neutrality would have on every aspect of our culture.
  • While the FCC’s proposed change will touch most Americans, net neutrality remains a mystifying concept to non-techies. To help our readers better understand the issue, we have compiled some videos that explain net neutrality and its importance. The FCC will be accepting comments from the public on their website until August 16, 2017.
2More

M of A - Assad Says The "Boy In The Ambulance" Is Fake - This Proves It - 0 views

  • Re: Major net hack - its not necessarily off topic. .gov is herding web sites into it's own little DNS animal farms so it can properly protect the public from that dangerous 'information' stuff in time of emergency. CloudFlare is the biggest abattoir... er, animal farm. CloudFlare is kind of like a protection racket. If you pay their outrageous fees, you will be 'protected' from DDoS attacks. Since CloudFlare is the preferred covert .gov tool of censorship and content control (when things go south), they are trying to drive as many sites as possible into their digital panopticons. Who the hell is Cloudflare? ISUCKER: BIG BROTHER INTERNET CULTURE On top of that, CloudFlare’s CEO Matthew Prince made a weird, glib admission that he decided to start the company only after the Department of Homeland Security gave him a call in 2007 and suggested he take the technology behind Project Honey Pot one step further… And that makes CloudFlare a whole different story: People who sign up for the service are allowing CloudFlare to monitor, observe and scrutinize all of their site’s traffic, which makes it much easier for intel or law enforcement agencies to collect info on websites and without having to hack or request the logs from each hosting company separately. But there’s more. Because CloudFlare doesn’t just passively monitor internet traffic but works like a dynamic firewall to selectively block traffic from sources it deems to be “hostile,” website operators are giving it a whole lotta power over who gets to see their content. The whole point of CloudFlare is to restrict access to websites from specific locations/IP addresses on the fly, without notifying or bothering the website owner with the details. It’s all boils down to a question of trust, as in: do you trust a shady company with known intel/law enforcement connections to make that decision?
  • And here is an added bonus for the paranoid: Because CloudFlare partially caches websites and delivers them to web surfers via its own servers, the company also has the power to serve up redacted versions of the content to specific users. CloudFlare is perfect: it can implement censorship on the fly, without anyone getting wise to it! Right now CloudFlare says it monitors nearly 1/5 of all Internet visits. [<-- this] An astounding claim for a company most people haven’t even heard of. And techie bloggers seem very excited about getting as much Internet traffic routed through them as possible! See? Plausable deniability. A couple of degrees of separation. Yet when the Borg Queen wants to start WWIII next year, she can order the DHS Stazi to order outfits like CloudFlare to do the proper 'shaping' of internet traffic to filter out unwanted information. How far is any expose of propaganda like Dusty Boy going to happen if nobody can get to sites like MoA? You'll be able to get to all kinds of tweets and NGO sites crying about Dusty Boy 2.0, but you won't see a tweet or a web site calling them out on their lies. Will you even know they interviewed Assad? Will you know the activist 'photographer' is a paid NGO shill or that he's pals with al Zenki? Nope, not if .gov can help it.
20More

From Radio to Porn, British Spies Track Web Users' Online Identities - 0 views

  • HERE WAS A SIMPLE AIM at the heart of the top-secret program: Record the website browsing habits of “every visible user on the Internet.” Before long, billions of digital records about ordinary people’s online activities were being stored every day. Among them were details cataloging visits to porn, social media and news websites, search engines, chat forums, and blogs. The mass surveillance operation — code-named KARMA POLICE — was launched by British spies about seven years ago without any public debate or scrutiny. It was just one part of a giant global Internet spying apparatus built by the United Kingdom’s electronic eavesdropping agency, Government Communications Headquarters, or GCHQ. The revelations about the scope of the British agency’s surveillance are contained in documents obtained by The Intercept from National Security Agency whistleblower Edward Snowden. Previous reports based on the leaked files have exposed how GCHQ taps into Internet cables to monitor communications on a vast scale, but many details about what happens to the data after it has been vacuumed up have remained unclear.
  • Amid a renewed push from the U.K. government for more surveillance powers, more than two dozen documents being disclosed today by The Intercept reveal for the first time several major strands of GCHQ’s existing electronic eavesdropping capabilities.
  • The surveillance is underpinned by an opaque legal regime that has authorized GCHQ to sift through huge archives of metadata about the private phone calls, emails and Internet browsing logs of Brits, Americans, and any other citizens — all without a court order or judicial warrant
  • ...17 more annotations...
  • A huge volume of the Internet data GCHQ collects flows directly into a massive repository named Black Hole, which is at the core of the agency’s online spying operations, storing raw logs of intercepted material before it has been subject to analysis. Black Hole contains data collected by GCHQ as part of bulk “unselected” surveillance, meaning it is not focused on particular “selected” targets and instead includes troves of data indiscriminately swept up about ordinary people’s online activities. Between August 2007 and March 2009, GCHQ documents say that Black Hole was used to store more than 1.1 trillion “events” — a term the agency uses to refer to metadata records — with about 10 billion new entries added every day. As of March 2009, the largest slice of data Black Hole held — 41 percent — was about people’s Internet browsing histories. The rest included a combination of email and instant messenger records, details about search engine queries, information about social media activity, logs related to hacking operations, and data on people’s use of tools to browse the Internet anonymously.
  • Throughout this period, as smartphone sales started to boom, the frequency of people’s Internet use was steadily increasing. In tandem, British spies were working frantically to bolster their spying capabilities, with plans afoot to expand the size of Black Hole and other repositories to handle an avalanche of new data. By 2010, according to the documents, GCHQ was logging 30 billion metadata records per day. By 2012, collection had increased to 50 billion per day, and work was underway to double capacity to 100 billion. The agency was developing “unprecedented” techniques to perform what it called “population-scale” data mining, monitoring all communications across entire countries in an effort to detect patterns or behaviors deemed suspicious. It was creating what it said would be, by 2013, “the world’s biggest” surveillance engine “to run cyber operations and to access better, more valued data for customers to make a real world difference.”
  • A document from the GCHQ target analysis center (GTAC) shows the Black Hole repository’s structure.
  • The data is searched by GCHQ analysts in a hunt for behavior online that could be connected to terrorism or other criminal activity. But it has also served a broader and more controversial purpose — helping the agency hack into European companies’ computer networks. In the lead up to its secret mission targeting Netherlands-based Gemalto, the largest SIM card manufacturer in the world, GCHQ used MUTANT BROTH in an effort to identify the company’s employees so it could hack into their computers. The system helped the agency analyze intercepted Facebook cookies it believed were associated with Gemalto staff located at offices in France and Poland. GCHQ later successfully infiltrated Gemalto’s internal networks, stealing encryption keys produced by the company that protect the privacy of cell phone communications.
  • Similarly, MUTANT BROTH proved integral to GCHQ’s hack of Belgian telecommunications provider Belgacom. The agency entered IP addresses associated with Belgacom into MUTANT BROTH to uncover information about the company’s employees. Cookies associated with the IPs revealed the Google, Yahoo, and LinkedIn accounts of three Belgacom engineers, whose computers were then targeted by the agency and infected with malware. The hacking operation resulted in GCHQ gaining deep access into the most sensitive parts of Belgacom’s internal systems, granting British spies the ability to intercept communications passing through the company’s networks.
  • In March, a U.K. parliamentary committee published the findings of an 18-month review of GCHQ’s operations and called for an overhaul of the laws that regulate the spying. The committee raised concerns about the agency gathering what it described as “bulk personal datasets” being held about “a wide range of people.” However, it censored the section of the report describing what these “datasets” contained, despite acknowledging that they “may be highly intrusive.” The Snowden documents shine light on some of the core GCHQ bulk data-gathering programs that the committee was likely referring to — pulling back the veil of secrecy that has shielded some of the agency’s most controversial surveillance operations from public scrutiny. KARMA POLICE and MUTANT BROTH are among the key bulk collection systems. But they do not operate in isolation — and the scope of GCHQ’s spying extends far beyond them.
  • The agency operates a bewildering array of other eavesdropping systems, each serving its own specific purpose and designated a unique code name, such as: SOCIAL ANTHROPOID, which is used to analyze metadata on emails, instant messenger chats, social media connections and conversations, plus “telephony” metadata about phone calls, cell phone locations, text and multimedia messages; MEMORY HOLE, which logs queries entered into search engines and associates each search with an IP address; MARBLED GECKO, which sifts through details about searches people have entered into Google Maps and Google Earth; and INFINITE MONKEYS, which analyzes data about the usage of online bulletin boards and forums. GCHQ has other programs that it uses to analyze the content of intercepted communications, such as the full written body of emails and the audio of phone calls. One of the most important content collection capabilities is TEMPORA, which mines vast amounts of emails, instant messages, voice calls and other communications and makes them accessible through a Google-style search tool named XKEYSCORE.
  • As of September 2012, TEMPORA was collecting “more than 40 billion pieces of content a day” and it was being used to spy on people across Europe, the Middle East, and North Africa, according to a top-secret memo outlining the scope of the program. The existence of TEMPORA was first revealed by The Guardian in June 2013. To analyze all of the communications it intercepts and to build a profile of the individuals it is monitoring, GCHQ uses a variety of different tools that can pull together all of the relevant information and make it accessible through a single interface. SAMUEL PEPYS is one such tool, built by the British spies to analyze both the content and metadata of emails, browsing sessions, and instant messages as they are being intercepted in real time. One screenshot of SAMUEL PEPYS in action shows the agency using it to monitor an individual in Sweden who visited a page about GCHQ on the U.S.-based anti-secrecy website Cryptome.
  • Partly due to the U.K.’s geographic location — situated between the United States and the western edge of continental Europe — a large amount of the world’s Internet traffic passes through its territory across international data cables. In 2010, GCHQ noted that what amounted to “25 percent of all Internet traffic” was transiting the U.K. through some 1,600 different cables. The agency said that it could “survey the majority of the 1,600” and “select the most valuable to switch into our processing systems.”
  • According to Joss Wright, a research fellow at the University of Oxford’s Internet Institute, tapping into the cables allows GCHQ to monitor a large portion of foreign communications. But the cables also transport masses of wholly domestic British emails and online chats, because when anyone in the U.K. sends an email or visits a website, their computer will routinely send and receive data from servers that are located overseas. “I could send a message from my computer here [in England] to my wife’s computer in the next room and on its way it could go through the U.S., France, and other countries,” Wright says. “That’s just the way the Internet is designed.” In other words, Wright adds, that means “a lot” of British data and communications transit across international cables daily, and are liable to be swept into GCHQ’s databases.
  • A map from a classified GCHQ presentation about intercepting communications from undersea cables. GCHQ is authorized to conduct dragnet surveillance of the international data cables through so-called external warrants that are signed off by a government minister. The external warrants permit the agency to monitor communications in foreign countries as well as British citizens’ international calls and emails — for example, a call from Islamabad to London. They prohibit GCHQ from reading or listening to the content of “internal” U.K. to U.K. emails and phone calls, which are supposed to be filtered out from GCHQ’s systems if they are inadvertently intercepted unless additional authorization is granted to scrutinize them. However, the same rules do not apply to metadata. A little-known loophole in the law allows GCHQ to use external warrants to collect and analyze bulk metadata about the emails, phone calls, and Internet browsing activities of British people, citizens of closely allied countries, and others, regardless of whether the data is derived from domestic U.K. to U.K. communications and browsing sessions or otherwise. In March, the existence of this loophole was quietly acknowledged by the U.K. parliamentary committee’s surveillance review, which stated in a section of its report that “special protection and additional safeguards” did not apply to metadata swept up using external warrants and that domestic British metadata could therefore be lawfully “returned as a result of searches” conducted by GCHQ.
  • Perhaps unsurprisingly, GCHQ appears to have readily exploited this obscure legal technicality. Secret policy guidance papers issued to the agency’s analysts instruct them that they can sift through huge troves of indiscriminately collected metadata records to spy on anyone regardless of their nationality. The guidance makes clear that there is no exemption or extra privacy protection for British people or citizens from countries that are members of the Five Eyes, a surveillance alliance that the U.K. is part of alongside the U.S., Canada, Australia, and New Zealand. “If you are searching a purely Events only database such as MUTANT BROTH, the issue of location does not occur,” states one internal GCHQ policy document, which is marked with a “last modified” date of July 2012. The document adds that analysts are free to search the databases for British metadata “without further authorization” by inputing a U.K. “selector,” meaning a unique identifier such as a person’s email or IP address, username, or phone number. Authorization is “not needed for individuals in the U.K.,” another GCHQ document explains, because metadata has been judged “less intrusive than communications content.” All the spies are required to do to mine the metadata troves is write a short “justification” or “reason” for each search they conduct and then click a button on their computer screen.
  • Intelligence GCHQ collects on British persons of interest is shared with domestic security agency MI5, which usually takes the lead on spying operations within the U.K. MI5 conducts its own extensive domestic surveillance as part of a program called DIGINT (digital intelligence).
  • GCHQ’s documents suggest that it typically retains metadata for periods of between 30 days to six months. It stores the content of communications for a shorter period of time, varying between three to 30 days. The retention periods can be extended if deemed necessary for “cyber defense.” One secret policy paper dated from January 2010 lists the wide range of information the agency classes as metadata — including location data that could be used to track your movements, your email, instant messenger, and social networking “buddy lists,” logs showing who you have communicated with by phone or email, the passwords you use to access “communications services” (such as an email account), and information about websites you have viewed.
  • Records showing the full website addresses you have visited — for instance, www.gchq.gov.uk/what_we_do — are treated as content. But the first part of an address you have visited — for instance, www.gchq.gov.uk — is treated as metadata. In isolation, a single metadata record of a phone call, email, or website visit may not reveal much about a person’s private life, according to Ethan Zuckerman, director of Massachusetts Institute of Technology’s Center for Civic Media. But if accumulated and analyzed over a period of weeks or months, these details would be “extremely personal,” he told The Intercept, because they could reveal a person’s movements, habits, religious beliefs, political views, relationships, and even sexual preferences. For Zuckerman, who has studied the social and political ramifications of surveillance, the most concerning aspect of large-scale government data collection is that it can be “corrosive towards democracy” — leading to a chilling effect on freedom of expression and communication. “Once we know there’s a reasonable chance that we are being watched in one fashion or another it’s hard for that not to have a ‘panopticon effect,’” he said, “where we think and behave differently based on the assumption that people may be watching and paying attention to what we are doing.”
  • When compared to surveillance rules in place in the U.S., GCHQ notes in one document that the U.K. has “a light oversight regime.” The more lax British spying regulations are reflected in secret internal rules that highlight greater restrictions on how NSA databases can be accessed. The NSA’s troves can be searched for data on British citizens, one document states, but they cannot be mined for information about Americans or other citizens from countries in the Five Eyes alliance. No such constraints are placed on GCHQ’s own databases, which can be sifted for records on the phone calls, emails, and Internet usage of Brits, Americans, and citizens from any other country. The scope of GCHQ’s surveillance powers explain in part why Snowden told The Guardian in June 2013 that U.K. surveillance is “worse than the U.S.” In an interview with Der Spiegel in July 2013, Snowden added that British Internet cables were “radioactive” and joked: “Even the Queen’s selfies to the pool boy get logged.”
  • In recent years, the biggest barrier to GCHQ’s mass collection of data does not appear to have come in the form of legal or policy restrictions. Rather, it is the increased use of encryption technology that protects the privacy of communications that has posed the biggest potential hindrance to the agency’s activities. “The spread of encryption … threatens our ability to do effective target discovery/development,” says a top-secret report co-authored by an official from the British agency and an NSA employee in 2011. “Pertinent metadata events will be locked within the encrypted channels and difficult, if not impossible, to prise out,” the report says, adding that the agencies were working on a plan that would “(hopefully) allow our Internet Exploitation strategy to prevail.”
1More

W3C News Archive: 2010 W3C - 0 views

  • Today W3C, the International Standards Organization (ISO), and the International Electrotechnical Commission (IEC) took steps that will encourage greater international adoption of W3C standards. W3C is now an "ISO/IEC JTC 1 PAS Submitter" (see the application), bringing "de jure" standards communities closer to the Internet ecosystem. As national bodies refer increasingly to W3C's widely deployed standards, users will benefit from an improved Web experience based on W3C's standards for an Open Web Platform. W3C expects to use this process (1) to help avoid global market fragmentation; (2) to improve deployment within government use of the specification; and (3) when there is evidence of stability/market acceptance of the specification. Web Services specifications will likely constitute the first package W3C will submit, by the end of 2010. For more information, see the W3C PAS Submission FAQ.
2More

Google building Skype-alike software into Chrome | Deep Tech - CNET News - 0 views

  • Shortly after releasing software for audio and video chat as an open-source project called WebRTC as open-source software, Google is beginning to build it into its Chrome browser. The real-time chat software originated from Google's 2010 acquisition of Global IP Solutions (GIPS), a company specializing in Internet telephony and videoconferencing.
  • If Google and allies succeed in establishing the technology and building support into multiple browsers, that would mean anybody building a Web site or Web application could draw upon the communications technology. In other words, anyone could build a rival to services, such as Skype, with just a Web application.
4More

Mozilla partners with Panasonic to bring Firefox OS to the TV, details progress on tabl... - 0 views

  • At CES 2014 in Las Vegas today, Mozilla announced its plans for Firefox OS this year. Having launched Firefox OS for smartphones in 2013, the company has now partnered with Panasonic to bring its operating system to TVs, and also detailed the progress that has been made around the tablet and desktop versions.
  • Mereby elaborated that current options are controlled by either Google or Apple, two major corporations that “hold all the strings.” As such, Android and iOS are not viable options for Panasonic, as the ecosystem is tightly controlled. With Firefox OS, however, Mereby argues that “anyone can compete”, as you can operate your own marketplace. Not only can Panasonic open up its own marketplace for apps and content, but those who want to build apps and sell content can bypass marketplaces and make their offerings directly to Firefox OS users.
  • While the partnership is not exclusive, Panasonic will be the first to release next-generation smart TVs powered by Firefox OS. Mozilla and Panasonic will work together to promote Firefox OS and its open ecosystem on the big screen. The plan is to leverage existing HTML5 and Web technologies used on PCs, smartphones, and tablets, to provide TVs with more personalized and optimized access to content and services through the Internet. Mozilla’s Web APIs for hardware control and operation will allow TVs to monitor and operate devices, such as emerging smart home appliances, inside and outside of the home. Basic functions such as menus and programming guides, which are currently written as embedded programs, will be written in HTML5, letting developers easily create applications for smartphones or tablets to remotely access and operate TVs. Mozilla also envisions personalized user interfaces with users’ favorites and new functions for multiple users sharing the same screen.
  • ...1 more annotation...
  • Last but not least, Mozilla wanted to underline how Firefox OS was coming to the desktop. Since the operating system is open source, anyone can modify it. VIA is doing just that: it’s making its own changes to create a more suitable version for the desktop, and Mozilla is bringing those commits back to its own repository. Furthermore, VIA today announced the availability of APC Paper and Rock, two new devices that offer a preview of Firefox OS running in a desktop environment. Rock is a motherboard which can be inserted into any barebone PC chassis while Paper is a standalone computer with its own case. Both are targeted at early adopters and developers wanting to help find, file, and fix bugs for VIA’s desktop version of Firefox OS. Paper and Rock are available with the same buildable source codes currently available on GitHub.
1More

HTML5 vs Flash - 0 views

  •  
    HTML5 and Flash are two technologies that are getting measured constantly.HTML5 vs Flash is like comparing oranges and apples.
1More

Feds use keylogger to thwart PGP, Hushmail | News Blogs - CNET News - 0 views

  •  
    The more i learn about the Governments illegal and un-Constitutional surveillance activities, the worse it gets.  As i read this article i couldn't help but wonder why the Government would want to disclose the warrantless activities as evidence in court?  Clearly the Government wants to have their violations of carefully enumerated Constitutional protections of individual rights validated by the nations courts.  Scary stuff. excerpt: A recent court case provides a rare glimpse into how some federal agents deal with encryption: by breaking into a suspect's home or office, implanting keystroke-logging software, and spying on what happens from afar. An agent with the Drug Enforcement Administration persuaded a federal judge to authorize him to sneak into an Escondido, Calif., office believed to be a front for manufacturing the drug MDMA, or Ecstasy. The DEA received permission to copy the hard drives' contents and inject a keystroke logger into the computers. That was necessary, according to DEA Agent Greg Coffey, because the suspects were using PGP and the encrypted Web e-mail service Hushmail.com. Coffey asserted that the DEA needed "real-time and meaningful access" to "monitor the keystrokes" for PGP and Hushmail passphrases. The aggressive surveillance techniques employed by the DEA were part of a case that resulted in a ruling on Friday (PDF) by the 9th Circuit Court of Appeals, which primarily dealt with Internet surveillance through a wiretap conducted on a PacBell (now AT&T) business DSL line used by the defendants.
1More

Combining the Best of Gmail and Zoho CRM Produces Amazing Results By James Kimmons of A... - 0 views

  •  
    ZOHO has demonstrated some very effective and easy to use data merging. They have also released a ZOHO Writer extension for Chrome that is awesome. The problem with "merge" is that, while full featured, the only usable data source is ZOHO CRM. Not good, but zCRM does fully integrate with ZOHO eMail, which enables the full two way transparent integration with zCRM. Easier to do than explain. Real Estate example excerpt: Zoho is smart, allowing you to integrate Gmail: The best of both worlds is available, because Zoho had the foresight to allow you to use Gmail and integrate your emails with the Zoho CRM system. Once you've set it up, you use Gmail the way you've always used it. I get to continue using all of the things I love about Gmail. But, every email, in or out of Gmail, attaches itself to the appropriate contact in the Zoho CRM system. When I send or receive an email in Gmail that is to or from one of my Zoho contacts or leads, the email automatically is picked up by Zoho and becomes a part of that contact/prospect's record, even though I never opened Zoho. If you've wondered about backing up Gmail, let Zoho do it: A bonus benefit in using Zoho mail is that you can set it up to receive all of your Gmail, sent and received, as well. It's a ready-made backup for your Gmail. So, if CRM isn't something you want to do with Zoho, at least set up the free email to copy all of your Gmail. And, if you're still using Outlook...why? The Internet is Improving Our Business at a Lower Cost: Here we have two free email systems that give you amazing flexibility and backup. Then the Zoho CRM system, with the email module installed, is only $15/month. You can do mass marketing emails, auto-responders, and take in new contacts and prospects with Web forms. Once you tie Gmail and Zoho together, your email and CRM will be top-notch, at a very low cost. Though you may wish for one, there isn't a reasonably priced "does it all" solution out there. This is an
1More

Telax Unveils HTML5 Software for Mac OS Contact Centers - 0 views

  •  
    Interesting development in the world of real time Web Apps.  Looks like Business processes and services in the Cloud are embracing HTML5, and moving fast to replace legacy client/server.  Note this is not Flash or Silverlight RiA.   excerpt: Telax Hosted Call Center, a leader in cloud contact center solutions announced the release of its HTML5-based Call Center Agent (CCA) today. Key to the development of the browser-based CCA was Websocket, a component of HTML5 that provides a bi-directional, full-duplex communication channel over a single Transmission Control Protocol (TCP) socket. Websocket is currently supported by the latest versions of Google Chrome, Apple Safari, and Firefox, making Telax's new CCA compatible with the most popular browsers in Mac environments. Before HTML5, real-time unified communication software was typically deployed as a local client because its browser-based counterparts were unable to deliver an acceptable user experience. Some browser-based clients use 3rd party software such as Adobe Flash or Sliverlight to operate adequately, but both solutions require software installation and are not mobile friendly.
« First ‹ Previous 81 - 100 of 150 Next › Last »
Showing 20 items per page