Skip to main content

Home/ Open Web/ Group items tagged internet-infrastructure

Rss Feed Group items tagged

Paul Merrell

The Internet May Be Underwater in 15 Years - 0 views

  • When the internet goes down, life as the modern American knows it grinds to a halt. Gone are the cute kitten photos and the Facebook status updates—but also gone are the signals telling stoplights to change from green to red, and doctors’ access to online patient records. A vast web of physical infrastructure undergirds the internet connections that touch nearly every aspect of modern life. Delicate fiber optic cables, massive data transfer stations, and power stations create a patchwork of literal nuts and bolts that facilitates the flow of zeros and ones. Now, research shows that a whole lot of that infrastructure sits squarely in the path of rising seas. (See what the planet would look like if all the ice melted.) Scientists mapped out the threads and knots of internet infrastructure in the U.S. and layered that on top of maps showing future sea level rise. What they found was ominous: Within 15 years, thousands of miles of fiber optic cable—and hundreds of pieces of other key infrastructure—are likely to be swamped by the encroaching ocean. And while some of that infrastructure may be water resistant, little of it was designed to live fully underwater. “So much of the infrastructure that's been deployed is right next to the coast, so it doesn't take much more than a few inches or a foot of sea level rise for it to be underwater,” says study coauthor Paul Barford, a computer scientist at the University of Wisconsin, Madison. “It was all was deployed 20ish years ago, when no one was thinking about the fact that sea levels might come up.”
  • “This will be a big problem,” says Rae Zimmerman, an expert on urban adaptation to climate change at NYU. Large parts of internet infrastructure soon “will be underwater, unless they're moved back pretty quickly.”
Paul Merrell

Canadian Spies Collect Domestic Emails in Secret Security Sweep - The Intercept - 0 views

  • Canada’s electronic surveillance agency is covertly monitoring vast amounts of Canadians’ emails as part of a sweeping domestic cybersecurity operation, according to top-secret documents. The surveillance initiative, revealed Wednesday by CBC News in collaboration with The Intercept, is sifting through millions of emails sent to Canadian government agencies and departments, archiving details about them on a database for months or even years. The data mining operation is carried out by the Communications Security Establishment, or CSE, Canada’s equivalent of the National Security Agency. Its existence is disclosed in documents obtained by The Intercept from NSA whistleblower Edward Snowden. The emails are vacuumed up by the Canadian agency as part of its mandate to defend against hacking attacks and malware targeting government computers. It relies on a system codenamed PONY EXPRESS to analyze the messages in a bid to detect potential cyber threats.
  • Last year, CSE acknowledged it collected some private communications as part of cybersecurity efforts. But it refused to divulge the number of communications being stored or to explain for how long any intercepted messages would be retained. Now, the Snowden documents shine a light for the first time on the huge scope of the operation — exposing the controversial details the government withheld from the public. Under Canada’s criminal code, CSE is not allowed to eavesdrop on Canadians’ communications. But the agency can be granted special ministerial exemptions if its efforts are linked to protecting government infrastructure — a loophole that the Snowden documents show is being used to monitor the emails. The latest revelations will trigger concerns about how Canadians’ private correspondence with government employees are being archived by the spy agency and potentially shared with police or allied surveillance agencies overseas, such as the NSA. Members of the public routinely communicate with government employees when, for instance, filing tax returns, writing a letter to a member of parliament, applying for employment insurance benefits or submitting a passport application.
  • Chris Parsons, an internet security expert with the Toronto-based internet think tank Citizen Lab, told CBC News that “you should be able to communicate with your government without the fear that what you say … could come back to haunt you in unexpected ways.” Parsons said that there are legitimate cybersecurity purposes for the agency to keep tabs on communications with the government, but he added: “When we collect huge volumes, it’s not just used to track bad guys. It goes into data stores for years or months at a time and then it can be used at any point in the future.” In a top-secret CSE document on the security operation, dated from 2010, the agency says it “processes 400,000 emails per day” and admits that it is suffering from “information overload” because it is scooping up “too much data.” The document outlines how CSE built a system to handle a massive 400 terabytes of data from Internet networks each month — including Canadians’ emails — as part of the cyber operation. (A single terabyte of data can hold about a billion pages of text, or about 250,000 average-sized mp3 files.)
  • ...1 more annotation...
  • The agency notes in the document that it is storing large amounts of “passively tapped network traffic” for “days to months,” encompassing the contents of emails, attachments and other online activity. It adds that it stores some kinds of metadata — data showing who has contacted whom and when, but not the content of the message — for “months to years.” The document says that CSE has “excellent access to full take data” as part of its cyber operations and is receiving policy support on “use of intercepted private communications.” The term “full take” is surveillance-agency jargon that refers to the bulk collection of both content and metadata from Internet traffic. Another top-secret document on the surveillance dated from 2010 suggests the agency may be obtaining at least some of the data by covertly mining it directly from Canadian Internet cables. CSE notes in the document that it is “processing emails off the wire.”
  •  
    " CANADIAN SPIES COLLECT DOMESTIC EMAILS IN SECRET SECURITY SWEEP BY RYAN GALLAGHER AND GLENN GREENWALD @rj_gallagher@ggreenwald YESTERDAY AT 2:02 AM SHARE TWITTER FACEBOOK GOOGLE EMAIL PRINT POPULAR EXCLUSIVE: TSA ISSUES SECRET WARNING ON 'CATASTROPHIC' THREAT TO AVIATION CHICAGO'S "BLACK SITE" DETAINEES SPEAK OUT WHY DOES THE FBI HAVE TO MANUFACTURE ITS OWN PLOTS IF TERRORISM AND ISIS ARE SUCH GRAVE THREATS? NET NEUTRALITY IS HERE - THANKS TO AN UNPRECEDENTED GUERRILLA ACTIVISM CAMPAIGN HOW SPIES STOLE THE KEYS TO THE ENCRYPTION CASTLE Canada's electronic surveillance agency is covertly monitoring vast amounts of Canadians' emails as part of a sweeping domestic cybersecurity operation, according to top-secret documents. The surveillance initiative, revealed Wednesday by CBC News in collaboration with The Intercept, is sifting through millions of emails sent to Canadian government agencies and departments, archiving details about them on a database for months or even years. The data mining operation is carried out by the Communications Security Establishment, or CSE, Canada's equivalent of the National Security Agency. Its existence is disclosed in documents obtained by The Intercept from NSA whistleblower Edward Snowden. The emails are vacuumed up by the Canadian agency as part of its mandate to defend against hacking attacks and malware targeting government computers. It relies on a system codenamed PONY EXPRESS to analyze the messages in a bid to detect potential cyber threats. Last year, CSE acknowledged it collected some private communications as part of cybersecurity efforts. But it refused to divulge the number of communications being stored or to explain for how long any intercepted messages would be retained. Now, the Snowden documents shine a light for the first time on the huge scope of the operation - exposing the controversial details the government withheld from the public. Under Canada's criminal code, CSE is no
Paul Merrell

Report: Verizon Claimed Public Utility Status To Get Government Perks - Slashdot - 0 views

  • Research for the Public Utility Law Project (PULP) has been released which details 'how Verizon deliberately moves back and forth between regulatory regimes, classifying its infrastructure either like a heavily regulated telephone network or a deregulated information service depending on its needs. The chicanery has allowed Verizon to raise telephone rates, all the while missing commitments for high-speed internet deployment' (PDF). In short, Verizon pushed for the government to give it common carrier privileges under Title II in order to build out its fiber network with tax-payer money. Result: increased service rates on telephone users to subsidize Verizon's 'infrastructure investment.' When it comes to regulations on Verizon's fiber network, however, Verizon has been pushing the government to classify its services as that of information only — i.e., beyond Title II. Verizon has made about $4.4 billion in additional revenue in New York City alone, 'money that's funneled directly from a Title II service to an array of services that currently lie beyond Title II's reach.' And it's all legal. An attorney at advocacy group Public Knowledge said it best: 'To expect that you can come in and use public infrastructure and funds to build a network and then be free of any regulation is absurd....When Verizon itself is describing these activities as a Title II common carrier, how can the FCC look at broadband internet and continue acting as though it's not a telecommunication network?'"
  •  
    Let's also not forget that what is now named "Verizon" used to be named Bell Atlantic, one of the seven Baby Bells that were spun off by AT&T by government order during antitrust proceedings.  In other words, this is one of the companies rate-payers financed through a heavily-regulated analog telephony absolute monopoly. But Verizon wants to spread its wings and escape the chains of regulation as a telecommunications carrier. While having its cake and eating it to, according to this article. The FCC has poised itself through a proposed rule with the flexibility to postpone a decision on net neutrality.  AT&T famously was allowed to keep its R&D arm while being freed of the expense of upgrading the U.S. telephony network from analog to digital and from copper wire to fibre optic.  And pay for those Baby Bells to make that transition we did. I remember monthly bills for a two person office running as high as $1,100 a month for calls all carried from Baby Bell to AT&T and back to another Baby Bell. All at state-regulated rates with FCC looking the other way. But now Verizon, Comcast (the originally munipally regulated cable television monopolies) and the few other "competing" survivors of that broadband rollout, having had their infrastructure paid for by the ratepayers, want to fly off and begin charging us at the other end of the pipe,via charges to content providers that will be passed on to us. Leading to the squeezing out of Mom and Pop internet businesses by the big content providers that can afford the charges and pass them on to us. This is looking more and more like another massive rip-off of the customers who already paid for that infrasture. Is that banksters I smell, privatizing a enormous public utility in the name of free markets?      
Gary Edwards

Mary Meeker: Mobile Internet Will Soon Overtake Fixed Internet: Tech News and... - 0 views

  •  
    what does Meeker see in her crystal ball this year? Two overwhelming trends that will affect consumers, the hardware/infrastructure industry and the commercial potential of the web: mobile and social networking. Such a conclusion is hardly earth-shattering news to GigaOM readers, for we have been following these trends over the past year or two, but Meeker puts some pretty large numbers next to those trends, and looks at the shifts that will (or are likely to) take place in related industries such as communications hardware. She also compares where the rest of the developed world is in terms of mobile communications and social networking with Japan. Again, not a radically different approach to the one many tech forecasters take, but Meeker has the weight of some considerable research chops on her side. The Morgan Stanley analyst says that the world is currently in the midst of the fifth major technology cycle of the past half a century. The previous four were the mainframe era of the 1950s and 60s, the mini-computer era of the 1970s and the desktop Internet era of the 80s. The current cycle is the era of the mobile Internet, she says - predicting that within the next five years "more users will connect to the Internet over mobile devices than desktop PCs." As she puts it on one of the slides in the report: "Rapid Ramp of Mobile Internet Usage Will be a Boon to Consumers and Some Companies Will Likely Win Big (Potentially Very Big) While Many Will Wonder What Just Happened."
Paul Merrell

'Nice Internet You've Got There... You Wouldn't Want Something To Happen To It...' | Te... - 0 views

  • Last month, we wrote about Bruce Schneier's warning that certain unknown parties were carefully testing ways to take down the internet. They were doing carefully configured DDoS attacks, testing core internet infrastructure, focusing on key DNS servers. And, of course, we've also been talking about the rise of truly massive DDoS attacks, thanks to poorly secured Internet of Things (IoT) devices, and ancient, unpatched bugs. That all came to a head this morning when large chunks of the internet went down for about two hours, thanks to a massive DDoS attack targeting managed DNS provider Dyn. Most of the down sites are back (I'm still having trouble reaching Twitter), but it was pretty widespread, and lots of big name sites all went down. Just check out this screenshot from Downdetector showing the outages on a bunch of sites:
  • You'll see not all of them have downtime (and the big ISPs, as always, show lots of complaints about downtimes), but a ton of those sites show a giant spike in downtime for a few hours. So, once again, we'd like to point out that this is as problem that the internet community needs to start solving now. There's been a theoretical threat for a while, but it's no longer so theoretical. Yes, some people point out that this is a difficult thing to deal with. If you're pointing people to websites, even if we were to move to a more distributed system, there are almost always some kinds of chokepoints, and those with malicious intent will always, eventually, target those chokepoints. But there has to be a better way -- because if there isn't, this kind of thing is going to become a lot worse.
Paul Merrell

Obama wants to help make your Internet faster and cheaper. This is his plan. - The Wash... - 0 views

  • Frustrated over the number of Internet providers that are available to you? If so, you're like many who are limited to just a handful of broadband companies. But now President Obama wants to change that, arguing that choice and competition are lacking in the U.S. broadband market. On Wednesday, Obama will unveil a series of measures aimed at making high-speed Web connections cheaper and more widely available to millions of Americans. The announcement will focus chiefly on efforts by cities to build their own alternatives to major Internet providers such as Comcast, Verizon or AT&T — a public option for Internet access, you could say. He'll write to the Federal Communications Commission urging the agency to help neutralize laws, erected by states, that effectively protect large established Internet providers against the threat represented by cities that want to build and offer their own, municipal Internet service. He'll direct federal agencies to expand grants and loans for these projects and for smaller, rural Internet providers. And he'll draw attention to a new coalition of mayors from 50 cities who've committed to spurring choice in the broadband industry.
  • "When more companies compete for your broadband business, it means lower prices," Jeff Zients, director of Obama's National Economic Council, told reporters Tuesday. "Broadband is no longer a luxury. It's a necessity." The announcement highlights a growing chorus of small and mid-sized cities that say they've been left behind by some of the country's biggest Internet providers. In many of these places, incumbent companies have delayed network upgrades or offer what customers say is unsatisfactory service because it isn't cost-effective to build new infrastructure. Many cities, such as Cedar Falls, Iowa, have responded by building their own, publicly operated competitors. Obama will travel to Cedar Falls on Wednesday to roll out his initiative.
Paul Merrell

UN Report Finds Mass Surveillance Violates International Treaties and Privacy Rights - ... - 0 views

  • The United Nations’ top official for counter-terrorism and human rights (known as the “Special Rapporteur”) issued a formal report to the U.N. General Assembly today that condemns mass electronic surveillance as a clear violation of core privacy rights guaranteed by multiple treaties and conventions. “The hard truth is that the use of mass surveillance technology effectively does away with the right to privacy of communications on the Internet altogether,” the report concluded. Central to the Rapporteur’s findings is the distinction between “targeted surveillance” — which “depend[s] upon the existence of prior suspicion of the targeted individual or organization” — and “mass surveillance,” whereby “states with high levels of Internet penetration can [] gain access to the telephone and e-mail content of an effectively unlimited number of users and maintain an overview of Internet activity associated with particular websites.” In a system of “mass surveillance,” the report explained, “all of this is possible without any prior suspicion related to a specific individual or organization. The communications of literally every Internet user are potentially open for inspection by intelligence and law enforcement agencies in the States concerned.”
  • Mass surveillance thus “amounts to a systematic interference with the right to respect for the privacy of communications,” it declared. As a result, “it is incompatible with existing concepts of privacy for States to collect all communications or metadata all the time indiscriminately.” In concluding that mass surveillance impinges core privacy rights, the report was primarily focused on the International Covenant on Civil and Political Rights, a treaty enacted by the General Assembly in 1966, to which all of the members of the “Five Eyes” alliance are signatories. The U.S. ratified the treaty in 1992, albeit with various reservations that allowed for the continuation of the death penalty and which rendered its domestic law supreme. With the exception of the U.S.’s Persian Gulf allies (Saudi Arabia, UAE and Qatar), virtually every major country has signed the treaty. Article 17 of the Covenant guarantees the right of privacy, the defining protection of which, the report explained, is “that individuals have the right to share information and ideas with one another without interference by the State, secure in the knowledge that their communication will reach and be read by the intended recipients alone.”
  • The report’s key conclusion is that this core right is impinged by mass surveillance programs: “Bulk access technology is indiscriminately corrosive of online privacy and impinges on the very essence of the right guaranteed by article 17. In the absence of a formal derogation from States’ obligations under the Covenant, these programs pose a direct and ongoing challenge to an established norm of international law.” The report recognized that protecting citizens from terrorism attacks is a vital duty of every state, and that the right of privacy is not absolute, as it can be compromised when doing so is “necessary” to serve “compelling” purposes. It noted: “There may be a compelling counter-terrorism justification for the radical re-evaluation of Internet privacy rights that these practices necessitate. ” But the report was adamant that no such justifications have ever been demonstrated by any member state using mass surveillance: “The States engaging in mass surveillance have so far failed to provide a detailed and evidence-based public justification for its necessity, and almost no States have enacted explicit domestic legislation to authorize its use.”
  • ...5 more annotations...
  • Instead, explained the Rapporteur, states have relied on vague claims whose validity cannot be assessed because of the secrecy behind which these programs are hidden: “The arguments in favor of a complete abrogation of the right to privacy on the Internet have not been made publicly by the States concerned or subjected to informed scrutiny and debate.” About the ongoing secrecy surrounding the programs, the report explained that “states deploying this technology retain a monopoly of information about its impact,” which is “a form of conceptual censorship … that precludes informed debate.” A June report from the High Commissioner for Human Rights similarly noted “the disturbing lack of governmental transparency associated with surveillance policies, laws and practices, which hinders any effort to assess their coherence with international human rights law and to ensure accountability.” The rejection of the “terrorism” justification for mass surveillance as devoid of evidence echoes virtually every other formal investigation into these programs. A federal judge last December found that the U.S. Government was unable to “cite a single case in which analysis of the NSA’s bulk metadata collection actually stopped an imminent terrorist attack.” Later that month, President Obama’s own Review Group on Intelligence and Communications Technologies concluded that mass surveillance “was not essential to preventing attacks” and information used to detect plots “could readily have been obtained in a timely manner using conventional [court] orders.”
  • Three Democratic Senators on the Senate Intelligence Committee wrote in The New York Times that “the usefulness of the bulk collection program has been greatly exaggerated” and “we have yet to see any proof that it provides real, unique value in protecting national security.” A study by the centrist New America Foundation found that mass metadata collection “has had no discernible impact on preventing acts of terrorism” and, where plots were disrupted, “traditional law enforcement and investigative methods provided the tip or evidence to initiate the case.” It labeled the NSA’s claims to the contrary as “overblown and even misleading.” While worthless in counter-terrorism policies, the UN report warned that allowing mass surveillance to persist with no transparency creates “an ever present danger of ‘purpose creep,’ by which measures justified on counter-terrorism grounds are made available for use by public authorities for much less weighty public interest purposes.” Citing the UK as one example, the report warned that, already, “a wide range of public bodies have access to communications data, for a wide variety of purposes, often without judicial authorization or meaningful independent oversight.”
  • The report was most scathing in its rejection of a key argument often made by American defenders of the NSA: that mass surveillance is justified because Americans are given special protections (the requirement of a FISA court order for targeted surveillance) which non-Americans (95% of the world) do not enjoy. Not only does this scheme fail to render mass surveillance legal, but it itself constitutes a separate violation of international treaties (emphasis added): The Special Rapporteur concurs with the High Commissioner for Human Rights that where States penetrate infrastructure located outside their territorial jurisdiction, they remain bound by their obligations under the Covenant. Moreover, article 26 of the Covenant prohibits discrimination on grounds of, inter alia, nationality and citizenship. The Special Rapporteur thus considers that States are legally obliged to afford the same privacy protection for nationals and non-nationals and for those within and outside their jurisdiction. Asymmetrical privacy protection regimes are a clear violation of the requirements of the Covenant.
  • That principle — that the right of internet privacy belongs to all individuals, not just Americans — was invoked by NSA whistleblower Edward Snowden when he explained in a June, 2013 interview at The Guardian why he disclosed documents showing global surveillance rather than just the surveillance of Americans: “More fundamentally, the ‘US Persons’ protection in general is a distraction from the power and danger of this system. Suspicionless surveillance does not become okay simply because it’s only victimizing 95% of the world instead of 100%.” The U.N. Rapporteur was clear that these systematic privacy violations are the result of a union between governments and tech corporations: “States increasingly rely on the private sector to facilitate digital surveillance. This is not confined to the enactment of mandatory data retention legislation. Corporates [sic] have also been directly complicit in operationalizing bulk access technology through the design of communications infrastructure that facilitates mass surveillance. ”
  • The latest finding adds to the growing number of international formal rulings that the mass surveillance programs of the U.S. and its partners are illegal. In January, the European parliament’s civil liberties committee condemned such programs in “the strongest possible terms.” In April, the European Court of Justice ruled that European legislation on data retention contravened EU privacy rights. A top secret memo from the GCHQ, published last year by The Guardian, explicitly stated that one key reason for concealing these programs was fear of a “damaging public debate” and specifically “legal challenges against the current regime.” The report ended with a call for far greater transparency along with new protections for privacy in the digital age. Continuation of the status quo, it warned, imposes “a risk that systematic interference with the security of digital communications will continue to proliferate without any serious consideration being given to the implications of the wholesale abandonment of the right to online privacy.” The urgency of these reforms is underscored, explained the Rapporteur, by a conclusion of the United States Privacy and Civil Liberties Oversight Board that “permitting the government to routinely collect the calling records of the entire nation fundamentally shifts the balance of power between the state and its citizens.”
Paul Merrell

USA, USA, USA: America's 4G Network Is Ranked 62nd 'Best' In The World (Behind Macedoni... - 0 views

  • The United States takes pride in being a technological leader in the world. Companies such as Apple, Alphabet, IBM, Amazon and Microsoft have shaped our (digital) lives for many years and there is little indication of that changing anytime soon. But, as Statista's Felix Richter notes, when it comes to IT infrastructure however, the U.S. is lagging behind the world’s best (and many of its not-so-best), be it in terms of home broadband or wireless broadband speeds. According to OpenSignal's latest State of LTE report, the average 4G download speed in the United States was 16.31 Mbps in Q4 2017.
  • The United States takes pride in being a technological leader in the world. Companies such as Apple, Alphabet, IBM, Amazon and Microsoft have shaped our (digital) lives for many years and there is little indication of that changing anytime soon. But, as Statista's Felix Richter notes, when it comes to IT infrastructure however, the U.S. is lagging behind the world’s best (and many of its not-so-best), be it in terms of home broadband or wireless broadband speeds. According to OpenSignal's latest State of LTE report, the average 4G download speed in the United States was 16.31 Mbps in Q4 2017.
  • The United States takes pride in being a technological leader in the world. Companies such as Apple, Alphabet, IBM, Amazon and Microsoft have shaped our (digital) lives for many years and there is little indication of that changing anytime soon. But, as Statista's Felix Richter notes, when it comes to IT infrastructure however, the U.S. is lagging behind the world’s best (and many of its not-so-best), be it in terms of home broadband or wireless broadband speeds. According to OpenSignal's latest State of LTE report, the average 4G download speed in the United States was 16.31 Mbps in Q4 2017.
  • ...2 more annotations...
  • That’s little more than a third of the speed that mobile device users in Singapore enjoy and ranks the U.S. at a disappointing 62nd place in the global ranking.
  • While U.S. mobile networks appear to lack in speed, they are on par with the best in terms of 4G availability. According to OpenSignal's findings, LTE was available to U.S. smartphone users 90 percent of the time, putting the United States in fifth place.
Gary Edwards

How To Win The Cloud Wars - Forbes - 0 views

  •  
    Byron Deeter is right, but perhaps he's holding back on his reasoning.  Silicon Valley is all about platform, and platform plays only come about once every ten to twenty years.  They come like great waves of change, not replacing the previous waves as much as taking away and running with the future.   Cloud Computing is the fourth great wave.  It will replace the PC and Network Computing waves as the future.  It is the target of all developers and entrepreneurs.   The four great waves are mainframe, workstation, pc and networked pc, and the Internet.  Cloud Computing takes the Internet to such a high level of functionality that it will now replace the pc-netwroking wave.  It's going to be enormous.  Especially as enterprises move their business productivity and data / content apps from the desktop/workgroup to the Cloud.  Enormous. The key was the perfect storm of 2008, where mobility (iPhone) converged with the standardization of tagged PDF, which converged with the Cloud Computing application and data model, which all happened at the time of the great financial collapse.   The financial collapase of 2008 caused a tectonic shift in productivity.  Survival meant doing more with less.  Particularly less labor since cost of labor was and continues to be a great uncertainty.  But that's also the definition of productivity and automation.  To survive, companies were compelled to reduce labor and invest in software/hardware systems based productivity.  The great leap to a new platform had it's fuel; survival. Social applications and services are just the simplest manifestation of productivity through managed connectivity in the Cloud.  Wait until this new breed of productivity reaches business apps!  The platform wars have begun, and it's for all the marbles. One last thought.  The Internet was always going to win as the next computing platform wave.  It's the first time communications have been combined and integrated into content, and vast dat
Paul Merrell

LocalOrg: Decentralizing Telecom - 0 views

  • SOPA, ACTA, the criminalization of sharing, and a myriad of other measures taken to perpetuate antiquated business models propping up enduring monopolies - all have become increasingly taxing on the tech community and informed citizens alike. When the storm clouds gather and torrential rain begins to fall, the people have managed to stave off the flood waters through collective effort and well organized activism - stopping, or at least delaying SOPA and ACTA. However, is it really sustainable to mobilize each and every time multi-billion dollar corporations combine their resources and attempt to pass another series of draconian rules and regulations? Instead of manning the sandbags during each storm, wouldn't it suit us all better to transform the surrounding landscape in such a way as to harmlessly divert the floods, or better yet, harness them to our advantage? In many ways the transformation has already begun.
  • While open source software and hardware, as well as innovative business models built around collaboration and crowd-sourcing have done much to build a paradigm independent of current centralized proprietary business models, large centralized corporations and the governments that do their bidding, still guard all the doors and carry all the keys. The Internet, the phone networks, radio waves, and satellite systems still remain firmly in the hands of big business. As long as they do, they retain the ability to not only reassert themselves in areas where gains have been made, but can impose preemptive measures to prevent any future progress. With the advent of hackerspaces, increasingly we see projects that hold the potential of replacing, at least on a local level, much of the centralized infrastructure we take for granted until disasters or greed-driven rules and regulations upset the balance. It is with the further developing of our local infrastructure that we can leave behind the sandbags of perpetual activism and enjoy a permanently altered landscape that favors our peace and prosperity. Decentralizing Telecom
  • As impressive as a hydroelectric dam may be and as overwhelming as it may seem as a project to undertake, it will always start with but a single shovelful of dirt. The work required becomes in its own way part of the payoff - with experienced gained and with a magnificent accomplishment to aspire toward. In the same way, a communication network that runs parallel to existing networks, with global coverage, but locally controlled, may seem an impossible, overwhelming objective - and for one individual, or even a small group of individuals, it is. However, the paradigm has shifted. In the age of digital collaboration made possible by existing networks, the building of such a network can be done in parallel. In an act of digital-judo, we can use the system's infrastructure as a means of supplanting and replacing it with something superior in both function and in form. 
Paul Merrell

The BRICS "Independent Internet" Cable. In Defiance of the "US-Centric Internet" | Glob... - 0 views

  • The President of Brazil, Dilma Rousseff announces publicly the creation of a world internet system INDEPENDENT from US and Britain ( the “US-centric internet”). Not many understand that, while the immediate trigger for the decision (coupled with the cancellation of a summit with the US president) was the revelations on NSA spying, the reason why Rousseff can take such a historic step is that the alternative infrastructure: The BRICS cable from Vladivostock, Russia  to Shantou, China to Chennai, India  to Cape Town, South Africa  to Fortaleza, Brazil,  is being built and it’s, actually, in its final phase of implementation. No amount of provocation and attempted “Springs” destabilizations and Color Revolution in the Middle East, Russia or Brazil can stop this process.  The huge submerged part of the BRICS plan is not yet known by the broader public.
  • Nonetheless it is very real and extremely effective. So real that international investors are now jumping with both feet on this unprecedented real economy opportunity. The change… has already happened. Brazil plans to divorce itself from the U.S.-centric Internet over Washington’s widespread online spying, a move that many experts fear will be a potentially dangerous first step toward politically fracturing a global network built with minimal interference by governments. President Dilma Rousseff has ordered a series of measures aimed at greater Brazilian online independence and security following revelations that the U.S. National Security Agency intercepted her communications, hacked into the state-owned Petrobras oil company’s network and spied on Brazilians who entrusted their personal data to U.S. tech companies such as Facebook and Google.
  • BRICS Cable… a 34 000 km, 2 fibre pair, 12.8 Tbit/s capacity, fibre optic cable system For any global investor, there is no crisis – there is plenty of growth. It’s just not in the old world BRICS is ~45% of the world’s population and ~25% of the world’s GDP BRICS together create an economy the size of Italy every year… that’s the 8th largest economy in the world The BRICS presents profound opportunities in global geopolitics and commerce Links Russia, China, India, South Africa, Brazil – the BRICS economies – and the United States. Interconnect with regional and other continental cable systems in Asia, Africa and South America for improved global coverage Immediate access to 21 African countries and give those African countries access to the BRICS economies. Projected ready for service date is mid to second half of 2015.
  •  
    Undoubtedly, construction was under way well before the Edward Snowden leaked documents began to be published. But that did give the new BRICS Cable an excellent hook for the announcement. With 12.8 Tbps throughput, it looks like this may divert considerable traffic now routed through the UK. But it still connects with the U.S., in Miami. 
Paul Merrell

Europe and Japan Aiming to Build 100Gbps Fibre Optic Internet - ISPreview UK - 0 views

  • The European Commission (EC) and Japan have announced the launch of six joint research projects, supported by £15.3m+ (€18m) in funding, that aim to build networks which are “5000 times faster than today’s average European broadband ISP speed (100Gbps compared to 19.7Mbps)“. The telecoms experts among you will know that 100Gbps+ (Gigabits per second) fibre optic links are nothing new but most of these are major submarine or national cable links. The new effort appears to be looking further ahead, with a view to improving the efficiency of such networks and perhaps even bringing them closer to homes. It’s frequently noted that demand for data is putting a growing strain on broadband connections (the EU expects data traffic to grow 12-fold by 2018), which is partly fuelled by ever faster fixed line ISP and mobile broadband connectivity. But technology is always evolving to keep pace.
  • A quick glance at each of the projects reveals that this seems to be more about improving what already exists, yet in some circles even 100Gbps is beginning to look old-hat. Never the less many of the improvements mentioned above will, if ever adopted, eventually filter down to benefit everybody. After all, several UK ISPs are already offering 1Gbps home connections (e.g. Hyperoptic, CityFibre / Fibreband in Bournemouth, Gigaclear etc.) and that’s only 99 fold slower than a 100Gbps link. In the realm of evolving internet access services that’s only a short hop, unless your infrastructure is still limited by a copper last mile. But there’s little point in having a 100Gbps link (don’t worry we won’t see this in homes for a fair few years) if the ISP can’t supply the capacity for it and that’s another part of the new effort. It’s important to stress that this is not about tackling today’s needs; it’s all about the future. Not so long ago we were still stuck on 50Kbps dialup.
Gary Edwards

Ericom Launches Pure HTML5 RDP Client -- Campus Technology - 0 views

  •  
    Wow!  This reads like a premature press release, but if true it's breakthru technology.  I wonder though why Ericom is targeting education?  Seems this innovation would be of immediate importance to enterprise and SMB businesses struggling with the great transition from desktop/workgroup productivity systems to Web Productivity Platforms. excerpt: Ericom has released AccessNow, a pure HTML5 remote desktop (RDP) client that runs within a Web browser without the need to install anything on the client device. AccessNow provides accelerated remote access to applications and desktops running on Windows Terminal Services, remote desktop services (RDS), and virtual desktop infrastructure (VDI), including applications, remote desktops, VMware View desktops, virtual desktops running on Microsoft Hyper-V, and other hypervisors. AccessNow works on any device with an HTML5-capable browser, such as Chrome, Safari, Firefox, Opera, and others, without the use of browser plugins, Java, Flash, ActiveX, Silverlight, or other underlying technology. Internet Explorer is also supported, although it does require the Chrome Frame plugin. AccessNow uses only the standard Web technologies: HTML, CSS, and JavaScript. This approach helps IT administrators maintain centralized control of school resources. It also enables students and staff to use any Internet-enabled device, including smartphones, tablets, and Chromebooks, to do their work anywhere and anytime.
Paul Merrell

Russia gears up to build its own 'independent internet' | The Times of Israel - 0 views

  • The Russian government is reportedly considering building an “independent internet infrastructure” that it can use as an alternative to the global Domain Name System, or DNS system. Last month, Russia’s Security Council asked the government to start building a backup DNS system citing “the increased capabilities of Western nations to conduct offensive operations.”
  • However, some defense experts say the move could “have more to do with Moscow’s own plans for offensive cyber operations,” according to the Defense One website. The alternative DNS would also serve the so-called BRIC nations — Brazil, Russia, India, China, and South Africa — and would operate independently of international organizations.
  • Russian president Vladimir Putin set a deadline of August 2018 to complete the infrastructure.
Paul Merrell

WG Review: Internet Wideband Audio Codec (codec) - 0 views

  •  
    A new IETF working group has been proposed in the Real-time Applications and Infrastructure Area. The IESG has not made any determination as yet. The following draft charter was submitted, and is provided for informational purposes only. Please send your comments to the IESG mailing list (iesg at ietf.org) by January 20, 2010. ... According to reports from developers of Internet audio applications and operators of Internet audio services, there are no standardized, high-quality audio codecs that meet all of the following three conditions: 1. Are optimized for use in interactive Internet applications. 2. Are published by a recognized standards development organization (SDO) and therefore subject to clear change control. 3. Can be widely implemented and easily distributed among application developers, service operators, and end users. ... The goal of this working group is to develop a single high-quality audio codec that is optimized for use over the Internet and that can be widely implemented and easily distributed among application developers, service operators, and end users. Core technical considerations include, but are not necessarily limited to, the following: 1. Designing for use in interactive applications (examples include, but are not limited to, point-to-point voice calls, multi-party voice conferencing, telepresence, teleoperation, in-game voice chat, and live music performance) 2. Addressing the real transport conditions of the Internet as identified and prioritized by the working group 3. Ensuring interoperability with the Real-time Transport Protocol (RTP), including secure transport via SRTP 4. Ensuring interoperability with Internet signaling technologies such as Session Initiation Protocol (SIP), Session Description Protocol (SDP), and Extensible Messaging and Presence Protocol (XMPP); however, the result should not depend on the details of any particular signaling technology.
Paul Merrell

Rural America and the 5G Digital Divide. Telecoms Expanding Their "Toxic Infrastructure... - 0 views

  • While there is considerable telecom hubris regarding the 5G rollout and increasing speculation that the next generation of wireless is not yet ready for Prime Time, the industry continues to make promises to Rural America that it has no intention of fulfilling. Decades-long promises to deliver digital Utopia to rural America by T-Mobile, Verizon and AT&T have never materialized.  
  • In 2017, the USDA reported that 29% of American farms had no internet access. The FCC says that 14 million rural Americans and 1.2 million Americans living on tribal lands do not have 4G LTE on their phones, and that 30 million rural residents do not have broadband service compared to 2% of urban residents.  It’s beginning to sound like a Third World country. Despite an FCC $4.5 billion annual subsidy to carriers to provide broadband service in rural areas, the FCC reports that ‘over 24 million Americans do not have access to high-speed internet service, the bulk of them in rural area”while a  Microsoft Study found that  “162 million people across the US do not have internet service at broadband speeds.” At the same time, only three cable companies have access to 70% of the market in a sweetheart deal to hike rates as they avoid competition and the FCC looks the other way.  The FCC believes that it would cost $40 billion to bring broadband access to 98% of the country with expansion in rural America even more expensive.  While the FCC has pledged a $2 billion, ten year plan to identify rural wireless locations, only 4 million rural American businesses and homes will be targeted, a mere drop in the bucket. Which brings us to rural mapping: Since the advent of the digital age, there have been no accurate maps identifying where broadband service is available in rural America and where it is not available.  The FCC has a long history of promulgating unreliable and unverified carrier-provided numbers as the Commission has repeatedly ‘bungled efforts to produce accurate broadband maps” that would have facilitated rural coverage. During the Senate Commerce Committee hearing on April 10th regarding broadband mapping, critical testimony questioned whether the FCC and/or the telecom industry have either the commitment or the proficiency to provide 5G to rural America.  Members of the Committee shared concerns that 5G might put rural America further behind the curve so as to never catch up with the rest of the country
Paul Merrell

Can Dweb Save The Internet? 06/03/2019 - 0 views

  • On a mysterious farm just above the Pacific Ocean, the group who built the internet is inviting a small number of friends to a semi-secret gathering. They describe it as a camp "where diverse people can freely exchange ideas about the technologies, laws, markets, and agreements we need to move forward.” Forward indeed.It wasn’t that long ago that the internet was an open network of computers, blogs, sites, and posts.But then something happened -- and the open web was taken over by private, for-profit, closed networks. Facebook isn’t the web. YouTube isn’t the web. Google isn’t the web. They’re for-profit businesses that are looking to sell audiences to advertisers.Brewster Kahle is one of the early web innovators who built the Internet Archive as a public storehouse to protect the web’s history. Along with web luminaries such as Sir Tim Berners-Lee and Vint Cerf, he is working to protect and rebuild the open nature of the web.advertisementadvertisement“We demonstrated that the web had failed instead of served humanity, as it was supposed to have done,” Berners-Lee told Vanity Fair. The web has “ended up producing -- [through] no deliberate action of the people who designed the platform -- a large-scale emergent phenomenon which is anti-human.”
  • o, they’re out to fix it, working on what they call the Dweb. The “d” in Dweb stands for distributed. In distributed systems, no one entity has control over the participation of any other entity.Berners-Lee is building a platform called Solid, designed to give people control over their own data. Other global projects also have the goal of taking take back the public web. Mastodon is decentralized Twitter. Peertube is a decentralized alternative to YouTube.This July 18 - 21, web activists plan to convene at the Decentralized Web Summit in San Francisco. Back in 2016, Kahle convened an early group of builders, archivists, policymaker, and journalists. He issued a challenge to  use decentralized technologies to “Lock the Web Open.” It’s hard to imagine he knew then how quickly the web would become a closed network.Last year's Dweb gathering convened more than 900 developers, activists, artists, researchers, lawyers, and students. Kahle opened the gathering by reminding attendees that the web used to be a place where everyone could play. "Today, I no longer feel like a player, I feel like I’m being played. Let’s build a decentralized web, let’s build a system we can depend on, a system that doesn’t feel creepy” he said, according to IEEE Spectrum.With the rising tide of concerns about how social networks have hacked our democracy, Kahle and his Dweb community will gather with increasing urgency around their mission.The internet began with an idealist mission to connect people and information for good. Today's web has yet to achieve that goal, but just maybe Dweb will build an internet more robust and open than the current infrastructure allows. That’s a mission worth fighting for.
Paul Merrell

EU Officials Propose Internet Cops On Patrol, No Anonymity & No Obscure Languages (Beca... - 0 views

  • Back in February we wrote about the ominously-named "Clean IT" project in Europe, designed to combat the use of the Internet by terrorists. At that time, we suspected that this would produce some seriously bad ideas, but a leaked document obtained by EDRI shows that these are actually much worse than feared (pdf), amounting to a system of continuous surveillance, extrajudicial removal of content and some new proposals that can only be described as deranged.
  • And where there are laws, it must be OK for law enforcement agencies (LEAs) to ignore them and have content taken down on demand: It must be legal for LEAs to make Internet companies aware of terrorist content on their infrastructure ('flagging') that should be removed, without following the more labour intensive and formal procedures for 'notice and take action'
  • Social media companies must allow only real pictures of users Presumably you're not allowed to smile, either. Talking of social media, the Clean IT plans include the introduction of friendly "virtual police officers", constantly spying on, er, watching over Europeans online: Virtual police officers must be used to show law enforcement is present, is watchful, in order to prevent terrorist use of the Internet and make regular users feel more secure. The idea is that "virtual police officers" will be keeping an eye on you -- for your own safety, you understand. Other ways in which users will be protected from themselves is through the use of filters:
  • ...2 more annotations...
  • Among the even more interesting proposals in the leaked document seems to be the idea that the authorities can order encryption to be turned off, presumably to allow eavesdropping: In some cases notice and take action procedures must lead to security certificates of sites to be downgraded.
  • The use of platforms in languages abuse specialists or abuse systems do not master should be unacceptable and preferably technically impossible. Incredible though it might sound, that seems to suggest that less common foreign languages would be banned from the European Internet entirely in case anybody discusses naughty stuff without the authorities being able to spy on them (haven't they heard of Google Translate?) You could hardly hope for a better symbol of the paranoid and xenophobic thinking that lies behind this crazy scheme.
Paul Merrell

It's Time to Nationalize the Internet - 0 views

  • Such profiteering tactics have disproportionately affected low-income and rural communities. ISPs have long redlined these demographic groups, creating what’s commonly known as the “digital divide.” Thirty-nine percent of Americans lack access to service fast enough to meet the federal definition of broadband. More than 50 percent of adults with household incomes below $30,000 have home broadband—a problem plaguing users of color most acutely. In contrast, internet access is near-universal for households with an annual income of $100,000 or more. The reason for such chasms is simple: Private network providers prioritize only those they expect to provide a return on investment, thus excluding poor and sparsely populated areas.
  • Chattanooga, Tennessee, has seen more success in addressing redlining. Since 2010, the city has offered public broadband via its municipal power organization, Electric Power Board (EPB). The project has become a rousing success: At half the price, its service is approximately 85 percent faster than that of Comcast, the region’s primary ISP prior to EPB’s inception. Coupled with a discounted program for low-income residents, Chattanooga’s publicly run broadband reaches about 82,000 residents—more than half of the area’s Internet users—and is only expected to grow. Chattanooga’s achievements have radiated to other locales. More than 450 communities have introduced publicly-owned broadband. And more than 110 communities in 24 states have access to publicly owned networks with one gigabit-per-second (Gbps) service. (AT&T, for example, has yet to introduce speeds this high.) Seattle City Councilmember Kshama Sawant proposed a pilot project in 2015 and has recently urged her city to invest in municipal broadband. Hawaii congressperson Kaniela Ing is drafting a bill for publicly-owned Internet for the state legislature to consider next year. In November, residents of Fort Collins, Colo. voted to authorize the city to build municipal broadband infrastructure.
Gary Edwards

Office to finally fully support ODF, Open XML, and PDF formats | ZDNet - 0 views

  •  
    The king of clicks returns!  No doubt there was a time when the mere mention of ODF and the now legendary XML "document" format wars with Microsoft could drive click counts into the statisphere.  Sorry to say though, those times are long gone. It's still a good story though.  Even if the fate of mankind and the future of the Internet no longer hinges on the outcome.  There is that question that continues defy answer; "Did Microsoft win or lose?"  So the mere announcement of supported formats in MSOffice XX is guaranteed to rev the clicks somewhat. Veteran ODF clickmeister SVN does make an interesting observation though: "The ironic thing is that, while this was as hotly debated am issue in the mid-2000s as are mobile patents and cloud implementation is today, this news was barely noticed. That's a mistake. Updegrove points out, "document interoperability and vendor neutrality matter more now than ever before as paper archives disappear and literally all of human knowledge is entrusted to electronic storage." He concluded, "Only if documents can be easily exchanged and reliably accessed on an ongoing basis will competition in the present be preserved, and the availability of knowledge down through the ages be assured. Without robust, universally adopted document formats, both of those goals will be impossible to attain." Updegrove's right of course. Don't believe me? Go into your office's archives and try to bring up documents your wrote in the 90s in WordPerfect or papers your staff created in the 80s with WordStar. If you don't want to lose your institutional memory, open document standards support is more important than ever. "....................................... Sorry but Updegrove is wrong.  Woefully wrong. The Web is the future.  Sure interoperability matters, but only as far as the Web and the future of Cloud Computing is concerned.  Sadly neither ODF or Open XML are Web ready.  The language of the Web is famously HTML, now HTML5+
1 - 20 of 36 Next ›
Showing 20 items per page