Skip to main content

Home/ Future of the Web/ Group items tagged achievements

Rss Feed Group items tagged

Paul Merrell

Carakan - By Opera Core Concerns - 0 views

  • Over the past few months, a small team of developers and testers have been working on implementing a new ECMAScript/JavaScript engine for Opera. When Opera's current ECMAScript engine, called Futhark, was first released in a public version, it was the fastest engine on the market. That engine was developed to minimize code footprint and memory usage, rather than to achieve maximum execution speed. This has traditionally been a correct trade-off on many of the platforms Opera runs on. The Web is a changing environment however, and tomorrow's advanced web applications will require faster ECMAScript execution, so we have now taken on the challenge to once again develop the fastest ECMAScript engine on the market.
  • So how fast is Carakan? Using a regular cross-platform switch dispatch mechanism (without any generated native code) Carakan is currently about two and a half times faster at the SunSpider benchmark than the ECMAScript engine in Presto 2.2 (Opera 10 Alpha). Since Opera is ported to many different hardware architectures, this cross-platform improvement is on its own very important. The native code generation in Carakan is not yet ready for full-scale testing, but the few individual benchmark tests that it is already compatible with runs between 5 and 50 times faster, so it is looking promising so far.
Gary Edwards

Can Cloud Computing Achieve Interoperable Platforms? - 0 views

  • the fact is that today if a customer has heavily invested in either platform then there isn't a straightforward way for customers to extricate themselves from the platform and switch to another vendor. In addition there is not a competitive marketplace of vendors providing standard/interoperable platforms as there are with email hosting or Web hosting providers.
  •  
    Response from Microsoft's Dare Obasanjo to the Tim Bray blog: Get in the Cloud. .. "When it comes to cloud computing platforms, you have all of the same problems described above and a few extra ones. The key wrinkle with cloud computing platforms is that there is no standardization of the APIs and platform technologies that underlie these services. The APIs provided by Amazon's cloud computing platform (EC2/S3/EBS/etc) are radically different from those provided by Google App Engine (Datastore API/Python runtime/Images API/etc). For zero lock-in to occur in this space, there need to be multiple providers of the same underlying APIs. Otherwise, migrating between cloud computing platforms will be more like switching your application from Ruby on Rails and MySQL to Django and PostgreSQL (i.e. a complete rewrite)...." Although cloud computing vendors are not explicitly trying to lock-in customers to their platform, the fact is that today if a customer has heavily invested in either platform then there isn't a straightforward way for customers to extricate themselves from the platform and switch to another vendor. In addition there is not a competitive marketplace of vendors providing standard/interoperable platforms as there are with email hosting or Web hosting providers.
Gary Edwards

Is the Apps Marketplace just playing catchup to Microsoft? | Googling Google | ZDNet.com - 2 views

shared by Gary Edwards on 12 Mar 10 - Cached
  • Take the basic communication, calendaring, and documentation enabled for free by Apps Standard Edition, add a few slick applications from the Marketplace and the sky was the limit. Or at least the clouds were.
  • He also pointed to all of the applications in the Marketplace that seek to improve connectivity with Microsoft Office.
  • Payne predicted that we would see Microsoft accelerate ahead of Google in terms of productivity in the cloud while Google is still trying to achieve more desktop-application-style fidelity.
  • ...2 more annotations...
  • Where Payne saw the Apps Marketplace as a validation of Microsoft’s dominance in the enterprise, Chris Vander Mey saw it as a validation of the open Web (and Google’s applications) as a powerful platform for development.
  • Microsoft is pushing hard to extend its productivity and enterprise dominance into the cloud while Google is looking to leverage its extensive, native web platforms.
Matteo Spreafico

Knowledge Genes® - Welcome Page - 4 views

  • Better knowledge, process and control coded WHAT, HOW and WHY
  • Better information in terms of WHAT you need to achieve, HOW and WHY
Paul Merrell

Remaining Snowden docs will be released to avert 'unspecified US war' - ‪Cryp... - 1 views

  • All the remaining Snowden documents will be released next month, according t‪o‬ whistle-blowing site ‪Cryptome, which said in a tweet that the release of the info by unnamed third parties would be necessary to head off an unnamed "war".‬‪Cryptome‬ said it would "aid and abet" the release of "57K to 1.7M" new documents that had been "withheld for national security-public debate [sic]". <a href="http://pubads.g.doubleclick.net/gampad/jump?iu=/6978/reg_security/front&sz=300x250%7C300x600&tile=3&c=33U7RchawQrMoAAHIac14AAAKH&t=ct%3Dns%26unitnum%3D3%26unitname%3Dwww_top_mpu%26pos%3Dtop%26test%3D0" target="_blank"> <img src="http://pubads.g.doubleclick.net/gampad/ad?iu=/6978/reg_security/front&sz=300x250%7C300x600&tile=3&c=33U7RchawQrMoAAHIac14AAAKH&t=ct%3Dns%26unitnum%3D3%26unitname%3Dwww_top_mpu%26pos%3Dtop%26test%3D0" alt=""></a> The site clarified that will not be publishing the documents itself.Transparency activists would welcome such a release but such a move would be heavily criticised by inteligence agencies and military officials, who argue that Snowden's dump of secret documents has set US and allied (especially British) intelligence efforts back by years.
  • As things stand, the flow of Snowden disclosures is controlled by those who have access to the Sn‪o‬wden archive, which might possibly include Snowden confidants such as Glenn Greenwald and Laura Poitras. In some cases, even when these people release information to mainstream media organisations, it is then suppressed by these organisations after negotiation with the authorities. (In one such case, some key facts were later revealed by the Register.)"July is when war begins unless headed off by Snowden full release of crippling intel. After war begins not a chance of release," Cryptome tweeted on its official feed."Warmongerers are on a rampage. So, yes, citizens holding Snowden docs will do the right thing," it said.
  • "For more on Snowden docs release in July watch for Ellsberg, special guest and others at HOPE, July 18-20: http://www.hope.net/schedule.html," it added.HOPE (Hackers On Planet Earth) is a well-regarded and long-running hacking conference organised by 2600 magazine. Previous speakers at the event have included Kevin Mitnick, Steve Wozniak and Jello Biafra.In other developments, ‪Cryptome‬ has started a Kickstarter fund to release its entire archive in the form of a USB stick archive. It wants t‪o‬ raise $100,000 to help it achieve its goal. More than $14,000 has already been raised.The funding drive follows a dispute between ‪Cryptome‬ and its host Network Solutions, which is owned by web.com. Access to the site was bl‪o‬cked f‪o‬ll‪o‬wing a malware infection last week. ‪Cryptome‬ f‪o‬under J‪o‬hn Y‪o‬ung criticised the host, claiming it had ‪o‬ver-reacted and had been sl‪o‬w t‪o‬ rest‪o‬re access t‪o‬ the site, which ‪Cryptome‬ criticised as a form of cens‪o‬rship.In resp‪o‬nse, ‪Cryptome‬ plans to more widely distribute its content across multiple sites as well as releasing the planned USB stick archive. ®
  •  
    Can't happen soon enough. 
Paul Merrell

Microsoft to host data in Germany to evade US spying | Naked Security - 0 views

  • Microsoft's new plan to keep the US government's hands off its customers' data: Germany will be a safe harbor in the digital privacy storm. Microsoft on Wednesday announced that beginning in the second half of 2016, it will give foreign customers the option of keeping data in new European facilities that, at least in theory, should shield customers from US government surveillance. It will cost more, according to the Financial Times, though pricing details weren't forthcoming. Microsoft Cloud - including Azure, Office 365 and Dynamics CRM Online - will be hosted from new datacenters in the German regions of Magdeburg and Frankfurt am Main. Access to data will be controlled by what the company called a German data trustee: T-Systems, a subsidiary of the independent German company Deutsche Telekom. Without the permission of Deutsche Telekom or customers, Microsoft won't be able to get its hands on the data. If it does get permission, the trustee will still control and oversee Microsoft's access.
  • Microsoft CEO Satya Nadella dropped the word "trust" into the company's statement: Microsoft’s mission is to empower every person and every individual on the planet to achieve more. Our new datacenter regions in Germany, operated in partnership with Deutsche Telekom, will not only spur local innovation and growth, but offer customers choice and trust in how their data is handled and where it is stored.
  • On Tuesday, at the Future Decoded conference in London, Nadella also announced that Microsoft would, for the first time, be opening two UK datacenters next year. The company's also expanding its existing operations in Ireland and the Netherlands. Officially, none of this has anything to do with the long-drawn-out squabbling over the transatlantic Safe Harbor agreement, which the EU's highest court struck down last month, calling the agreement "invalid" because it didn't protect data from US surveillance. No, Nadella said, the new datacenters and expansions are all about giving local businesses and organizations "transformative technology they need to seize new global growth." But as Diginomica reports, Microsoft EVP of Cloud and Enterprise Scott Guthrie followed up his boss’s comments by saying that yes, the driver behind the new datacenters is to let customers keep data close: We can guarantee customers that their data will always stay in the UK. Being able to very concretely tell that story is something that I think will accelerate cloud adoption further in the UK.
  • ...2 more annotations...
  • Microsoft and T-Systems' lawyers may well think that storing customer data in a German trustee data center will protect it from the reach of US law, but for all we know, that could be wishful thinking. Forrester cloud computing analyst Paul Miller: To be sure, we must wait for the first legal challenge. And the appeal. And the counter-appeal. As with all new legal approaches, we don’t know it is watertight until it is challenged in court. Microsoft and T-Systems’ lawyers are very good and say it's watertight. But we can be sure opposition lawyers will look for all the holes. By keeping data offshore - particularly in Germany, which has strong data privacy laws - Microsoft could avoid the situation it's now facing with the US demanding access to customer emails stored on a Microsoft server in Dublin. The US has argued that Microsoft, as a US company, comes under US jurisdiction, regardless of where it keeps its data.
  • Running away to Germany isn't a groundbreaking move; other US cloud services providers have already pledged expansion of their EU presences, including Amazon's plan to open a UK datacenter in late 2016 that will offer what CTO Werner Vogels calls "strong data sovereignty to local users." Other big data operators that have followed suit: Salesforce, which has already opened datacenters in the UK and Germany and plans to open one in France next year, as well as new EU operations pledged for the new year by NetSuite and Box. Can Germany keep the US out of its datacenters? Can Ireland? Time, and court cases, will tell.
  •  
    The European Community's Court of Justice decision in the Safe Harbor case --- and Edward Snowden --- are now officially downgrading the U.S. as a cloud data center location. NSA is good business for Europeans looking to displace American cloud service providers, as evidenced by Microsoft's decision. The legal test is whether Microsoft has "possession, custody, or control" of the data. From the info given in the article, it seems that Microsoft has done its best to dodge that bullet by moving data centers to Germany and placing their data under the control of a European company. Do ownership of the hardware and profits from their rent mean that Microsoft still has "possession, custody, or control" of the data? The fine print of the agreement with Deutsche Telekom and the customer EULAs will get a thorough going over by the Dept. of Justice for evidence of Microsoft "control" of the data. That will be the crucial legal issue. The data centers in Germany may pass the test. But the notion that data centers in the UK can offer privacy is laughable; the UK's legal authority for GCHQ makes it even easier to get the data than the NSA can in the U.S.  It doesn't even require a court order. 
Paul Merrell

The De-Americanization of Internet Freedom - Lawfare - 0 views

  • Why did the internet freedom agenda fail? Goldsmith’s essay tees up, but does not fully explore, a range of explanatory hypotheses. The most straightforward have to do with unrealistic expectations and unintended consequences. The idea that a minimally regulated internet would usher in an era of global peace, prosperity, and mutual understanding, Goldsmith tells us, was always a fantasy. As a project of democracy and human rights promotion, the internet freedom agenda was premised on a wildly overoptimistic view about the capacity of information flows, on their own, to empower oppressed groups and effect social change. Embracing this market-utopian view led the United States to underinvest in cybersecurity, social media oversight, and any number of other regulatory tools. In suggesting this interpretation of where U.S. policymakers and their civil society partners went wrong, Goldsmith’s essay complements recent critiques of the neoliberal strains in the broader human rights and transparency movements. Perhaps, however, the internet freedom agenda has faltered not because it was so naïve and unrealistic, but because it was so effective at achieving its realist goals. The seeds of this alternative account can be found in Goldsmith’s concession that the commercial non-regulation principle helped companies like Apple, Google, Facebook, and Amazon grab “huge market share globally.” The internet became an increasingly valuable cash cow for U.S. firms and an increasingly potent instrument of U.S. soft power over the past two decades; foreign governments, in due course, felt compelled to fight back. If the internet freedom agenda is understood as fundamentally a national economic project, rather than an international political or moral crusade, then we might say that its remarkable early success created the conditions for its eventual failure. Goldsmith’s essay also points to a third set of possible explanations for the collapse of the internet freedom agenda, involving its internal contradictions. Magaziner’s notion of a completely deregulated marketplace, if taken seriously, is incoherent. As Goldsmith and Tim Wu have discussed elsewhere, it takes quite a bit of regulation for any market, including markets related to the internet, to exist and to work. And indeed, even as Magaziner proposed “complete deregulation” of the internet, he simultaneously called for new legal protections against computer fraud and copyright infringement, which were soon followed by extensive U.S. efforts to penetrate foreign networks and to militarize cyberspace. Such internal dissonance was bound to invite charges of opportunism, and to render the American agenda unstable.
Paul Merrell

Superiority in Cyberspace Will Remain Elusive - Federation Of American Scientists - 0 views

  • Military planners should not anticipate that the United States will ever dominate cyberspace, the Joint Chiefs of Staff said in a new doctrinal publication. The kind of supremacy that might be achievable in other domains is not a realistic option in cyber operations. “Permanent global cyberspace superiority is not possible due to the complexity of cyberspace,” the DoD publication said. In fact, “Even local superiority may be impractical due to the way IT [information technology] is implemented; the fact US and other national governments do not directly control large, privately owned portions of cyberspace; the broad array of state and non-state actors; the low cost of entry; and the rapid and unpredictable proliferation of technology.” Nevertheless, the military has to make do under all circumstances. “Commanders should be prepared to conduct operations under degraded conditions in cyberspace.” This sober assessment appeared in a new edition of Joint Publication 3-12, Cyberspace Operations, dated June 8, 2018. (The 100-page document updates and replaces a 70-page version from 2013.) The updated DoD doctrine presents a cyber concept of operations, describes the organization of cyber forces, outlines areas of responsibility, and defines limits on military action in cyberspace, including legal limits.
  • The new cyber doctrine reiterates the importance and the difficulty of properly attributing cyber attacks against the US to their source. “The ability to hide the sponsor and/or the threat behind a particular malicious effect in cyberspace makes it difficult to determine how, when, and where to respond,” the document said. “The design of the Internet lends itself to anonymity and, combined with applications intended to hide the identity of users, attribution will continue to be a challenge for the foreseeable future.”
Paul Merrell

The Supreme Court's Groundbreaking Privacy Victory for the Digital Age | American Civil... - 0 views

  • The Supreme Court on Friday handed down what is arguably the most consequential privacy decision of the digital age, ruling that police need a warrant before they can seize people’s sensitive location information stored by cellphone companies. The case specifically concerns the privacy of cellphone location data, but the ruling has broad implications for government access to all manner of information collected about people and stored by the purveyors of popular technologies. In its decision, the court rejects the government’s expansive argument that people lose their privacy rights merely by using those technologies. Carpenter v. U.S., which was argued by the ACLU, involves Timothy Carpenter, who was convicted in 2013 of a string of burglaries in Detroit. To tie Carpenter to the burglaries, FBI agents obtained — without seeking a warrant — months’ worth of his location information from Carpenter’s cellphone company. They got almost 13,000 data points tracking Carpenter’s whereabouts during that period, revealing where he slept, when he attended church, and much more. Indeed, as Chief Justice John Roberts wrote in Friday’s decision, “when the Government tracks the location of a cell phone it achieves near perfect surveillance, as if it had attached an ankle monitor to the phone’s user.”.
  • The ACLU argued the agents had violated Carpenter’s Fourth Amendment rights when they obtained such detailed records without a warrant based on probable cause. In a decision written by Chief Justice John Roberts, the Supreme Court agreed, recognizing that the Fourth Amendment must apply to records of such unprecedented breadth and sensitivity: Mapping a cell phone’s location over the course of 127 days provides an all-encompassing record of the holder’s whereabouts. As with GPS information, the timestamped data provides an intimate window into a person’s life, revealing not only his particular movements, but through them his ‘familial, political, professional, religious, and sexual associations.’
  • The government’s argument that it needed no warrant for these records extends far beyond cellphone location information, to any data generated by modern technologies and held by private companies rather than in our own homes or pockets. To make their case, government lawyers relied on an outdated, 1970s-era legal doctrine that says that once someone shares information with a “third party” — in Carpenter’s case, a cellphone company — that data is no longer protected by the Fourth Amendment. The Supreme Court made abundantly clear that this doctrine has its limits and cannot serve as a carte blanche for the government seizure of any data of its choosing without judicial oversight.
  • ...1 more annotation...
  • While the decision extends in the immediate term only to historical cellphone location data, the Supreme Court’s reasoning opens the door to the protection of the many other kinds of data generated by popular technologies. Today’s decision provides a groundbreaking update to privacy rights that the digital age has rendered vulnerable to abuse by the government’s appetite for surveillance. It recognizes that “cell phones and the services they provide are ‘such a pervasive and insistent part of daily life’ that carrying one is indispensable to participation in modern society.” And it helps ensure that we don’t have to give up those rights if we want to participate in modern life. 
Paul Merrell

Can Dweb Save The Internet? 06/03/2019 - 0 views

  • On a mysterious farm just above the Pacific Ocean, the group who built the internet is inviting a small number of friends to a semi-secret gathering. They describe it as a camp "where diverse people can freely exchange ideas about the technologies, laws, markets, and agreements we need to move forward.” Forward indeed.It wasn’t that long ago that the internet was an open network of computers, blogs, sites, and posts.But then something happened -- and the open web was taken over by private, for-profit, closed networks. Facebook isn’t the web. YouTube isn’t the web. Google isn’t the web. They’re for-profit businesses that are looking to sell audiences to advertisers.Brewster Kahle is one of the early web innovators who built the Internet Archive as a public storehouse to protect the web’s history. Along with web luminaries such as Sir Tim Berners-Lee and Vint Cerf, he is working to protect and rebuild the open nature of the web.advertisementadvertisement“We demonstrated that the web had failed instead of served humanity, as it was supposed to have done,” Berners-Lee told Vanity Fair. The web has “ended up producing -- [through] no deliberate action of the people who designed the platform -- a large-scale emergent phenomenon which is anti-human.”
  • o, they’re out to fix it, working on what they call the Dweb. The “d” in Dweb stands for distributed. In distributed systems, no one entity has control over the participation of any other entity.Berners-Lee is building a platform called Solid, designed to give people control over their own data. Other global projects also have the goal of taking take back the public web. Mastodon is decentralized Twitter. Peertube is a decentralized alternative to YouTube.This July 18 - 21, web activists plan to convene at the Decentralized Web Summit in San Francisco. Back in 2016, Kahle convened an early group of builders, archivists, policymaker, and journalists. He issued a challenge to  use decentralized technologies to “Lock the Web Open.” It’s hard to imagine he knew then how quickly the web would become a closed network.Last year's Dweb gathering convened more than 900 developers, activists, artists, researchers, lawyers, and students. Kahle opened the gathering by reminding attendees that the web used to be a place where everyone could play. "Today, I no longer feel like a player, I feel like I’m being played. Let’s build a decentralized web, let’s build a system we can depend on, a system that doesn’t feel creepy” he said, according to IEEE Spectrum.With the rising tide of concerns about how social networks have hacked our democracy, Kahle and his Dweb community will gather with increasing urgency around their mission.The internet began with an idealist mission to connect people and information for good. Today's web has yet to achieve that goal, but just maybe Dweb will build an internet more robust and open than the current infrastructure allows. That’s a mission worth fighting for.
Paul Merrell

It's Time to Nationalize the Internet - 0 views

  • Such profiteering tactics have disproportionately affected low-income and rural communities. ISPs have long redlined these demographic groups, creating what’s commonly known as the “digital divide.” Thirty-nine percent of Americans lack access to service fast enough to meet the federal definition of broadband. More than 50 percent of adults with household incomes below $30,000 have home broadband—a problem plaguing users of color most acutely. In contrast, internet access is near-universal for households with an annual income of $100,000 or more. The reason for such chasms is simple: Private network providers prioritize only those they expect to provide a return on investment, thus excluding poor and sparsely populated areas.
  • Chattanooga, Tennessee, has seen more success in addressing redlining. Since 2010, the city has offered public broadband via its municipal power organization, Electric Power Board (EPB). The project has become a rousing success: At half the price, its service is approximately 85 percent faster than that of Comcast, the region’s primary ISP prior to EPB’s inception. Coupled with a discounted program for low-income residents, Chattanooga’s publicly run broadband reaches about 82,000 residents—more than half of the area’s Internet users—and is only expected to grow. Chattanooga’s achievements have radiated to other locales. More than 450 communities have introduced publicly-owned broadband. And more than 110 communities in 24 states have access to publicly owned networks with one gigabit-per-second (Gbps) service. (AT&T, for example, has yet to introduce speeds this high.) Seattle City Councilmember Kshama Sawant proposed a pilot project in 2015 and has recently urged her city to invest in municipal broadband. Hawaii congressperson Kaniela Ing is drafting a bill for publicly-owned Internet for the state legislature to consider next year. In November, residents of Fort Collins, Colo. voted to authorize the city to build municipal broadband infrastructure.
Paul Merrell

Homepage - Contract for the Web - 0 views

  • The Web was designed to bring people together and make knowledge freely available. It has changed the world for good and improved the lives of billions. Yet, many people are still unable to access its benefits and, for others, the Web comes with too many unacceptable costs. Everyone has a role to play in safeguarding the future of the Web. The Contract for the Web was created by representatives from over 80 organizations, representing governments, companies and civil society, and sets out commitments to guide digital policy agendas. To achieve the Contract’s goals, governments, companies, civil society and individuals must commit to sustained policy development, advocacy, and implementation of the Contract text.
« First ‹ Previous 41 - 52 of 52
Showing 20 items per page