Skip to main content

Home/ Future of the Web/ Group items tagged become

Rss Feed Group items tagged

Gonzalo San Gil, PhD.

How to use GitHub organizations to grow your code club | Opensource.com - 0 views

  •  
    "For anything involving code, programming clubs often turn to GitHub, which has become the standard for open source project hosting for thousands of projects all over the world. "
  •  
    "For anything involving code, programming clubs often turn to GitHub, which has become the standard for open source project hosting for thousands of projects all over the world. "
Gonzalo San Gil, PhD.

Doing for User Space What We Did for Kernel Space | Linux Journal - 0 views

  •  
    "Jul 06, 2016 By Doc Searls in Community Identity kernel Privacy I believe the best and worst thing about Linux is its hard distinction between kernel space and user space. Without that distinction, Linux never would have become the most leveraged operating system in the world. Today, Linux has the largest range of uses for the largest number of users-most of whom have no idea they are using Linux when they search for something on Google or poke at their Android phones. Even Apple stuff wouldn't be what it is (for example, using BSD in its computers) were it not for Linux's success. "
  •  
    "Jul 06, 2016 By Doc Searls in Community Identity kernel Privacy I believe the best and worst thing about Linux is its hard distinction between kernel space and user space. Without that distinction, Linux never would have become the most leveraged operating system in the world. Today, Linux has the largest range of uses for the largest number of users-most of whom have no idea they are using Linux when they search for something on Google or poke at their Android phones. Even Apple stuff wouldn't be what it is (for example, using BSD in its computers) were it not for Linux's success. "
Gonzalo San Gil, PhD.

Canonical's and Red Hat's Shameful War Against One Another… and Against the A... - 0 views

    • Gonzalo San Gil, PhD.
       
      # ! Guess who (& why) can be behind... # ! feeding a warped conflict in the core of # ! the #FreeSotware environment... [# ! Note: Look how 'some' treat their ''friends'... http://fossbytes.com/microsoft-buys-canonical-kills-ubuntu-linux-forever/ # ! and guess how they can behave with their rivals... # ! ...and what all this conflict represents for the Digital Community, as a whole...]
  •  
    "Summary: In an effort to trip each other up and in order to become the 'industry standard', Canonical and Red Hat hurt each other and alienate the media (what's left of it)"
  •  
    "Summary: In an effort to trip each other up and in order to become the 'industry standard', Canonical and Red Hat hurt each other and alienate the media (what's left of it)"
Alexandra IcecreamApps

Google Search Alternatives - Icecream Tech Digest - 1 views

  •  
    When it comes to searching for something online, the first website we turn to is Google. This king of the search engines has become so major that plenty of people say "Google it" instead of "Look this up online." However, … Continue reading →
  •  
    When it comes to searching for something online, the first website we turn to is Google. This king of the search engines has become so major that plenty of people say "Google it" instead of "Look this up online." However, … Continue reading →
Gonzalo San Gil, PhD.

GNU's Framework for Secure Peer-to-Peer Networking GNU's Framework for Secure Peer-to-P... - 0 views

  •  
    "Philosophy The foremost goal of the GNUnet project is to become a widely used, reliable, open, non-discriminating, egalitarian, unfettered and censorship-resistant system of free information exchange. We value free speech above state secrets, law-enforcement or intellectual property. GNUnet is supposed to be an anarchistic network, where the only limitation for peers is that they must contribute enough back to the network such that their resource consumption does not have a significant impact on other users. GNUnet should be more than just another file-sharing network. The plan is to offer many other services and in particular to serve as a development platform for the next generation of decentralized Internet protocols."
  •  
    "Philosophy The foremost goal of the GNUnet project is to become a widely used, reliable, open, non-discriminating, egalitarian, unfettered and censorship-resistant system of free information exchange. We value free speech above state secrets, law-enforcement or intellectual property. GNUnet is supposed to be an anarchistic network, where the only limitation for peers is that they must contribute enough back to the network such that their resource consumption does not have a significant impact on other users. GNUnet should be more than just another file-sharing network. The plan is to offer many other services and in particular to serve as a development platform for the next generation of decentralized Internet protocols."
Gonzalo San Gil, PhD.

The "Internet Governance" Farce and its "Multi-stakeholder" Illusion | La Quadrature du... - 0 views

  •  
    by Jérémie Zimmermann For almost 15 years, "Internet Governance" meetings1 have been drawing attention and driving our imaginaries towards believing that consensual rules for the Internet could emerge from global "multi-stakeholder" discussions. A few days ahead of the "NETmundial" Forum in Sao Paulo it has become obvious that "Internet Governance" is a farcical way of keeping us busy and hiding a sad reality: Nothing concrete in these 15 years, not a single action, ever emerged from "multi-stakeholder" meetings, while at the same time, technology as a whole has been turned against its users, as a tool for surveillance, control and oppression.
  •  
    by Jérémie Zimmermann For almost 15 years, "Internet Governance" meetings1 have been drawing attention and driving our imaginaries towards believing that consensual rules for the Internet could emerge from global "multi-stakeholder" discussions. A few days ahead of the "NETmundial" Forum in Sao Paulo it has become obvious that "Internet Governance" is a farcical way of keeping us busy and hiding a sad reality: Nothing concrete in these 15 years, not a single action, ever emerged from "multi-stakeholder" meetings, while at the same time, technology as a whole has been turned against its users, as a tool for surveillance, control and oppression.
Gonzalo San Gil, PhD.

How to Use the Free YouTube Video Editor - 1 views

  •  
    "Believe it or not, the YouTube video editor has been around since September 2011. When it was first released it was somewhat limited, but over the years it has grown to become a versatile, feature packed video editor. It works great on Macs, PCs and Chromebooks, but is best avoided if you are working on a mobile device like an iPad or Android phone."
  •  
    "Believe it or not, the YouTube video editor has been around since September 2011. When it was first released it was somewhat limited, but over the years it has grown to become a versatile, feature packed video editor. It works great on Macs, PCs and Chromebooks, but is best avoided if you are working on a mobile device like an iPad or Android phone."
Gonzalo San Gil, PhD.

The Art of Unblocking Websites Without Committing Crimes | TorrentFreak - 1 views

  •  
    " Andy on September 23, 2014 C: 31 Breaking Last month UK police took down several torrent site proxies and arrested their owner. Now a UK developer has created a new & free service that not only silently unblocks any website without falling foul of the law, but one that will eventually become available to all under a GPL 3.0 license."
  •  
    " Andy on September 23, 2014 C: 31 Breaking Last month UK police took down several torrent site proxies and arrested their owner. Now a UK developer has created a new & free service that not only silently unblocks any website without falling foul of the law, but one that will eventually become available to all under a GPL 3.0 license."
Gonzalo San Gil, PhD.

How to configure peer-to-peer VPN on Linux - Xmodulo - 0 views

  •  
    "Last updated on October 9, 2014 Authored by Dan Nanni Leave a comment A traditional VPN (e.g., OpenVPN, PPTP) is composed of a VPN server and one or more VPN clients connected to the server. When any two VPN clients talk to each other, the VPN server needs to relay VPN traffic between them. The problem of such a hub-and-spoke type of VPN topology is that the VPN server can easily become a performance bottleneck as the number of connected clients increase"
  •  
    "Last updated on October 9, 2014 Authored by Dan Nanni Leave a comment A traditional VPN (e.g., OpenVPN, PPTP) is composed of a VPN server and one or more VPN clients connected to the server. When any two VPN clients talk to each other, the VPN server needs to relay VPN traffic between them. The problem of such a hub-and-spoke type of VPN topology is that the VPN server can easily become a performance bottleneck as the number of connected clients increase"
Gonzalo San Gil, PhD.

Patent Reform Bill A Good Step, But Still Falls Way Short Of Fixing A Broken System | T... - 0 views

  •  
    "from the it's-a-start dept As was widely expected, earlier this week, a bunch of high-profile Senators introduced a big patent reform bill, known as the Protecting American Talent and Entrepreneurship (PATENT) Act. It's backed by Senators Chuck Grassley, Patrick Leahy, Chuck Schumer and John Cornyn, and has a decent chance of becoming law."
  •  
    "from the it's-a-start dept As was widely expected, earlier this week, a bunch of high-profile Senators introduced a big patent reform bill, known as the Protecting American Talent and Entrepreneurship (PATENT) Act. It's backed by Senators Chuck Grassley, Patrick Leahy, Chuck Schumer and John Cornyn, and has a decent chance of becoming law."
Paul Merrell

WikiLeaks' Julian Assange warns: Google is not what it seems - 0 views

  • Back in 2011, Julian Assange met up with Eric Schmidt for an interview that he considers the best he’s ever given. That doesn’t change, however, the opinion he now has about Schmidt and the company he represents, Google.In fact, the WikiLeaks leader doesn’t believe in the famous “Don’t Be Evil” mantra that Google has been preaching for years.Assange thinks both Schmidt and Google are at the exact opposite spectrum.“Nobody wants to acknowledge that Google has grown big and bad. But it has. Schmidt’s tenure as CEO saw Google integrate with the shadiest of US power structures as it expanded into a geographically invasive megacorporation. But Google has always been comfortable with this proximity,” Assange writes in an opinion piece for Newsweek.
  • “Long before company founders Larry Page and Sergey Brin hired Schmidt in 2001, their initial research upon which Google was based had been partly funded by the Defense Advanced Research Projects Agency (DARPA). And even as Schmidt’s Google developed an image as the overly friendly giant of global tech, it was building a close relationship with the intelligence community,” Assange continues.Throughout the lengthy article, Assange goes on to explain how the 2011 meeting came to be and talks about the people the Google executive chairman brought along - Lisa Shields, then vice president of the Council on Foreign Relationship, Jared Cohen, who would later become the director of Google Ideas, and Scott Malcomson, the book’s editor, who would later become the speechwriter and principal advisor to Susan Rice.“At this point, the delegation was one part Google, three parts US foreign-policy establishment, but I was still none the wiser.” Assange goes on to explain the work Cohen was doing for the government prior to his appointment at Google and just how Schmidt himself plays a bigger role than previously thought.In fact, he says that his original image of Schmidt, as a politically unambitious Silicon Valley engineer, “a relic of the good old days of computer science graduate culture on the West Coast,” was wrong.
  • However, Assange concedes that that is not the sort of person who attends Bilderberg conferences, who regularly visits the White House, and who delivers speeches at the Davos Economic Forum.He claims that Schmidt’s emergence as Google’s “foreign minister” did not come out of nowhere, but it was “presaged by years of assimilation within US establishment networks of reputation and influence.” Assange makes further accusations that, well before Prism had even been dreamed of, the NSA was already systematically violating the Foreign Intelligence Surveillance Act under its director at the time, Michael Hayden. He states, however, that during the same period, namely around 2003, Google was accepting NSA money to provide the agency with search tools for its rapidly-growing database of information.Assange continues by saying that in 2008, Google helped launch the NGA spy satellite, the GeoEye-1, into space and that the search giant shares the photographs from the satellite with the US military and intelligence communities. Later on, 2010, after the Chinese government was accused of hacking Google, the company entered into a “formal information-sharing” relationship with the NSA, which would allow the NSA’s experts to evaluate the vulnerabilities in Google’s hardware and software.
  • ...1 more annotation...
  • “Around the same time, Google was becoming involved in a program known as the “Enduring Security Framework” (ESF), which entailed the sharing of information between Silicon Valley tech companies and Pentagon-affiliated agencies at network speed.’’Emails obtained in 2014 under Freedom of Information requests show Schmidt and his fellow Googler Sergey Brin corresponding on first-name terms with NSA chief General Keith Alexander about ESF,” Assange writes.Assange seems to have a lot of backing to his statements, providing links left and right, which people can go check on their own.
  •  
    The "opinion piece for Newsweek" is an excerpt from Assange's new book, When Google met Wikileaks.  The chapter is well worth the read. http://www.newsweek.com/assange-google-not-what-it-seems-279447
Gonzalo San Gil, PhD.

How to become an (online) activist: It's time to break out the digital pitchforks | Ars... - 1 views

  •  
    Free, Web-based tools let you use Freedom of Information requests to dig up political dirt.
Gary Edwards

Brendan's Roadmap Updates: Open letter to Microsoft's Chris Wilson and their fight to s... - 0 views

  • The history of ECMAScript since its beginnings in November 1996 shows that when Microsoft was behind in the market (against Netscape in 1996-1997), it moved aggressively in the standards body to evolve standards starting with ES1 through ES3. Once Microsoft dominated the market, the last edition of the standard was left to rot -- ES3 was finished in 1999 -- and even easy-to-fix standards conformance bugs in IE JScript went unfixed for eight years (so three years to go from Edition 1 to 3, then over eight to approach Edition 4). Now that the proposed 4th edition looks like a competitive threat, the world suddenly hears in detail about all those bugs, spun as differences afflicting "JavaScript" that should inform a new standard.
  • In my opinion the notion that we need to add features so that ajax programming would be easier is plain wrong. ajax is a hack and also the notion of a webapp is a hack. the web was created in a document centric view. All w3c standards are also based on the same document notion. The heart of the web, the HTTP protocol is designed to support a web of documents and as such is stateless. the proper solution, IMO, is not to evolve ES for the benefit of ajax and webapps, but rather generalize the notion of a document browser that connects to a web of documents to a general purpose client engine that connects to a network of internet applications. thus the current web (document) browser just becomes one such internet application.
  •  
    the obvious conflict of interest between the standards-based web and proprietary platforms advanced by Microsoft, and the rationales for keeping the web's client-side programming language small while the proprietary platforms rapidly evolve support for large languages, does not help maintain the fiction that only clashing high-level philosophies are involved here. Readers may not know that Ecma has no provision for "minor releases" of its standards, so any ES3.1 that was approved by TG1 would inevitably be given a whole edition number, presumably becoming the 4th Edition of ECMAScript. This is obviously contentious given all the years that the majority of TG1, sometimes even apparently including Microsoft representatives, has worked on ES4, and the developer expectations set by this long-standing effort. A history of Microsoft's post-ES3 involvement in the ECMAScript standard group, leading up to the overt split in TG1 in March, is summarized here. The history of ECMAScript since its beginnings in November 1996 shows that when Microsoft was behind in the market (against Netscape in 1996-1997), it moved aggressively in the standards body to evolve standards starting with ES1 through ES3. Once Microsoft dominated the market, the last edition of the standard was left to rot -- ES3 was finished in 1999 -- and even easy-to-fix standards conformance bugs in IE JScript went unfixed for eight years (so three years to go from Edition 1 to 3, then over eight to approach Edition 4). Now that the proposed 4th edition looks like a competitive threat, the world suddenly hears in detail about all those bugs, spun as differences afflicting "JavaScript" that should inform a new standard.
Gonzalo San Gil, PhD.

4 gui applications for installing Linux from USB key | LinuxBSDos.com - 0 views

  •  
    "The traditional and most common method of installing Linux is by burning the installation ISO image to a CD or DVD. But with many laptops, notebooks, ultra notebooks, subnotebooks shipping without an optical drive, installation via USB flash stick has become the most common method for installing Linux on these types of computers" # ! #Freedom to #Go.
  •  
    "The traditional and most common method of installing Linux is by burning the installation ISO image to a CD or DVD. But with many laptops, notebooks, ultra notebooks, subnotebooks shipping without an optical drive, installation via USB flash stick has become the most common method for installing Linux on these types of computers"
Paul Merrell

Most Agencies Falling Short on Mandate for Online Records - 1 views

  • Nearly 20 years after Congress passed the Electronic Freedom of Information Act Amendments (E-FOIA), only 40 percent of agencies have followed the law's instruction for systematic posting of records released through FOIA in their electronic reading rooms, according to a new FOIA Audit released today by the National Security Archive at www.nsarchive.org to mark Sunshine Week. The Archive team audited all federal agencies with Chief FOIA Officers as well as agency components that handle more than 500 FOIA requests a year — 165 federal offices in all — and found only 67 with online libraries populated with significant numbers of released FOIA documents and regularly updated.
  • Congress called on agencies to embrace disclosure and the digital era nearly two decades ago, with the passage of the 1996 "E-FOIA" amendments. The law mandated that agencies post key sets of records online, provide citizens with detailed guidance on making FOIA requests, and use new information technology to post online proactively records of significant public interest, including those already processed in response to FOIA requests and "likely to become the subject of subsequent requests." Congress believed then, and openness advocates know now, that this kind of proactive disclosure, publishing online the results of FOIA requests as well as agency records that might be requested in the future, is the only tenable solution to FOIA backlogs and delays. Thus the National Security Archive chose to focus on the e-reading rooms of agencies in its latest audit. Even though the majority of federal agencies have not yet embraced proactive disclosure of their FOIA releases, the Archive E-FOIA Audit did find that some real "E-Stars" exist within the federal government, serving as examples to lagging agencies that technology can be harnessed to create state-of-the art FOIA platforms. Unfortunately, our audit also found "E-Delinquents" whose abysmal web performance recalls the teletype era.
  • E-Delinquents include the Office of Science and Technology Policy at the White House, which, despite being mandated to advise the President on technology policy, does not embrace 21st century practices by posting any frequently requested records online. Another E-Delinquent, the Drug Enforcement Administration, insults its website's viewers by claiming that it "does not maintain records appropriate for FOIA Library at this time."
  • ...9 more annotations...
  • "The presumption of openness requires the presumption of posting," said Archive director Tom Blanton. "For the new generation, if it's not online, it does not exist." The National Security Archive has conducted fourteen FOIA Audits since 2002. Modeled after the California Sunshine Survey and subsequent state "FOI Audits," the Archive's FOIA Audits use open-government laws to test whether or not agencies are obeying those same laws. Recommendations from previous Archive FOIA Audits have led directly to laws and executive orders which have: set explicit customer service guidelines, mandated FOIA backlog reduction, assigned individualized FOIA tracking numbers, forced agencies to report the average number of days needed to process requests, and revealed the (often embarrassing) ages of the oldest pending FOIA requests. The surveys include:
  • The federal government has made some progress moving into the digital era. The National Security Archive's last E-FOIA Audit in 2007, " File Not Found," reported that only one in five federal agencies had put online all of the specific requirements mentioned in the E-FOIA amendments, such as guidance on making requests, contact information, and processing regulations. The new E-FOIA Audit finds the number of agencies that have checked those boxes is now much higher — 100 out of 165 — though many (66 in 165) have posted just the bare minimum, especially when posting FOIA responses. An additional 33 agencies even now do not post these types of records at all, clearly thwarting the law's intent.
  • The FOIAonline Members (Department of Commerce, Environmental Protection Agency, Federal Labor Relations Authority, Merit Systems Protection Board, National Archives and Records Administration, Pension Benefit Guaranty Corporation, Department of the Navy, General Services Administration, Small Business Administration, U.S. Citizenship and Immigration Services, and Federal Communications Commission) won their "E-Star" by making past requests and releases searchable via FOIAonline. FOIAonline also allows users to submit their FOIA requests digitally.
  • THE E-DELINQUENTS: WORST OVERALL AGENCIES In alphabetical order
  • Key Findings
  • Excuses Agencies Give for Poor E-Performance
  • Justice Department guidance undermines the statute. Currently, the FOIA stipulates that documents "likely to become the subject of subsequent requests" must be posted by agencies somewhere in their electronic reading rooms. The Department of Justice's Office of Information Policy defines these records as "frequently requested records… or those which have been released three or more times to FOIA requesters." Of course, it is time-consuming for agencies to develop a system that keeps track of how often a record has been released, which is in part why agencies rarely do so and are often in breach of the law. Troublingly, both the current House and Senate FOIA bills include language that codifies the instructions from the Department of Justice. The National Security Archive believes the addition of this "three or more times" language actually harms the intent of the Freedom of Information Act as it will give agencies an easy excuse ("not requested three times yet!") not to proactively post documents that agency FOIA offices have already spent time, money, and energy processing. We have formally suggested alternate language requiring that agencies generally post "all records, regardless of form or format that have been released in response to a FOIA request."
  • Disabilities Compliance. Despite the E-FOIA Act, many government agencies do not embrace the idea of posting their FOIA responses online. The most common reason agencies give is that it is difficult to post documents in a format that complies with the Americans with Disabilities Act, also referred to as being "508 compliant," and the 1998 Amendments to the Rehabilitation Act that require federal agencies "to make their electronic and information technology (EIT) accessible to people with disabilities." E-Star agencies, however, have proven that 508 compliance is no barrier when the agency has a will to post. All documents posted on FOIAonline are 508 compliant, as are the documents posted by the Department of Defense and the Department of State. In fact, every document created electronically by the US government after 1998 should already be 508 compliant. Even old paper records that are scanned to be processed through FOIA can be made 508 compliant with just a few clicks in Adobe Acrobat, according to this Department of Homeland Security guide (essentially OCRing the text, and including information about where non-textual fields appear). Even if agencies are insistent it is too difficult to OCR older documents that were scanned from paper, they cannot use that excuse with digital records.
  • Privacy. Another commonly articulated concern about posting FOIA releases online is that doing so could inadvertently disclose private information from "first person" FOIA requests. This is a valid concern, and this subset of FOIA requests should not be posted online. (The Justice Department identified "first party" requester rights in 1989. Essentially agencies cannot use the b(6) privacy exemption to redact information if a person requests it for him or herself. An example of a "first person" FOIA would be a person's request for his own immigration file.) Cost and Waste of Resources. There is also a belief that there is little public interest in the majority of FOIA requests processed, and hence it is a waste of resources to post them. This thinking runs counter to the governing principle of the Freedom of Information Act: that government information belongs to US citizens, not US agencies. As such, the reason that a person requests information is immaterial as the agency processes the request; the "interest factor" of a document should also be immaterial when an agency is required to post it online. Some think that posting FOIA releases online is not cost effective. In fact, the opposite is true. It's not cost effective to spend tens (or hundreds) of person hours to search for, review, and redact FOIA requests only to mail it to the requester and have them slip it into their desk drawer and forget about it. That is a waste of resources. The released document should be posted online for any interested party to utilize. This will only become easier as FOIA processing systems evolve to automatically post the documents they track. The State Department earned its "E-Star" status demonstrating this very principle, and spent no new funds and did not hire contractors to build its Electronic Reading Room, instead it built a self-sustaining platform that will save the agency time and money going forward.
Gonzalo San Gil, PhD.

How the MPAA Can Become Great Again | Gary Shapiro | LinkedIn - 0 views

  •  
    "Gary Shapiro Influencer President and CEO at Consumer Electronics Association" [How does one of the most famous and important American trade groups reinvent itself? For 30-plus years, I fought against and occasionally worked with the Motion Picture Association of America (MPAA). ...]
  •  
    "Gary Shapiro Influencer President and CEO at Consumer Electronics Association"
Gary Edwards

Developer: Dump JavaScript for faster Web loading | CIO - 0 views

  • Accomplishing the goal of a high-speed, responsive Web experience without loading JavaScript "could probably be done by linking anchor elements to JSON/XML (or a new definition) API endpoints [and] having the browser internally load the data into a new data structure," the proposal states.
  • The browser "then replaces DOM elements with whatever data that was loaded as needed.
  • The initial data and standard error responses could be in header fixtures, which could be replaced later if so desired. "The HTML body thus becomes a templating language with all the content residing in the fixtures that can be dynamically reloaded without JavaScript."
  •  
    "A W3C (World Wide Web Consortium) mailing list post entitled "HTML6 proposal for single-page Web apps without JavaScript" details the proposal, dated March 20. "The overall purpose [of the plan] is to reduce response times when loading Web pages," said Web developer Bobby Mozumder, editor in chief of FutureClaw magazine, in an email. "This is the difference between a 300ms page load vs 10ms. The faster you are, the better people are going to feel about using your Website." The proposal cites a standard design pattern emerging via front-end JavaScript frameworks where content is loaded dynamically via JSON APIs. "This is the single-page app Web design pattern," said Mozumder. "Everyone's into it because the responsiveness is so much better than loading a full page -- 10-50ms with a clean API load vs. 300-1500ms for a full HTML page load. Since this is so common now, can we implement this directly in the browsers via HTML so users can dynamically run single-page apps without JavaScript?" Accomplishing the goal of a high-speed, responsive Web experience without loading JavaScript "could probably be done by linking anchor elements to JSON/XML (or a new definition) API endpoints [and] having the browser internally load the data into a new data structure," the proposal states. The browser "then replaces DOM elements with whatever data that was loaded as needed." The initial data and standard error responses could be in header fixtures, which could be replaced later if so desired. "The HTML body thus becomes a templating language with all the content residing in the fixtures that can be dynamically reloaded without JavaScript." JavaScript frameworks and JavaScript are leveraged for loading now, but there are issues with these, Mozumder explained. "Should we force millions of Web developers to learn JavaScript, a framework, and an associated templating language if they want a speedy, responsive Web site out-of-the-box? This is a huge barrier for beginners, and right n
Alexandra IcecreamApps

Best Dating Apps of 2016 - Icecream Tech Digest - 0 views

  •  
    Online dating has become widely spread due to the growth of the services offering all sorts of match finding. There are dating sites that help to connect people with various religions, ethnicities, orientations and other parameters. Some of them suggest … Continue reading →
  •  
    Online dating has become widely spread due to the growth of the services offering all sorts of match finding. There are dating sites that help to connect people with various religions, ethnicities, orientations and other parameters. Some of them suggest … Continue reading →
Alexandra IcecreamApps

Top Google Chrome Extensions for Better Browsing - Icecream Tech Digest - 1 views

  •  
    Google Chrome browser has become widely popular thanks to its high speed, elegant, minimalistic interface, and in-built translator; and, well, it is a Google product after all. Thanks to its fame and tons of users, the number of available extensions…
  •  
    Google Chrome browser has become widely popular thanks to its high speed, elegant, minimalistic interface, and in-built translator; and, well, it is a Google product after all. Thanks to its fame and tons of users, the number of available extensions…
Gonzalo San Gil, PhD.

Top 10 Open Source Developments of 2015 | Business | LinuxInsider - 0 views

  •  
    "Open source is driving an ever-expanding market. The notion of community-driven development is a growing disruption to proprietary software controlled by commercial vendors, and the free open source software concept has become a major disruption in industry and technology."
  •  
    "Open source is driving an ever-expanding market. The notion of community-driven development is a growing disruption to proprietary software controlled by commercial vendors, and the free open source software concept has become a major disruption in industry and technology."
‹ Previous 21 - 40 of 203 Next › Last »
Showing 20 items per page