Skip to main content

Home/ Future of the Web/ Group items tagged open source Data

Rss Feed Group items tagged

Gonzalo San Gil, PhD.

[# ! #Tech:] How do I permanently erase hard disk? - 1 views

  •  
    "I am going to sell my laptop soon. Before discarding my system, I want to make sure that no one should be able to use my personal data using any method (format do not work). Is there any open source software out there that can help me permanently erase my hard disk?"
  •  
    "I am going to sell my laptop soon. Before discarding my system, I want to make sure that no one should be able to use my personal data using any method (format do not work). Is there any open source software out there that can help me permanently erase my hard disk?"
Gonzalo San Gil, PhD.

Open source big data processing in education | Opensource.com - 0 views

  •  
    "The continuing growth of massive and diverse data volumes, and the growth of data intensive applications, has presented a need to find effective means of data management across all sectors. According to a recent report, businesses face a huge skill gap in the management of big data,"
Gonzalo San Gil, PhD.

What is SPDX? | SPDX - 2 views

  •  
    "The Software Package Data Exchange® (SPDX®) specification is a standard format for communicating the components, licenses and copyrights associated with a software package."
Gonzalo San Gil, PhD.

MicroMappers analyzes big data for disaster relief | Opensource.com - 0 views

  •  
    "Open source and crowdsourcing-uttering these words at a meeting of the United Nations before the year 2010 would have made you persona non grata. In fact, the fastest way to discredit yourself at any humanitarian meeting just five years ago was to suggest the use of open source software and crowdsourcing in disaster response"
  •  
    "Open source and crowdsourcing-uttering these words at a meeting of the United Nations before the year 2010 would have made you persona non grata. In fact, the fastest way to discredit yourself at any humanitarian meeting just five years ago was to suggest the use of open source software and crowdsourcing in disaster response"
Gonzalo San Gil, PhD.

Apache Spark: 100 terabytes (TB) of data sorted in 23 minutes | Opensource.com - 0 views

  •  
    "In October 2014, Databricks participated in the Sort Benchmark and set a new world record for sorting 100 terabytes (TB) of data, or 1 trillion 100-byte records. The team used Apache Spark on 207 EC2 virtual machines and sorted 100 TB of data in 23 minutes."
  •  
    "In October 2014, Databricks participated in the Sort Benchmark and set a new world record for sorting 100 terabytes (TB) of data, or 1 trillion 100-byte records. The team used Apache Spark on 207 EC2 virtual machines and sorted 100 TB of data in 23 minutes."
1 - 5 of 5
Showing 20 items per page