Skip to main content

Home/ science 2.0/ Group items tagged funding

Rss Feed Group items tagged

katarzyna szkuta

The #SciFund Challenge - - 0 views

  •  
    Last fall, scientists raised $76,230 for their research in the first round of the #SciFund Challenge. The second round launches on May 1, 2012! What? The #SciFund Challenge is a grand experiment in science funding. Can scientists raise money for their research by convincing the general public to open their wallets for small-amount donations?
david osimo

Research impact: Altmetrics make their mark : Naturejobs - 0 views

  •  
    "Research Excellence Framework (REF), an evaluation of UK academia that influences funding"
iaravps

Research 2.0.3: The future of research communication : Soapbox Science - 0 views

  • Open Access has led directly to an increase in usage of platforms that make is easy for researchers to comply with this mandate by depositing open access versions of their papers. Examples of companies in this space are Academia.edu, ResearchGate.net and Mendeley.  Open Access also means that anyone can contribute to the post-publication evaluation of research articles.
  • There are a number of initiatives focused on improving the process of peer review. Post-publication peer review, in which journals publish papers after minimal vetting and then encourage commentary from the scientific community, has been explored by several publishers, but has run into difficulties incentivizing sufficient numbers of experts to participate.  Initiatives like Faculty of 1000 have tried to overcome this by corralling experts as part of post-publication review boards.  And sometimes, as in the case of arsenic-based life, the blogosphere has taken peer review into its own hands.
  • Traditionally the number of first and senior author publications, and the journal(s) in which those publications appear, has been the key criteria for assessing the quality of a researcher’s work. This is used by funding agencies to determine whether to award research grants to conduct their future work, as well as by academic research institutions to inform hiring and career progression decisions. However, this is actually a very poor measure of a researcher’s true impact since a) it only captures a fraction of a researcher’s contribution and b) since more than 70% of published research cannot be reproduced, the publication based system rewards researchers for the wrong thing (the publication of novel research, rather than the production of robust research).
  • ...2 more annotations...
  • The h-index was one of the first alternatives proposed as a measure of scientific research impact.  It and its variants rely on citation statistics, which is a good start, but includes a delay which can be quite long, depending on the rapidity with which papers are published in a particular field.  There are a number of startups that are attempting to improve the way a researcher’s reputation is measured. One is ImpactStory which is attempting to aggregate metrics from researcher’s articles, datasets, blog posts, and more. Another is ResearchGate.net which has developed its own RG Score.
  • Which set of reputational signifiers rise to the top will shape the future of science itself.
Francesco Mureddu

Multimission Archive at STScI (MAST) - NASA Science - 0 views

  •  
    The Multimission Archive at STScI (MAST) is a NASA funded project to support and provide to the astronomical community a variety of astronomical data archives, with the primary focus on scientifically related data sets in the optical, ultraviolet, and near-infrared parts of the spectrum.
1 - 9 of 9
Showing 20 items per page