Skip to main content

Home/ science 2.0/ Group items tagged paper

Rss Feed Group items tagged

iaravps

Rise of 'Altmetrics' Revives Questions About How to Measure Impact of Research - Techno... - 0 views

  • "Campuswide there's a little sensitivity toward measuring faculty output," she says. Altmetrics can reveal that nobody's talking about a piece of work, at least in ways that are trackable—and a lack of interest is hardly something researchers want to advertise in their tenure-and-promotion dossiers. "What are the political implications of having a bunch of stuff online that nobody has tweeted about or Facebooked or put on Mendeley?"
    • iaravps
       
      What about uncited papers?
  • "The folks I've talked to are like, 'Yes, it does have some value, but in terms of the reality of my tenure-and-promotion process, I have to focus on other things,'" she says.
  • As that phrasing indicates, altmetrics data can't reveal everything. Mr. Roberts points out that if someone tweets about a paper, "they could be making fun of it." If a researcher takes the time to download a paper into an online reference manager like Mendeley or Zotero, however, he considers that a more reliable sign that the work has found some kind of audience. "My interpretation is that because they downloaded it, they found it useful," he says.
  • ...1 more annotation...
  • It's an interesting story in itself how the desire of librarians 50 years ago to know what journals to buy now propels the entire scientific enterprise across the globe.
iaravps

Research 2.0.3: The future of research communication : Soapbox Science - 0 views

  • Open Access has led directly to an increase in usage of platforms that make is easy for researchers to comply with this mandate by depositing open access versions of their papers. Examples of companies in this space are Academia.edu, ResearchGate.net and Mendeley.  Open Access also means that anyone can contribute to the post-publication evaluation of research articles.
  • There are a number of initiatives focused on improving the process of peer review. Post-publication peer review, in which journals publish papers after minimal vetting and then encourage commentary from the scientific community, has been explored by several publishers, but has run into difficulties incentivizing sufficient numbers of experts to participate.  Initiatives like Faculty of 1000 have tried to overcome this by corralling experts as part of post-publication review boards.  And sometimes, as in the case of arsenic-based life, the blogosphere has taken peer review into its own hands.
  • Traditionally the number of first and senior author publications, and the journal(s) in which those publications appear, has been the key criteria for assessing the quality of a researcher’s work. This is used by funding agencies to determine whether to award research grants to conduct their future work, as well as by academic research institutions to inform hiring and career progression decisions. However, this is actually a very poor measure of a researcher’s true impact since a) it only captures a fraction of a researcher’s contribution and b) since more than 70% of published research cannot be reproduced, the publication based system rewards researchers for the wrong thing (the publication of novel research, rather than the production of robust research).
  • ...2 more annotations...
  • The h-index was one of the first alternatives proposed as a measure of scientific research impact.  It and its variants rely on citation statistics, which is a good start, but includes a delay which can be quite long, depending on the rapidity with which papers are published in a particular field.  There are a number of startups that are attempting to improve the way a researcher’s reputation is measured. One is ImpactStory which is attempting to aggregate metrics from researcher’s articles, datasets, blog posts, and more. Another is ResearchGate.net which has developed its own RG Score.
  • Which set of reputational signifiers rise to the top will shape the future of science itself.
Francesco Mureddu

CMND2007.pdf (Oggetto application/pdf) - 0 views

  •  
    This paper describes the design of OpenFOAM, an object- oriented library for Computational Fluid Dynamics (CFD) and struc- tural analysis. Efficient and flexible implementation of complex physi- cal models in Continuum Mechanics is achieved by mimicking the form of partial differential equation in software. The library provides Fi- nite Volume and Finite Element discretisation in operator form and with polyhedral mesh support, with relevant auxiliary tools and sup- port for massively parallel computing. Functionality of OpenFOAM is illustrated on three levels: improvements in linear solver technology with CG-AMG solvers, LES data analysis using Proper Orthogonal Decom- position (POD) and a self-contained fluid-structure interaction solver.
Francesco Mureddu

IEEE Xplore - Abstract Page - 0 views

  •  
    Complex coupled multi-physics simulations are playing increasingly important roles in scientific and engineering applications such as fusion plasma and climate modeling. At the same time, extreme scales, high levels of concurrency and the advent of multicore and many core technologies are making the high-end parallel computing systems on which these simulations run, hard to program. While the Partitioned Global Address Space (PGAS) languages is attempting to address the problem, the PGAS model does not easily support the coupling of multiple application codes, which is necessary for the coupled multi-physics simulations. Furthermore, existing frameworks that support coupled simulations have been developed for fragmented programming models such as message passing, and are conceptually mismatched with the shared memory address space abstraction in the PGAS programming model. This paper explores how multi-physics coupled simulations can be supported within the PGAS programming framework. Specifically, in this paper, we present the design and implementation of the XpressSpace programming system, which enables efficient and productive development of coupled simulations across multiple independent PGAS Unified Parallel C (UPC) executables. XpressSpace provides the global-view style programming interface that is consistent with the memory model in UPC, and provides an efficient runtime system that can dynamically capture the data decomposition of global-view arrays and enable fast exchange of parallel data structures between coupled codes. In addition, XpressSpace provides the flexibility to define the coupling process in specification file that is independent of the program source codes. We evaluate the performance and scalability of Xpress Space prototype implementation using different coupling patterns extracted from real world multi-physics simulation scenarios, on the Jaguar Cray XT5 system of Oak Ridge National Laboratory.
david osimo

Scientific Communication As Sequential Art - 0 views

  •  
    "Nature paper"
david osimo

- Article of the Future - 0 views

  •  
    Resulting from the Article of the Future project innovations, we are now able to announce the SciVerse ScienceDirect redesigned article page, with a new layout including a navigational pane and an optimized reading middle pane. The Article of the Future project- an ongoing initiative aiming to revolutionize the traditional format of the academic paper in regard to three key elements: presentation, content and context.
david osimo

Filter-then-publish vs. publish-then-filter | Sauropod Vertebra Picture of the Week - 2 views

  •  
    "Unlike many journals which attempt to use the peer review process to determine whether or not an article reaches the level of 'importance' required by a given journal, PLoS ONE uses peer review to determine whether a paper is technically sound and worthy of inclusion in the published scientific record. Once the work is published in PLoS ONE, the broader community is then able to discuss and evaluate the significance of the article (through the number of citations it attracts; the downloads it achieves; the media and blog coverage it receives; and the post-publication Notes, Comments and Ratings that it receives on PLoS ONE etc)."
katarzyna szkuta

SciVee | Making Science Visible - 0 views

  •  
    Share your science and technology through publications, posters, papers, or slides combined with video and science communities.
Francesco Mureddu

Identifying population differences in whole-brain... [Neuroimage. 2010] - PubMed - NCBI - 0 views

  •  
    Models of whole-brain connectivity are valuable for understanding neurological function, development and disease. This paper presents a machine learning based approach to classify subjects according to their approximated structural connectivity patterns and to identify features which represent the key differences between groups. Brain networks are extracted from diffusion magnetic resonance images obtained by a clinically viable acquisition protocol. Connections are tracked between 83 regions of interest automatically extracted by label propagation from multiple brain atlases followed by classifier fusion. Tracts between these regions are propagated by probabilistic tracking, and mean anisotropy measurements along these connections provide the feature vectors for combined principal component analysis and maximum uncertainty linear discriminant analysis. The approach is tested on two populations with different age distributions: 20-30 and 60-90 years. We show that subjects can be classified successfully (with 87.46% accuracy) and that the features extracted from the discriminant analysis agree with current consensus on the neurological impact of ageing.
Francesco Mureddu

S145.full.pdf (Oggetto application/pdf) - 0 views

  •  
    A new program package, XEASY, was written for interactive computer support of the analysis of NMR spectra for three-dimensional structure determination of biological macromolecules. XEASY was developed for work with 2D, 3D and 4D NMR data sets. It includes all the functions performed by the precursor program EASY, which was designed for the analysis of 2D NMR spectra, i.e., peak picking and support of sequence-specific resonance assignments, cross-peak assignments, cross-peak integration and rate constant determination for dynamic processes. Since the program utilizes the X-window system and the Motif widget set, it is portable on a wide range of UNIX workstations. The design objective was to provide maximal computer support for the analysis of spectra, while providing the user with complete control over the final resonance assignments. Technically important features of XEASY are the use and flexible visual display of lsquostripsrsquo, i.e., two-dimensional spectral regions that contain the relevant parts of 3D or 4D NMR spectra, automated sorting routines to narrow down the selection of strips that need to be interactively considered in a particular assignment step, a protocol of resonance assignments that can be used for reliable bookkeeping, independent of the assignment strategy used, and capabilities for proper treatment of spectral folding and efficient transfer of resonance assignments between spectra of different types and different dimensionality, including projected, reduced-dimensionality triple-resonance experiments.
Francesco Mureddu

Access : Literature mining for the biologist: from information retrieval to biological ... - 0 views

  •  
    For the average biologist, hands-on literature mining currently means a keyword search in PubMed. However, methods for extracting biomedical facts from the scientific literature have improved considerably, and the associated tools will probably soon be used in many laboratories to automatically annotate and analyse the growing number of system-wide experimental data sets. Owing to the increasing body of text and the open-access policies of many journals, literature mining is also becoming useful for both hypothesis generation and biological discovery. However, the latter will require the integration of literature and high-throughput data, which should encourage close collaborations between biologists and computational linguists.
Francesco Mureddu

vonmering2002.pdf (Oggetto application/pdf) - 0 views

  •  
    Comprehensive protein-protein interaction maps promise to reveal many aspects of the complex regulatory network underlying cellular function. Recently, large-scale approaches have predicted many new protein interactions in yeast. To measure their accuracy and potential as well as to identify biases, strengths and weaknesses, we compare the methods with each other and with a reference set of previously reported protein interactions.
Francesco Mureddu

Journal of Biomolecular NMR, Volume 6, Number 1 - SpringerLink - 0 views

  •  
    A new program package, XEASY, was written for interactive computer support of the analysis of NMR spectra for three-dimensional structure determination of biological macromolecules. XEASY was developed for work with 2D, 3D and 4D NMR data sets. It includes all the functions performed by the precursor program EASY, which was designed for the analysis of 2D NMR spectra, i.e., peak picking and support of sequence-specific resonance assignments, cross-peak assignments, cross-peak integration and rate constant determination for dynamic processes. Since the program utilizes the X-window system and the Motif widget set, it is portable on a wide range of UNIX workstations. The design objective was to provide maximal computer support for the analysis of spectra, while providing the user with complete control over the final resonance assignments. Technically important features of XEASY are the use and flexible visual display of lsquostripsrsquo, i.e., two-dimensional spectral regions that contain the relevant parts of 3D or 4D NMR spectra, automated sorting routines to narrow down the selection of strips that need to be interactively considered in a particular assignment step, a protocol of resonance assignments that can be used for reliable bookkeeping, independent of the assignment strategy used, and capabilities for proper treatment of spectral folding and efficient transfer of resonance assignments between spectra of different types and different dimensionality, including projected, reduced-dimensionality triple-resonance experiments.
Francesco Mureddu

The biological impact of mass-spectrometry-based prot... [Nature. 2007] - PubMed - NCBI - 0 views

  •  
    In the past decade, there have been remarkable advances in proteomic technologies. Mass spectrometry has emerged as the preferred method for in-depth characterization of the protein components of biological systems. Using mass spectrometry, key insights into the composition, regulation and function of molecular complexes and pathways have been gained. From these studies, it is clear that mass-spectrometry-based proteomics is now a powerful 'hypothesis-generating engine' that, when combined with complementary molecular, cellular and pharmacological techniques, provides a framework for translating large data sets into an understanding of complex biological processes.
david osimo

Research 2.0.2: How research is conducted : Soapbox Science - 0 views

  • Traditionally, research was conducted by a single scientist or a small team of scientists within a single laboratory. The scientist(s) would conduct the majority of required experiments themselves, even if they did not initially have the necessary expertise or equipment. If they could not conduct an experiment themselves, they would attempt to find a collaborator in another lab to help them by using a barter system. This barter system essentially involves one scientist asking for a favor from another scientist, with the potential upside being co-authorship on any publications that are produced by the work. This type of collaborative arrangement depends heavily on personal networks developed by scientists.
  • The amount of collaboration required in research will continue to increase, driven by many factors including: The need for ever more complex and large scale instrumentation to delve deeper into biological and physical processes The maturation of scientific disciplines requiring more and more knowledge in order to make significant advances, a demand which can often only be met by pooling knowledge with others An increasing desire to obtain cross-fertilization across disciplines
  • So with large teams of scientists, often based at remote institutions, increasingly needing to work together to solve complex problems, there will be a demand for new tools to help facilitate collaboration. Specifically, there will be an increasing need for tools that allow researchers to easily find and access other scientists with the expertise required to advance their research projects. In my view, to operate most efficiently these tools also need new methods to reward researchers for participating in these collaborations.
  • ...1 more annotation...
  • One result of the rise in research requiring the combination of multiple specialized areas of expertise on ever shortening time-scales is, unfortunately, a concomitant decrease in the reproducibility of the published results (New York Times, Wall Street Journal and Nature.).  It is now apparent that independent validation of key experimental findings is an essential step that will be placed in the research process.
1 - 19 of 19
Showing 20 items per page