Editorial in Nature about the way they handle misconduct allegations

In its editorial of the April 29th, Nature details the procedure they follow when contacted for misconduct allegations in one of their publications.
It is very interesting to see the whole process, specifically the difficulties they sometimes encounter when trying to check the allegations. We also like the fact that, by writing such an editorial, they propose a common procedure to all scientific journals. Aware of Elsevier’s recent troubles (more here), it would be interesting to know how this company handles this kind of situations…

 

(may 2010)

Advertisements

Responsible scientific competitiveness – a responsibility index

In science, measuring the performance of a researcher, a lab or a country is a difficult and critical task. So far, evaluation has relied mostly on quantitative criteria, measured by bibliometrics (the number of publications, the impact factors of the corresponding journals, the number of citations) or the number of patents.
However, it should be clear that quality and not only quantity matters for fostering competitive research and that other parameters, such as the working environment and the values it conveys, and the organisation of the research system should not be overlooked.

Interestingly, the editorial of Nature 29 January 2009 presents how Saudi Arabia developed an index for responsible competitiveness. Then the author notices that “strong science-based innovation requires its own metrics of inputs and achievement” and proposes some qualitative metrics:
A first set of metrics would relate to scientific misconduct. On one hand, it would refer to the mechanisms in place, both at the local and national levels, to declare, investigate and punish it; on the other hand, it emphasizes the role of the working environment and the education to prevent misbehaviours.
A second set of metrics would “measure the transparency and objectivity of a nation’s systems of evaluation, funding, staff appointments and promotion.”
“A third set would evaluate a nation’s framework for science policy, and the extent to which it allows talented scientists to follow their noses while also giving societal values and economic needs their due priority.”
Finally, in a 4th set, it is aimed at measuring “openness”. Openness is seen both as a receptivity to the ideas and practices of researchers in other countries or disciplines and as “as a willingness to have ideas and conclusions publicly criticized”.

This initiative is quite interesting in that changing the criteria for evaluation is related to changing objectives. For instance, what if the performance of a country would not rely anymore on its GNP but rather on its efforts to develop environmental and social policies? The metrics presented here should be regarded as open suggestions, and encourage to think about the values one wants to promote and how to define and measure corresponding metrics.

(Feb 2009)