The Leiden Manifesto for research metrics

The Leiden Manifesto for research metrics

23 April 2015 | Diana Hicks, Paul Wouters and colleagues
The Leiden Manifesto for Research Metrics, authored by Diana Hicks, Paul Wouters, and colleagues, outlines ten principles to guide the evaluation of research using metrics. The manifesto addresses the growing reliance on quantitative metrics in scientific evaluation, which can lead to a loss of judgment and the misapplication of indicators. Key points include: 1. **Quantitative vs. Qualitative Assessment**: Quantitative metrics should support, not replace, expert judgment. 2. **Performance Against Research Missions**: Indicators should align with institutional or researcher goals and consider socio-economic and cultural contexts. 3. **Protecting Local Research Excellence**: Metrics should recognize and reward locally relevant research, especially in fields with regional or national engagement. 4. **Transparency and Simplicity**: Data collection and analysis processes should be open, transparent, and simple. 5. **Data Verification**: Those evaluated should be able to verify data and analysis. 6. **Field-Specific Practices**: Publication and citation practices vary by field, and indicators should be normalized accordingly. 7. **Qualitative Judgement of Researchers**: Assessment should consider a researcher's portfolio, including expertise, experience, and influence. 8. **Avoid False Precision**: Indicators should avoid conceptual ambiguity and false precision. 9. **Systemic Effects**: Indicators should be designed to avoid gaming and goal displacement. 10. **Regular Review and Update**: Indicator systems should be regularly scrutinized and updated to reflect evolving research missions. The manifesto emphasizes the importance of combining robust statistics with qualitative evidence to make informed decisions about scientific research.The Leiden Manifesto for Research Metrics, authored by Diana Hicks, Paul Wouters, and colleagues, outlines ten principles to guide the evaluation of research using metrics. The manifesto addresses the growing reliance on quantitative metrics in scientific evaluation, which can lead to a loss of judgment and the misapplication of indicators. Key points include: 1. **Quantitative vs. Qualitative Assessment**: Quantitative metrics should support, not replace, expert judgment. 2. **Performance Against Research Missions**: Indicators should align with institutional or researcher goals and consider socio-economic and cultural contexts. 3. **Protecting Local Research Excellence**: Metrics should recognize and reward locally relevant research, especially in fields with regional or national engagement. 4. **Transparency and Simplicity**: Data collection and analysis processes should be open, transparent, and simple. 5. **Data Verification**: Those evaluated should be able to verify data and analysis. 6. **Field-Specific Practices**: Publication and citation practices vary by field, and indicators should be normalized accordingly. 7. **Qualitative Judgement of Researchers**: Assessment should consider a researcher's portfolio, including expertise, experience, and influence. 8. **Avoid False Precision**: Indicators should avoid conceptual ambiguity and false precision. 9. **Systemic Effects**: Indicators should be designed to avoid gaming and goal displacement. 10. **Regular Review and Update**: Indicator systems should be regularly scrutinized and updated to reflect evolving research missions. The manifesto emphasizes the importance of combining robust statistics with qualitative evidence to make informed decisions about scientific research.
Reach us at info@study.space