Тиражируемые услуги для воспроизводимых исследований: модель для научных библиотек
https://doi.org/10.20913/1815-3186-2019-4-33-45
Аннотация
За последнее десятилетие исследования в различных дисциплинах, от биологии до экономики, показали, что многие научные результаты не удается воспроизвести. Это привело к заявлениям в научной прессе и СМИ о том, что наука переживает «кризис воспроизводимости» и что в результате этого кризиса студенты, преподаватели и общественность в целом задумались о степени доверия к исследованиям. Преподаватели университетов строят на этих результатах свои исследования, а студенты и общественность используют эти результаты для очень многих целей: от ухода за пациентами до государственной политики. Для построения модели поддержки университетскими библиотеками воспроизводимых исследований авторы провели обзор основных рекомендаций от финансирующих научные исследования организаций, издателей и профессиональных сообществ и сопоставили рекомендации с библиотечными услугами университетов и опытом библиотекарей. Обзор показывает, что многие рекомендации по повышению воспроизводимости результатов могут стать в университетах основными областями библиотечного дела, включая управление данными, научные коммуникации и методологическую поддержку систематических обзоров и исследований, основанных на обработке больших объемов данных (data-intensive research). Расширение представлений о перспективах библиотечного дела, журналов, финансирования и общественного восприятия вопросов воспроизводимости результатов исследований, а также переосмысление предоставляемых библиотеками знаний и услуг позволит университетским библиотекам стать лидерами в поддержке воспроизводимых исследований.
Об авторах
Ф. СэйрСоединённые Штаты Америки
Сэйр Франклин, библиотекарь по связи с пользователями (liaison librarian)
Твин Ситиз
Э. Ригельман
Соединённые Штаты Америки
Ригельман Эми, библиотекарь по социальным наукам
Твин Ситиз
Список литературы
1. Nosek B. A. et al. Estimating the reproducibility of psychological science. Science, 2015, 349 (6251), 943.
2. Baggerly K. A., Coombes K. R. Deriving chemosensitivity from cell lines: forensic bioinformatics and reproducible research in high-throughput biology. Annals of Applied Statistics, 2009, 3 (4), 1309–1334.
3. Ioannidis J. P. A. Why Most published research findings are false. PLoS Medicine, 2005, 2 (8), e124, 696–701.
4. Gilmore R. O., Diaz M. T., Wyble B. A., Yarkoni T. Progress toward openness, transparency, and reproducibility in cognitive neuroscience. Annals of the New York Academy of Sciences, 2017, 1396 (1), 5–18.
5. Begley C. G., Ellis L. M. Drug development: raise standards for preclinical cancer research. Nature, 2012, 483 (7391), 531–533.
6. Gibb B. C. Reproducibility. Nature Chemistry, 2014, 6 (8), 653–654.
7. Benestad R. E., Nuccitelli D., Lewandowsky S., Hayhoe K., Hygen H. O., Van Dorland R., Cook J. Learning from mistakes in climate research. Theoretical and Applied Climatology, 2016, 126 (3/4), 699–703.
8. Maniadis Z., Tufano F., List J. A. To replicate or not to replicate? Exploring reproducibility in economics through the lens of a model and a pilot study. Economic Journal, 2017, 127 (605), F209–F235.
9. Herndon T., Ash M., Pollin P. Does high public debt consistently stifle economic growth? Critique of Reinhart and Rogoff. Working Paper Series of Political Economy Research Institute, 2013, 322, 1–25.
10. Makel M. C., Plucker J. A. Facts are more important than novelty: replication in the education sciences. Educational Researcher, 2014, 43 (6), 304–316.
11. Sayre F., Riegelman A. The reproducibility crisis and academic libraries. College & Research Libraries, 2018, 79 (1), 1–9.
12. Bollen K., Cacioppo J. T., Kaplan R. M., Krosnick J. A., Olds J. L. Social, behavioral, and economic sciences perspectives on robust and reliable science. Report of the Subcommittee on Replicability in Science Advisory Committee to the National Science Foundation Directorate for Social, Behavioral, and Economic Sciences. 2015. 29 p. URL: https://www.nsf.gov/sbe/AC_Materials/SBE_Robust_and_Reliable_Research_Report.pdf (accessed 23.01.2019).
13. Leek J. T., Jager Leah R. Is most published research really false? Annual Review of Statistics and Its Application, 2017, 4, 109–122.
14. Stodden V., Bailey D. H., Borwein J., LeVeque R. J., Rider W., Stein W. (comps.) Setting the default to reproducible reproducibility in computational and experimental mathematics. 2014. URL: http://stodden.net/icerm_report.pdf (accessed 23.01.2019).
15. Nosek B. A. et al. Promoting an open research culture. Science, 2015, 348 (6242), 1422–1425.
16. Center for Open Science announces Elsevier as new signatory to TOP guidelines. URL: https://cos.io/about/news/centre-open-science-announces-elsevier-newsignatory-top-guidelines/ (accessed 30.09.2017).
17. TOP Guidelines. URL: https://cos.io/our-services/top-guidelines/ (accessed 09.04.2018).
18. Guidelines for Transparency and Openness Promotion (TOP) in journal policies and practices. The TOP Guidelines. URL: https://www.fosteropenscience.eu/content/guidelines-transparency-and-opennesspromotion-top-journal-policies-and-practices-top (accessed 09.04.2018).
19. Broman K., Cetinkaya-Rundel M . , Nussbaum A., Paciorek Ch., Peng R., Turek D., Wickham H. Recommendations to funding agencies for supporting reproducible research. American Statistical Association. 2017, 1–4. URL: https://www.amstat.org/asa/files/pdfs/POL-ReproducibleResearchRecommendations.pdf (accessed 09.04.2018).
20. Principles and guidelines for reporting preclinical research. National Institute of Health. 2014. URL: https://www.nih.gov/research-training/rigorreproducibility/principles-guidelines-reportingpreclinical-research (accessed 09.04.2018).
21. A Framework for ongoing and future national science foundation activities to improve reproducibility, replicability, and robustness in funded research. 2014. URL: https://www.nsf.gov/attachments/134722/public/Reproducibility_NSFPlanforOMB_Dec31_2014.pdf (accessed 25.05.2017).
22. Enhancing research reproducibility. Recommendations from the Federation of American Societies for Experimental Biology. 2016. 12 p. URL: http://www.faseb.org/Portals/2/PDFs/opa/2016/FASEB_Enhancing Research Reproducibility.pdf (accessed 11.01.2019).
23. Research practices for scientific rigor: a resource for discussion, training, and practice. Society for Neuroscience. 2015. URL: https://www.sfn.org/Advocacy/PolicyPositions/ResearchPractices-for-Scientific-Rigor (accessed 11.01.2019).
24. Munafò M. R., Nosek B. A., Bishop D. V. M., Button K. S., Chambers Ch. D., Du Sert N. P., Simonsohn U., Wagenmakers E.-J., Ware J. J., Ioannidis J. P. A. Manifesto for reproducible science. Nature Human Behaviour, 2017, 1, 0021, 1–9.
25. Mulligan R. Supporting digital scholarship. SPEC Kit 350. Washington, Association of Research Libraries, 2016. 205 p. DOI: 10.29242/spec.350.
26. About us. Mapping Prejudice. 2016. URL: https://www.mappingprejudice.org/about-us/ (accessed 31.10.2017).
27. Delegard K., Ehrman-Solberg K. Playground of the people? Mapping racial covenants in twentieth-century Minneapolis. Open Rivers: Rethinking the Mississippi, 2017, 6, 72–79.
28. Wilson G., Bryan J., Cranston C., Kitzes J., Nederbragt L., Tea T. K. Good enough practices in scientific computing. PLoS Computational Biology, 2017, 113 (6), e1005510, 1–20.
29. Sandve G. K., Nekrutenko A., Taylor J., Hovig E. Ten Simple rules for reproducible computational research. PLoS Computational Biology, 2013, 9 (10), e1003285, 1–4.
30. Zhao Sh. Principles and practices for reproducible science. Decart Summer School. Data science for healthcare. 2017. URL: https://github.com/shirl0207/reproducible_science (accessed 06.10.2017).
31. Gore G. C., Jones J. Systematic reviews and librarians: a primer for managers. Partnership: the Canadian Journal of Library and Information Practice and Research, 2015, 10 (1), 1–16.
32. Koffel J. B. Use of Recommended search strategies in systematic reviews and the impact of librarian involvement: a cross-sectional survey of recent authors. PLoS ONE, 2015, 10 (5), e0125931, 1–13.
33. Rethlefsen M. L., Farrell A. M., Osterhaus Trzasko L. C., Brigham T. J. Librarian co-authors correlated with higher quality reported search strategies in general internal medicine systematic reviews. Journal of Clinical Epidemiology, 2015, 68 (6), 617–626.
34. Koffel J. B., Rethlefsen M. L. Reproducibility of search strategies is poor in systematic reviews published in high-impact pediatrics, cardiology and surgery journals: a cross-sectional study. PLoS ONE, 2016, 11 (9), e0163309, 1–16.
35. Thayer K. A., Wolfe M. S., Rooney A. A., Boyles A. L., Bucher J. R., Birnbaum L. S. Intersection of systematic review methodology with the NIH reproducibility initiative. Environmental Health Perspectives, 2014, 122 (7), A176–A177.
36. Cornell University Library systematic review service. URL: https://www.library.cornell.edu/services/systematic-review (accessed 23.01.2019).
37. Systematic review service. URL: https://www.lib.umn.edu/researchsupport/systematic-review-service (accessed 23.01.2019).
38. Equator Network. URL: www.equator-network.org/ (accessed 12.04.2018).
39. PRISMA: transparent reporting of systematic reviews and meta-analyses. URL: www.prisma-statement.org/ (accessed 09.04.2018).
40. Kirtley Sh. Impactful librarians: identifying opportunities to increase your impact. Journal of EAHIL, 2015, 11 (4), 23–28.
41. Librarian action plan. URL: www.equator-network.org/wp-content/uploads/2013/06/Librarian-Action-PlanSimple-Ideas.pdf (accessed 18.12.2017).
42. Spies J. R. The open science framework: improving science by making it open and accessible: dissertation. Charlottesville, 2013. 86 p. DOI: 10.31237/osf.io/t23za.
43. Hudson-Vitale C., Imker H., Johnston L. R., Carlson J., Kozlowski W., Olendorf R., Stewart C. Data Curation. SPEC Kit 354. Washington, Association of Research Libraries, 2017. 135 p. DOI: 10.29242/spec.354.
44. Steeves V., Chirigati F., Rampin R. Using ReproZip for reproducibility and library services. OSF home. 2018. URL: https://osf.io/8z73c/ (accessed 11.01.2019).
45. Bandrowski A., Brush M., Grethe J. S., Haendel M. A., Kennedy D. N., Hill S., Hof P. R., Martone M. E., Pols M., Tan S. C., Washington N., Zudilova-Seinstra E., Vasilevsky N. The resource identification initiative: a cultural shift in publishing. F1000Research, 2015, 4, 134, 1–18.
46. Nosek B. A. Center for Open Science. Strategic plan 2017–2020. 2018. 25 p. URL: https://osf.io/x2w9h/ (accessed 18.12.2017).
47. Bakker C. J. Data management in the lab. The Medical Library Association guide to data management for librarians. Lanham, 2016, 203–214.
48. Sherpa/Romeo. URL: www.sherpa.ac.uk/romeo/index.php (accessed 18.12.2017).
49. Registered reports. Center for Open Science. URL: https://cos.io/rr/ (accessed 09.04.2018).
50. Morey R. D., Chambers C. D., Etchells P. J., Harris C. R., Hoekstra R., Lakens D., Lewandowsky S., Morey C. C., Newman D. P., Schönbrodt F. D., Vanpaemel W., Wagenmakers E.-J., Zwaan R. A. The peer reviewers’ openness initiative: incentivizing open research practices through peer review. Royal Society Open Science, 2016, 3, 150547, 1–7.
51. Nosek B. A., Spies J. R., Motyl M. Scientific utopia. 2. Restructuring incentives and practices to promote truth over publishability. Perspectives on Psychological Science, 2012, 7 (6), 615–631.
52. Collins F. S., Tabak L. A. NIH plans to enhance reproducibility. Nature, 2014, 505 (7485), 612–613.
53. Johnson R. P. Consume, reproduce, extend and connect: sustaining our research lifecycle. Bulletin of the Association for Information Science and Technology, 2017, 43, 24–29.
54. Makel M. C., Plucker J. A. An Introduction to replication research in gifted education. Gifted Child Quarterly, 2015, 59 (3), 157–164. 55. Travers J. C., Cook B. G., Therrien W. J., Coyne M. D. Replication of special education research. Remedial and Special Education, 2016, 37 (4), 195–204.
Рецензия
Для цитирования:
Сэйр Ф., Ригельман Э. Тиражируемые услуги для воспроизводимых исследований: модель для научных библиотек. Библиосфера. 2019;(4):33-45. https://doi.org/10.20913/1815-3186-2019-4-33-45
For citation:
Sayre F., Riegelman A. Replicable services for reproducible research: a model for academic libraries. Bibliosphere. 2019;(4):33-45. (In Russ.) https://doi.org/10.20913/1815-3186-2019-4-33-45