Advanced search

Classification model of funding for research institutions in Armenia

Full Text:


Improving the efficiency of the use of public funds directed at scientific and technological research and development is one of the urgent problems of our time. Among the possible solutions is scientific organizations funding according to their results. With this aim, a number of countries have implemented evaluation of their scientific organizations productivity at the national level. Evaluation of the effectiveness of organizations engaged in scientific and technical R&D, and the distribution of funding according to the results of such evaluation there are in the UK (the first country to introduce such practices), Italy and Russia. Starting from 2020, the Republic of Armenia is also planning to invest a rating model for financing state scientific organizations.

The article presents the model of rating financing of the state scientific organizations in Armenia, developed and proposed by the Committee on science of the Republic of Armenia. The introduction of the model will lead to the formation of additional financial resources and increase the efficiency of budget financing, as it will be done with the account of results of a scientific organization.

About the Authors

Sh. A. Sargsyan
Institute for Informatics and Automation Problems National Academy of Sciences of the Republic of Armenia; Medical Physics Department at the Yerevan State Medical University after M. Heratsi

Center for Scientific Information Analysis and Monitoring at the Institute for Informatics and Automation Problems; Medical Physics Department at YSMU 


T. S. Harutyunyan
Public Administration Academy of the Republic of Armenia

Center for Regional Studies



V. H. Sahakyan
Committee of Science of the Republic of Armenia

S. G. Haroutiunian
Committee of Science of the Republic of Armenia


1. Guskov A. E., Kosyakov D. V., Selivanova I. V. Method to assess the efficiency of scientific organizations. Vestnik Rossiskoi academii nauk, 2018, 88(5), 430–443. (In Russ.). DOI: 10.7868/S0869587318050092.

2. Abramo G., D’Angelo C. A., Pugini F. The measurement of Italian universities’ research productivity by a non parametric-bibliometric methodology. Scientometrics, 2008, 76(2), 225–244. DOI: 10.1007/s11192007-1942-2.

3. Abramo G., D’Angelo C. A. Evaluating research: from informed peer review to bibliometrics. Scientometrics, 2011, 87(3), 499–514. DOI: 10.1007/s11192-011-0352-7.

4. Aksnes D. W. When different persons have an identical author name. How frequent are homonyms? Journal of the American Society for Information Science and Technology, 2008, 59(5), 838–841. DOI:

5. Aksnes D. W., Taxt R. E. Peers reviews and bibliometric indicators: a comparative study at Norvegian University. Research Evaluation, 2004, 13(1), 33–41. DOI: 10.3152/147154404781776563.

6. Anfossi A., Ciolfi A., Costa F., Parisi G., Benedetto S. Large-scale assessment of research outputs through a weighted combination of bibliometric indicators. Scientometrics, 2016, 107(2), 671–683. DOI: 10.1007/s11192-016-1882-9.

7. Malek J., Hudeckova V., Matejka M. System of evaluation of research institutions in the Czech Republic. Procedia Computer Science, 2014, 33, 315–320. DOI:

8. Bence V., Oppenheim C. The evolution of the UK’s research assessment exercise: publications, performance and perceptions. Journal of Educational Administration and History, 2005, 37(2), 137–155. DOI: 10.1080/00220620500211189.

9. Butler L., McAllister I. Metrics or peer review? Evaluating the 2001 UK research assessment exercise in political science. Political Studies Review, 2007, 7(1), 3–17. DOI: 91

10. Franceschini F., Maisano D. Critical remarks on the Italian research assessment exercise VQR2011– 2014. Journal of Informetrics, 2017, 11(2), 337–357. DOI: 10.1016/j.joi.2017.02.005.

11. Fursov K., Roschina Y., Balmush O. Determinants of research productivity: an individual-level lens. Foresight and STI Governance, 2016, 10(2), 44–56. DOI: 10.17323/1995-459X.2016.2.44.56.

12. Harman G. Allocating research infrastructure grants in post-binary higher education systems: British and Australian approaches. Journal of Higher Education Policy and Management, 2000, 22(2), 111–126. DOI: 10.1080/14636770307132.

13. Hicks D. Performance-based university research funding system. Research Policy, 2012, 41. 251–261. DOI:

14. Lee F. S., Pham X., Gu G. The UK research assessment exercise and the narrowing of UK economics. Cambridge Journal of Economics, 2013, 37(4), 693–717. DOI: 10.1093/cje/bet031.

15. Moed H. F. Citation analysis in research evaluation. Dordrecht, Springer, 2005. 323 p.

16. Oppenheim C. The correlation between citation counts and the 1992 research assessment exercise ratings for British research in genetics, anatomy and archaeology. Journal of Documentation, 1997, 53(5), 477–487.

17. Oppenheim C., Norris M. Citation counts and the research assessment exercise V: archaeology and the 2001 RAE. Journal of Documentation, 2003, 56(6), 709–730. DOI:

18. Seidl da Fonseca R., Pinheiro-Veloso A. The practice and future of financing science, technology, and innovation. Foresight and STI Governance, 2018, 12(2), 6–22. DOI: 10.17323/2500-2597.2018.2.6.22.

19. Zacharewicz Th., Lepori B., Reale E., Jonkers K. Performance-based research funding in EU member states – a comparative assessment. Science and Public Policy, 2019, 46(1), 105–115. DOI:

20. Wa-Mbaleka S., Aguila Gomez M. State funding of research in the Philippines: processes and stakeholders’ experiences. Prism, 2018, 22(1), 2–19.

For citation:

Sargsyan S.A., Harutyunyan T.S., Sahakyan V.H., Haroutiunian S.G. Classification model of funding for research institutions in Armenia. Bibliosphere. 2019;(3):85-92. (In Russ.)

Views: 85

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 License.

ISSN 1815-3186 (Print)