000 03943cam a2200313 i 4500
001 ocn946160420
003 OCoLC
007 ta
008 210528s2016 mau b 001 0 eng
020 _a9780262035125
_q(hardcover ;
_qalk. paper)
020 _a026203512X
_q(hardcover ;
_qalk. paper)
035 _a(OCoLC)946160420
_z(OCoLC)961875109
_z(OCoLC)969902328
_z(OCoLC)993582025
_z(OCoLC)1000144949
_z(OCoLC)1007247338
_z(OCoLC)1011906210
_z(OCoLC)1013336159
_z(OCoLC)1013885094
_z(OCoLC)1015523808
_z(OCoLC)1016959994
_z(OCoLC)1017801039
_z(OCoLC)1017960598
_z(OCoLC)1021798085
_z(OCoLC)1022772343
_z(OCoLC)1022790435
_z(OCoLC)1023520716
_z(OCoLC)1023540042
_z(OCoLC)1028189739
_z(OCoLC)1029502164
_z(OCoLC)1030816582
_z(OCoLC)1031052384
_z(OCoLC)1031688879
_z(OCoLC)1031963189
_z(OCoLC)1032578634
_z(OCoLC)1032721488
_z(OCoLC)1034628493
_z(OCoLC)1035519686
_z(OCoLC)1038465345
_z(OCoLC)1043127535
_z(OCoLC)1044443247
_z(OCoLC)1045342650
_z(OCoLC)1048429326
_z(OCoLC)1049774223
_z(OCoLC)1051449338
_z(OCoLC)1052741948
_z(OCoLC)1054013511
_z(OCoLC)1054880916
_z(OCoLC)1055984746
_z(OCoLC)1056534740
_z(OCoLC)1060914391
_z(OCoLC)1066374693
_z(OCoLC)1073071001
_z(OCoLC)1080082762
_z(OCoLC)1080677306
_z(OCoLC)1083042758
_z(OCoLC)1084491421
_z(OCoLC)1084961255
_z(OCoLC)1084963317
_z(OCoLC)1086258457
_z(OCoLC)1088448955
_z(OCoLC)1090021864
_z(OCoLC)1090381120
_z(OCoLC)1166137158
050 _aQ180.55.E9
_bG564 2016
100 1 _aGingras, Yves,
_d1954-
245 1 0 _aBibliometrics and research evaluation :
_buses and abuses /
_cYves Gingras.
260 _aCambridge, Massachusetts :
_bThe MIT Press,
_cc2016.
300 _axii, 119 p.
490 1 _aHistory and foundations of information science
500 _aTranslated from the French.
504 _aIncludes bibliographical references and index.
505 0 _aThe origins of bibliometrics -- What bibliometrics teach us about the dynamics of science -- The proliferation of research evaluations -- The evaluation of research evaluation -- Conclusion: the universities' new clothes.
520 _a"The research evaluation market is booming. "Ranking," "metrics," "h-index," and "impact factors" are reigning buzzwords. Government and research administrators want to evaluate everything-teachers, professors, training programs, universities-using quantitative indicators. Among the tools used to measure "research excellence," bibliometrics-aggregate data on publications and citations-has become dominant. Bibliometrics is hailed as an "objective" measure of research quality, a quantitative measure more useful than "subjective" and intuitive evaluation methods such as peer review that have been used since scientific papers were first published in the seventeenth century. In this book, Yves Gingras offers a spirited argument against an unquestioning reliance on bibliometrics as an indicator of research quality. Gingras shows that bibliometric rankings have no real scientific validity, rarely measuring what they pretend to. Although the study of publication and citation patterns, at the proper scales, can yield insights on the global dynamics of science over time, ill-defined quantitative indicators often generate perverse and unintended effects on the direction of research. Moreover, abuse of bibliometrics occurs when data is manipulated to boost rankings. Gingras looks at the politics of evaluation and argues that using numbers can be a way to control scientists and diminish their autonomy in the evaluation process. Proposing precise criteria for establishing the validity of indicators at a given scale of analysis, Gingras questions why universities are so eager to let invalid indicators influence their research strategy"--The publisher.
650 4 _aBibliometrics.
650 4 _aResearch
_xEvaluation.
650 4 _aEducation, Higher
_xResearch
_xEvaluation.
650 4 _aUniversities and colleges
_xResearch
_xEvaluation.
830 0 _aHistory and foundations of information science.
942 _2lcc
_cBK
999 _c2583
_d2583