001385797 000__ 03699cam\a2200541Ii\4500 001385797 001__ 1385797 001385797 003__ MaCbMITP 001385797 005__ 20240325105010.0 001385797 006__ m\\\\\o\\d\\\\\\\\ 001385797 007__ cr\un\nnnunnun 001385797 008__ 170104t20162016maua\\\\ob\\\\001\0\eng\d 001385797 020__ $$a9780262337649$$q(electronic bk.) 001385797 020__ $$a0262337649$$q(electronic bk.) 001385797 020__ $$a9780262337663$$q(electronic bk.) 001385797 020__ $$a0262337665$$q(electronic bk.) 001385797 020__ $$a0262337657 001385797 020__ $$a9780262337656 001385797 020__ $$z9780262035125$$q(hardcover ;$$qalk. paper) 001385797 035__ $$a(OCoLC)967668558$$z(OCoLC)1058675103 001385797 035__ $$a(OCoLC-P)967668558 001385797 040__ $$aOCoLC-P$$beng$$erda$$epn$$cOCoLC-P 001385797 050_4 $$aQ180.55.E9$$bG56 2016eb 001385797 08204 $$a020.72/7$$223 001385797 1001_ $$aGingras, Yves,$$d1954-$$eauthor. 001385797 24010 $$aDérives de l'évaluation de la recherche.$$lEnglish 001385797 24510 $$aBibliometrics and research evaluation :$$buses and abuses /$$cYves Gingras. 001385797 264_1 $$aCambridge, Massachusetts :$$bThe MIT Press,$$c[2016] 001385797 264_4 $$c©2016 001385797 300__ $$a1 online resource (xii, 119 pages). 001385797 336__ $$atext$$btxt$$2rdacontent 001385797 337__ $$acomputer$$bc$$2rdamedia 001385797 338__ $$aonline resource$$bcr$$2rdacarrier 001385797 4901_ $$aHistory and foundations of information science 001385797 506__ $$aAccess limited to authorized users. 001385797 5203_ $$a"The research evaluation market is booming. "Ranking," "metrics," "h-index," and "impact factors" are reigning buzzwords. Government and research administrators want to evaluate everything -- teachers, professors, training programs, universities -- using quantitative indicators. Among the tools used to measure "research excellence," bibliometrics -- aggregate data on publications and citations -- has become dominant. Bibliometrics is hailed as an "objective" measure of research quality, a quantitative measure more useful than "subjective" and intuitive evaluation methods such as peer review that have been used since scientific papers were first published in the seventeenth century. In this book, Yves Gingras offers a spirited argument against an unquestioning reliance on bibliometrics as an indicator of research quality. Gingras shows that bibliometric rankings have no real scientific validity, rarely measuring what they pretend to. Although the study of publication and citation patterns, at the proper scales, can yield insights on the global dynamics of science over time, ill-defined quantitative indicators often generate perverse and unintended effects on the direction of research. Moreover, abuse of bibliometrics occurs when data is manipulated to boost rankings. Gingras looks at the politics of evaluation and argues that using numbers can be a way to control scientists and diminish their autonomy in the evaluation process. Proposing precise criteria for establishing the validity of indicators at a given scale of analysis, Gingras questions why universities are so eager to let invalid indicators influence their research strategy." 001385797 588__ $$aOCLC-licensed vendor bibliographic record. 001385797 650_0 $$aBibliometrics. 001385797 650_0 $$aResearch$$xEvaluation. 001385797 650_0 $$aEducation, Higher$$xResearch$$xEvaluation. 001385797 650_0 $$aUniversities and colleges$$xResearch$$xEvaluation. 001385797 653__ $$aINFORMATION SCIENCE/General 001385797 655_0 $$aElectronic books 001385797 852__ $$bebk 001385797 85640 $$3MIT Press$$uhttps://univsouthin.idm.oclc.org/login?url=https://doi.org/10.7551/mitpress/10719.001.0001?locatt=mode:legacy$$zOnline Access through The MIT Press Direct 001385797 85642 $$3OCLC metadata license agreement$$uhttp://www.oclc.org/content/dam/oclc/forms/terms/vbrl-201703.pdf 001385797 909CO $$ooai:library.usi.edu:1385797$$pGLOBAL_SET 001385797 980__ $$aBIB 001385797 980__ $$aEBOOK 001385797 982__ $$aEbook 001385797 983__ $$aOnline