Please use this identifier to cite or link to this item: http://hdl.handle.net/1942/12896
Full metadata record
DC FieldValueLanguage
dc.contributor.authorBornmann, L.-
dc.contributor.authorEGGHE, Leo-
dc.date.accessioned2012-01-11T11:13:56Z-
dc.date.available2012-01-11T11:13:56Z-
dc.date.issued2012-
dc.identifier.citationJOURNAL OF DOCUMENTATION, 68(4), p. 527-535-
dc.identifier.issn0022-0418-
dc.identifier.urihttp://hdl.handle.net/1942/12896-
dc.description.abstractPurpose: In editorial peer review systems of journals, one does not always accept the best papers. Due to different human perceptions, the evaluation of papers by peer review (for a journal) can be different from the impact that a paper has after its publication (measured by number of citations received) in this or another journal. This system (and corresponding problems) is similar to the information retrieval process in a documentary system. Also there one retrieves not always the most relevant documents for a certain topic. This is so because the topic is described in the command language of the documentary system and this command does not always completely cover the "real topic" that one wants to describe. Design/methodology/approach: Based on this statement we are applying classical information retrieval evaluation techniques to the evaluation of peer review systems. Basic in such an information retrieval evaluation are the notions of precision and recall and the precision-recall-curve. Such notions are introduced here for the evaluation of peer review systems. Findings: The analogues of precision and recall are defined and we construct their curve based on peer review data from the journal Angewandte Chemie - International Edition and on citation impact data of accepted papers by this journal or rejected but published elsewhere papers. We conclude that, due to the imperfect peer review process (based on human evaluation), if we want to publish a high amount of qualified papers (the ones we seek), one will also accept several non-qualified papers as well.-
dc.language.isoen-
dc.subject.otherInformation science; Information retrieval; Periodicals; Peer review-
dc.titleJournal peer review as an information retrieval process-
dc.typeJournal Contribution-
dc.identifier.epage535-
dc.identifier.issue4-
dc.identifier.spage527-
dc.identifier.volume68-
local.bibliographicCitation.jcatA1-
local.type.refereedRefereed-
local.type.specifiedArticle-
dc.bibliographicCitation.oldjcatA1-
dc.identifier.doi10.1108/00220411211239093-
dc.identifier.isi000308836300006-
item.fullcitationBornmann, L. & EGGHE, Leo (2012) Journal peer review as an information retrieval process. In: JOURNAL OF DOCUMENTATION, 68(4), p. 527-535.-
item.validationecoom 2013-
item.accessRightsOpen Access-
item.fulltextWith Fulltext-
item.contributorBornmann, L.-
item.contributorEGGHE, Leo-
crisitem.journal.issn0022-0418-
crisitem.journal.eissn1758-7379-
Appears in Collections:Research publications
Files in This Item:
File Description SizeFormat 
peer 1.pdfPeer-reviewed author version205.36 kBAdobe PDFView/Open
peer 2.pdf
  Restricted Access
Published version132.18 kBAdobe PDFView/Open    Request a copy
Show simple item record

SCOPUSTM   
Citations

2
checked on Sep 2, 2020

WEB OF SCIENCETM
Citations

2
checked on Jul 18, 2024

Page view(s)

188
checked on Sep 7, 2022

Download(s)

322
checked on Sep 7, 2022

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.