Interrater reliability and agreement of performance ratings: A methodological comparison
- 1 March 1996
- journal article
- Published by Springer Science and Business Media LLC in Journal of Business and Psychology
- Vol. 10 (3), 367-380
- https://doi.org/10.1007/bf02249609
Abstract
No abstract availableKeywords
This publication has 10 references indexed in Scilit:
- Perceptions or reality: Is multi‐perspective measurement a means or an end?Human Resource Management, 1993
- A disagreement about within-group agreement: Disentangling issues of consistency versus consensus.Journal of Applied Psychology, 1992
- A Generalized Agreement MeasureEducational and Psychological Measurement, 1990
- A Generalization of Cohen's Kappa Agreement Measure to Interval Measurement and Multiple RatersEducational and Psychological Measurement, 1988
- STRUCTURED INTERVIEWING: RAISING THE PSYCHOMETRIC PROPERTIES OF THE EMPLOYMENT INTERVIEWPersonnel Psychology, 1988
- Synthetic validity: A conceptual and comparative review.Journal of Applied Psychology, 1984
- Rating the ratings: Assessing the psychometric quality of rating data.Psychological Bulletin, 1980
- Intraclass correlations: Uses in assessing rater reliability.Psychological Bulletin, 1979
- Interrater reliability and agreement of subjective judgments.Journal of Counseling Psychology, 1975
- Judgment of counseling process: Reliability, agreement, and error.Psychological Bulletin, 1972