Inter-observer agreement for quality measures applied to online health information.

  • 1 January 2004
    • journal article
    • Vol. 107, 1308-12
Abstract
Many quality criteria have been developed to rate the quality of online health information. However, few instruments have been validated for inter-observer reliability. Therefore, we assessed the degree to which two raters agree upon the presence or absence of information based on 22 popularly cited quality criteria on a sample of 21 complementary and alternative medicine websites. Our preliminary analysis showed a poor inter-rater agreement on 10 out of the 22 quality criteria. Therefore, we created operational definitions for each of the criteria, decreased the allowed choices and defined a location to look for the information. As a result 15 out of the 22 quality criteria had a kappa >0.6. We conclude that even with precise definitions some commonly used quality criteria to assess the quality of health information online cannot be reliably assessed. However, inter-rater agreement can be improved by providing precise operational definitions.