Abstract
Clostridium difficile–associated diarrhea (CDAD) is a unique hospital infection that occurs almost entirely in patients who have received previous antimicrobial treatment. Although not conclusively proven, the normal gastrointestinal flora is presumably disrupted by antimicrobials, which, in turn, enables ingested spores of toxigenic C. difficile to colonize the colon, produce toxins, and cause CDAD. Epidemiologic evidence suggests that specific antimicrobials and classes of antimicrobials are not created equal in terms of CDAD risk [1]. Two risks for CDAD are of interest because of how they relate to antimicrobial use: the relative risk of CDAD associated with use of a specific antimicrobial, and the attributable risk of CDAD in a particular population. Attributable risk incorporates both the relative risk of an antimicrobial and the frequency with which that drug is used in the population. Clindamycin was the highest-risk agent for CDAD in the 1970s, but its use has decreased in US and European hospitals, with a resultant reduction in attributable risk of antibiotic-associated diarrhea and CDAD [2]. By the late 1980s and through the 1990s, cephalosporin antimicrobials, particularly those of the second and third generation, such as cefuroxime, cefotaxime, ceftazidime, and ceftriaxone, had become the agents with the highest relative risk and highest attributable risk of CDAD because of their frequent use in hospitals [1].